You pay a different rate per model (OpenRouter shows the pricing transparently). You load your account with credits. I use it daily (undoubtedly far more than the average user) and loaded 50$ with credits five months ago, but I still have over 1/2 of it left.
I think it is hard to believe that Kagi would be any cheaper and have no rate limits.
This means that it's more expensive than calling OpenAI directly, even though they have the same price per token.
If you want to use precisely one API, paying directly for that API is cheaper. However, that's only true with closed-source providers. Anyone can host a server running llama 3.1 that OpenRouter could (in theory) use, bringing price competition to model cost. Closed-source models have a monopoly and can set their price wherever they want.
I'm okay with spending an extra 2$ every six months to access the APIs of any model I want.
https://openrouter.ai/docs/limits
...haven't had issues with them, but they are there
And if you cared and loaded your credits, you can do 200 req/s (maybe also a surge limit? docs are unclear)
If you just want text chat with different models it's great.