Keep in mind that, while OpenRouter gives you the upstream price for OpenAI/Anthropic models (so you pay the same per token), there's a loading charge, so if you want to load $10 in credits you pay $12 or so.
This means that it's more expensive than calling OpenAI directly, even though they have the same price per token.
Where the loading charge is amortized over all the calls made.
If you want to use precisely one API, paying directly for that API is cheaper. However, that's only true with closed-source providers. Anyone can host a server running llama 3.1 that OpenRouter could (in theory) use, bringing price competition to model cost. Closed-source models have a monopoly and can set their price wherever they want.
I'm okay with spending an extra 2$ every six months to access the APIs of any model I want.
Sure, but I only use the hosted APIs, so for me it doesn't make much sense to pay the extra premium. Maybe it doesn't for others either.