Hacker News new | past | comments | ask | show | jobs | submit
So for 25$ a month I get access to ChatGPT and Claude offerings in addition to access to Kagi search. This sounds like a good deal, compared to the 20$/month access to ChatGPT only. Or am I missing something?
You can use something like OpenRouter, which lets you access essentially all commercially available models. Including open-source models. There are no rate limits.

You pay a different rate per model (OpenRouter shows the pricing transparently). You load your account with credits. I use it daily (undoubtedly far more than the average user) and loaded 50$ with credits five months ago, but I still have over 1/2 of it left.

I think it is hard to believe that Kagi would be any cheaper and have no rate limits.

Keep in mind that, while OpenRouter gives you the upstream price for OpenAI/Anthropic models (so you pay the same per token), there's a loading charge, so if you want to load $10 in credits you pay $12 or so.

This means that it's more expensive than calling OpenAI directly, even though they have the same price per token.

Where the loading charge is amortized over all the calls made.

If you want to use precisely one API, paying directly for that API is cheaper. However, that's only true with closed-source providers. Anyone can host a server running llama 3.1 that OpenRouter could (in theory) use, bringing price competition to model cost. Closed-source models have a monopoly and can set their price wherever they want.

I'm okay with spending an extra 2$ every six months to access the APIs of any model I want.

loading story #41450619
Just plug your API keys into a front end like https://github.com/enricoros/big-AGI and pay as you go for all commercially available models
Have you tried open-webui?[1] I've been using that and really loving it, but wondering if I should try out big-AGI

[1]: https://github.com/open-webui/open-webui

loading story #41451824
There's also BigAgi (really a weird ass name - probably hurting them) that is good for the same use case. Just paste in your API key and you get a really nice UI to chat through at-cost.

https://get.big-agi.com/

A nice thing about this as I'm reading it is you can hook up OpenRouter to it. OpenRouter's interface leaves a lot to be desired.
Chatblade cli is also worth checking out. No loading fees and you can pipe code results to files.
Thanks for the hint, a usage based model looks way more attractive to me right now.
deepinfra is pretty easy to use too, for llamas and other "open" ones.
OpenRouter absolutely does have rate limits:

https://openrouter.ai/docs/limits

...haven't had issues with them, but they are there

Good catch. It's not relevant if used as an "assistant," though.

And if you cared and loaded your credits, you can do 200 req/s (maybe also a surge limit? docs are unclear)

Also, if you're already a Kagi Pro subscriber it's really only $15/mo more for access to both models. This is the first time I've actually been tempted by one of these subscription LLMs.
It was a little janky when I tested it in beta and you don't get all the features of paying for ChatGPT directly (no multimodal, no DALL-E, etc) but otherwise yeah it's a good deal.

If you just want text chat with different models it's great.

You can not spend the $25 and you will save $25.
loading story #41450267
What is the best paid service for private, anonymous, censorship free access to an LLM chatbot? Are there any that let you choose between multiple LLM backends to be able to compare answers or avoid being subject to secret system prompts, while still retaining privacy?
loading story #41452997
loading story #41453996
{"deleted":true,"id":41449589,"parent":41449502,"time":1725477736,"type":"comment"}