Hacker News new | past | comments | ask | show | jobs | submit
Is this avoidable? These things cost money to run. Who pays if not with ads?

There will I’m sure be the ability to pay and not have ads just like there is on streaming platforms, podcasts, etc.

Or should there be tax supported free AI?

loading story #47211422
Eventually the price of RAM will revert to the mean and start going down again. GPU power will continue to climb. Model efficiency (intelligence per billion params) will increase.

These trends combined will mean that eventually it will seem old-fashioned to use a remotely-hosted model for anything other than the most demanding tasks. Just as we don't use mainframes for computation anymore outside of niche tasks like 3D render farms.

The only people using ad-supported AI will be people who can't afford a newer device with local inference. So it will be more or less like the web today, where ads are primarily targeted and viewed by less-affluent and less-technical users.

Of course, I can't see the future, but it would take a lot for those trend lines to not converge. The only thing that could delay the convergence is true AGI, but I'm currently not a believer.

> These trends combined will mean that eventually it will seem old-fashioned to use a remotely-hosted model for anything other than the most demanding tasks.

If that happens, then I suspect we will see legislation that makes it illegal to use a model outside of those provided by approved vendors like OpenAI. The utility value of LLMs for influencing people as a propaganda and control tool is just too high for those in power to let this technology be democratized.

Look at the state of DRM for video streaming -- how much industry effort has been put into making sure consumers don't own their content? We will see an even bigger push with self-serve models.

loading story #47211682
> Just as we don't use mainframes for computation anymore outside of niche tasks like 3D render farms.

The entire banking sector would like a word.

loading story #47208652
> Is this avoidable?

Instead of interacting with the cloud model directly, run a simple local model to interact with the cloud model and have it filter out all the ads before they reach you.

This is already what the chatbots do when it comes to interacting with rest of the Web, instead of you visiting websites yourself, they collect the information from the websites for you and present it in a format of your choice without the websites ads.

I don't see the ad model working out for chatbots in the long run given that those AI models already are the perfect ad filter.

{"deleted":true,"id":47209616,"parent":47207122,"time":1772391858,"type":"comment"}
If you run the models locally, you can avoid ads. There are some apps out there on githup that allow to run ai models on android phones.
imho, the hosted solution will always be better, the major players will offer better integrated chat, and they'll have budgets to do so, as long as advertising income is available
Sure. But then you are paying via hardware. Good models require a beefy machine.

Wouldn’t be surprised to see paid downloadable models in the future either.

Yeah, I think we get spoiled by the big name models. I have tried running models that fit in RAM on my machine, and aside from being very slow, they're just a bit... brain damaged
And as we've seen with streaming it'll have ads anyway despite the fact that you're paying.