Hacker News new | past | comments | ask | show | jobs | submit
I don't see how its possible to think this. AI coding assistants are some of the most useful technologies ever created, and model quality is by far the most important thing, so I doesn't make sense why local inference would be the path forward unless something fundamentally changes about hardware.
The hardware will change. We know that.