Hacker News new | past | comments | ask | show | jobs | submit
I am seriously considering binge buying local AI inference hardware. The way this is going, there will be another big GPU crunch soon because everyone will need local models and/or open model inference capacity to do their programming tasks when the subsidized subscriptions are no longer flowing.
loading story #47932639