Hacker News new | past | comments | ask | show | jobs | submit
While the press lumps it all together as "AI", you have to differentiate LLMs (driven by big tech and big money) from unrelated image/video types of generative models and approaches like diffusion, NeRF, Gaussian splatting, etc, which have their roots in academia.
LLMs don't have their roots in academia?
Not anymore.
Not at all - Transformer was invented by a bunch of former Google employees (while at Google), primarily Jakob Uszkoreit and Noam Shazeer. Of course as with anything it builds on what had gone before, but it's really quite a novel architecture.
loading story #41872382
This makes no sense. A thing's roots don't change, either it did start there or it didn't.
loading story #41871790