- What if these centralized providers had restricted their LLMs to a small set of corporations / nations / qualified individuals?
- What if Google that invented the core transformer architecture had kept the research paper to themselves instead of openly publishing it?
- What if the universities / corporations, who had worked on concepts like the attention mechanism so essential for Google's paper, had instead gatekept it to themselves?
- What if the base models, recipes, datasets, and frameworks for training our own LLMs had never been open-sourced and published by Meta/Alibaba/DeepSeek/Mistral/many more?
I'm pretty sure that someone else would have come around the corner with a similar idea some time later, because the fundamentals of these stuff were already discussed decases before "Attention is all you need" paper, the novel thing they did was combining existing knowhow into a new idea and making it public. A couple of ingredients of the base research for this is decades old (interestingly back then some European universities were leading the field)
I am not trying to be dismissive, but this could apply to all research ever
Cell phones made communication easier for exactly zero people even though billions have been sold. Why? Because they come from just a few different companies.
Similar story to cell phones.
LLMs are in this state right out the gate.