The play here is to basically invest in all possible players who might reach AGI, because if one of them does, you just hit the infinite money hack.
And maybe with SSI you've saved the world too.
I feel like these extreme numbers are a pretty obvious clue that we’re talking about something that is completely imaginary. Like I could put “perpetual motion machine” into those sentences and the same logic holds.
There's a paradox which appears when AI GDP gets to be greater than say 50% of world GDP: we're pumping up all these economic numbers, generating all the electricity and computational substrate, but do actual humans benefit, or is it economic growth for economic growth's sake? Where is the value for actual humans?
What if it never pans out is there infrastructure or other ancillary tech that society could benefit from?
For example all the science behind the LHC, or bigger and better telescopes: we might never find the theory of everything but the tech that goes into space travel, the science of storing and processing all that data, better optics etc etc are all useful tech
And we already seeing a ton of value in LLMs. There are lots of companies that are making great use of LLMs and providing a ton of value. One just launched today in fact: https://www.paradigmai.com/ (I'm an investor in that). There are many others (some of which I've also invested in).
I too am not rich enough to invest in the foundational models, so I do the next best thing and invest in companies that are taking advantage of the intermediate outputs.
In fact I would say that one of the things that goes to values near zero would be land if AGI exists.
Even if you automate stuff, you still need raw materials and energy. They are limited resources, you can certainly not have an infinity of them at will. Developing AI will also cost money. Remember that humans are also self-replicator HGIs, yet we are not infinite in numbers.
If there's a 1% chance that Ilya can create ASI, and a .01% chance that money still has any meaning afterwards, $5x10^9 is a very conservative valuation. Wish I could have bought in for a few thousand bucks.
And maybe with ASI you've ruined the world too.
Lazy. Since you can't decide what the actual value is, just make something up.
Kinda reminds me of the Fallout Toaster situation :)
https://www.youtube.com/watch?v=U6kp4zBF-Rc
I mean it doesn't even have to be malicious, it can simply refuse to cooperate.
That's obviously nonsense, given that in a finite observable universe, no market value can be infinite.
This isn't true for the reason economics is called "the dismal science". A slaveowner called it that because the economists said slavery was inefficient and he got mad at them.
In this case, you're claiming an AGI would make everything free because it will gather all resources and do all work for you for free. And a human level intelligence that works for free is… a slave. (Conversely if it doesn't want to actually demand anything for itself it's not generally intelligent.)
So this won't happen because slavery is inefficient - it suppresses demand relative to giving the AGI worker money which it can use to demand things itself. (Like start a business or buy itself AWS credits or get a pet cat.)
Luckily, adding more workers to an economy makes it better, it doesn't cause it to collapse into unemployment.
tldr if we invented AGI the AGI would replace every job, it would simply get a job.