LB: it's funny to me that in the context of that AI bubble story is the Google CEO out there saying "The risk of underinvesting is dramatically greater than the risk of overinvesting for us here" -- like looking at the Google of 10 years ago, they were the company to beat in conventional AI natural language processing and pre-LLM machine learning as part of an actual product or service millions of people use every day,
Sequoia Capital's analysis of the AI arms race in game-theoretic terms is chilling by implication.
Not the AI startup grifters, or the GPU manufacturers selling shovels during the gold rush, but the existential terror that's got Microsoft/Google/Amazon/Apple/Facebook and hangers-on all panicking and shovelling billions into the sector?
Our biggest corporations are locked in a prisoner's dilemma game. And when this bubble bursts it'll swallow about a trillion dollars.
it's like all of a sudden they decided their remaining existing products and services were worth zero and they urgently have to invest whatever they can in LLM like just another monster-of-the-week venture effort
@rakslice Did you see Sequoia Capital's analysis of what's going on in game-theoretic terms? Not the AI startup grifters, or the GPU manufacturers selling shovels during the gold rush, but the existential terror that's got Microsoft/Google/Amazon/Apple/Facebook and hangers-on all panicking and shovelling billions into the sector?
This made sense of it to me—the missing piece of the picture. The implications are terrifying.
like if anyone can say there is a risk of overinvesting in some new unproven experimental direction compared to existing stuff _they_ know works, it should be those guys
If cloud services have a combined market cap of $250Bn, what's the future valuation of the entire AI sector? Bear in mind that AI has a limitless appetite for data centre capacity, burps, and asks for more. Which is what the big vendors—AMZN, AAPL, META, MSFT—are making a play for. All with market caps north of $1Tn each, so essentially bottomless asset pits to borrow against.
The FAANGs all have a record of terrible investment decisions in the past 2 decades. Apple wasted $8Bn on driverless cars before they gave up. Meta spent by some estimates $100Bn pursuing the VR metaverse before they quit. Amazon lost $10Bn on Alexa in 2022 alone.
So like gambling addicts everywhere, they've just moved on up to the top table (AI) and are doubling down.
@cstross but in any case, this is just "betting the farm on something stupid has no downside on the decision maker because of how bankruptcies work, the startups story"
@cstross "Whether or not these investments end up being profitable before they depreciate, they are on the critical path to AI’s long-term impact." well, if it turns out it has no long-term impact, then nothing was on the critical path to its long term impact
@Di4na@cstross@KevinMarks Don't jumble in all AI together. There have been massive productivity gains from a diverse set of AIs throughout all areas of life and industry in the past decades. Those AIs by and large where not a non specialised generative AI such as the LLMs and image generation we are seeing now. They are specialized and mostly invisible. Talking about GenAI as "AI" as if it covered the whole field is a marketing trick. Don't help them pull it off. ;-)
These orgs are, in huge swath of them, not set up and managed to make money.
Their managing class does not have the training and experience for it. They built their skill in an environment where money was everywhere and eager for any story, regardless of real RoI. And in which their skills did not matter as much as luck and having massive war chest to invest in regards to succeeding
@cstross@KevinMarks there are genuine ways to unlock massive economic downfall and productivity boosts from software. But they look like http://moraware.com/
Not AI.
Also none of the Tech crowd built this. Or is built to understand that. There will be a lot of pain until we retool and reorient toward that.
It could be sped up. It would also be "cheap" at this scale. And it would be more equitable. But ... Policy makers have not realised yet
@cstross@KevinMarks part of the problem is that for the last few decades, being bad at investing did not matter. There are 2 big reasons
1. Boomers savings meant there was a lot of money to buy anything and to support investment. Money needed to go somewhere and there was more demand for investment products than there was supply of them 2. Software made money despite bad investments.
Which means it created an industry in which making money is a side thing. Not a skill trained.
(There *is* value, e.g. in protein folding research, drug design and delivery, physics, image recognition, and a bundle of other fields. But spicy autocomplete is what gets the headlines and the biggest bucks.)
@cstross even that analysis assumes that the LLM frenzy is actually valuable, and more valuable than existing cloud services. When they realise that there is not enough actual utility people will pay for, how does this shake out then?
@cstross@KevinMarks Like that's basically saying the LLM bubble is an opportunity to do research in the actually useful parts of AI. Sure, but like you say, these companies waste billions on long shots all the time; they could do that any time they want
@rakslice@KevinMarks I'd be a lot happier if they were throwing this kind of money at aneutronic fusion reactors or orbital solar power stations. Similar startup costs but more likely to turn out useful (AND profitable in the long run).
@cstross I guess I'm missing the part where these corporations are going to set a trillion dollars on fire is anything other than "don't threaten me with a good time".
@jwz@cstross the problem is that a lot of that money is going to go into the pockets of the very grifters who started the bubble, and who are currently working to elect a fascist regime by hook or by crook. It's going to wreck the "normie" economy, in a way that Bitcoin tried and mostly failed to do, and a bunch of money will be relocated from productive use by companies that employ people to grifters who'll replace those workers with vastly inferior technology that pretends to do the work.
@cstross@KevinMarks thing is, spicy autocomplete is a great generator of disinformation. And with so many billionaire fascists seeking to takeover governments, demand for disinformation generators is huge.
And then there is the long-time fascist love affair with using correlations from sloppy, biased databases and the fake imprimatur of statistics to justify their decisions on who goes to the camps. That too promises money for so-called "AI".
@cstross@rakslice was just reading that, and was struck by the “welp, guess we”re watching a horrifyingly expensive game of chicken in progress” quality of it