@evan We need to take seriously the moment we are in, when we can resist the proliferation of this system. On environmental grounds alone we can refuse to support this massively destructive technology.
- driving a gas powered car about 10km generates about 2kg CO2 equivalent. - eating a single beef meal is about 9kg CO2 equivalent. - Using an LLM for half an hour is about 0.005kg CO2 with a dirty coal electrical grid, much less with renewables.
Maybe we have other things we should be working on first.
@Matt_Noyes@evan what's the source for that data? There's not much trustworthy independent analysis as these companies aren't sharing data, and what it out there neglects other inputs with a potential climate impacts, such as construction, water usage, mining for minerals, or the energy used to produce semiconductor/chip or other hardware.
@ncoca@Matt_Noyes there are a lot of different kinds of coal, not all of which are used in electricity generation; they all have emissions factors around 300-900g/kWh . It's not going to change the scale that much.
@evan@Matt_Noyes Chat GPT itself is not a reliable source for data about their emissions. And it doesn't address the concerns about the broader footprint - "per request" is a tiny, tiny piece of the potential impact, and says nothing about broader environmental/social impacts.
I'm looking into another big tech company's data center footprint, and their self-reporting is full of errors and lies. Its not trustworthy at all.
@evan@Matt_Noyes What’s the training cost for a human? Human programmers in the US are around 20 Tons of Carbon a year for their operational cost. I agree with Evan’s data. It’s in the same range as my own estimates, and I’ve been working with the Green Software Foundation and tracking carbon use of cloud in detail for the last few years.
@Matt_Noyes@adrianco it's interesting but really hard to read. The comparison is for all electricity use for the US, then it switches to 22% of residential energy use. One of the reasons that people get really confused about emissions!
@Matt_Noyes@adrianco as best I can tell, it sounds like they project a doubling or tripling of AI use, and thus electricity needs of AI, by 2028. That very well could happen, but it doesn't sound like the carbon intensity of the activity itself will change.
@Matt_Noyes@adrianco prima facie, even if people's AI use will triple, if it's 1/1000th of the carbon footprint of cars or beef today, it will be 3/1000 of the footprint in 2028. That's still pretty small.
@Matt_Noyes@adrianco I'll read the MIT paper more carefully and see if I can provide some more insight, though. Maybe I'm missing something important! Thanks for sharing it.
Using an LLM for half an hour is about 0.005kg CO2 with a dirty coal electrical grid
And if use of the LLM were the biggest part of its electric cost, this would almost be a fair point (only almost, because there are things that go into that electricity use that are likely not being factored in, but that's not really the main point).
But regular use is not the biggest problem. Training models consumes massive amounts of electricity. And AI companies are constantly training new models to improve their performance.
And the only reason they need to keep training new models is because people keep using them.
"Using AI barely uses any electricity" isn't a reasonable argument, because it obscures the fact that AI companies are using massive amounts of electricity.