Conversation

“OpenAI’s CEO Sam Altman on Tuesday said an energy breakthrough is necessary for future artificial intelligence, which will consume vastly more power than people have expected. “There’s no way to get there without a breakthrough,” he said. ”It motivates us to go invest more in fusion.”
https://www.news18.com/tech/openai-ceo-sam-altman-says-future-of-ai-depends-on-nuclear-fusion-breakthrough-8743104.html

7
0
0

In the greatest crisis humanity has ever faced, caused by overuse of energy, our Tech Bro magicians continue to invent technologies that not alone are not energy conserving, they are massively, massively energy consumptive. From bitcoin to AI, tech becomes greedier and greedier for energy.

0
1
0

Not alone does Big Tech see these environment-destroying energy demands as a existential problem. They see it as a big opportunity. Altman gushes about “silver linings” and “climate-friendly sources of energy”. They are so excited about what sort of grift they can concoct to billionaire themselves a bit more with their investments in fusion. And we all play sing, all clap and sing along as the greed train hurtles to its final destination. Woo hoo.

0
0
0

New specialized AI chips and GPUs consume 2-3x the power of prior generations.
https://www.digitalbridge.com/news/2023-08-04-early-impacts-of-generative-ai

In AI, performance always comes before environmental good.
https://arxiv.org/abs/2307.06440

Multi-purpose, AI systems are orders of magnitude more energy intense than task-specific systems.
https://arxiv.org/pdf/2311.16863.pdf

Making an image with generative AI uses as much energy as charging your phone
https://www.technologyreview.com/2023/12/01/1084189/making-an-image-with-generative-ai-uses-as-much-energy-as-charging-your-phone/

0
0
0

Industry estimates peg the cost of running generative AI large language models at up to $4 million a day.
https://www.techtarget.com/searchcio/news/366548312/Cloud-costs-continue-to-rise-among-IT-commodities

In 2020, it took about 27 kilowatt hours of energy to train an AI model. By 2022, that had risen to a million kilowatt hours.
https://semiengineering.com/ai-power-consumption-exploding/

ChatGPT may have consumed as much electricity as 175,000 people in January 2023.
https://towardsdatascience.com/chatgpts-electricity-consumption-7873483feac4

AI data center costs will exceed $76 billion by 2028
https://www.forbes.com/sites/tiriasresearch/2023/05/12/generative-ai-breaks-the-data-center-data-center-infrastructure-and-operating-costs-projected-to-increase-to-over-76-billion-by-2028/?sh=c2981e87c15e

0
0
0

The hidden costs of AI: Impending energy and resource strain
https://penntoday.upenn.edu/news/hidden-costs-ai-impending-energy-and-resource-strain

OpenAI’s GPT-3 emitted more than 500 and 75 metric tons of carbon dioxide during training.
https://www.technologyreview.com/2022/11/14/1063192/were-getting-a-better-idea-of-ais-true-carbon-footprint

Facebook used 2.6 million KWh hours of electricity and emitted 1,000 tons of CO2 when developing their new LLaMA models.
https://kaspergroesludvigsen.medium.com/facebook-disclose-the-carbon-footprint-of-their-new-llama-models-9629a3c5c28b

The computing power required for AI increased 300,000-fold from 2012 to 2018.
https://www.wired.com/story/ai-great-things-burn-planet/

0
0
0

The carbon impact of artificial intelligence
https://www.nature.com/articles/s42256-020-0219-9

It takes a lot of energy for machines to learn – here's why AI is so power-hungry
https://theconversation.com/it-takes-a-lot-of-energy-for-machines-to-learn-heres-why-ai-is-so-power-hungry-151825

Energy consumption of AI poses environmental problems
https://www.techtarget.com/searchenterpriseai/feature/Energy-consumption-of-AI-poses-environmental-problems

Silicon Valley and the Environmental Costs of AI
https://www.perc.org.uk/project_posts/silicon-valley-and-the-environmental-costs-of-ai/

In 2020: 27 kilowatt hours of energy to train an AI model.
2022: 1 million kilowatt hours.
https://semiengineering.com/ai-power-consumption-exploding/

0
0
0

@gerrymcgovern
I mean: I am all over huge amounts of cheap and clean energy.

We could finally start to recycle the gigantic plastic trash heaps. Massively reduce CO2, maybe even capture it.

None of that needs AI.

0
0
0

@gerrymcgovern That Medium article is from may 20th 2023. ChatGPT4 is from a week earlier. I wonder if the article refers to V3 or V4? Since V4 is a much larger model it consumes more than V3, and future models will be even worse :(

0
0
0

@gerrymcgovern I read the story and the first item in your post is that a query uses the same energy as a 5W bulb running for 26.5hrs, however the post itself says just over 1hr, but they do that by assuming that all the cost of training is only used for one day (see comments), and that was 80% of the total. But if that training was updated every 100 days it would be less than 1% of the total, which seems more likely, so it would be more like a quarter of an hour.

1
0
0

@adrianco
Here's more info on queries:

"As a standard Google search reportedly uses 0.3 Wh of electricity, this suggests an electricity consumption of approximately 3 Wh per LLM interaction. This figure aligns with SemiAnalysis’ assessment of ChatGPT’s operating costs in early 2023, which estimated that ChatGPT responds to 195 million requests per day, requiring an estimated average electricity consumption of 564 MWh per day, or, at most, 2.9 Wh per request."

https://doi.org/10.1016/j.joule.2023.09.004

1
0
0

@gerrymcgovern That estimate is about double the power of the other estimate you reference, 3Wh is about 40min of a 5W bulb. I still don’t see where 26.5h came from.

1
0
0

@adrianco

It was this quote:

"Or in simpler terms, each query on ChatGPT consumes the equivalent amount of energy of running a 5W LED bulb for 26.5 hrs!”
https://medium.com/@zodhyatech/how-much-energy-does-chatgpt-consume-4cba1a7aef85

But they must have updated their page recently. Yes, the 26.5 hours for a 5w bulb is clearly wrong, sorry. I'll remove it.

1
0
0

@gerrymcgovern Thanks! The modeling they did for that story was pretty weak over-all. Doesn’t inspire confidence. The ongoing changes in AI training are that the H100 replaced the A100 and is much faster for similar power, and the software is much more efficient, and the models have got bigger, but the amount of inference per training session is increasing and there’s a big push to reduce the cost (and power) of inference with different optimized hardware.

1
0
0

@adrianco

It's funny, isn't it, how these systems always start off massively energy wasteful, then become less so, but still very, very wasteful. But no matter what AI does, it will still use much more energy than a basic search, so we progress backwards.

In computing, there is a culture of working without material or energy constraints. Yet, back in the 1960s, we managed to land people on the moon with a millionth less computing power.

1
0
0

@gerrymcgovern Electrical energy is the easiest/first thing to clean up for climate change, and we’re making good progress. The big AI training systems are running with a lot of green energy input. Transport, agriculture, manufacturing, mining etc. are a much bigger problem by orders of magnitude.

2
0
0

@adrianco
Agreed. I'm spending most of my time now researching mining. It's truly horrifying what's happening in mining.

1
0
0

@gerrymcgovern Some of the heavy equipment is moving to electric, but the main problem is that atoms are heavy…

1
0
0

@adrianco hitching trade to ever greater levels of dependency on compute, in an era of semiconductors still manufactured on the back of 100 year old industrial processes which we have to stop using but are not serious about replacing, is not a good plan – materially or culturally

@gerrymcgovern

1
0
0

@urlyman @gerrymcgovern Im not sure what you mean by hitching trade… A lot of the optimization and material technology technology development needed to clean up carbon emissions depends on more computing. A small amount of extra compute can save a large amount of carbon by optimizing industrial processes. Silicon production is a big carbon emitter at the moment but the places it’s made (mostly in Asia) are cleaning up fairly quickly.

1
0
0

@adrianco It's the toxic waste more than anything. Over 100 billion ton a year. By 2050, 170 billion ton. A Mt Everest of waste every year that destroys the water table, the soil, the air. We're devouring our environment for "clean" energy and the "green" transition.

1
1
0

@adrianco by “hitching trade” I’m referring to the AI will save us / magic up growth mindset. Which results in stoking demand when we’re already well into overshoot. The cultural baggage that creates is deadly.

Re “cleaning up”, what is the status of replacing silicon metal manufacture with low carbon processes? And will it be done in the 6 to 8 years or so we have remaining of the Paris carbon budget?
@gerrymcgovern

1
0
0

@adrianco for clarity, I’m not anti-tech as part of the path away from our current energy blindness, but the more or less total absence of any notion of limits is a recurring time-travelling telegram to William Jevons

@gerrymcgovern

1
0
0
@gerrymcgovern They say ChatGPT query is equivalent of 5W bulb for 26 hours. Wow. That's really easy imagine :-). 130Wh would be better. 470kJ would be better, too. That's about quarter of an yoghurt, if I compute correctly.
0
0
0