U.S. is Facing a Major Energy Crunch Due to AI’s Insatiable Demand

Yves here. We have been regularly harping on the fact that AI consumes substantial amounts of energy while offering at best marginal benefits over existing analytical methods. Yet the touts are in charge, with policy makers for the most part ignoring the environmental costs and the impact on grid operation.

The post below describes a feeble and late reaction from the Department of Energy, even as a fresh CNBC story (hat tip Kevin W) confirms the how ugly the coming crunch will be:

This strategy of reducing power use by improving compute efficiency, often referred to as “more work per watt,” is one answer to the AI ​​energy crisis. But it’s not nearly enough.

One ChatGPT query uses nearly 10 times as much energy as a typical Google search, according to a report by Goldman Sachs. Generating an AI image can use as much power as charging your smartphone.

This problem isn’t new. Estimates in 2019 found training one large language model produced as much CO2 as the entire lifetime of five gas-powered cars.

The hyperscalers building data centers to accommodate this massive power draw are also seeing emissions soar. Google’s latest environmental report showed greenhouse gas emissions rose nearly 50% from 2019 to 2023 in part because of data center energy consumption, although it also said its data centers are 1.8 times as energy efficient as a typical data center. Microsoft’s emissions rose nearly 30% from 2020 to 2024, also due in part to data centers.

And in Kansas City, where Meta is building an AI-focused data center, power needs are so high that plans to close a coal-fired power plant are being put on hold.

There are more than 8,000 data centers globally, with the highest concentration in the U.S. And, thanks to AI, there will be far more by the end of the decade. Boston Consulting Group estimates demand for data centers will rise 15%-20% every year through 2030, when they’re expected to comprise 16% of total U.S. power consumption. That’s up from just 2.5% before OpenAI’s ChatGPT was released in 2022, and it’s equivalent to the power used by about two-thirds of the total homes in the U.S.

By Haley Zaremba, a writer and journalist based in Mexico City. Originally published at OilPrice

  • The rapid growth of artificial intelligence poses significant energy security risks due to its high electricity consumption.
  • The U.S. Department of Energy has proposed a new initiative called FASST to harness AI for the public’s benefit while addressing energy challenges and ensuring responsible AI governance.
  • FASST aims to advance national security, attract skilled workforce, drive scientific discovery, optimize energy production, and develop expertise for AI governance.

To date, the runaway growth of the Artificial Intelligence agency has proven itself to be all but ungovernable. As the technology has taken over the tech sector like wildfire, regulators have been largely impotent to stay ahead of its spread and evolution. Questions about the reach and responsibility of Artificial Intelligence are being bandied around, but there are few answers to go around. And then there is the issue of the sector’s gargantuan and growing energy footprint and associated carbon emissions, which are now so significant that the developed world is facing a major energy crunch like they haven’t seen since before the shale revolution.

“AI-powered services involve considerably more computer power – and so electricity – than standard online activity, prompting a series of warnings about the technology’s environmental impact,” the BBC recently reported. A recent study from scientists at Cornell University finds that generative AI systems like ChatGPT use up to 33 times more energy than computers running task-specific software, and each AI-powered internet query consumes about ten times more energy than a standard search.

The global AI sector is expected to be responsible for 3.5 percent of global electricity consumption by 2030. In the United States, data centers alone could consume 9 percent of electricity generation by 2030, double their current levels. Already, this development is making major waves for Big Tech – earlier this month Google revealed that its carbon emissions have skyrocketed by 48 percent over the last five years.

Not only does the United States need far more renewable growth to keep up with the insatiable demand of the tech sector, it needs more energy production, period, in order to avoid crippling shortages. Broad and rapid action is needed on several fronts in order to slow the runaway train of AI’s energy consumption, but the United States also needs to keep up with other nations’ AI spending and development for its own national security concerns. The genie is out of the bottle, and it’s not going back in.

“Certain strategic areas of the US government’s artificial intelligence capabilities currently lag industry while foreign adversaries are investing in AI at scale,” a recent Department of Energy (DoE) bulletin read. “If U.S. government leadership is not rapidly established in this sector, the nation risks falling behind in the development of safe and trustworthy AI for national security, energy, and scientific discovery, and thereby compromising our ability to address pressing national and global challenges.”

So the question now is not how to walk back the global AI takeover, but how to secure new energy sources in a hurry, how to place strategic limits on the intensity of the sector’s growth and consumption rates, and how to ensure that AI is employed responsibly and for the benefit of the energy sector, the nation, the public, and the world as a whole.

To this end, the United States Department of Energy (DoE) has proposed a new agency-wide initiative to ‘harness and advance artificial intelligence for the public’s benefit’ according to reporting from Axios. Just this month, the DoE released a roadmap for the program, which was first publicly mentioned back in May of this year. The Frontiers in Artificial Intelligence for Science, Security and Technology (FASST) includes coordinated cooperation from all 17 of the DoE’s national laboratories.

This program would focus on staying competitive in the AI sector on a global scale, but would also put significant resources into making more energy-efficient computer models to avoid compromising the country’s energy security and climate goals in the process. The five overarching objectives of the program are:

  • 1. Advance National Security
  • 2. Attract and build a talented workforce
  • 3. Harness AI for Scientific Discovery
  • 4. Address Energy Challenges
  • 5. Develop technical expertise necessary for AI governance

Under the “address energy challenges” objective, the Department of Energy states that “FASST will unlock new clean energy sources, optimize energy production, and improve grid resilience, and build tomorrow’s advanced energy economy. America needs low-cost energy to support economic growth and FASST can help us meet this challenge.”

While the proposed FASST program will be a critical first step in the right direction for responsible growth and application of Artificial Intelligence in the United States, it still needs congressional authorization and funding to be put into action. A bipartisan bill has already been introduced in the Senate.

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *