U.S. is Facing a Major Energy Crunch Due to AI’s Insatiable Demand

Yves here. We have been regularly harping on the fact that AI consumes substantial amounts of energy while offering at best marginal benefits over existing analytical methods. Yet the touts are in charge, with policy makers for the most part ignoring the environmental costs and the impact on grid operation.

The post below describes a feeble and late reaction from the Department of Energy, even as a fresh CNBC story (hat tip Kevin W) confirms the how ugly the coming crunch will be:

This strategy of reducing power use by improving compute efficiency, often referred to as “more work per watt,” is one answer to the AI ​​energy crisis. But it’s not nearly enough.

One ChatGPT query uses nearly 10 times as much energy as a typical Google search, according to a report by Goldman Sachs. Generating an AI image can use as much power as charging your smartphone.

This problem isn’t new. Estimates in 2019 found training one large language model produced as much CO2 as the entire lifetime of five gas-powered cars.

The hyperscalers building data centers to accommodate this massive power draw are also seeing emissions soar. Google’s latest environmental report showed greenhouse gas emissions rose nearly 50% from 2019 to 2023 in part because of data center energy consumption, although it also said its data centers are 1.8 times as energy efficient as a typical data center. Microsoft’s emissions rose nearly 30% from 2020 to 2024, also due in part to data centers.

And in Kansas City, where Meta is building an AI-focused data center, power needs are so high that plans to close a coal-fired power plant are being put on hold.

There are more than 8,000 data centers globally, with the highest concentration in the U.S. And, thanks to AI, there will be far more by the end of the decade. Boston Consulting Group estimates demand for data centers will rise 15%-20% every year through 2030, when they’re expected to comprise 16% of total U.S. power consumption. That’s up from just 2.5% before OpenAI’s ChatGPT was released in 2022, and it’s equivalent to the power used by about two-thirds of the total homes in the U.S.

By Haley Zaremba, a writer and journalist based in Mexico City. Originally published at OilPrice

  • The rapid growth of artificial intelligence poses significant energy security risks due to its high electricity consumption.
  • The U.S. Department of Energy has proposed a new initiative called FASST to harness AI for the public’s benefit while addressing energy challenges and ensuring responsible AI governance.
  • FASST aims to advance national security, attract skilled workforce, drive scientific discovery, optimize energy production, and develop expertise for AI governance.

To date, the runaway growth of the Artificial Intelligence agency has proven itself to be all but ungovernable. As the technology has taken over the tech sector like wildfire, regulators have been largely impotent to stay ahead of its spread and evolution. Questions about the reach and responsibility of Artificial Intelligence are being bandied around, but there are few answers to go around. And then there is the issue of the sector’s gargantuan and growing energy footprint and associated carbon emissions, which are now so significant that the developed world is facing a major energy crunch like they haven’t seen since before the shale revolution.

“AI-powered services involve considerably more computer power – and so electricity – than standard online activity, prompting a series of warnings about the technology’s environmental impact,” the BBC recently reported. A recent study from scientists at Cornell University finds that generative AI systems like ChatGPT use up to 33 times more energy than computers running task-specific software, and each AI-powered internet query consumes about ten times more energy than a standard search.

The global AI sector is expected to be responsible for 3.5 percent of global electricity consumption by 2030. In the United States, data centers alone could consume 9 percent of electricity generation by 2030, double their current levels. Already, this development is making major waves for Big Tech – earlier this month Google revealed that its carbon emissions have skyrocketed by 48 percent over the last five years.

Not only does the United States need far more renewable growth to keep up with the insatiable demand of the tech sector, it needs more energy production, period, in order to avoid crippling shortages. Broad and rapid action is needed on several fronts in order to slow the runaway train of AI’s energy consumption, but the United States also needs to keep up with other nations’ AI spending and development for its own national security concerns. The genie is out of the bottle, and it’s not going back in.

“Certain strategic areas of the US government’s artificial intelligence capabilities currently lag industry while foreign adversaries are investing in AI at scale,” a recent Department of Energy (DoE) bulletin read. “If U.S. government leadership is not rapidly established in this sector, the nation risks falling behind in the development of safe and trustworthy AI for national security, energy, and scientific discovery, and thereby compromising our ability to address pressing national and global challenges.”

So the question now is not how to walk back the global AI takeover, but how to secure new energy sources in a hurry, how to place strategic limits on the intensity of the sector’s growth and consumption rates, and how to ensure that AI is employed responsibly and for the benefit of the energy sector, the nation, the public, and the world as a whole.

To this end, the United States Department of Energy (DoE) has proposed a new agency-wide initiative to ‘harness and advance artificial intelligence for the public’s benefit’ according to reporting from Axios. Just this month, the DoE released a roadmap for the program, which was first publicly mentioned back in May of this year. The Frontiers in Artificial Intelligence for Science, Security and Technology (FASST) includes coordinated cooperation from all 17 of the DoE’s national laboratories.

This program would focus on staying competitive in the AI sector on a global scale, but would also put significant resources into making more energy-efficient computer models to avoid compromising the country’s energy security and climate goals in the process. The five overarching objectives of the program are:

  • 1. Advance National Security
  • 2. Attract and build a talented workforce
  • 3. Harness AI for Scientific Discovery
  • 4. Address Energy Challenges
  • 5. Develop technical expertise necessary for AI governance

Under the “address energy challenges” objective, the Department of Energy states that “FASST will unlock new clean energy sources, optimize energy production, and improve grid resilience, and build tomorrow’s advanced energy economy. America needs low-cost energy to support economic growth and FASST can help us meet this challenge.”

While the proposed FASST program will be a critical first step in the right direction for responsible growth and application of Artificial Intelligence in the United States, it still needs congressional authorization and funding to be put into action. A bipartisan bill has already been introduced in the Senate.

Print Friendly, PDF & Email

29 comments

  1. Acacia

    I tripped over:

    a new initiative called FASST to harness AI for the public’s benefit

    And:

    the runaway growth of the Central Artificial Intelligence agency has proven itself to be all but ungovernable

    So, AI is already spawning a new Agency? Given that the number one objective of FASST is “Advance National Security”… it’s hard to take this seriously.

  2. vao

    So the newfangled LLM-based AI is now estimated to consume at least one order of magnitude more energy per transaction than conventional computing approaches.

    We already had the blockchained-based bitcoins that were consuming three to five orders of magnitude more energy per transaction than a conventional credit card operation.

    And the growing utilization of the Internet cloud for intensive applications means that the energy consumed for the back and forth transfer of data may significantly exceed the energy consumed for computing stricto sensu — whether on a server or on a PC.

    These developments are a puzzle for me.

    What economic mechanisms explain that, at a time when energy is becoming more expensive and its availability is getting strained (because of all those issues with pipelines and power lines), the IT world (and society at large) is enthusiastically adopting practices that are so wasteful, instead of going for more energy-efficiency?

      1. vao

        I understand that Jevon’s paradox states that whatever is gained through energy efficiency is nevertheless used up through an increased application of the — more energy efficient — devices.

        Here we are observing new, energy-wasteful (and arguably less useful) approaches being followed instead of the less energy-consuming ones — and whose utility has a proven track-record. This is utterly bizarre.

        1. Henry Moon Pie

          A world of limits is just beyond the boundries of their worldview. They’re chasing money the way these Olympic sprinters chase the tape. Being oblivious to everything, thinking collectively is just not going to happen with them.

          And then there are others who are in a full-blown religion Transhumanists believe AI will be some sort of super-intelligence that solves all our problems, even the ones caused by its energy consumption and carbon pollution. It’s Feuerbach’s view of theism as a projection has morphed into an effort to resurrect a god that starts out as our admitted creation. God is dead, so let me create a new one right before your eyes. Man is now a conjurer. The eventual product of that delusion could offer us some amusement. Sort of like the Millerites when April of 1844 arrived. Maybe the Transhumanists will use the old, “We did it wrong. First, we have to do it right (the conjuring), then our silicon god will appear.”

          Phychoses all over the place, and the system selects them for promotion. We spend a lot of time here trying to figure out just when that happened, and the answers range from 2000 to Clinton to ’80 to Henry Wallace’s political defenestration. (hat tip – Lambert) I vote for the ’60s and the Youth Revolution that failed. It’s not that this preference for psychotics started then. That was the last chance we had to thwart or evade it without paying a very high price. If we had seen the future, would there have been more Weathermen and Panthers then?

        2. i just don't like the gravy

          It’s only wasteful because you view energy as somebody to be conserved.

          All energy is subservient to the profit motive. If something is significantly more wasteful, but is incrementally more profitable, it will be adopted.

        3. jerry

          good insight! I thought on that and remembered another saying

          “the point of a system is what it does”

          we are getting good at wasting energy

  3. Polar Socialist

    So, when will we see the first start-up elevator pitching they can expand Solomonoff–Kolmogorov–Chaitin complexity to neural networks and thus will be able to remove the cruft from the pytorch/tensorflow neural network? Just imagine the savings you can achieve while deploying your AI! And you can deploy it on the Edge, too!

    I’m sure a half-decent Elizabeth Holmes clone could manage to rake in a few billions before the halting problem would become too apparent even to most greedy investors…

  4. GC54

    Time to crank out untested-in-real-world modular nuclear reactors. The Australian GABA awaits their waste stream.

  5. .human

    The bullet points should be tackled in reverse order.

    The Frontiers in Artificial Science, for Computing, Intelligence, Security, and Technology

  6. QuarterBack

    Growth in AI power demand is such that I anticipate that it will begin to shift the broader energy policy for the U.S. and many treaty organizations in order to make way for increased AI data center demand.. A significant portion of the same investor community that supports transition away from fossil fuels, and subsidies for EV infrastructure and usage, are also huge investors in AI development. This fact can have more impact on policy because changes in their investment priorities will have more immediate zero-sum impact on milestone dates and metrics.

    Look for near term comebacks in trends for natural gas and coal electricity generation in order to satisfy the growing electricity demand, There will also be pressure to slow EV adoption. Recharging infrastructure initiatives will face headwinds from grid enhancements targeting transmission improvements to large data center sites. Subtle pressure and obstacles to EV usage and regulation will offset growing EV electricity demand by letting more transportation energy to come from gasoline. We are also seeing a renewed love affair with nuclear power led by some of the big players in AI investment.

  7. lyman alpha blob

    Frying the planet for a solution in search of a problem.

    I disagree that the genie can’t be put back in the bottle. This AI atrocity needs a functioning society to continue to run and we are slowly but surely making sure we won’t have one.

    1. k

      Agree.

      Once corporations do not realize a return on their AI investments, the Genie will return to it’s bottle.

      However, it will take a long period of non-returns before they do, because it’s a race and all and damn it’s fun spending money on new toys!

      1. Es s Ce Tera

        The problem with that premise is, what if they do realize a return on their AI investments?

  8. eg

    The obsession with the fantasy (and grift) that is AI, consequences be damned, is exposing the rot in our civilization.

    1. tegnost

      On the bright side, this technology should help clear the computational hurdles and bring us self driving cars in about ten years…
      How long before the populations energy needs will be throttled in order to feed the AI which is busy saving us from itself? National Security? Look! It’s “bipartisan! That means the entire donor class wants it, and their democracy is the best democracy and only a racist misogynist disgruntled white working class male would not agree…
      And speaking of throttling, lately it becomes harder and harder to find links to unsavory events that make the donor class look bad such as the cost projections for m4a in 2015 (it was 18 trillion over 10 years in the article I can no longer access) which now would be equal to or less than the subsidies sent into insurance company coffers from .gov. in 2023 (that would be 1.8 trillion, which time 10 years equals?)….can’t have that kind of information, nope… I’m sure that the AI makes the throttling more effective. Evil. But today will be yesterday tomorrow and the day before yesterday will no longer be, so the day after tomorrow is where we should be focusing all our energy because in a post truth world there can be no history since the day before tomorrow is today, and today will be yesterday and will no longer exist which is good because a week from today the AI has solved all the pressing philosophical questions plaguing meat based misinformed thinkers since the day before yesterday when something perplexing happened but I searched on the google and couldn’t find it so I can’t claim to know what it was so I’d better ask the AI……

    2. Kurtismayfield

      And essentially the driving goal of AI is to reduce labor costs. So they want to get rid of labor (people) and heat the planet more. They are anti humanist.

  9. SocalJimObjects

    What’s “disappointing” is that Skynet apparently will not take over computer systems and have nukes fly all over the world, instead it’s simply making sure that the world runs out of energy sooner. You can’t write a more boring Science Fiction plot than that. Heck, no one will buy an SF book based on that premise.

  10. JohnM_inMN

    “To date, the runaway growth of the Artificial Intelligence agency has proven itself to be all but ungovernable. As the technology has taken over the tech sector like wildfire, regulators have been largely impotent to stay ahead of its spread and evolution.”

    I couldn’t help but think of Thomas Friedman as I read this. One of his patented “ technology is growing a a rate beyond our ability to adapt to it” can’t be far off. Perhaps he has written it already.

    https://www.rollingstone.com/politics/politics-features/late-is-enough-on-thomas-friedmans-new-book-109962/

  11. The Rev Kev

    I suppose you could tax all those server farms by the amount of power that they are using and investing that money into upgrading the grid. But I am sure that Big Tech would fight this and threaten to move their server farms overseas. They should be told to go ahead and move them to a place that has abundant power & water plus a stable political economy where you won’t have radicals flying drones into those server farms. If they can find such a place, then go ahead. But not China as Washington wants to scrap with China next year so that is no good.

    1. Jokerstein

      Jay Inslee, soon-to-be-ex-Governer of Washington State signed tax breaks for data centers into law, while vetoing a study of data center power use. See here.

  12. John

    To say the “Big Tech” would fight taxation to upgrade the power grid is akin to saying the sun will rise tomorrow. Ungovernability is a function of political will more often than objective fact. Once “politics” became dialing for dollars full time to go along with re-election as the prime directive, governability was fully commodified. Nothing will change as long as “AI” presents a vision of riches. Do not forget the rules. Greed is good. There is no such thing as too much.

    1. Boshko

      I constantly think the same thing. If AI is going to be our panacea, why not train and aim these tools at actual material problems (e.g. energy efficiency, fusion, carbon sequestration, etc.) rather than trivial stuff like an elaborate Turing test, conjuring bad images and even translation/transcription??

  13. Henni

    No sane person will pay upfront for the required infra which will only decay in value after technology matures. There will be no way to recoup money thrown at current costs to AI.

Comments are closed.