American tech companies are investing billions in artificial intelligence. In doing so, they are creating an electricity problem


More and better artificial intelligence (AI) is the goal pursued by the major US tech companies. Open AI, Amazon and Meta have all announced billions in investments in AI in recent weeks. Even Donald Trump spoke about the topic on his second day in office. A large part of this money will go into the infrastructure that forms the backbone of AI: data centers.
NZZ.ch requires JavaScript for important functions. Your browser or ad blocker is currently preventing this.
Please adjust the settings.
100 billion dollars are to be invested in the "Stargate" project by Open AI and Microsoft, Alphabet plans to invest 75 billion in the development of AI this year, Amazon is investing 86 billion in infrastructure, Meta 65 billion. New data centers are also to be built in Europe, especially in France. French President Emmanuel Macron announced investments of 109 billion euros at an AI summit in Paris. Energy requirements will increase with the expansion of AI infrastructure, but by how much is disputed.
There are far more data centers in the USA than in Europe. Highly specialized chips train and operate AI models around the clock. These data centers consume a lot of electricity. In the state of Virginia, data centers already account for a quarter of the total electricity demand. But where should this energy come from in the most environmentally friendly and reliable way possible?
Artificial intelligence increases energy demandAI consumes power several times: large amounts when the AI model is being trained. And again every time a user submits a query to the model. Depending on whether text, images or videos are to be created, each query to a chatbot such as Chat-GPT consumes ten to thirty times as much energy as an online search using a search engine.
The computer chips needed to train and use AI are also more power-hungry than traditional chips needed for cloud applications, for example. In order to train an AI model or process a query, a chip must first and foremost calculate, not just store information. This also generates heat. This is why the data centers have to be specially cooled. This requires a lot of additional power, especially in hot regions like Texas or Arizona.
This is also reflected in the projections for the future energy consumption of data centers. A study by the consulting firm McKinsey estimates that it will be 80 gigawatts in the USA in 2030. Last year, this figure was 25 gigawatts. The Boston Consulting Group (BCG) also expects energy requirements to triple. The fact that AI is becoming increasingly efficient is factored into the calculation, reported BCG. Data centers are not only being built in the USA; countries and companies around the world are investing in their expansion. The consulting firm Bain writes that the energy consumption of data centers worldwide increased by 72 percent between 2019 and 2023 and is expected to double again by 2027.
Today, data centers account for one percent of global energy demand. If estimates are correct, data centers would account for a full 2.6 percent of global energy demand by 2027. That may still be a small share, but the rapid increase highlights the need for reliable energy sources.
In general, chips have become increasingly efficient in recent years. But the trend is towards more power consumption, especially with AI chips. Nvidia's latest Blackwell chip will require 15 kilowatts of energy. If you fill an entire data center with such chips, it will easily consume as much energy as a medium-sized city. Babak Falsafi is a professor at EPFL and studies the efficiency of data centers. He says: "With chips that are specifically developed for AI, the energy consumption doubles with each new generation."
A year ago, Sam Altman, CEO of Open AI, warned that an energy shortage would endanger the development of AI.
Better chips can make AI more efficientThe success of the AI of the Chinese startup Deepseek has cast doubt on the assumption that AI really needs more and more computing power. According to the company, Deepseek trained its chatbot with fewer and less powerful chips and still achieved similar performance to Open AI's top model. In a research paper, Deepseek explains the techniques they used to achieve more efficient AI. Other AI developers can adopt these innovations for their own models.
If computing power can be saved, the data centers also use less electricity. Babak Falsafi also says: "Improvements to the algorithms could make them more efficient and thus save energy."
In fact, algorithms that train AI more efficiently do not necessarily reduce the energy required for AI applications overall. They make AI cheaper and therefore more attractive to users. When more people and companies use AI, power consumption increases again. The costs and energy requirements then simply shift from training to application.
The tech companies are investing in new energy technologyThat's why Microsoft is relying on nuclear energy and is financing the restart of the Three Mile Island nuclear power plant, which was shut down in 2019. The power plant was no longer profitable. In the fall, Amazon and Google announced major investments in so-called "Small Modular Reactors." These small, modular nuclear power plants generate up to 300 megawatts of power and can directly supply data centers with electricity. None of these mini nuclear power plants are yet on the grid in the USA or even close to being commissioned.
Sam Altman himself is betting on startups like Oklo, which is developing small nuclear reactors that run on nuclear waste as fuel. He is also investing in Helion, a company that specializes in nuclear fusion. Altman is investing hundreds of millions of dollars in energy bets in the hope of a breakthrough.
However, this breakthrough is still a long way off. It could take years or even decades before new forms of energy production provide enough electricity. Until then, data centers will often be powered by energy from fossil fuels. The AI hype is consuming electricity today.
nzz.ch