
Written by Xavier Blot, Associate Professor at emlyon business school
After decades of promises, Artificial Intelligence has entered our daily lives. This democratization leads to an increase in its physical footprint due to the energy consumption of data centers.
The IEA believes that globally, This type of infrastructure represents 1.5% of electricity consumption in 2023, or about 400 TWh. The United States dominates the sector with over 50% of global computing capacity for 176 TWh consumed according to A Berkeley Lab study. By 2030, global consumption could reach between 1000 TWh and 1500 TWh. The IEA reference scenario (Stated Policies) expects the sector to account for around 10% of the growth in electricity demand, or 700 TWh, to be added to current consumption, totalling 1100 TWh. In the United States, demand could reach from 300 TWh to nearly 600 TWh (next graph).

The large uncertainties in these estimates can be explained for several reasons.
Since 2019, demand has increased sharply without generative AI, as online uses have exploded since Covid. It is complicated to know the share of AI in data center energy consumption. The data are poorly communicated, lack standardization and often mix AI/non-AI uses. This limitation can be mitigated by the use of specific chips (GPU, TPU) for the AI.
However, we can say that today the use of generative AI, such as a ChatGPT request, is more energy-consuming than a Google query (A factor of 10 seems consistent). Future consumption could explode with diversification (images, sounds, etc.) and the democratization of uses. But after an initial phase of growth, it could also stabilize, as the sector has already demonstrated, thanks to huge gains in software and hardware efficiency (for example with the development of TPUs).
At the international level, The EIA scenario shows that the energy growth in the sector is likely to be similar to that of desalination technologies, but much less than that linked to the new demand for air conditioning or electric vehicles.
However, the upward trend seems to be true and regular articles on the subject, such as Eric Schmidt's recent op-ed in Le Monde, recommend massive investments in electricity production to deal with them. It is in this context that we can understand The interests of GAFAM in nuclear power, especially large power plants and SMRs. Gas-fired power plants could also expand due to this demand..
However, this overview is partial because it hides the highly concentrated geographical pressure of the sector. Generative AI is driving the development of more powerful data centers, from less than 10 MW to 75-150 MW today and already 500 MW to 1 GW in the works (like in Saudi Arabia). Moreover, recent announcements mainly concern data centers used to train AI. They require stable and predictable power consumption for their intensive calculations. They do not need to be located in specific locations and a large part will probably be located in the United States. Data centers used to use models (inference) have variable loads and must be close to users for reduced latency. Their aggregate consumption is potentially greater than for those during the training phase. Their impact on local networks is more difficult to anticipate.
This is where the future of the sector is at stake. While data center consumption represents 4.4% of electricity consumption in the United States, it is already more than 10% in 5 federal states. This figure rises to over 20% in Ireland! This impact is already leading to trade-offs for Dublin network operator or The American regulator (FERC). At the crossroads of local and international issues, the rise of AI will therefore require new energy strategies. The challenge is technical, economic and geopolitical.

