In the digital age, artificial intelligence is considered the new oil: a technology with the potential to automate processes, enable new forms of human interaction, and transform entire industries. However, the seemingly limitless innovative power of AI carries a risk that has received little attention in public debate so far: the rapidly growing power requirements of modern AI applications and its impact on the physical infrastructure of digitalization.
“It is not just about energy goals, but also about the resilience of digital systems. Digitalization is thus facing the hurdle of its own physical limits,” says Jerome Evans, founder and CEO of firstcolo GmbH. “Artificial intelligence does not function without massive computing power. Large language models, multimodal neural networks, and machine learning systems for industry, medicine, or research require enormous amounts of energy.”
While classic applications operate with relatively stable load profiles, AI workloads cause complex load peaks and continuous stress that place heavy demands on data centers. “We are talking about workloads that require specialized hardware such as GPUs or TPUs and exhibit high power densities—often in areas that average data centers are not designed for on a per-rack basis,” Evans explains. However, the problem is not just the power requirements of the servers.
“Power supply, cooling, and redundancy systems must also grow accordingly. This is increasingly difficult, as local power grids in many cities are already at full capacity,” Evans adds. There is hardly any new connection capacity, and it is accompanied by years of approval times. Furthermore, AI systems do not just need more power; it must also be continuously available. A single failure, for example during model training, can cost millions in resources or bring entire business processes to a standstill.
There is a structural imbalance: the speed at which AI applications are developed and scaled far exceeds the speed at which energy infrastructure can be planned and implemented. “Software scales in months, while grid expansion takes years. This gap is therefore widening further and further,” Evans warns.
The consequence: many companies outsource their AI processes to data centers without paying attention to the resulting load or the associated risks. “It is therefore a matter of viewing digitalization not just as a software issue, but as a holistic systemic task. AI innovations can only take place if the infrastructure that enables them is also in place,” says Evans. “Energy efficiency helps to slow down demand, but not to prevent it.”
Due to high electricity prices, energy efficiency has long been a core requirement for German data centers, which they address through heat recovery, free cooling, or the use of renewable energy sources. “However, if total consumption multiplies, even the most efficient infrastructure eventually reaches its limits,” Evans points out.
“A fail-safe data center needs power first and foremost—reliable, stable, and around the clock. Possibilities could include prioritized power access for system-relevant digital services, the promotion of edge computing and decentralized load distribution, or faster approval and grid expansion procedures for high-availability infrastructure,” he outlines. Given the increasing dependence on digital processes, it would be negligent to view power as a secondary matter.
In the coalition agreement, the black-red federal government formulated the goal of making Germany a ‘beacon of Europe’ as a location for data centers. “However, if power is lacking, such ambitions will come to nothing, and Germany runs the risk of losing touch with digital future technologies: Europe’s beacon could quickly become the tail-ender,” Evans notes.