How AI is Raising the Stakes for Data Centre Load Efficiency in the UK

September 19, 2025

Share

Are you ready?

Graphics processing units (GPU) clusters are now consuming as much power as small cities, with some burning through 100 megawatt-hours just to train a single model. The AI boom is forcing data centres to face demands that traditional systems weren't designed to handle.

McKinsey has estimated that AI-ready data centre capacity will grow 33% annually every year until at least 2030. The International Energy Agency warns that the electricity demand associated with this boom in data centres could more than double by the decade's end. Recent EPRI research reveals an even more dramatic shift: rack density is jumping from 8-40kW to 130-600kW, with projections reaching 1.2MW per rack by 2028. As NVIDIA's Jensen Huang noted: "Your revenue is limited if your power is limited."

"The industry needs dynamic thermal management systems that adapt to variable AI loads in real time"

Davin S. Sandhu, Global Portfolio Director for Data Centre Solutions

Understanding AI Data Centres in the UK: Training vs Inference

Not all AI data centres are built the same. AI training facilities — often referred to as “AI factories for model creation” — run continuous, power-intensive workloads that push thermal systems to their limits. These sites are responsible for developing large language models (LLMs) that underpin today’s AI applications.

Inference data centres serve a different purpose. These “AI factories for deployment” manage real-time user interactions, such as those powering tools like Copilot or ChatGPT. They must respond instantly to unpredictable usage spikes across global user bases, maintaining performance and reliability under pressure.

Geography plays a critical role. Industry trends suggest uneven global distribution of inference capacity, with regions like Asia-Pacific potentially underserved compared to more mature markets. This imbalance is driving rapid expansion worldwide. As AI tokens become more affordable, experts anticipate a surge in new applications requiring inference infrastructure closer to end users. That means facilities must be adaptable and scalable across diverse climates and conditions. Cooler regions like Northern Europe can benefit from natural cooling, while warmer areas demand more intensive thermal management.
 

The Real Challenge: Heat and Variability

AI workloads do not just consume more power — they introduce entirely new operational demands. Unlike traditional applications with steady, predictable loads, AI generates sudden power surges and intense heat bursts that can overwhelm conventional cooling systems. Modern AI chips are hotter and denser, creating significant thermal management challenges.

This is not simply about managing higher baseline energy consumption. It is about designing systems that respond in real time to workloads that shift from moderate to peak intensity in milliseconds. Traditional cooling approaches, built for stable operations, are not equipped to handle this level of variability.

The sustainability stakes are equally high. According to McKinsey, the growth of AI infrastructure could outpace global decarbonisation efforts, putting net zero targets at risk. The International Energy Agency projects that by 2030, AI-optimised data centres could consume more electricity than the entire country of Japan does today.

Explore our net zero building strategies to future-proof your data centre.
 

The Path Forward: Adaptive Infrastructure for AI

To meet the demands of AI, the industry must adopt dynamic thermal management systems that adjust to variable loads in real time. This means embedding intelligent controls, predictive analytics and adaptive cooling technologies across every layer of operations. Success depends on solutions that perform consistently across geographies while adapting to local conditions without compromising efficiency.

The industry is very good at understanding how we remove heat at a low and medium-density scenario,” says Davin S. Sandhu, Global Portfolio Director for Data Centre Solutions at Johnson Controls. “But as rack density keeps increasing, that's when you start having to discuss and have a conversation on whether you have the right thermal management solutions in place.”

And that's when it becomes incredibly important to have a partner who understands these different thermal management challenges and system demands, so that you're not only successful today, but you're prepared for the future.”

Organisations need partners who understand both the technical complexities and strategic imperatives of AI infrastructure. The AI revolution is raising the stakes for everyone in the data centre ecosystem — but it is also unlocking extraordinary opportunities for smarter, more sustainable and efficient operations.

The companies that solve these challenges with the right expertise and strategic support will be the ones that lead the way. The question is not whether we are ready, but whether we will choose solutions that can adapt, scale and deliver the performance required for tomorrow.
 

Ready to Future-Proof Your AI Infrastructure?

Partner with Johnson Controls to navigate the complexity of AI-ready data centres and maintain efficiency, resilience and sustainability. Learn more about our data centre optimisation solutions and how we support high-density environments across the UK.

  

Share