AI workloads meet thermal constraints in data centres
Artificial intelligence is often framed as a software breakthrough. In practice, it is also an energy and infrastructure challenge.
A single AI query can consume roughly 10 times the electricity of a typical internet search, and demand is climbing rapidly. By 2030, data centres are projected to account for around 3% of global electricity consumption — nearly double their share today and growing far faster than electricity demand from most other sectors.
That level of demand brings with it a practical constraint: heat.
AI infrastructure is becoming denser, more power-hungry and far more thermally intensive than the systems data centres were originally designed to support. Cooling systems that once performed reliably are now approaching their limits.
The shift to high-density, liquid-cooled infrastructure
AI workloads are pushing rack power densities to new extremes.
Where traditional enterprise servers might draw only a few kilowatts per rack, modern GPU-accelerated systems operate at dramatically higher levels. Today’s fully populated GPU racks draw around 132 kilowatts. The next generation of systems is expected to approach 240 kilowatts per rack, and industry roadmaps are already exploring future densities of one megawatt per rack.
These systems rely on graphics processing units and specialised accelerators that generate intense, concentrated heat loads. Moving enough air through racks at this scale quickly becomes inefficient and difficult to manage.
For decades, air cooling has been the standard approach in data centres. It remains effective at moderate densities, but AI infrastructure is pushing well beyond those thresholds.
Liquid cooling changes the equation. Rather than cooling the surrounding air, direct-to-chip liquid cooling captures heat at the source. Because liquids transfer heat far more efficiently than air, the difference in performance is significant. Direct liquid cooling can be thousands of times more effective at removing heat from high-density components.
As AI systems scale, this approach is increasingly becoming a necessity rather than an optimisation.
Liquid cooling at scale: the sustainability equation
Cooling is typically the second-largest energy consumer in a data centre after the IT equipment itself. Improving cooling efficiency therefore has a direct impact on total power demand.
In many cases, liquid cooling can reduce cooling energy consumption by 30 to 60%. Those gains translate into lower operating costs and lower emissions, particularly in regions where electricity grids still rely heavily on fossil fuels.
Water use is another factor that deserves closer attention.
Traditional cooling towers and evaporative systems can consume large volumes of water, particularly in warm climates. By contrast, liquid cooling operates as a closed loop at the rack level, meaning the racks themselves do not consume water directly. The overall water footprint then depends largely on how heat is rejected from the facility.
This is where system design becomes critical.
Operating at higher inlet fluid temperatures can significantly reduce both energy consumption and water use. Heat rejection systems also play a role. Air-cooled chillers with economiser modes, for example, rely on cool ambient air to dissipate heat and can dramatically reduce reliance on water-intensive cooling towers.
Component choices matter as well. High-efficiency pumps, durable heat exchangers and coordinated control systems help minimise operational energy while extending equipment lifespan and reducing the embedded carbon associated with manufacturing replacements.
Designing for sustainability: key decisions impacting AI data centre sustainability
Adopting liquid cooling is an important step, but it is not sufficient on its own. The sustainability of AI infrastructure depends on how facilities are designed and operated.
Several factors influence the outcome.
Adopting liquid cooling is an important step, but it is not sufficient on its own. The sustainability of AI infrastructure ultimately depends on how facilities are designed and operated.
Liquid cooling also makes heat reuse far more practical.
Air-cooled facilities typically release low-grade heat that is difficult to repurpose. Liquid-cooled systems, however, produce higher-grade heat streams that can be reused for district heating or nearby industrial processes. While still emerging in many markets, this approach offers a pathway for data centres to contribute energy back into local ecosystems rather than simply rejecting it.
A blueprint for future-proofing AI infrastructure
Moving to liquid cooling requires careful planning and coordination.
Historically, IT hardware decisions and facility design have often occurred separately. In the AI era, those processes must happen in parallel. Otherwise organisations risk installing powerful AI hardware that existing infrastructure cannot properly support.
Flexibility is equally important. Hardware generations are evolving quickly, and facilities must be able to accommodate different density scenarios over time. Hybrid environments combining air and liquid cooling are becoming increasingly common, allowing operators to support current workloads while preparing for future systems.
Early collaboration across the technology ecosystem is also critical. Data centre operators, server manufacturers, cooling specialists and infrastructure partners all bring different expertise. Engaging these partners early helps avoid costly redesigns and ensures systems operate as intended once deployed.
Bottom line: efficient cooling is mission-critical for the AI era
AI will reshape industries and economies in the years ahead. But its growth will depend just as much on infrastructure as on algorithms.
Cooling may not attract the same attention as chips or AI models, yet it is quickly becoming one of the defining engineering challenges of the AI era.
Brain-inspired AI hardware advances neuromorphic systems
Autonomous devices can operate more smartly and efficiently thanks to neuromorphic hardware that...
Brain-inspired chip material cuts AI energy demands
A novel chip material inspired by the human brain could lower AI energy use by mimicking neural...
Superconductor advance targets low-power electronics design
Researchers have developed a superconducting material for electronics, reducing energy loss and...

