Is liquid cooling a sustainable, long-term solution for UK data centres?

04 June 2025

Paul Mellon, Operations Director, Stellium Datacenters

The data centre community has spent the last ten years investing in air cooling systems for racks. These were mostly indirect air systems that could provide effective cooling for racks up to 20kW and deliver a very credible PUE of 1.2. These data centres were mostly purpose built for these deployments and did not require significant external footprint.

The seismic shift to AI/HPC has caused significant supply/demand issues not just in the UK but across Europe and the world in general. Going forward, data centres will be designed to accommodate AI HPC (Training model) for racks up to 150kW, AI HPC (inference model/edge) for racks up to 50kW and HPC cloud for racks up to 50kW. The migration of our existing data centre base in the UK to support these rack power densities will create many challenges.

Flexibility and efficiency

The most compelling selling points for liquid cooling are its flexibility and efficiency. As liquid cooling systems are closed loop their usage of water is negligible. Liquids transfer heat far better than air, meaning servers run cooler with less energy than just blowing air around.

The result? A lower Power Usage Effectiveness (PUE) and actual savings on electricity bills. Plus, higher-density racks (think GPU-packed clusters) are far easier to manage when you’re circulating liquid directly over the hottest components instead of fighting heat pockets with airflow.

Liquid cooling of racks can be configured to match power densities from 20kW to 200kW and beyond. This form of cooling requires the least restructuring of the GPU’s and supporting IT technology.

Location, location, location

Depending on the location of the data centre it may not require evaporative cooling which will demand significant water usage.
South of Birmingham evaporative cooling will most likely be required whereas the North of UK will most likely run without evaporative cooling. From a sustainable perspective there are KPIs around water (WUE) as well as power (PUE).

There are further carbon elements in the location of data centres. The Southeast UK being the highest at 309g CO2/kWhr compared with the North east UK at 25g CO2/ kWhr When implemented correctly - location, design, monitoring and operation the environmental benefits can be substantial.

The higher ‘waste’ heat from a liquid-cooled system can be captured and reused more readily, making district heating projects a real possibility and a potentially lucrative energy-recycling strategy for operators.

The UK has set ambitious goals around cutting carbon emissions, with the Northeast of the UK around Newcastle currently having the lowest output. Data centre operators are under the microscope, and cooling is a big piece of the sustainability puzzle. Because liquid cooling can significantly reduce the power needed for thermal management, it’s becoming increasingly attractive to businesses looking to burnish their green credentials.

“The most compelling selling points for liquid cooling are its flexibility and efficiency. As liquid cooling systems are closed loop their usage of water is negligible. Liquids transfer heat far better than air, meaning servers run cooler with less energy than just blowing air around.”

There’s also the question of resilience in a warming climate. The UK may be mild compared to some parts of the world, but summertime heatwaves and rising average temperatures mean air-cooled data centres often must crank up the fans to cope. Liquid cooling can be a more stable, predictable solution, regardless of seasonal fluctuations.

Installing liquid-cooling infrastructure can initially seem expensive, especially considering the need for specialised equipment and possible retrofitting at older data centres. However, the operational savings in energy often offset those costs over the long run. As more hardware vendors adopt liquid-ready components, the investment hurdle is slowly decreasing.

Future-proofing

Let’s also not forget the invaluable intangible: future-proofing. As workloads become more compute-intensive (AI, HPC), designing a data centre around higher densities will pay off later, both in monetary and sustainability terms.

Implementing liquid cooling can be more complex and expensive upfront. It often requires revamped infrastructure, specialised piping, and sometimes entirely new server designs. Retrofitting these systems in older data centres isn’t always a trivial task.

For many UK data centres — especially newer facilities or those dealing with AI, machine learning, and HPC workloads like Stellium in Newcastle — the short answer to our title question is ‘absolutely.’ It’s a robust, energy-efficient, and increasingly cost-effective way to tackle current and future thermal challenges.