Data Center Cooling: The Unexpected Challenge to AI
Artificial intelligence (AI) will only ever be as good as the infrastructure that supports it. In particular, its growth depends on a massive expansion of data center capacity.
Data centers are the warehouses of our digital world, holding the data that makes up the cloud. According to recent statistics, the cloud will hold half of all data in existence by next year – that’s 100 zettabytes of data. One zettabyte is equal to a trillion gigabytes or the storage capacity of 100 billion top-end iPhone 15 Pros.
But far from being virtual, these are real physical buildings with very real challenges, especially when it comes to keeping cool.
Servers in data centers generate lots of heat that must be vented or cooled. If the servers get too hot or humid, they can shut down or suffer damage.
But a recent survey by the data center industry body AFCOM found that just 46% of its members’ facilities had cooling systems that met all their needs. Worse still, more than a third said they persistently run out of cooling capacity.
If data centers are struggling to stay cool today, how will their servers cope with new computer chips essential to AI that are predicted to generate between 5 and 10 times as much heat?
AI turns up the heat
AI is already having a major influence on data center design and infrastructure, according to AFCOM’s 2024 State of the Data Center Report. It cites the likes of Meta, Google and Amazon as pushing for ever-increasing levels of capacity to support AI workloads.
This is most clearly illustrated by the acceleration in data centers’ rack density – a measure of the data and energy capacity of servers. Average rack density is expected to jump from 8.5 kW per rack in 2023 to 12 kW per rack in 2024. And 55% of respondents to AFCOM’s survey expect rack density to increase further over the next 12-36 months.
Greater rack density may mean bigger data capacity, but it also means higher energy use and more heat. Data centers operate optimally between 21 and 24 degrees Celsius, so any increase in rack density must be accompanied by improvements to cooling capabilities.
While the traditional method of air cooling continues to be the most favored option for those increasing the rack density of their data centers, 40% said they were looking at newer liquid cooling options. This is despite only 17% of respondents currently using these in their data centers.
Liquid cooling to the rescue?
Advanced direct-to-chip liquid cooling is the option being most considered. In the survey, 48% of respondents said they are investing in a “single-phase” method, in which heat is transferred to water that’s piped onto plates connected to the chips.
A more efficient, but more expensive “two-phase” method uses dielectric fluid that’s allowed to evaporate and carry away more heat. This is being considered by 15%.
Data center cooling specialist ZutaCore has developed a direct-to-chip system called HyperCool that Mitsubishi Heavy Industries (MHI) will incorporate into its own data center equipment.
MHI has also engaged in newer immersion cooling techniques, where electrical components are submerged into a bath of dielectric fluid. In tests at a data center owned by Japanese telecom firm KDDI, MHI’s immersion system helped reduce energy consumption by 94%.
A combined 60% of respondents to the AFCOM survey said they were planning to use immersion techniques.
Liquid-based innovations are more expensive, but they enable data center operators to absorb more waste heat and use less energy than air-cooled systems. That also makes them more sustainable, an important consideration at a time when operators are under pressure to reduce data centers’ carbon footprints.
Making data centers sustainable
The rising use of AI will be a key driver behind an expected sixfold increase in data center capacity in the next three years. They already account for about 0.6% of global greenhouse gas emissions and 1% of all electricity consumption, the International Energy Association says. Cooling systems account for about a third of that, according to P&S Intelligence.
As rack densities increase and energy demands rise, innovations like liquid cooling solutions are expected to be a key part of operators’ efforts to improve their power usage efficiency and carbon-reduction strategies. Other efficiency innovations include the conversion of waste heat into electricity to power data-center processes.
Another crucial factor to enhance the sustainability of data centers is decarbonization of power sources. Almost three-quarters of respondents to the AFCOM survey said they are considering renewable energy generation. On-site roof-top solar panels and wind turbines are commonplace at smaller data centers, and growing numbers are considering nuclear power as small, localized reactors become available, the survey found.
While larger centers use engines or turbines that run on gas to provide their energy, a new generation of low-carbon power sources is being considered, including hydrogen-powered generators.
Data centers of the future may also harness nature for their cooling and power needs.
For instance, MHI has been working with Keppel Data centers to build a hydrogen-ready power plant to supply energy to the Keppel Floating Data center Park in Singapore. The experimental data center project is a scalable facility that uses sea water to help maintain internal temperatures, a technique that increases cooling efficiency by as much as 80%.
All these infrastructure innovations are vital if data centers are to successfully support the expansion of AI in a sustainable way. Without them, what looks like a limitless technology could easily find its progress hampered by the real, physical systems upon which it depends.
To learn more about MHI’s Data Center System Vision, click here.