Direct liquid cooling is coming to a data center near you

·        Direct liquid cooling (DLC) targets specific components within a server rather than the entire system

·        Dell and others are working to bring more DLC-compatible servers to market as demand increases

·        Concerns include a lack of standards, how to train staff and mitigating potential leak hazards

Direct liquid cooling is poised to make a splash in the data center market, with Dell and others pushing to bring more compatible servers to market in the near term. But how exactly does the technology work and what might be holding it back?

David Schmidt, senior director of PowerEdge Product Management at Dell Technologies, told Silverlinings Dell has offered products compatible with direct liquid cooling since 2017. But there just hasn’t been much need for it previously.

“Until now, the number of applications where customers were required to consider DLC has been small,” Schmidt said. “However, with CPUs reaching 350 W to 400 W, GPUs growing to 750 W and beyond, and other devices, like memory also increasing in power consumption, we are seeing a notable increase in demand for DLC.”

The proliferation of high-power (and thus high heat) GPUs in particular is being driven by the rise of artificial intelligence (AI) workloads which require more processing muscle. Soon, Schmidt said, direct liquid cooling won’t be niche anymore.

“We anticipate that DLC enabled servers will become a significant percentage of Dell’s portfolio over the next few generations,” he stated.

How it works

Direct liquid cooling (DLC) is also known as direct-to-chip cooling. It is one of a few kinds of liquid cooling data center operators are looking toward to help meet their changing needs. The other two kinds of liquid cooling are rear door heat exchanger systems and immersion tanks.

As the name implies, direct liquid cooling involves using liquid to cool specific elements – such as a CPU or GPU – within a server rather than the entire box. This is commonly done by placing what is known as a cold plate directly on the element in question and piping a liquid through that plate to conduct the heat transfer. Once the liquid absorbs the heat from the chip it flows back to a Cooling Distribution Unit (CDU) where it is cooled and recirculated.

Nigel Gore, senior director of Liquid Cooling at vendor Vertiv, told Silverlinings it’s key for the cold plate to make as much direct contact with the element being cooled as possible to ensure maximum heat transfer. When configured correctly, they can remove up to 85% of the heat from a server, but that means some air cooling is still necessary.

As with other forms of liquid cooling, there are two main types of direct cooling: single phase and two-phase. In single phase, the liquid remains a liquid, whereas in two-phase it boils to becomes a vapor before being cooled back down to return to its liquid state.

The former usually uses some kind of water-based mixture, with glycol a common ingredient to both prevent freezing and kill any biological material that might be present. The latter point is important given the scale involved. For context, Gore noted the channels used to funnel the liquid through the chips are about 100 microns thick. That’s about the diameter of a human hair or the thickness of a sheet of paper.

Interestingly, the liquid being used isn’t exactly as cold as you might think at all. In fact, Gore said Vertiv is seeing a trend where the supply temperature to the rack is being specified at 32 degrees Celsius – which is nearly 90 degrees Fahrenheit. That’s at the low end, too. Historically the inlet temperature was closer to 45 degrees Celsius, or roughly 119 degrees Fahrenheit. But that’s nothing compared to the 100 degrees Celsius (212 F) that the chip itself can reach in just a few seconds of operation.  

Beyond Dell, OEMs pushing support for DLC include Intel, NVIDIA and HPE. Vendors aside from Vertiv include CoolIT Systems, Boyd and Motivair.


According to Schmidt “the single biggest obstacle to adoption is the lack of industry-accepted standards for all of the various components of DLC systems…Without standards, DLC is more expensive than it needs to be and can be complex and inconvenient for customers to deploy.”

He said Dell is working through this problem as part of both the American Society of Heating, Refrigeration and Air-conditioning Engineers and the Open Compute project. But Gore also noted there just aren’t that many servers on the market today that support liquid cooling, though that is rapidly changing.

“As a number of servers become available with liquid cooling SKUs,” adoption will increase, he said.

Both Schmidt and Gore also pointed to difficulties in retrofitting and the need for high-reliability systems as elements which have hitherto given data center operators pause. Gore also noted there are maintenance and operational considerations.

“There is some learning that needs to be done by the data center design teams to design the facility to be able to support liquid cooling systems and then the operation teams need to learn to operate those liquid cooling systems” in addition to the traditional air cooling systems,” he said. That includes implementing maintenance schedules to ensure proper upkeep of the liquid chemistry, Gore said.

Finally, Gore and Schmidt acknowledged the industry needs to resolve concerns about leaks, which the latter said Dell is doing by developing a leak detection system capable of raising an alert or shutting down a server to protect the equipment.

Other uses

One final note: Helen Xenos, senior director of portfolio marketing at Ciena, told Silverlinings direct liquid cooling can be used for more than just the chips in servers. She noted the vendor is experimenting with using the technology to cool optical components as well, including its WaveRouter, WaveServer  and pluggable modules.

She added the company is in discussions with about a dozen customers about using the technology, with at least one interested in taking the technology for a test drive via a pilot program next year.

In case you’re wondering what the heck optical technology has to do with data centers, check out this story.

Learn more about data centers by watching our Cloud Data Center Strategies Summit on demand.