A 'fluid' way to cool high-density data centres
By Gabey Goh November 17, 2014
- New immersion cooling technology drastically cuts energy and space costs
- Key to technology are new techniques and new super fluid by 3M
IT was an experiment born out of necessity, when solutions provider Allied Control was tasked with building a high-density data centre in Hong Kong, known for its extremely high real estate costs.
Due to the hot and humid summer climate, the team also had to deal with average annual industry power usage effectiveness (or PUE, a measure of how efficiently a data centre uses energy) of 2.2, according to a study by Hong Kong’s biggest electricity provider, China Light and Power.
In order to decrease operational expenses, the team considered the latest available data centre cooling methods for high performance computing (HPC) applications, but were still not satisfied with the results.
“Furthermore, studies claim that around 2,000 cubic feet of air are required to adequately cool a single high-density server rack of 20kW," said Kar-Wing Lau (pic), vice president of operations for Allied Control Limited.
"That’s equivalent to a staggering 56km/h of wind speed when forced through a one-foot wide air duct, which is similar to placing your hand out of the car while driving.
"It was not feasible for us to waste that much energy and costs just to provide the necessary cooling. We also found that no single vendor was offering what we needed,” he told Digital News Asia (DNA) via email.
The team found itself with “no other choice” but to experiment with 'evaporative two-phase immersion cooling technology,' by fully immersing electronics into dielectric and environmentally sustainable 3M Novec fluids.
“No one has ever been known to use this technology in a large-scale data centre implementation before,” Lau said.
3M Novec Engineered Fluids are a family of proprietary non-flammable fluids that are being applied for a number of industrial use cases, including precision cleaning and heat transfer.
“It was really an immense challenge, but we set up our 500kW immersion-cooled data centre in only six months from conceptual design to completion. This included securing even the data centre site and the entire supply chain.
“To make this possible, we had to temporarily upsize the project team to a total of 25 employees. A typical data centre of this size often takes at least double the time to be set up,” Lau claimed.
Using the two-phase immersion cooling technique, the team claimed savings of 99% on cooling electricity and 87% of space compared with traditional air cooling options.
The team’s efforts won them recognition in the form of the “Future Thinking and Design Concepts” award during Singapore Datacenter Week, which was held Oct 13 – 17.
How it works
Lau said that efficient cooling is about navigating intelligently within the physical limits of heat transfer. Liquid cooling, like water-cooling, is much more efficient than air-cooling and allows for higher densities at lower costs.
“But since water and electricity don’t mix well, a physical separation via custom-made metal cold plates containing an intricate system of small copper and plastic tubes needs to be implemented, which quickly becomes out-dated with every additional chip and socket generation due to changing form factors,” he said.
Additionally, with rather messy oil-cooling methods that make the hardware greasy, the heat transfer coefficients for forced liquids through pumps in small areas around chips are also limited, he added.
With two-phase immersion cooling, the heat of the completely immersed electronic components causes the fluid to boil at relatively low temperatures of 34° to 56° Celsius, depending on which fluid is used.
Heat is then transported away with vapour contained in the bubbles that rise through the fluid surface, up to the second cooling loop with a bigger condensing coil containing normal facility water below the fluid’s boiling temperature, where it condenses while transferring heat to the facility water.
The condensed fluid then falls back into the tank containing the electronics at a cooler temperature. Due to the turbulent boiling, no pumps are necessary in the first cooling loop and it provides cooling even to the smallest areas.
According to Lau, the advantages of this technology are multi-fold. Firstly the facility water doesn’t have to be as cold as that of chilled water supplying the Computer Room Air Conditioner (CRAC), which is usually at 7° Celsius.
This allows very low energy expenditure to cool down facility water, even at hot and humid ambient climates.
Due to the efficiency in heat transfer even at the smallest chip level, and in combination with centralised, larger condensing coils instead of many small copper tubes of normal pure water cooling systems, the system only requires a relatively low water flow, which can save a lot of pump energy.
“This, in combination with the first point, leads to cooling electricity savings of 99% and a PUE of 1.01,” said Lau.
Lastly, as all electronic components are being cooled uniformly and from all sides, no heat sinks or water cold plates are required.
“By getting rid of all bulky heat sinks and being able to stack mainboards with only a few millimetres in distance from each other, this allows us to achieve unprecedented densities and thereby save a lot of space,” he added.
Next page: Addressing concerns and what's next