Data Center Technology

Data Centers are in news again and this time it is Microsoft’s Data Center and this time it is bigger than that time when a submarine-like data center was submerged in the sea by the researchers.

The submerged rack is sunken in special engineered fluid in Microsoft Data Center situated in Quincy, Washington. This has a similar purpose to Project Natick, the air-tight, computer-filled capsule that is placed on the ocean floor on the past shores of Orkney Islands, Scotland but this one is running production workloads.

Natick is running a real production software with a handful of servers that are placed inside a special vessel that contains low-boiling point fluid. Quincy is the answer to the basic questions about the deployment at a larger scale to test the reliability.

Basic Functionality and operability are the two aspects that are to be tested, as said by the Vice President of Microsoft’s Data Center Advance Development Group, Christian Belady. Basically, his team is trying to answer questions like “Is server performance affected by immersion cooling?”, “How difficult is it for the data center engineer to work on the servers that are submerged in liquid?”.

It is a much smaller deployment than what we had seen in the latest Natick experiment, it’s a small rack. It’s just a future trajectory of computing and nothing is at stake. The chipmakers are facing a challenge as they are unable to double the processor’s speed every two years. This can only be done with increases in power consumption by more cramming up which results in the non-functioning of minute transistors onto the same-size silicon. If they can hold up the benefits of Moore’s Law by packing multiple processors in one data center then the issue will be solved and this is what Belady’s team is working on.

“How is it possible to continue to follow the mounting Moore’s law just by the help of the data center footprint?”. The same trend was followed by Liquid-Cooled AI Hardware by Google a few years ago. Any similarities are overshadowed by the change in purpose, and this is only true to some extent. Microsoft is looking for a way to continuously increase the capacity of its data center so that they are able to process any workload. They are not looking at a subset (liquid-cooled) of the most powerful computers processing hefty workloads.

Belady justified that they don’t have any luxurious arrangements to count the chip for performance improvements.

Microsoft is using one of the several types of liquid cooling systems that are available for deployment. The synthetic liquid is engineered to boil at a lower temperature (say 122F) that turns to vapor when it comes in contact with the warm processor. In return, the heat is removed as the gas travels up to the surface which condenses on the tanker’s lid. This process is repeated as the droplets rain back down. This is a two-phase immersion cooling system.  

Microsoft is still unconvinced about choosing the use of liquid technology for scaled deployment. Belady and his team want to look out for other alternatives for cooling servers looking ahead to the trends in process design.  

Source – https://www.datacenterknowledge.com/power-and-cooling/microsoft-s-tiny-data-center-liquid-cooling-experiment-big-deal

Author
Shweta
Senior Content Writer