Microsoft contributed $25 million to Project Natick in 2020. The goal was to conserve energy while providing coastal communities with lightning-fast cloud services. Microsoft has tested its solutions by flooding data centers near coastal cities in order to achieve this goal. Data has to travel a shorter distance in this case, which makes web browsing, video streaming, and online gaming faster and smoother. It’s a smart plan considering more than half of the world’s population lives within 120 miles of a shore.
Additional information about the underwater data center
In 2014, during Microsoft’s ThinkWeek, the idea was originally brought up at a meeting where employees could exchange unusual ideas. The main concept was to control the temperature of an underwater data center using continuously cold sea surfaces. Microsoft has also used heat exchange tubes in the data center to create energy efficient designs. In addition, the company has evaluated and monitored the reliability and performance of its data center servers underwater. The fact that undersea data centers cannot receive routine maintenance is their only drawback.
They also receive free cooling from the ocean and don’t need expensive real estate either. The server failure rate in the Natick project was also one-eighth the failure rate of a conventional server in a human-served data center. Microsoft believes that the sealed inert nitrogen that was placed in the data center prior to its deployment played a role.
Traditional data centers have high humidity requirements and are full of oxygen for human operators, which can lead to chemical deterioration of components. Lower failure rates can also result from the same human technicians making fewer errors.
Microsoft will build a second subsea data center for the project in the North Sea. There may come a moment in a few years when the majority of data centers will adopt a similar operating model. In the comments below, please share your thoughts on these undersea data centers.