Authored By:

Microsoft’s Underwater Datacenter Redeploys

Most people would agree that seawater and computers don’t go well together, but Microsoft might say otherwise. Recently the tech giant announced a redeployment of their underwater data center concept, known as Project Natick.

Project Natick Continues

This is the second time Microsoft has submerged a data center in the ocean, the first was in 2015 in Southern California. This time is different however.  The datacenter is larger in size than the 2015 pilot project and the deployment will be longer, five years instead of five months. The general design is the same though: racks holding servers stored in a sealed tube with a cable to shore.

For this second deployment of Project Natick Microsoft has partnered with the European Marine Energy Center (EMEC) located in Orkney, Scotland. EMEC is a marine energy testing center that hosts various wave, tidal, and ocean current devices at their pre-permitted offshore berths. Marine energy developers can use these sites to test their technologies while also providing power to the grid. These marine energy devices, in combination with local wind energy, provide more than enough power for the island’s grid, including Microsoft’s 250 kW underwater datacenter.

underwater datacenter

Microsoft’s Project Natick being prepared for deployment off Orkney.

Cool Data

Data centers produce a lot of heat, and if the heat is not removed the efficiency of the data center drops and could lead to system failures. Keeping datacenter servers cool is not easy and developers have long been searching for more effective methods.  Energy overhead, referred to as power usage effectiveness (PUE), has been decreasing steadily at Google’s datacenters (and likely for their competitors, like Microsoft, as well).  Google’s PUE for all its datacenters has dropped from a high of 1.23 in 2008 to 1.10 today. In layman’s terms, this means for every watt of power used to operate the servers, one-tenth of a watt is being used for auxiliary systems, up to 70% of which goes to cooling.  This may not sound like much, but large datacenters can use quite a bit of power, hundreds of megawatts for some of the larger facilities, and that one-tenth of a watt can add-up.

Google’s PUE has leveled-off since 2013 with only modest improvements over the last five years. This could indicate that they are near the limit of peak efficiency.  In other words, if they want to further reduce energy overhead developers will need to get creative. That brings us to why Microsoft thought it would be a good idea to stick a datacenter in the ocean. Water has high thermal conductivity, which means the ocean could be a great heat-sink, and best of all, it’s free.  An underwater datacenter could have a very low energy overhead.

Project Natick about to enter the water.

There are two other reasons for submerging a datacenter. One has to do with development time. According to one of the lead engineers for the 2015 pilot of Project Natick, they estimated a deployment time of approximately three months for a commercial system. This is a significant improvement over the two to three years to develop a traditional land-based datacenter. This could extremely useful for disaster-hit areas that need to get reconnected in a hurry.  The fact that Microsoft’s underwater datacenter design can fit within a standard shipping container helps too.

The other reason relates to location and latency. Close to 50 percent of the U.S population will live near the coastline by 2020.  By placing datacenters closer to the end-user it reduces the length of cabling needed and thus the latency, or time-delay, for data to transfer to and from the customer.

Time Will Tell

The idea is not without challenges. Operating a datacenter underwater means a lights-out operation; service technicians can’t simply walk over and troubleshoot a faulty server. Whatever servers go into the data center must be extremely reliable. Then there are the typical issues with operating in the ocean such as biofouling and corrosion.  Concerns have also been raised about the potential to create localized heating of the marine environment, a tough proposition for an already warming ocean. Based on the pilot project in Southern California in 2015 the localized heating effect was virtually negligible, but as these underwater datacenters move from science experiment to reality the effect could be more substantial.

Project Natick could mark the beginning of a new trend for offshoring data. It will be very interesting to see how the system performs over the next five years.

 

For further reading check out my post on the first deployment for Project Natick in Southern California.

Check out Microsoft’s project page for some other information and media: https://news.microsoft.com/features/under-the-sea-microsoft-tests-a-datacenter-thats-quick-to-deploy-could-provide-internet-connectivity-for-years/

Images courtesy of Microsoft and  Scott Eklund/Red Box Pictures.

More to Explore...

Thanks for subscribing to The Liquid Grid! You’ll receive email updates when new articles are posted.

The Liquid Grid

Learn about the latest in marine clean tech

Subscribe

Your contact information will never be sold or shared with third-parties