Microsoft finds underwater datacenters are reliable, practical and use energy sustainably

Earlier this summer, marine specialists reeled up a shipping-container-size datacenter coated in algae, barnacles and sea anemones from the seafloor off Scotland’s Orkney Islands.

The retrieval launched the final phase of a years-long effort that proved the concept of underwater datacenters is feasible, as well as logistically, environmentally and economically practical.

Microsoft’s Project Natick team deployed the Northern Isles datacenter 117 feet deep to the seafloor in spring 2018. For the next two years, team members tested and monitored the performance and reliability of the datacenter’s servers.

The team hypothesized that a sealed container on the ocean floor could provide ways to improve the overall reliability of datacenters. On land, corrosion from oxygen and humidity, temperature fluctuations and bumps and jostles from people who replace broken components are all variables that can contribute to equipment failure.

The Northern Isles deployment confirmed their hypothesis, which could have implications for datacenters on land.

Lessons learned from Project Natick also are informing Microsoft’s datacenter sustainability strategy around energy, waste and water, said Ben Cutler, a project manager in Microsoft’s Special Projects research group who leads Project Natick.

What’s more, he added, the proven reliability of underwater datacenters has prompted discussions with a Microsoft team in Azure that’s looking to serve customers who need to deploy and operate tactical and critical datacenters anywhere in the world.

“We are populating the globe with edge devices, large and small,” said William Chappell, vice president of mission systems for Azure. “To learn how to make datacenters reliable enough not to need human touch is a dream of ours.”

Proof of concept

The underwater datacenter concept splashed onto the scene at Microsoft in 2014 during ThinkWeek, an event that gathers employees to share out-of-the-box ideas. The concept was considered a potential way to provide lightning-quick cloud services to coastal populations and save energy.

More than half the world’s population lives within 120 miles of the coast. By putting datacenters underwater near coastal cities, data would have a short distance to travel, leading to fast and smooth web surfing, video streaming and game playing.

The consistently cool subsurface seas also allow for energy-efficient datacenter designs. For example, they can leverage heat-exchange plumbing such as that found on submarines.

Microsoft’s Project Natick team proved the underwater datacenter concept was feasible during a 105-day deployment in the Pacific Ocean in 2015. Phase II of the project included contracting with marine specialists in logistics, ship building and renewable energy to show that the concept is also practical.

“We are now at the point of trying to harness what we have done as opposed to feeling the need to go and prove out some more,” Cutler said. “We have done what we need to do. Natick is a key building block for the company to use if it is appropriate.”

Algae, barnacles and sea anemones

The Northern Isles underwater datacenter was manufactured by Naval Group and its subsidiary Naval Energies, experts in naval defense and marine renewable energy. Green Marine, an Orkney Island-based firm, supported Naval Group and Microsoft on the deployment, maintenance, monitoring and retrieval of the datacenter, which Microsoft’s Special Projects team operated for two years.

The Northern Isles was deployed at the European Marine Energy Centre, a test site for tidal turbines and wave energy converters. Tidal currents there travel up to 9 miles per hour at peak intensity and the sea surface roils with waves that reach more than 60 feet in stormy conditions.

The deployment and retrieval of the Northern Isles underwater datacenter required atypically calm seas and a choreographed dance of robots and winches that played out between the pontoons of a gantry barge. The procedure took a full day on each end.

The Northern Isles was gleaming white when deployed. Two years underwater provided time for a thin coat of algae and barnacles to form, and for sea anemones to grow to cantaloupe size in the sheltered nooks of its ballast-filled base.

“We were pretty impressed with how clean it was, actually,” said Spencer Fowers, a principal member of technical staff for Microsoft’s Special Projects research group. “It did not have a lot of hardened marine growth on it; it was mostly sea scum.”

Power wash and data collection

Once it was hauled up from the seafloor and prior to transportation off the Orkney Islands, the Green Marine team power washed the water-tight steel tube that encased the Northern Isles’ 864 servers and related cooling system infrastructure.

The researchers then inserted test tubes through a valve at the top of the vessel to collect air samples for analysis at Microsoft headquarters in Redmond, Washington.

“We left it filled with dry nitrogen, so the environment is pretty benign in there,” Fowers said.

The question, he added, is how gases that are normally released from cables and other equipment may have altered the operating environment for the computers.

The cleaned and air-sampled datacenter was loaded onto a truck and driven to Global Energy Group’s Nigg Energy Park facility in the North of Scotland. There, Naval Group unbolted the endcap and slid out the server racks as Fowers and his team performed health checks and collected components to send to Redmond for analysis.

Among the components crated up and sent to Redmond are a handful of failed servers and related cables. The researchers think this hardware will help them understand why the servers in the underwater datacenter are eight times more reliable than those on land.

“We are like, ‘Hey this looks really good,’” Fowers said. “We have to figure out what exactly gives us this benefit.”

The team hypothesizes that the atmosphere of nitrogen, which is less corrosive than oxygen, and the absence of people to bump and jostle components, are the primary reasons for the difference. If the analysis proves this correct, the team may be able to translate the findings to land datacenters.

“Our failure rate in the water is one-eighth of what we see on land,” Cutler said. “I have an economic model that says if I lose so many servers per unit of time, I’m at least at parity with land,” he added. “We are considerably better than that.”

Energy, waste and water

Other lessons learned from Project Natick are already informing conversations about how to make datacenters use energy more sustainably, according to the researchers.

For example, the Project Natick team selected the Orkney Islands for the Northern Isles deployment in part because the grid there is supplied 100% by wind and solar as well as experimental green energy technologies under development at the European Marine Energy Centre.

“We have been able to run really well on what most land-based datacenters consider an unreliable grid,” Fowers said. “We are hopeful that we can look at our findings and say maybe we don’t need to have quite as much infrastructure focused on power and reliability.”

Cutler is already thinking of scenarios such as co-locating an underwater datacenter with an offshore windfarm. Even in light winds, there would likely be enough power for the datacenter. As a last resort, a powerline from shore could be bundled with the fiber optic cabling needed to transport data.

Other sustainability related benefits may include eliminating the need to use replacement parts. In a lights-out datacenter, all servers would be swapped out about once every five years. The high reliability of the servers means that the few that fail early are simply taken offline.

In addition, Project Natick has shown that datacenters can be operated and kept cool without tapping freshwater resources that are vital to people, agriculture and wildlife, Cutler noted.

“Now Microsoft is going down the path of finding ways to do this for land datacenters,” he said.

Go anywhere

Early conversations about the potential future of Project Natick centered on how to scale up underwater datacenters to power the full suite of Microsoft Azure cloud services, which may require linking together a dozen or more vessels the size of the Northern Isles.

“As we are moving from generic cloud computing to cloud and edge computing, we are seeing more and more need to have smaller datacenters located closer to customers instead of these large warehouse datacenters out in the middle of nowhere,” Fowers said.

That’s one of the reasons Chappell’s group in Azure is keeping an eye on the progress of Project Natick, including tests of post-quantum encryption technology that could secure data from  sensitive and critical sectors. The ability to protect data is core to the mission of Azure in multiple industries.

“The fact that they were very quickly able to deploy it and it has worked as long as it has and it has the level of encryption on the signals going to it combines to tell a pretty compelling vision of the future,” Chappell said.

Related

John Roach writes about Microsoft research and innovation. Follow him on Twitter.