Microsoft goes Underwater

Microsoft goes Underwater
Microsoft is leveraging technology from submarines and working with pioneers in marine energy for the second phase of its moonshot to develop self-sufficient underwater datacenters that can deliver lightning-quick cloud services to coastal cities.


A data center is a facility composed of networked computers and storage that businesses or other organizations use to organize, process, store and disseminate large amounts of data. A business typically relies heavily upon the applications, services and data contained within a data center, making it a focal point and critical asset for everyday operations.

In cloud computing, virtualized resources are hosted by a service provider or IT department and are delivered to users over a network or the internet. These resources include virtual machines and components, such as servers, memory, network switches, firewalls, load balancers and storage.

Google patented a 'water-based data center' in 2009, describing an environmentally friendly, sea-powered telecommunications and storage system.

Jonathan Koomey, a Stanford research fellow and expert on data centers, said companies such as Dell, Hewlett-Packard, Sun Microsystems, and Microsoft have been installing specially built data centers in shipping containers for some time because they're easy to deploy. The data centers themselves would be deployed quickly, possibly within three months. That’s far less time than it normally takes to build or scale out a data center, and the new systems could be integrated with existing land-based servers.

In 2016, Microsoft tested its first submarine data center, dubbed Leona Philpot (named after the Halo character from Microsoft's Xbox). Microsoft’s Project Natick submerged a 38,000-pound, 10-by-seven-foot steel tube off the coast of California for three months to see if the servers inside it continued to function with the power of 300 desktop PCs. The company says this is the first time a data center has operated under the sea.


The deployment of the Northern Isles datacenter at the European Marine Energy Centre marks a milestone in Microsoft’s Project Natick 2, a years-long research effort to investigate manufacturing and operating environmentally sustainable, prepackaged datacenter units that can be ordered to size, rapidly deployed and left to operate lights out on the seafloor for years.

There are several hypothetical advantages to dropping data centers in the deep (relatively speaking). Air conditioning and cooling costs eat a sizable percentage of a data center’s budget. Water is far more effective at removing heat than forced-air cooling; deploying servers in the ocean would eliminate cooling overhead. Microsoft is reportedly considering deploying the servers with their own surface turbines or tidal energy systems to generate power on-site.

Because half the world’s population lives within 50km of an ocean, deploying water-based servers would allow companies to offer guaranteed low-latency connections to large groups of people

Project Natick’s approach to cloud storage also gains efficiency by standardizing the manufacturing process. Normally, server farms have to be built to meet site-specific permitting issues and construction costs. It’s hard to streamline those obstacles from project to project. But the researchers built the undersea unit in 90 days, illustrating the potential time-savings of water-bound modular fabrication.

Microsoft has stated that its end goal is to create data centers that are recyclable and do not impact the local environment. In theory, this should be possible, depending on how much waste heat each chassis produces. Closed-loop coolers would still allow for heat exchange between the server pod and the water without pumping seawater in and out of the system (and dealing with the associated corrosion and filtering issues).


Our assessment is that this achievement opens the door for future data-storage infrastructure to take advantage of the natural cooling properties and renewable energy of the underwater location. We feel that this technology looks good for the environment: less water usage and energy needed from the grid, and a decrease in the acres of landscape devoted to our computing needs.