Data centers are an essential component of Internet usage. Without even realizing it, nearly everyone one in the world makes use of data centers. Be it when you try to stream videos or check your social media updates on your phone, the data processing happens miles away in a remote data center.
In early June this year, Microsoft created a stir when they announced about sinking a data center off the coast of Scotland. The headline brought up several questions. What is the benefit of placing a data center underwater? Isn’t it risky placing electronics underwater in a corrosive environment like the sea? How will maintenance be done? And why off Scotland? Perhaps before answering these questions, we should have a look at the facts from scratch.
The birth of Project Natick
It all started in 2013 when Microsoft employees submitted a whitepaper describing an underwater datacentre that could be powered by renewable ocean energy. One of the major cost drivers of any data center is electricity, which is required to both power and cool equipment. This paper’s concept of using renewable ocean resources to power-up equipment and also to cool them, got the attention of the Microsoft management. And in late 2014, Microsoft kicked off Project Natick.
Phase 1 – Leona Philpot
As the first phase of the project, Microsoft developed a 10’ x 7’ cylindrical prototype vessel – taking inspiration from submarines – which could house one server rack. The prototype was eccentrically named ‘Leona Philpot’ — a character from Microsoft’s ‘Halo’ video game series. The team could take advantage of the fact that no human intervention would be required to maximize space and forego the need for oxygen in the vessel.
In 2015, Leona Philpot was deployed underwater off the coast of California with sensors placed to monitor the system and track factors such as motion, pressure, heat, and humidity for 105 days. The test results exceeded Microsoft expectations, and they extracted the prototype and took it back to HQ for analysis and refitting.

Phase 2 – Orkney Scotland
For the current phase of this project, Microsoft decided to get expertise from marine organizations to lead the design, fabrication, and deployment of the Phase 2 data center. The new module is larger, with a length and diameter of 12.2 m and 3.18 m respectively, and a payload that includes 12 racks containing 864 standard Microsoft data center servers with FPGA acceleration and 27.6 petabytes of disk, and has enough storage for about 5 million movies. The internal operating environment is set as 1-atmosphere pressure, with dry nitrogen used.
The European Marine Energy Centre in Orkney Scotland was selected due to the abundant availability of wind and tidal power supply through a grid. On 1st June, the data center was deployed from a docking structure.
While the objective of the first phase of the project was to see if the underwater data center was feasible, the second phase looks at whether the concept is logistical, environmentally and economically practical.
What does this mean for the future of data centers?
Latency looks to be one of the key improvements Microsoft is targeting – signals travel around 200 km/millisecond across the Internet, so if you are 200 km away, one round trip to the data center takes about 2 milliseconds, but if you are 4000 km away, each round trip takes 40 milliseconds. Impact of latency is mostly felt in real time data applications like gaming and video conferencing. Right now, most of the world’s data centers are located in giant complexes, usually far away from urban areas, and this distance to the population increases latency. However, half of the world’s population is located within 200 kilometers of the coast, so placing data centers in the sea would provide benefits.
Traditional data centers also take years to construct and deploy. Microsoft is looking at utilizIng the modular design of these underwater data centers which can reduce deployment time to 90 days. The size of the data center deployed in Orkney is no accident. It can be transported on a 40-foot container lorry, meaning that within 90 days, you can upgrade your data center capacity with standard logistic capabilities.
Modern data centers use a lot of water for the cooling systems – typically 4.8 Liters per kWh of electricity used – while the Natick modules only use existing seawater for the heat exchangers. The use of tidal energy-based power in the future could make the data center zero emission and truly sustainable.

Environmental concerns
The project Natick modules are environmentally sustainable with zero emissions. However, if there is a concern, it is to do with the heat emitted from the modules. Phase 1 showed that there was minimal heat emitted and that it was very close to the vessel surface. During phase 2, sensors will continue to monitor the heat and noise emitted by the module to assess any environmental impact.
After each 5-year deployment cycle, the data center vessel would be retrieved, reloaded with new computers and redeployed. The current project Natick module is expected to last 20 years – after that, the data center vessel is designed to be retrieved and recycled.
Is it worth the effort?
Data centers have made great strides towards reducing energy usage through methods such as liquid cooling, airflow management, and server virtualization. Even moving data centers to countries with lower temperatures have shown to improve energy efficiency. Phase 1 of Project Natick operated with a highly efficient PUE (power usage effectiveness or total power divided by server power – 1.0 is perfect) of 1.07. However, Google data centers claim to have a PUE of 1.12.
Therefore, the greatest advantage could be gained through lowering water usage (for cooling), increasing deployment speed and improving latency for end users by targeting cities with high populations close to the coast.
Further plans exist to provide energy for undersea modules through tidal power generation, making the modules run completely on renewable energy and completely self-sustaining.
Project Natick is currently a speculative effort but if the next few years of testing show that the concept is technologically and commercially viable, it could very well change the economics of the data center business.