FileHippo News

The latest software and tech news

Meet Project Natick. Microsoft have unveiled their latest research project, and it could be a major game changer in how large companies increase the... Microsoft Trials Underwater Natick Data Center

Meet Project Natick.

Microsoft have unveiled their latest research project, and it could be a major game changer in how large companies increase the way they think about cloud computing, by submerging a 38,000-pound container a thousand meters out into the Pacific Ocean.

According to Microsoft, 50% of us live close to a coast, so while dropping data centers under the ocean might sound a bit nuts to start with, it actually makes a lot of sense.


With the ever faster expansion of the cloud and the ever increasing need for more internet, large technology firms have been keen to find energy efficient ways to store their increasing amounts of data. Data centers consume up to 3% of the world’s electricity, and are essential for modern everyday computing.

One of the major power draws on these data centers is heat, or rather the cooling of it.  As a result in a bid to lower power costs, tech firms are routinely considering new options and ideas for housing data. Facebook, including moving them to countries with colder climates, and in Microsoft’s case, under the sea.

The prototype vessel used to test the theory, the Leona Philpot (named after an X-Box character, apparently) was sitting one kilometer off the Pacific coast of the United States from August to November of 2015.

Sounds a bit expensive doesn’t it?

Yes, it does, and yes it is. But with the end of Moore’s law, (the idea that computing processing power doubles every 18 months), the costs involved are no longer cripplingly expensive. Microsoft are postulating that each set of digital server racks within the underwater data centers will last for 5 years, with the actual Natick data centers steel casing will be usable for up to 20 years.

But there are other reasons, and here, according to Microsoft are the main reasons for their ocean testing

  • AIR CON:  Deepwater deployment offers ready access to cooling. Air conditioning is one of the major costs with running data centers. Sticking them in the ocean, will practically eliminate the cooling costs.
  • SUSTAINABLE: (Maybe) Natick data centers are envisioned to be fully recycled, with perhaps the exception of the actual computer bits inside of them. They will be made from recycled material which in turn will be recycled at the end of life of the datacenter.
  • RENEWABLE: With offshore renewable energy sources, Natick data centers might actually be truly zero emission, especially if they are powered by renewable energy. There should be absolute minimum wastage.
  • LATENCY:  Gamers of the world unite. As stated above, around half the world’s population lives within 200 km of the coast. By placing data centers offshore, Microsoft would dramatically be reducing latency and providing better responsiveness, because data would not be as restricted as much as it currently is.
  • RAPID DEPLOYMENT: Demand for extra bandwidth and the need for more cloud could be met quickly and efficiently.  This could prove to be extremely helpful for natural disasters and special events such as World Cup

Project Natick is currently at the research stage. Microsoft only began its research program back in 2013, so it’s still early days whether this concept will ultimately ever be deployed, but initial feedback from the prototype Natick has been mostly positive.