Take Microsoft’s underwater data centers with a grain of salt

The problem with an underwater cloud

under water data center Microsoft cloud computing azure ocean

Microsoft wants to put data center pods underwater. Is that really a good idea? 

Credit: Microsoft

Microsoft revealed Project Natick this weekend – an effort to test data centers that are deployed underwater.

Is it really viable for the cloud to live in the ocean?

+MORE AT NETWORK WORLD: Despite layoffs, VMware is banking on this one big cloud innovation +

The rationale for Project Natick makes a lot of sense. Microsoft says half of Americans live within 200 KM of the ocean. Locating data centers near end users reduces latency. Since dry ground is expensive near large population areas, why not throw the data centers in the ocean?

Microsoft has been developing and testing Project Natick since 2013 when the idea was initially conceived by an employee who used to work on a US Navy submarine. Microsoft built a miniature data center, enclosed it in a waterproof steel vessel and sunk it off the coast of California (see photo above).

Here’s my question: What happens when something breaks?

“It’s kind of like launching a satellite for space,” Project Natick research engineer Jeff Kramer says, emphasizing my piont exactly. “Once you’ve built it, and you hand it to the guys with the rocket - or in our case the guys with the crane – you can’t do anything about it if it screws up.”

Here’s the problem: the cloud screws up a lot. Amazon.com CTO Werner Vogels has a famous saying: “Everything fails. All the time.” What happens when a server inexplicably fails, or a router in the underwater pod goes awry?

Microsoft says Project Natick data centers are very resilient.

With the end of Moore’s Law, the cadence at which servers are refreshed with new and improved hardware in the datacenter is likely to slow significantly. We see this as an opportunity to field long-lived, resilient datacenters that operate “lights out” – nobody on site – with very high reliability for the entire life of the deployment, possibly as long as 10 years. 

An FAQ on the Project Natick website goes on to explain that the data centers are designed to be deployed for five years, then reloaded with new computers and redeployed. That all sounds fine and good. But it still doesn’t answer what will happen when – not if – something breaks inside. Those pods would need to be coming back up to the surface a lot sooner than every five years. 

Let’s say Microsoft does figure that out. Then they have to still worry about sharks eating their underwater cables, as Google knows all too well.

Must read: Hidden Cause of Slow Internet and how to fix it
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies