Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

20,000 Leagues Under the Cloud

Experts say Microsoft's Project Natick could give the world better online services by putting computer servers under the sea

5 min read

20,000 Leagues Under the Cloud
Microsoft workers deploy the test capsule for underwater data centers at the start of the trial.
Photo: Microsoft

In the 2015 film “Creed,” aged boxing legend Rocky Balboa stares up at the sky in confusion after his young protege tells him a smartphone picture has been saved in the cloud. Rocky might feel even more befuddled if he heard about Microsoft’s experiment in putting the cloud’s computer servers under the sea. As crzay as it sounds, the underwater data center initiative, called Project Natick, could revolutionize the way companies Internet services such as streaming video, music, or games.

Microsoft’s first underwater test involved a car-sized capsule that weighs more than 17,236 kilograms and has a computing power equivalent to 300 desktop computers. That’s tiny compared with existing data centers. But Project Natick marks a first step toward seeing if ocean water can help undersea data centers save on the power costs associated with cooling computers. Undersea data centers could also put many more cloud computing hubs closer to the billions of people living near coastlines.

“Heat will dissipate really fast and really far underwater. I see that as big cost savings”

“You could save on at least one third of your operating infrastructure costs right there on the spot,” says Andres Marquez, a high-performance computing expert at the Pacific Northwest National Laboratory in Richland, Wash. “The location also makes perfect sense because a lot of people live in densely populated coastal areas.”

Marquez does not have any direct involvement with Project Natick. But in his personal opinion, Microsoft’s idea of underwater data centers could play an important role in meeting the world’s growing demand for Internet services. Putting data centers physically closer to customers makes a big difference for streaming Internet services that want to reduce the delay caused by delivering data over longer distances, Marquez says. For ordinary Internet users, that means less annoyance caused by stuttering music streams, buffering videos, or the dreaded lag in online games.

The prime location of underwater data centers offers an alternative to buying expensive real estate within or near cities. Similar speculation surrounded Google’s experiments with floating barges before it was revealed that they were merely showrooms for new Google products.

imgMicrosoft workers assembling the undersea test capsule containing computer servers.Photo: Microsoft

Project Natick’s use of readily-available ocean water as a coolant also plays into the location advantage. Tech giants such as Facebook have previously experimented with putting data centers in frigid environments, such as at the edge of the Arctic Circle. But such chilly locations don’t always offer the best location in terms of proximity to customers.

The first Project Natick test capsule—named Leona Philpot in honor of a character in the “Halo” video game series—survived a 105-day underwater trial run at a depth of 9 meters off the coast of central California. It used a heat exchanger system to transfer the heat generated by the computer servers to the surrounding ocean water. The New York Times described it as the following:

It is a large white steel tube, covered with heat exchangers, with its ends sealed by metal plates and large bolts. Inside is a single data center computing rack that was bathed in pressurized nitrogen to efficiently remove heat from computing chips while the system was tested on the ocean floor.

The ocean’s capability to act as a huge, heat-absorbing sponge should come as no surprise, says Junaed Sattar, assistant professor of computer science and engineering at the University of Minnesota in Minneapolis. As an underwater robotics expert, Sattar does not have any direct connection with Project Natick. But he has observed the difference an ocean plunge can make in terms of drastically reducing the operating temperature for machines such as underwater robots.

“The underwater domain is a prime location to put something heat-producing like a data center,” Sattar says. “Heat will dissipate really fast and really far underwater. I see that as big cost savings.”

Underwater data centers may also drive reliance upon renewable power sources such as wave or tidal power. The first Project Natick test still relied upon external power cables connecting the underwater capsule to the existing power grid. In future testing, Microsoft wants to hook up a renewable power source that supplies the electricity required to run it. Such a setup could complement the ocean water cooling to achieve fossil-fuel-free operations. It would also free underwater data centers from having to be located near the existing power grid.

Eventually, Microsoft hopes Project Natick can lead to data centers capable of deploying underwater for up to five years at a time. The tech giant envisions quickly deploying such data centers within 90 days as a rapid response to changing market demand.

But first, Microsoft must overcome several challenges in making the underwater data center idea work. The underwater capsules must have the material strength to survive the corrosive effects of salty ocean water and also have watertight casings, Sattar says. A leak could lead to the catastrophe of ocean water flooding in and ruining the data center’s computer servers. Survivability is crucial to Microsoft’s long-term plan for underwater deployments lasting years at a time.

imgThe capsule being removed from its underwater location at the end of testingPhoto: Microsoft

Microsoft does not plan to perform underwater upgrades of the computer servers during the five-year undersea deployments. Still, the company may have to use human divers to periodically check on the capsules and possibly clean off either organic biofilms or mineral deposits, Marquez said. Over time, such buildup could potentially clog heat exchangers. (During the 105-day trial period, Microsoft sent a diver down once a month to check on the capsule.)

“I’m not saying Microsoft didn’t solve that problem, but I would be curious to see how they do it,” Marquez says.

Maintenance might get even more complicated during future tests or operational deployments that put data centers at ocean depths beyond easy reach of human divers. That’s when underwater robot submersibles might come into play, Sattar says. Such remotely-operated or autonomous underwater vehicles could help collect data from self-contained sensors and monitoring stations near underwater data centers.

The environmental impact of such underwater data centers also remains unknown. Ideally, the heat and sounds coming from the data center should avoid disturbing fish and other marine life near the seafloor.

Microsoft reported no extra heating of the surrounding water beyond a few inches from the test capsule during Project Natick’s first trial run, according to the New York Times. And the clicking of nearby shrimp supposedly drowned out any sounds from the capsule. The tech company plans to test a bigger data center capsule alongside an ocean-based power source sometime in 2017. But Sattar cautioned that it’s tough to predict the environmental impacts of much larger data centers or underwater data farms containing hundreds of smaller units.

“I do like the approach of putting this underwater and saving energy and expenses, but of course you have to worry about the other side of the equation,” Sattar says. “I’m hopeful and positive about what they’re doing here; if it’s done right, we’ll all benefit from it.”

The Conversation (0)