DEV Community

Cover image for Implications of Microsoft's Underwater Data Center Experiments
Lauraharvsey
Lauraharvsey

Posted on

Implications of Microsoft's Underwater Data Center Experiments

Recently, Microsoft began Phase II of their experiment to see if underwater data centers are the wave of the future. The testing is part of Project Natick, and it's being conducted in conjunction with the European Marine Energy Centre (EMEC) near Orkney Island off the coast of Scotland.

The first phase was launched in 2014 on a vessel off the California coast, with the focus on collecting data. That experiment was conducted for 105 days and concluded with promising results. This second deployment is to test whether the outcome of Phase I could be replicated in a variety of marine environments.

What is Project Natick?

The experiment involves building submarine-like data centers and sinking them below the ocean surface. The containers are about 40 feet long, ringed with external heat exchangers, and cooled by sea water. Each 30,000 pound container houses the cooling system infrastructure and 12 racks equipped with 864 servers.

Although there are several floating data centers, these are the first that are completely submersible and designed as fail-in-place infrastructure.

The project rests on the theory that, because nearly half of the Earth's population lives within 200 kilometers of the ocean, placing servers on the ocean floor at strategic points along global coastlines will improve service speed and reliability.

It's also designed to reduce the footprint that results from increasing technology and use; energy consumption by data centers is expected to reach 140 billion kilowatts by the end of 2020.

The way these submersible data centers are constructed and cooled reduces the amount of energy needed to operate and cool servers that are becoming overwhelmed with the demand of more users, cloud computing, and advanced tech like artificial intelligence. Right now, the research project remains in the realm of an applied study, but it could include deploying cloud-computing applications for real clients in the future.

According to project manager, Ben Cutler, Microsoft believes, “The structure is potentially simpler and more uniform than we have for data centers today." Feasibility remains hypothetical at this point, but, The expectation is there actually may be a cost advantage to this.”

How Does is Work?

While the concept is progressive, it doesn't employ groundbreaking concepts like blockchain technology. The data center relies upon existing land-based electrical grids and fiber-optic cables. Cooling mechanisms are based on natural cooling from the ocean water itself by running it through the exterior heat exchanges.

Cooling busy data centers is the biggest cause of energy consumption and pollution in tech environments. The electrical grid on Orkney Island is already 100 percent powered by renewable wind and solar energy. Researches also point out that winds are typically stronger along coastlines, which supports the possibility of wind as the primary energy source.

The current phase of testing is expected to take about a year plus container recovery time, at which point the team will try to correlate their expectations against actual results. The containers themselves are designed to last five years in the underwater environment, at which time Microsoft expects that advancing technology will have out-paced their usefulness anyway.

What's the Point?

The stated purpose of this experiment is to investigate ways to expand data centers into spaces that promote sustainability by utilizing renewable resources rather than contributing to global warming by continuing business as usual. It's also to support faster cloud deployment at a lower cost than building and maintaining land-based servers.

If it works, the innovation will reduce the environmental effects of powering large data centers while bringing more reliable service where it's needed most. In a probable future where serverless apps and virtual environments are the norm, creating cost-effective green solutions that actually improve service delivery seems like a no-brainer.

This advancement is being called by some at Microsoft "Cloud-in-Ocean computing" using FPGA boards from a Microsoft Azure data center that were re-located at the Northern Isles data center. The containers themselves are being built by a marine manufacturing company. They can be mass-produced and turned out in 90 days at a much lower cost than land-based data centers.

What About Security?

The logic of sinking sensitive electronics equipment underwater instead of keeping it safe on dry land is an obvious question many skeptics ponder. One problem that the Microsoft team is working on is a condition called biofoulment. This is caused by micro-organisms attaching themselves to materials placed in the ocean, and it can begin within minutes. One solution would be to construct water-tight containers from materials that could eventually be converted to artificial coral reefs, as some coastal areas are doing.

Another concern is the impact on ocean temperatures. Although these are the only such containers in existence, preliminary figures show that the temperature change near the containers was about one-thousandth of a degree. The heated water dissipated within 20 feet of the vessel.

With only the few containers being tested, no big deal, but if the technology were to become popular and suddenly there are thousands or hundreds of thousands of containers in the ocean generating heat, that’s something different.

Underwater Hackers?

As for physical server security, there is no mechanism currently in place to send scuba-diving techs down to service broken or damaged equipment. During testing, a diver conducts a visual inspection of the container monthly.

The design is built on a self-sustaining "lights-out" protocol that makes human oversight and intervention unnecessary. The location would also make physical break-ins, vandalism, and theft unlikely.

Physical Security: As outlandish as it might seem, data security on these underwater servers will be unremarkable. Encryption and authentication methods should be no different than what is already used to protect traditional data centers and virtual computing environments. The wrench that might eventually be thrown into the mix is that Microsoft currently holds a patent on the concept and design.

It’s not hard to envision a future in which Microsoft makes it difficult for VPN service providers and firewall and anti-virus software manufacturers to create products compatible with the new servers. We have only to look to the 2001 antitrust lawsuit brought by the federal government (read more about the browser wars) to see how Microsoft has behaved in the past.

The bottom line is that, assuming the playing field is level, data held on these container servers should be privy to the same protection by VPNs, firewalls, and anti-virus software as any other. On the flip side, the servers would also find themselves targeted by hackers to the same degree.

Final Thoughts

The future of tech is in collaborative virtual environments. Placing servers closer to where they're needed will not only lower costs while improving speed and availability, it will cut enterprise time-to-market delivery by boosting advanced remote development capabilities. However, questions remain about the feasibility and security of such environments.

Stay tuned for the results of Phase Two to learn how this experiment pans out and what it may mean for the future of computing!

Top comments (0)