Microsoft's Submarine Data Center - PixelRocket

Microsoft’s Submarine Data Center

Microsoft’s Submarine Data Center

When Bill and Melinda Gates founded the Bill and Melinda Gates Foundation in 1999, their mission was to connect the world through internet access. Now, Microsoft’s Project Natick takes this dream one step further. A long, submarine-like tube currently houses a functional data center—completely underwater. Not only does this model solve the economic and ecological problems behind cooling data centers, but since half of the world’s population lives near a large body of water, what better place to drop a bunch of servers than in the ocean?

Microsoft Underwater Data Server Engineer Spencer Fowers

Source: photographer Scott Eklund/Red Box Pictures and Microsoft

Data Processing

If you’re on your computer, smartphone, or tablet right now, you’re utilizing one of the millions of data centers in the world. 3 million of these are in the United States alone. As unlimited as this access seems, breaking down the anatomy of the data center shows issues with their growing vitality.

Consider the following: data processing occurs within a server. Data centers house shelves upon shelves of these servers. More data centers mean more information can be processed from more devices. In the digital age, greater use of communication devices has caused traffic for servers to grow immensely. Highly populated areas especially bear this burden, creating slower networks, internet speed, etc.

Project Natick seeks to reduce traffic by increasing roadways. Microsoft hopes to eventually place more data centers in large bodies of water near major cities. Cloud computing services, in this case, Azure, allow nearby users to benefit from these centers.

Built elsewhere and transported to their desired location, these innovations are more convenient than current data centers located in buildings. Project Natick itself was actually built in France before being transported to Scotland, proving its convenience in portability.

Meet the brains behind Project Natick as they submerge their revolutionary data center: 

Hot Commodities

Project Natick not only serves to meet high demands but also solves the greatest issues of data centers: overheating and expense. Any piece of technology used too often—a computer, a phone, or even a car—will inevitably begin to heat up after so many hours of use. Similarly, a data center full of servers running 24/7 can grow very warm, very quickly. However, servers can’t simply be turned off to cool down. For this reason, the most expensive part of maintaining a data center comes from running the AC on high, nonstop, forever.

Project Natick solves this problem using submarine-like technology. “The system pipes seawater directly through the radiators on the back of each of the 12 server racks and back out into the ocean,” explains Microsoft’s features on the project. Hopefully, the 864 servers will remain at functionally cool levels with this technique.

To lower expense, Microsoft has turned to eco-friendly energy sources. The undersea datacenter finds itself off the coast of Scotland for the country’s use of renewable energy. Orkney Islands’ wind farms and solar panels power this experiment. Microsoft’s ultimate goal is to find a way to create energy using tidal waves, thus making the data center completely independent of land. This freedom from any other power source would make it possible to place data centers anywhere in the ocean. Success in this experiment would mean increasing network speeds even in isolated locations with undependable electricity.

Microsoft Undersea Data Server

Source: photographer Scott Eklund/Red Box Pictures and Microsoft

Natick Phase 1 and the Future

Thus far our article has been about Natick Phase 2. While experimental, this trail is actually the result of another prototype in the form of Natick Phase 1.

In 2015, Microsoft deployed a similar data center off the coast of California. The smaller prototype had only a fraction of the power as its big brother, but it functioned all on its own, without maintenance, for 105 days. Back then, Phase 1 siphoned energy from an electric power grid on the mainland. It also cooled purely by contact with the freezing Pacific Ocean. Phase 2 has many, many upgrades. Not only is it much bigger, but it has the processing ability of several thousand computers. It’s construction also required less than the previous unit’s 90 days.

Microsoft anticipates great results from its newest addition to Project Natick, but already ambitions for improvements. A full five years could pass before this vessel requires maintenance. The “lights out” method of zero-maintenance could be extended to 10 years in the future. The European Marine Energy Centre, a close partner on this project, also seeks to create the technology necessary for newer additions to be completely independent of other energy sources. Additionally, recycled materials could potentially make-up the container’s hull. All of these details would make the project even more environmentally friendly and cost-effective.

What do you think of Project Natick? Could it be the future of data centers, or is it too early to tell? Let us know what you think in the comments below!