We use third-party cookies to identify website visitor trends, to improve site functionality and to tailor content to your interests. If you continue to use our website, you consent to our use of cookies as outlined in our privacy policy. For more information about our privacy policy and to opt-out of cookies, please click here.
June 12, 2018 - Under the Scottish sea, as part of the next phase of Project Natick, Microsoft tests a data center including 864 servers and 27.6 petabytes of storage that’s quick to deploy and could provide internet connectivity for years.
Underwater Data Centers
Microsoft has announced that it is leveraging technology from submarines and working with pioneers in marine energy for the second phase of its moonshot to develop self-sufficient underwater data centers that can deliver lightning-quick cloud services to coastal cities.
An experimental, shipping-container-size prototype is processing workloads on the seafloor near Scotland’s Orkney Islands.
The deployment of the Northern Isles data center at the European Marine Energy Centre marks a milestone in Microsoft’s Project Natick, a years-long research effort to investigate manufacturing and operating environmentally sustainable, prepackaged datacenter units that can be ordered to size, rapidly deployed and left to operate lights out on the seafloor for years.
“That is kind of a crazy set of demands to make,” said Peter Lee, corporate vice president of Microsoft AI and Research, who leads the New Experiences and Technologies, or NExT, group. “Natick is trying to get there.”
Lee’s group pursues what Microsoft CEO Satya Nadella has called “relevant moonshots” with the potential to transform the core of Microsoft’s business and the computer technology industry. Project Natick is an out-of-the-box idea to accommodate exponential growth in demand for cloud computing infrastructure near population centers.
More than half of the world’s population lives within about 120 miles of the coast. By putting data centers in bodies of water near coastal cities, data would have a short distance to travel to reach coastal communities, leading to fast and smooth web surfing, video streaming and game playing as well as authentic experiences for AI-driven technologies.
“For true delivery of AI, we are really cloud dependent today,” said Lee. “If we can be within one internet hop of everyone, then it not only benefits our products but also the products our customers serve.”
From France to Scotland
Project Natick’s 40-foot long Northern Isles datacenter is loaded with 12 racks containing a total of 864 servers and associated cooling system infrastructure. The data center was assembled and tested in France and shipped on a flatbed truck to Scotland where it was attached to a ballast-filled triangular base for deployment on the seabed.
On deployment day, the winds were calm and seas flat under a thick coat of fog. “For us, it was perfect weather,” said Ben Cutler, a project manager in the special projects group within Microsoft’s research organization who leads the Project Natick team.
The data center was towed out to sea partially submerged and cradled by winches and cranes between the pontoons of an industrial catamaran-like gantry barge. At the deployment site, a remotely operated vehicle retrieved a cable containing the fiber optic and power wiring from the seafloor and brought it to the surface where it was checked and attached to the datacenter, and the data center powered on.
Cutler said there were sighs of relief as these risks were eliminated. As if on cue, the last wisps of fog lifted.
The most complex task of the day was the foot-by-foot lowering of the data center and cable 117 feet to the rock slab seafloor. The marine crew used 10 winches, a crane, a gantry barge and a remotely operated vehicle that accompanied the data center on its journey.
“The most joyful moment of the day was when the datacenter finally slipped beneath the surface on it's slow, carefully scripted journey,” said Cutler. Once the data center made it to the seafloor, the shackles were released, winch cables hauled to the surface and operational control of the Northern Isles passed to the shore station.
Everything learned from the deployment – and operations over the next year and eventual recovery – will allow the researchers to measure their expectations against the reality of operating underwater datacenters in the real world.
Powered by Renewable Energy
The Northern Isles is a chapter in the continuing story of Project Natick, one that tells a tale about researching whether it’s possible to use the existing logistics supply chain to ship and rapidly deploy modular data centers anywhere in the world, even in the roughest patches of sea.
“We know if we can put something in here and it survives, we are good for just about any place we want to go,” said Cutler.
The European Marine Energy Centre is a test site for experimental tidal turbines and wave energy converters that generate electricity from the movement of seawater. Tidal currents there travel up to nine miles per hour at peak intensity and the sea surface regularly roils with 10-foot waves that whip up to more than 60 feet in stormy conditions.
Onshore, wind turbines sprout from farmers’ rolling fields and solar panels adorn roofs of centuries-old homes, generating more than enough electricity to supply the islands’ 10,000 residents with 100 percent renewable energy. A cable from the Orkney Island grid sends electricity to the data center, which requires just under a quarter of a megawatt of power when operating at full capacity.
Colocation with marine renewable energy is a step toward realizing Microsoft’s vision of data centers with their own sustainable power supply, explained Christian Belady, general manager of cloud infrastructure strategy and architecture in Microsoft’s cloud and enterprise division.
Energy self-sufficient datacenters, he noted, could be deployed anywhere within reach of a data pipe, bringing Azure cloud services, for example, to regions of the world with unreliable electricity, and eliminate the need for costly backup generators in case of power grid failures.
“Our vision is to be able to deploy compute rapidly anywhere on the planet as needed by our customers,” said Belady, who has long advocated research that explores the marriage of data centers and energy generation to simplify and accelerate the build-out of cloud computing infrastructure.
The Backbone of the Internet
Datacenters are the backbone of the internet, the physical clouds of cloud computing where customers leverage economies of scale to securely store and process data, train machine learning models and run AI algorithms.
Demand for data center resources across the computing industry is growing exponentially as corporations increasingly shift their networks and computing needs to the cloud, and internet-connected intelligent devices ranging from smartphones to robots proliferate.
“When you are in this kind of exponential growth curve, it tells you that most of the datacenters that we’ll ever build we haven’t built yet,” said Cutler, underscoring the need for innovation in the race to build out what is fast becoming a critical piece of 21st-century infrastructure.
The underwater datacenter concept was originally presented in a white paper prepared for a Microsoft event called ThinkWeek that encourages employees to share out-of-the-box ideas. Lee’s group was intrigued. Just 12 months after launching Project Natick in July 2014, the team had deployed a lab-built proof-of-concept prototype in calm, shallow waters off California.
The proof-of-concept vessel operated for 105 days. Encouraged by the results and potential industry impact, the Project Natick team pushed ahead to design, manufacture and test the full-scale module deployed in Scotland. Cutler said the latest version is designed to remain in operation without maintenance for up to five years.
Datacenter and Submarine Synergy
Phase 1 of Project Natick showed the underwater datacenter concept is feasible; Phase 2 is focused on researching whether the concept is logistically, environmentally and economically practical.
At the outset of Phase 2, the Microsoft team knew that scalable manufacture of submarine-like datacenters would require outside expertise. That’s why Microsoft chose to work with Naval Group, a 400-year old France-based company with global expertise in engineering, manufacturing and maintaining military-grade ships and submarines as well as marine energy technologies.
The Microsoft team presented Naval Group with general specifications for the underwater datacenter and let the company take the lead in the design and manufacture of the vessel deployed in Scotland.
“At the first look, we thought there is a big gap between data centers and submarines, but in fact, they have a lot of synergies,” said Eric Papin, senior vice president, chief technical officer and director of innovation for Naval Group.
Submarines, he noted, are essentially big pressure vessels that house complex data management and processing infrastructure for ship management and other systems integrated according to stringent requirements on electricity, volume, weight, thermal balance, and cooling.
Submarine Technology
In fact, Naval Group adapted a heat-exchange process commonly used for cooling submarines to the underwater datacenter. The system pipes seawater directly through the radiators on the back of each of the 12 server racks and back out into the ocean. Findings from phase 1 of Project Natick indicate water from the data center rapidly mixes and dissipates in the surrounding currents.
Spencer Fowers, a senior member of technical staff for Microsoft’s special projects research group, said one key design specification was for the vessel itself to have roughly the dimensions of a standard cargo container used to move supplies on ships, trains, and trucks to optimize the existing logistics supply chain.
Once the datacenter was bolted shut and all systems checked out in France, the team loaded the data center onto the back of an 18-wheel truck and drove it to the Orkney Islands, ferry crossings included. In Scotland, the vessel was secured to the ballast-filled triangular base and towed out to sea for deployment from the gantry barge
“Like any new car, we will kick the tires and run the engine at different speeds to make sure everything works well,” Fowers said. “Then, once we are completely ready to go, we will grab one or two of our clients and hand them over the keys and let them start deploying jobs onto our system.”
Applied Research
The Project Natick team will spend the next 12 months monitoring and recording the performance of the data center, keeping tabs on everything from power consumption and internal humidity levels to sound and temperature levels.
The world’s oceans at depth are consistently cold, offering ready and free access to cooling, which is one of the biggest costs for land-based data centers. Underwater datacenters could also serve as anchor tenants for marine renewable energy such as offshore wind farms or banks of tidal turbines, allowing the two industries to evolve in lockstep.
For now, Project Natick is an applied research project, focused on determining the economic viability of operating containerized datacenters offshore near major population centers to provide cloud computing for a world increasingly dependent on internet connectivity.
“When you go for a moonshot, you might not ever get to the moon,” Lee said. “It is great if you do, but, regardless, you learn a lot and there are unexpected spinoffs along the way. You get Velcro at some point. That is happening in this case. We are learning about disk failures, about rack design, about the mechanical engineering of cooling systems and those things will feedback into our normal datacenters.”
Source: SupplyChain247