How Do You Cool a Data Center?
- Youtube Views 46,332 VIDEO VIEWS
WHEN we access and use today’s online world it’s easy to assume that all of our searching, uploading, downloading, liking, streaming and purchasing is happening in the ether; in some form of virtual world or “cloud” that exists in the air around us.
In truth, all of those actions rely on huge physical support infrastructures; vast built assets that are delivered by the architecture, engineering and construction (AEC) industries.
The scale and performance demands of such facilities are ever increasing as more of the world’s population gains internet access and as more services are provided online. The shear volumes of data produced by the world’s population are also a major factor. We’re set to generate more data in 2017 than in the previous 5,000 years of human existence.
Historically, data centers have been constructed in remote locations where land tends to be cheaper, and for the passive security benefits that such locations offer. Remote positioning also allows for easy future expansion where necessary.
Above: Google's data center at Mayes County, Oklahoma in the United States (image courtesy of Google).
Placing so much electrical equipment under one roof of course generates significant amounts of heat. Data centers and typically cooled by mechanical plant; that is air-conditioning units or electrically driven fans. Quite aside from the cost of running this cooling equipment continuously, it requires large amounts of energy and damages the environment.
Above: Electrical servers can generate considerable amounts of heat (image courtesy of Google).
The ever growing pressure that our internet usage places on data centers has been partially countered by some improvements in server efficiency and advancements in cooling systems, but these don’t outweigh the rising demands. Several technology giants and professional teams are now considering alternative design concepts.
Facebook recently opened a new data center at Luleå in Northern Sweden, just outside the Arctic Circle.
The building uses fans to draw in the surrounding Arctic air and cool its servers. Whilst these fans still require electric power, the approach is more efficient and environmentally friendly than the air conditioning systems found in older facilities and moves Facebook closer to their aim of running all their data centers on 50% clean and renewable energy by 2018.
Above: Facebook's data center in Luleå is cooled by fans that draw in the surrounding Arctic air (image courtesy of Mark Zuckerberg / Facebook).
The Arctic location has energy benefits too. Historically, Sweden built a number of hydroelectric dams for its steel, iron ore and paper industries. These have since declined, leaving the northern parts of the country with a power surplus.
Microsoft are also looking at innovative cooling techniques, exploring the feasibility of locating data centers on the ocean floor.
The firm tested the concept off the coast of California in 2015 and 2016 in a scheme code-named Project Natick. They placed a server and cooling system within a watertight steel shell that was then positioned on the sea floor and connected back to land via a cable. Heat exchangers on the shell’s exterior used the surrounding cold ocean water to cool the server and maintain an optimum operating temperature.
Above: Microsoft placed a server within a watertight steel shell - a "data center" that would then be placed on the sea floor. Below: The data center returning from its trial deployment (images courtesy of Microsoft).
With an initial trial complete, Microsoft are working to develop their concept into a practical solution that can be delivered at scale.
Whilst the prototype capsule can be submerged for 5 years, the team are targeting a 20 year deployment. They are also investigating tidal power as a more sustainable means of generating electricity, as the land-based power grid was used during the trials.
Using the ocean to cool data centres has other benefits too. Half the world’s population live within 200km of the sea potentially reducing latency; that is the time it takes for data to travel from source to its destination.
As internet usage and data generation continues to increase, the built assets that the architecture, engineering and construction industries deliver to support it must continue to innovate. Some of the solutions explored here offer sustainable approaches that could well be the shape of things to come.
Above: Cooling plant at Google's data center in Saint Ghislain, Belgium (image courtesy of Google).
This video was kindly powered by Viewpoint.
Images courtesy of Facebook Google, Mark Zuckerberg and Microsoft. Footage courtesy of Microsoft. We welcome you sharing our content to inspire others, but please be nice and play by our rules.