Innovations in data centre cooling could herald a more efficient and greener future, says Roel Castelein of The Green Grid…

When considering the items on a CIO’s annual budget, few people would think of the price of cooling the organisation’s data centre as a significant factor on the ledger. However, this seemingly esoteric item can make up a large business expenditure – in many companies, cooling costs more than powering and operating the actual IT equipment that the company uses. This puts data centre cooling firmly on the agenda for the CFO and the CIO as both an area where technology can work more efficiently and a place where savings can be made.

However, this is also an issue for anyone concerned with the sustainability of our digital lifestyles. Almost everything we do on a screen, and quite a few of the things we do off one, all involve a data centre somewhere – from sending an email, to paying for our shopping, to using social media, to streaming music and video.

Technology is a neutral force, and we can choose how to apply its potential

These data volumes are set to keep increasing at an incredible pace with the advance of data-rich and data-hungry technologies such as the Internet of Things, Virtual and Augmented Reality, and Artificial Intelligence. Data centres now account for around 3% of the world’s energy consumption, putting their greenhouse gas emissions on a similar level as the aviation industry. The global data centre market is forecast to grow at a Compound Annual Growth Rate (CAGR) of 11% between 2016 and 2020, and could well be higher – meaning these figures will only escalate.

But new technologies need not only be the precursor of environmental degradation. Technology is a neutral force, and we can choose how to apply its potential – including to make the data centre more sustainable. The data centre is complicated, and equipment, building layout, and the environment surrounding the data centre can interact in complex ways. The clever application of technology can cut through that complexity, and improve data centre cooling and energy efficiency.

AI applications

Google have created a pioneering approach to the problem of data centre efficiency. It let loose DeepMind AI on its data centres, trusting that the same technology that managed to beat a human player at the notoriously abstract board game of Go could find ways to increase the efficiency of its data centres.

DeepMind researchers collaborated with Google’s data centre team to train the deep neural networks on vast swathes of historical data related to different elements of the data centre, for example, power, temperature, set points etc. The result: a 40% reduction in the volume of energy expended in cooling the data centre – a noteworthy outcome, achieved by unleashing the deep neural networks on approximately 120 variables in the data centres, covering a wide range of factors from cooling systems to windows. The AI subsequently calculated the most efficient cooling methods by analysing data it was fed.

AI currently can seem somewhat elusive and expensive to many. However, as the technology and associated techniques become more commonly used so they should also become more accessible to teams who don’t necessarily have quite the scale of Google’s expertise, nor their war chest. Given the complexity of the cooling challenge, it’s advisable that other data centres keep an eye on AI’s potential to analyse all elements and find areas for optimisation that a human could not identify.

Seabed servers

Other organisations have side-stepped the intricacy of technological solutions to the cooling issue, and have instead opted for more direct approaches. As Nordic-based data centres have long-known, the environment in which a data centre is situated can influence how easy it is to keep all its machinery at a workable temperature.

Microsoft took this to the next level by putting data centre servers underwater off the coast of California. Project Natick is an attempt to work out whether underwater data centres could be feasible, which involved putting 38,000 pounds of data servers in the deep sea. Deep sea temperatures can average between 0-6 degrees Celsius, meaning an underwater data centre requires far less energy devoted towards cooling it.

With some resourcefulness and nous, you don’t necessarily have to be a big-spender to increase the efficiency of your data centre

There is a way to attack the sustainability element of cooling without changing anything within the data centre, which is to attack the problem at source. If a data centre can use renewable energy to power its cooling systems, this avoids the environmental damage associated with fossil fuels. This is not always a possibility, but those who might struggle to move 100% to renewable can always seek to source part of their energy supply from a renewable source as a secondary supply.

It’s worth keeping in mind for those building new data centre infrastructure that being located near renewable resources such as dams or windy areas can allow you to tap into this source of energy relatively cheaply; potentially a cheaper option than using fossil fuels.

Plugging the gap

However, with some resourcefulness and data centre nous, you don’t necessarily have to be a big-spender to increase the efficiency of your data centre cooling and therefore save money and do less harm to the environment. Most legacy data centres could benefit from plugging leaks with something as simple as duct tape. Just like a refrigerator, a data centre’s cooling function works best when its air flows are unmixed and controlled. Another simple way is installing blanking panels to seal off unused rack units. This prevents the mixing of hot and cold air flows and reduces the need to cool, which results in lower energy costs.

Cooling is a key chokepoint in data centre efficiency, a place where significant cost savings and sustainability progress can be made with the judicious application of clever ideas and analysis. Whilst many of these ideas require not insignificant upfront investment, there’s huge ground to be gained in both making cooling the data centre a less expensive month-by-month OPEX and in making the industry as a whole more sustainable for our digital present and future.