Articles

Cool Data

Ben Sampson

Engineers face big challenges in making data centres energy efficient but a new development in London Docklands is said to be particularly green.

The large building looms out of the skyline on the fringe of east London’s Docklands as you approach it. The strange circuit pattern on the building’s façade hints at its contents. Telehouse North Two is a data centre and represents the latest of a new type of critical infrastructure. Without such data centres, modern life would be impossible for millions of people.

North Two is the conduit for the UK’s incoming internet traffic from all around the world. It’s also a little ostentatious compared to other data centres. Most are massive one- or two-storey anonymous grey slabs of building. But North Two goes up 11 floors and has something to shout about. Its owner, ultimately Japanese telecoms multinational KDDI, claims it’s one of the, if not the, most advanced and connected data centres in the world.

It makes use of the latest sustainable technology and has been purpose-built to meet the power and chilling requirements of computers.

Andrew Dewing, technical director at Telehouse, says that the key aspect of the building has been to future-proof it. “We don’t know what IT will look in five or ten years, so we’ve engineered in flexibility,” he says. “All our systems are redundant, there is no single point of failure.

“This is the first multi-floor data centre to feature vertical indirect adiabatic and evaporative cooling systems. This is one of the greenest data centres in the world.”

North Two’s power usage effectiveness (PUE) rating is 1.16. The PUE indicates how much power it takes to run the building and its IT equipment – the closer to 1.00 the PUE is, the more effective and efficient the IT operation. The IT equipment the £135 million building will house is so heavy that the foundations have had to be dug as deep as the Shard’s.

The opening of North Two last August kicked off what was a very busy second half of 2016 for data centres. Microsoft launched data centres in London, Cardiff and Durham and announced the Ministry of Defence as a client. IBM announced that it was tripling the number of data centres it runs in the UK to six. Then Amazon Web Services said it was opening its first UK data centre in London. The increased activity is partly to meet EU data protection regulations about the location of held data that come into force in 2018, but is also a clear indication of the growing importance of data infrastructure.

Steven Hone, chief executive of the trade association Data Center Alliance, compares data centres to hotels. “You rent rooms in a hotel by the day,” he says. “It’s the same in a data centre – you’re renting a space for a time, with the services provided, like reliable power, security and connectivity, instead of maid service.”

Data centres can also be seen as very big server and communications rooms, which have been around since the early computers in the 1960s. During the early 1980s, IT and communications started to become increasingly important to businesses. As the number of computers increased, the business model of data centres also evolved, from just providing a place for other people’s computers, to providing managed services and the IT hardware inside the data centres.

Then, in the early 2000s, companies began to also provide the software that runs on the hardware inside the data centres, so-called “software as a service”, “platform as a service” or “infrastructure as a service”. Confronted with a myriad of networking services, such as Virtual Private Networks and Voice Over IP, and the delivery of software from word processing to 3D design programs, a buzzword was needed to describe them all. The term digital cloud was coined.

Today massive providers such as Microsoft, Amazon, Google and IBM dominate the market for cloud services. These companies supply services such as storage of photographs, virtual servers for businesses, and software as a service, provided by either their own data centres or “colos”, co-location centres such as Telehouse North Two, where space is shared with other providers.

Since more and more of our everyday activities are conducted online, data centres have become “the fourth utility”. Some data centres are critical infrastructure, with government and emergency services using the cloud. Along with their growing importance, the data centres have become increasingly complex, so that the provision of cloud services can be guaranteed.

“As our reliance on digital services grows, you need to make sure that what you are delivering is reliable,” says Hone. “Data centre sophistication will continue to grow. Engineers from across the industry are having to work together more to design and build data centres.

“But they won’t necessarily get bigger. In the future we will see smaller centres, strategically located next to sustainable power, and equipped with better technology for reliability and cooling.”

The basic engineering challenge with data centres is that you put energy in and get heat out. The problem lies in mitigating the amount of heat and removing it and also being as energy efficient as possible.

So there is increasing pressure from regulators for data centres to become more energy efficient. But Hone seeks to rectify the myth that they are “energy-sucking monsters”. “Data centres house people’s and businesses’ private IT equipment,” he says. “If we didn’t have them, there would be tens of thousands of smaller server and communications rooms.

“A data centre might consume 10MW but in that one building is the equipment for 50,000 businesses. A sustainable centre will be 10 times more efficient than lots of little centres – if it didn’t exist, the power required for those 50,000 businesses’ equipment may be 50MW.”

Initially the IT equipment within data centres was the same as is used in offices, with cooling achieved using air. However, the increasingly power-hungry processors and the corresponding drive for efficiency mean that engineers are being forced to devise innovative technologies.

The average power density per rack in a data centre now needs to be at least 7kW. Hone says there are limits to the amount of cooling that can be achieved with air, usually up to 12kW per rack. Engineers are turning to liquid cooling for when the power requirements of a rack will rise more.

To help solve the challenges, the sector is seeking engineering talent. The Data Center Alliance is working with the government on an initiative to raise awareness of data centres among students. “There’s a big skills gap and the opportunities are vast,” says Hone.

But a barrier to recruitment is that most people are uninterested in how their Facebook page is updated. Government also doesn’t realise how critical data infrastructure has become. “Politicians have sleepwalked into something we’ve become reliant on,” says Hone.

Perhaps one day all the mechanical and electrical services will be administered by an artificially intelligent computer, removing the most unpredictable element, humans, from data centres and the skills gap will vanish. What strange pattern will the computers bother to put on the outside of their buildings when that happens?

Share:

Read more related articles

Professional Engineering magazine

Professional Engineering app

  • Industry features and content
  • Engineering and Institution news
  • News and features exclusive to app users

Download our Professional Engineering app

Professional Engineering newsletter

A weekly round-up of the most popular and topical stories featured on our website, so you won't miss anything

Subscribe to Professional Engineering newsletter

Opt into your industry sector newsletter

Related articles