Wednesday, April 22, 2015

Inside a multi-million dollar datacentre (TechRepublic)



The hub

Behind most apps and cloud services lies a datacentre - but just what goes into building one of these hubs of the online world?
Equinix opened the doors to its new $145m (£98m) facility in Slough, England last week - a multi-storey glass and steel block that will initially pack tens of thousands of servers in a 236,000 square foot hall.
Equinix has invested more than $7bn over the past 15 years in the datacentres that make up its International Business Exchange - recently opening new centres in New York, Toronto, Melbourne, Shanghai, Singapore. This LD6 datacentre is the sixth that Equinix has launched the London area and one of more than 100 it runs in 33 markets worldwide. 

Inside the data hall

Initially LD6 can house 1,385 server cabinets in a 236,000 square foot data hall, with space for a further 1,385 cabinets to be made available at a later date.
The hall will hold shared-colocation servers and each customer will usually deploy their own machines. Those that require a high security environment will sit within a cage.
Equinix hasn't named customers lined up to use the centre but said financial services, network service and cloud service providers have expressed interest.
To maximise the space available for computing infrastructure the building is designed to only need support pillars down the middle of the hall.
The hall can support up to 1.5kW of machines per square metre and 12 kVA of cooling per cabinet.

Purpose built

The centre is monitored 24/7 and has biometric controls and a mantrap to prevent unauthorised access.
A building management system polls the infrastructure every 15 seconds for temperature and humidity readings and reports back to the on-site monitoring centre.
The building has been designed to provide the optimal width for air ducts for the fresh air cooling system.
"In effect we started with the cooling technology and then designed the building around it," said Equinix UK managing director Russell Poole.
It took about two-and-a-half years to fill a co-location space of the size offered by LD6 in another of Equinix's datacentres.

Data from above

Data cables run overhead, while power and cooling is provided underfloor.
Fibre cross connects run in the yellow tray with copper cross-connects above, and will feed the server cabinets below.

A breath of fresh air

Much of the building is devoted to the machinery that keeps the servers powered on, supplied with data and properly cooled.
The building minimises energy use by cooling servers using fresh, rather than artificially chilled, air.
Air is drawn from outside and passed over heat exchangers that cool hot air pulled from the server cabinets. This cooled air is piped back to the data halls and to the cabinets through floor vents and the process begins again. The system helps keep the server room at a maximum of 22C and the air flows never mix to prevent contamination.

Liquid refreshment

This pipe running up the side of building takes water from a borehole drilled to a depth of 350 metres.

When the temperature rises over 20C, water is sprayed on to the heat exchangers and evaporates, removing additional heat from the air circulating through the facility. If the temperature hits 30C then a chilled water system can be used to extract even more heat.

Piped in

By minimising the energy needed to cool the server halls, Equinix can run the building with a Power Usage Effectiveness (PUE) rating of just 1.2 - ahead of the industry average of 1.7. PUE is a measure of what proportion of the total energy consumed by a datacentre goes to running the servers and how much drives the cooling and other infrastructure.

The LD6 datacentre is Leadership in Energy and Environmental Design gold-accredited.

Redundant power

Each server is powered by two mains supplies, so one can take over in the event of an outage.
Each power line is backed by two uninterruptible power supplies, able to support the full load of the datacentre for up to eight minutes.
That eight minutes should provide enough time for the centre's diesel generators to begin producing power. By the time the centre is complete there should be 32 generators capable of running the centre for 36 hours.
Power to the building's cooling, fire suppression and security system are also backed up by an UPS.

No comments:

Post a Comment