Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
FM, IT and Data Centres

FM, IT and Data Centres

Ratings: (0)|Views: 61|Likes:
Published by quocirca

More info:

Published by: quocirca on Jan 04, 2013
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





Copyright Quocirca © 2013Clive LongbottomQuocirca LtdTel : +44 118 9483360Email: 
FM, IT and Data Centres
 Are Facilities and IT data centre managers implacable enemies, or is it just aneed for different priorities and emphases on work that seem to get in the way? 
January 2013
Often, Quocirca finds that an organisation has one team
facilitiesmanagement (FM)
looking after the physical facility of the data centre,with another
information technology (IT)
looking after the servers,storage and network equipment, along with the software running within it.This can lead to problems where priorities clash, or where a lack of commonlanguage or views of a problem can stop things from happening. This reportpulls together a series of articles written for SearchDataCenter throughout2012.
FM, IT and Data Centres
© Quocirca 2013 - 2 -
FM, IT and Data Centres
 Are Facilities and IT data centre managers implacable enemies, or is it just a need for different  priorities and emphases on work that seem to get in the way? 
<Second précis>
Data centres can berun hotter thanpreviously
A major means of saving energy in a data centre is to use less cooling
and new guidelines meanthat a modern data centre can be run considerably warmer than previously. Combining this withother engineered approaches, such as hot and cold aisles, can provide large cost savings.Originally published here. 
Facilities and IT mustwork more closelytogether
Facilities management (FM) and information technology (IT) teams are too often working inisolation
and this often leads to them working against each other. This must be addressedthrough combining teams and managing projects according to business priorities.Originally published here. 
UPSs are criticalcomponents
The uninterruptable power supply (UPS) is no longer
a piece of equipment that is there asinsurance. With increasing intelligence built in to the system, the procurement and use of a UPShas to be balanced with IT
s and the business
and FM and IT must work together on this.Originally publishedhere. 
DC power is no silverbullet
The hoary old chestnut of the use of DC power in a data centre keeps coming back. Although allIT componentry runs on DC, a DC-based data centre would be non-standard and expensive. Onlythose who will have massive, bespoke systems will find that a DC infrastructure makes economicsense to them.Originally published here. 
There is increasingchoice in how datacentres can beimplemented
corporate data centre is not the right way to think any longer. Co-location and public cloudcomputing mean that a hybrid environment will be a more general means of providing an ITplatform. For the corporate data centre, this does mean that changes will be required
and is ascale-out cloud or a modular approach best?Originally published here. 
Physical security is just as important astechnical security
Far too often, Quocirca has seen the focus being on creating massively secure technicalplatforms, with little focus on the physical security of facilities and equipment. FM and IT mustwork together in order to ensure that all security issues are dealt with to give higher levels of corporate security.Originally published here. 
The data centre is changing. Although co-location, cloud and software as a service (SaaS) is moving some functions outside of theorganisation
s control, the corporate data centre will remain a critical part of the overall IT platform for most organisations for theforeseeable future. The data centre cannot be seen as being in two parts
the facility and its contents
but must be designed,implemented, operated and maintained as a single, dynamic system. IT and FM have to work together to ensure that this is the case.
FM, IT and Data Centres
© Quocirca 2013 - 3 -
The changing face of ASHRAE data centerenvironmental standards
When mainframes ruled IT, the received wisdom was to keep them as cold as possible. Water cooling was the norm,and cryo-cooling through the use of 
pumped refrigerants was Hollywood’s preferred manner of showing
supercomputers in use.As the use of distributed computing spread, the interdependencies between the data centre facility and thecomputing equipment held within it became more complex. No l
onger was the main IT “engine” concentrated into
one part of the facility
now, lots of smaller engines were spread around. These early tower systems still had to bekept cool, and many an IT manager has had servers collapse due to lack of cooling occurring when fans in such towersystems failed and systems management software failed to pick up the failure.As the need for more compute power grew, the use of rack systems began to replace the use of towers. Thesestandard-sized racks drove commoditisation of computer equipment into different multiples of height units (1U, 2U,
4U, etc.) within a 19” rack. Such concentration of equipment density made cooling even harder –
radial fans gave wayto axial fans, which can shift lower volumes of air.The data centre facility itself became more important. Computer room air conditioning (CRAC) units became thenorm, chilling and treating air to ensure that it could cool the equipment as required, without causing condensationthrough the moisture content of the air being too high, or the growth of dendrites which could cause electricalshorting through being too dry.However, for many organisations, getting this right was a bit hit-and-miss, as no official guidelines were available asto what environmental conditions should be applied for cooling air within a data centre. To this end, the AmericanSociety of Heating, Refrigeration and Air-conditioning Engineers (ASHRAE) produces a document in 2004 laying out aset of best-practice guidelines as to what the environmental parameters should be for running a data centre.In 2004, the design parameters of IT equipment were such that ASHRAE had to be quite prescriptive in its approach,and it also had to deal with predicted growth in equipment densities and thermal output from the equipment.ASHRAE could not depend on predictions of improvements in thermal and environmental envelopes of futureequipment, however, which led to the advised parameters being well within the requirements of equipment launchedeven soon after the guidelines were produced.In 2008, ASHRAE updated the guidelines to reflect that the pace of change in IT equipment had led to a different placethan was expected
the increasing use of blades and of multi-core CPUs in multi-CPU chassis meant that equipmentdensities had massively increased, while the chip manufacturers had done much to improve both the thermalperformance of their chips and the resiliency of them through the use of, for example, selective shut down of partsof the chips when not in use.This second set of guidelines put the focus on maintaining high reliability of the equipment in a datacentre in the mostenergy efficient manner
a change from the 2004 guidelines which just focused on reliability. The increasing focuson energy usage within data centres means that measures such as power usage effectiveness (PUE) have becomemore important, and just maintaining reliability within a data centre without ensuring low energy usage is no longervalid.

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->