Professional Documents
Culture Documents
1
treatment of datacom cooling and related subjects.
ISBN 978-1-947192-64-5 (pbk) ASHRAE Datacom Series, Book 1 1 ASHRAE Datacom Series
ISBN 978-1-947192-65-2 (PDF)
Fifth Edition
Thermal Guidelines for Data Processing Environments is authored by ASHRAE Tech-
nical Committee (TC) 9.9, Mission Critical Facilities, Technology Spaces and Electronic
Equipment. ASHRAE TC 9.9 is composed of a wide range of industry representatives,
including but not limited to equipment manufacturers, consulting engineers, data center
operators, academia, testing laboratories, and government officials who are all committed
to increasing and sharing the body of knowledge related to data centers.
Peachtree Corners
ISBN 978-1-947192-64-5 (paperback)
ISBN 978-1-947192-65-2 (PDF)
© 2004, 2008, 2012, 2015, 2021 ASHRAE. All rights reserved.
ASHRAE has compiled this publication with care, but ASHRAE has not investigated, and
ASHRAE expressly disclaims any duty to investigate, any product, service, process, proce-
dure, design, or the like that may be described herein. The appearance of any technical data
or editorial material in this publication does not constitute endorsement, warranty, or guar-
anty by ASHRAE of any product, service, process, procedure, design, or the like. ASHRAE
does not warrant that the information in the publication is free of errors, and ASHRAE does
not necessarily agree with any statement or opinion in this publication. The entire risk of the
use of any information in this publication is assumed by the user.
ASHRAE STAFF
SPECIAL PUBLICATIONS Cindy Sheffield Michaels, Editor
James Madison Walker, Managing Editor of Standards
Lauren Ramsdell, Associate Editor
Mary Bolton, Assistant Editor
Michshell Phillips, Senior Editorial Coordinator
PUBLISHING SERVICES David Soltis, Group Manager of Publishing Services
Jayne Jackson, Publication Traffic Administrator
DIRECTOR OF PUBLICATIONS
AND EDUCATION Mark S. Owen
Contents
Preface to the Fifth Edition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix
Acknowledgments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi
Chapter 1—Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Book Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Primary Users of This Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3 Adoption . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Although there are reasons to want to consider the impact of equipment outlet
temperature on the hot aisle, it does not impact the reliability or performance of
the ITE. Also, each manufacturer balances design and performance require-
ments when determining their equipment design temperature rise. Data center
operators should expect to understand the equipment inlet temperature distribu-
tion throughout their data centers and take steps to monitor these conditions. A
facility designed to maximize efficiency by aggressively applying new operating
ranges and techniques will require a complex, multivariable optimization
performed by an experienced data center architect.
Although the vast majority of data centers are air cooled at the IT load, liquid
cooling is becoming more commonplace and likely will be adopted to a greater
extent due to its enhanced operational efficiency, potential for increased density, and
opportunity for heat recovery. Consequently, the fourth and fifth editions of Thermal
Guidelines for Data Processing Environments include definitions of liquid-cooled
environmental classes and descriptions of their applications. Even a primarily
liquid-cooled data center may have air-cooled IT within. As a result, a combination
of air-cooled and liquid-cooled classes will typically be specified for a given data
center.
Acknowledgments
ASHRAE Technical Committee (TC) 9.9 would like to thank the following
members of the IT subcommittee for their groundbreaking work and willingness to
share in order to further the understanding of the entire data center industry and for
their active participation, including conference calls, writing/editing, and reviews:
Dustin Demetriou (IBM), Dave Moss (Dell), Mark Steinke (AMD), Roger Schmidt
(IBM, retired), and Robin Steinbrecher (Intel, retired). Thanks also to Roger
Schmidt for leading the effort on updating this fifth edition.
A special thanks is due to Syracuse University Mechanical and Aerospace Engi-
neering Department and the leadership of Professor Jianshun Zhang and his team,
including PhD student Rui Zhang, for carrying out the research to investigate the
effect of high humidity and gaseous pollutants on information technology equip-
ment (ITE). The result of this work was the primary reason for this fifth edition.
ASHRAE TC 9.9 also wishes to thank the following people for their construc-
tive comments on the draft of this edition: Jason Matteson (Isotope), Jon Fitch
(Midas Green Technologies), John Gross (J. M. Gross Engineering, LLC), Dave
Kelley (Vertiv, retired), Ecton English, Gerardo Alfonso (Ingeal), and Vali Sorell
(Microsoft).
Finally, special thanks to Neil Chauhan of DLB Associates for creating a consis-
tent set of graphics for this updated edition.
1
Introduction
Over the years, the power density of electronic equipment has steadily
increased. In addition, the mission-critical nature of computing has sensitized busi-
nesses to the health of their data centers. The combination of these effects makes it
obvious that better alignment is needed between equipment manufacturers and facil-
ity operations personnel to ensure proper and fault-tolerant operation within data
centers.
This need was recognized by an industry consortium in 1999 that began a grass-
roots effort to provide a power density road map and to work toward standardizing
power and cooling of the equipment for seamless integration into a data center. The
Industry Thermal Management Consortium produced the first projection of heat
density trends. The IT Subcommittee of ASHRAE Technical Committee (TC) 9.9
is the successor of that industry consortium. An updated set of power trend charts
was published in IT Equipment Power Trends, Third Edition (ASHRAE 2018b).
These updated equipment power trends extend to 2025.
The objective of Thermal Guidelines for Data Processing Environments, Fifth
Edition, is to do the following:
standard ETSI EN 300 019-1-3 (2014), which is referenced when there is a compar-
ison between data centers and telecom rooms. It is important to show the comparison
where some convergence in these environments may occur in the future.
1.3 ADOPTION
It is the hope of ASHRAE TC 9.9 that many equipment manufacturers and facil-
ities managers will follow the guidance provided in this book. Data center facilities
managers can be confident that these guidelines have been produced by IT manu-
facturers.
Manufacturers can self-certify that specific models of equipment operate as
intended in data processing air-cooling environmental classes A1, A2, A3, A4, and
H1 and the liquid-cooling environmental classes W17 through W+.
1.4 DEFINITIONS
air:
conditioned air: air treated to control its temperature, relative humidity, purity,
pressure, and movement.
supply air: air entering a space from an air-conditioning, heating, or ventilating
apparatus.
annual failure rate (AFR): average number of failures per year.
availability: a percentage value representing the degree to which a system or compo-
nent is operational and accessible when required for use.
basic input/output system (BIOS): set of computer instructions in firmware that
control input and output operations.
cabinet: frame for housing electronic equipment that is enclosed by doors and is
stand-alone; this is generally found with high-end servers.
computer room: a room or portions of a building serving an ITE load less than or
equal to 10 kW or 215 W/m2 (20 W/ft2) or less of conditioned floor area.
Thermal Guidelines for Data Processing Environments, Fifth Edition 5
humidity:
absolute humidity: the mass of water vapor in a specific volume of a mixture
of water vapor and dry air.
humidity ratio: the ratio of the mass of water to the total mass of a moist air
sample; it is usually expressed as grams of water per kilogram of dry air (gw/kgda)
or as pounds of water per pound of dry air (lbw/lbda).
relative humidity (RH):
a. Ratio of the partial pressure or density of water vapor to the saturation pres-
sure or density, respectively, at the same dry-bulb temperature and baro-
metric pressure of the ambient air.
b. Ratio of the mole fraction of water vapor to the mole fraction of water
vapor saturated at the same temperature and barometric pressure; at
100% rh, the dry-bulb, wet-bulb, and dew-point temperatures are equal.
information technology (IT): the study or use of systems (especially computers and
telecommunications) for storing, retrieving, and sending information.
information technology equipment (ITE): devices or systems that use digital tech-
niques for purposes such as data processing and computation.
information technology original equipment manufacturer (IT OEM): tradition-
ally, a company whose goods are used as components in the products of another
company, which then sells the finished item to users.
IT space: a space dedicated primarily to computers and servers but with environ-
mental and support requirements typically less stringent than those of a data center.
liquid cooled: cases where liquid must be circulated to and from the electronics
within the ITE for cooling with no other form of heat transfer.
mean time between failures (MTBF): the average time between system breakdowns.
power:
measured power: the heat release in watts, as defined in Chapter 6, Section 6.1,
“Providing Heat Release and Airflow Values.”
nameplate rating: term used for rating according to nameplate (IEC 60950-1,
under clause 1.7.1: “Equipment shall be provided with a power rating marking,
the purpose of which is to specify a supply of correct voltage and frequency, and
of adequate current-carrying capacity” [IEC 2005]).
rated current: “The input current of the equipment as declared by the manu-
facturer” (IEC 2005); the rated current is the absolute maximum current that is
required by the unit from an electrical branch circuit.
rated frequency: the supply frequency as declared by the manufacturer.
Thermal Guidelines for Data Processing Environments, Fifth Edition 7
rated frequency range: the supply frequency range as declared by the manu-
facturer, expressed by its lower- and upper-rated frequencies.
rated voltage: the supply voltage as declared by the manufacturer.
rated voltage range: the supply voltage range as declared by the manufacturer.
power usage effectiveness (PUETM): the ratio of total amount of energy used by a
computer data center facility to the energy delivered to the computer equipment. See
PUETM: A Comprehensive Examination of the Metric (ASHRAE 2014c) for more
information.
printed circuit board (PCB): an electronic circuit consisting of thin strips of a
conducting material such as copper that have been etched from a layer fixed to a flat
insulating sheet and to which integrated circuits and other components are attached.
rack: frame for housing electronic equipment.
rack-mounted equipment: equipment that is to be mounted in an Electronic Industry
Alliance (EIA) or similar cabinet; these systems are generally specified in EIA units,
such as 1U, 2U, 3U, where 1U = 44 mm (1.75 in.).
reliability: percentage value representing the probability that a piece of equipment
or system will be operable throughout its mission duration; values of 99.9% (“three
nines”) and higher are common in data and communications equipment areas. For
individual components, reliability is often determined through testing; for assem-
blies and systems, reliability is often the result of a mathematical evaluation based
on the reliability or individual components and any redundancy or diversity that may
be used.
room load capacity: the point at which the equipment heat load in the room no longer
allows the equipment to run within the specified temperature requirements of the
equipment; Chapter 4 defines where these temperatures are measured. The load
capacity is influenced by many factors, the primary factor being the room theoretical
capacity; other factors, such as the layout of the room and load distribution, also
influence the room load capacity.
room theoretical capacity: the capacity of the room based on the mechanical room
equipment capacity; this is the sensible capacity in kilowatts (tons) of the mechan-
ical room for supporting the computer or telecom room heat loads.
stock keeping unit (SKU): the number of one specific product available for sale. If
a hardware device or software package comes in different versions, there is a SKU
for each one.
temperature:
dew-point temperature: the temperature at which water vapor has reached the
saturation point (100% rh).
dry-bulb temperature: the temperature of air indicated by a thermometer.
8 Introduction
2.1 BACKGROUND
TC 9.9 created the original publication Thermal Guidelines for Data Processing
Environments in 2004 (ASHRAE 2004). At the time, the most important goal was
to create a common set of environmental guidelines that ITE would be designed to
meet. Although computing efficiency was important, performance and availability
took precedence. Temperature and humidity limits were set accordingly. In the first
decade of the twenty-first century, increased emphasis has been placed on comput-
ing efficiency. Power usage effectiveness (PUETM) has become the new metric by
which to measure the effect of design and operation on data center efficiency
(ASHRAE 2014c). To improve PUE, free-cooling techniques, such as air- and water-
side economization, have become more commonplace with a push to use them year
round. To enable improved PUE capability, TC 9.9 created additional environmental
classes, along with guidance on the use of the existing and new classes. Expanding
the capability of ITE to meet wider environmental requirements can change the
equipment’s reliability, power consumption, and performance capabilities; this fifth
edition of the book provides information on how these capabilities are affected.
In the second edition of Thermal Guidelines (ASHRAE 2008), the recom-
mended envelope was expanded along with guidance for data center operators on
maintaining high reliability and also operating their data centers in the most energy-
efficient manner. This expanded envelope was created for general use across all
types of businesses and conditions. However, different environmental envelopes
may be more appropriate for different business values and climate conditions.
Therefore, to allow for the potential to operate a data center in a different envelope
that might provide even greater energy savings, the third edition provided general
guidance on server metrics that assisted data center operators in creating an operat-
ing envelope that matched their business values. Each of these metrics is described
in this book. Using these guidelines, the user should be able to determine what envi-
ronmental conditions best meet their technical and business needs. Any choice
10 Environmental Guidelines for Air-Cooled Equipment
In this fifth edition of the book, more enhancements to the recommended enve-
lope were made to aid in data center energy improvements. While the fourth edition
focused on modifying the recommended envelope based on low-humidity research,
the changes to this fifth edition are primarily a result of the ASHRAE-funded
research project RP-1755 (Zhang et al. 2019a) on the effects of high relative humidity
(RH) and gaseous pollutants on corrosion of ITE. ASHRAE funded the Syracuse
University Mechanical and Aerospace Engineering Department from 2015 to 2018
to investigate the risk of operating data centers at higher levels of moisture when high
levels of gaseous pollutants exist (Zhang et al. 2019). The objective was to evaluate
the ability of increasing the recommended moisture level in support of reducing the
energy required by data centers. Five gaseous pollutants were tested under a variety
of temperature and RH conditions—three pollutants that are pervasive throughout the
planet (SO2, NO2, and O3) and two catalyst pollutants (H2S and Cl2). Pollutant levels
tested were at or near the maximum common concentration levels existing around the
world. The changes made to the recommended envelope based on this research are
summarized in this chapter, and Appendix E provides more insight into why the
changes were made to the recommended envelope based on the research results.
temperature and relative humidity (RH) ranges, the maximum dew point (DP) and
maximum elevation values are part of the allowable operating environment defini-
tions. The IT purchaser must consult with the equipment manufacturer to understand
the performance capabilities of the ITE at the extreme upper limits of the allowable
thermal envelopes.
notes detail changes made to the recommended envelope that were made with the
intent of maintaining high reliability of the ITE. These notes are critical to using this
fifth edition of Thermal Guidelines for Data Processing Environments.
1. To gain the full advantage of the results of current research (Zhang et al. 2019),
data center operators should use silver and copper coupons inside their data
centers at least twice a year (once in the winter and once in the summer) to
detect the level of corrosion in the environment. See Particulate and Gaseous
Contamination in Datacom Environments (ASHRAE 2014b) for more details
on these measurements.
2. For data center environments tested with silver and copper coupons that are
shown to have corrosion levels less than 300 Å/month for copper and 200 Å/
month for silver, suggesting that only the pervasive pollutants (SO2, NO2, and
O3) may be present, the recommended moisture limit has been raised from 60%
rh to 70% rh. The upper moisture limit is now 70% rh or 15°C (59°F) DP,
whichever is the minimum moisture content.
The data also showed that increasing the recommended temperature from
27°C to 28°C (80.6°F to 82.4°F) would be acceptable from a reliability stand-
point (Zhang et al. 2019). However, because IT manufacturers typically start
increasing airflow through servers around 25°C (77°F) to offset the higher
ambient temperature, this increased air-moving device power draw did not
warrant changing the recommended upper temperature limit.
In addition, the data showed that increasing the dew point from 15°C to 17°C
(59°F to 62.6°F) would be acceptable from a reliability standpoint. However, this
change would put the recommended upper moisture limit coincident with the
upper moisture limit of the allowable envelope of Class A1. For those data centers
that operate to the Class A1 environment, it was decided to maintain the buffer of
2°C (3.6°F) between the recommended and allowable envelopes and to maintain
the recommended envelope the same for all air-cooling classes (A1 through A4).
3. For data center environments tested with silver and copper coupons that are
shown to have levels of corrosion greater than 300 Å/month for copper and
200 Å/month for silver, suggesting that Cl2 and/or H2S (or other corrosive cata-
lysts) may be present, then the recommended moisture levels should be kept
below 50% rh. The upper moisture limit is 50% rh or 15°C (59°F) DP, which-
ever is the minimum moisture content. Chemical filtration should be considered
in these situations.
4. If coupon measurements are not performed to aid in understanding the possible
corrosion impact on ITE, the data center operator should consider maintaining
a lower humidity level to protect the ITE, either below 60% as specified in the
fourth edition of this book or below 50% as specified in note 3 above.
Figure 2.3 2021 recommended and allowable envelopes for Classes A1,
A2, A3 and A4. The recommended envelope is for high levels
of pollutants verified by coupon measurements as indicated in
note 3 of Section 2.2.
Thermal Guidelines for Data Processing Environments, Fifth Edition 15
Product
Product Operationb,c Power
Offc,d
Max.
Max. Rate Dry-
Dry-Bulb Humidity Dew Max. of Bulb
Temp.e,g, Range, Pointk, Elev.e,j,m, Changef, Temp., RHk,
Classa °C Noncond.h, i, k, l, n °C m °C/h °C %
–9°C DP to 15°C DP
A1 to
18 to 27 and
A4
70% rhn or 50% rhn
Allowable
–12°C DP and 8% rh
A1 15 to 32 to 17 3050 5/20 5 to 45 8 to 80k
17°C DP and 80% rhk
–12°C DP and 8% rh
A2 10 to 35 to 21 3050 5/20 5 to 45 8 to 80k
21°C DP and 80% rhk
–12°C DP and 8% rh
A3 5 to 40 to 24 3050 5/20 5 to 45 8 to 80k
24°C DP and 85% rhk
–12°C DP and 8% rh
A4 5 to 45 to 24 3050 5/20 5 to 45 8 to 80k
24°C DP and 90% rhk
* For potentially greater energy savings, refer to Appendix C for the process needed to account for multiple server
metrics that impact overall TCO.
16 Environmental Guidelines for Air-Cooled Equipment
Notes for Table 2.1, 2021 Thermal Guidelines for Air Cooling—
SI Version (I-P Version in Appendix B)
a. Classes A3 and A4 are identical to those included in the 2011 version of the thermal guide-
lines (ASHRAE 2012). The 2015 version of the A1 and A2 classes (ASHRAE 2015b) has
expanded RH levels compared to the 2011 version.
b. Product equipment is powered on.
c. Tape products require a stable and more restrictive environment (similar to Class A1 as spec-
ified in 2008). Typical requirements: minimum temperature is 15°C, maximum temperature
is 32°C, minimum RH is 20%, maximum RH is 80%, maximum DP is 22°C, rate of change
of temperature is less than 5°C/h, rate of change of humidity is less than 5% rh per hour, and
no condensation.
d. Product equipment is removed from original shipping container and installed but not in use,
e.g., during repair, maintenance, or upgrade.
e. Classes A1 and A2—Derate maximum allowable dry-bulb temperature 1°C/300 m above 900
m. Above 2400 m altitude, the derated dry-bulb temperature takes precedence over the recom-
mended temperature. Class A3—Derate maximum allowable dry-bulb temperature 1°C/175
m above 900 m. Class A4—Derate maximum allowable dry-bulb temperature 1°C/125 m
above 900 m.
f. For tape storage: 5°C in an hour. For all other ITE: 20°C in an hour and no more than 5°C in
any 15-minute period of time. The temperature change of the ITE must meet the limits shown
in the table and is calculated to be the maximum air inlet temperature minus the minimum air
inlet temperature within the time window specified. The 5°C and 20°C temperature change
is considered to be a temperature change within a specified period of time and not a rate of
change. See Appendix K for additional information and examples.
g. With a diskette in the drive, the minimum temperature is 10°C (not applicable to Classes A1
or A2).
h. The minimum humidity level for Classes A1, A2, A3, and A4 is the higher (more moisture)
of the –12°C dew point and the 8% rh. These intersect at approximately 25°C. Below this
intersection (~25°C) the dew point (–12°C) represents the minimum moisture level, while
above it, RH (8%) is the minimum.
i. Based on research funded by ASHRAE and performed at low RH (Pommerenke et al. 2014),
the following are the minimum requirements:
1) Data centers that have non-ESD floors and where personnel are allowed to wear non-ESD
shoes may need increased humidity given that the risk of generating 8 kV increases slightly
from 0.27% at 25% rh to 0.43% at 8% rh (see Appendix D for more details).
2) All mobile furnishing/equipment is to be made of conductive or static-dissipative materials
and bonded to ground.
3) During maintenance on any hardware, a properly functioning and grounded wrist strap
must be used by any personnel who contacts ITE.
j. To accommodate rounding when converting between SI and I-P units, the maximum elevation
is considered to have a variation of ±0.1%. The impact on ITE thermal performance within
this variation range is negligible and enables the use of the rounded value of 3050 m.
k. See Appendix L for graphs that illustrate how the maximum and minimum DP limits restrict
the stated RH range for each of the classes for both product operations and product power off.
l. For the upper moisture limit, the limit is the minimum absolute humidity of the DP and RH
stated. For the lower moisture limit, the limit is the maximum absolute humidity of the DP and
RH stated.
m. Operation above 3050 m requires consultation with the IT supplier for each specific piece of
equipment.
n. If testing with silver or copper coupons results in values less than 200 and 300 Å/month,
respectively, then operating up to 70% rh is acceptable. If testing shows corrosion levels
exceed these limits, then catalyst-type pollutants are probably present and RH should be
driven to 50% or lower. See note 3 of Section 2.2 for more details.
Thermal Guidelines for Data Processing Environments, Fifth Edition 17
• The room air should be continuously filtered with MERV 8 filters as recom-
mended by AHRI Standard 1360 (2017).
• Air entering a data center should be filtered with MERV 11 to MERV 13 fil-
ters.
All sources of dust inside data centers should be reduced. Every effort should
be made to filter out dust that has deliquescent relative humidity less than the maxi-
mum allowable relative humidity in the data center.
Note k was added to Table 2.1 to provide further clarification of the allowable
range of relative humidity. The humidity range noted in the table is not for the range
of dry-bulb temperatures specified in the table (this can clearly be seen in the psycho-
metric charts shown in Figures 2.2 and 2.3). As an example, the range of humidity
for Class A3 is shown in Figure 2.4. Additional clarification for the other classes is
provided in Appendix L.
Because equipment manufactured to environmental Classes A1 and A2 may
exist in two different forms that meet either the 2011 or 2015 versions, it is imper-
ative that when referencing equipment in Classes A1 or A2 that the thermal guide-
lines version (2011 or 2015) be noted.
The recommended envelope is highlighted as a separate row in Table 2.1
because of some misconceptions regarding the use of the recommended envelope.
When it was first created, it was intended that within this envelope the most reliable,
18 Environmental Guidelines for Air-Cooled Equipment
• High RH levels have been shown to affect failure rates of electronic compo-
nents. Examples of failure modes exacerbated by high RH include conductive
anodic failures, hygroscopic dust failures, tape media errors and excessive
wear, and corrosion. The recommended upper RH limit is set to limit this
effect. The new research reported in detail in Appendix E sets the recom-
mended upper RH limit at 70% for data centers that continuously monitor the
corrosion rate of copper and silver and are shown to have levels below 300
and 200 Å/month, respectively.
• Electronic devices are susceptible to damage by ESD, but based on the ESD
research reported in Appendix D, susceptibility to low RH is a lesser concern
than once thought.
• High temperature affects the reliability and life of electronic equipment. The
recommended upper ambient temperature limit is set to limit these tempera-
ture-related reliability effects. To estimate the effects of operating at higher
temperatures, see Section 2.4.3 for a description of the relative ITE failure
rate x-factor.
• The lower the temperature in the room that houses the electronic equipment,
in general the more energy is required by the HVAC equipment. The recom-
mended lower ambient temperature limit is set to limit extreme overcooling.
For data center equipment, each individual manufacturer tests to specific envi-
ronmental ranges, and these may or may not align with the allowable ranges spec-
ified in Table 2.1; regardless, the product that is shipped will in most cases align with
one of the classes.
Regarding the maximum altitude at which data center products should operate,
Figure 2.5 shows that the majority of the population resides below 3000 m (9840 ft);
therefore, the maximum altitude for Classes A1 through A4 was chosen as 3050 m
(10,000 ft).
The purpose of specifying a derating on the maximum dry-bulb temperature for
altitude (see note e of Table 2.1) is to identify acceptable environmental limits that
compensate for degradation in air-cooling performance at high altitudes. The rate of
heat transfer in air-cooled electronics is a function of convective heat transfer and
coolant mass flow rates, both of which decrease as a result of reduced air density,
which accompanies the lower atmospheric pressure at high altitudes. An altitude
derating restricts the maximum allowable upper operating temperature limit when
20 Environmental Guidelines for Air-Cooled Equipment
the system is operated at higher altitudes and permits a higher operating temperature
limit when the system is operated at lower altitudes. Altitude derating thus ensures
that system component temperatures stay within functional limits while extending
the useful operating range to the maximum extent possible for a given cooling
design.
One area that needed careful consideration was the application of the altitude
derating for the environmental classes. Simply providing the derating curve for
Classes A1 and A2 for Classes A3 and A4 would have driven undesirable increases
in server energy to support the higher altitudes upon users at all altitudes. In an effort
to provide for both a relaxed operating environment and a total focus on the best solu-
tion with the lowest TCO for the client, modification to this derating was applied.
The derating curves for Classes A3 and A4 maintain significant relaxation while
mitigating extra expense incurred both during acquisition of the ITE but also under
operation due to increased power consumption. The relationship between dry-bulb
temperature, altitude, and air density for the different environments is depicted
graphically in the derating curves of Appendix G.
It was intended that operation within the recommended envelope created by the
equipment manufacturers would provide the most reliable and power-efficient data
center operation. This intent continues to be the goal.
Thermal Guidelines for Data Processing Environments, Fifth Edition 21
Product
Product Operationb,c
Power Offc,d
Recommended
–9°C DP to 15°C DP
H1 18 to 22 and
70% rhn or 50% rhn
Allowable
–12°C DP and 8% rh
H1 5 to 25 to 17 3050 5/20 5 to 45 8 to 80k
17°C DP and 80% rhk
Thermal Guidelines for Data Processing Environments, Fifth Edition 23
Notes for Table 2.2, 2021 Thermal Guidelines for High-Density Servers—
SI Version (I-P Version in Appendix B)
a. This is a new class specific to high-density servers. It is at the discretion of the ITE manu-
facturer to determine the need for a product to use this high-density server class. Classes A1
through A4 are separate and are shown in Table 2.1.
b. Product equipment is powered on.
c. Tape products require a stable and more restrictive environment (similar to 2011 Class A1).
Typical requirements: minimum temperature is 15°C, maximum temperature is 32°C, mini-
mum RH is 20%, maximum RH is 80%, maximum DP is 22°C, rate of change of temperature
is less than 5°C/h, rate of change of humidity is less than 5% rh per hour, and no condensation.
d. Product equipment is removed from original shipping container and installed but not in use,
e.g., during repair, maintenance, or upgrade.
e. For H1 class only—Derate maximum allowable dry-bulb temperature 1°C/500 m above 900
m. Above 2400 m altitude, the derated dry-bulb temperature takes precedence over the recom-
mended temperature.
f. For tape storage: 5°C in an hour. For all other ITE: 20°C in an hour and no more than 5°C in
any 15-minute period of time. The temperature change of the ITE must meet the limits shown
in the table and is calculated to be the maximum air inlet temperature minus the minimum air
inlet temperature within the time window specified. The 5°C and 20°C temperature change
is considered to be a temperature change within a specified period of time and not a rate of
change. See Appendix K for additional information and examples.
g. With a diskette in the drive, the minimum temperature is 10°C. With the lowest allowed
temperature of 15°C, there is no problem with diskettes residing in this H1 environment.
h. The minimum humidity level for Class H1 is the higher (more moisture) of the –12°C DP and
the 8% rh. These intersect at approximately 25°C. Below this intersection (~25°C) the DP (–
12°C) represents the minimum moisture level, while above it, RH (8%) is the minimum.
i. Based on research funded by ASHRAE and performed at low RH (Pommerenke et al. 2014),
the following are the minimum requirements:
1) Data centers that have non-ESD floors and where personnel are allowed to wear non-ESD
shoes may need increased humidity given that the risk of generating 8 kV increases slightly
from 0.27% at 25% rh to 0.43% at 8% rh (see Appendix D for more details).
2) All mobile furnishing/equipment is to be made of conductive or static-dissipative materials
and bonded to ground.
3) During maintenance on any hardware, a properly functioning and grounded wrist strap
must be used by any personnel who contacts ITE.
j. To accommodate rounding when converting between SI and I-P units, the maximum elevation
is considered to have a variation of ±0.1%. The impact on ITE thermal performance within
this variation range is negligible and enables the use of the rounded value of 3050 m.
k. See Appendix L for graphs that illustrate how the maximum and minimum DP limits restrict
the stated RH range for both product operations and product power OFF.
l. For the upper moisture limit, the limit is the minimum absolute humidity of the DP and RH
stated. For the lower moisture limit, the limit is the maximum absolute humidity of the DP and
RH stated.
m. Operation above 3050 m requires consultation with the IT supplier for each specific piece of
equipment.
n. If testing with silver or copper coupons results in values less than 200 and 300 Å/month,
respectively, then operating up to 70% rh is acceptable. If testing shows corrosion levels
exceed these limits, then catalyst-type pollutants are probably present and RH should be
driven to 50% or lower. See note 3 of Section 2.2 for more details.
24 Environmental Guidelines for Air-Cooled Equipment
Exceptional
Environmental Parameter Unit Normal
(E)
a Low air temperature °C +5 –5
b High air temperature °C +40 +45
c Low relative humidity % rh 5 5
d High relative humidity % rh 85 90
e Low absolute humidity g/m3 1
f High absolute humidity g/m3 25
g Rate of change of temperaturea °C/min 0.5
h Low air pressure kPa 70
i High air pressureb kPa 106
j Solar radiation W/m2 700
k Heat radiation W/m2 600
l Movement of the surrounding airc m/s 5
m Conditions of condensation none no
n Conditions of wind—driven rain, snow, hail, etc. none no
o Conditions of water from sources other than rain none no
p Conditions of icing none no
a. Averaged over a period of 5 min.
b. Conditions in mines are not considered.
c. A cooling system based on non-assisted convection may be disturbed by adverse movement of the surrounding air.
Figure 2.8 Climatogram of the ETSI Class 3.1 and 3.1e environmental
conditions (ETSI 2014).
With five data center classes, the decision process for the data center owner/
operator is complicated when trying to optimize efficiency, reduce TCO, address
reliability issues, and improve performance. Data center optimization is a complex,
multivariable problem and requires a detailed engineering evaluation for any signif-
icant changes to be successful. An alternative operating envelope should be consid-
ered only after appropriate data are collected and interactions within the data center
are understood. Each parameter’s current and planned status could lead to a different
endpoint for the data center optimization path.
The worst-case scenario would be for an end user to carelessly assume that ITE
is capable of operating in Classes A3 or A4 or that the mere definition of these
classes, with their expanded environmental ranges, magically solves existing data
center thermal management or power density or cooling problems. While some new
ITE may operate in these classes, other ITE, including legacy equipment, may not.
Data center problems would most certainly be compounded if the user erroneously
assumes that Class A3 or A4 conditions are acceptable. The rigorous use of the tools
and guidance in this chapter should preclude such errors. Table 2.4 summarizes the
key characteristics and potential options to be considered when evaluating the opti-
mal operating range for each data center.
26 Environmental Guidelines for Air-Cooled Equipment
Climate factors—Humidityd Range of humidity in the region (obtain bin data and/
or design extremes for RH and DP), coincident
temperature and humidity extremes, number of hours
per year outside potential ASHRAE class humidity
ranges
a. Some computer room air-conditioner (CRAC)/computer room air handler (CRAH) units have limited return
temperatures, as low as 30°C (86°F).
b. With good airflow management, server temperature rise can be on the order of 20°C (36°F); with an inlet
temperature of 40°C (104°F) the hot aisle could be 60°C (140°F).
c. Data center type affects reliability/availability requirements.
d. Climate factors are summarized in “ASHRAE Position Document on Climate Change” (2018a).
Thermal Guidelines for Data Processing Environments, Fifth Edition 27
By understanding the characteristics described in Table 2.4 along with the data
center capability, one can follow the general steps necessary in setting the operating
temperature and humidity range of the data center:
1. Consider the state of best practices for the data center. Most best practices,
including airflow management and cooling-system control strategies, should be
implemented prior to the adoption of higher server inlet temperature.
2. Determine the maximum allowable ASHRAE class environment from
Tables 2.1 and 2.2 based on review of all ITE environmental specifications.
3. Use the default recommended operating envelope (see Tables 2.1 and 2.2) or,
if more energy savings is desired, use the following information to determine
the operating envelope:
a. Climate data for locale (only when using economizers)
b. Server power trend versus ambient temperature (see Section 2.4.1)
c. Acoustical noise levels in the data center versus ambient temperature (see
Section 2.4.2)
d. Server reliability trend versus ambient temperature (see Section 2.4.3)
e. Server reliability versus moisture, contamination, and other temperature
effects (see Section 2.4.4)
f. Server performance trend versus ambient temperature (see Section 2.4.5)
g. Server cost trend versus ambient temperature (see Section 2.4.6)
The steps above provide a simplified view of the flowchart in Appendix C. The
use of Appendix C is highly encouraged as a starting point for the evaluation of the
options. The flowchart provides guidance to data center operators seeking to mini-
mize TCO on how best to position their data center for operating in a specific envi-
ronmental envelope. Possible endpoints range from optimization of TCO within the
recommended envelope as specified in Table 2.1 to a chillerless data center using any
of the data center classes. More importantly, Appendix C describes how to achieve
even greater energy savings through the use of a TCO analysis using the server
metrics provided in the next section.
Data were collected from a number of ITE manufacturers covering a wide range
of products. Most of the data collected for the Class A2 environment fell within the
envelope displayed in Figure 2.9. The power increase is a result of fan power, compo-
nent power, and the power conversion for each. The component power increase is a
result of an increase in leakage current for some silicon devices. As an example of
the use of Figure 2.9, if a data center is normally operating at a server inlet tempera-
ture of 15°C (59°F) and the operator wants to raise this temperature to 30°C (86°F),
it could be expected that the server power would increase in the range of 3% to 7%.
If the inlet temperature increases to 35°C (95°F), the ITE power could increase in
the range of 7% to 20% compared to operating at 15°C (59°F).
The development of the Class A3 envelope shown in Figure 2.9 was simply
extrapolated from the Class A2 trend. New products for this class would likely be
developed with improved heat sinks and/or fans to properly cool the components
within the new data center class, so the power increases over the wider range would
be very similar to that shown for Class A2.
With the increase in fan speed over the range of ambient temperatures, ITE flow
rate also increases. An estimate of the increase in server airflow rates over the
temperature range up to 35°C (95°F) is as displayed in Figure 2.10. It is very import-
ant in designing data centers to take advantage of temperatures above the 25°C to
27°C (77°F to 80.6°F) inlet ambient temperature range. With higher temperatures as
an operational target, the data center design must be analyzed to be able to accom-
modate the higher volumes of airflow. This includes all aspects of the airflow system.
The base system may be called upon to meet 250% (per Figure 2.10) of the nominal
airflow (the airflow when in the recommended range). This may include the outdoor
air inlet, filtration, cooling coils, dehumidification/humidification, fans, underfloor
plenum, raised-floor tiles/grates, and containment systems. A detailed engineering
evaluation of the data center system’s higher flow rate is a requirement to ensure
successful operation at elevated inlet temperatures.
Another aspect of power trend that might help determine a new operating enve-
lope is understanding the total facility energy consumption and not just the IT load
as discussed in this section. For example, as the inlet operating temperature is
Figure 2.10 Server flow rate increase versus ambient temperature increase.
30 Environmental Guidelines for Air-Cooled Equipment
increased, it is very possible that the fan speed of servers will also increase, thereby
increasing the server power. This server power increase would probably result in a
lower PUE, giving the false impression that energy use of the data center has
improved, though this is not the case. This situation highlights the importance of
measuring the total data center power usage.
25°C (77°F) 30°C (86°F) 35°C (95°F) 40°C (104°F) 45°C (113°F)
would exist between high-end equipment that uses sophisticated fan-speed control
and entry-level equipment using fixed fan speeds or rudimentary speed control.
However, the above increases in noise emission levels with ambient temperature can
serve as a general guideline for data center managers and owners concerned about
noise levels and noise exposure for employees and service personnel. The IT indus-
try has developed its own internationally standardized test codes for measuring the
noise emission levels of its products (ISO 7779 [2018]) and for declaring these noise
levels in a uniform fashion (ISO 9296 [2017]). Noise emission limits for ITE
installed in a variety of environments (including data centers) are stated in Statskon-
toret Technical Standard 26:6 (2004).
This discussion applies to potential increases in noise emission levels (i.e., the
sound energy actually emitted from the equipment, independent of listeners in the
room or the environment in which the equipment is located). Ultimately, the real
concern is about the possible increase in noise exposure, or noise emission levels,
experienced by personnel in the data center. With regard to regulatory workplace
noise limits and protection of employees against potential hearing damage, data
center managers should check whether potential changes in noise levels in their envi-
ronment will cause them to trip various action-level thresholds defined in local, state,
or national codes. The actual regulations should be consulted, as they are complex
and beyond the scope of this book to explain in full. The noise levels of concern in
workplaces are stated in terms of A-weighted sound pressure levels (as opposed to
the A-weighted sound power levels used for rating the emission of noise sources).
For instance, when noise levels in a workplace exceed a sound pressure level of
85 dB(A), hearing conservation programs, which can be quite costly, are mandated,
generally involving baseline audiometric testing, noise level monitoring or dosim-
etry, noise hazard signage, and education and training. When noise levels exceed
87 dB(A) (in Europe) or 90 dB(A) (in the U.S.), further action, such as mandatory
hearing protection, rotation of employees, or engineering controls, must be taken.
Data center managers should consult with acoustical or industrial hygiene experts
to determine whether a noise exposure problem will result when ambient tempera-
tures are increased to the upper ends of the expanded ranges proposed in this book.
In an effort to provide some general guidance on the effects of the proposed
higher ambient temperatures on noise exposure levels in data centers, the following
observations can be made (though, as noted above, it is advised that one seek profes-
sional help in actual situations, because regulatory and legal requirements are at
issue). Modeling and predictions of typical ITE racks in a typical data center with
front-to-back airflow have shown that the sound pressure level in the center of a typi-
cal aisle between two rows of continuous racks will reach the regulatory trip level
of 85 dB(A) when each of the individual racks in the rows has a measured (as
32 Environmental Guidelines for Air-Cooled Equipment
opposed to a statistical upper limit) sound power level of roughly 8.4 B (84 dB). If
it is assumed that this is the starting condition for a 25°C (77°F) ambient data center
temperature—and many fully configured high-end ITE racks today are at or above
this 8.4 B (84 dB) level—the sound pressure level in the center of the aisle would
be expected to increase to 89.7 dB(A) at 30°C (86°F) ambient, to 91.4 dB(A) at 35°C
(95°F) ambient, to 93.4 dB(A) at 40°C (104°F) ambient, and to 97.9 dB(A) at 45°C
(113°F) ambient, using the predicted increases to sound power level shown in
Table 2.5. Needless to say, these levels are extremely high. They are not only above
the regulatory trip levels for mandated action (or fines, in the absence of action), but
they clearly pose a risk of hearing damage unless controls are instituted to avoid
exposure by data center personnel.
Because there are so many different variables and scenarios to consider for ITE
reliability, the approach taken by ASHRAE TC 9.9 was to initially establish a base-
line failure rate (x-factor) of 1.00 that reflected the average probability of failure
under a constant ITE inlet temperature of 20°C (68°F). Table 2.6 provides x-factors
at other constant ITE inlet temperatures for 7 × 24 × 365 continuous operation condi-
tions. The key to applying the x-factors in Table 2.6 is to understand that they repre-
sent a relative failure rate compared to the baseline of a constant ITE inlet
temperature of 20°C (68°F). This table was created using manufacturers’ reliability
data, which included all components within the volume server package. Table 2.6
provides x-factor data at the average, upper, and lower bounds to take into account
the many variations within a server package among the number of processors, dual
in-line memory modules (DIMMs), hard drives, and other components. The data set
chosen should depend on the level of risk tolerance for a given application.
It is important to note that the 7 × 24 × 365 use conditions corresponding to the
x-factors in Table 2.6 are not a realistic reflection of the three economization scenar-
ios outlined previously. For most climates in the industrialized world, the majority
of the hours in a year are spent at cool temperatures, where mixing cool outdoor air
with air from the hot aisle exhaust keeps the data center temperature in the range of
15°C to 20°C (59°F to 68°F) (x-factor of 0.72 to 1.00). Furthermore, these same
climates spend only 10% to 25% of their annual hours above 27°C (80.6°F), the
upper limit of the ASHRAE recommended range. The correct way to analyze the
34 Environmental Guidelines for Air-Cooled Equipment
1000 servers incorporates warmer temperatures, and the relative failure rate x-factor
is 1.2, then the expected failure rate would be 5 failures per 1000 servers). To provide
an additional frame of reference on data center hardware failures, sources showed
blade hardware server failures were in the range of 2.5% to 3.8% over 12 months in
two different data centers with supply temperatures approximately 20°C (68°F)
(Patterson et al. 2009; Atwood and Miner 2008). In a similar data center that
included an air-side economizer with temperatures occasionally reaching 35°C
(95°F) (at an elevation around 1600 m [5250 ft]), the failure rate was 4.5%. These
values are provided solely for guidance with an example of failure rates. In these
studies, a failure was deemed to have occurred each time a server required hardware
attention. No attempt to categorize the failure mechanisms was made.
To provide additional guidance on the use of Table 2.6, Appendix H gives a
practical example of the impact of a compressorless cooling design on hardware fail-
ures, and Appendix I provides ITE reliability data for selected major U.S. and global
cities.
One other aspect not discussed here is server availability requirements. The
question one has to ask is: are there availability requirements for some servers in the
data center that would require much more stringent temperature controls than might
be allowed through modeling of reliability as described here?
of equipment failures they can cause are well documented. One of the best sources
on the effects of pollution on data centers is Particulate and Gaseous Contamination
in Datacom Environments, Second Edition (ASHRAE 2014b). When selecting a site
for a new data center or when adding an air-side economizer to an existing data
center, the air quality and building materials should be checked carefully for sources
of pollution and particulates. Additional filtration should be added to remove
gaseous pollution and particulates, if needed. Research has shown that in addition
to pollution, both temperature and humidity affect dielectric properties of printed
circuit board (PCB) dielectric materials (Hamilton et al. 2007; Sood 2010; Hinaga
et al. 2010). The dielectric (e.g., FR4) provides the electrical isolation between board
signals. With either increased moisture or higher temperature in the PCB, transmis-
sion line losses increase. Signal integrity may be significantly degraded as the
board’s temperature and moisture content increase. Moisture content changes rela-
tively slowly, on the order of hours and days, based on the absorption rate of the
moisture into the board. Outer board layers are affected first. Temperature changes
on the order of minutes and can quickly affect performance. As more high-speed
signals are routed in the PCB, both temperature and humidity will become even
greater concerns for ITE manufacturers. The cost of PCB material may increase
significantly and may increase the cost of Class A3- and A4-rated ITE. The alter-
native for ITE manufacturers is to use lower-speed bus options, which will lower
performance.
Excessive exposure to high humidity can induce performance degradations or
failures at various circuitry levels. At the PCB level, conductive anodic filament
grows along the delaminated fiber/epoxy interfaces where moisture facilitates the
formation of a conductive path (Turbini and Ready 2002; Turbini et al. 1997). At the
substrate level, moisture can cause surface dendrite growth between pads of opposite
bias due to electrochemical migration. This is a growing concern due to continuing
C4 (solder ball connection) pitch refinement. At the silicon level, moisture can
induce degradation or loss of the adhesive strength in the dielectric layers, while
additional stress can result from hygroscopic swelling in package materials. The
combination of these two effects often causes delamination near the die corner
region where thermal-mechanical stress is inherently high and more vulnerable to
moisture. It is worth noting that temperature plays an important role in moisture
effects. On one hand, higher temperature increases the diffusivity coefficients and
accelerates the electrochemical reaction. On the other hand, the locally higher
temperature due to self-heating also reduces the local RH, thereby drying out the
circuit components and enhancing their reliability.
In addition to the above diffusion-driven mechanism, another obvious issue with
high humidity is condensation. This can result from sudden ambient temperature
drop or the presence of a lower temperature source for water-cooled or refrigeration-
cooled systems. Condensation can cause failures in electrical and mechanical
devices through electrical shorting and corrosion. Other examples of failure mode
exacerbated by high RH include hygroscopic dust failures (Comizzoli et al. 1993),
tape media errors, excessive wear (Van Bogart 1995), and corrosion. These failures
are found in environments that exceed 60% rh for extended periods of time.
Thermal Guidelines for Data Processing Environments, Fifth Edition 37
As a rule, the typical mission-critical data center must give utmost consideration
to the trade-offs before operating with an RH that exceeds 60% for the following
reasons:
• It is well known that moisture and pollutants are necessary for metals to
corrode. Moisture alone is not sufficient to cause atmospheric corrosion.
Pollution aggravates corrosion in the following ways:
• Corrosion products, such as oxides, may form and protect the metal
and slow down the corrosion rate. In the presence of gaseous pollut-
ants such as sulfur dioxide (SO2) and hydrogen sulfide (H2S) and
ionic pollutants such as chlorides, the corrosion-product films are less
protective, allowing corrosion to proceed somewhat linearly. When the
RH in the data center is greater than the deliquescent RH of the corro-
sion products, such as copper sulfate, cupric chloride, and the like, the
corrosion-product films become wet, dramatically increasing the rate
of corrosion. Cupric chloride, a common corrosion product on copper,
has a deliquescent RH of about 65%. A data center operating with RH
greater than 65% would result in the cupric chloride absorbing mois-
ture, becoming wet, and aggravating the copper corrosion rate.
• Dust is ubiquitous. Even with the best filtration efforts, fine dust will be
present in a data center and will settle on electronic hardware. Fortu-
nately, most dust has particles with high deliquescent RH, which is the
RH at which the dust absorbs enough water to become wet and promote
corrosion and/or ion migration. When the deliquescent RH of dust is
greater than the RH in the data center, the dust stays dry and does not
contribute to corrosion or ion migration. However, on the rare occurrence
when the dust has a deliquescent RH lower than the RH in the data cen-
ter, the dust will absorb moisture, become wet, and promote corrosion
and/or ion migration, degrading hardware reliability. A study by Comiz-
zoli et. al. (1993) showed that, for various locations worldwide, leakage
current due to dust that had settled on PCBs increased exponentially with
RH. This study leads us to the conclusion that maintaining the RH in a
data center below about 60% will keep the leakage current from settled
fine dust in the acceptable subangstrom range.
The conditions noted in the above two bullets do not contradict the 70% rh upper
limit for the recommended envelope as shown in Table 2.1 and Figure 2.2. The
guidelines for the 70% rh upper limit are for a data center that has low levels of
pollutants; namely, copper and silver coupons are measured to be below 300 and 200
Å/month, respectively. If these measurements are higher than these limits, suggest-
ing higher levels of pollutants are present, then the RH should be limited as noted
above and suggested in note 4 of Section 2.2.
Gaseous contamination concentrations that lead to silver and/or copper corro-
sion rates greater than about 300 Å/month have been known to cause the two most
38 Environmental Guidelines for Air-Cooled Equipment
common recent failure modes: copper creep corrosion on circuit boards and the
corrosion of silver metallization in miniature, surface-mounted components.
In summary, if protection of mission-critical data center hardware is paramount,
equipment can best be protected from corrosion by maintaining an RH of less than
70% and limiting the particulate and gaseous contamination concentration to levels
at which the copper and/or silver corrosion rates are less than 300 and 200 Å/month,
respectively. Of course, the data center operator may choose to limit the data center
RH to below 50% at all times to be overly protective of the ITE.
Given these reliability concerns, data center operators need to pay close atten-
tion to the overall data center humidity and local condensation concerns, especially
when running economizers on hot/humid summer days. When operating in polluted
geographies, data center operators must also consider particulate and gaseous
contamination, because the contaminants can influence the acceptable temperature
and humidity limits within which data centers must operate to keep corrosion-related
hardware failure rates at acceptable levels. Dehumidification, filtration, and gas-
phase filtration may become necessary in polluted geographies with high humidity.
Section 2.2 provides additional guidance on minimizing corrosion due to high RH
and gaseous pollutants.
customers and applications but is generally not the default configuration that will,
in most cases, support full operation.
To enable ITE manufacturers the greatest flexibility in designing to an allowable
environmental class, power and thermal management may be triggered, and with the
new guidance on allowable ranges, “full-performance operation” has been replaced
with “full operation” in the definition of allowable environmental envelope in
Section 2.2. ITE is designed with little to no margin at the extreme upper limit of the
allowable range. The recommended range enabled a buffer for excursions to the
allowable limits. That buffer has been removed and, consequently, power and ther-
mal management features may be triggered within the allowable range to ensure
there are no thermal excursions outside the capability of the ITE under extreme load
conditions. ITE is designed based on the probability of a worst-case event occurring,
such as the combination of extreme workloads simultaneously with room tempera-
ture excursions. Because of the low probability of simultaneous worst-case events
occurring, IT manufacturers skew their power and thermal management systems to
ensure that operation is guaranteed. Operating within a particular environmental
class requires full operation of the equipment over the entire allowable environmen-
tal range, based on nonfailure conditions. The IT purchaser must consult with the
equipment manufacturer to understand the performance capability at the extreme
upper limits of the allowable thermal envelopes.
constraints of the server, and the changes required to these components may also
affect server cost. In any case, the cost of servers supporting the newer ASHRAE
classes should be discussed with the individual server manufacturer to understand
whether this will factor into the decision to support the new classes within an indi-
vidual data center.
Liquid cooling more readily enables the reuse of waste heat. If a project is
adequately planned from the beginning, reusing the waste energy from the data
center may reduce the energy use of the site or campus. In this case, liquid cooling
is the obvious choice because the heat in the liquid can most easily be transferred to
other locations. Also, the closer the liquid is to the components, the higher the quality
of the heat that is recovered and available for alternative uses.
3.1.2 Expansions
Another time to change to or add liquid cooling is when adding or upgrading
equipment in an existing data center. Often, existing data centers do not have large
raised-floor heights or the raised floor plenum is full of obstructions such as cabling.
If a new rack of ITE is to be installed that is of higher power density than the existing
raised-floor air cooling can support, liquid cooling can be the ideal solution. Current
typical air-cooled rack powers can range from 6 to 30 kW. In many cases, rack
powers of 30 kW are well beyond what legacy air cooling can handle. Liquid cooling
to a datacom rack, cabinet-mounted chassis, cabinet rear door, or other localized
liquid-cooling system can make these higher-density racks nearly room neutral by
cooling the exhaust temperatures down to room temperature levels.
The facility water is anticipated to support any liquid-cooled ITE using water,
water plus additives, refrigerants, or dielectrics. To date, most liquid-cooling solu-
tions use a CDU as the interface of the ITE to the facility. If there is no CDU, it is
the responsibility of the facility to maintain the water-quality requirements of the
ITE as well as a water temperature guaranteed to be above the data center dew point.
The CDU may be external to the datacom rack, as shown in Figure 3.1, or within the
datacom rack, as shown in Figure 3.2.
Figures 3.1 and 3.2 show the interface for a liquid-cooled rack with remote heat
rejection. The interface is located at the boundary at the facility water system loop
Figure 3.2 Combination air- and liquid-cooled rack or cabinet with internal
CDU.
Thermal Guidelines for Data Processing Environments, Fifth Edition 45
and does not impact the ITE cooling system loops, which are controlled and
managed by the cooling equipment and ITE manufacturers. However, the definition
of the interface at the loop affects both the ITE manufacturers and the facility where
the ITE is housed. For that reason, all of the parameters that are key to this interface
are described in detail here. Liquid Cooling Guidelines for Datacom Equipment
Centers (ASHRAE 2014a) describes the various liquid-cooling loops that could
exist within a data center and its supporting infrastructure. Figure 3.3 shows these
liquid loops as well as two liquids—the coolant contained in the technology cooling
system (TCS) and the coolant contained in the datacom equipment cooling system
(DECS). The TCS may include in-row and overhead forced air-to-liquid heat
exchangers. If the TCS liquid is a dielectric coolant, the external CDU pump may
potentially be used to route the TCS coolant directly to cold plates attached to DECS
internal components in addition to or in place of a separate internal DECS. As seen
in Figure 3.3, the water guidelines that are discussed in this book are at the chilled-
water system (CHWS) loop. If chillers are not installed, then the guidelines would
apply to the condenser water system (CWS) loop.
Although not specifically noted, a building-level CDU may be more appropriate
where there are a large number of racks connected to liquid cooling. In this case, the
location of the interface is defined the same as in Figure 3.1, but the CDU as shown
would be a building-level unit rather than a modular unit. Building-level CDUs
handling many megawatts of power have been built for large HPC systems.
Although Figure 3.1 shows liquid cooling using a raised floor, liquid could be
distributed above the ceiling just as efficiently.
W17
W27
} Chiller/cooling tower
Water-side economizer
(cooling tower)
17 (62.6)
27 (80.6)
W32
W40
} Cooling tower
Chiller or
district heating system
32 (89.6)
40 (104)
W45
W+
} Cooling tower District heating system
45 (113)
>45 (>113)
a. Minimum water temperature for all classes is 2°C (35.6°F).
Thermal Guidelines for Data Processing Environments, Fifth Edition 47
The high thermal density and continuous operating hours of data centers can be
an attractive added value in providing low-temperature hot water to high-density
building clusters with high thermal loads such as mixed-use developments, airports,
college and university campuses, and large office developments. The liquid cooling
classes with supply temperatures of 32°C (59°F) and higher (shown in Table 3.1) are
candidates for district heating. The option of district heating is shown in Table 3.1
for classes W32, W40, W45, and W+. Data center operators can determine whether
they can take advantage of this option by computing the energy reuse effectiveness
(ERE) metric, as described by the Green Grid (TGG 2010) and enhanced in a short
paper published on the ASHRAE TC 9.9 home page titled “An Improved Energy
Reuse Metric” (Khalifa and Schmidt 2014). Additional details on district heating,
including the supply temperature categories, can be found in Chapter 12 of ASHRAE
Handbook—HVAC Systems and Equipment (2020) and the presentation given at the
4th international Conference on Smart Energy Systems and 4th Generation District
Heating (Lund et al. 2018).
Although the facility supply water temperatures specified in Table 3.1 are
requirements to be met by the ITE, it is incumbent on the facility owner/designer to
ensure the approach temperature for any planned CDU is taken into account, insur-
ing the proper TCS temperature for the ITE. Also it should be noted for the data
center operator, the use of the full range of temperatures within the class may not be
required or even desirable given the specific data center infrastructure design.
Until recently, liquid cooling has been sought out for performance, density, or
efficiency reasons. There are now liquid-only processor chips, and there will be
These three tests are hierarchical in nature, and the user should consider all of
them prior to choosing the one that best fits their application. In some cases, the
50 Facility Temperature and Humidity Measurement
proper test may be a mix of the above. For instance, a data center with low overall
power density but with localized high-density areas may elect to perform a facility
health and audit test for the entire facility but also perform an equipment installation
verification test for the area with localized high power density.
Sections 4.1 through 4.3 outline the recommended tests for measuring tempera-
ture and humidity. Section 4.4, new for the fifth edition, covers cooling simulation.
• Establish at least one point for every 3 to 9 m (10 to 30 ft) of aisle or every
fourth rack position, as shown in Figure 4.1.
• Locate points midway along the aisle, centered between equipment rows, as
shown in Figure 4.2.
• Where a hot-aisle/cold-aisle configuration is used, establish points in cold
aisles only,1 as shown in Figure 4.3.
1. Hot-aisle temperature levels do not reflect equipment inlet conditions and, therefore, may be
outside the ranges defined in Tables 2.1 and 2.2. Hot-aisle temperature levels may be measured
to help understand the facility, but significant temperature variation with measurement location
is normal.
Thermal Guidelines for Data Processing Environments, Fifth Edition 51
The objective of these measurements is to ensure that the aisle temperature and
humidity levels are all being maintained within the recommended operating condi-
tions of the class environment, as noted in Tables 2.1 and 2.2 of Chapter 2.
4.1.3 Evaluation
cold supply air to bypass equipment and return directly to an HVAC unit. The cause
of any short-circuiting should be investigated and evaluated for corrective action.
All temperature and humidity levels should fall within the specifications for the
class environment specified in Tables 2.1 and 2.2. If all measurements are within
limits, equipment failure is most likely not the result of poor environmental condi-
tions. If any measurement falls outside the recommended operating condition, the
facility operations personnel may wish to consult with the equipment manufacturer
regarding the risks involved or to correct the out-of-range condition.
Note: In some facilities, in particular pressurized facilities that control humidity
levels prior to the introduction of air into the data center, the absolute humidity in the
space is typically uniform. This is because significant humidity sources do not usually
exist inside data centers. If there is not a significant source of humidity in the data
center, humidity measurements do not have to be measured at every point, because
they can be calculated as a function of the localized temperature and the (uniform)
absolute humidity in the space at large.
The recommended airflow protocols for data center equipment in Figure 5.2 closely
follow those recommended for telecom equipment in Telcordia GR-3028-CORE.
Per Telcordia GR-63-CORE (2012), forced-air-cooled equipment is required to
use only a rear aisle exhaust. If approved by exception, top-exhaust airflow equipment
may be used in support of specialized airflow requirements. Forced-air-cooled equip-
ment should use a front-aisle air inlet. Forced-air-cooled equipment with other than
front-aisle-to-rear-aisle airflow may be approved for use when fitted with manufac-
turer-provided air baffles/deflectors that effectively reroute the air to provide front-
aisle-to-rear-aisle airflow. Equipment requiring air baffles/deflectors for airflow
compliance is required to be tested by the manufacturer for compliance to GR-63-
CORE with such hardware in place. Forced-air-cooled equipment other than front-
aisle air inlets may be approved for use but should not sustain any damage or deteri-
oration of functional performance during its operating life when operated at elevated
air inlet temperatures.
blanking panels should be added to the front cabinet rails, thereby preventing the
recirculation of hot air to the equipment inlet vented front, and rear doors for the cabi-
net must be nonrestrictive to airflow to reduce the load on information technology
equipment (ITE) fans, which can cause undesired ITE power consumption. Gener-
ally, 60% open ratio or greater is acceptable. To assist with hot-aisle/cold-aisle isola-
tion, solid-roofed cabinets are preferred.
Two solutions are becoming more common in data centers to eliminate the
mixing of cold and hot air. These containment solutions—the cold-aisle containment
design shown in Figure 5.6 and the hot-aisle containment design shown in
Figure 5.7—prevent the mixing of cold and hot air, thereby improving energy effi-
ciency for data centers significantly in some cases.
Figure 5.4 Example of hot and cold aisles for raised-floor environments
with underfloor cooling.
ment and so on. This can create a potentially harmful situation for the equipment in
the cabinets farther to the rear. If not addressed, this condition would contribute to
increased equipment failures and system downtime. Therefore, place cabinets that
cannot use hot-aisle/cold-aisle configurations together in another area of the data
center, being careful to ensure that exhaust from various equipment is not drawn into
equipment inlets. Temperature measurements can document the effect of recircu-
lated hot air and should be compared to the recommended and allowable temperature
ranges.
Aisle pitch is defined as the distance between the center of the reference cold aisle
and the center of the next cold aisle in either direction. A common aisle pitch for data
centers is seven floor tiles, based on two controlling factors. First, it is advisable to
allow a minimum of one complete floor tile in front of each rack. Second, maintaining
a minimum of three feet in any aisle for wheelchair access may be required by Section
4.3.3 of the Americans with Disabilities Act (ADA), 28 CFR Part 36 (ADA 2010).
Based on the standard-sized domestic floor tile, these two factors result in a seven-tile
pitch, allowing two accessible tiles in the cold aisle, 914.4 mm (3 ft) in the hot aisle,
and reasonably deep rack equipment, as shown in Figure 5.8. Table 5.1 lists potential
equipment depths for a seven-tile pitch. Rack depth would have to be less than 1066.8
mm (42 in.) to maintain a seven-tile pitch.
Some installations require that the rear of a cabinet line up with the edge of a
removable floor tile to facilitate underfloor service, such as pulling cables. Adding
this constraint to a seven-tile pitch results in a 1.21 m (4 ft) wide hot aisle and forces
a cold aisle of less than 1.21 m (4 ft), with only one row of vented tiles and more
limited cooling capacity, as shown in Figure 5.9.
Thermal Guidelines for Data Processing Environments, Fifth Edition 63
• Steady state
• User controls or programs set to a utilization rate that maximizes the number
of simultaneous components, devices, and subsystems that are active
• Nominal voltage input
• Ambient temperature between 18°C and 27°C (64.4°F and 80.6°F)
• Air-moving devices at ambient inlet temperatures as specified above
66 Equipment Manufacturers’ Heat and Airflow Reporting
Airflow values should be reflective of those that would be seen in the ITE oper-
ating in a data center. Representative racking, cabling, and loading should be taken
into account in airflow reporting. Some ITE manufacturers use variable-speed fans,
which can result in a large variance in airflow due to equipment loading and ambient
conditions. Airflow reporting should be based on the following conditions:
• The values predicted for tested configurations are within 10% of the measured
values.
• When the predicted values vary by more than 10% from the measured values,
the predictive algorithm is updated and revalidated.
Condition
Overall
Description
Typical System
Airflow, Weight
Heat Airflowa, Dimensionsb
Maximum (W × D × H)
Release Nominal
@ 35°C (95°F)
(@ 110 V)
Airflow Diagram
Cooling Scheme F-R
ASHRA
E Class 8 CPU-B, 16 GB, 64 I/O
Full Configuration
A1, A2 (2 GB cards, 2 frames)
a. Airflow values are for an air density of 1.2 kg/m3 (0.075 lb/ft3). This corresponds to air at 18°C (64.4°F),
101.3 kPa (14.7 psia), and 50% rh.
b. Footprint does not include service clearance or cable management, which is 0 on the sides, 1168 mm (46 in.)
in the front, and 1016 mm (40 in.) in the rear.
68 Equipment Manufacturers’ Heat and Airflow Reporting
To certify for an ENERGY STAR rating, a computer server must offer processor
power management that is enabled by default in the basic input/output system
(BIOS) and/or through a management controller, service processor, and/or the oper-
ating system shipped with the computer server. All processors must be able to reduce
power consumption in times of low utilization by
A computer server must provide data on input power consumption (W), inlet air
temperature (°C [°F]), and average utilization of all logical central processing units
(CPUs):
should not be a problem, but running near the allowable limits for extended periods
could result in increased reliability issues. (See Table 2.6 in Chapter 2 for the effects
of higher inlet temperatures on server reliability.) In reviewing the available data from
a number of IT manufacturers, the 2008 expanded recommended environmental enve-
lope became the agreed-upon envelope that is acceptable to all IT manufacturers, and
operation within this envelope does not compromise overall reliability of ITE.
This recommended envelope was created for general use across all types of busi-
nesses and conditions. However, different environmental envelopes may be more
appropriate for different business values and climate conditions. Therefore, to allow
for the potential of the ITE to operate in a different envelope that might provide even
greater energy savings, the fourth edition of Thermal Guidelines (ASHRAE 2015b)
provided general guidance on server metrics that can assist data center operators in
creating different operating envelopes that match their business values. Each of these
metrics is described in Chapter 2. By using these guidelines, the user can determine
what environmental conditions best meet their technical and business needs. Any
choice outside of the recommended region will be a balance between the additional
energy savings of the cooling system versus the deleterious effects that may be created
on total cost of ownership (TCO) (total site energy use, reliability, acoustics, and
performance).
None of the versions of the recommended operating environments ensure that the
data center is operating at optimum energy efficiency. Depending on the cooling
system, design, and outdoor environmental conditions, there will be varying degrees
of efficiency within the recommended zone. For instance, when the ambient tempera-
ture in a data center is raised, the thermal management algorithms within some data-
com equipment increase the speeds of air-moving devices to compensate for the higher
inlet air temperatures, potentially offsetting the gains in energy efficiency due to the
higher ambient temperature. It is incumbent upon each data center operator to review
and determine, with appropriate engineering expertise, the ideal operating point for
each system. This includes taking into account the recommended range and site-
specific conditions. The full recommended envelope is not the most energy-efficient
environment when a refrigeration cooling process is being used. For example, the high
dew point at the upper areas of the envelope result in latent cooling (condensation) on
refrigerated coils, especially in DX units. Latent cooling may decrease the available
sensible cooling capacity for the cooling system and, depending on the specific condi-
tions to be maintained in the data center, make it necessary to humidify to replace
excessive moisture removed from the air.
The ranges included in this book apply to the inlets of all equipment in the data
center (except where IT manufacturers specify other ranges). Attention is needed to
make sure the appropriate inlet conditions are achieved for the top portion of ITE
racks. The inlet air temperature in many data centers tends to be warmer at the top
portion of racks, particularly if the warm rack exhaust air does not have a direct return
path to the computer room air conditioners (CRACs). This warmer air also affects
the relative humidity (RH), resulting in lower values at the top portion of the rack.
Finally, it should be noted that the 2008 change to the recommended upper
temperature limit from 25°C to 27°C (77°F to 80.6°F) can have detrimental effects
74 2021 ASHRAE Environmental Guidelines for ITE
on acoustical noise levels in the data center. See the Acoustical Noise Levels section
of this appendix for a discussion of these effects.
Figure A.3 Inlet and component temperatures with fixed fan speed.
• Below a certain inlet temperature (23°C [73.4°F] in the case described above), IT
systems using variable-speed air-moving devices have constant fan power, and
their component temperatures track fairly closely to ambient temperature
changes. Systems that do not use variable-speed air-moving devices track ambient
air temperatures over the full range of allowable ambient temperatures.
• Above a certain inlet temperature (23°C [73.4°F] in the case described
above), the speed of the air-moving device increases to maintain fairly con-
stant component temperatures and, in this case, inlet temperature changes
have little to no effect on component temperatures and thereby no effect on
76 2021 ASHRAE Environmental Guidelines for ITE
Figure A.4 Inlet and component temperatures with variable fan speed.
As shown in Figure A.4, the IT fan power can increase dramatically as it ramps
up speed to counter the increased inlet ambient temperature. The graph shows a typi-
cal power increase that results in the near-constant component temperature. In this
case, the fan power increased from 11 W at 23°C (73.4°F) inlet temperature to over
60 W at 35°C (95°F) inlet temperature. The inefficiency in the power supply results
in an even larger system power increase. The total room power (facilities + IT) may
actually increase at warmer temperatures. IT manufacturers should be consulted
when considering system ambient temperatures approaching the upper recom-
mended ASHRAE temperature specification. See the work by Patterson (2008) for
a technical evaluation of the effect of increased environmental temperature, where
it was shown that an increase in temperature can actually increase energy use in a
standard data center but reduce it in a data center with economizers in the cooling
system.
Because of the derating of the maximum allowable temperature with altitude for
Classes A1 and A2, the recommended maximum temperature is derated by 1°C/
300 m (1.8°F/984 ft) above 1800 m (5906 ft).
investigate the effects of gaseous pollutants and high relative humidity on the reli-
ability of ITE. Specifically, it was found that for data center environments tested with
silver and copper coupons that are shown to have corrosion levels less than 300 Å/
month for copper and 200 Å/month for silver, suggesting that only the pervasive
pollutants (SO2, NO2, and O3) may be present, the moisture limit could be raised to
70% rh for the recommended environmental envelope. However, before this change
could be made to the recommended envelope, detrimental effects to other IT compo-
nents from raising the RH limits needed to be investigated. Specifically, the question
considered was: what are the effects of this change from 60% to 70% rh on printed
circuit cards, hard disk drives (HDDs), and tape drives? The answers to this question
are addressed in the following subsections.
that is, the bar graph showing this distribution would indicate an equal number of
samples for 30% to 40%, 40% to 50%, and 50% to 60% rh. This might represent
operating a data center with high humidity to a maximum RH limit of 60% (as it was
in the 2015 recommended envelope). In this case the failure rate was 1.23%, less than
the base case of 1.5%. Raising the bins by 10%, where the distribution becomes 40%
to 50%, 50% to 60%, and 60% to 70% rh to reflect operating a data center up to the
higher RH limit of 70% for the recommended envelope, results in a failure rate
computed at 1.78%. (This might be considered a worst-case scenario for operating
a data center at the higher RHs.) This projected failure rate seems acceptable given
that it is much less than that experienced by the data center described in Figure A.5,
where the failure rate was 3.1%, or more than two times the well-controlled chiller-
based data center with a failure rate of 1.5%.
Two other observations from the Manousakis et al. (2016) paper are worth
including here:
• It was found that in high-RH data centers, server designs that place disks in
the backs of their enclosures can reduce the disk failure rate significantly.
• Though higher RH increases component failures, relying on software tech-
niques to mask them also significantly reduces infrastructure and energy costs
and more than compensates for the cost of the additional failures.
Tape products have been following note c of Table 2.1 where 80% rh is accept-
able: tape products require a stable and more restrictive environment (similar to
Class A1 of the 2011 thermal guidelines). Typical requirements are a minimum
temperature of 15°C (59°F), a maximum temperature of 32°C (89.6°F), a minimum
RH of 20%, a maximum RH of 80%, a maximum dew point of 22°C (71.6°F), a rate
of change of temperature less than 5°C/h (9°F/h), a rate of change of humidity of less
than 5% rh per hour, and no condensation.
The lower limit of moisture for the recommended envelope as shown in Table A.1
was changed in both 2008 and 2015 and will remain as the 2015 limit in this edition
of Thermal Guidelines. The key change from the original 2004 edition to all later
editions was the change from an RH limit to a dew-point limit. The key reason for
this change is to force data center operators to control moisture based on dew point
and not RH, principally because dew point is fairly uniform throughout the data
center whereas RH is not.
Another practical benefit of the change to a dew-point limit from an RH limit
is that the operation of the HVAC systems within the data center will be sensible only.
Also, having an RH limit greatly complicates the control and operation of the cool-
ing systems and could require added humidification operation at a cost of increased
energy in order to maintain an RH when the space is already above the needed dew-
point temperature. To avoid these complications, the hours of economizer operation
available using the 2004 guidelines were often restricted.
ASHRAE funded a research project conducted by Missouri University Science
and Technology to investigate low moisture levels and the resulting ESD effects
(Pommerenke et al. 2014). The concerns raised prior to this study regarding the
increase of ESD-induced risk with reduced humidity were not justified. Based on
those results, reported in Appendix D of this book, the lower moisture limit for the
recommended envelope was reduced from 5.5°C (41.9°F) to –9°C (15.8°F) dew
point and for Classes A1 and A2 was reduced from 20% rh to –12°C (10.4°F) and
8% rh. These changes significantly reduce the humidification requirements for data
centers.
national codes. The actual regulations should be consulted, because they are
complex and beyond the scope of this book to explain fully. For instance, when levels
exceed 85 dB(A), hearing conservation programs are mandated, which can be quite
costly and generally involve baseline audiometric testing, noise level monitoring or
dosimetry, noise hazard signage, and education and training. When levels exceed
87 dB(A) (in Europe) or 90 dB(A) (in the United States), further action, such as
mandatory hearing protection, rotation of employees, or engineering controls, must
be taken. Data center managers should consult with acoustical or industrial hygiene
experts to determine whether a noise exposure problem will result from increasing
ambient temperatures to the upper recommended limit.
1. Scenario #1: Expansion of economizer use for longer periods of the year where
hardware failures are not tolerated.
• For short periods of time, it is acceptable to operate outside the recom-
mended envelope and approach the allowable extremes. All manufactur-
ers perform tests to verify that their hardware functions at the allowable
limits. For example, if during the summer months it is desirable to oper-
ate for longer periods of time using an economizer rather than turning on
the chillers, this should be acceptable as long as the period of warmer
inlet air temperatures to the ITE does not exceed several days each year;
otherwise, the long-term reliability of the equipment could be affected.
Operation near the upper end of the allowable range may result in tem-
perature warnings from the ITE. See Section 2.4.3 of Chapter 2 for infor-
mation on estimating the effects of operating at higher temperatures by
using the failure rate x-factor data.
2. Scenario #2: Expansion of economizer use for longer periods of the year where
limited hardware failures are tolerated.
• As previously stated, all manufacturers perform tests to verify that their
hardware functions at the allowable limits. For example, if during the
summer months it is desirable to operate for longer periods of time using
the economizer rather than turning on the chillers, and if the data center
operation is such that periodic hardware fails are acceptable, then operat-
ing for extended periods of time near or at the allowable limits may be
acceptable. Of course, it is a business decision of when to operate within
the allowable and recommended envelopes and for what periods of time.
Operation near the upper end of the allowable range may result in tem-
perature warnings from the ITE. See Section 2.4.3 of Chapter 2 for infor-
82 2021 ASHRAE Environmental Guidelines for ITE
Notes for Table B.1, 2021 Thermal Guidelines for Air Cooling—
I-P Version (SI Version in Chapter 2)
a. Classes A3 and A4 are identical to those included in the 2011 version of the thermal guide-
lines. The 2015 version of the A1 and A2 classes has expanded RH levels compared to the
2011 version. The 2021 version of the thermal guidelines maintains the same envelopes for
A1 through A4 but updates the recommended range depending on the level of pollutants in
the data center environment.
b. Product equipment is powered on.
c. Tape products require a stable and more restrictive environment (similar to Class A1 as spec-
ified in 2008). Typical requirements: minimum temperature is 59°F, maximum temperature
is 89.6°F, minimum RH is 20%, maximum RH is 80%, maximum dew point (DP) is 71.6°F,
rate of change of temperature is less than 9°F/h, rate of change of humidity is less than 5%
rh per hour, and no condensation.
d. Product equipment is removed from original shipping container and installed but not in use,
e.g., during repair, maintenance, or upgrade.
e. Classes A1 and A2—Derate maximum allowable dry-bulb temperature 1.8°F/984 ft above
2953 ft. Above 7874 ft altitude, the derated dry-bulb temperature takes precedence over the
recommended temperature. Class A3—Derate maximum allowable dry-bulb temperature
1.8°F/574 ft above 2953 ft. Class A4—Derate maximum allowable dry-bulb temperature
1.8°F/410 ft above 2953 ft.
f. For tape storage: 9°F in an hour. For all other ITE: 36°F in an hour and no more than 9°F in
any 15-minute period of time. The temperature change of the ITE must meet the limits shown
in the table and is calculated to be the maximum air inlet temperature minus the minimum air
inlet temperature within the time window specified. The 9°F and 36°F temperature change is
considered to be a temperature change within a specified period of time and not a rate of
change. See Appendix K for additional information and examples.
g. With a diskette in the drive, the minimum temperature is 50°F (not applicable to Classes A1
or A2).
h. The minimum humidity level for Classes A1, A2, A3, and A4 is the higher (more moisture)
of the 10.4°F DP and the 8% rh. These intersect at approximately 77°F. Below this intersec-
tion (~77°F) the DP (10.4°F) represents the minimum moisture level, while above it, RH (8%)
is the minimum.
i. Based on research funded by ASHRAE and performed at low RH (Pommerenke et al. 2014),
the following are the minimum requirements:
1) Data centers that have non-electrostatic discharge (non-ESD) floors and where personnel
are allowed to wear non-ESD shoes need increased humidity given that the risk of gener-
ating 8 kV increases slightly from 0.27% at 25% rh to 0.43% at 8% rh (see Appendix D for
more details).
2) All mobile furnishing/equipment is to be made of conductive or static-dissipative materials
and bonded to ground.
3) During maintenance on any hardware, a properly functioning and grounded wrist strap
must be used by any personnel who contacts ITE.
j. To accommodate rounding when converting between SI and I-P units, the maximum elevation
is considered to have a variation of ±0.1%. The impact on ITE thermal performance within
this variation range is negligible and enables the use of the rounded value of 10,000 ft.
k. See Appendix L for graphs that illustrate how the maximum and minimum DP limits restrict
the stated RH range for each of the classes for both product operations and product power off.
l. For the upper moisture limit, the limit is the minimum absolute humidity of the DP and RH
stated. For the lower moisture limit, the limit is the maximum absolute humidity of the DP and
RH stated.
m. Operation above 10,000 ft requires consultation with the IT supplier for each specific piece
of equipment.
n. If testing with silver or copper coupons results in values less than 200 and 300 Å/month,
respectively, then operating up to 70% rh is acceptable. If testing shows corrosion levels
exceed these limits, then catalyst-type pollutants are probably present and RH should be
driven to 50% or lower.
86 2021 Air-Cooled Equipment Thermal Guidelines (I-P)
Notes for Table B.2, 2021 Thermal Guidelines for High-Density Servers—
I-P Version (SI Version in Chapter 2)
a. This is a new class specific to high-density servers. It is at the discretion of the ITE manufac-
turer to determine the need for a product to use this high-density server class. Classes A1
through A4 are separate and are shown in Table 2.1.
b. Product equipment is powered on.
c. Tape products require a stable and more restrictive environment (similar to 2011 Class A1).
Typical requirements: minimum temperature is 59°F, maximum temperature is 89.6°F, mini-
mum RH is 20%, maximum RH is 80%, maximum dew point (DP) is 71.6°F, rate of change
of temperature is less than 9°F/h, rate of change of humidity is less than 5% rh per hour, and
no condensation.
d. Product equipment is removed from original shipping container and installed but not in use,
e.g., during repair, maintenance, or upgrade.
e. For H1 class only—Derate maximum allowable dry-bulb temperature 1°F/1640 ft above 2950
ft. Above 7870 ft altitude, the derated dry-bulb temperature takes precedence over the recom-
mended temperature.
f. For tape storage: 9°F in an hour. For all other ITE: 36°F in an hour and no more than 9°F in
any 15-minute period of time. The temperature change of the ITE must meet the limits shown
in the table and is calculated to be the maximum air inlet temperature minus the minimum air
inlet temperature within the time window specified. The 9°F or 36°F temperature change is
considered to be a temperature change within a specified period of time and not a rate of
change. See Appendix K for additional information and examples.
g. With a diskette in the drive, the minimum temperature is 50°F. With the lowest allowed
temperature of 59°F, there is no problem with diskettes residing in this H1 environment.
h. The minimum humidity level for Class H1 is the higher (more moisture) of the 10.4°F DP and
the 8% rh. These intersect at approximately 77°F. Below this intersection (~77°F) the DP
(10.4°F) represents the minimum moisture level, while above it, RH (8%) is the minimum.
i. Based on research funded by ASHRAE and performed at low RH (Pommerenke et al. 2014),
the following are the minimum requirements:
1) Data centers that have non-electrostatic discharge (non-ESD) floors and where personnel
are allowed to wear non-ESD shoes may need increased humidity given that the risk of gener-
ating 8 kV increases slightly from 0.27% at 25% rh to 0.43% at 8% (see Appendix D for more
details).
2) All mobile furnishing/equipment is to be made of conductive or static-dissipative materials
and bonded to ground.
3) During maintenance on any hardware, a properly functioning and grounded wrist strap
must be used by any personnel who contacts ITE
j. To accommodate rounding when converting between SI and I-P units, the maximum elevation
is considered to have a variation of ±0.1%. The impact on ITE thermal performance within
this variation range is negligible and enables the use of the rounded value of 10,000 ft.
k. See Appendix L for graphs that illustrate how the maximum and minimum DP limits restrict
the stated RH range for both product operations and product power off.
l. For the upper moisture limit, the limit is the minimum absolute humidity of the DP and RH
stated. For the lower moisture limit, the limit is the maximum absolute humidity of the DP and
RH stated.
m. Operation above 10,000 ft requires consultation with IT supplier for each specific piece of
equipment.
n. If testing with silver or copper coupons results in values less than 200 and 300 Å/month,
respectively, then operating up to 70% rh is acceptable. If testing shows corrosion levels
exceed these limits, then catalyst-type pollutants are probably present and RH should be
driven to 50% or lower. See note 3 of Section 2.2 for more details.
Appendix C
Detailed Flowchart for the
Use and Application of
the ASHRAE Data Center Classes
Figures C.1 through C.4 provide guidance to the data center operator on how to
position a data center to operate in a specific environmental envelope. These figures
permit the continued use of the recommended envelope as specified in Table 2.1 but,
more importantly, they show how to achieve even greater energy savings through the
use of a total cost of ownership (TCO) analysis using the server metrics provided in
Chapter 2.
• Human charging test: The human body voltage of a person walking on the
floor was measured as a function of floor type, footwear, grounding, and envi-
ronmental conditions.
• Cable charging by spooling and dragging: Different cables were dragged
across different surfaces and the induced voltage was measured.
• Human metal discharge: A charged person held a metallic ground and dis-
charged himself. Currents and electric fields were measured.
• Cable discharge: To emulate charges on a jacket, cables were wrapped with
aluminum foil, the foil was charged to a given voltage, and the voltages
induced on the wires were measured.
Only the data from the measurement of voltages generated by people walking
are reported here, as these test results were considered most directly related to the
humidity requirements for the environmental classes. Results from the other exper-
iments can be obtained from the research project final report by Pommerenke et al.
(2014).
The charging experiments were analyzed to obtain both the maximal voltage for
each parameter combination and the effect of parameter changes, especially the
humidity. Furthermore, an extrapolation was performed to obtain the probability of
voltages larger than typical thresholds used for electronic systems robustness. Here,
500 V (for service conditions) and 4 and 8 kV (derived from the IEC 61000-4-2 test
method [IEC 2008]) were used as the limits.
Using ESD-mitigating flooring and footwear, the risk of ESD upset and damage
can be reduced to an insignificant level, even if the humidity is allowed to drop to low
values, such as 8% (the lower limit of relative humidity for Classes A3 and A4). In
addition to using conductive footwear and flooring, other precautions should be taken,
especially under low-humidity conditions, to avoid rapid removal of non-conductive
plastic wrapping when in close proximity to ITE. Furthermore, all office chairs and
carts selected for use in data centers should have ESD-mitigating properties.
The low increase in the ESD risk with reduced humidity indicates that a data
center with a low incident rate of ESD-induced damage operating at 25% rh will
maintain a low incident rate if the humidity is reduced to 8%. The concerns raised
prior to the study regarding the increase in ESD-induced risk with reduced humidity
are not justified. A standard set of ESD mitigation procedures will ensure a very low
ESD incident rate at all humidity levels tested.
All electronic equipment placed in a data center is tested for its ESD robustness
to at least the levels set by IEC 61000-4-2, which is 4 kV contact mode and 8 KV
air discharge mode (IEC 2008). However, human charging can lead to voltages
above these levels, and discharges can have rise times that are faster than the refer-
enced event used to define the ESD test standard IEC 61000-4-2. Three voltage
limits were chosen for expressing the effects of lower humidity levels (Pommerenke
et al. 2014):
Thermal Guidelines for Data Processing Environments, Fifth Edition 97
• 500 V is the limit during service. This level was selected as an assumed
robustness during service of ITE. During service, shielding panels may be
removed, the operator may handle hard drives or other plug-in devices, and
the operator might connect a laptop via a USB cable to an internal service
connector. Those service actions are usually not considered during standard-
ized IEC 61000-4-2 ESD testing, as these conditions expose sensitive elec-
tronics. In the electronics industry, it is generally considered that a voltage of
100 V is low enough to handle nearly all electronic components (such as inte-
grated circuits or transistors). However, we assume that these components are
integrated into a system, and the system, such as a hard drive, provides some
level of protection and shielding. This assumption and communication with
many people involved in ITE quality control led to a decision to use 500 V as
the service robustness threshold.
• 4 kV is derived from the level 2 contact discharge test method in IEC 61000-
4-2. This test uses a contact mode and the contact mode waveform is based on
the much more severe human metal ESD. In the examples that illustrate a pos-
sible event rate, it was assumed that the operator will only discharge himself
via a piece of metal into the ITE in 1% of the cases when he touches a server
during operation. An example of such discharge might be the discharge from
a handheld key to a key lock on the operator console of a server.
• 8 kV is derived from the level 3 air discharge test method in IEC 61000-4-2.
This is the air discharge test level that is applied to nonconductive surfaces.
Here it was assumed that the failure mechanism inside the ITE is only a func-
tion of having or not having a breakdown, independent of the current or rise
time. The dielectric breakdown threshold is not a function of holding or not
holding a metal part. In the example that illustrates a possible event rate, it
was assumed that every time the operator reaches >8 kV, damage or an upset
may occur (the human/metal ESD calculation assumed that only 1% of the
discharges are via a piece of metal, thus endangering the ITE). An example of
such a discharge might be a discharge from the surface of a touch screen into
the electronics of the screen.
The relative rate of ESD-related failures or upsets is derived for various types
of data centers based on different flooring systems and personal footwear. As esti-
mation of the actual number of ESD-related failures or upsets is impossible, hypo-
thetical scenarios of data centers are considered with the assumption that the
operator actions and ITE are constant in all these data centers. Then, using industry-
accepted ESD robustness thresholds, the probabilities of exceeding these thresholds
are calculated and compared. This analysis allows us to estimate the relative rate of
ESD-related failures or upsets as a function of environmental conditions, flooring
types, and footwear. The simulation is based on a well-defined walking pattern that
has good repeatability (see Figure D.1). Due to limitations on performing the well-
defined walking pattern for long periods of time and due to the small probability of
observing very high voltages, an extrapolation approach is used to determine the
probabilities of exceeding ESD robustness levels. Two approaches have been used
to obtain the extrapolation functions used to predict higher voltage levels:
(1) performing the extrapolation based on the distribution functions measured in the
test, and (2) performing the extrapolation based on literature data. The literature data
predict higher risk levels; however, in many cases both extrapolations lead to the
same conclusions with respect to risk level. Based on the calculated probabilities and
different categories of data center, recommendations regarding the flooring system
and footwear control are provided herein.
For this test, 18 different types of flooring samples were assembled on test plates
0.91 × 0.91 m (3 × 3 ft) in size. Twelve different types of footwear or shoe-grounding
devices were gathered, representing a broad spectrum of shoe types and materials
(shoe types are shown in Table D.1). The electrical resistance ranges are shown in
Flooring Shoes
ESD Shoes
High-
Conductive Dissipative Non-ESD and Shoe
Resistance
ESD Floors ESD Floors Floors Grounding
Shoes
Devices
Electrical 1 ×10E6 to
<1 × 10E6 >1 × 10E9 <1 × 10E8 >1 × 10E9
Resistance <1 × 10E9
Table D.2. While not every shoe type was tested in every condition on all the floors,
the main types of shoes were tested as experience was gained during the test program.
Many of the shoe types performed similarly on similar floors. Therefore, in a few
cases floor types and shoe types were skipped in some of the test conditions to reduce
the redundancies. The environmental conditions in between the extremes show the
same tendencies; detail can be found in the work by Pommerenke et al. (2014).
If these data had been recorded over a very long time (e.g., one year), the voltage
might have exceeded 4 kV a few times. It should be noted that both the maximum
voltage in each walking cycle and the shape of the waveform depend on the envi-
ronmental condition, shoe and floor types, and speed and pattern of walking. The
walking experiment is repeated for different environmental conditions while keep-
ing other parameters (walking pattern, speed of walking, and types of flooring and
shoes) constant. The amplitude density of the recorded data is converted to its prob-
ability density function by dividing it by the total number of points in the data set.
The magnitude is taken to include negative charge voltages.
While it may prove impossible to control the footwear worn by personnel who
enter or work in data centers with certainty, it would be a good idea for facility
owners and managers to have an awareness that footwear can lead to issues in the
daily operation of a data center. Almost any conventional polymer-based sole mate-
rial may lead to high charge levels, some more so than others—regardless of humid-
ity. A conductive floor will help to mitigate electrostatic charging, even from the
worst possible pair of shoes.
The results from the walking test are summarized in Table D.3.
Thermal Guidelines for Data Processing Environments, Fifth Edition 101
Cumulative Probability (V > V0) with ESD Floors and ESD Shoes
(Pattern Walking)
Environmental
V0 = 500 V V0 = 4 kV V0 = 8 kV
Condition
Cumulative Probability (V > V0) with Non-ESD Floors and Non-ESD Shoes
(Pattern Walking)
Environmental
V0 = 500 V V0 = 4 kV V0 = 8 kV
Condition
45% rh at 27°C (80.6°F) 4.70% 0.01% 0.00%
25% rh at 27°C (80.6°F) 23% 1.13% 0.27%
8% rh at 27°C (80.6°F) 48.80% 2.28% 0.43%
Cumulative Probability (V > V0) with ESD Floors and Non-ESD Shoes
(Pattern Walking)
Environmental
V0 = 500 V V0 = 4 kV V0 = 8 kV
Condition
45% rh at 27°C (80.6°F) 0.15% 7.44E-11 1.17E-13
25% rh at 27°C (80.6°F) 5.80% 7.14E-11 2.12E-10
8% rh at 27°C (80.6°F) 12.20% 2.38E-06 3.01E-09
Because all electronic equipment placed in a data center is tested for its ESD
robustness to at least the levels set by CISPR 24 (IEC 2010) (which is 4 kV contact
mode and 8 kV air discharge mode), the columns in Table D.3 for 4 kV and 8 kV are
of primary interest. The 500 V column associated with servicing of servers is not of
particular interest since wrist straps are required for servicing these days (see foot-
note i in Table 2.1 and Table 2.2). What is noteworthy in Table D.3 is that the test
results for ESD floors/ESD shoes and ESD floors/non-ESD shoes for 4 kV, 8 kV, and
higher have zero risk for relative humidity (RH) levels at 8%. Tests were performed
for the category of non-ESD floors/ESD shoes, but not enough tests were performed
to obtain accurate probability projections. However, the results did indicate that they
would be very similar to the ESD floors/non-ESD shoes results, where the risk at
4 and 8 kV is zero. Finally, the probability results at 4 and 8 kV for non-ESD floors/
non-ESD shoes do show some slight increase in risk in going from 25% to 8% rh,
102 ESD Research and Static Control Measures
albeit the risk is low (Pommerenke et al. 2014). Since ITE is tested to 8 kV, there will
need to be some judgment on the part of the data center operator as to whether to
increase moisture levels above 8%, given this increase of risk from 0.27% to 0.43%
for 8 kV.
• Provide a conductive path from the metallic floor structure to a known build-
ing ground source.
• Ground the floor metallic support structure (stringer, pedestals, etc.) to build-
ing steel at several places within the room. The number of ground points is
Thermal Guidelines for Data Processing Environments, Fifth Edition 103
based on the size of the room. The larger the room, the more ground points
that are required.
• Ensure the maximum resistance for the flooring system is 2 × 1010 , mea-
sured between the floor surface and the building ground (or an applicable
ground reference). Flooring material with a lower resistance will further
decrease static buildup and discharge. For safety, the floor covering and floor-
ing system should provide a resistance of no less than 150 kwhen measured
between any two points on the floor space 1 m (3 ft) apart.
• Maintain ESD-control floor coverings (including carpet and tile) according to
the individual supplier’s recommendations. Carpeted floor coverings must
meet electrical conductivity requirements. Use only low-charging materials
with low-propensity ratings.
• Use only ESD-control furniture with conductive casters or wheels.
soils. Based on the results of the literature review and assuming that indoor concen-
trations would be similar to the outdoor levels in a worst-case condition when
outdoor air is used directly for free cooling, realistic indoor worst-case concentra-
tions for corrosion testing were defined as 80 ppb NO2, 60 ppb O3, 40 ppb SO2, 2
ppb Cl2, and 10 ppb H2S (Zhang et al. 2018).
All tests were performed by first exposing the test specimens (standard copper
and silver coupons or printed circuit boards; see Figure E.2) in exposure chambers
of a testing system (Figure E.3) specifically developed for this study. The test spec-
imens were then analyzed by coulometric reduction to determine the total corrosion
thickness and quantities of major corrosion products.
A mixed flowing gas test apparatus (Figure E.3) was developed for this
research. It was based on ASTM B827-05, Standard Practice for Conducting Mixed
Flowing Gas (MFG) Environmental Tests (ASTM 2014). The experiments were
designed around two groups of mixed flowing gas mixtures: one consists of the prev-
alent compounds NO2, O3, and SO2, or their combinations, and the other includes
NO2, O3, and SO2 plus Cl2 or H2S or both.
The corrosion thicknesses of copper and silver coupons were measured after six
days of exposure at 50% rh, 70% rh, and 80% rh and 21°C and 28°C (69.8°F and
Thermal Guidelines for Data Processing Environments, Fifth Edition 107
(a) (b)
Figure E.2 Test specimens: a) standard copper and silver coupons and
b) printed circuit board (PCB) coupons.
Figure E.4 Corrosion thicknesses for copper at 50% rh, 70% rh, and
80% rh.
82.4°F) under different pollutant mixtures. This research found that copper corro-
sion is strongly dependent on RH (Zhang et al. 2019). Figure E.4 shows that when
no Cl2 or H2S is present (i.e., only NO2, O3, and SO2 were present), increasing the
RH from 50% to 70% did not cause any significant increase of corrosion thickness
for copper, but at 80% rh there was a significant increase in corrosion thickness. It
was also noticed that for all testing for 50% rh and above with all pollutant mixtures
that none of the results were acceptable and corrosion thicknesses were well beyond
the limits of copper. The corrosion rate of silver (Figure E.5), however, was found
to have no obvious dependence on RH. Increasing the RH did not cause obvious
significant difference in the corrosion thickness for the four-compound mixture
(NO2 + O3 + SO2 + Cl2). However, any test mixture with H2S caused significant
corrosion on both the copper and silver coupons.
1. The overall research results from RP-1755 (Zhang et al. 2019) follow. It is
important to note that the conclusions developed from this research are based
on the pollutant concentration at or near a maximum experienced around the
world. In most real-world cases the pollutant levels would be expected to be
much less than those tested in this research.
a. Corrosion development over time: According to the experimental results
from the 30-day tests (21°C [69.8°F] and 50% rh) for the 5-compound gas
mixture, NO2 + O3 + SO2 + Cl2 + H2S, there exists a logarithmic relation-
ship between the corrosion thickness and exposure time for copper.
Thermal Guidelines for Data Processing Environments, Fifth Edition 109
Figure E.5 Corrosion thicknesses for silver at 50% rh, 70% rh, and
80% rh.
sion when Cl2 was not present. A further increase of the RH to 80%
resulted in significant corrosion for all gas conditions tested, including O3,
O3 + SO2, NO2 + O3, NO2 + O3 + SO2, NO2 + O3 + SO2 + Cl2, NO2 + O3
+ SO2 + H2S, and NO2 + O3 + SO2 + Cl2 + H2S. This suggests that a critical
RH exists for copper between 70% and 80% rh, above which the corrosion
thickness increases dramatically.
b. For silver, increasing the RH did not cause significant increase in the corro-
sion thickness for all gas conditions tested except for the five-compound
mixture in which increasing the RH from 50% to 70% and even to 80%
resulted in a reduction in the corrosion thickness.
c. Operating data centers with only the three pervasive compounds present
and at RH levels as high as 70% at 21°C (69.8°F) is acceptable for copper
and silver corrosion control.
3. Regarding the effects of temperature on copper and silver corrosion, RP-1755
(Zhang et al. 2019) found the following:
a. For copper, increasing the temperature from 21°C to 28°C (69.8°F to
82.4°F) while keeping the RH at the reference condition (50% rh) dramat-
ically reduced corrosion thickness for all mixture conditions tested. This
was unexpected, but a repeat test confirmed the observation. It is likely that
at a higher temperature, which causes less moisture to be adsorbed on the
coupon surface, a much lower amount of pollutants may be adsorbed or
absorbed on the test coupon’s surface to cause corrosion.
b. For silver, significant corrosion thickness was still detected at 28°C
(82.4°F) and 50% rh for the H2S-containing mixture conditions. The
elevated temperature had no significant impact on silver corrosion when
H2S was not present.
c. For data center environments where Cl2 and H2S are not present, tempera-
tures as high as 28°C (82.4°F) are acceptable for corrosion control.
4. Regarding the effects of voltage bias (electrical current) on copper and silver
corrosion, RP-1755 (Zhang et al. 2019) found the following:
a. Results from scanning electron microscopy (SEM) and energy dispersive
x-ray spectroscopy (EDS) analysis show that the voltage bias on the
printed circuit boards (PCBs) significantly reduced the corrosion at 80% rh
but slightly increased the corrosion at 50% rh.
b. Further testing and analysis are necessary to determine the combined
effects of voltage bias and RH on copper and silver corrosion.
Appendix F
Psychrometric Charts
The psychrometric charts in this appendix graphically depict (in both SI and
I-P units) the envelopes of the allowable and recommended conditions shown in
tabular form in Tables 2.1 and 2.2 of Chapter 2. These charts would be useful to a
manufacturer trying to determine the appropriate environmental class for a new
information technology (IT) product.
Figures F.1 and F.2 show the recommended and allowable envelopes for Classes
A1, A2, A3, and A4. The recommended envelopes are shown for both low and high
levels of gaseous pollutants.
Figures F.3 and F.4 show the recommended and allowable envelopes for
Class H1. The recommended envelopes are shown for both low and high levels of
gaseous pollutants.
112 Psychrometric Charts
(a)
(b)
(a)
(b)
(a)
(b)
(a)
(b)
Table H.1 Time-Weighted Failure Rate x-Factor Calculations for Air-Side Economization for ITE in Chicago
Time-at-Temperature Weighted Failure Rate Calculation for Air-Side Economization
15°C T 20°C 20°C < T 25°C 25°C < T 30°C 30°C < T 35°C
Net
Location (59°F T 68°F) (68°F < T 77°F) (77°F < T 86°F) (86°F < T 95°F)
x-Factor
% of % of % of % of
x-Factor x-Factor x-Factor x-Factor
Hours Hours Hours Hours
Chicago, IL 72.45% 0.865 14.63% 1.130 9.47% 1.335 3.45% 1.482 0.970
Appendix I
ITE Reliability Data for
Selected Major
U.S. and Global Cities
In general, to make a data center failure rate projection, an accurate histogram
of the time-at-temperature for the given location is needed, and the appropriate air
temperature rise from the type of economizer being used should be considered as
well as the data center environmental control algorithm. For simplicity in the anal-
ysis conducted for this book, the impact of economization on the reliability of data
center hardware is shown here with three key assumptions:
The method of data analysis in this appendix is not meant to imply or recom-
mend a specific algorithm for data center environmental control. A detailed treatise
on economizer approach temperatures is beyond the scope of this book. The intent
here is to demonstrate the methodology applied and provide general guidance. An
engineer well versed in economizer designs should be consulted for exact tempera-
ture rises for a specific economizer type in a specific geographic location.
A reasonable assumption for data center supply air temperature rise above the
outdoor ambient dry-bulb temperature is assumed to be 1.5°C (2.7°F). For water-
side economizers, the temperature of the cooling water loop is primarily dependent
on the wet-bulb temperature of the outdoor air.
With Chicago as an example, data from Weather Data Viewer (ASHRAE
2009b) can be used to determine the number of hours during a year when compres-
sorless cooling can be used, based on an assumed approach temperature between the
wet-bulb temperature and the supply air temperature.
In the analysis done for this appendix, a reasonable assumption of 9°C (16.2°F)
was used for the combination of approaches for the cooling tower, heat exchanger(s),
and cooling coil in the air handler. For water-side economization with a dry-cooler-
type tower (closed loop, no evaporation), a 12°C (21.6°F) air temperature rise of the
data center air above the outdoor ambient air temperature is assumed. The figures
and tables in this appendix were based upon the above assumptions.
Time-at-temperature weighted average failure rate projections are shown in
Figures I.1 through I.6 for selected U.S. and global cities and for different economizer
124 ITE Reliability Data for Selected Major U.S. and Global Cities
scenarios. The calculations for those graphs, including the percentage of hours spent
within each temperature range for each city and the reliability data as a function of
temperature, can be found in the corresponding Tables I.1 through I.6 and are based
on Weather Data Viewer (ASHRAE 2009b) software.
It is important to be clear regarding what the relative failure rate values mean.
The results have been normalized for a data center run continuously at 20°C (68°F);
this has the relative failure rate of 1.00. For those cities with values below 1.00, the
assumption is that the economizer still functions and the data center is cooled below
20°C to 15°C (68°F to 59°F) for those hours each year.
In addition, the relative failure rate shows the expected increase in the number
of failed information technology equipment (ITE) products, not the percentage of
total ITE products failing; for example, if a data center that experiences four failures
per 1000 ITE products incorporates warmer temperatures, and the relative failure
rate is 1.20, then the expected failure rate would be 5 failures per 1000 ITE products.
For the majority of U.S. and European cities, the air-side and water-side
economizer projections show failure rates that are very comparable to a tradi-
tional data center run at a steady-state temperature of 20°C (68°F). For a water-
side economizer with a dry-cooler-type tower, the failure rate projections for most
U.S. and global cities are 10% to 40% higher than the 20°C (68°F) steady-state
baseline.
For reference, each of Figures I.1 through I.6 includes three lines showing fail-
ure rate projections for continuous (77×24×365) operation at 20°C, 30°C, and 35°C
(68°F, 86°F, and 95°F). Even though economized, compressorless facilities reach
temperatures of 30°C (86°F) and higher; their failure rate projections are still far
below the failure rates one would expect from continuous, high-temperature, steady-
state operation.
1. The weather data being considered for both the net x-factor calculation and
hours per year of chiller operation are based only on temperature and not on
humidity.
a. The impact of humidity on the net x-factor calculation is currently under
development and needs to be considered based on the local climate.
b. The impact of humidity on hours per year of chiller operation varies based
on excursion type and humidity management techniques and needs to be
considered based on the local climate.
2. U.S. cities marked with an asterisk on the figures in this appendix are located
in the part of the country where ANSI/ASHRAE/IES Standard 90.1 (ASHRAE
2013) does not mandate economization. Most of these cities lie in a region of
the U.S. that is both warm and humid.
3. The number of hours per year of chiller operation required in the cities analyzed
in Figure I.1 through I.6 is shown in Figures I.7 through I.12. A data center
facility located in a climate that requires zero hours of chiller operation per year
could be built without a chiller.
Thermal Guidelines for Data Processing Environments, Fifth Edition 125
4. For a majority of U.S. and European cities, and even some Asian cities, it is
possible to build economized data centers that rely almost entirely on the local
climate for their cooling needs. However, the availability of Class A3 and A4
capable ITE significantly increases the number of U.S. and global locations
where compressorless facilities could be built and operated. The use of air- and
water-side economization (versus dry-cooler-type water-side economization)
also increases the number of available locations for compressorless facilities.
126 ITE Reliability Data for Selected Major U.S. and Global Cities
Figure I.1 Failure rate projections for air-side economizer for selected
U.S. cities.
Figure I.2 Failure rate projections for water-side economizer for selected
U.S. cities.
Table I.1 Time-Weighted Failure Rate x-Factor Calculations for Class A2 for Air-Side Economization for
Selected Major U.S. Cities Assuming 1.5°C (2.7°F) Temperature Rise between
Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Air-Side Economization
% Bin Hours and Associated x-Factors for U.S. Cities at Various Temperature Bins
128 ITE Reliability Data for Selected Major U.S. and Global Cities
Selected Major U.S. Cities Assuming 9°C (16.2°F) Temperature Rise between
Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Water-Side Economization
% Bin Hours and Associated x-Factors for U.S. Cities at Various Temperature Bins
15°C T 20°C 20°C < T 25°C 25°C < T 30°C 30°C < T 35°C Net
Location
(59°F T 68°F) (68°F < T 77°F) (77°F < T 86°F) (86°F < T 95°F) x-Factor
% of Hours x-Factor % of Hours x-Factor % of Hours x-Factor % of Hours x-Factor
Helena, MT 80.69% 0.865 16.49% 1.130 2.81% 1.335 0.01% 1.482 0.922
Denver, CO 72.50% 0.865 22.24% 1.130 5.26% 1.335 0.00% 1.482 0.949
Seattle, WA 64.41% 0.865 30.06% 1.130 5.48% 1.335 0.05% 1.482 0.971
Madison, WI 62.54% 0.865 16.01% 1.130 15.78% 1.335 5.67% 1.482 1.017
Boston, MA 59.42% 0.865 17.90% 1.130 16.98% 1.335 5.70% 1.482 1.027
San Francisco, CA 41.62% 0.865 52.99% 1.130 5.38% 1.335 0.01% 1.482 1.031
Chicago, IL 59.59% 0.865 16.16% 1.130 16.80% 1.335 7.45% 1.482 1.033
Washington DC 48.73% 0.865 15.85% 1.130 19.29% 1.335 16.13% 1.482 1.097
Phoenix, AZ 35.94% 0.865 30.06% 1.130 20.13% 1.335 13.87% 1.482 1.125
Los Angeles, CA 20.92% 0.865 46.95% 1.130 31.50% 1.335 0.62% 1.482 1.141
Atlanta, GA 37.79% 0.865 18.17% 1.130 23.69% 1.335 20.36% 1.482 1.150
Dallas, TX 33.72% 0.865 16.09% 1.130 20.84% 1.335 29.35% 1.482 1.187
Houston, TX 22.14% 0.865 14.95% 1.130 21.60% 1.335 41.31% 1.482 1.261
Miami, FL 2.98% 0.865 8.58% 1.130 27.52% 1.335 60.93% 1.482 1.393
Thermal Guidelines for Data Processing Environments, Fifth Edition 129
Figure I.4 Failure rate projections for air-side economizer for selected
global cities.
Table I.3 Time-Weighted Failure Rate x-Factor Calculations for Class A2 for Water-Side Dry-Cooler-Type
130 ITE Reliability Data for Selected Major U.S. and Global Cities
Tower Economization for Selected Major U.S. Cities Assuming 12°C (21.6°F) Temperature Rise
between Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Water-Side Economization with Dry-Cooler-Type Tower
% Bin Hours and Associated x-Factors for U.S. Cities at Various Temperature Bins
15°C T 20°C 20°C < T 25°C 25°C < T 30°C 30°C < T 35°C Net
Location
(59°F T 68°F) (68°F < T 77°F) (77°F < T 86°F) (86°F < T 95°F) x-Factor
% of Hours x-Factor % of Hours x-Factor % of Hours x-Factor % of Hours x-Factor
Helena, MT 53.32% 0.865 15.61% 1.130 13.48% 1.335 17.59% 1.482 1.078
Madison, WI 48.26% 0.865 12.46% 1.130 13.79% 1.335 25.49% 1.482 1.120
Seattle, WA 33.56% 0.865 30.79% 1.130 22.16% 1.335 13.48% 1.482 1.134
Denver, CO 44.26% 0.865 14.85% 1.130 15.30% 1.335 25.59% 1.482 1.134
Chicago, IL 44.31% 0.865 12.83% 1.130 13.74% 1.335 29.13% 1.482 1.143
Boston, MA 41.16% 0.865 16.23% 1.130 15.95% 1.335 26.66% 1.482 1.147
Washington DC 29.94% 0.865 15.13% 1.130 14.89% 1.335 40.04% 1.482 1.222
San Francisco, CA 6.42% 0.865 38.41% 1.130 40.38% 1.335 14.79% 1.482 1.248
Atlanta, GA 18.89% 0.865 14.56% 1.130 17.00% 1.335 49.55% 1.482 1.289
Dallas, TX 15.96% 0.865 13.08% 1.130 14.69% 1.335 56.27% 1.482 1.316
Los Angeles, CA 1.01% 0.865 15.41% 1.130 45.39% 1.335 38.19% 1.482 1.355
Houston, TX 9.23% 0.865 10.99% 1.130 14.24% 1.335 65.54% 1.482 1.365
Phoenix, AZ 3.93% 0.865 12.14% 1.130 16.35% 1.335 67.58% 1.482 1.391
Miami, FL 0.30% 0.865 1.91% 1.130 6.15% 1.335 91.64% 1.482 1.464
Table I.4 Time-Weighted Failure Rate x-Factor Calculations for Class A2 for Air-Side Economization for
Selected Major Global Cities Assuming 1.5°C (2.7°F) Temperature Rise between
Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Air-Side Economization
% Bin Hours and Associated x-Factors for Global Cities at Various Temperature Bins
Figure I.5 Failure rate projections for water-side economizer for selected
global cities.
134 ITE Reliability Data for Selected Major U.S. and Global Cities
Tower Economization for Selected Major U.S. Cities Assuming 12°C (21.6°F) Temperature Rise between
Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Water-Side Economization with Dry-Cooler-Type Tower
% Bin Hours and Associated x-Factors for U.S. Cities at Various Temperature Bins
15°C T 20°C 20°C < T 25°C 25°C < T 30°C 30°C < T 35°C Net
Location
(59°F T 68°F) (68°F < T 77°F) (77°F < T 86°F) (86°F < T 95°F) x-Factor
% of Hours x-Factor % of Hours x-Factor % of Hours x-Factor % of Hours x-Factor
Oslo 56.72% 0.865 18.01% 1.130 16.66% 1.335 8.60% 1.482 1.044
Frankfurt 42.53% 0.865 21.55% 1.130 19.40% 1.335 16.52% 1.482 1.115
London 32.43% 0.865 30.63% 1.130 23.79% 1.335 13.15% 1.482 1.139
Milan 32.39% 0.865 17.26% 1.130 17.55% 1.335 32.80% 1.482 1.196
Rome 16.40% 0.865 22.61% 1.130 23.76% 1.335 37.23% 1.482 1.266
Tokyo 20.82% 0.865 17.79% 1.130 17.84% 1.335 43.55% 1.482 1.265
Sydney 2.73% 0.865 15.74% 1.130 32.88% 1.335 48.65% 1.482 1.361
Hong Kong 0.27% 0.865 3.70% 1.130 15.78% 1.335 80.26% 1.482 1.444
Bangalore 0.00% 0.865 0.03% 1.130 5.66% 1.335 94.31% 1.482 1.474
Singapore 0.00% 0.865 0.00% 1.130 0.00% 1.335 100.00% 1.482 1.482
Mexico City 4.72% 0.865 19.39% 1.130 39.87% 1.335 36.02% 1.482 1.326
Sao Paolo 0.29% 0.865 6.46% 1.130 30.25% 1.335 63.00% 1.482 1.413
San Jose, CR 0.00% 0.865 0.01% 1.130 4.84% 1.335 95.15% 1.482 1.475
Thermal Guidelines for Data Processing Environments, Fifth Edition 135
Figure I.7 Number of hours per year of chiller operation required for
air-side economizer for selected U.S. cities.
Figure I.8 Number of hours per year of chiller operation required for
water-side economizer for selected U.S. cities.
136 ITE Reliability Data for Selected Major U.S. and Global Cities
Figure I.9 Number of hours per year of chiller operation required for
water-side dry-cooler economizer for selected U.S.cities.
Figure I.10 Number of hours per year of chiller operation required for
air-side economizer for selected global cities.
Thermal Guidelines for Data Processing Environments, Fifth Edition 137
Figure I.11 Number of hours per year of chiller operation required for
water-side economizer for selected global cities.
Figure I.12 Number of hours per year of chiller operation required for
water-side dry-cooler economizer for selected global cities.
Appendix J
OSHA and
Personnel Working in
High Air Temperatures
As data center cold-aisle air temperatures have significantly increased due to the
increased ASHRAE recommended rack inlet air temperatures, so too have the hot-
aisle temperatures. As a result, many data center owners, operators, and IT manu-
facturers are concerned about personnel that work in these elevated temperature
environments. The 2011 Thermal Guidelines Classes A3 and A4 allowed for infor-
mation technology equipment (ITE) inlet air temperatures up to 40°C and 45°C
(104°F and 113°F), respectively, which can result in hot-aisle temperatures that
exceed 50°C (122°F). These temperatures are much higher than traditional cold- and
hot-aisle temperatures and can pose a significant health hazard to personnel who
work in these environments.
The U.S. Department of Labor’s Occupational Safety and Health Administra-
tion (OSHA), as well as the European Union’s Agency for Safety and Health at Work
(EU-OSHA), determine the minimum worker safety standards for the United States
and the European Union. As of January 2012, neither health organization had any
particular regulations specifying the allowable temperature ranges for working envi-
ronments. Instead, Health and Safety Executive (HSE) states and recommends that
workroom temperatures should provide “reasonable” comfort levels:
The temperature in workrooms should provide reasonable comfort without the
need for special clothing. Where such a temperature is impractical because of
hot or cold processes, all reasonable steps should be taken to achieve a tempera-
ture which is as close as possible to comfortable. “Workroom” means a room
where people normally work for more than short periods. (HSE 1992)
Although OSHA does not have a particular regulation or standard that covers
high-temperature environments, the General Duty Clause, Section 5(a)(1) of the
Occupational Safety and Health Act of 1970 (OSHA 2019), requires each employer
to “furnish to each of his employees employment and a place of employment which
are free from recognized hazards that are causing or are likely to cause death or seri-
ous physical harm.” OSHA has interpreted this rule such that employers shall
provide means and methods that will reduce the likelihood of worker heat stress.
These means or methods may include issuing personal protective equipment (PPE),
minimizing exposure through frequent breaks, frequent hydration, and developing
a heat stress program. There are various manufacturers that produce PPE for hot
working environments.
NIOSH (2016) and OSHA (2019) state that employers should develop a written
health and safety policy outlining how workers in hot environments will be protected
from heat stress. As a minimum, the following steps should be taken and addressed:
140 OSHA and Personnel Working in High Air Temperatures
Workload
% Work
Light Moderate Heavy* Very Heavy*
75% to 100%
31.0°C (87.8°F) 28.0°C (82.4°F) N/A N/A
(continuous)
50% to 75% 31.0°C (87.8°F) 29.0°C (84.2°F) 27.5°C (81.5°F) N/A
25% to 50% 32.0°C (89.6°F) 30.0°C (86.0°F) 29.0°C (84.2°F) 28.0°C (82.4°F)
0% to 25% 32.5°C (90.5°F) 31.5°C (86.9°F) 30.5°C (86.9°F) 30.0°C (86.0°F)
* Criteria values are not provided for heavy or very heavy work for continuous and 25% rest because of the
extreme physical strain. Detailed job hazard analyses and physiological monitoring should be used for these
cases rather than these screening criteria.
Thermal Guidelines for Data Processing Environments, Fifth Edition 141
tures, workload levels, and worker safety in their data centers if the temperatures
exceed 25°C (77°F).
It is important to note that although there are no particular laws or regulations
for the data center industry that prohibit working in 40°C (104°F) and above envi-
ronments, great care must be taken to ensure the safety of all personnel who may be
exposed to such temperatures and that appropriate safety and heat stress prevention
measures are implemented.
Appendix K
Allowable Server
Inlet Temperature
Rate of Change
The inlet air temperature change requirements of 5°C (9°F) in an hour (for tape
equipment) and 20°C (36°F) in an hour (for other types of IT equipment not includ-
ing tape) are not temperature rates of change. Figures K.1 through K.4 provide
examples of air inlet temperatures that are either compliant or noncompliant with the
temperature change requirements for data center rooms with and without tape-based
information technology equipment (ITE).
The control algorithms of many data center HVAC systems generate small
but rapid fluctuations in the cold air supply temperature, which can have a very
high rate of temperature change (see Figure K.5). These small changes are not a
problem for ITE functionality and reliability, because the time scale of the air inlet
temperature changes is typically too short for a large thermal mass, such as a
storage array, to respond to the changes (see Figure K.6).
A time lag of five minutes to respond to a change in air inlet temperature is
not an unusual amount of time for hard disk drives (HDDs) in a piece of ITE.
Small but rapid air temperature changes from the data center HVAC system
generally occur on a time scale much shorter than the time lag of the HDDs so that
the hard drives do not have a chance to respond to the rapid rates of temperature
change in the airstream. The extent of temperature change in the HDDs may also
be reduced by the cooling fan control algorithm of the equipment enclosure. Thus,
HDDs in ITE are significantly buffered from temperature changes and the rate of
temperature change of the air in the equipment inlet airstream. Other sub-
assemblies within the ITE (e.g., solid-state drives, option cards, power supplies)
are also somewhat buffered from data center air temperature changes. However,
this buffering is to a degree dependent on their thermal mass, cooling airflow, and
location within the ITE.
The intent of defining inlet air temperature change requirement as 5°C (9°F) and
20°C (36°F) for tape and other types of ITE, respectively, is two fold: 1) to provide
data center facility-level requirements that will keep the critical internal components
and subassemblies of the ITE within the manufacturer’s requirements, and 2) to
avoid costly and unnecessary data center HVAC system and facility upgrades that
might be needed to comply with the former rate-of-change-based requirement.
144 Allowable Server Inlet Temperature Rate of Change
(a) (b)
Figure K.1 Examples of tape equipment inlet air temperature versus time
that are compliant with the 5°C (9°F) in an hour temperature
change requirement for data center rooms with tape.
equipment.
(a) (b)
Figure K.2 Examples of tape equipment inlet air temperature versus time
that are noncompliant with the 5°C (9°F) in an hour temperature
change requirement for data center rooms with tape equipment.
Thermal Guidelines for Data Processing Environments, Fifth Edition 145
(a) (b)
Figure K.3 Examples of equipment inlet air temperature versus time that are
compliant with the 20°C (36°F) in an hour and the 5°C (9°F) in
15 minutes temperature change requirements for data center
rooms that contain other types of ITE not including tape.
(a) (b)
(c)
Figure K.4 Examples of equipment inlet air temperature versus time that:
a) are noncompliant with the 20°C (36°F) in an hour
requirement, b) are noncompliant with the 5°C (9°F) in 15
minutes requirement, and c) are noncompliant with 5°C (9°F)
in 15 minutes requirement but compliant with 20°C (36°F) in an
hour requirement for data center rooms that contain other
types of ITE not including tape.
146 Allowable Server Inlet Temperature Rate of Change
Figure K.5 Example of ITE air inlet temperature rate of change (°C/h)
calculated over 1 min, 5 min, 15 min, and 60 min time intervals.
Figure K.6 Example of time delay between inlet air temperature change to
storage array and the corresponding temperature change in
HDDs of the storage array.
Appendix L
Allowable Server Inlet RH Limits
versus Maximum Inlet
Dry-Bulb Temperature
In most information technology equipment (ITE) specifications, the allowable inlet
air relative humidity (RH) limits are not static but are instead a function of the inlet air
dry-bulb temperature. In other words, the RH specification is not simply the stated mini-
mum and maximum RH values—these values are usually modified by minimum and
maximum dew-point limits. Whether or not the dew-point limits affect the RH limits is
a function of the dry-bulb temperature of the inlet air. Dew-point limits are typically used
to reduce allowable high humidity values at high dry-bulb temperatures and to increase
the minimum allowable humidity value at low dry-bulb temperatures.
RH is the percentage of the partial pressure of water vapor to the saturation pressure
at a given dry-bulb temperature. Thus, RH is relative to a given temperature. If the
temperature of a parcel of air is changed, the RH will also change even though the abso-
lute amount of water present in the air remains unchanged.
Dew point is a measure of the absolute water content of a given volume of air. It is
also the temperature at which water vapor has reached the saturation point (100% rh).
Consider Class A3 from the 2015 thermal guidelines. Class A3 is defined as a mois-
ture range of –12°C (10.4°F) dew point and 8% rh to 24°C (75.2°F) dew point and
85% rh. The 24°C (75.2°F) maximum dew-point limit restricts high RH values at higher
temperatures. The –12°C (10.4°F) dew-point restriction prohibits low RH values at
lower temperatures. These effects are illustrated in Figure L.1.
(a) (b)
The purpose of applying dew-point limits to restrict RH values at high and low
temperatures is to minimize known reliability issues. For example, many types of corro-
sion are exponentially accelerated by RH and temperature. The maximum dew-point
limit helps reduce the risk of a corrosion-related failure by limiting the maximum RH
allowed at high temperatures. Similarly, damage to ITE from electrostatic discharge
(ESD) can be a problem at low RH levels. The minimum dew-point value serves to raise
RH limits at low temperatures to mitigate the risk of equipment damage from ESD.
Figures L.2 through L.7 show climatograms to graphically illustrate each of the
2021 thermal guideline classes and to show how the application of dew-point restrictions
changes the RH limits.
Thermal Guidelines for Data Processing Environments, Fifth Edition 149
(a) (b)
(a) (b)
Verdingovas, V., M.S. Jellesen, and R. Ambat. 2014. Impact of NaCl contamina-
tion and climatic conditions on the reliability of printed circuit board assem-
blies. IEEE Transactions on Device and Materials Reliability, 14(1):42–51.
Zhang, H., S. Shao, H. Xu, H. Zou, and C. Tian. 2014. Free cooling of data cen-
ters: A review. Renewable and Sustainable Energy Reviews 35:171–82.
Zhang, R., R. Schmidt, J. Gilbert, and J. Zhang. 2018. Effects of gaseous pollution
and thermal conditions on the corrosion rates of copper and silver in data cen-
tre environment: A literature review. 7th International Building Physics Con-
ference, IBPC2018, Proceedings. https://surface.syr.edu/cgi/viewcontent.cgi
?article=1280&context=ibpc.
Zhang, J., R. Zhang, R. Schmidt, J. Gilbert, and B. Guo. 2019. Impact of gaseous
contamination and high humidity on the reliable operation of information
technology equipment in data centers. ASHRAE Research Project 1755, Final
Report. Peachtree Corners, GA: ASHRAE.
Zhang, R., J. Zhang, R. Schmidt, J. Gilbert, and B. Guo. 2020. Effects of moisture
content, temperature and pollutant mixture on atmospheric corrosion of cop-
per and silver and implications for the environmental design of data centers
(RP-1755). Science and Technology for the Built Environment 26(4): 567–86.
Thermal
1
treatment of datacom cooling and related subjects.
ISBN 978-1-947192-64-5 (pbk) ASHRAE Datacom Series, Book 1 1 ASHRAE Datacom Series
ISBN 978-1-947192-65-2 (PDF)