You are on page 1of 170

Thermal

Thermal Guidelines for Data Processing Environments | Fifth Edition


Essential Guidance for Data Center Designers and Operators

Thermal Guidelines for Data Processing Environments provides groundbreaking,


vendor-neutral information that empowers data center designers, operators,
and managers to better determine the impacts of varying design and operation
parameters on information technology equipment (ITE).
This book covers six primary areas:
Guidelines for
• Environmental guidelines for air-cooled equipment
• New environmental class for high-density air-cooled equipment
• Environmental guidelines for liquid-cooled equipment
• Facility temperature and humidity measurement
• Equipment placement and airflow patterns
Data Processing
• Equipment manufacturers’ heat load and airflow requirement reporting
Since its first publication in 2004, Thermal Guidelines has enabled HVAC
equipment manufacturers and installers, data center designers, and facility
operators to find common solutions and standard practices that facilitate
ITE interchangeability while preserving industry innovation. This fifth edition
Environments
Fifth Edition
features clarified wording throughout, changes due to research on the effects
of high relative humidity and gaseous pollutants on the corrosion of ITE, and
a new environmental class for high-density server equipment. The book
also includes a removable reference card with helpful information for facility
managers and others. The reference card may also be accessed online.
This book is the first in the ASHRAE Datacom Series, authored by ASHRAE
Technical Committee 9.9, Mission Critical Facilities, Data Centers, Technology
Spaces and Electronic Equipment. The series provides comprehensive

1
treatment of datacom cooling and related subjects.

ISBN 978-1-947192-64-5 (pbk) ASHRAE Datacom Series, Book 1 1 ASHRAE Datacom Series
ISBN 978-1-947192-65-2 (PDF)

180 Technology Parkway


9 781947 192645 Peachtree Corners, GA 30092
Product code: 90579 3/21
www.ashrae.org/bookstore
Thermal Guidelines for
Data Processing
Environments

Fifth Edition
Thermal Guidelines for Data Processing Environments is authored by ASHRAE Tech-
nical Committee (TC) 9.9, Mission Critical Facilities, Technology Spaces and Electronic
Equipment. ASHRAE TC 9.9 is composed of a wide range of industry representatives,
including but not limited to equipment manufacturers, consulting engineers, data center
operators, academia, testing laboratories, and government officials who are all committed
to increasing and sharing the body of knowledge related to data centers.

Thermal Guidelines for Data Processing Environments is not an ASHRAE Guideline


and has not been developed in accordance with ASHRAE’s consensus process.

For more information on the ASHRAE Datacom Series, visit


www.ashrae.org/datacenterguidance.

For more information on ASHRAE TC 9.9, visit


https://tc0909.ashraetcs.org/.

Updates and errata for this publication will be posted on the


ASHRAE website at www.ashrae.org/publicationupdates.
Thermal Guidelines
for Data Processing
Environments
Fifth Edition

ASHRAE Datacom Series


Book 1

Peachtree Corners
ISBN 978-1-947192-64-5 (paperback)
ISBN 978-1-947192-65-2 (PDF)
© 2004, 2008, 2012, 2015, 2021 ASHRAE. All rights reserved.

180 Technology Parkway · Peachtree Corners, GA 30092 · www.ashrae.org


ASHRAE is a registered trademark of the
American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.
____________________________________________

ASHRAE has compiled this publication with care, but ASHRAE has not investigated, and
ASHRAE expressly disclaims any duty to investigate, any product, service, process, proce-
dure, design, or the like that may be described herein. The appearance of any technical data
or editorial material in this publication does not constitute endorsement, warranty, or guar-
anty by ASHRAE of any product, service, process, procedure, design, or the like. ASHRAE
does not warrant that the information in the publication is free of errors, and ASHRAE does
not necessarily agree with any statement or opinion in this publication. The entire risk of the
use of any information in this publication is assumed by the user.

No part of this publication may be reproduced without permission in writing from


ASHRAE, except by a reviewer who may quote brief passages or reproduce illustrations in
a review with appropriate credit, nor may any part of this publication be reproduced, stored
in a retrieval system, or transmitted in any way or by any means—electronic, photocopying,
recording, or other—without permission in writing from ASHRAE. Requests for permission
should be submitted at www.ashrae.org/permissions.

Library of Congress Cataloging-in-Publication Data

Names: ASHRAE (Firm), author.


Title: Thermal guidelines for data processing environments.
Description: Fifth edition. | Peachtree Corners, GA : ASHRAE : Peachtree
Corners, [2021] | Series: ASHRAE datacom series ; book 1 | Includes
bibliographical references. | Summary: "Covers equipment operating
environment guidelines for air-cooled equipment, environmental
guidelines for liquid-cooled equipment, facility temperature and
humidity measurement, equipment placement and airflow patterns,
equipment manufacturers' heat load and airflow requirements reporting,
and methods for increasing energy efficiency and avoiding electrostatic
discharge"-- Provided by publisher.
Identifiers: LCCN 2020046021 | ISBN 9781947192645 (paperback) | ISBN
9781947192652 (adobe pdf)
Subjects: LCSH: Data processing service centers--Cooling. | Data processing
service centers--Heating and ventilation. | Buildings--Environmental
engineering. | Data processing service centers--Design and construction.
| Electronic data processing departments--Equipment and
supplies--Protection. | Electronic apparatus and appliances--Cooling.
Classification: LCC TH7688.C64 T488 2021 | DDC 697.9/316--dc23
LC record available at https://lccn.loc.gov/2020046021

ASHRAE STAFF
SPECIAL PUBLICATIONS Cindy Sheffield Michaels, Editor
James Madison Walker, Managing Editor of Standards
Lauren Ramsdell, Associate Editor
Mary Bolton, Assistant Editor
Michshell Phillips, Senior Editorial Coordinator
PUBLISHING SERVICES David Soltis, Group Manager of Publishing Services
Jayne Jackson, Publication Traffic Administrator
DIRECTOR OF PUBLICATIONS
AND EDUCATION Mark S. Owen



Contents
Preface to the Fifth Edition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix

Acknowledgments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi

Chapter 1—Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Book Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Primary Users of This Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3 Adoption . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

Chapter 2—Environmental Guidelines for Air-Cooled Equipment . . . 9


2.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2 New Air-Cooled Equipment Environmental Specifications . . . . 11
2.2.1 Environmental Class Definitions 
for Air-Cooled Equipment . . . . . . . . . . . . . . . . . . . . . . . . 17
2.2.2 Environmental Class Definition 
for High-Density Air-Cooled Equipment . . . . . . . . . . . . . 21
2.2.3 ETSI Environmental Specifications . . . . . . . . . . . . . . . . 24
2.3 Guide for the Use and Application of the
ASHRAE Data Center Classes . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.4 Server Metrics to Consider in Using Guidelines . . . . . . . . . . . . 27
2.4.1 Server Power Trend versus Ambient Temperature . . . . 28
2.4.2 Acoustical Noise Levels versusAmbient Temperature . . 30
2.4.3 Server Reliability Trend versus Ambient Temperature. . 32
2.4.4 Server Reliability versus Moisture, Contamination, 
and Other Temperature Effects . . . . . . . . . . . . . . . . . . . 35
2.4.5 Server Performance Trend versus 
Ambient Temperature. . . . . . . . . . . . . . . . . . . . . . . . . . . 38
2.4.6 Server Cost Trend versus Ambient Temperature. . . . . . 39
2.4.7 Summary of Air-Cooled Equipment
Environmental Specifications . . . . . . . . . . . . . . . . . . . . . 40
vi Contents

Chapter 3—Environmental Guidelines for


Liquid-Cooled Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .41
3.1 ITE Liquid Cooling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .42
3.1.1 New Construction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .42
3.1.2 Expansions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .43
3.1.3 High-Performance Computing 
and Other High-Density Workloads . . . . . . . . . . . . . . . . .43
3.1.4 ITE and Facilities Interface . . . . . . . . . . . . . . . . . . . . . . .44
3.2 Facility Water Supply Temperature Classes for ITE . . . . . . . . .46
3.2.1 Liquid Cooling Environmental Class Definitions . . . . . . .46
3.2.2 Condensation Considerations . . . . . . . . . . . . . . . . . . . . .48

Chapter 4—Facility Temperature and Humidity Measurement . . . . .49


4.1 Facility Health and Audit Tests. . . . . . . . . . . . . . . . . . . . . . . . . .50
4.1.1 Aisle Measurement Locations . . . . . . . . . . . . . . . . . . . . .50
4.1.2 HVAC Operational Status . . . . . . . . . . . . . . . . . . . . . . . .51
4.1.3 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .52
4.2 Equipment Installation Verification Tests . . . . . . . . . . . . . . . . . .53
4.3 Equipment Troubleshooting Tests . . . . . . . . . . . . . . . . . . . . . . .54
4.4 Cooling Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .56

Chapter 5—Equipment Placement and Airflow Patterns . . . . . . . . . .57


5.1 Equipment Airflow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .57
5.1.1 Airflow Protocol Syntax . . . . . . . . . . . . . . . . . . . . . . . . . .57
5.1.2 Airflow Protocol for Equipment . . . . . . . . . . . . . . . . . . . .57
5.1.3 Cabinet Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .58
5.2 Equipment Room Airflow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .59
5.2.1 Placement of Cabinets and Rows of Cabinets . . . . . . . .59
5.2.2 Cabinets with Dissimilar Airflow Patterns . . . . . . . . . . . .60
5.2.3 Aisle Pitch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .61

Chapter 6—Equipment Manufacturers’ 


Heat and Airflow Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .65
6.1 Providing Heat Release and Airflow Values . . . . . . . . . . . . . . . .65
6.2 Equipment Thermal Report . . . . . . . . . . . . . . . . . . . . . . . . . . . .66
6.3 EPA ENERGY STAR Reporting. . . . . . . . . . . . . . . . . . . . . . . .68
Thermal Guidelines for Data Processing Environments, Fifth Edition vii

Appendix A—2021 ASHRAE Environmental Guidelines for ITE—


Expanding the Recommended Environmental Envelope . . . . . . . . . 71
A.1 Dry-Bulb Temperature Limits . . . . . . . . . . . . . . . . . . . . . . . . . . 74
A.1.1 Low End . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
A.1.2 High End. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
A.2 Moisture Limits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
A.2.1 High End. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
A.2.2 Low End . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
A.3 Acoustical Noise Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
A.4 Data Center Operation Scenarios for the
Recommended Environmental Limits . . . . . . . . . . . . . . . . . . . . 81

Appendix B—2021 Air-Cooled Equipment


Thermal Guidelines (I-P) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

Appendix C—Detailed Flowchart for the 


Use and Application of the ASHRAE Data Center Classes . . . . . . . . 89
C.1 Notes for Figures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
C.2 Nomenclature for Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

Appendix D—ESD Research and Static Control Measures. . . . . . . . 95


D.1 ESD Background. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
D.2 ESD Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
D.3 Personnel and Operational Issues . . . . . . . . . . . . . . . . . . . . . 102
D.4 Flooring Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
D.4.1 Measuring Floor Resistance. . . . . . . . . . . . . . . . . . . . . 103
D.5 Further Reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103

Appendix E—Research on the Effect of 


RH and Gaseous Pollutants on ITE Reliability . . . . . . . . . . . . . . . . . 105
E.1 Conclusions from the Research . . . . . . . . . . . . . . . . . . . . . . . 108

Appendix F—Psychrometric Charts . . . . . . . . . . . . . . . . . . . . . . . . . 111

Appendix G—Altitude Derating Curves . . . . . . . . . . . . . . . . . . . . . . 117

Appendix H—Practical Example of the Impact of


Compressorless Cooling on Hardware Failure Rates . . . . . . . . . . . 119
viii Contents

Appendix I—ITE Reliability Data for


Selected Major U.S. and Global Cities . . . . . . . . . . . . . . . . . . . . . . . .123
I.1 Notes on Figures and Tables . . . . . . . . . . . . . . . . . . . . . . . . . .124

Appendix J—OSHA and Personnel Working in 


High Air Temperatures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .139

Appendix K—Allowable Server 


Inlet Temperature Rate of Change . . . . . . . . . . . . . . . . . . . . . . . . . . .143

Appendix L—Allowable Server Inlet RH Limits versus 


Maximum Inlet Dry-Bulb Temperature . . . . . . . . . . . . . . . . . . . . . . . .147

References and Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .154

Thermal Guidelines for Data Processing Environments, Fifth Edition,


is accompanied by supplemental online content, which can be found
at www.ashrae.org/datacom1_5th.



Preface to the Fifth Edition
Prior to the 2004 publication of the first edition of Thermal Guidelines for Data
Processing Environments, there was no single source in the data center industry for
information technology equipment (ITE) temperature and humidity requirements.
This book established groundbreaking common design points endorsed by the major
information technology original equipment manufacturers (IT OEMs). The second
edition, published in 2008, created a new precedent by expanding the recommended
temperature and humidity ranges.
The third edition (2012) broke new ground through the addition of new data
center environmental classes that enable near-full-time use of free-cooling tech-
niques in most of the world’s climates. This exciting development also brought
increased complexity and trade-offs that required more careful evaluation in their
application due to the potential impact on the ITE to be supported.
The fourth edition (2015b) took further steps to increase the energy efficiency
of data centers by reducing the requirements for humidification. ASHRAE funded
the Electromagnetic Compatibility (EMC) Laboratory at the Missouri University of
Science and Technology from 2011 to 2014 to investigate the risk of upsets or
damage to electronics related to electrostatic discharge (ESD). The concerns raised
prior to the study regarding the increase in ESD-induced risk with reduced humidity
were not justified (Pommerenke et al. 2014).
This fifth edition of Thermal Guidelines is primarily focused on two major
changes—one is a result of the ASHRAE-funded research project RP-1755 (Zhang
et al. 2019a) on the effects of high relative humidity (RH) and gaseous pollutants on
corrosion of ITE, and the second is the addition of a new environmental class for
high-density equipment. ASHRAE funded the Syracuse University Mechanical and
Aerospace Engineering Department from 2015 to 2018 to investigate the risk of oper-
ating data centers at higher levels of moisture when high levels of gaseous pollutants
exist. The objective was to evaluate the ability to increase the recommended moisture
level in support of reducing energy required by data centers. The changes made to the
recommended envelope based on this research study are shown in Chapter 2, with the
details for the basis of these changes reported in Appendix E. A new environmental
class for high-density server equipment has also been added to accommodate high-
performance equipment that cannot meet the requirements of the current environ-
mental classes A1through A4. The fifth edition also changes the naming of the liquid
cooling classes to represent maximum facility water temperatures.
A cornerstone idea carried over from previous editions of Thermal Guide-
lines is that inlet temperature is the only temperature that matters to ITE.
x Preface to the Fifth Edition

Although there are reasons to want to consider the impact of equipment outlet
temperature on the hot aisle, it does not impact the reliability or performance of
the ITE. Also, each manufacturer balances design and performance require-
ments when determining their equipment design temperature rise. Data center
operators should expect to understand the equipment inlet temperature distribu-
tion throughout their data centers and take steps to monitor these conditions. A
facility designed to maximize efficiency by aggressively applying new operating
ranges and techniques will require a complex, multivariable optimization
performed by an experienced data center architect.
Although the vast majority of data centers are air cooled at the IT load, liquid
cooling is becoming more commonplace and likely will be adopted to a greater
extent due to its enhanced operational efficiency, potential for increased density, and
opportunity for heat recovery. Consequently, the fourth and fifth editions of Thermal
Guidelines for Data Processing Environments include definitions of liquid-cooled
environmental classes and descriptions of their applications. Even a primarily
liquid-cooled data center may have air-cooled IT within. As a result, a combination
of air-cooled and liquid-cooled classes will typically be specified for a given data
center.



Acknowledgments
ASHRAE Technical Committee (TC) 9.9 would like to thank the following
members of the IT subcommittee for their groundbreaking work and willingness to
share in order to further the understanding of the entire data center industry and for
their active participation, including conference calls, writing/editing, and reviews:
Dustin Demetriou (IBM), Dave Moss (Dell), Mark Steinke (AMD), Roger Schmidt
(IBM, retired), and Robin Steinbrecher (Intel, retired). Thanks also to Roger
Schmidt for leading the effort on updating this fifth edition.
A special thanks is due to Syracuse University Mechanical and Aerospace Engi-
neering Department and the leadership of Professor Jianshun Zhang and his team,
including PhD student Rui Zhang, for carrying out the research to investigate the
effect of high humidity and gaseous pollutants on information technology equip-
ment (ITE). The result of this work was the primary reason for this fifth edition.
ASHRAE TC 9.9 also wishes to thank the following people for their construc-
tive comments on the draft of this edition: Jason Matteson (Isotope), Jon Fitch
(Midas Green Technologies), John Gross (J. M. Gross Engineering, LLC), Dave
Kelley (Vertiv, retired), Ecton English, Gerardo Alfonso (Ingeal), and Vali Sorell
(Microsoft).
Finally, special thanks to Neil Chauhan of DLB Associates for creating a consis-
tent set of graphics for this updated edition.
1



Introduction
Over the years, the power density of electronic equipment has steadily
increased. In addition, the mission-critical nature of computing has sensitized busi-
nesses to the health of their data centers. The combination of these effects makes it
obvious that better alignment is needed between equipment manufacturers and facil-
ity operations personnel to ensure proper and fault-tolerant operation within data
centers.
This need was recognized by an industry consortium in 1999 that began a grass-
roots effort to provide a power density road map and to work toward standardizing
power and cooling of the equipment for seamless integration into a data center. The
Industry Thermal Management Consortium produced the first projection of heat
density trends. The IT Subcommittee of ASHRAE Technical Committee (TC) 9.9
is the successor of that industry consortium. An updated set of power trend charts
was published in IT Equipment Power Trends, Third Edition (ASHRAE 2018b).
These updated equipment power trends extend to 2025.
The objective of Thermal Guidelines for Data Processing Environments, Fifth
Edition, is to do the following:

• Provide standardized operating environments for equipment


• Provide and define a common environmental interface for the equipment and
its surroundings
• Provide guidance on how to evaluate and test the operational health of a data
center
• Provide a methodology for reporting the environmental characteristics of a
computer system
• Guide data center owners and operators in making changes in the data center
environment
• Provide the basis for measuring the effect of any changes intended to save
energy in data centers

This book provides equipment manufacturers and facility operations personnel


with a common set of guidelines for environmental conditions. It is important to
recognize that the ASHRAE TC 9.9 IT Subcommittee is made up of subject matter
experts from the major information technology equipment (ITE) manufacturers. It
is the intent of ASHRAE TC 9.9 to update this book regularly.
Unless otherwise stated, the thermal guidelines in this document refer to data
center and other data-processing environments. Telecom central offices are
discussed in detail in the European Telecommunications Standards Institute (ETSI)
2 Introduction

standard ETSI EN 300 019-1-3 (2014), which is referenced when there is a compar-
ison between data centers and telecom rooms. It is important to show the comparison
where some convergence in these environments may occur in the future.

1.1 BOOK FLOW


Following this introductory chapter, this book continues as follows:

• Chapter 2, “Environmental Guidelines for Air-Cooled Equipment,” pro-


vides

• descriptions of the A1-A4 environmental classes and a new H1 high-den-


sity server environmental class,
• temperature and humidity conditions that ITE must meet for all classes,
• the recommended operating environment for all of the ITE classes,
• the opportunity for facility operators to plan excursions into the allowable
range or modify the recommended operating envelope based on details
provided in this book on the effect of data center environments on server
operation and reliability,
• the effect of altitude on each data center class, and
• Chapter 3, “Environmental Guidelines for Liquid-Cooled Equipment,”
provides information on five environmental classes for supply water tempera-
ture and other characteristics.
• Chapter 4, “Facility Temperature and Humidity Measurement,” provides
a recommended procedure for measuring temperature and humidity in a data
center. Different protocols are described depending on whether the purpose of
the measurement is to perform an audit on the data center, an equipment
installation verification test, or an equipment troubleshooting test.
• Chapter 5, “Equipment Placement and Airflow Patterns,” examines rec-
ommended airflow protocols, hot-aisle/cold-aisle configurations, and recom-
mended equipment placement.
• Chapter 6, “Equipment Manufacturers’ Heat and Airflow Reporting,”
provides manufacturers with a methodology for reporting sufficient dimen-
sional, heat load, and airflow data to allow a data center to be adequately
designed to meet equipment requirements but not overdesigned, as might be
the case if nameplate equipment ratings were used to estimate heat loads.
• Appendix A, “2021 ASHRAE Environmental Guidelines for ITE—
Expanding the Recommended Environmental Envelope,” describes some
of the methodology used in determining the recommended envelope and also
some scenarios for how the recommended and allowable envelopes can be
applied in an operational data center.
• Appendix B, “2021 Air-Cooled Equipment Thermal Guidelines (I-P),” shows
the new air-cooled equipment classes in I-P units.
• Appendix C, “Detailed Flowchart for the Use and Application of the
ASHRAE Data Center Classes,” provides, in detail, guidance for data center
Thermal Guidelines for Data Processing Environments, Fifth Edition 3

operators to achieve data center operation within a specific environmental


envelope.
• Appendix D, “ESD Research and Static Control Measures,” discusses the
need for minimum humidity levels and basic electrostatic discharge (ESD)
protection protocols in data centers.
• Appendix E, “Research on the Effect of RH and Gaseous Pollutants on
ITE Reliability,” discusses the research that provides an expanded recom-
mended environmental envelope for increased data center energy savings.
• Appendix F, “Psychrometric Charts,” shows various psychrometric charts
for the air-cooled classes in different units.
• Appendix G, “Altitude Derating Curves,” shows the envelopes of tempera-
ture and elevation for Classes A1 through A4 and H1.
• Appendix H, “Practical Example of the Impact of Compressorless Cool-
ing on Hardware Failure Rates,” uses a hypothetical data center implemen-
tation in the city of Chicago to guide the reader through assessing the impact
of a compressorless cooling design on hardware failure rates using the infor-
mation in this book.
• Appendix I, “ITE Reliability Data for Selected Major U.S. and Global
Cities,” uses ASHRAE’s Weather Data Viewer software (2009b) and the rela-
tive hardware failure rate information in this book to provide localized metrics
on net hardware failure rates and annual hours per year of compressorized
cooling needed in selected major U.S. and global cities.
• Appendix J, “OSHA and Personnel Working in High Air Temperatures,”
provides some information and guidance on personnel working in high-
temperature environments.
• Appendix K, “Allowable Server Inlet Temperature Rate of Change,” con-
tains background information that explains the change to the temperature rate
of change specification that was made in the fourth edition of the book
(ASHRAE 2015b). Examples are provided to illustrate temperature changes
that are and are not acceptable for the new specification.
• Appendix L, “Allowable Server Inlet RH Limits versus Maximum Inlet
Dry-Bulb Temperature,” contains x-y climatogram plots to illustrate how
the application of the dew-point limits in ASHRAE specifications can restrict
relative humidity values at high and low temperatures.
• References and Bibliography provides references as cited throughout this
book as well as sources for additional information.
• The Reference Card provides helpful, easy-to-access information for facility
managers and others. This card can be found in the front pocket of the book
and as a downloadable Adobe© Acrobat© PDF at ashrae.org/datacom1_5th. If
the files or information at the link are not accessible, please contact the pub-
lisher.
4 Introduction

1.2 PRIMARY USERS OF THIS BOOK


Primary users of this book are those involved in the design, construction,
commissioning, operation, implementation, and maintenance of equipment rooms.
Others who may benefit from this book are those involved in the development and
design of electronic equipment. Specific examples of the book’s intended audience
include the following:

• Computer equipment manufacturers—research and development, marketing,


and sales organizations
• Infrastructure equipment manufacturers—cooling and power
• Consultants
• General and trade contractors
• Equipment operators, IT departments, facilities engineers, and chief informa-
tion officers

1.3 ADOPTION
It is the hope of ASHRAE TC 9.9 that many equipment manufacturers and facil-
ities managers will follow the guidance provided in this book. Data center facilities
managers can be confident that these guidelines have been produced by IT manu-
facturers.
Manufacturers can self-certify that specific models of equipment operate as
intended in data processing air-cooling environmental classes A1, A2, A3, A4, and
H1 and the liquid-cooling environmental classes W17 through W+.

1.4 DEFINITIONS
air:
conditioned air: air treated to control its temperature, relative humidity, purity,
pressure, and movement.
supply air: air entering a space from an air-conditioning, heating, or ventilating
apparatus.
annual failure rate (AFR): average number of failures per year.
availability: a percentage value representing the degree to which a system or compo-
nent is operational and accessible when required for use.
basic input/output system (BIOS): set of computer instructions in firmware that
control input and output operations.
cabinet: frame for housing electronic equipment that is enclosed by doors and is
stand-alone; this is generally found with high-end servers.
computer room: a room or portions of a building serving an ITE load less than or
equal to 10 kW or 215 W/m2 (20 W/ft2) or less of conditioned floor area.
Thermal Guidelines for Data Processing Environments, Fifth Edition 5

coolant distribution unit (CDU): 1) creates an isolated secondary loop, separate


from the chilled-water supply (building chilled water, dedicated chiller, etc.),
enabling strict containment and precise control of the liquid cooling system for the
ITE and 2) maintains the supply temperature of the liquid cooling loop for the ITE
above the dew point of the data center, preventing condensation and ensuring 100%
sensible cooling.
data center: a room or building, or portions thereof, including computer rooms
served by data center systems, serving a total ITE load greater than 10 kW or 215 W/
m2 (20 W/ft2) of conditioned floor area.
dew point (DP): the atmospheric temperature (varying according to pressure and
humidity) below which water droplets begin to condense and dew can form.
electrostatic discharge (ESD): the sudden flow of electricity between two electri-
cally charged objects caused by contact, an electrical short, or dielectric breakdown.
equipment: refers but is not limited to servers, storage products, workstations,
personal computers, and transportable computers; may also be referred to as elec-
tronic equipment or ITE.
equipment room: data center or telecom central office room that houses computer
and/or telecom equipment; for rooms housing mostly telecom equipment, see
Telcordia GR-3028-CORE (2001).
framework: structural portion of a frame.
heat:
latent heat: change of enthalpy during a change of state.
sensible heat: heat that causes a change in temperature.
total heat (enthalpy): a thermodynamic quantity equal to the sum of the internal
energy of a system plus the product of the pressure-volume work done on the
system:
h = U + pv
where
h = enthalpy or total heat content
U = internal energy of the system
p = pressure
v = volume
For the purposes of this document, h = sensible heat + latent heat.
high-performance computing (HPC): most generally refers to the practice of
aggregating computing power in a way that delivers much higher performance than
is possible from a typical desktop computer or workstation in order to solve large
problems in science, engineering, or business.
6 Introduction

humidity:
absolute humidity: the mass of water vapor in a specific volume of a mixture
of water vapor and dry air.
humidity ratio: the ratio of the mass of water to the total mass of a moist air
sample; it is usually expressed as grams of water per kilogram of dry air (gw/kgda)
or as pounds of water per pound of dry air (lbw/lbda).
relative humidity (RH):
a. Ratio of the partial pressure or density of water vapor to the saturation pres-
sure or density, respectively, at the same dry-bulb temperature and baro-
metric pressure of the ambient air.
b. Ratio of the mole fraction of water vapor to the mole fraction of water
vapor saturated at the same temperature and barometric pressure; at
100% rh, the dry-bulb, wet-bulb, and dew-point temperatures are equal.
information technology (IT): the study or use of systems (especially computers and
telecommunications) for storing, retrieving, and sending information.
information technology equipment (ITE): devices or systems that use digital tech-
niques for purposes such as data processing and computation.
information technology original equipment manufacturer (IT OEM): tradition-
ally, a company whose goods are used as components in the products of another
company, which then sells the finished item to users.
IT space: a space dedicated primarily to computers and servers but with environ-
mental and support requirements typically less stringent than those of a data center.
liquid cooled: cases where liquid must be circulated to and from the electronics
within the ITE for cooling with no other form of heat transfer.
mean time between failures (MTBF): the average time between system breakdowns.
power:
measured power: the heat release in watts, as defined in Chapter 6, Section 6.1,
“Providing Heat Release and Airflow Values.”
nameplate rating: term used for rating according to nameplate (IEC 60950-1,
under clause 1.7.1: “Equipment shall be provided with a power rating marking,
the purpose of which is to specify a supply of correct voltage and frequency, and
of adequate current-carrying capacity” [IEC 2005]).
rated current: “The input current of the equipment as declared by the manu-
facturer” (IEC 2005); the rated current is the absolute maximum current that is
required by the unit from an electrical branch circuit.
rated frequency: the supply frequency as declared by the manufacturer.
Thermal Guidelines for Data Processing Environments, Fifth Edition 7

rated frequency range: the supply frequency range as declared by the manu-
facturer, expressed by its lower- and upper-rated frequencies.
rated voltage: the supply voltage as declared by the manufacturer.
rated voltage range: the supply voltage range as declared by the manufacturer.
power usage effectiveness (PUETM): the ratio of total amount of energy used by a
computer data center facility to the energy delivered to the computer equipment. See
PUETM: A Comprehensive Examination of the Metric (ASHRAE 2014c) for more
information.
printed circuit board (PCB): an electronic circuit consisting of thin strips of a
conducting material such as copper that have been etched from a layer fixed to a flat
insulating sheet and to which integrated circuits and other components are attached.
rack: frame for housing electronic equipment.
rack-mounted equipment: equipment that is to be mounted in an Electronic Industry
Alliance (EIA) or similar cabinet; these systems are generally specified in EIA units,
such as 1U, 2U, 3U, where 1U = 44 mm (1.75 in.).
reliability: percentage value representing the probability that a piece of equipment
or system will be operable throughout its mission duration; values of 99.9% (“three
nines”) and higher are common in data and communications equipment areas. For
individual components, reliability is often determined through testing; for assem-
blies and systems, reliability is often the result of a mathematical evaluation based
on the reliability or individual components and any redundancy or diversity that may
be used.
room load capacity: the point at which the equipment heat load in the room no longer
allows the equipment to run within the specified temperature requirements of the
equipment; Chapter 4 defines where these temperatures are measured. The load
capacity is influenced by many factors, the primary factor being the room theoretical
capacity; other factors, such as the layout of the room and load distribution, also
influence the room load capacity.
room theoretical capacity: the capacity of the room based on the mechanical room
equipment capacity; this is the sensible capacity in kilowatts (tons) of the mechan-
ical room for supporting the computer or telecom room heat loads.
stock keeping unit (SKU): the number of one specific product available for sale. If
a hardware device or software package comes in different versions, there is a SKU
for each one.
temperature:
dew-point temperature: the temperature at which water vapor has reached the
saturation point (100% rh).
dry-bulb temperature: the temperature of air indicated by a thermometer.
8 Introduction

wet-bulb temperature: the temperature indicated by a psychrometer when the


bulb of one thermometer is covered with a water-saturated wick over which air
is caused to flow at approximately 4.5 m/s (900 ft/min) to reach an equilibrium
temperature of water evaporating into air, where the heat of vaporization is
supplied by the sensible heat of the air.
thermal design power (TDP): the maximum amount of heat generated by a
computer chip or component (often a CPU, GPU, or system on a chip) that the cool-
ing system in a computer is designed to dissipate under any workload. Sometimes
called thermal design point.
Threshold Limit Values (TLVs®): American Conference of Governmental and
Industrial Hygienists (ACGIH) guidelines for work in extreme heat or in hot envi-
ronments that consist of work-rest (WR) allocations designed to ensure a stable core
temperature that does not exceed 38°C (100.4°F) (ACGIH 2017). See Appendix J.
total cost of ownership (TCO): the purchase price of an asset plus the costs of oper-
ation. Assessing the TCO represents taking a bigger-picture look at what the product
is and what its value is over time.
ventilation: the process of supplying or removing air by natural or mechanical
means to or from any space; such air may or may not have been conditioned.
wet-bulb globe temperature (WBGT): a measure of the heat stress in direct sunlight,
which takes into account temperature, humidity, wind speed, sun angle, and cloud
cover (solar radiation). See Appendix J for more information.
x-factor: a dimensionless metric that measures the relative hardware failure rate at
a given constant equipment inlet dry-bulb temperature when compared to a baseline
of the average hardware failure rate at a constant equipment inlet dry-bulb tempera-
ture of 20°C (68°F). See Chapter 2, Section 2.4.3, for a table of x-factor values.
x-factor, time-weighted (or net): a dimensionless metric indicating a statistical
equipment failure rate over a defined range of environmental temperatures when
compared to a constant baseline temperature of 20°C (68°F); it is calculated by
summing individual time-at-temperature bins multiplied by their associated x-factor.
2


Environmental Guidelines for 
Air-Cooled Equipment
Chapters 2 and 3 summarize data center environmental guidelines developed
by members of the ASHRAE TC 9.9 committee representing information technol-
ogy equipment (ITE) manufacturers. These environmental guidelines are for terres-
trial-based systems and do not cover electronic systems designed for aircraft or
spacecraft applications. In this book the term server is used to generically describe
any ITE, such as servers, storage, and network products, used in data-center-like
applications.

2.1 BACKGROUND
TC 9.9 created the original publication Thermal Guidelines for Data Processing
Environments in 2004 (ASHRAE 2004). At the time, the most important goal was
to create a common set of environmental guidelines that ITE would be designed to
meet. Although computing efficiency was important, performance and availability
took precedence. Temperature and humidity limits were set accordingly. In the first
decade of the twenty-first century, increased emphasis has been placed on comput-
ing efficiency. Power usage effectiveness (PUETM) has become the new metric by
which to measure the effect of design and operation on data center efficiency
(ASHRAE 2014c). To improve PUE, free-cooling techniques, such as air- and water-
side economization, have become more commonplace with a push to use them year
round. To enable improved PUE capability, TC 9.9 created additional environmental
classes, along with guidance on the use of the existing and new classes. Expanding
the capability of ITE to meet wider environmental requirements can change the
equipment’s reliability, power consumption, and performance capabilities; this fifth
edition of the book provides information on how these capabilities are affected.
In the second edition of Thermal Guidelines (ASHRAE 2008), the recom-
mended envelope was expanded along with guidance for data center operators on
maintaining high reliability and also operating their data centers in the most energy-
efficient manner. This expanded envelope was created for general use across all
types of businesses and conditions. However, different environmental envelopes
may be more appropriate for different business values and climate conditions.
Therefore, to allow for the potential to operate a data center in a different envelope
that might provide even greater energy savings, the third edition provided general
guidance on server metrics that assisted data center operators in creating an operat-
ing envelope that matched their business values. Each of these metrics is described
in this book. Using these guidelines, the user should be able to determine what envi-
ronmental conditions best meet their technical and business needs. Any choice
10 Environmental Guidelines for Air-Cooled Equipment

outside of the recommended region would be a balance between the additional


energy savings of the cooling system and the deleterious effects that may be created
on total cost of ownership (TCO) (total site energy use, reliability, acoustics, or
performance). A simple representation of this process is shown in Figure 2.1 for
those who decide to create their own envelope rather than use the recommended
envelope for operation of their data centers.
A flowchart was also added in the second edition to help guide the user through
the appropriate evaluation steps. Many of these metrics center around simple graphs
that describe the trends. However, the use of these metrics was intended for those
who plan to go beyond the recommended envelope for additional energy savings.
Their use would require significant additional analysis to understand the TCO
impact of operating beyond the recommended envelope.
In the third edition of Thermal Guidelines (ASHRAE 2011), two new classes
(A3 and A4) were added to accommodate different applications and priorities of ITE
operation. Each data center operator is forced to operate in a specific environment
based on the classes of equipment installed and the operator’s own criteria (e.g.,
TCO, reliability, performance).
In the fourth edition of Thermal Guidelines (ASHRAE 2015b), more enhance-
ments to the ITE classes were made to meet data center energy-efficiency improve-
ment requirements. These enhancements were based on electrostatic discharge
(ESD) research funded by ASHRAE (Pommerenke et al. 2014). The details of this
research are reported in Appendix D.

Figure 2.1 Server metrics for determining data center operating


environment envelope.
Thermal Guidelines for Data Processing Environments, Fifth Edition 11

In this fifth edition of the book, more enhancements to the recommended enve-
lope were made to aid in data center energy improvements. While the fourth edition
focused on modifying the recommended envelope based on low-humidity research,
the changes to this fifth edition are primarily a result of the ASHRAE-funded
research project RP-1755 (Zhang et al. 2019a) on the effects of high relative humidity
(RH) and gaseous pollutants on corrosion of ITE. ASHRAE funded the Syracuse
University Mechanical and Aerospace Engineering Department from 2015 to 2018
to investigate the risk of operating data centers at higher levels of moisture when high
levels of gaseous pollutants exist (Zhang et al. 2019). The objective was to evaluate
the ability of increasing the recommended moisture level in support of reducing the
energy required by data centers. Five gaseous pollutants were tested under a variety
of temperature and RH conditions—three pollutants that are pervasive throughout the
planet (SO2, NO2, and O3) and two catalyst pollutants (H2S and Cl2). Pollutant levels
tested were at or near the maximum common concentration levels existing around the
world. The changes made to the recommended envelope based on this research are
summarized in this chapter, and Appendix E provides more insight into why the
changes were made to the recommended envelope based on the research results.

2.2 NEW AIR-COOLED EQUIPMENT


ENVIRONMENTAL SPECIFICATIONS
This chapter focuses primarily on the latest environmental specifications, with
Appendix A providing additional information on the recommended environmental
envelope. Before the latest specifications are described, several key definitions need
to be highlighted:

recommended environmental range: Facilities should be designed to achieve,


under normal circumstances, ambient conditions that fall within the recommended
range. This recommended range may be as defined either in Table 2.1 or by the
process outlined later in this chapter whereby the user can apply the metrics in
Figure 2.1 (described in more detail in Section 2.4) to define a different recom-
mended range more appropriate to meet specific business objectives. The recom-
mended envelope was chosen based on a number of inputs, the primary being
reliability of ITE, power increases of ITE with higher ambient temperatures, acous-
tical impacts with higher ambient temperatures, and providing a buffer for excur-
sions to the allowable limits caused by facility cooling fails. These events are
discussed in more detail throughout the remainder of this chapter.

allowable environmental envelope: The allowable envelope is where IT manufac-


turers test their equipment to verify full operation and that it will function within
those environmental boundaries. To enable the greatest latitude in use of all the
classes, power and thermal management features may be triggered within the allow-
able range to ensure there are no thermal excursions outside the capability of the ITE
under extreme load conditions. Typically, IT manufacturers perform a number of
tests prior to the announcement of the product to verify that it meets all the functional
requirements within the environmental envelope. This is not a statement of reliabil-
ity but one of the functionality of the ITE. In addition to the allowable dry-bulb
12 Environmental Guidelines for Air-Cooled Equipment

temperature and relative humidity (RH) ranges, the maximum dew point (DP) and
maximum elevation values are part of the allowable operating environment defini-
tions. The IT purchaser must consult with the equipment manufacturer to understand
the performance capabilities of the ITE at the extreme upper limits of the allowable
thermal envelopes.

practical application: Prolonged exposure of operating equipment to conditions


outside its recommended range, especially approaching the extremes of the allowable
operating environment, can result in decreased equipment reliability and longevity
(server reliability values versus inlet air temperatures are provided in Section 2.4.3
to provide some guidance on operating outside the recommended range). Operating
equipment at conditions outside its allowable operating environment risks cata-
strophic equipment failure. With equipment at high power density, it may be difficult
to maintain air entering the equipment within the recommended range, particularly
over the entire face of the equipment. In these situations, reasonable efforts should
be made to achieve conditions within the recommended range. If these efforts prove
unsuccessful, operation outside the recommended range but within the allowable
environment is likely to be adequate, but facility operators may wish to consult with
the equipment manufacturers regarding the risks involved. More information on
operating at high RH levels, in some cases outside the recommended levels, is
provided later in this section. This information is based on the recent research on high
RH levels combined with high levels of pollutants (Zhang et al. 2019).

To restate these important concepts in different words: in general, ITE manu-


facturers consider their equipment warrantied for operation within the allowable
envelope without any time limit imposed on operation at any temperature and
humidity value within that envelope. However, for long-term reliability, IT manu-
facturers recommend that the equipment be maintained within the recommended
envelope for most of the time. An estimate of the impact of operating a data center
outside the recommended envelope can be made with the use of the server failure rate
x-factor described in Section 2.4.3.
ASHRAE funded the Electromagnetic Compatibility (EMC) Laboratory at the
Missouri University of Science and Technology from 2011 to 2014 to investigate the
risk of upsets or damage to electronics related to electrostatic discharge (ESD).
Emphasis was placed on the increase in risk with reduced humidity. The results from
this study (Pommerenke et al. 2014) show that a data center with a low incident rate
of ESD-induced damage operating at 25% rh will maintain a low incident rate if the
humidity is reduced to 8%. The concerns regarding the increase in ESD-induced risk
with reduced humidity raised prior to the study were found not to be justified. A stan-
dard set of ESD mitigation procedures will ensure a very low ESD incident rate at
humidity levels tested down to 8% rh. As a result of this study, the ASHRAE envi-
ronmental classes were expanded to realize potential energy savings in data centers
by not requiring humidification at low moisture levels.
The previous ESD research was focused on low levels of moisture; this fifth
edition of Thermal Guidelines presents results of research on operating a data center
in an environment with high moisture levels and gaseous pollutants. The following
Thermal Guidelines for Data Processing Environments, Fifth Edition 13

notes detail changes made to the recommended envelope that were made with the
intent of maintaining high reliability of the ITE. These notes are critical to using this
fifth edition of Thermal Guidelines for Data Processing Environments.

1. To gain the full advantage of the results of current research (Zhang et al. 2019),
data center operators should use silver and copper coupons inside their data
centers at least twice a year (once in the winter and once in the summer) to
detect the level of corrosion in the environment. See Particulate and Gaseous
Contamination in Datacom Environments (ASHRAE 2014b) for more details
on these measurements.
2. For data center environments tested with silver and copper coupons that are
shown to have corrosion levels less than 300 Å/month for copper and 200 Å/
month for silver, suggesting that only the pervasive pollutants (SO2, NO2, and
O3) may be present, the recommended moisture limit has been raised from 60%
rh to 70% rh. The upper moisture limit is now 70% rh or 15°C (59°F) DP,
whichever is the minimum moisture content. 
The data also showed that increasing the recommended temperature from
27°C to 28°C (80.6°F to 82.4°F) would be acceptable from a reliability stand-
point (Zhang et al. 2019). However, because IT manufacturers typically start
increasing airflow through servers around 25°C (77°F) to offset the higher
ambient temperature, this increased air-moving device power draw did not
warrant changing the recommended upper temperature limit.
In addition, the data showed that increasing the dew point from 15°C to 17°C
(59°F to 62.6°F) would be acceptable from a reliability standpoint. However, this
change would put the recommended upper moisture limit coincident with the
upper moisture limit of the allowable envelope of Class A1. For those data centers
that operate to the Class A1 environment, it was decided to maintain the buffer of
2°C (3.6°F) between the recommended and allowable envelopes and to maintain
the recommended envelope the same for all air-cooling classes (A1 through A4).
3. For data center environments tested with silver and copper coupons that are
shown to have levels of corrosion greater than 300 Å/month for copper and
200 Å/month for silver, suggesting that Cl2 and/or H2S (or other corrosive cata-
lysts) may be present, then the recommended moisture levels should be kept
below 50% rh. The upper moisture limit is 50% rh or 15°C (59°F) DP, which-
ever is the minimum moisture content. Chemical filtration should be considered
in these situations.
4. If coupon measurements are not performed to aid in understanding the possible
corrosion impact on ITE, the data center operator should consider maintaining
a lower humidity level to protect the ITE, either below 60% as specified in the
fourth edition of this book or below 50% as specified in note 3 above.

The environmental envelopes, updated based on the study of the effect of RH


and gaseous pollutants on the corrosion of copper and silver (Zhang et al. 2019), are
shown in Figures 2.2 and 2.3. Table 2.1 displays the specific values that went into
creating these figures.
14 Environmental Guidelines for Air-Cooled Equipment

Figure 2.2 2021 recommended and allowable envelopes for Classes


A1, A2, A3 and A4. The recommended envelope is for low
levels of pollutants verified by coupon measurements as
indicated in note 3 of Section 2.2.

Figure 2.3 2021 recommended and allowable envelopes for Classes A1,
A2, A3 and A4. The recommended envelope is for high levels
of pollutants verified by coupon measurements as indicated in
note 3 of Section 2.2.
Thermal Guidelines for Data Processing Environments, Fifth Edition 15

Table 2.1 2021 Thermal Guidelines for Air Cooling—


SI Version (I-P Version in Appendix B)

Equipment Environment Specifications for Air Cooling

Product
Product Operationb,c Power
Offc,d

Max.
Max. Rate Dry-
Dry-Bulb Humidity Dew Max. of Bulb
Temp.e,g, Range, Pointk, Elev.e,j,m, Changef, Temp., RHk,
Classa °C Noncond.h, i, k, l, n °C m °C/h °C %

Recommended (suitable for Classes A1 to A4; explore data center metrics in


this book for conditions outside this range.)

–9°C DP to 15°C DP
A1 to
18 to 27 and
A4
70% rhn or 50% rhn

Allowable

–12°C DP and 8% rh
A1 15 to 32 to 17 3050 5/20 5 to 45 8 to 80k
17°C DP and 80% rhk

–12°C DP and 8% rh
A2 10 to 35 to 21 3050 5/20 5 to 45 8 to 80k
21°C DP and 80% rhk

–12°C DP and 8% rh
A3 5 to 40 to 24 3050 5/20 5 to 45 8 to 80k
24°C DP and 85% rhk

–12°C DP and 8% rh
A4 5 to 45 to 24 3050 5/20 5 to 45 8 to 80k
24°C DP and 90% rhk

* For potentially greater energy savings, refer to Appendix C for the process needed to account for multiple server
metrics that impact overall TCO.
16 Environmental Guidelines for Air-Cooled Equipment

Notes for Table 2.1, 2021 Thermal Guidelines for Air Cooling—
SI Version (I-P Version in Appendix B)
a. Classes A3 and A4 are identical to those included in the 2011 version of the thermal guide-
lines (ASHRAE 2012). The 2015 version of the A1 and A2 classes (ASHRAE 2015b) has
expanded RH levels compared to the 2011 version.
b. Product equipment is powered on.
c. Tape products require a stable and more restrictive environment (similar to Class A1 as spec-
ified in 2008). Typical requirements: minimum temperature is 15°C, maximum temperature
is 32°C, minimum RH is 20%, maximum RH is 80%, maximum DP is 22°C, rate of change
of temperature is less than 5°C/h, rate of change of humidity is less than 5% rh per hour, and
no condensation.
d. Product equipment is removed from original shipping container and installed but not in use,
e.g., during repair, maintenance, or upgrade.
e. Classes A1 and A2—Derate maximum allowable dry-bulb temperature 1°C/300 m above 900
m. Above 2400 m altitude, the derated dry-bulb temperature takes precedence over the recom-
mended temperature. Class A3—Derate maximum allowable dry-bulb temperature 1°C/175
m above 900 m. Class A4—Derate maximum allowable dry-bulb temperature 1°C/125 m
above 900 m.
f. For tape storage: 5°C in an hour. For all other ITE: 20°C in an hour and no more than 5°C in
any 15-minute period of time. The temperature change of the ITE must meet the limits shown
in the table and is calculated to be the maximum air inlet temperature minus the minimum air
inlet temperature within the time window specified. The 5°C and 20°C temperature change
is considered to be a temperature change within a specified period of time and not a rate of
change. See Appendix K for additional information and examples.
g. With a diskette in the drive, the minimum temperature is 10°C (not applicable to Classes A1
or A2).
h. The minimum humidity level for Classes A1, A2, A3, and A4 is the higher (more moisture)
of the –12°C dew point and the 8% rh. These intersect at approximately 25°C. Below this
intersection (~25°C) the dew point (–12°C) represents the minimum moisture level, while
above it, RH (8%) is the minimum.
i. Based on research funded by ASHRAE and performed at low RH (Pommerenke et al. 2014),
the following are the minimum requirements:
1) Data centers that have non-ESD floors and where personnel are allowed to wear non-ESD
shoes may need increased humidity given that the risk of generating 8 kV increases slightly
from 0.27% at 25% rh to 0.43% at 8% rh (see Appendix D for more details).
2) All mobile furnishing/equipment is to be made of conductive or static-dissipative materials
and bonded to ground.
3) During maintenance on any hardware, a properly functioning and grounded wrist strap
must be used by any personnel who contacts ITE.
j. To accommodate rounding when converting between SI and I-P units, the maximum elevation
is considered to have a variation of ±0.1%. The impact on ITE thermal performance within
this variation range is negligible and enables the use of the rounded value of 3050 m.
k. See Appendix L for graphs that illustrate how the maximum and minimum DP limits restrict
the stated RH range for each of the classes for both product operations and product power off.
l. For the upper moisture limit, the limit is the minimum absolute humidity of the DP and RH
stated. For the lower moisture limit, the limit is the maximum absolute humidity of the DP and
RH stated.
m. Operation above 3050 m requires consultation with the IT supplier for each specific piece of
equipment.
n. If testing with silver or copper coupons results in values less than 200 and 300 Å/month,
respectively, then operating up to 70% rh is acceptable. If testing shows corrosion levels
exceed these limits, then catalyst-type pollutants are probably present and RH should be
driven to 50% or lower. See note 3 of Section 2.2 for more details.
Thermal Guidelines for Data Processing Environments, Fifth Edition 17

In addition to the impact of gaseous contamination, particulate contamination


remains important. Data centers must be kept clean to ISO Standard 14644-1 Class 8
(ISO 2015). This level of cleanliness can generally be achieved by an appropriate
filtration scheme as recommended in Particulate and Gaseous Contamination in
Datacom Environments (ASHRAE 2014b). A summary of these recommendations
is included here:

• The room air should be continuously filtered with MERV 8 filters as recom-
mended by AHRI Standard 1360 (2017).
• Air entering a data center should be filtered with MERV 11 to MERV 13 fil-
ters.

All sources of dust inside data centers should be reduced. Every effort should
be made to filter out dust that has deliquescent relative humidity less than the maxi-
mum allowable relative humidity in the data center.

2.2.1 Environmental Class Definitions for Air-Cooled Equipment


ITE operating within a particular allowable envelope may trigger power and
thermal management features when the temperature extremes of the environmental
envelope are approached.

• Class A1: Typically a data center with tightly controlled environmental


parameters (DP, temperature, and RH) and mission-critical operations; types
of products typically designed for this environment are enterprise servers and
storage products.
• Classes A2, A3, and A4: Typically an IT space with some control of environ-
mental parameters (DP, temperature, and RH); types of products typically
designed for this environment are volume servers, storage products, personal
computers, and workstations. Among these three classes, A2 has the narrow-
est temperature and moisture requirements and A4 has the widest environ-
mental requirements.

Note k was added to Table 2.1 to provide further clarification of the allowable
range of relative humidity. The humidity range noted in the table is not for the range
of dry-bulb temperatures specified in the table (this can clearly be seen in the psycho-
metric charts shown in Figures 2.2 and 2.3). As an example, the range of humidity
for Class A3 is shown in Figure 2.4. Additional clarification for the other classes is
provided in Appendix L.
Because equipment manufactured to environmental Classes A1 and A2 may
exist in two different forms that meet either the 2011 or 2015 versions, it is imper-
ative that when referencing equipment in Classes A1 or A2 that the thermal guide-
lines version (2011 or 2015) be noted.
The recommended envelope is highlighted as a separate row in Table 2.1
because of some misconceptions regarding the use of the recommended envelope.
When it was first created, it was intended that within this envelope the most reliable,
18 Environmental Guidelines for Air-Cooled Equipment

Figure 2.4 Climatogram of Class A3 illustrating how dew-point limits


modify relative humidity specification limits.

acceptable, and reasonably power-efficient operation could be achieved. Data from


manufacturers were used to create the recommended envelope. It was never intended
that the recommended envelope represent the absolute limits of inlet air temperature
and humidity for ITE. As stated in the second edition of Thermal Guidelines
(ASHRAE 2008), the recommended envelope defined the limits under which ITE
would operate most reliably while still achieving reasonably energy-efficient data
center operation. However, in order to use economizers as much as possible to save
energy during certain times of the year, the inlet server conditions may fall outside
the recommended envelope but still within the allowable envelope. The second
edition of Thermal Guidelines also states that it is acceptable to operate outside the
recommended envelope for short periods of time without risk of affecting the overall
reliability and operation of the ITE. However, some still felt the recommended enve-
lope was mandatory, even though that was never the intent. The effect on the reli-
ability of the equipment operating outside the recommended envelope can be
estimated using the failure rate x-factor described in Section 2.4.3.
Equipment inlet air temperature measurements are specified in Chapter 4. To
aid in data center layout and inlet rack temperature monitoring, manufacturers of
electronic equipment should include temperature sensors within their equipment
that monitor and display or report the inlet air temperature. (See Advancing DCIM
with IT Equipment Integration [ASHRAE 2019] for more information on the
sensors within ITE.) For product operation, the environmental specifications given
in Table 2.1 refer to the air entering the electronic equipment. Air exhausting from
Thermal Guidelines for Data Processing Environments, Fifth Edition 19

electronic equipment is not relevant to the manufacturers of such equipment.


However, the exhaust temperature is a concern, for example, for service personnel
working in the hot exhaust airstream. Some information and guidance from Occu-
pational Safety and Health Administration (OSHA) for personnel working in high-
temperature environments is given in Appendix J.
The allowable and recommended envelopes for Classes A1 through A4 are
depicted in psychrometric charts in Appendix F. The recommended environmental
envelope specified in Table 2.1 is based in general on the reliability aspects of the
electronic hardware specifically:

• High RH levels have been shown to affect failure rates of electronic compo-
nents. Examples of failure modes exacerbated by high RH include conductive
anodic failures, hygroscopic dust failures, tape media errors and excessive
wear, and corrosion. The recommended upper RH limit is set to limit this
effect. The new research reported in detail in Appendix E sets the recom-
mended upper RH limit at 70% for data centers that continuously monitor the
corrosion rate of copper and silver and are shown to have levels below 300
and 200 Å/month, respectively.
• Electronic devices are susceptible to damage by ESD, but based on the ESD
research reported in Appendix D, susceptibility to low RH is a lesser concern
than once thought.
• High temperature affects the reliability and life of electronic equipment. The
recommended upper ambient temperature limit is set to limit these tempera-
ture-related reliability effects. To estimate the effects of operating at higher
temperatures, see Section 2.4.3 for a description of the relative ITE failure
rate x-factor.
• The lower the temperature in the room that houses the electronic equipment,
in general the more energy is required by the HVAC equipment. The recom-
mended lower ambient temperature limit is set to limit extreme overcooling.

For data center equipment, each individual manufacturer tests to specific envi-
ronmental ranges, and these may or may not align with the allowable ranges spec-
ified in Table 2.1; regardless, the product that is shipped will in most cases align with
one of the classes.
Regarding the maximum altitude at which data center products should operate,
Figure 2.5 shows that the majority of the population resides below 3000 m (9840 ft);
therefore, the maximum altitude for Classes A1 through A4 was chosen as 3050 m
(10,000 ft).
The purpose of specifying a derating on the maximum dry-bulb temperature for
altitude (see note e of Table 2.1) is to identify acceptable environmental limits that
compensate for degradation in air-cooling performance at high altitudes. The rate of
heat transfer in air-cooled electronics is a function of convective heat transfer and
coolant mass flow rates, both of which decrease as a result of reduced air density,
which accompanies the lower atmospheric pressure at high altitudes. An altitude
derating restricts the maximum allowable upper operating temperature limit when
20 Environmental Guidelines for Air-Cooled Equipment

Figure 2.5 World population distribution by altitude. (Courtesy Bill


Rankin, www.radicalcartography.net/howhigh.html)

the system is operated at higher altitudes and permits a higher operating temperature
limit when the system is operated at lower altitudes. Altitude derating thus ensures
that system component temperatures stay within functional limits while extending
the useful operating range to the maximum extent possible for a given cooling
design.
One area that needed careful consideration was the application of the altitude
derating for the environmental classes. Simply providing the derating curve for
Classes A1 and A2 for Classes A3 and A4 would have driven undesirable increases
in server energy to support the higher altitudes upon users at all altitudes. In an effort
to provide for both a relaxed operating environment and a total focus on the best solu-
tion with the lowest TCO for the client, modification to this derating was applied.
The derating curves for Classes A3 and A4 maintain significant relaxation while
mitigating extra expense incurred both during acquisition of the ITE but also under
operation due to increased power consumption. The relationship between dry-bulb
temperature, altitude, and air density for the different environments is depicted
graphically in the derating curves of Appendix G.
It was intended that operation within the recommended envelope created by the
equipment manufacturers would provide the most reliable and power-efficient data
center operation. This intent continues to be the goal.
Thermal Guidelines for Data Processing Environments, Fifth Edition 21

Figure 2.6 2021 recommended and allowable envelopes for ASHRAE


Class H1. The recommended envelope is for low levels of
pollutants verified by coupon measurements as indicated in
note 3 of Section 2.2.

2.2.2 Environmental Class Definition 


for High-Density Air-Cooled Equipment
High-density products that use high-powered components such as central
processing units (CPUs), graphic processing units (GPUs), and memory requiring
increased cooling could be provided with an increase in heat sink volume/fan perfor-
mance. However, allowable server volume does not permit these performance
enhancements. To meet the component temperature limits, the ambient temperature
needs to be lowered. Therefore, to address such high-powered ITE, a new air-cooling
class specific to high-density servers has been added. All the current environmental
classes as noted in Section 2.2.1 remain as described.
When a data center includes ITE manufactured to one or more of the envelopes
described in Section 2.2.1 as well as other equipment requiring more restrictive
temperature or humidity control as described in this section, separate areas should be
provided. If necessary, these areas should have separate environmental controls and
may use separate cooling systems to facilitate optimization of cooling efficiency. Of
course, the IT manufacturer will determine if a product requires this class environ-
ment, with the knowledge that more facility cooling energy will be required by the
customer to meet this more restrictive environment. Figures 2.6 and 2.7 display the
new recommended Class H1 high-density envelope and its corresponding allowable
envelope. Table 2.2 provides the specific values for the environmental limits. Appen-
dices B and F include additional graphical representations of these new envelopes.
22 Environmental Guidelines for Air-Cooled Equipment

Figure 2.7 2021 recommended and allowable envelopes for ASHRAE


Class H1. The recommended envelope is for high levels of
pollutants verified by coupon measurements as indicated in
note 3 of Section 2.2.

Table 2.2 2021 Thermal Guidelines for High-Density Servers—


SI Version (I-P Version in Appendix B)
Equipment Environment Specifications for High-Density Air Cooling

Product
Product Operationb,c
Power Offc,d

Max. Max. Dry-


Dry-Bulb Dew Max. Rate of Bulb
Temp.e,g, Humidity Range, Point, Elev.e,j,m, Changef, Temp., RH,
a
Class °C Noncond.h,i,k,l,n °C m °C/h °C %

Recommended
–9°C DP to 15°C DP
H1 18 to 22 and
70% rhn or 50% rhn
Allowable
–12°C DP and 8% rh
H1 5 to 25 to 17 3050 5/20 5 to 45 8 to 80k
17°C DP and 80% rhk
Thermal Guidelines for Data Processing Environments, Fifth Edition 23

Notes for Table 2.2, 2021 Thermal Guidelines for High-Density Servers—
SI Version (I-P Version in Appendix B)
a. This is a new class specific to high-density servers. It is at the discretion of the ITE manu-
facturer to determine the need for a product to use this high-density server class. Classes A1
through A4 are separate and are shown in Table 2.1.
b. Product equipment is powered on.
c. Tape products require a stable and more restrictive environment (similar to 2011 Class A1).
Typical requirements: minimum temperature is 15°C, maximum temperature is 32°C, mini-
mum RH is 20%, maximum RH is 80%, maximum DP is 22°C, rate of change of temperature
is less than 5°C/h, rate of change of humidity is less than 5% rh per hour, and no condensation.
d. Product equipment is removed from original shipping container and installed but not in use,
e.g., during repair, maintenance, or upgrade.
e. For H1 class only—Derate maximum allowable dry-bulb temperature 1°C/500 m above 900
m. Above 2400 m altitude, the derated dry-bulb temperature takes precedence over the recom-
mended temperature.
f. For tape storage: 5°C in an hour. For all other ITE: 20°C in an hour and no more than 5°C in
any 15-minute period of time. The temperature change of the ITE must meet the limits shown
in the table and is calculated to be the maximum air inlet temperature minus the minimum air
inlet temperature within the time window specified. The 5°C and 20°C temperature change
is considered to be a temperature change within a specified period of time and not a rate of
change. See Appendix K for additional information and examples.
g. With a diskette in the drive, the minimum temperature is 10°C. With the lowest allowed
temperature of 15°C, there is no problem with diskettes residing in this H1 environment.
h. The minimum humidity level for Class H1 is the higher (more moisture) of the –12°C DP and
the 8% rh. These intersect at approximately 25°C. Below this intersection (~25°C) the DP (–
12°C) represents the minimum moisture level, while above it, RH (8%) is the minimum.
i. Based on research funded by ASHRAE and performed at low RH (Pommerenke et al. 2014),
the following are the minimum requirements:
1) Data centers that have non-ESD floors and where personnel are allowed to wear non-ESD
shoes may need increased humidity given that the risk of generating 8 kV increases slightly
from 0.27% at 25% rh to 0.43% at 8% rh (see Appendix D for more details).
2) All mobile furnishing/equipment is to be made of conductive or static-dissipative materials
and bonded to ground.
3) During maintenance on any hardware, a properly functioning and grounded wrist strap
must be used by any personnel who contacts ITE.
j. To accommodate rounding when converting between SI and I-P units, the maximum elevation
is considered to have a variation of ±0.1%. The impact on ITE thermal performance within
this variation range is negligible and enables the use of the rounded value of 3050 m.
k. See Appendix L for graphs that illustrate how the maximum and minimum DP limits restrict
the stated RH range for both product operations and product power OFF.
l. For the upper moisture limit, the limit is the minimum absolute humidity of the DP and RH
stated. For the lower moisture limit, the limit is the maximum absolute humidity of the DP and
RH stated.
m. Operation above 3050 m requires consultation with the IT supplier for each specific piece of
equipment.
n. If testing with silver or copper coupons results in values less than 200 and 300 Å/month,
respectively, then operating up to 70% rh is acceptable. If testing shows corrosion levels
exceed these limits, then catalyst-type pollutants are probably present and RH should be
driven to 50% or lower. See note 3 of Section 2.2 for more details.
24 Environmental Guidelines for Air-Cooled Equipment

Table 2.3 ETSI Class 3.1 and 3.1e Environmental Requirements


(ETSI 2014)

Exceptional
Environmental Parameter Unit Normal
(E)
a Low air temperature °C +5 –5
b High air temperature °C +40 +45
c Low relative humidity % rh 5 5
d High relative humidity % rh 85 90
e Low absolute humidity g/m3 1
f High absolute humidity g/m3 25
g Rate of change of temperaturea °C/min 0.5
h Low air pressure kPa 70
i High air pressureb kPa 106
j Solar radiation W/m2 700
k Heat radiation W/m2 600
l Movement of the surrounding airc m/s 5
m Conditions of condensation none no
n Conditions of wind—driven rain, snow, hail, etc. none no
o Conditions of water from sources other than rain none no
p Conditions of icing none no
a. Averaged over a period of 5 min.
b. Conditions in mines are not considered.
c. A cooling system based on non-assisted convection may be disturbed by adverse movement of the surrounding air.

2.2.3 ETSI Environmental Specifications


The European Telecommunications Standards Institute (ETSI) defines stan-
dards for information and communications technologies and is recognized by the
European Union as a European standards organization. ETSI has defined a set of five
environmental classes based on the end-use application. ETSI Classes 3.1 and 3.1e
apply to telecommunications centers, data centers, and similar end-use locations.
These classes assume a noncondensing environment, no risk of biological or animal
contamination, normal levels of airborne pollutants, insignificant vibration and
shock, and that the equipment is not situated near a major source of sand or dust.
Classes 3.1 and 3.1e apply to permanently temperature-controlled enclosed loca-
tions where humidity is not usually controlled. For comparison against ASHRAE
Classes A1 through A4 and H1, a high-level summary of Classes 3.1 and 3.1e is
given in Table 2.3. A climatogram of those same conditions is shown in Figure 2.8.
For more details on the Class 3.1 and 3.1e specification requirements, consult ETSI
300 019-1-3 (ETSI 2014).
Thermal Guidelines for Data Processing Environments, Fifth Edition 25

Figure 2.8 Climatogram of the ETSI Class 3.1 and 3.1e environmental
conditions (ETSI 2014).

2.3 GUIDE FOR THE USE AND APPLICATION OF THE


ASHRAE DATA CENTER CLASSES

With five data center classes, the decision process for the data center owner/
operator is complicated when trying to optimize efficiency, reduce TCO, address
reliability issues, and improve performance. Data center optimization is a complex,
multivariable problem and requires a detailed engineering evaluation for any signif-
icant changes to be successful. An alternative operating envelope should be consid-
ered only after appropriate data are collected and interactions within the data center
are understood. Each parameter’s current and planned status could lead to a different
endpoint for the data center optimization path.
The worst-case scenario would be for an end user to carelessly assume that ITE
is capable of operating in Classes A3 or A4 or that the mere definition of these
classes, with their expanded environmental ranges, magically solves existing data
center thermal management or power density or cooling problems. While some new
ITE may operate in these classes, other ITE, including legacy equipment, may not.
Data center problems would most certainly be compounded if the user erroneously
assumes that Class A3 or A4 conditions are acceptable. The rigorous use of the tools
and guidance in this chapter should preclude such errors. Table 2.4 summarizes the
key characteristics and potential options to be considered when evaluating the opti-
mal operating range for each data center.
26 Environmental Guidelines for Air-Cooled Equipment

Table 2.4 Ranges of Options to Consider for


Optimizing Energy Savings

Characteristic Range of Options

Project type New, retrofit, existing upgrade

Architectural aspects Layout and arrangement, economizer airflow path,


connections between old and new sections

Airflow management Extensive range, from none to full containmenta,b,


room’s performance in enabling uniform ITE inlet
temperatures and reducing or eliminating undesired
recirculation

Cooling controls sensor location Cooling system outlet, IT inlet

Temperature/humidity rating Temperature/humidity rating of: power distribution


of all existing equipment equipment, cabling, switches and network gear, room
instrumentation, humidification equipment, cooling
unit allowable supply and return temperatures,
personnel health and safety requirements

Economizer None, to be added, existing, water-side, air-side

Chiller None, existing

Climate factors—Temperatured Range of temperature in the region (obtain bin data


and/or design extremes), number of hours per year
above potential ASHRAE class maximums

Climate factors—Humidityd Range of humidity in the region (obtain bin data and/
or design extremes for RH and DP), coincident
temperature and humidity extremes, number of hours
per year outside potential ASHRAE class humidity
ranges

Cooling architecture Air, liquid, perimeter, row, rack level

Data center typec High performance computing (HPC), internet,


enterprise, financial

a. Some computer room air-conditioner (CRAC)/computer room air handler (CRAH) units have limited return
temperatures, as low as 30°C (86°F).
b. With good airflow management, server temperature rise can be on the order of 20°C (36°F); with an inlet
temperature of 40°C (104°F) the hot aisle could be 60°C (140°F).
c. Data center type affects reliability/availability requirements.
d. Climate factors are summarized in “ASHRAE Position Document on Climate Change” (2018a).
Thermal Guidelines for Data Processing Environments, Fifth Edition 27

By understanding the characteristics described in Table 2.4 along with the data
center capability, one can follow the general steps necessary in setting the operating
temperature and humidity range of the data center:

1. Consider the state of best practices for the data center. Most best practices,
including airflow management and cooling-system control strategies, should be
implemented prior to the adoption of higher server inlet temperature.
2. Determine the maximum allowable ASHRAE class environment from
Tables 2.1 and 2.2 based on review of all ITE environmental specifications.
3. Use the default recommended operating envelope (see Tables 2.1 and 2.2) or,
if more energy savings is desired, use the following information to determine
the operating envelope:
a. Climate data for locale (only when using economizers)
b. Server power trend versus ambient temperature (see Section 2.4.1)
c. Acoustical noise levels in the data center versus ambient temperature (see
Section 2.4.2)
d. Server reliability trend versus ambient temperature (see Section 2.4.3)
e. Server reliability versus moisture, contamination, and other temperature
effects (see Section 2.4.4)
f. Server performance trend versus ambient temperature (see Section 2.4.5)
g. Server cost trend versus ambient temperature (see Section 2.4.6)

The steps above provide a simplified view of the flowchart in Appendix C. The
use of Appendix C is highly encouraged as a starting point for the evaluation of the
options. The flowchart provides guidance to data center operators seeking to mini-
mize TCO on how best to position their data center for operating in a specific envi-
ronmental envelope. Possible endpoints range from optimization of TCO within the
recommended envelope as specified in Table 2.1 to a chillerless data center using any
of the data center classes. More importantly, Appendix C describes how to achieve
even greater energy savings through the use of a TCO analysis using the server
metrics provided in the next section.

2.4 SERVER METRICS TO CONSIDER IN USING GUIDELINES


The development of the recommended envelopes for the 2004 and 2008 editions
of this book were based on IT manufacturers’ knowledge of both the reliability and
equipment power consumption trends of servers as a function of inlet air tempera-
ture. To use a different envelope providing greater flexibility in data center opera-
tion, some knowledge of these two factors must be provided. The following
subsections provide trend data for ITE for both power and reliability over a wide
range of ambient temperatures. In addition, some aspects of server performance,
power, acoustics, corrosion, and cost versus ambient temperature and humidity are
also discussed.
A number of server metrics are presented in the following subsections and are
shown as ranges for the parameter of interest. The ranges are meant to capture most of
the volume server market. For specific server information, contact the IT manufacturer.
28 Environmental Guidelines for Air-Cooled Equipment

2.4.1 Server Power Trend versus Ambient Temperature

Data were collected from a number of ITE manufacturers covering a wide range
of products. Most of the data collected for the Class A2 environment fell within the
envelope displayed in Figure 2.9. The power increase is a result of fan power, compo-
nent power, and the power conversion for each. The component power increase is a
result of an increase in leakage current for some silicon devices. As an example of
the use of Figure 2.9, if a data center is normally operating at a server inlet tempera-

Figure 2.9 Server power increase (Class A3 is an estimate) versus


ambient temperature for Classes A2 and A3.
Thermal Guidelines for Data Processing Environments, Fifth Edition 29

ture of 15°C (59°F) and the operator wants to raise this temperature to 30°C (86°F),
it could be expected that the server power would increase in the range of 3% to 7%.
If the inlet temperature increases to 35°C (95°F), the ITE power could increase in
the range of 7% to 20% compared to operating at 15°C (59°F).
The development of the Class A3 envelope shown in Figure 2.9 was simply
extrapolated from the Class A2 trend. New products for this class would likely be
developed with improved heat sinks and/or fans to properly cool the components
within the new data center class, so the power increases over the wider range would
be very similar to that shown for Class A2.
With the increase in fan speed over the range of ambient temperatures, ITE flow
rate also increases. An estimate of the increase in server airflow rates over the
temperature range up to 35°C (95°F) is as displayed in Figure 2.10. It is very import-
ant in designing data centers to take advantage of temperatures above the 25°C to
27°C (77°F to 80.6°F) inlet ambient temperature range. With higher temperatures as
an operational target, the data center design must be analyzed to be able to accom-
modate the higher volumes of airflow. This includes all aspects of the airflow system.
The base system may be called upon to meet 250% (per Figure 2.10) of the nominal
airflow (the airflow when in the recommended range). This may include the outdoor
air inlet, filtration, cooling coils, dehumidification/humidification, fans, underfloor
plenum, raised-floor tiles/grates, and containment systems. A detailed engineering
evaluation of the data center system’s higher flow rate is a requirement to ensure
successful operation at elevated inlet temperatures.
Another aspect of power trend that might help determine a new operating enve-
lope is understanding the total facility energy consumption and not just the IT load
as discussed in this section. For example, as the inlet operating temperature is

Figure 2.10 Server flow rate increase versus ambient temperature increase.
30 Environmental Guidelines for Air-Cooled Equipment

increased, it is very possible that the fan speed of servers will also increase, thereby
increasing the server power. This server power increase would probably result in a
lower PUE, giving the false impression that energy use of the data center has
improved, though this is not the case. This situation highlights the importance of
measuring the total data center power usage.

2.4.2 Acoustical Noise Levels versus Ambient Temperature


Expanding the operating envelope for datacom facilities may have an adverse
effect on acoustical noise levels. Noise levels in high-end data centers have steadily
increased over the years and have become, or at least will soon become, a serious
concern to data center managers and owners. For background and discussion on this,
see Chapter 9 of Design Considerations for Datacom Equipment Centers, Second
Edition (ASHRAE 2009a).
This subsection addresses ITE noise as opposed to total data center noise, which
would include computer room cooling noise sources, which also contribute to over-
all data center noise exposure. The increase in noise levels is the obvious result of
the significant increase in cooling requirements of modern IT and telecommunica-
tions equipment. The increase in concern results from noise levels in data centers
approaching or exceeding regulatory workplace limits, such as those imposed by
OSHA (1980) in the United States or by EC Directives in Europe (Europa 2003).
Telco equipment level sound power requirements are specified in GR-63-CORE
(Telcordia 2012). Empirical fan laws generally predict that the sound power level of
an air-moving device increases with the fifth power of rotational speed; this behavior
has generally been validated over the years for typical high-end rack-mounted serv-
ers, storage units, and input/output (I/O) equipment normally found on data center
floors. This means that a 20% increase in speed (e.g., 3000 to 3600 rpm) equates to
a 4 dB increase in noise level. While it is not possible to predict a priori the effect
on noise levels of a potential 2°C (3.6°F) increase in data center temperatures, it is
not unreasonable to expect to see increases in the range of 3 to 5 dB for such a rise
in ambient temperatures, especially above the maximum recommended temperature
limit, as a result of the air-moving devices speeding up to maintain the same cooling
effect. Data center managers and owners should therefore weigh the trade-offs
between the potential benefits in energy efficiency with this new recommended
operating environment and the potential risks associated with increased noise levels.
The ASHRAE air-cooled equipment guidelines described in this chapter,
specifically Classes A3 and A4 with widely extended operating temperature enve-
lopes, make it instructive to look at the allowable upper temperature ranges and their
potential effects on data center noise levels. Using the fifth power empirical law
mentioned previously, coupled with current practices for increasing air-moving
device speeds based on ambient temperature, the A-weighted sound power level
increases shown in Table 2.5 were predicted for typical air-cooled high-end server
racks containing a mix of compute, I/O, and water-cooled units.
Of course, the actual increase in noise level for any particular ITE rack depends
not only on the specific configuration of the rack but also on the cooling schemes and
fan-speed algorithms used for the various rack drawers and components. Differences
Thermal Guidelines for Data Processing Environments, Fifth Edition 31

Table 2.5 Expected Increase in A-Weighted Sound Power Level

25°C (77°F) 30°C (86°F) 35°C (95°F) 40°C (104°F) 45°C (113°F)

0 dB 4.7 dB 6.4 dB 8.4 dB 12.9 dB

would exist between high-end equipment that uses sophisticated fan-speed control
and entry-level equipment using fixed fan speeds or rudimentary speed control.
However, the above increases in noise emission levels with ambient temperature can
serve as a general guideline for data center managers and owners concerned about
noise levels and noise exposure for employees and service personnel. The IT indus-
try has developed its own internationally standardized test codes for measuring the
noise emission levels of its products (ISO 7779 [2018]) and for declaring these noise
levels in a uniform fashion (ISO 9296 [2017]). Noise emission limits for ITE
installed in a variety of environments (including data centers) are stated in Statskon-
toret Technical Standard 26:6 (2004).
This discussion applies to potential increases in noise emission levels (i.e., the
sound energy actually emitted from the equipment, independent of listeners in the
room or the environment in which the equipment is located). Ultimately, the real
concern is about the possible increase in noise exposure, or noise emission levels,
experienced by personnel in the data center. With regard to regulatory workplace
noise limits and protection of employees against potential hearing damage, data
center managers should check whether potential changes in noise levels in their envi-
ronment will cause them to trip various action-level thresholds defined in local, state,
or national codes. The actual regulations should be consulted, as they are complex
and beyond the scope of this book to explain in full. The noise levels of concern in
workplaces are stated in terms of A-weighted sound pressure levels (as opposed to
the A-weighted sound power levels used for rating the emission of noise sources).
For instance, when noise levels in a workplace exceed a sound pressure level of
85 dB(A), hearing conservation programs, which can be quite costly, are mandated,
generally involving baseline audiometric testing, noise level monitoring or dosim-
etry, noise hazard signage, and education and training. When noise levels exceed
87 dB(A) (in Europe) or 90 dB(A) (in the U.S.), further action, such as mandatory
hearing protection, rotation of employees, or engineering controls, must be taken.
Data center managers should consult with acoustical or industrial hygiene experts
to determine whether a noise exposure problem will result when ambient tempera-
tures are increased to the upper ends of the expanded ranges proposed in this book.
In an effort to provide some general guidance on the effects of the proposed
higher ambient temperatures on noise exposure levels in data centers, the following
observations can be made (though, as noted above, it is advised that one seek profes-
sional help in actual situations, because regulatory and legal requirements are at
issue). Modeling and predictions of typical ITE racks in a typical data center with
front-to-back airflow have shown that the sound pressure level in the center of a typi-
cal aisle between two rows of continuous racks will reach the regulatory trip level
of 85 dB(A) when each of the individual racks in the rows has a measured (as
32 Environmental Guidelines for Air-Cooled Equipment

opposed to a statistical upper limit) sound power level of roughly 8.4 B (84 dB). If
it is assumed that this is the starting condition for a 25°C (77°F) ambient data center
temperature—and many fully configured high-end ITE racks today are at or above
this 8.4 B (84 dB) level—the sound pressure level in the center of the aisle would
be expected to increase to 89.7 dB(A) at 30°C (86°F) ambient, to 91.4 dB(A) at 35°C
(95°F) ambient, to 93.4 dB(A) at 40°C (104°F) ambient, and to 97.9 dB(A) at 45°C
(113°F) ambient, using the predicted increases to sound power level shown in
Table 2.5. Needless to say, these levels are extremely high. They are not only above
the regulatory trip levels for mandated action (or fines, in the absence of action), but
they clearly pose a risk of hearing damage unless controls are instituted to avoid
exposure by data center personnel.

2.4.3 Server Reliability Trend versus Ambient Temperature


Before extensively using data center economizers or wider environmental oper-
ating limits, it is important to understand the reliability (failure rate) impact of those
changes. The hardware failure rate within a given data center is determined by the
local climate, the type of economization being implemented, and the temperature
and humidity range over which the economization is being carried out. Most econ-
omized facilities have a means of mixing hot exhaust air with incoming cold air, so
the minimum data center temperature is usually tempered to something in the range
of 15°C to 20°C (59°F to 68°F). All of the ITE (servers, storage, networking, power,
etc.) in a data center using an economizer must be rated to operate within the planned
data center class of temperature and humidity ranges.
This subsection describes the process for evaluating the effect of temperature on
ITE reliability. Actual ITE reliability for any given data center could be better or
worse due to exposure to a wider range of ambient temperatures through the use of
economization. No guidance is provided with respect to the reliability of equipment
other than ITE. Equipment other than ITE must be separately evaluated in combi-
nation with the ITE to determine overall data center reliability.
To understand the impact of temperature on hardware failure rates, one can
model different economization scenarios. First, consider the ways economization
can be implemented and how these would impact the data center temperature.
(Although this subsection focuses on economization and the possible effects on reli-
ability due to increased temperatures, the discussion can also apply to chiller-
designed data centers that operators wish to run at higher temperatures.) For
purposes of this discussion, consider three broad categories of economized facilities:

1. Economization over a narrow temperature range with little or no change


to the data center temperature. This is the primary control methodology,
where the data center is properly configured to control the air temperature at or
near the IT inlet to the data center operators’ chosen temperature. The econo-
mizer modulates to bring in more or less cool air (air side) or adjust the cool
water flow or temperature (water side) or adjust the refrigerant flow (refrigerant
side) to meet this required temperature. If the external conditions or internal
load change such that the economizer can no longer handle the task on its own,
Thermal Guidelines for Data Processing Environments, Fifth Edition 33

the chiller or direct-expansion (DX) system ramps up to provide additional cool-


ing capacity, thereby keeping the space at the desired temperature. This is the
most benign implementation of economizing, because the temperature of the
data center is essentially the same as if the data center were not economized. If
there is little or no temperature change, then there should be little or no failure
rate impact of temperature on the data center hardware. This economization
scenario probably represents the vast majority of economized data centers.
2. Expanded temperature range economization, where there may be a net
increase in the data center temperature some of the time. Some data center
owners/operators may choose to reduce cooling energy by expanding econo-
mizer hours or raising computer room air-conditioner (CRAC) set points,
thereby widening the temperature range over which they operate their facilities.
Facilities using this operational mode may be located in an environment where
expanded economizer hours are available, but they typically have mechanical
cooling as part of their system configuration.
3. A chillerless data center facility, where data center temperatures are
higher and vary with the outdoor air temperature and local climate. Some
data center owners/operators in cool climates may want to reduce their data
center construction capital costs by building a chillerless facility. In chillerless
data center facilities the temperature in the data center varies over a much wider
range that is determined, at least in part, by the temperature of the outdoor air
and the local climate. These facilities may use supplemental cooling methods
that are not chiller based, such as evaporative cooling.

Because there are so many different variables and scenarios to consider for ITE
reliability, the approach taken by ASHRAE TC 9.9 was to initially establish a base-
line failure rate (x-factor) of 1.00 that reflected the average probability of failure
under a constant ITE inlet temperature of 20°C (68°F). Table 2.6 provides x-factors
at other constant ITE inlet temperatures for 7 × 24 × 365 continuous operation condi-
tions. The key to applying the x-factors in Table 2.6 is to understand that they repre-
sent a relative failure rate compared to the baseline of a constant ITE inlet
temperature of 20°C (68°F). This table was created using manufacturers’ reliability
data, which included all components within the volume server package. Table 2.6
provides x-factor data at the average, upper, and lower bounds to take into account
the many variations within a server package among the number of processors, dual
in-line memory modules (DIMMs), hard drives, and other components. The data set
chosen should depend on the level of risk tolerance for a given application.
It is important to note that the 7 × 24 × 365 use conditions corresponding to the
x-factors in Table 2.6 are not a realistic reflection of the three economization scenar-
ios outlined previously. For most climates in the industrialized world, the majority
of the hours in a year are spent at cool temperatures, where mixing cool outdoor air
with air from the hot aisle exhaust keeps the data center temperature in the range of
15°C to 20°C (59°F to 68°F) (x-factor of 0.72 to 1.00). Furthermore, these same
climates spend only 10% to 25% of their annual hours above 27°C (80.6°F), the
upper limit of the ASHRAE recommended range. The correct way to analyze the
34 Environmental Guidelines for Air-Cooled Equipment

Table 2.6 Relative ITE Failure Rate x-Factor


as Function of Constant ITE Air Inlet Temperature

Temperature Impact on Volume Server Hardware Failure Rate

Dry-Bulb Temp, Failure Rate x-Factor


°C (°F) Lower Bound Average Bound Upper Bound
15.0 (59.0) 0.72 0.72 0.72
17.5 (63.5) 0.80 0.87 0.95
20.0 (68.0) 0.88 1.00 1.14
22.5 (72.5) 0.96 1.13 1.31
25.0 (77.0) 1.04 1.24 1.43
27.5 (81.5) 1.12 1.34 1.54
30.0 (86.0) 1.19 1.42 1.63
32.5 (90.5) 1.27 1.48 1.69
35.0 (95.0) 1.35 1.55 1.74
37.5 (99.5) 1.43 1.61 1.78
40.0 (104.0) 1.51 1.66 1.81
42.5 (108.5) 1.59 1.71 1.83
45.0 (113.0) 1.67 1.76 1.84
Note: Relative hardware failure rate x-factor for volume servers is shown as a function of continuous operation.

reliability impact of economization is to use climate data to construct a time-


weighted average x-factor. An analysis of time-weighted x-factors will show that,
even for the harshest economization scenario (chillerless), the reliability impact of
economization is much more benign than the 7 × 24 × 365 x-factor data in Table 2.6
would indicate. A summary of time-weighted x-factors for air-side economization
for a variety of U.S. cities is shown in Figure 2.11. (See Appendix I for more details
on how this figure was created.) The data assume a 1.5°C (2.7°F) temperature rise
between the outdoor air temperature and the equipment inlet air temperature. More
than half of the cities have x-factor values at or below 1.0, and even the warmest cities
shows an x-factor of only about 1.25 relative to a traditional air-conditioned data
center that is kept at 20°C (68°F).
It is important to understand the meaning of the relative failure rate values. The
results are normalized to a data center run continuously at 20°C (68°F), which has
the relative failure rate of 1.0. For those cities with values below 1.0, the implication
is that the economizer still functions and the data center is cooled below 20°C (68°F)
(to 15°C [59°F]) for those hours each year. In addition, the relative failure rate in
Table 2.6 shows the expected increase in the number of failed servers, not the
percentage of total servers failing (e.g., if a data center that experiences 4 failures per
Thermal Guidelines for Data Processing Environments, Fifth Edition 35

Figure 2.11 Time-weighted x-factor estimates for air-side economizer use


for selected U.S. cities.

1000 servers incorporates warmer temperatures, and the relative failure rate x-factor
is 1.2, then the expected failure rate would be 5 failures per 1000 servers). To provide
an additional frame of reference on data center hardware failures, sources showed
blade hardware server failures were in the range of 2.5% to 3.8% over 12 months in
two different data centers with supply temperatures approximately 20°C (68°F)
(Patterson et al. 2009; Atwood and Miner 2008). In a similar data center that
included an air-side economizer with temperatures occasionally reaching 35°C
(95°F) (at an elevation around 1600 m [5250 ft]), the failure rate was 4.5%. These
values are provided solely for guidance with an example of failure rates. In these
studies, a failure was deemed to have occurred each time a server required hardware
attention. No attempt to categorize the failure mechanisms was made.
To provide additional guidance on the use of Table 2.6, Appendix H gives a
practical example of the impact of a compressorless cooling design on hardware fail-
ures, and Appendix I provides ITE reliability data for selected major U.S. and global
cities.
One other aspect not discussed here is server availability requirements. The
question one has to ask is: are there availability requirements for some servers in the
data center that would require much more stringent temperature controls than might
be allowed through modeling of reliability as described here?

2.4.4 Server Reliability versus Moisture, Contamination, and


Other Temperature Effects
The preceding discussion is almost entirely about temperature, but there are
other factors, such as pollution and humidity, that can cause failures in data center
equipment. The effects of gaseous pollution, particulates, and humidity on the types
36 Environmental Guidelines for Air-Cooled Equipment

of equipment failures they can cause are well documented. One of the best sources
on the effects of pollution on data centers is Particulate and Gaseous Contamination
in Datacom Environments, Second Edition (ASHRAE 2014b). When selecting a site
for a new data center or when adding an air-side economizer to an existing data
center, the air quality and building materials should be checked carefully for sources
of pollution and particulates. Additional filtration should be added to remove
gaseous pollution and particulates, if needed. Research has shown that in addition
to pollution, both temperature and humidity affect dielectric properties of printed
circuit board (PCB) dielectric materials (Hamilton et al. 2007; Sood 2010; Hinaga
et al. 2010). The dielectric (e.g., FR4) provides the electrical isolation between board
signals. With either increased moisture or higher temperature in the PCB, transmis-
sion line losses increase. Signal integrity may be significantly degraded as the
board’s temperature and moisture content increase. Moisture content changes rela-
tively slowly, on the order of hours and days, based on the absorption rate of the
moisture into the board. Outer board layers are affected first. Temperature changes
on the order of minutes and can quickly affect performance. As more high-speed
signals are routed in the PCB, both temperature and humidity will become even
greater concerns for ITE manufacturers. The cost of PCB material may increase
significantly and may increase the cost of Class A3- and A4-rated ITE. The alter-
native for ITE manufacturers is to use lower-speed bus options, which will lower
performance.
Excessive exposure to high humidity can induce performance degradations or
failures at various circuitry levels. At the PCB level, conductive anodic filament
grows along the delaminated fiber/epoxy interfaces where moisture facilitates the
formation of a conductive path (Turbini and Ready 2002; Turbini et al. 1997). At the
substrate level, moisture can cause surface dendrite growth between pads of opposite
bias due to electrochemical migration. This is a growing concern due to continuing
C4 (solder ball connection) pitch refinement. At the silicon level, moisture can
induce degradation or loss of the adhesive strength in the dielectric layers, while
additional stress can result from hygroscopic swelling in package materials. The
combination of these two effects often causes delamination near the die corner
region where thermal-mechanical stress is inherently high and more vulnerable to
moisture. It is worth noting that temperature plays an important role in moisture
effects. On one hand, higher temperature increases the diffusivity coefficients and
accelerates the electrochemical reaction. On the other hand, the locally higher
temperature due to self-heating also reduces the local RH, thereby drying out the
circuit components and enhancing their reliability.
In addition to the above diffusion-driven mechanism, another obvious issue with
high humidity is condensation. This can result from sudden ambient temperature
drop or the presence of a lower temperature source for water-cooled or refrigeration-
cooled systems. Condensation can cause failures in electrical and mechanical
devices through electrical shorting and corrosion. Other examples of failure mode
exacerbated by high RH include hygroscopic dust failures (Comizzoli et al. 1993),
tape media errors, excessive wear (Van Bogart 1995), and corrosion. These failures
are found in environments that exceed 60% rh for extended periods of time.
Thermal Guidelines for Data Processing Environments, Fifth Edition 37

As a rule, the typical mission-critical data center must give utmost consideration
to the trade-offs before operating with an RH that exceeds 60% for the following
reasons:

• It is well known that moisture and pollutants are necessary for metals to
corrode. Moisture alone is not sufficient to cause atmospheric corrosion.
Pollution aggravates corrosion in the following ways:
• Corrosion products, such as oxides, may form and protect the metal
and slow down the corrosion rate. In the presence of gaseous pollut-
ants such as sulfur dioxide (SO2) and hydrogen sulfide (H2S) and
ionic pollutants such as chlorides, the corrosion-product films are less
protective, allowing corrosion to proceed somewhat linearly. When the
RH in the data center is greater than the deliquescent RH of the corro-
sion products, such as copper sulfate, cupric chloride, and the like, the
corrosion-product films become wet, dramatically increasing the rate
of corrosion. Cupric chloride, a common corrosion product on copper,
has a deliquescent RH of about 65%. A data center operating with RH
greater than 65% would result in the cupric chloride absorbing mois-
ture, becoming wet, and aggravating the copper corrosion rate.
• Dust is ubiquitous. Even with the best filtration efforts, fine dust will be
present in a data center and will settle on electronic hardware. Fortu-
nately, most dust has particles with high deliquescent RH, which is the
RH at which the dust absorbs enough water to become wet and promote
corrosion and/or ion migration. When the deliquescent RH of dust is
greater than the RH in the data center, the dust stays dry and does not
contribute to corrosion or ion migration. However, on the rare occurrence
when the dust has a deliquescent RH lower than the RH in the data cen-
ter, the dust will absorb moisture, become wet, and promote corrosion
and/or ion migration, degrading hardware reliability. A study by Comiz-
zoli et. al. (1993) showed that, for various locations worldwide, leakage
current due to dust that had settled on PCBs increased exponentially with
RH. This study leads us to the conclusion that maintaining the RH in a
data center below about 60% will keep the leakage current from settled
fine dust in the acceptable subangstrom range.

The conditions noted in the above two bullets do not contradict the 70% rh upper
limit for the recommended envelope as shown in Table 2.1 and Figure 2.2. The
guidelines for the 70% rh upper limit are for a data center that has low levels of
pollutants; namely, copper and silver coupons are measured to be below 300 and 200
Å/month, respectively. If these measurements are higher than these limits, suggest-
ing higher levels of pollutants are present, then the RH should be limited as noted
above and suggested in note 4 of Section 2.2.
Gaseous contamination concentrations that lead to silver and/or copper corro-
sion rates greater than about 300 Å/month have been known to cause the two most
38 Environmental Guidelines for Air-Cooled Equipment

common recent failure modes: copper creep corrosion on circuit boards and the
corrosion of silver metallization in miniature, surface-mounted components.
In summary, if protection of mission-critical data center hardware is paramount,
equipment can best be protected from corrosion by maintaining an RH of less than
70% and limiting the particulate and gaseous contamination concentration to levels
at which the copper and/or silver corrosion rates are less than 300 and 200 Å/month,
respectively. Of course, the data center operator may choose to limit the data center
RH to below 50% at all times to be overly protective of the ITE.
Given these reliability concerns, data center operators need to pay close atten-
tion to the overall data center humidity and local condensation concerns, especially
when running economizers on hot/humid summer days. When operating in polluted
geographies, data center operators must also consider particulate and gaseous
contamination, because the contaminants can influence the acceptable temperature
and humidity limits within which data centers must operate to keep corrosion-related
hardware failure rates at acceptable levels. Dehumidification, filtration, and gas-
phase filtration may become necessary in polluted geographies with high humidity.
Section 2.2 provides additional guidance on minimizing corrosion due to high RH
and gaseous pollutants.

2.4.5 Server Performance Trend versus Ambient Temperature


Whether the environment supports the ITE depends on the thermal design and
management implementation of the ITE. Each component within the ITE has ther-
mal limits that must be met based on the intended use. Components such as proces-
sors have features that enable maximizing performance within power and thermal
constraints based on a thermal design power (TDP). That TDP is provided to guide
the IT thermal design engineer during the design phase so that cooling is sufficiently
sized. If the ITE is not designed to meet the full capability implied by the TDP,
performance can be impacted. See IT Equipment Design Impact on Data Center
Solutions (ASHRAE 2016) for detailed information on this topic.
With some components, power consumption and performance reductions are
handled gracefully with somewhat predictable results. For example, processors can
automatically limit their power consumption if they are threatened to become too
hot, based on real-time, on-chip thermal measurements. Other components have
little or no power management capability. Many components have no thermal
sensors and no mechanism for power management and, therefore, no way to stay
within their thermal limits. Consequently, if environmental specifications are not
met, the temperature limits of such devices may be exceeded, resulting in loss of data
integrity. A system designed for one class but used in another class may continue to
operate with light workloads but may experience performance degradation with
heavy workloads. Performance degradation is driven by power management
features. These features are used for protection.
The exception occurs when a system is configured in an energy-saving mode
where power management features are triggered to enable adequate but not peak
performance. A configuration setting such as this may be acceptable for some
Thermal Guidelines for Data Processing Environments, Fifth Edition 39

customers and applications but is generally not the default configuration that will,
in most cases, support full operation.
To enable ITE manufacturers the greatest flexibility in designing to an allowable
environmental class, power and thermal management may be triggered, and with the
new guidance on allowable ranges, “full-performance operation” has been replaced
with “full operation” in the definition of allowable environmental envelope in
Section 2.2. ITE is designed with little to no margin at the extreme upper limit of the
allowable range. The recommended range enabled a buffer for excursions to the
allowable limits. That buffer has been removed and, consequently, power and ther-
mal management features may be triggered within the allowable range to ensure
there are no thermal excursions outside the capability of the ITE under extreme load
conditions. ITE is designed based on the probability of a worst-case event occurring,
such as the combination of extreme workloads simultaneously with room tempera-
ture excursions. Because of the low probability of simultaneous worst-case events
occurring, IT manufacturers skew their power and thermal management systems to
ensure that operation is guaranteed. Operating within a particular environmental
class requires full operation of the equipment over the entire allowable environmen-
tal range, based on nonfailure conditions. The IT purchaser must consult with the
equipment manufacturer to understand the performance capability at the extreme
upper limits of the allowable thermal envelopes.

2.4.6 Server Cost Trend versus Ambient Temperature


With ITE designed to Classes A3 or A4, the IT manufacturer has a number of
ways to support the wider environmental requirements. The trade-offs include cool-
ing solution capability, component selection based on temperature ratings, and
performance capability. With some components, such as processors, an increased
temperature capability will come at either a significant cost increase or a reduced
performance capability. The silicon must be tested to the temperature specification,
and if that specification is higher, the capability to produce a high-performance part
is reduced and it becomes more valuable, thereby increasing cost.
Higher-temperature-rated parts may or may not be available for all components.
As mentioned, improved PCB materials are available but could increase cost signifi-
cantly over lower-performing materials. Improved heat sinks may be used to
improve cooling performance, but such improvement is limited and will normally be
used in conjunction with increased air-mover speeds. The effect of increased air-
mover speeds is evident in the previous power versus temperature guidance
provided. One must be aware that the need for higher air-mover speeds will only
occur when the system inlet temperature moves towards the high range of the ther-
mal envelope. Typical speeds will still remain relatively low under more normal
room temperatures.
Assuming that performance is maintained through cooling improvements, the
cost of a server would likely increase moving from Classes A2 to A3 and then from
Classes A3 to A4. Many server designs may require improved, noncooling compo-
nents (e.g., processors, memory, storage) to achieve Class A3 or A4 operation,
because the cooling system may be incapable of improvement within the volume
40 Environmental Guidelines for Air-Cooled Equipment

constraints of the server, and the changes required to these components may also
affect server cost. In any case, the cost of servers supporting the newer ASHRAE
classes should be discussed with the individual server manufacturer to understand
whether this will factor into the decision to support the new classes within an indi-
vidual data center.

2.4.7 Summary of Air-Cooled Equipment


Environmental Specifications
Classes A3 and A4 were added in the 2011 edition of Thermal Guidelines
primarily for facilities wishing to avoid the capital expense of compressor-based
cooling. These classes may offer some additional hours of economization above and
beyond Classes A1 and A2, but there is no guarantee that operation at the extremes
of Classes A3 and A4 actually results in a minimal energy condition. Fan power, both
in the ITE and in the facility, may push the total energy to a higher level than is expe-
rienced when chilling the air. One of the important reasons for the initial recom-
mended envelope was that its upper temperature bound was typical of minimized IT
fan energy. Moreover, higher-temperature operation increases the leakage power of
complementary metal-oxide semiconductor (CMOS)-based electronics, partially
(or in extreme environments, completely) offsetting the energy savings achieved by
compressorless cooling.
This chapter points out that ITE failure rates can increase with temperature in
some cases, but those failure rate increases are moderated by the short time periods
spent at elevated temperatures (see Table 2.6). For many locations and economiza-
tion scenarios, the net increase in ITE failure rate will be negligible during short-
term periods of elevated temperatures. For longer-term periods of elevated tempera-
tures, equipment may experience significant reductions in mean time between fail-
ures (MTBF). The potential reduction in MTBF is directly related to the level of the
elevated temperatures. Diligent management of elevated temperatures to minimize
event duration should minimize any residual effect on MTBF. The guidance
provided here should allow users to quantify the ITE failure rate impact of both their
economization scenarios and the climate where their data center facility is located.
Refer to Tables 2.1 and 2.2 or Appendix F for specific recommended and allowable
temperature limits.
As recent ASHRAE-funded research (Zhang et al. 2019) has shown, the combi-
nation of high humidity and gaseous contamination can be a significant driver for
reduced ITE reliability. Data center operators should monitor the rates of copper and
silver corrosion in their data centers at least twice annually as outlined in Section 2.2.
As stated in note 4 of Section 2.2, or if coupon measurements are not performed to
aid in understanding the possible corrosion impact on ITE, the data center operator
should consider maintaining a lower humidity level to protect the ITE, either below
60% as specified in the fourth edition of Thermal Guidelines (ASHRAE 2015b) or
below 50% as specified in note 3 Section 2.2 of this edition.
The reasons for the original recommended envelope have not changed. Operation
at wider extremes will have energy and/or reliability impacts. A compressorless data
center could actually provide better reliability than its tightly controlled counterpart.
3


Environmental Guidelines for
Liquid-Cooled Equipment
The guidelines for the expanded data center environments discussed in
Chapter 2 are for air-cooled information technology equipment (ITE) and do not
address water temperatures provided by facilities for supporting liquid-cooled ITE
(here liquid-cooled ITE refers to equipment using any liquid, such as water, refrig-
erant, or dielectric, within the design control of the IT manufacturers). In 2014
ASHRAE TC 9.9 published the second edition of Liquid Cooling Guidelines for
Datacom Equipment Centers, which focuses mostly on the design options for liquid-
cooled equipment.
This chapter describes the classes for the temperature ranges of the facility
supply of water to liquid-cooled ITE. The location of this interface is the same as that
defined in Liquid Cooling Guidelines (ASHRAE 2014a) and detailed in Chapter 4
of that book. In addition, this chapter reinforces some of the information provided
in Liquid Cooling Guidelines on the interface between the ITE and infrastructure in
support of the liquid-cooled ITE. Because the classes cover a wide range of facility
water temperatures supplied to the ITE, a brief description is provided for the possi-
ble infrastructure equipment that could be used between the liquid-cooled ITE and
the outdoor environment.
The global interest in expanding the temperature and humidity ranges for air-
cooled ITE continues to increase, driven by the desire to achieve higher data center
operating efficiency and lower total cost of ownership (TCO). This desire also drives
the use of liquid cooling of ITE, which can achieve high energy efficiency and power
densities beyond air-cooled equipment while simultaneously enabling the use of
waste heat when facility supply water temperatures are high enough.
By creating these facility-water cooling classes and not mandating use of a
specific class, ASHRAE TC 9.9 provides server manufacturers the ability to develop
products for each class depending on customer needs and requirements.
Developing these new classes for commercial IT manufacturers, in consultation
with the Energy Efficient High Performance Computing (EE HPC) Working Group,
should produce better results, since the sharing of critical data has resulted in broader
environmental specifications than would otherwise be possible.
The first five water-cooling classes were introduced in the third edition of Ther-
mal Guidelines (2012). In this fifth edition, an additional water-cooling class was
added to fill in the large gap of maximum water temperatures between two of the
classes. In addition, the naming of the classes was changed to reflect the maximum
facility water temperature allowed by each class.
42 Environmental Guidelines for Liquid-Cooled Equipment

3.1 ITE LIQUID COOLING


The increasing heat density of modern electronics is stretching the ability of air
to adequately cool the electronic components within servers and within the datacom
facilities that house them. To meet this challenge, direct water or refrigerant cooling
at the rack or board level is now being used. The ability of water and refrigerant to
carry much larger amounts of heat per volume or mass also offers tremendous advan-
tages. The heat from these liquid-cooling units is in turn rejected to the outdoor envi-
ronment by using either air or water to transfer heat out of the building or, in some
facilities, to use it for local space heating. Because of the operating temperatures
involved with liquid-cooling solutions, water-side economization fits in well.
Liquid cooling can also offer advantages in terms of lower noise levels and close
control of electronics temperatures. However, liquid in electronic equipment raises
concerns about leaks. This is an issue because of the need to disconnect and recon-
nect the liquid-carrying lines when electronic components are replaced or upgraded.
To overcome this concern, and to eliminate the potential for electric shorts caused
by cooling liquid bridging electrical contacts, IT original equipment manufacturer
(OEM) designers sometimes use a nonconductive liquid, such as a refrigerant or a
dielectric fluid, in the cooling loop for the ITE.
In the past, high-performance mainframes were often water cooled and the
internal piping was supplied by the IT OEM. Components available today have simi-
lar factory-installed and leak-tested piping that can accept the water from the
mechanical cooling system, which may also use a water-side economizer. Increased
standardization of liquid-cooled designs for connection methods and locations will
also help expand their use by minimizing piping concerns and allowing interchange-
ability of diverse liquid-cooled IT products.
The choice to move to liquid cooling may occur at different times in the life of
a data center. There are three main times, discussed in the following subsections,
when the decision between air and liquid cooling must be made. Water’s thermal
properties were discussed previously as being superior to those of air. This is
certainly the case, but it does not mean that liquid cooling is invariably more efficient
than air cooling. Both can be very efficient or inefficient, and which is best generally
has more to do with design and application than the cooling fluid. In fact, modern
air-cooled data centers with air economizers are often more efficient than many
liquid-cooled systems. The choice of liquid-cooled versus air-cooled generally has
more to do with factors other than efficiency.

3.1.1 New Construction


In the case of a new data center, the cooling architect must consider a number
of factors, including data center workload, availability of space, location-specific
issues, and local climate. If the data center will have an economizer and the climate
is best suited to air-side economizers because of mild temperatures and moderate
humidity, then an air-cooled data center may make the most sense. Conversely, if the
climate is primarily dry, then a water-side economizer may be ideal, with the cooling
fluid conveyed either to the racks or to a coolant distribution unit (CDU).
Thermal Guidelines for Data Processing Environments, Fifth Edition 43

Liquid cooling more readily enables the reuse of waste heat. If a project is
adequately planned from the beginning, reusing the waste energy from the data
center may reduce the energy use of the site or campus. In this case, liquid cooling
is the obvious choice because the heat in the liquid can most easily be transferred to
other locations. Also, the closer the liquid is to the components, the higher the quality
of the heat that is recovered and available for alternative uses.

3.1.2 Expansions
Another time to change to or add liquid cooling is when adding or upgrading
equipment in an existing data center. Often, existing data centers do not have large
raised-floor heights or the raised floor plenum is full of obstructions such as cabling.
If a new rack of ITE is to be installed that is of higher power density than the existing
raised-floor air cooling can support, liquid cooling can be the ideal solution. Current
typical air-cooled rack powers can range from 6 to 30 kW. In many cases, rack
powers of 30 kW are well beyond what legacy air cooling can handle. Liquid cooling
to a datacom rack, cabinet-mounted chassis, cabinet rear door, or other localized
liquid-cooling system can make these higher-density racks nearly room neutral by
cooling the exhaust temperatures down to room temperature levels.

3.1.3 High-Performance Computing and 


Other High-Density Workloads
Data centers using high-performance computing (HPC) ITE have been early
adopters of liquid cooling. Other compute models like machine learning and artifi-
cial intelligence also have rack densities similar to those of HPC ITE and also use
liquid cooling. As companies look to take advantage of these newer compute models
to improve their data center performance, the industry could see these dense
machines coming to enterprise data centers, cloud service providers, and co-location
facilities, who will need to learn to design for those densities. One of the main cost
and performance drivers for these dense workloads is the node-to-node interconnect
and the desire to shorten the interconnect distance by densifying the rack. Thirty-
kilowatt racks are typical, with densities extending as high as 80 to 120 kW. Without
some implementation of liquid cooling, these higher powers would be very difficult,
if not impossible, to cool. The advantages of liquid cooling increase as the load
densities increase.
Several implementations of liquid cooling may be used. The most common are
as follows:

• Rear-door, in-row, or above-rack heat exchanger that removes a large percent-


age of the ITE waste heat from air to liquid
• Totally enclosed cabinet that uses air as the working fluid and an air-to-liquid
heat exchanger
• Direct delivery of the cooling fluid to the components in the system using
cold plates directly attached to processors, application-specific integrated cir-
cuit, memory, power supplies, etc., in the system chassis or rack, whether they
be servers or telecom equipment
• Immersive solutions using either single- or two-phase (low boiling point) fluids
44 Environmental Guidelines for Liquid-Cooled Equipment

3.1.4 ITE and Facilities Interface

The facility water is anticipated to support any liquid-cooled ITE using water,
water plus additives, refrigerants, or dielectrics. To date, most liquid-cooling solu-
tions use a CDU as the interface of the ITE to the facility. If there is no CDU, it is
the responsibility of the facility to maintain the water-quality requirements of the
ITE as well as a water temperature guaranteed to be above the data center dew point.
The CDU may be external to the datacom rack, as shown in Figure 3.1, or within the
datacom rack, as shown in Figure 3.2.
Figures 3.1 and 3.2 show the interface for a liquid-cooled rack with remote heat
rejection. The interface is located at the boundary at the facility water system loop

Figure 3.1 Liquid-cooled rack or cabinet with external CDU.

Figure 3.2 Combination air- and liquid-cooled rack or cabinet with internal
CDU.
Thermal Guidelines for Data Processing Environments, Fifth Edition 45

and does not impact the ITE cooling system loops, which are controlled and
managed by the cooling equipment and ITE manufacturers. However, the definition
of the interface at the loop affects both the ITE manufacturers and the facility where
the ITE is housed. For that reason, all of the parameters that are key to this interface
are described in detail here. Liquid Cooling Guidelines for Datacom Equipment
Centers (ASHRAE 2014a) describes the various liquid-cooling loops that could
exist within a data center and its supporting infrastructure. Figure 3.3 shows these
liquid loops as well as two liquids—the coolant contained in the technology cooling
system (TCS) and the coolant contained in the datacom equipment cooling system
(DECS). The TCS may include in-row and overhead forced air-to-liquid heat
exchangers. If the TCS liquid is a dielectric coolant, the external CDU pump may
potentially be used to route the TCS coolant directly to cold plates attached to DECS
internal components in addition to or in place of a separate internal DECS. As seen
in Figure 3.3, the water guidelines that are discussed in this book are at the chilled-
water system (CHWS) loop. If chillers are not installed, then the guidelines would
apply to the condenser water system (CWS) loop.
Although not specifically noted, a building-level CDU may be more appropriate
where there are a large number of racks connected to liquid cooling. In this case, the
location of the interface is defined the same as in Figure 3.1, but the CDU as shown
would be a building-level unit rather than a modular unit. Building-level CDUs
handling many megawatts of power have been built for large HPC systems.
Although Figure 3.1 shows liquid cooling using a raised floor, liquid could be
distributed above the ceiling just as efficiently.

Figure 3.3 Liquid-cooling systems/loops for a data center.


46 Environmental Guidelines for Liquid-Cooled Equipment

3.2 FACILITY WATER SUPPLY TEMPERATURE CLASSES FOR ITE


3.2.1 Liquid Cooling Environmental Class Definitions
Operating within a particular environmental class requires full performance of
the equipment over the entire environmental range of the specified class, based on
nonfailure conditions. The ITE specific for each class requires different design
points for the cooling components (cold plates, thermal interface materials, liquid
flow rates, piping sizes, etc.) used within the ITE. Special care must be taken to
ensure compatibility between the facility water system and the ITE requirements for
working pressures, flow rates, differential pressure, and temperature rates of change.
For IT designs that meet the higher supply temperatures, as referenced in Table 3.1,
enhanced thermal designs are required to maintain the liquid-cooled components
within the desired temperature limits. Generally, the higher the supply water
temperature, the higher the cost of the cooling solutions. The environmental classes
for liquid-cooled ITE are as follows:
• Class W17/W27: These are typically data centers that are traditionally cooled
using chillers and a cooling tower, but with an optional water-side economizer
to improve energy efficiency, depending on the location of the data center (see
Figure 3.4).
• Class W32/W40: For most locations, these data centers may be operated
without chillers (see Figure 3.5). However, some locations may still require
chillers (see Figure 3.4).
• Class W45/W+: These data centers are operated without chillers to take
advantage of energy efficiency and reduce capital expense (see Figure 3.5).
Some locations may not be suitable for drycoolers.

Table 3.1 2021 Thermal Guidelines for Liquid Cooling

Equipment Environment Specifications for Liquid Cooling

Typical Infrastructure Design Facility


Liquid
Secondary/ Water Supply
Cooling
Primary Facilities Supplemental Temperature,
Class
Facilities °C (°F) a

W17

W27
} Chiller/cooling tower
Water-side economizer
(cooling tower)
17 (62.6)

27 (80.6)

W32

W40
} Cooling tower
Chiller or
district heating system
32 (89.6)

40 (104)

W45

W+
} Cooling tower District heating system
45 (113)

>45 (>113)
a. Minimum water temperature for all classes is 2°C (35.6°F).
Thermal Guidelines for Data Processing Environments, Fifth Edition 47

The high thermal density and continuous operating hours of data centers can be
an attractive added value in providing low-temperature hot water to high-density
building clusters with high thermal loads such as mixed-use developments, airports,
college and university campuses, and large office developments. The liquid cooling
classes with supply temperatures of 32°C (59°F) and higher (shown in Table 3.1) are
candidates for district heating. The option of district heating is shown in Table 3.1
for classes W32, W40, W45, and W+. Data center operators can determine whether
they can take advantage of this option by computing the energy reuse effectiveness
(ERE) metric, as described by the Green Grid (TGG 2010) and enhanced in a short
paper published on the ASHRAE TC 9.9 home page titled “An Improved Energy
Reuse Metric” (Khalifa and Schmidt 2014). Additional details on district heating,
including the supply temperature categories, can be found in Chapter 12 of ASHRAE
Handbook—HVAC Systems and Equipment (2020) and the presentation given at the
4th international Conference on Smart Energy Systems and 4th Generation District
Heating (Lund et al. 2018).
Although the facility supply water temperatures specified in Table 3.1 are
requirements to be met by the ITE, it is incumbent on the facility owner/designer to
ensure the approach temperature for any planned CDU is taken into account, insur-
ing the proper TCS temperature for the ITE. Also it should be noted for the data
center operator, the use of the full range of temperatures within the class may not be
required or even desirable given the specific data center infrastructure design.
Until recently, liquid cooling has been sought out for performance, density, or
efficiency reasons. There are now liquid-only processor chips, and there will be

Figure 3.4 Liquid-cooling Classes W17 and W27 typical infrastructure.

Figure 3.5 Liquid-cooling Classes W32, W40, W45, and W+ typical


infrastructure.
48 Environmental Guidelines for Liquid-Cooled Equipment

more in the future. Liquid-cooled equipment is available from most manufacturers,


some even capable of Class W+ environments. The industry is, however, on the
verge of large chip power increases that may drive equipment to a lower water
temperature classification in order to support future power density requirements. At
the same time, the increased chip power is driving lower processor reliability and
functionality temperature requirements. IT OEMs have visibility to future chip
powers for several generations into the future. It is quite likely that the products that
will be available just one or two generations in the future will move from Classes
W40 or W45 to W32, for instance, due to these trends. Data centers lacking mechan-
ical cooling as a backup may have to choose between lower performance and a data
center retrofit to add cooling that supports using higher-performance ITE.

3.2.2 Condensation Considerations


All of the liquid-cooling classes allow the water supplied to the ITE to be as low
as 2°C (36°F), which is below the ASHRAE allowable room dew-point guideline of
17°C (63°F) for Class A1 enterprise datacom centers (refer to Table 2.1). Electronics
equipment manufacturers are aware of this and are taking it into account in their
designs. Data center relative humidity and dew point should be managed according
to the guidelines in this book. If low fluid operating temperatures are expected, care-
ful consideration of condensation should be exercised. It is suggested that a CDU (as
shown in Figures 3.1 and 3.2) with a heat exchanger be used to raise the coolant
temperature to at least 18°C (64.4°F) to eliminate condensation issues or have an
adjustable water supply temperature that is set 2°C (3.9°F) or more above the dew
point of the data center space.
4

Facility 
Temperature and Humidity
Measurement
Data centers and telecommunications central offices can be a challenge to effec-
tively cool. In many cases, the aggregate internal heat load is less than the theoretical
room cooling capacity, but localized overheating may still occur. Humidity condi-
tions that are out of specification may also cause some problems. Temperature and
humidity measurements are the best way to assess a data center environment. These
measurements may be carried out manually or by using automated data collection
systems built into information technology equipment (ITE) or mounted on equip-
ment racks. Care should be taken to make measurements systematically and to
ensure that they accurately represent equipment intake conditions.
Facilities designed to operate with varying operating temperatures, such as
those making extensive use of free cooling, are encouraged to install automated aisle
temperature and humidity monitoring.
The use of data center infrastructure management (DCIM) has become
commonplace. DCIM is the supervision, administration, and operational control of
data center assets and resources with the aim of optimizing cost and performance in
terms of infrastructure availability, energy efficiency, and operational efficiency. As
data centers carry out more environmental monitoring, it is becoming critical to
automate the collection, processing, alerting, and reporting of this data. ASHRAE
TC 9.9 has recently released Datacom Series Book 14, Advancing DCIM with IT
Equipment Integration (ASHRAE 2019). This publication depicts how a well-
implemented and maintained DCIM system helps safely maximize the efficient use
of power, cooling, and space resources through a comprehensive connective frame-
work. This framework proposes the necessary data sets, naming conventions, moni-
toring and integration points, and key metrics required for judging the effectiveness
of a data center environment. One of the core tenants of this connective framework
is the DCIM Compliance for IT Equipment (CITE), which highlights the core set of
data that should be made available directly from the ITE, thus reducing the burden
and cost to the facility operator to make these measurements.
Temperature and humidity measurements in a facility are generally required for
the following three reasons:

• Facility health and audit tests (refer to Section 4.1)


• Equipment installation verification tests (refer to Section 4.2)
• Equipment troubleshooting tests (refer to Section 4.3)

These three tests are hierarchical in nature, and the user should consider all of
them prior to choosing the one that best fits their application. In some cases, the
50 Facility Temperature and Humidity Measurement

proper test may be a mix of the above. For instance, a data center with low overall
power density but with localized high-density areas may elect to perform a facility
health and audit test for the entire facility but also perform an equipment installation
verification test for the area with localized high power density.
Sections 4.1 through 4.3 outline the recommended tests for measuring tempera-
ture and humidity. Section 4.4, new for the fifth edition, covers cooling simulation.

4.1 FACILITY HEALTH AND AUDIT TESTS


Facility health and audit tests are used to proactively assess the health of a data
center to avoid temperature- and humidity-related electronic equipment failures.
These tests can also be used to evaluate a facility’s cooling system for availability
of spare capacity for the future. It is recommended that these tests be conducted on
a regular basis.

4.1.1 Aisle Measurement Locations


Establish temperature and humidity measurement locations in each aisle that
has equipment air inlets. Standard temperature and humidity sensors mounted on
walls and columns are not deemed adequate for this testing. Lacking more elaborate
arrays of temperature and humidity sensors placed at the intakes of individual pieces
of equipment, manual measurement and recording of ambient temperature and
humidity is recommended.
Use the following guidelines to establish locations for measuring aisle ambient
temperature and humidity. It is suggested that points be permanently marked on the
floor for consistency and ease in repetition of measurements.

• Establish at least one point for every 3 to 9 m (10 to 30 ft) of aisle or every
fourth rack position, as shown in Figure 4.1.
• Locate points midway along the aisle, centered between equipment rows, as
shown in Figure 4.2.
• Where a hot-aisle/cold-aisle configuration is used, establish points in cold
aisles only,1 as shown in Figure 4.3.

Points picked should be representative of the ambient temperature and humid-


ity. Telcordia GR-63-CORE (2012) suggests measuring aisle temperature at 1.5 m
(4.9 ft) above the floor, which can be useful in some equipment configurations. This
will depend on the type of cabinet or rack used near the area where the measurement
is being observed. Lacking a more elaborate measurement system, this is considered
a minimum measurement.

1. Hot-aisle temperature levels do not reflect equipment inlet conditions and, therefore, may be
outside the ranges defined in Tables 2.1 and 2.2. Hot-aisle temperature levels may be measured
to help understand the facility, but significant temperature variation with measurement location
is normal.
Thermal Guidelines for Data Processing Environments, Fifth Edition 51

Figure 4.1 Measurement points in aisle.

Figure 4.2 Measurement points between rows.

The objective of these measurements is to ensure that the aisle temperature and
humidity levels are all being maintained within the recommended operating condi-
tions of the class environment, as noted in Tables 2.1 and 2.2 of Chapter 2.

4.1.2 HVAC Operational Status


Measure and record the following status points at all HVAC units, as applicable:

• Operating status of unit: ON, OFF


• Supply fan: status (ON/OFF) and fan speed if variable
• Temperature: supply air temperature, return air temperature
• Humidity: supply air humidity, return air humidity
52 Facility Temperature and Humidity Measurement

Figure 4.3 Measurement points in a hot-aisle/cold-aisle configuration.

Automatic logging of HVAC equipment parameters can provide valuable


insight into operational trends and may simplify data collection. The objective of
these measurements is to confirm proper HVAC operation.

4.1.3 Evaluation

4.1.3.1 Aisle Temperature and Humidity Levels


The temperature and/or humidity of any aisle with equipment inlets that is found
to be outside the desired operating range for the class environment should be inves-
tigated and the resolution fully documented. The investigation should involve iden-
tification of the source of the out-of-range condition and a possible corrective action.
The corrective action could be as simple as minor air balancing or more complex,
involving major rework of the cooling system. A decision to take no action must be
made with the recognition that prolonged operation outside of the recommended
operating ranges can result in decreased equipment reliability and longevity.

4.1.3.2 HVAC Unit Operation


Temperature and humidity levels at the HVAC unit should be consistent with
design values. Return air temperatures significantly below room ambient tempera-
tures is indicative of short-circuiting of supply air, which is a pathway that allows
Thermal Guidelines for Data Processing Environments, Fifth Edition 53

Figure 4.4 Monitoring points for configured racks.

cold supply air to bypass equipment and return directly to an HVAC unit. The cause
of any short-circuiting should be investigated and evaluated for corrective action.

4.2 EQUIPMENT INSTALLATION VERIFICATION TESTS

Equipment installation verification tests are used to ensure proper installation


of equipment in the room environment. The objective of these tests is to ensure that
the temperature and humidity in front of the cabinet or rack are acceptable.
For the tests, measure and record the temperature and humidity at the geometric
center of the air intake of the top, middle, and bottom racked equipment at 50 mm
(approximately 2 in.) from the front of the equipment. For example, if there are 20
servers in a rack, measure the temperature and humidity at the center of the first,
tenth or eleventh, and twentieth server. Figure 4.4 shows example monitoring points
for configured racks. For configurations with three pieces of equipment or less per
cabinet, measure the inlet temperature and humidity of each piece of equipment at
50 mm (approximately 2 in.) from the front at the geometric center of each piece of
equipment, as shown in Figure 4.4.
All temperature and humidity levels should fall within the specifications for the
class environment specified in Tables 2.1 and 2.2. If any measurement falls outside
of the desired operating conditions as specified, the facility operations personnel
may wish to consult with the equipment manufacturer regarding the risks involved.
Facilities managers sometimes use Telcordia GR-63-CORE (2012) to measure
and record the temperature at 1.5 m (4.9 ft) high and 380 mm (15 in.) from the front
of the frame or cabinet. However, this measurement method was not designed for
computer equipment. It is instead recommended that the preceding tests be used to
verify an installation.
54 Facility Temperature and Humidity Measurement

4.3 EQUIPMENT TROUBLESHOOTING TESTS


Equipment troubleshooting tests are used to determine if the failure of equip-
ment is potentially due to environmental effects. These tests are the same as those in
the first of paragraph of Section 4.2, except that the temperature and humidity across
the entire intake of the problematic piece of equipment are monitored. The objective
here is to determine if air is being drawn into the equipment within the allowable
conditions specified for the class environment shown in Tables 2.1 and 2.2.

• Case A: For equipment that is 1U to 3U in height, arrange the monitoring


points as shown in Figure 4.5.
• Case B: For equipment that is 4U to 6U in height, arrange the monitoring
points as shown in Figure 4.6.
• Case C: For equipment that is 7U and larger in height, arrange the monitoring
points as shown in Figure 4.7.
• Case D: For equipment that has a localized area for inlet air, arrange the mon-
itoring points in a grid pattern on the inlet as shown in Figure 4.8.
• Case E: For equipment cabinets with external doors, monitor the temperature
and humidity with the cabinet in its normal operational mode, which typically
will be with the doors closed.

Figure 4.5 Monitoring points for 1U to 3U equipment.

Figure 4.6 Monitoring points for 4U to 6U equipment.


Thermal Guidelines for Data Processing Environments, Fifth Edition 55

All temperature and humidity levels should fall within the specifications for the
class environment specified in Tables 2.1 and 2.2. If all measurements are within
limits, equipment failure is most likely not the result of poor environmental condi-
tions. If any measurement falls outside the recommended operating condition, the
facility operations personnel may wish to consult with the equipment manufacturer
regarding the risks involved or to correct the out-of-range condition.
Note: In some facilities, in particular pressurized facilities that control humidity
levels prior to the introduction of air into the data center, the absolute humidity in the
space is typically uniform. This is because significant humidity sources do not usually
exist inside data centers. If there is not a significant source of humidity in the data
center, humidity measurements do not have to be measured at every point, because
they can be calculated as a function of the localized temperature and the (uniform)
absolute humidity in the space at large.

Figure 4.7 Monitoring points for 7U and larger equipment.

Figure 4.8 Monitoring points for equipment with localized cooling.


56 Facility Temperature and Humidity Measurement

Chapter 1 of ASHRAE Handbook—Fundamentals (2017) provides the equa-


tions that relate temperature and absolute humidity to the relative humidity and/or
dew-point values needed to determine compliance with Tables 2.1 and 2.2 of this
book (most psychrometric charts could be used to perform the same calculations).

4.4 COOLING SIMULATION


Cooling simulation has traditionally been based on computational fluid dynam-
ics (CFD). CFD has been used for some time to evaluate the concept design of a data
center or to undertake troubleshooting of an existing facility that is experiencing
cooling issues. CFD programs designed specifically for data center simulation can
quantitatively improve deployment planning decisions and provide insights into
equipment installation prior to or after the deployment. Forecasting or predicting the
impact of the deployment to the data center prior to the actual deployment can mini-
mize operational impacts and allow operators to make informed decisions on capac-
ity, availability, and resiliency.
Cooling simulation can be used to complement the facility health and audit tests
described in Section 4.1. CFD traditionally uses graphical views of temperature
planes, flow patterns, and streamlines to illustrate simulation results. The measure-
ment data collected during a facility health test, whether aisle, HVAC, or ITE, can
be used to support the calibration and verification of the CFD model. Once the model
has been verified, the CFD model can be used with confidence within the facility.
The CFD model also provides the advantage of enabling higher spatial resolution
than can realistically be obtained through discrete sensor measurement points. Many
commercially available CFD tools provide preconfigured visualizations to look at

• ITE inlet temperature,


• ASHRAE environment conformance,
• room- and rack-level recirculation indices, and
• available cooling capacity.
5


Equipment Placement and 
Airflow Patterns
Chapter 5 provides airflow guidelines to align equipment manufacturers with
facility designers, operators, and managers regarding the placement of data process-
ing and communication equipment. Aisle pitch and equipment placement in aisles
are also addressed. It is important to note that this chapter focuses on developing
fundamental airflow protocols and the general concept of hot aisle/cold aisle;
detailed or best practices engineering is covered by other books in the ASHRAE
Datacom Series (see www.ashrae.org/datacenterguidance).
Note: Airflow in a high-density environment is a complex and often nonintui-
tive phenomenon. Following the recommendations in this guide does not guarantee
adequate equipment cooling, as detailed airflow design and fluid dynamics are
beyond the scope of this book. Facility managers must perform the appropriate engi-
neering analysis to include the effects of static pressure, dynamic (velocity) pres-
sure, occupancy, T, turbulence, etc. (For example, for an underfloor supply air
system, raised-floor height is a critical parameter, and locating floor grilles near
“downflow” computer room air-conditioning [CRAC] units often has a negative
impact.) In addition, emerging technologies enable localized equipment cooling that
may or may not be compatible with these guidelines. Such technologies require
further analysis.

5.1 EQUIPMENT AIRFLOW


This section addresses the recommended locations of the air intake and air
exhaust for electronic equipment.

5.1.1 Airflow Protocol Syntax


The airflow protocol used here adopts the syntax detailed in Telcordia GR-
3028-CORE (2001) on how the air intake and air exhaust are to be specified, and it
is consistent with Figure 5.1. GR-3028-CORE also defines levels that help describe
the location of the air inlet and exhaust.

5.1.2 Airflow Protocol for Equipment


To be consistent with and to complement a hot-aisle/cold-aisle configuration in
an equipment room, it is advantageous to design equipment using one of the three
airflow protocols shown in Figure 5.2.
The front of the equipment is typically defined as the surface that has cosmetic
skin and/or display. Rack-mounted equipment should follow the F-R protocol shown
in Figure 5.2 only, and cabinet systems can follow any of the three protocols shown.
58 Equipment Placement and Airflow Patterns

The recommended airflow protocols for data center equipment in Figure 5.2 closely
follow those recommended for telecom equipment in Telcordia GR-3028-CORE.
Per Telcordia GR-63-CORE (2012), forced-air-cooled equipment is required to
use only a rear aisle exhaust. If approved by exception, top-exhaust airflow equipment
may be used in support of specialized airflow requirements. Forced-air-cooled equip-
ment should use a front-aisle air inlet. Forced-air-cooled equipment with other than
front-aisle-to-rear-aisle airflow may be approved for use when fitted with manufac-
turer-provided air baffles/deflectors that effectively reroute the air to provide front-
aisle-to-rear-aisle airflow. Equipment requiring air baffles/deflectors for airflow
compliance is required to be tested by the manufacturer for compliance to GR-63-
CORE with such hardware in place. Forced-air-cooled equipment other than front-
aisle air inlets may be approved for use but should not sustain any damage or deteri-
oration of functional performance during its operating life when operated at elevated
air inlet temperatures.

5.1.3 Cabinet Design


Blanking panels should be installed in all unused rack and cabinet spaces to
maximize and improve the functionality of the hot-aisle/cold-aisle air system. The

Figure 5.1 Syntax of face definitions.

Figure 5.2 Recommended airflow protocol.


Thermal Guidelines for Data Processing Environments, Fifth Edition 59

blanking panels should be added to the front cabinet rails, thereby preventing the
recirculation of hot air to the equipment inlet vented front, and rear doors for the cabi-
net must be nonrestrictive to airflow to reduce the load on information technology
equipment (ITE) fans, which can cause undesired ITE power consumption. Gener-
ally, 60% open ratio or greater is acceptable. To assist with hot-aisle/cold-aisle isola-
tion, solid-roofed cabinets are preferred.

5.2 EQUIPMENT ROOM AIRFLOW


To maximize the thermal and physical capabilities of the equipment room, the
equipment and the equipment room need to have compatible airflow schemes. The
following subsections address guidelines that should be followed to achieve this.

5.2.1 Placement of Cabinets and Rows of Cabinets


For equipment that follows the airflow protocol outlined in Section 5.1.2, a hot-
aisle/cold-aisle layout is recommended. Figure 5.3 shows the recommended layout
of aisles to meet the hot-aisle/cold-aisle configuration. The arrows in the cold aisle
and the hot aisle depict the intake airflow and the exhaust airflow, respectively. The
intent of the hot-aisle/cold-aisle concept is to maximize the delivery of cooled air to
the intakes of the electronic equipment and allow for the efficient extraction of the
warmed air discharged by the equipment.
Recirculation can be reduced through tight cabinet placement and the use of
equipment blanking panels, as described in Section 5.1.3. It is the responsibility of
the facility operations personnel to determine the best way to implement hot-aisle/
cold-aisle configurations. Figure 5.4 shows an example of this configuration using
underfloor cooling found in a typical data center.
Figure 5.5 shows a non-raised-floor implementation. The overhead ventilation
system uses multiple air diffusers that inject cool air vertically (downward) into the
cold aisles.

Figure 5.3 View of a hot-aisle/cold-aisle configuration.


60 Equipment Placement and Airflow Patterns

Two solutions are becoming more common in data centers to eliminate the
mixing of cold and hot air. These containment solutions—the cold-aisle containment
design shown in Figure 5.6 and the hot-aisle containment design shown in
Figure 5.7—prevent the mixing of cold and hot air, thereby improving energy effi-
ciency for data centers significantly in some cases.

5.2.2 Cabinets with Dissimilar Airflow Patterns


It is important to emphasize that the risks of not deploying cabinets with a front-
to-back airflow design in a hot-aisle/cold-aisle configuration are significant, espe-
cially in rooms with high heat densities. The complexities of airflow dynamics are
difficult to predict without training and tools. To make the task easier, keep equip-
ment with the same type of airflow pattern together, with all exhausts directed toward
the hot aisle.
In implementations that do not use the hot-aisle/cold-aisle configuration,
warmed air discharged from the rear of one cabinet can be drawn into the front of
a nearby cabinet. This warmed air can be further warmed by the next row of equip-

Figure 5.4 Example of hot and cold aisles for raised-floor environments
with underfloor cooling.

Figure 5.5 Example of hot and cold aisles for non-raised-floor


environments with overhead cooling.
Thermal Guidelines for Data Processing Environments, Fifth Edition 61

ment and so on. This can create a potentially harmful situation for the equipment in
the cabinets farther to the rear. If not addressed, this condition would contribute to
increased equipment failures and system downtime. Therefore, place cabinets that
cannot use hot-aisle/cold-aisle configurations together in another area of the data
center, being careful to ensure that exhaust from various equipment is not drawn into
equipment inlets. Temperature measurements can document the effect of recircu-
lated hot air and should be compared to the recommended and allowable temperature
ranges.

5.2.3 Aisle Pitch

Aisle pitch is defined as the distance between the center of the reference cold aisle
and the center of the next cold aisle in either direction. A common aisle pitch for data
centers is seven floor tiles, based on two controlling factors. First, it is advisable to
allow a minimum of one complete floor tile in front of each rack. Second, maintaining
a minimum of three feet in any aisle for wheelchair access may be required by Section
4.3.3 of the Americans with Disabilities Act (ADA), 28 CFR Part 36 (ADA 2010).

Figure 5.6 Cold-aisle containment.

Figure 5.7 Hot-aisle containment.


62 Equipment Placement and Airflow Patterns

Figure 5.8 Seven-tile aisle pitch, equipment aligned on hot aisle.

Table 5.1 Aisle Pitch Allocation


Maximum
Aisle Pitch Nominal
Region

Space Allocated for Hot Aisle


Tile Size (Cold Aisle to Cold Aisle
Equipment with Size
Cold Aisle)a Sizeb
No Overhangc
610 mm 4267 mm 1220 mm 1067mm 914 mm
Global U.S.

(2 ft) (14 ft) (4 ft) (42 in.) (3 ft)

600 mm 4200 mm 1200 mm 1043 mm 914 mm


(23.6 in.) (13.78 ft) (3.94 ft) (41 in.) (3 ft)
a. If considering a pitch other than seven floor tiles, it is advised to increase or decrease the pitch in whole tile incre-
ments. Any overhang into the cold aisle should take into account the specific design of the front of the rack and
how it affects access to and flow through the tile.
b. Nominal dimension assumes no overhang; less if front door overhang exists.
c. Typically a one metre rack is 1070 mm (42 in.) deep with the door and would overhang the front tile 3 mm
(0.12 in.) for a U.S. configuration and 27 mm (1.06 in.) for a global configuration.

Based on the standard-sized domestic floor tile, these two factors result in a seven-tile
pitch, allowing two accessible tiles in the cold aisle, 914.4 mm (3 ft) in the hot aisle,
and reasonably deep rack equipment, as shown in Figure 5.8. Table 5.1 lists potential
equipment depths for a seven-tile pitch. Rack depth would have to be less than 1066.8
mm (42 in.) to maintain a seven-tile pitch.
Some installations require that the rear of a cabinet line up with the edge of a
removable floor tile to facilitate underfloor service, such as pulling cables. Adding
this constraint to a seven-tile pitch results in a 1.21 m (4 ft) wide hot aisle and forces
a cold aisle of less than 1.21 m (4 ft), with only one row of vented tiles and more
limited cooling capacity, as shown in Figure 5.9.
Thermal Guidelines for Data Processing Environments, Fifth Edition 63

Figure 5.9 Seven-tile aisle pitch, equipment aligned on cold aisle.

For larger cabinet sizes and/or higher-power-density equipment, it may be


advantageous to use an eight-tile pitch. Similarly, smaller equipment, especially
telecom form factors, can take advantage of tighter pitches. For example, ATIS-
0600336 (2015) defines a universal telecommunication framework (UTF) as having
a baseline depth of a frame of 600 mm (23.6 in.); deeper equipment may be permitted
in special deeper lineups of 750 or 900 mm (29.5 or 35.4 in.) depths. All configu-
rations need to be examined on a case-by-case basis.
Aisle pitch determines how many perforated floor tiles can be placed in a cold
aisle. The opening in the tile together with the static pressure in the raised-floor
plenum determines how much supply airflow is available to cool the ITE.
6


Equipment Manufacturers’ 
Heat and Airflow Reporting
This chapter provides guidance to users for estimating heat release from infor-
mation technology equipment (ITE) similar to what was developed by Telcordia in
GR-3028-CORE (2001) for the telecom market. Some ITE manufacturers provide
sophisticated tools to more accurately assess power and airflow consumption. When
available, the manufacturer should be consulted and data from their tools should be
used to provide more specific information than may be available in the thermal report
provided by the ITE manufacturers.

6.1 PROVIDING HEAT RELEASE AND AIRFLOW VALUES


This section contains a recommended process for ITE manufacturers to
provide heat release and airflow values to end users that results in more accurate
planning for data center air handling. It is important to emphasize that the heat
release information is intended for thermal management purposes.
Note: Nameplate ratings should at no time be used as a measure of equip-
ment heat release. The purpose of a nameplate rating is solely to indicate the
maximum power draw for safety and regulatory approval. Similarly, the heat
release values should not be used in place of the nameplate rating for safety and
regulatory purposes. Please refer to the definitions for power in Section 1.4 of
Chapter 1.
In determining the correct equipment power and airflow characteristics, the
goal is to have an algorithm that works with variations in configurations and that
is reasonably accurate. The actual method of algorithm development and the
definitions of the equipment configurations are up to the manufacturer. The algo-
rithm can be a combination of empirically gathered test data and predictions, or
it may consist only of measured values. During equipment development, the
algorithm may consist only of predictions, but representative measured values
must be factored into the algorithm by the time the product is announced.
Heat release numbers, in watts, should be based on the following conditions:

• Steady state
• User controls or programs set to a utilization rate that maximizes the number
of simultaneous components, devices, and subsystems that are active
• Nominal voltage input
• Ambient temperature between 18°C and 27°C (64.4°F and 80.6°F)
• Air-moving devices at ambient inlet temperatures as specified above
66 Equipment Manufacturers’ Heat and Airflow Reporting

Airflow values should be reflective of those that would be seen in the ITE oper-
ating in a data center. Representative racking, cabling, and loading should be taken
into account in airflow reporting. Some ITE manufacturers use variable-speed fans,
which can result in a large variance in airflow due to equipment loading and ambient
conditions. Airflow reporting should be based on the following conditions:

• Representative mounting (i.e., inside rack with doors shut)


• Representative cabling (cabling commensurate with the configuration level)
• Steady state
• User controls or programs set to a utilization rate that maximizes the number
of simultaneous components, devices, and subsystems that are active
• Nominal voltage input
• All normally powered fans operating
• Ambient temperature between 18°C and 27°C (64.4°F and 80.6°F)
• Sea level: airflow values at an air density of 1.2 kg/m3 (0.075 lb/ft3) (this
corresponds to air at 18°C [64.4°F], 101.3 kPa [14.7 psia], and 50% rh)

For equipment with variable-speed fans, in addition to the nominal airflow


value it is recommended that a maximum airflow value be given for each configu-
ration. The conditions that yield the reported maximum flow values should be indi-
cated in the report provided by the ITE manufacturers. An example of thermal
reporting is shown in Table 6.1.
Once representative configurations have been tested, other values may be
obtained through a predictive algorithm. For predicted heat release and airflow
values, the accuracy should adhere to the following guidelines:

• The values predicted for tested configurations are within 10% of the measured
values.
• When the predicted values vary by more than 10% from the measured values,
the predictive algorithm is updated and revalidated.

6.2 EQUIPMENT THERMAL REPORT


The manufacturer’s equipment thermal report should include the following
items (see Table 6.1 for an example of thermal reporting):

• Power for representative configurations. A table of configuration options


should always be included in the report. This table may be representative or
exhaustive, but it should span from minimum to maximum configurations.
Listed options should only be those that are orderable by the customer. The
table should include each of the following for each listed configuration:
• Description of configuration.
• Steady-state heat release for equipment in watts for conditions defined in
Section 6.1. A calculator may also be provided at the discretion of the
manufacturer.
• Dimensions of configurations: height, width, and depth of the rack-
mountable or stand-alone equipment in I-P and SI units.
Thermal Guidelines for Data Processing Environments, Fifth Edition 67

Table 6.1 Example of Thermal Reporting

XYZ Co. Model abc Server: Representative Configurations

Condition
Overall
Description

Typical System
Airflow, Weight
Heat Airflowa, Dimensionsb
Maximum (W × D × H)
Release Nominal
@ 35°C (95°F)
(@ 110 V)

W m3/h (cfm) m3/h (cfm) kg (lb) mm (in.)


Configuration
Minimum

762 × 1016 × 1828


1765 680 (400) 1020 (600) 406 (896)
(30 × 42 × 72)
Configuration

1549 × 1016 × 1828


Full

10,740 1275 (750) 1913 (1125) 693 (1528)


(61 × 40 × 72)
Configuration
Typical

762 × 1016 × 1828


5040 943 (555) 1415 (833) 472 (1040)
(30 × 40 × 72)

Airflow Diagram
Cooling Scheme F-R

Minimum Configuration 1 CPU-A, 1GB, 2 I/O

ASHRA
E Class 8 CPU-B, 16 GB, 64 I/O
Full Configuration
A1, A2 (2 GB cards, 2 frames)

4 CPU-A, 8 GB, 32 I/O


Typical Configuration
(2 GB cards, 1 frame)

a. Airflow values are for an air density of 1.2 kg/m3 (0.075 lb/ft3). This corresponds to air at 18°C (64.4°F),
101.3 kPa (14.7 psia), and 50% rh.
b. Footprint does not include service clearance or cable management, which is 0 on the sides, 1168 mm (46 in.)
in the front, and 1016 mm (40 in.) in the rear.
68 Equipment Manufacturers’ Heat and Airflow Reporting

• Weight in pounds and kilograms of the rack-mountable or stand-alone


equipment.
• Airflow characteristics of each configuration in m3/h and cfm for condi-
tions defined in Section 6.1.
• Airflow diagram showing intake and exhaust of system (side, top, front
or back). Specify scheme using syntax defined in Figures 5.1 and 5.2 of
Chapter 5.
• Applicable ASHRAE environmental class designation(s). Compliance
with a particular air-cooled environmental class requires full operation of the
equipment over the entire allowable environmental range, based on nonfailure
conditions.

6.3 EPA ENERGY STAR REPORTING


ASHRAE TC 9.9 has recommended better thermal reporting for a number of
years. Recently the United States Environmental Protection Agency (EPA) has incor-
porated many of ASHRAE’s recommendations into their ENERGY STAR
program, particularly the development of the ENERGY STAR requirements for serv-
ers. Note that not all servers are required to meet these documentation requirements,
only those that the manufacturer desires to have an ENERGY STAR rating. The
ENERGY STAR program is constantly being refined, so the reader is encouraged to
check the EPA website for the latest information. The current version (as of this writ-
ing) is Version 3.0 and can be found on the ENERGY STAR website (EPA 2018).
The ENERGY STAR Version 3.0 requirements state that for a server to be
eligible for certification under this specification, it must meet the definition of
computer server as provided in Section 1 of ENERGY STAR Program Require-
ments for Computer Servers (EPA 2019a). Eligibility under Version 3.0 is limited
to blade, multinode, rack-mounted, or pedestal form-factor computer servers with
no more than four processor sockets in the computer server (or per blade or node
in the case of blade or multinode servers). The tested configurations must include
the following:

• Model name and number, identifying SKU and/or configuration ID


• System characteristics (form factor, available sockets/slots, power specifica-
tions, etc.)
• System type (e.g., resilient)
• System configuration(s) (including low-end performance configuration, high-
end performance configuration, and typical configuration for product family
certification)
• Power consumption and performance data from required active and idle state
efficiency criteria testing
• Available and enabled power-saving features (e.g., power management)
• For product family certifications, a list of qualified configurations with quali-
fied SKUs or configuration IDs
• For blade servers, a list of compatible blade chassis that meet ENERGY
STAR certification criteria
Thermal Guidelines for Data Processing Environments, Fifth Edition 69

To certify for an ENERGY STAR rating, a computer server must offer processor
power management that is enabled by default in the basic input/output system
(BIOS) and/or through a management controller, service processor, and/or the oper-
ating system shipped with the computer server. All processors must be able to reduce
power consumption in times of low utilization by

• reducing voltage and/or frequency through dynamic voltage and frequency


scaling (DVFS) or
• enabling processor or core reduced power states when a core or socket is not
in use.

A computer server must provide data on input power consumption (W), inlet air
temperature (°C [°F]), and average utilization of all logical central processing units
(CPUs):

• Input power consumption: Measurements must be reported with accuracy


of at least ±5% of the actual value, with a maximum level of accuracy of ±10
W for each installed power supply unit (PSU) (i.e., the power reporting accu-
racy for each power supply is never required to be better than ±10 W) through
the operating range from idle to full power.
• Inlet air temperature: Measurements must be reported with an accuracy of
at least ±2°C (3.6°F).
• CPU utilization: Average utilization must be estimated for each logical CPU
that is visible to the operating system (OS) and must be reported to the opera-
tor or user of the computer server through the operating environment (OS or
hypervisor).

These data must be made available in a published or user-accessible format that is


readable by third-party, nonproprietary management software over a standard
network. For blade and multinode servers and systems, data may be aggregated at
the chassis level.
As ENERGY STAR rated servers (or any servers that report their power and
thermal information) and data center infrastructure management (DCIM) software
that can use the information to manage the data center become more prevalent, the
ability to provide a higher level of integration between IT management and building
management systems will allow data center designers and operators to more fully
optimize data centers for maximum efficiency.
Appendix A
2021 ASHRAE Environmental
Guidelines for ITE—
Expanding the Recommended 
Environmental Envelope
The recommended environmental envelopes for information technology equip-
ment (ITE) are listed in Tables 2.1 and 2.2. The purpose of the recommended envelope
is to give guidance to data center operators on maintaining high reliability and also
operating data centers in the most energy-efficient manner. To provide greater flex-
ibility in facility operations, particularly with the goal of reduced energy consump-
tion in data centers, ASHRAE TC 9.9 revisited the recommended equipment
environmental specifications for the second edition of Thermal Guidelines
(ASHRAE 2008). The result of this effort, detailed in this appendix, was to expand
the recommended operating environment envelope as shown in Table A.1, which
provides a comparison between the 2004, 2008/2011, 2015, and 2021 versions.
Figures A.1 and A.2 show the 2021 recommended envelopes. These recommended
conditions, as well as the allowable conditions, refer to the inlet air entering the data-
com equipment.
IT manufacturers test their ITE in the allowable envelope to verify that the equip-
ment will function within these environmental boundaries. Typically, manufacturers
perform a number of tests prior to the announcement of a product to verify that it meets
all the functionality requirements within this allowable environmental envelope. This
is not a statement of reliability but one of functionality of the ITE. However, the recom-
mended envelope is a statement of reliability. IT manufacturers recommend that data
center operators maintain their environment within the recommended envelope for
extended periods of time. Exceeding the recommended limits for short periods of time

Table A.1 Comparison of 2004, 2008/2011, 2015, and 2021


Versions of Recommended Envelopes

2004 2008/2011 2015 2021


Version Version Version Version
Low-end
20°C (68°F) 18°C (64.4°F) 18°C (64.4°F) 18°C (64.4°F)
temperature
High-end
25°C (77°F) 27°C (80.6°F) 27°C (80.6°F) 27°C (80.6°F)
temperature
Low-end
40% rh 5.5°C (41.9°F) DP –9°C (15.8°F) DP –9°C (15.8°F) DP
moisture
15°C (59°F) DP
High-end 15°C (59°F) DP 15°C (59°F) DP
55% rh and 70% rh or
moisture and 60% rh and 60% rh
50% rh
72 2021 ASHRAE Environmental Guidelines for ITE

Figure A.1 Highlighted in red is the 2021 recommended envelope for a


low level of pollutants.

Figure A.2 Highlighted in red is the 2021 recommended envelope for a


high level of pollutants.
Thermal Guidelines for Data Processing Environments, Fifth Edition 73

should not be a problem, but running near the allowable limits for extended periods
could result in increased reliability issues. (See Table 2.6 in Chapter 2 for the effects
of higher inlet temperatures on server reliability.) In reviewing the available data from
a number of IT manufacturers, the 2008 expanded recommended environmental enve-
lope became the agreed-upon envelope that is acceptable to all IT manufacturers, and
operation within this envelope does not compromise overall reliability of ITE.
This recommended envelope was created for general use across all types of busi-
nesses and conditions. However, different environmental envelopes may be more
appropriate for different business values and climate conditions. Therefore, to allow
for the potential of the ITE to operate in a different envelope that might provide even
greater energy savings, the fourth edition of Thermal Guidelines (ASHRAE 2015b)
provided general guidance on server metrics that can assist data center operators in
creating different operating envelopes that match their business values. Each of these
metrics is described in Chapter 2. By using these guidelines, the user can determine
what environmental conditions best meet their technical and business needs. Any
choice outside of the recommended region will be a balance between the additional
energy savings of the cooling system versus the deleterious effects that may be created
on total cost of ownership (TCO) (total site energy use, reliability, acoustics, and
performance).
None of the versions of the recommended operating environments ensure that the
data center is operating at optimum energy efficiency. Depending on the cooling
system, design, and outdoor environmental conditions, there will be varying degrees
of efficiency within the recommended zone. For instance, when the ambient tempera-
ture in a data center is raised, the thermal management algorithms within some data-
com equipment increase the speeds of air-moving devices to compensate for the higher
inlet air temperatures, potentially offsetting the gains in energy efficiency due to the
higher ambient temperature. It is incumbent upon each data center operator to review
and determine, with appropriate engineering expertise, the ideal operating point for
each system. This includes taking into account the recommended range and site-
specific conditions. The full recommended envelope is not the most energy-efficient
environment when a refrigeration cooling process is being used. For example, the high
dew point at the upper areas of the envelope result in latent cooling (condensation) on
refrigerated coils, especially in DX units. Latent cooling may decrease the available
sensible cooling capacity for the cooling system and, depending on the specific condi-
tions to be maintained in the data center, make it necessary to humidify to replace
excessive moisture removed from the air.
The ranges included in this book apply to the inlets of all equipment in the data
center (except where IT manufacturers specify other ranges). Attention is needed to
make sure the appropriate inlet conditions are achieved for the top portion of ITE
racks. The inlet air temperature in many data centers tends to be warmer at the top
portion of racks, particularly if the warm rack exhaust air does not have a direct return
path to the computer room air conditioners (CRACs). This warmer air also affects
the relative humidity (RH), resulting in lower values at the top portion of the rack.
Finally, it should be noted that the 2008 change to the recommended upper
temperature limit from 25°C to 27°C (77°F to 80.6°F) can have detrimental effects
74 2021 ASHRAE Environmental Guidelines for ITE

on acoustical noise levels in the data center. See the Acoustical Noise Levels section
of this appendix for a discussion of these effects.

A.1 DRY-BULB TEMPERATURE LIMITS


Part of the rationale in choosing the new low and high temperature limits
stemmed from the generally accepted practice for the telecommunication industry’s
central office, based on Telcordia GR-3028-CORE (2001), which uses the same dry-
bulb temperature limits as specified in Table 2.1. In addition, this choice provides
precedence for reliable operation of telecommunications equipment based on a long
history of central office installations all over the world.

A.1.1 Low End


From an IT point of view, there is no concern in moving the lower recommended
limit for dry-bulb temperature from 20°C to 18°C (68°F to 64.4°F). In equipment
with constant-speed air-moving devices, a facility temperature drop of 2°C (3.6°F)
results in about a 2°C (3.6°F) drop in all component temperatures. Even if variable-
speed air-moving devices are used, typically no change in speed occurs in this
temperature range, so component temperatures again experience a 2°C (3.6°F) drop.
One reason for lowering the recommended temperature in 2008 was to extend the
control range of economized systems by not requiring a mixing of hot return air to
maintain the previous 20°C (68°F) recommended limit. The lower limit should not
be interpreted as a recommendation to reduce operating temperatures, as this could
increase hours of chiller operation and energy use. A non-economizer-based cooling
system running at 18°C (64.4°F) will most likely carry an energy penalty. (One
reason to use a non-economizer-based cooling system is having a wide range of inlet
rack temperatures due to poor airflow management; however, fixing the airflow
would likely be a good first step toward reducing energy.) Where the set point for
the room temperature is taken at the return to cooling units, the recommended range
should not be applied directly, as this could drive energy costs higher from over-
cooling the space. The recommended range is intended for the inlet to the ITE. If the
recommended range is used as a return air set point, the lower end of the range (18°C
to 20°C [64.4°F to 68°F]) increases the risk of freezing the coils in a direct-expansion
(DX) cooling system.

A.1.2 High End


The greatest justification for increasing the high-side temperature is to increase
the hours of economizer use per year. For non-economizer systems, there may be an
energy benefit by increasing the supply air or chilled-water temperature set points.
However, the move from 25°C to 27°C (77°F to 80.6°F) can have an impact on the
ITE’s power dissipation. Most IT manufacturers start to increase air-moving device
speed around 25°C (77°F) to improve the cooling of the components and thereby
offset the increased ambient air temperature. Therefore, care should be taken before
operating at the higher inlet conditions. The concern that increasing the IT inlet air
temperatures might have a significant effect on reliability is not well founded. An
Thermal Guidelines for Data Processing Environments, Fifth Edition 75

Figure A.3 Inlet and component temperatures with fixed fan speed.

increase in inlet temperature does not necessarily mean an increase in component


temperatures. Figure A.3 shows a typical component temperature relative to an
increasing ambient temperature for an IT system with constant-speed fans.
In Figure A.3, the component temperature is 21.5°C (37.8°F) above the inlet
temperature of 17°C (62.6°F), and it is 23.8°C (42.8°F) above an inlet ambient
temperature of 38°C (100.4°F). The component temperature tracks the air inlet
ambient temperature very closely.
Now consider the response of a typical component in a system with variable-
speed fan control, as depicted in Figure A.4. Variable-speed fans decrease the fan
flow rate at lower temperatures to save energy. Ideal fan control optimizes the reduc-
tion in fan power to the point that component temperatures are still within vendor
temperature specifications (i.e., the fans are slowed to the point that the component
temperature is constant over a wide range of inlet air temperatures).
This particular system has a constant fan flow up to approximately 23°C (73.4°F).
Below this inlet air temperature, the component temperature tracks closely to the ambi-
ent air temperature. Above this inlet temperature, the fan adjusts flow rate such that the
component temperature is maintained at a relatively constant temperature.
These data bring up several important observations:

• Below a certain inlet temperature (23°C [73.4°F] in the case described above), IT
systems using variable-speed air-moving devices have constant fan power, and
their component temperatures track fairly closely to ambient temperature
changes. Systems that do not use variable-speed air-moving devices track ambient
air temperatures over the full range of allowable ambient temperatures.
• Above a certain inlet temperature (23°C [73.4°F] in the case described
above), the speed of the air-moving device increases to maintain fairly con-
stant component temperatures and, in this case, inlet temperature changes
have little to no effect on component temperatures and thereby no effect on
76 2021 ASHRAE Environmental Guidelines for ITE

Figure A.4 Inlet and component temperatures with variable fan speed.

reliability, because component temperatures are not affected by ambient tem-


perature changes.
• The introduction of ITE that uses variable-speed air-moving devices has min-
imized the effect on component reliability as a result of changes in ambient
temperatures and has allowed for the potential of large increases in energy
savings, especially in facilities that use economizers.

As shown in Figure A.4, the IT fan power can increase dramatically as it ramps
up speed to counter the increased inlet ambient temperature. The graph shows a typi-
cal power increase that results in the near-constant component temperature. In this
case, the fan power increased from 11 W at 23°C (73.4°F) inlet temperature to over
60 W at 35°C (95°F) inlet temperature. The inefficiency in the power supply results
in an even larger system power increase. The total room power (facilities + IT) may
actually increase at warmer temperatures. IT manufacturers should be consulted
when considering system ambient temperatures approaching the upper recom-
mended ASHRAE temperature specification. See the work by Patterson (2008) for
a technical evaluation of the effect of increased environmental temperature, where
it was shown that an increase in temperature can actually increase energy use in a
standard data center but reduce it in a data center with economizers in the cooling
system.
Because of the derating of the maximum allowable temperature with altitude for
Classes A1 and A2, the recommended maximum temperature is derated by 1°C/
300 m (1.8°F/984 ft) above 1800 m (5906 ft).

A.2 MOISTURE LIMITS

A.2.1 High End


In 2015 ASHRAE funded a research project conducted by the Syracuse Univer-
sity Mechanical and Aerospace Engineering Department (Zhang et al. 2019) to
Thermal Guidelines for Data Processing Environments, Fifth Edition 77

investigate the effects of gaseous pollutants and high relative humidity on the reli-
ability of ITE. Specifically, it was found that for data center environments tested with
silver and copper coupons that are shown to have corrosion levels less than 300 Å/
month for copper and 200 Å/month for silver, suggesting that only the pervasive
pollutants (SO2, NO2, and O3) may be present, the moisture limit could be raised to
70% rh for the recommended environmental envelope. However, before this change
could be made to the recommended envelope, detrimental effects to other IT compo-
nents from raising the RH limits needed to be investigated. Specifically, the question
considered was: what are the effects of this change from 60% to 70% rh on printed
circuit cards, hard disk drives (HDDs), and tape drives? The answers to this question
are addressed in the following subsections.

A.2.1.1 Printed Circuit Boards


Extensive reliability testing of printed circuit board (PCB) laminate materials
has shown that conductive anodic filament (CAF) growth is strongly related to RH
(Sauter 2001). Augis et al. (1989) had determined that there is a humidity threshold
below which CAF formation will not occur. They found that this RH threshold
depends upon operating voltage and temperature. For example, they found that for
a 50 V circuit operating at 25°C (77°F), the critical RH for CAF formation is near
80%. As humidity increases, time to failure rapidly decreases. Extended periods of
high RH can result in failures, especially given the reduced conductor-to-conductor
spacing common in many designs today. The CAF mechanism involves electrolytic
migration after a path is created. Path formation could be due to a breakdown of inner
laminate bonds driven by moisture, which supports the electrolytic migration and
explains why moisture is so key to CAF formation.
The impact of sodium chloride (NaCl) contamination and climatic conditions
on the reliability of PCB assemblies is that are no significant changes in leakage
current or electrochemical migration susceptibility when the RH increased from
60% to 70% (Verdingovas et al. 2014).
There are many other studies where the impact of humidity on PCB assemblies
is investigated, but these mainly focus on contaminants—in particular, weak organic
acids (WOAs), solder flux, and NaCl—on the surface resistivity, corrosion, and elec-
trochemical migration of PCBs. The solubility of the contaminants and the related
deliquescence RH levels of the various contaminant materials have a significant
influence on corrosion reliability of PCBs. The degree to which contaminants are
present either from manufacturing or climate can vary widely and affect the critical
level of humidity above which fails occur. The guidelines in this book assume a clean
and well-controlled manufacturing process free from contaminants.

A.2.1.2 Hard Disk Drives (HDDs) and Tape Drives


The upper moisture region is also important for disk and tape drives. In disk
drives, there are head flyability and corrosion issues at high humidity. In tape drives,
high humidity can increase frictional characteristics of tape and increase head wear
and head corrosion. High RH, in combination with common atmospheric contami-
78 2021 ASHRAE Environmental Guidelines for ITE

nants, is required for atmospheric corrosion. The humidity forms monolayers of


water on surfaces, thereby providing the electrolyte for the corrosion process. Sixty
percent RH is associated with adequate monolayer buildup for monolayers to begin
taking on fluid-like properties. When humidity levels exceed the critical equilibrium
humidity of a contaminant’s saturated salt, hygroscopic corrosion product is formed,
further enhancing the buildup of acid-electrolyte surface wetness and greatly accel-
erating the corrosion process. Although disk drives do contain internal means to
control and neutralize pollutants, maintaining humidity levels below the critical
humidity levels of multiple monolayer formation retards initiation of the corrosion
process.
The results from important research on disk reliability published in 2016
(Manousakis et al.) can help determine the maximum allowable relative humidity for
operating data centers. In this work, the researchers investigated nine worldwide
Microsoft data centers for a period of 1.5 to 4 years each with a focus of studying
the impact of environmental conditions (absolute temperature, temperature varia-
tion, relative humidity, and relative humidity variation) on disk failures. Using the
data from the nine data centers and more than one million disks, they drew many
interesting observations. One of the main results is that temperature and temperature
variations on disk reliability are much less significant than RH in modern data center
cooling configurations.
With the volume of data the researchers were able to derive and validate,
Manousakis et al. (2016) were able to create a new model of disk lifetime as a func-
tion of environmental conditions. The data included five free-cooled data centers
(bringing outdoor air directly into the data center), two data centers with water-side
economizers, and two data centers using chillers with well-controlled environments.
Manousakis et al. (2016) reported a number of results, but one data center that
they chose to include details about was one of the highly humid free-cooled data
centers that had a disk population of 168k. The temperature and humidity distribu-
tions for this data center are shown in Figure A.5. This data center is one of the four
(out of nine) highly humid data centers investigated. The carefully controlled chiller-
based data centers where humidity was controlled to 50% were considered the base
case. For those data centers, the disk failure rate was 1.5% AFR (annual failure rate).
The failure rate of the free-cooled data center with the temperature and humidity
distribution shown in Figure A.5 was 3.1%, or two times greater than the base case
(it is important to note that the maximum RH specification for disk operation is 80%
rh). Consider the impact to the failure rate of this data center if the RH is limited to
70%: recomputing the failure rate based on their statistical model (using the expo-
nential parameters shown in Table 5 of their paper) and this 70% rh limit finds that
the failure rate is less than the baseline of the chiller-based data center controlled to
50% rh. (The reason is that in Figure A.5 the failure rates below 50% outweigh those
above 50%.) The failure rate is 1.3% using the distribution in Figure A.5 but limited
to a maximum of 70% rh.
Two more distributions were fabricated to make comparisons to the base case
of the chiller-controlled 50% rh data center. First, there was a case in which the
humidity varied between 30% and 60% in evenly distributed bins of 10% rh each;
Thermal Guidelines for Data Processing Environments, Fifth Edition 79

Figure A.5 Temperature and humidity distribution of a free-cooled data


center.

that is, the bar graph showing this distribution would indicate an equal number of
samples for 30% to 40%, 40% to 50%, and 50% to 60% rh. This might represent
operating a data center with high humidity to a maximum RH limit of 60% (as it was
in the 2015 recommended envelope). In this case the failure rate was 1.23%, less than
the base case of 1.5%. Raising the bins by 10%, where the distribution becomes 40%
to 50%, 50% to 60%, and 60% to 70% rh to reflect operating a data center up to the
higher RH limit of 70% for the recommended envelope, results in a failure rate
computed at 1.78%. (This might be considered a worst-case scenario for operating
a data center at the higher RHs.) This projected failure rate seems acceptable given
that it is much less than that experienced by the data center described in Figure A.5,
where the failure rate was 3.1%, or more than two times the well-controlled chiller-
based data center with a failure rate of 1.5%.
Two other observations from the Manousakis et al. (2016) paper are worth
including here:

• It was found that in high-RH data centers, server designs that place disks in
the backs of their enclosures can reduce the disk failure rate significantly.
• Though higher RH increases component failures, relying on software tech-
niques to mask them also significantly reduces infrastructure and energy costs
and more than compensates for the cost of the additional failures.

Tape products have been following note c of Table 2.1 where 80% rh is accept-
able: tape products require a stable and more restrictive environment (similar to
Class A1 of the 2011 thermal guidelines). Typical requirements are a minimum
temperature of 15°C (59°F), a maximum temperature of 32°C (89.6°F), a minimum
RH of 20%, a maximum RH of 80%, a maximum dew point of 22°C (71.6°F), a rate
of change of temperature less than 5°C/h (9°F/h), a rate of change of humidity of less
than 5% rh per hour, and no condensation.

A.2.2 Low End


The motivation for lowering the moisture limit is to allow a greater number of
hours per year where humidification (and its associated energy use) is not required.
80 2021 ASHRAE Environmental Guidelines for ITE

The lower limit of moisture for the recommended envelope as shown in Table A.1
was changed in both 2008 and 2015 and will remain as the 2015 limit in this edition
of Thermal Guidelines. The key change from the original 2004 edition to all later
editions was the change from an RH limit to a dew-point limit. The key reason for
this change is to force data center operators to control moisture based on dew point
and not RH, principally because dew point is fairly uniform throughout the data
center whereas RH is not.
Another practical benefit of the change to a dew-point limit from an RH limit
is that the operation of the HVAC systems within the data center will be sensible only.
Also, having an RH limit greatly complicates the control and operation of the cool-
ing systems and could require added humidification operation at a cost of increased
energy in order to maintain an RH when the space is already above the needed dew-
point temperature. To avoid these complications, the hours of economizer operation
available using the 2004 guidelines were often restricted.
ASHRAE funded a research project conducted by Missouri University Science
and Technology to investigate low moisture levels and the resulting ESD effects
(Pommerenke et al. 2014). The concerns raised prior to this study regarding the
increase of ESD-induced risk with reduced humidity were not justified. Based on
those results, reported in Appendix D of this book, the lower moisture limit for the
recommended envelope was reduced from 5.5°C (41.9°F) to –9°C (15.8°F) dew
point and for Classes A1 and A2 was reduced from 20% rh to –12°C (10.4°F) and
8% rh. These changes significantly reduce the humidification requirements for data
centers.

A.3 ACOUSTICAL NOISE LEVELS


Noise levels in high-end data centers have steadily increased over the years and
are becoming a serious concern for data center managers and owners. For back-
ground and discussion on this, see Chapter 9 of Design Considerations for Datacom
Equipment Centers (ASHRAE 2009a). As stated in Chapter 2, the increase in noise
levels is the obvious result of the significant increase in cooling requirements of new,
high-end ITE, and the increase in concern results from noise levels in data centers
approaching or exceeding regulatory workplace noise limits, such as those imposed
by OSHA (1980) in the United States or by EC Directives in Europe (Europa 2003).
Empirical fan laws generally predict that the sound power level of an air-moving
device increases with the fifth power of rotational speed. This means that a 20%
increase in speed (e.g., 3000 to 3600 rpm) equates to a 4 dB increase in noise level.
While it is not possible to predict a priori the effect on noise levels of a potential 2°C
(3.6°F) increase in data center temperatures, it is not unreasonable to expect to see
increases in the range of 3 to 5 dB. Data center managers and owners should, there-
fore, weigh the trade-offs between the potential energy efficiencies with the recom-
mended new operating environment and the potential increases in noise levels.
Again, as stated in Chapter 2, with regard to regulatory workplace noise limits
and to protect employees against potential hearing damage, data center managers
should check whether potential changes in noise levels in their environments will
cause them to trip various “action-level” thresholds defined in local, state, or
Thermal Guidelines for Data Processing Environments, Fifth Edition 81

national codes. The actual regulations should be consulted, because they are
complex and beyond the scope of this book to explain fully. For instance, when levels
exceed 85 dB(A), hearing conservation programs are mandated, which can be quite
costly and generally involve baseline audiometric testing, noise level monitoring or
dosimetry, noise hazard signage, and education and training. When levels exceed
87 dB(A) (in Europe) or 90 dB(A) (in the United States), further action, such as
mandatory hearing protection, rotation of employees, or engineering controls, must
be taken. Data center managers should consult with acoustical or industrial hygiene
experts to determine whether a noise exposure problem will result from increasing
ambient temperatures to the upper recommended limit.

A.4 DATA CENTER OPERATION SCENARIOS 


FOR THE RECOMMENDED ENVIRONMENTAL LIMITS
The recommended thermal guidelines are meant to give guidance to data
center operators on the inlet air conditions to the ITE for the most reliable oper-
ation. Four possible scenarios where data center operators may elect to operate at
conditions that lie outside the recommended environmental window follow.

1. Scenario #1: Expansion of economizer use for longer periods of the year where
hardware failures are not tolerated.
• For short periods of time, it is acceptable to operate outside the recom-
mended envelope and approach the allowable extremes. All manufactur-
ers perform tests to verify that their hardware functions at the allowable
limits. For example, if during the summer months it is desirable to oper-
ate for longer periods of time using an economizer rather than turning on
the chillers, this should be acceptable as long as the period of warmer
inlet air temperatures to the ITE does not exceed several days each year;
otherwise, the long-term reliability of the equipment could be affected.
Operation near the upper end of the allowable range may result in tem-
perature warnings from the ITE. See Section 2.4.3 of Chapter 2 for infor-
mation on estimating the effects of operating at higher temperatures by
using the failure rate x-factor data.
2. Scenario #2: Expansion of economizer use for longer periods of the year where
limited hardware failures are tolerated.
• As previously stated, all manufacturers perform tests to verify that their
hardware functions at the allowable limits. For example, if during the
summer months it is desirable to operate for longer periods of time using
the economizer rather than turning on the chillers, and if the data center
operation is such that periodic hardware fails are acceptable, then operat-
ing for extended periods of time near or at the allowable limits may be
acceptable. Of course, it is a business decision of when to operate within
the allowable and recommended envelopes and for what periods of time.
Operation near the upper end of the allowable range may result in tem-
perature warnings from the ITE. See Section 2.4.3 of Chapter 2 for infor-
82 2021 ASHRAE Environmental Guidelines for ITE

mation on estimating the effects of operating at higher temperatures by


using the failure rate x-factor data.
3. Scenario #3: Failure of cooling system or servicing cooling equipment.
• If the system was designed to perform within the recommended environ-
mental limits, it should be acceptable to operate outside the recom-
mended envelope and approach the extremes of the allowable envelope
during a failure. Again, all manufacturers perform tests to verify that their
hardware functions at the allowable limits. For example, if a modular
CRAC unit fails in the data center and the temperatures of the inlet air of
the nearby racks increase beyond the recommended limits but are still
within the allowable limits, this is acceptable for a short period of time
until the failed component is repaired. As long as the repairs are com-
pleted within typical industry times for these types of failures, this opera-
tion should be acceptable. Operation near the upper end of the allowable
range may result in temperature warnings from the ITE.
4. Scenario #4: Addition of new servers that push the environment beyond the
recommended envelope.
• For short periods of time, it should be acceptable to operate outside the
recommended envelope and approach the extremes of the allowable enve-
lope when the temperature is temporarily increased due to the addition of
additional servers. As stated, all manufacturers perform tests to verify
that their hardware functions at the allowable limits. For example, if addi-
tional servers are added to the data center in an area that would increase
the inlet air temperatures to the server racks above the recommended lim-
its but adhere to the allowable limits, this should be acceptable for short
periods of time until the ventilation can be improved. The length of time
operating outside the recommended envelope is somewhat arbitrary, but
several days would be acceptable. Operation near the upper end of the
allowable range may result in temperature warnings from the ITE.
Appendix B


2021 Air-Cooled Equipment 
Thermal Guidelines (I-P)
For potentially greater energy savings than what would result from operating
ITE within the recommended environmental envelope, refer to Appendix C for the
process needed to account for multiple server metrics that impact overall total cost
of ownership (TCO).
Note k of Tables 2.1, 2.2, B.1, and B.2 provides clarification of the allowable
range of relative humidity (RH). The humidity range noted in the tables is not for the
range of dry-bulb temperatures specified in the tables (this can clearly be seen in the
psychrometric chart shown in Figures 2.2 and 2.3). As an example, the range of
humidity for Class A3 is shown in Figure 2.4. Additional clarification for the other
classes is provided in Appendix L.
84 2021 Air-Cooled Equipment Thermal Guidelines (I-P)

Table B.1 2021 Thermal Guidelines for Air Cooling—


I-P Version (SI Version in Chapter 2)
Equipment Environment Specifications for Air Cooling
Product
Product Operationb,c
Power Offc,d
Max.
Max. Rate
Dry-Bulb Dew Max. of Dry-Bulb
Temp.e,g, Humidity Range, Pointk, Elev.e,j,m, Changef, Temp., RHk,
Classa °F Noncond.h,i,k,l,n °F ft °F/h °F %
Recommended (suitable for Classes A1 to A4; explore data center metrics in
this book for conditions outside this range.)
15.8°F DP to 59°F DP
A1 to 64.4 to
and
A4 80.6
70% rhn or 50% rhn
Allowable
10.4°F DP and 8% rh
A1 59 to 89.6 to 62.6 10,000 9/36 41 to 113 8 to 80k
62.6°F DP and 80% rhk
10.4°F DP and 8% rh
A2 50 to 95 to 69.8 10,000 9/36 41 to 113 8 to 80k
69.8°F DP and 80% rhk
10.4°F DP and 8% rh
A3 41 to 104 to 75.2 10,000 9/36 41 to 113 8 to 80k
75.2°F DP and 85% rhk
10.4°F DP and 8% rh
A4 41 to 113 to 75.2 10,000 9/36 41 to 113 8 to 80k
75.2°F DP and 90% rhk
* For potentially greater energy savings, refer to Appendix C for the process needed to account for multiple server
metrics that impact overall TCO.
Thermal Guidelines for Data Processing Environments, Fifth Edition 85

Notes for Table B.1, 2021 Thermal Guidelines for Air Cooling—
I-P Version (SI Version in Chapter 2)
a. Classes A3 and A4 are identical to those included in the 2011 version of the thermal guide-
lines. The 2015 version of the A1 and A2 classes has expanded RH levels compared to the
2011 version. The 2021 version of the thermal guidelines maintains the same envelopes for
A1 through A4 but updates the recommended range depending on the level of pollutants in
the data center environment.
b. Product equipment is powered on.
c. Tape products require a stable and more restrictive environment (similar to Class A1 as spec-
ified in 2008). Typical requirements: minimum temperature is 59°F, maximum temperature
is 89.6°F, minimum RH is 20%, maximum RH is 80%, maximum dew point (DP) is 71.6°F,
rate of change of temperature is less than 9°F/h, rate of change of humidity is less than 5%
rh per hour, and no condensation.
d. Product equipment is removed from original shipping container and installed but not in use,
e.g., during repair, maintenance, or upgrade.
e. Classes A1 and A2—Derate maximum allowable dry-bulb temperature 1.8°F/984 ft above
2953 ft. Above 7874 ft altitude, the derated dry-bulb temperature takes precedence over the
recommended temperature. Class A3—Derate maximum allowable dry-bulb temperature
1.8°F/574 ft above 2953 ft. Class A4—Derate maximum allowable dry-bulb temperature
1.8°F/410 ft above 2953 ft.
f. For tape storage: 9°F in an hour. For all other ITE: 36°F in an hour and no more than 9°F in
any 15-minute period of time. The temperature change of the ITE must meet the limits shown
in the table and is calculated to be the maximum air inlet temperature minus the minimum air
inlet temperature within the time window specified. The 9°F and 36°F temperature change is
considered to be a temperature change within a specified period of time and not a rate of
change. See Appendix K for additional information and examples.
g. With a diskette in the drive, the minimum temperature is 50°F (not applicable to Classes A1
or A2).
h. The minimum humidity level for Classes A1, A2, A3, and A4 is the higher (more moisture)
of the 10.4°F DP and the 8% rh. These intersect at approximately 77°F. Below this intersec-
tion (~77°F) the DP (10.4°F) represents the minimum moisture level, while above it, RH (8%)
is the minimum.
i. Based on research funded by ASHRAE and performed at low RH (Pommerenke et al. 2014),
the following are the minimum requirements:
1) Data centers that have non-electrostatic discharge (non-ESD) floors and where personnel
are allowed to wear non-ESD shoes need increased humidity given that the risk of gener-
ating 8 kV increases slightly from 0.27% at 25% rh to 0.43% at 8% rh (see Appendix D for
more details).
2) All mobile furnishing/equipment is to be made of conductive or static-dissipative materials
and bonded to ground.
3) During maintenance on any hardware, a properly functioning and grounded wrist strap
must be used by any personnel who contacts ITE.
j. To accommodate rounding when converting between SI and I-P units, the maximum elevation
is considered to have a variation of ±0.1%. The impact on ITE thermal performance within
this variation range is negligible and enables the use of the rounded value of 10,000 ft.
k. See Appendix L for graphs that illustrate how the maximum and minimum DP limits restrict
the stated RH range for each of the classes for both product operations and product power off.
l. For the upper moisture limit, the limit is the minimum absolute humidity of the DP and RH
stated. For the lower moisture limit, the limit is the maximum absolute humidity of the DP and
RH stated.
m. Operation above 10,000 ft requires consultation with the IT supplier for each specific piece
of equipment.
n. If testing with silver or copper coupons results in values less than 200 and 300 Å/month,
respectively, then operating up to 70% rh is acceptable. If testing shows corrosion levels
exceed these limits, then catalyst-type pollutants are probably present and RH should be
driven to 50% or lower.
86 2021 Air-Cooled Equipment Thermal Guidelines (I-P)

Table B.2 2021 Thermal Guidelines for High-Density Servers—


I-P Version (SI Version in Chapter 2)
Equipment Environment Specifications for High-Density Air Cooling
Product
Product Operationb,c Power
Offc,d

Max. Max. Dry-


Dry-Bulb Dew Max. Rate of Bulb
Temp.e,g, Humidity Range, Point, Elev.e,j,m, Changef, Temp., RH,
Classa °F Noncond.h,i,k,l,n °F ft °F/h °F %
Recommended
15.8°F DP to 59°F DP
H1 64.4 to 71.6 and
70% rhn or 50% rhn
Allowable
10.4°F DP and 8% rh
41 to 8 to
H1 59 to 77 to 62.6 10,000 9/36
113 80k
62.6°F DP and 80% rhk
Thermal Guidelines for Data Processing Environments, Fifth Edition 87

Notes for Table B.2, 2021 Thermal Guidelines for High-Density Servers—
I-P Version (SI Version in Chapter 2)
a. This is a new class specific to high-density servers. It is at the discretion of the ITE manufac-
turer to determine the need for a product to use this high-density server class. Classes A1
through A4 are separate and are shown in Table 2.1.
b. Product equipment is powered on.
c. Tape products require a stable and more restrictive environment (similar to 2011 Class A1).
Typical requirements: minimum temperature is 59°F, maximum temperature is 89.6°F, mini-
mum RH is 20%, maximum RH is 80%, maximum dew point (DP) is 71.6°F, rate of change
of temperature is less than 9°F/h, rate of change of humidity is less than 5% rh per hour, and
no condensation.
d. Product equipment is removed from original shipping container and installed but not in use,
e.g., during repair, maintenance, or upgrade.
e. For H1 class only—Derate maximum allowable dry-bulb temperature 1°F/1640 ft above 2950
ft. Above 7870 ft altitude, the derated dry-bulb temperature takes precedence over the recom-
mended temperature.
f. For tape storage: 9°F in an hour. For all other ITE: 36°F in an hour and no more than 9°F in
any 15-minute period of time. The temperature change of the ITE must meet the limits shown
in the table and is calculated to be the maximum air inlet temperature minus the minimum air
inlet temperature within the time window specified. The 9°F or 36°F temperature change is
considered to be a temperature change within a specified period of time and not a rate of
change. See Appendix K for additional information and examples.
g. With a diskette in the drive, the minimum temperature is 50°F. With the lowest allowed
temperature of 59°F, there is no problem with diskettes residing in this H1 environment.
h. The minimum humidity level for Class H1 is the higher (more moisture) of the 10.4°F DP and
the 8% rh. These intersect at approximately 77°F. Below this intersection (~77°F) the DP
(10.4°F) represents the minimum moisture level, while above it, RH (8%) is the minimum.
i. Based on research funded by ASHRAE and performed at low RH (Pommerenke et al. 2014),
the following are the minimum requirements:
1) Data centers that have non-electrostatic discharge (non-ESD) floors and where personnel
are allowed to wear non-ESD shoes may need increased humidity given that the risk of gener-
ating 8 kV increases slightly from 0.27% at 25% rh to 0.43% at 8% (see Appendix D for more
details).
2) All mobile furnishing/equipment is to be made of conductive or static-dissipative materials
and bonded to ground.
3) During maintenance on any hardware, a properly functioning and grounded wrist strap
must be used by any personnel who contacts ITE
j. To accommodate rounding when converting between SI and I-P units, the maximum elevation
is considered to have a variation of ±0.1%. The impact on ITE thermal performance within
this variation range is negligible and enables the use of the rounded value of 10,000 ft.
k. See Appendix L for graphs that illustrate how the maximum and minimum DP limits restrict
the stated RH range for both product operations and product power off.
l. For the upper moisture limit, the limit is the minimum absolute humidity of the DP and RH
stated. For the lower moisture limit, the limit is the maximum absolute humidity of the DP and
RH stated.
m. Operation above 10,000 ft requires consultation with IT supplier for each specific piece of
equipment.
n. If testing with silver or copper coupons results in values less than 200 and 300 Å/month,
respectively, then operating up to 70% rh is acceptable. If testing shows corrosion levels
exceed these limits, then catalyst-type pollutants are probably present and RH should be
driven to 50% or lower. See note 3 of Section 2.2 for more details.
Appendix C

Detailed Flowchart for the
Use and Application of
the ASHRAE Data Center Classes
Figures C.1 through C.4 provide guidance to the data center operator on how to
position a data center to operate in a specific environmental envelope. These figures
permit the continued use of the recommended envelope as specified in Table 2.1 but,
more importantly, they show how to achieve even greater energy savings through the
use of a total cost of ownership (TCO) analysis using the server metrics provided in
Chapter 2.

C.1 NOTES FOR FIGURES

1. To use the highest inlet temperatures, ensure excellent airflow segregation is in


place to avoid recirculation with warmer IT outlet flows.
2. Higher-temperature information technology equipment (ITE) loses its primary
benefit if mixed with standard ITE; a common cooling system must meet the
most demanding ITE requirements.
3. Ensure “no chiller” option choice meets availability requirements of the IT
workload needs; investigate site extreme temperature and humidity conditions
and economizer uptime risks.
4. Ensure the operational and safety aspects of high hot-aisle temperatures are
understood; temperatures of 55°C to 60°C (131°F to 140°F) may be expected.
5. Ensure higher airflow for ITE above the recommended range is understood.
Data center airflow may have to increase up to 250% (see Figure 2.9).

C.2 NOMENCLATURE FOR FIGURES


Ti = temperature at ITE inlet
Tmax = maximum temperature for a component
Trise = temperature rise across a component
TCO = total cost of ownership
90 Detailed Flowchart for the Use and Application of the ASHRAE Data Center Classes
Guidance for applying thermal guidelines.
Figure C.1
Thermal Guidelines for Data Processing Environments, Fifth Edition 91

Figure C.2 Guidance for applying thermal guidelines to new construction


projects.
92 Detailed Flowchart for the Use and Application of the ASHRAE Data Center Classes

Figure C.3 Guidance for applying thermal guidelines to major retrofit


projects.
Thermal Guidelines for Data Processing Environments, Fifth Edition 93

Figure C.4 Guidance for applying thermal guidelines to existing facilities


looking for efficiency gains.
Appendix D


ESD Research and 
Static Control Measures
This appendix supplements the summary information included in footnote i of
Table 2.1 and Table 2.2, discussing the need for minimum humidity levels and basic
electrostatic discharge protection protocols in the data center.

D.1 ESD BACKGROUND

Electrostatic discharge (ESD) can cause damage to silicon devices. Shrinking


device feature size means less energy is required in an ESD event to cause device
damage. Additionally, increased device operating speed has limited the effective-
ness of on-chip ESD protection structures; therefore, there is a significant risk to
unprotected IT components from ESD. In general, on-line operational hardware is
protected from ESD. However, when a machine is taken off-line it is no longer well
protected and becomes more susceptible to ESD damage. Most equipment has been
designed to withstand an ESD event of 8 kV while operational and grounded prop-
erly. Human perception of ESD is somewhat less; one can see ESD at 8 kV, hear it
at 6 kV, and feel it at 3 kV (these are order of magnitude estimates). Unprotected
semiconductor devices can be damaged at around 250 V. The next several genera-
tions of components will see this drop to around 125 V. Significant risk to the hard-
ware exists even when there is no perceptible ESD. In fact, damage can occur at ten
times below the perceptible limit. At these very low levels, an extensive ESD proto-
col is required.
ESD can be generated by the personnel in the room or the room hardware itself.
Section D.2 discusses ESD research, and two sections of high-level guidance for
ESD control are presented in Sections D.3 and D.5. Data center operators are encour-
aged to review the sources listed in Section D.5, Further Reading, and to implement
an effective ESD program at their sites.

D.2 ESD RESEARCH

ASHRAE funded the Electromagnetic Compatibility (EMC) Laboratory at the


Missouri University of Science and Technology from 2011 to 2014 to investigate the
risk of upsets or damage to electronics related to electrostatic discharge (ESD) in
data centers. Emphasis was placed on the increase in risk with reduced humidity.
Lower humidity not only increases the charge voltages but also leads to longer
charge retention and more damaging discharges; therefore, the following experiments
96 ESD Research and Static Control Measures

were performed under various environmental, footwear, and flooring conditions


(Pommerenke et al. 2014):

• Human charging test: The human body voltage of a person walking on the
floor was measured as a function of floor type, footwear, grounding, and envi-
ronmental conditions.
• Cable charging by spooling and dragging: Different cables were dragged
across different surfaces and the induced voltage was measured.
• Human metal discharge: A charged person held a metallic ground and dis-
charged himself. Currents and electric fields were measured.
• Cable discharge: To emulate charges on a jacket, cables were wrapped with
aluminum foil, the foil was charged to a given voltage, and the voltages
induced on the wires were measured.

Only the data from the measurement of voltages generated by people walking
are reported here, as these test results were considered most directly related to the
humidity requirements for the environmental classes. Results from the other exper-
iments can be obtained from the research project final report by Pommerenke et al.
(2014).
The charging experiments were analyzed to obtain both the maximal voltage for
each parameter combination and the effect of parameter changes, especially the
humidity. Furthermore, an extrapolation was performed to obtain the probability of
voltages larger than typical thresholds used for electronic systems robustness. Here,
500 V (for service conditions) and 4 and 8 kV (derived from the IEC 61000-4-2 test
method [IEC 2008]) were used as the limits.
Using ESD-mitigating flooring and footwear, the risk of ESD upset and damage
can be reduced to an insignificant level, even if the humidity is allowed to drop to low
values, such as 8% (the lower limit of relative humidity for Classes A3 and A4). In
addition to using conductive footwear and flooring, other precautions should be taken,
especially under low-humidity conditions, to avoid rapid removal of non-conductive
plastic wrapping when in close proximity to ITE. Furthermore, all office chairs and
carts selected for use in data centers should have ESD-mitigating properties.
The low increase in the ESD risk with reduced humidity indicates that a data
center with a low incident rate of ESD-induced damage operating at 25% rh will
maintain a low incident rate if the humidity is reduced to 8%. The concerns raised
prior to the study regarding the increase in ESD-induced risk with reduced humidity
are not justified. A standard set of ESD mitigation procedures will ensure a very low
ESD incident rate at all humidity levels tested.
All electronic equipment placed in a data center is tested for its ESD robustness
to at least the levels set by IEC 61000-4-2, which is 4 kV contact mode and 8 KV
air discharge mode (IEC 2008). However, human charging can lead to voltages
above these levels, and discharges can have rise times that are faster than the refer-
enced event used to define the ESD test standard IEC 61000-4-2. Three voltage
limits were chosen for expressing the effects of lower humidity levels (Pommerenke
et al. 2014):
Thermal Guidelines for Data Processing Environments, Fifth Edition 97

• 500 V is the limit during service. This level was selected as an assumed
robustness during service of ITE. During service, shielding panels may be
removed, the operator may handle hard drives or other plug-in devices, and
the operator might connect a laptop via a USB cable to an internal service
connector. Those service actions are usually not considered during standard-
ized IEC 61000-4-2 ESD testing, as these conditions expose sensitive elec-
tronics. In the electronics industry, it is generally considered that a voltage of
100 V is low enough to handle nearly all electronic components (such as inte-
grated circuits or transistors). However, we assume that these components are
integrated into a system, and the system, such as a hard drive, provides some
level of protection and shielding. This assumption and communication with
many people involved in ITE quality control led to a decision to use 500 V as
the service robustness threshold.
• 4 kV is derived from the level 2 contact discharge test method in IEC 61000-
4-2. This test uses a contact mode and the contact mode waveform is based on
the much more severe human metal ESD. In the examples that illustrate a pos-
sible event rate, it was assumed that the operator will only discharge himself
via a piece of metal into the ITE in 1% of the cases when he touches a server
during operation. An example of such discharge might be the discharge from
a handheld key to a key lock on the operator console of a server.
• 8 kV is derived from the level 3 air discharge test method in IEC 61000-4-2.
This is the air discharge test level that is applied to nonconductive surfaces.
Here it was assumed that the failure mechanism inside the ITE is only a func-
tion of having or not having a breakdown, independent of the current or rise
time. The dielectric breakdown threshold is not a function of holding or not
holding a metal part. In the example that illustrates a possible event rate, it
was assumed that every time the operator reaches >8 kV, damage or an upset
may occur (the human/metal ESD calculation assumed that only 1% of the
discharges are via a piece of metal, thus endangering the ITE). An example of
such a discharge might be a discharge from the surface of a touch screen into
the electronics of the screen.

The Pommerenke et al. (2014) testing also includes measurement of resistance


to groundable points from the surface of the flooring test plates using an industry-
recognized contact electrode as well as measurement of resistance to groundable
points of the floor test plates from personnel wearing various types of footwear. The
main dynamic test procedure measures the accumulation of voltage on a person
while walking in a specified pattern on the floor test plates according to industry-
standard test method ANSI/ESD STM97.2 (ESDA 2006b, 2016). The walking
pattern established in the STM97.2 procedure is shown in Figure D.1. During the
walking test, the person repeats the walking pattern a minimum of ten times while
holding an electrode that is connected by wire to the voltage monitor as shown in
Figure D.2. A data acquisition system records the walking voltage, and associated
software computes the statistics of the voltages recorded for the test iterations.
98 ESD Research and Static Control Measures

Figure D.1 Walking pattern according to ANSI/ESD STM97.2 (ESDA


2016). (Reprinted with permission of EOS/ESD Association,
Inc.)

Figure D.2 Walking voltage test setup according to ANSI/ESD STM97.2


(ESDA 2016). (Reprinted with permission of EOS/ESD
Association, Inc.)
Thermal Guidelines for Data Processing Environments, Fifth Edition 99

The relative rate of ESD-related failures or upsets is derived for various types
of data centers based on different flooring systems and personal footwear. As esti-
mation of the actual number of ESD-related failures or upsets is impossible, hypo-
thetical scenarios of data centers are considered with the assumption that the
operator actions and ITE are constant in all these data centers. Then, using industry-
accepted ESD robustness thresholds, the probabilities of exceeding these thresholds
are calculated and compared. This analysis allows us to estimate the relative rate of
ESD-related failures or upsets as a function of environmental conditions, flooring
types, and footwear. The simulation is based on a well-defined walking pattern that
has good repeatability (see Figure D.1). Due to limitations on performing the well-
defined walking pattern for long periods of time and due to the small probability of
observing very high voltages, an extrapolation approach is used to determine the
probabilities of exceeding ESD robustness levels. Two approaches have been used
to obtain the extrapolation functions used to predict higher voltage levels:
(1) performing the extrapolation based on the distribution functions measured in the
test, and (2) performing the extrapolation based on literature data. The literature data
predict higher risk levels; however, in many cases both extrapolations lead to the
same conclusions with respect to risk level. Based on the calculated probabilities and
different categories of data center, recommendations regarding the flooring system
and footwear control are provided herein.
For this test, 18 different types of flooring samples were assembled on test plates
0.91 × 0.91 m (3 × 3 ft) in size. Twelve different types of footwear or shoe-grounding
devices were gathered, representing a broad spectrum of shoe types and materials
(shoe types are shown in Table D.1). The electrical resistance ranges are shown in

Table D.1 Types of Flooring and Shoes Used in Test Program

Flooring Shoes

Conductive vinyl 1 Dissipative Asia slip-on


Dissipative vinyl 1 Full-sole grounder
Green dissipative vinyl Running shoe
Low-dissipative vinyl Heel ground strap
Thin conductive vinyl Full-sole elastic grounder
Epoxy—dissipative Dissipative safety shoe
Conductive rubber Low-dissipative safety shoe
High-pressure laminate (HPL) F Deck shoe 1
HPL N Deck shoe 2
Asia vinyl Deck shoe 3
Asphalt tile Molded plastic
Asphalt tile with dissipative wax Leather dress
100 ESD Research and Static Control Measures

Table D.2 Flooring and Shoes Defined by Electrical Resistance

ESD Shoes
High-
Conductive Dissipative Non-ESD and Shoe
Resistance
ESD Floors ESD Floors Floors Grounding
Shoes
Devices
Electrical 1 ×10E6  to
<1 × 10E6  >1 × 10E9  <1 × 10E8  >1 × 10E9 
Resistance <1 × 10E9 

Examples • Conductive • Asphalt tile • Asphalt tile • Low- • Running


rubber with • HPL F dissipative shoe
• Conductive dissipative • HPL N safety shoe • Deck shoe 1
vinyl 1 wax • Dissipative • Deck shoe 2
• Thin • Dissipative safety shoe • Deck shoe 3
conductive vinyl 1 • Dissipative • Molded
vinyl • Green slip-on plastic
• Asia vinyl dissipative • Full-sole • Leather
vinyl grounder dress
• Low- • Heel ground
dissipative strap
vinyl • Full-sole
• Epoxy— elastic
dissipative grounder
Floors tested by ANSI/ESD STM7.1 (ESDA 2013) and shoes/shoe grounders by ANSI/ESD STM97.1 (ESDA 2006a).

Table D.2. While not every shoe type was tested in every condition on all the floors,
the main types of shoes were tested as experience was gained during the test program.
Many of the shoe types performed similarly on similar floors. Therefore, in a few
cases floor types and shoe types were skipped in some of the test conditions to reduce
the redundancies. The environmental conditions in between the extremes show the
same tendencies; detail can be found in the work by Pommerenke et al. (2014).
If these data had been recorded over a very long time (e.g., one year), the voltage
might have exceeded 4 kV a few times. It should be noted that both the maximum
voltage in each walking cycle and the shape of the waveform depend on the envi-
ronmental condition, shoe and floor types, and speed and pattern of walking. The
walking experiment is repeated for different environmental conditions while keep-
ing other parameters (walking pattern, speed of walking, and types of flooring and
shoes) constant. The amplitude density of the recorded data is converted to its prob-
ability density function by dividing it by the total number of points in the data set.
The magnitude is taken to include negative charge voltages.
While it may prove impossible to control the footwear worn by personnel who
enter or work in data centers with certainty, it would be a good idea for facility
owners and managers to have an awareness that footwear can lead to issues in the
daily operation of a data center. Almost any conventional polymer-based sole mate-
rial may lead to high charge levels, some more so than others—regardless of humid-
ity. A conductive floor will help to mitigate electrostatic charging, even from the
worst possible pair of shoes.
The results from the walking test are summarized in Table D.3.
Thermal Guidelines for Data Processing Environments, Fifth Edition 101

Table D.3 Probabilities of Voltages from Walking Tests


Greater than Threshold Values

Cumulative Probability (V > V0) with ESD Floors and ESD Shoes
(Pattern Walking)

Environmental
V0 = 500 V V0 = 4 kV V0 = 8 kV
Condition

45% rh at 27°C (80.6°F) 1.47E-11 1.68E-19 3.82E-22


25% rh at 27°C (80.6°F) 9.74E-05 3.05E-09 9.61E-11
8% rh at 27°C (80.6°F) 3.76E-06 6.80E-12 8.30E-14

Cumulative Probability (V > V0) with Non-ESD Floors and Non-ESD Shoes
(Pattern Walking)

Environmental
V0 = 500 V V0 = 4 kV V0 = 8 kV
Condition
45% rh at 27°C (80.6°F) 4.70% 0.01% 0.00%
25% rh at 27°C (80.6°F) 23% 1.13% 0.27%
8% rh at 27°C (80.6°F) 48.80% 2.28% 0.43%

Cumulative Probability (V > V0) with ESD Floors and Non-ESD Shoes
(Pattern Walking)

Environmental
V0 = 500 V V0 = 4 kV V0 = 8 kV
Condition
45% rh at 27°C (80.6°F) 0.15% 7.44E-11 1.17E-13
25% rh at 27°C (80.6°F) 5.80% 7.14E-11 2.12E-10
8% rh at 27°C (80.6°F) 12.20% 2.38E-06 3.01E-09

Because all electronic equipment placed in a data center is tested for its ESD
robustness to at least the levels set by CISPR 24 (IEC 2010) (which is 4 kV contact
mode and 8 kV air discharge mode), the columns in Table D.3 for 4 kV and 8 kV are
of primary interest. The 500 V column associated with servicing of servers is not of
particular interest since wrist straps are required for servicing these days (see foot-
note i in Table 2.1 and Table 2.2). What is noteworthy in Table D.3 is that the test
results for ESD floors/ESD shoes and ESD floors/non-ESD shoes for 4 kV, 8 kV, and
higher have zero risk for relative humidity (RH) levels at 8%. Tests were performed
for the category of non-ESD floors/ESD shoes, but not enough tests were performed
to obtain accurate probability projections. However, the results did indicate that they
would be very similar to the ESD floors/non-ESD shoes results, where the risk at
4 and 8 kV is zero. Finally, the probability results at 4 and 8 kV for non-ESD floors/
non-ESD shoes do show some slight increase in risk in going from 25% to 8% rh,
102 ESD Research and Static Control Measures

albeit the risk is low (Pommerenke et al. 2014). Since ITE is tested to 8 kV, there will
need to be some judgment on the part of the data center operator as to whether to
increase moisture levels above 8%, given this increase of risk from 0.27% to 0.43%
for 8 kV.

D.3 PERSONNEL AND OPERATIONAL ISSUES


Electrostatic charge control must be considered when handling or coming into
contact with electrostatic-sensitive components such as motherboards, central
processing units (CPUs), and others. The goal is to minimize electrostatic voltage
potentials between all items within the area deemed to be ESD sensitive. This is
accomplished through selection of proper materials, such as low-charging (antistatic)
and static-dissipative materials, and by properly grounding items and personnel.
Operations for controlling the buildup and discharge of static electricity should
adhere to the following guidelines:

• Proper grounding is a very important aspect of electrostatic charge control.


Personnel can be grounded either through a wrist strap that is electrically
bonded to a known building or chassis ground or through the use of ESD foot-
wear such as ESD shoes or shoe-grounding straps. The latter method requires
that there be an electroconductive or static-dissipative floor to allow a charge
path from the human to a known building ground source.
• Areas/workstations where ITE will be handled and maintained should have
surfaces that are static dissipative and are grounded to a known building
ground source.
• Personnel working in and around open ITE should use smocks with static-
dissipative properties. A grounded smock is used to contain electrostatic
fields that emanate from the clothing of the personnel.
• Ensure all data center personnel have had ESD awareness training.
• Eliminate nonessential insulators from work areas.
• Ensure work surfaces are grounded and static dissipative.
• Use ESD-shielded bags or containers for all components in non-ESD-controlled
areas.
• Use ESD gloves and finger cots for work in ESD-sensitive areas.
• Use static-dissipative tools at workstations, including static-dissipative vac-
uum wands, suction cups, and tweezers.

D.4 FLOORING ISSUES


Conductive flooring for controlling the buildup and discharge of static electric-
ity should adhere to the following guidelines:

• Provide a conductive path from the metallic floor structure to a known build-
ing ground source.
• Ground the floor metallic support structure (stringer, pedestals, etc.) to build-
ing steel at several places within the room. The number of ground points is
Thermal Guidelines for Data Processing Environments, Fifth Edition 103

Figure D.3 Typical test setup for measuring floor conductivity.

based on the size of the room. The larger the room, the more ground points
that are required.
• Ensure the maximum resistance for the flooring system is 2 × 1010 , mea-
sured between the floor surface and the building ground (or an applicable
ground reference). Flooring material with a lower resistance will further
decrease static buildup and discharge. For safety, the floor covering and floor-
ing system should provide a resistance of no less than 150 kwhen measured
between any two points on the floor space 1 m (3 ft) apart.
• Maintain ESD-control floor coverings (including carpet and tile) according to
the individual supplier’s recommendations. Carpeted floor coverings must
meet electrical conductivity requirements. Use only low-charging materials
with low-propensity ratings.
• Use only ESD-control furniture with conductive casters or wheels.

D.4.1 Measuring Floor Resistance


A test instrument similar to an AEMC-1000 megohmmeter is required for
measuring floor conductivity. Figure D.3 shows the typical test setup for measuring
floor conductivity.

D.5 FURTHER READING


The most recent editions of the following publications, though not written
specifically for data centers, contain information that may be useful for the devel-
opment and implementation of a static control program:

• ANSI/ESD STM7.1, Characteristics of Materials—Flooring Materials


(ESDA 2013)
104 ESD Research and Static Control Measures

• ANSI/ESD STM9.1, Footwear—Resistive Characteristics (ESDA 2014b)


• ANSI/ESD S20.20, ESD Association Standard for the Development of an
Electrostatic Discharge Control Program for Protection of Electrical and
Electronic Parts, Assemblies and Equipment (Excluding Electrically Initiated
Explosive Devices) (ESDA 2014a)
• ANSI/ESD S6.1, Grounding (ESDA 2019)
• The Effect of Humidity on Static Electricity Induced Reliability Issues of ICT
Equipment in Data Centers. 1499 TRP (Pommerenke et al. 2014)
Appendix E

Research on the Effect of 
RH and Gaseous Pollutants on 
ITE Reliability
ASHRAE funded research performed at Syracuse University from 2015 to
2018. The final report (Zhang et al. 2019) was submitted to ASHRAE in December
2018 and a technical article summarizing the research was published in 2020 (Zhang
et al. 2020).
The objective of the research was to experimentally investigate how increasing
the relative humidity (RH) and temperature beyond the ASHRAE-recommended
thermal envelope would affect the corrosion rate and corrosion mechanisms of the
primary metals used in the build of electronics: copper and silver. Copper and silver
test specimens were exposed to different mixed-flow gas and thermal environmental
conditions then analyzed using the coulometric reduction technique, to measure the
corrosion thickness, and scanning electron microscopy and energy dispersive spec-
trometry, to identify the corrosion products on the surface.
Air pollution and thermal environmental control in data centers is critical to the
reliability of data and communications equipment and systems. As shown in
Figure E.1, the fourth edition of Thermal Guidelines for Data Processing Environ-
ments (ASHRAE 2015b) provided guidance on the recommended range of tempera-
ture and humidity conditions (i.e., the ASHRAE-recommended thermal envelope)
that would limit the severity of copper and silver corrosion to acceptable levels. That
recommendation was largely based on the field experience that if the temperature
and humidity ratio are within the recommended thermal envelope and the 30-day
corrosion thickness is below 300 Å and 200 Å for copper and silver, respectively, ITE
would not have premature failure within its service life. However, it is not clear under
what pollution conditions these limits are valid or if the thermal envelope could be
expanded if the pollution compounds and concentration levels were better managed.
There was clearly a need to determine the allowable temperature and RH limits given
pollutant levels that are realistic in the environments around the world for data
centers, especially under higher RH conditions. Allowing the recommended thermal
envelope to expand to a wider range would enable significant cooling energy savings
for data centers by using free cooling (Zhang et al. 2014). The fundamental question
for the ASHRAE-funded research was: can the recommended thermal envelope be
expanded for the purpose of reducing cooling energy consumption if the air pollu-
tion conditions are better understood and controlled?
A comprehensive literature review has shown that NO2, SO2, O3, Cl2, and H2S
are of most concern for corrosion in the data center environment (Zhang et al. 2018).
NO2, O3, and SO2 are most prevalent on the planet, and their outdoor concentration
levels vary by locations. Cl2 and H2S pollutants are generally caused by local events
such as emission from sewage treatment plants, decay of vegetation in wetlands, and
106 Research on the Effect of RH and Gaseous Pollutants on ITE Reliability

Figure E.1 2015 thermal environmental conditions of air entering ITE


(A1, A2, A3, and A4 represent different environmental
envelopes for ITE).

soils. Based on the results of the literature review and assuming that indoor concen-
trations would be similar to the outdoor levels in a worst-case condition when
outdoor air is used directly for free cooling, realistic indoor worst-case concentra-
tions for corrosion testing were defined as 80 ppb NO2, 60 ppb O3, 40 ppb SO2, 2
ppb Cl2, and 10 ppb H2S (Zhang et al. 2018).
All tests were performed by first exposing the test specimens (standard copper
and silver coupons or printed circuit boards; see Figure E.2) in exposure chambers
of a testing system (Figure E.3) specifically developed for this study. The test spec-
imens were then analyzed by coulometric reduction to determine the total corrosion
thickness and quantities of major corrosion products.
A mixed flowing gas test apparatus (Figure E.3) was developed for this
research. It was based on ASTM B827-05, Standard Practice for Conducting Mixed
Flowing Gas (MFG) Environmental Tests (ASTM 2014). The experiments were
designed around two groups of mixed flowing gas mixtures: one consists of the prev-
alent compounds NO2, O3, and SO2, or their combinations, and the other includes
NO2, O3, and SO2 plus Cl2 or H2S or both.
The corrosion thicknesses of copper and silver coupons were measured after six
days of exposure at 50% rh, 70% rh, and 80% rh and 21°C and 28°C (69.8°F and
Thermal Guidelines for Data Processing Environments, Fifth Edition 107

(a) (b)

Figure E.2 Test specimens: a) standard copper and silver coupons and
b) printed circuit board (PCB) coupons.

Figure E.3 Experimental setup for mixed flowing gas testing.


108 Research on the Effect of RH and Gaseous Pollutants on ITE Reliability

Figure E.4 Corrosion thicknesses for copper at 50% rh, 70% rh, and
80% rh.

82.4°F) under different pollutant mixtures. This research found that copper corro-
sion is strongly dependent on RH (Zhang et al. 2019). Figure E.4 shows that when
no Cl2 or H2S is present (i.e., only NO2, O3, and SO2 were present), increasing the
RH from 50% to 70% did not cause any significant increase of corrosion thickness
for copper, but at 80% rh there was a significant increase in corrosion thickness. It
was also noticed that for all testing for 50% rh and above with all pollutant mixtures
that none of the results were acceptable and corrosion thicknesses were well beyond
the limits of copper. The corrosion rate of silver (Figure E.5), however, was found
to have no obvious dependence on RH. Increasing the RH did not cause obvious
significant difference in the corrosion thickness for the four-compound mixture
(NO2 + O3 + SO2 + Cl2). However, any test mixture with H2S caused significant
corrosion on both the copper and silver coupons.

E.1 CONCLUSIONS FROM THE RESEARCH

1. The overall research results from RP-1755 (Zhang et al. 2019) follow. It is
important to note that the conclusions developed from this research are based
on the pollutant concentration at or near a maximum experienced around the
world. In most real-world cases the pollutant levels would be expected to be
much less than those tested in this research.
a. Corrosion development over time: According to the experimental results
from the 30-day tests (21°C [69.8°F] and 50% rh) for the 5-compound gas
mixture, NO2 + O3 + SO2 + Cl2 + H2S, there exists a logarithmic relation-
ship between the corrosion thickness and exposure time for copper.
Thermal Guidelines for Data Processing Environments, Fifth Edition 109

Figure E.5 Corrosion thicknesses for silver at 50% rh, 70% rh, and
80% rh.

However, for silver, a linear relationship appears to be a better description


of the development of the thickness over the exposure time, and increasing
the RH led to a reduction in the corrosion rate.
b. Effects of pollutant mixture: For the reference temperature and RH
condition (21°C [69.8°F] and 50% rh), significant copper corrosion
occurred only for the Cl2-containing mixtures (NO2 + O3 + SO2 + Cl2 and
NO2 + SO2 + Cl2 + H2S and NO2 + O3 + SO2 + Cl2 + H2S). These results
suggest that Cl2 had the most corrosive effect for copper. Without Cl2 the
corrosion thicknesses were significantly lower. However, for silver, signif-
icant corrosion occurred only when H2S was in the pollutant mixture. The
dominating effects of Cl2 on copper corrosion and of H2S on silver corro-
sion were also evidenced from the corrosion products or elements identi-
fied from the results of coulometric reduction analysis. As a result, separate
design guidelines can be established for data center environments depend-
ing on whether there is Cl2 and/or H2S in the environment—one for envi-
ronments where only the pervasive compounds (NO2, O3, and SO2) in
atmospheric pollution are present, and the other for environments where
Cl2 and/or H2S are also present due to local surrounding and/or indoor
activities.
2. Regarding the effects of RH on copper and silver corrosion, RP-1755 (Zhang
et al. 2019) found the following:
a. For copper, increasing the RH from 50% rh to 70% rh while keeping the
temperature at the reference condition (21°C [69.8°F]) enhanced the corro-
sion when Cl2 was present but did not have a significant impact on corro-
110 Research on the Effect of RH and Gaseous Pollutants on ITE Reliability

sion when Cl2 was not present. A further increase of the RH to 80%
resulted in significant corrosion for all gas conditions tested, including O3,
O3 + SO2, NO2 + O3, NO2 + O3 + SO2, NO2 + O3 + SO2 + Cl2, NO2 + O3
+ SO2 + H2S, and NO2 + O3 + SO2 + Cl2 + H2S. This suggests that a critical
RH exists for copper between 70% and 80% rh, above which the corrosion
thickness increases dramatically.
b. For silver, increasing the RH did not cause significant increase in the corro-
sion thickness for all gas conditions tested except for the five-compound
mixture in which increasing the RH from 50% to 70% and even to 80%
resulted in a reduction in the corrosion thickness.
c. Operating data centers with only the three pervasive compounds present
and at RH levels as high as 70% at 21°C (69.8°F) is acceptable for copper
and silver corrosion control.
3. Regarding the effects of temperature on copper and silver corrosion, RP-1755
(Zhang et al. 2019) found the following:
a. For copper, increasing the temperature from 21°C to 28°C (69.8°F to
82.4°F) while keeping the RH at the reference condition (50% rh) dramat-
ically reduced corrosion thickness for all mixture conditions tested. This
was unexpected, but a repeat test confirmed the observation. It is likely that
at a higher temperature, which causes less moisture to be adsorbed on the
coupon surface, a much lower amount of pollutants may be adsorbed or
absorbed on the test coupon’s surface to cause corrosion.
b. For silver, significant corrosion thickness was still detected at 28°C
(82.4°F) and 50% rh for the H2S-containing mixture conditions. The
elevated temperature had no significant impact on silver corrosion when
H2S was not present.
c. For data center environments where Cl2 and H2S are not present, tempera-
tures as high as 28°C (82.4°F) are acceptable for corrosion control.
4. Regarding the effects of voltage bias (electrical current) on copper and silver
corrosion, RP-1755 (Zhang et al. 2019) found the following:
a. Results from scanning electron microscopy (SEM) and energy dispersive
x-ray spectroscopy (EDS) analysis show that the voltage bias on the
printed circuit boards (PCBs) significantly reduced the corrosion at 80% rh
but slightly increased the corrosion at 50% rh.
b. Further testing and analysis are necessary to determine the combined
effects of voltage bias and RH on copper and silver corrosion.
Appendix F



Psychrometric Charts
The psychrometric charts in this appendix graphically depict (in both SI and
I-P units) the envelopes of the allowable and recommended conditions shown in
tabular form in Tables 2.1 and 2.2 of Chapter 2. These charts would be useful to a
manufacturer trying to determine the appropriate environmental class for a new
information technology (IT) product.
Figures F.1 and F.2 show the recommended and allowable envelopes for Classes
A1, A2, A3, and A4. The recommended envelopes are shown for both low and high
levels of gaseous pollutants.
Figures F.3 and F.4 show the recommended and allowable envelopes for
Class H1. The recommended envelopes are shown for both low and high levels of
gaseous pollutants.
112 Psychrometric Charts

(a)

(b)

Figure F.1 Classes A1–A4 allowable and recommended operating


conditions for (a) low level of pollutants and (b) high level of
pollutants (SI units).
Thermal Guidelines for Data Processing Environments, Fifth Edition 113

(a)

(b)

Figure F.2 Classes A1–A4 allowable and recommended operating


conditions for (a) low level of pollutants and (b) high level of
pollutants (I-P units).
114 Psychrometric Charts

(a)

(b)

Figure F.3 Class H1 allowable and recommended operating conditions


for (a) low level of pollutants and (b) high level of pollutants (SI
units).
Thermal Guidelines for Data Processing Environments, Fifth Edition 115

(a)

(b)

Figure F.4 Class H1 allowable and recommended operating conditions for


(a) low level of pollutants and (b) high level of pollutants (I-P
units).
Appendix G



Altitude Derating Curves
Figure G.1 shows the altitude derating for the 2021 thermal guidelines
described in the footnotes of Tables 2.1 and 2.2 in Chapter 2. As shown in this graph,
the derating curves for Classes A1 and A2 are the same (parallel lines), while the
curves for Classes A3, A4, and H1 are slightly different. As explained in Chapter 2,
this modification provides operational relief to server energy demands.

Figure G.1 Classes A1 to A4 temperature versus altitude.


Appendix H

Practical Example of the
Impact of Compressorless Cooling
on Hardware Failure Rates
Appendix H analyzes the impact of expanded temperature and compressorless
economization on hardware failure rates. The discussion of the analysis is not meant
to imply a specific data center environmental control algorithm. The method and
approach was chosen to facilitate analysis of the data in a simple manner that illus-
trates key findings.
To understand how a compressorless economized data center implementation
would impact hardware failure rates, consider the city of Chicago. When the annual
time-at-temperature climate data for Chicago is plotted as a histogram (Figure H.1),
one can see the vast majority of the hours in an average year are spent at cool and
cold temperatures (below 20°C [68°F]). Although Chicago does become hot in the
summer, those hot periods do not last long and are only a very small percentage of
the hours in a given year.
With an air-side economizer, the data center fans will do some work on the
incoming air and will raise its temperature by about 1.5°C (2.7°F) going from
outside the data center to the inlet of the information technology equipment (ITE).
Also, most data centers with economizers have a means of air mixing to maintain a
minimum data center temperature in the range of 15°C to 20°C (59°F to 68°F), even
in the winter. Applying these assumptions to the Chicago climate data, the histogram
transforms into the one shown in Figure H.2.
Taking the histogram data in Figure H.1 and calculating a percentage of time
spent in each temperature band, one can create a simple time-at-temperature
weighted average of the equipment failure rate, as shown in Table H.1.
In Table H.1, the values in the columns labeled x-factor are the relative failure
rates for the given temperature bins averaged from the values in Table 2.6 of
Chapter 2. As temperature increases, the ITE failure rate also increases. For an air-
side economizer, the net time-weighted average reliability for a data center in
Chicago is 0.970, which is very close to the value of 1 for a data center that is tightly
controlled and continuously run at a temperature of 20°C (68°F).
Even though the failure rate of the hardware increases with temperature, the data
center spends so much time at cool temperatures in the range of 15°C to 20°C (59°F
to 68°F) (where the failure rate is slightly below that for 20°C [68°F] continuous
operation) that the net reliability of the ITE in the data center over a year is very
comparable to the ITE in a data center that is run continuously at 20°C (68°F).
Note that in a data center with an economizer, the hardware failure rate will tend
to be slightly higher during warm periods of the summer, slightly lower during cool
winter months, and about average during fall and spring.
120 Practical Example of the Impact of Compressorless Cooling on Hardware Failure Rates

Figure H.1 Histogram of dry-bulb temperatures for Chicago.

Figure H.2 Dry-bulb temperatures for Chicago with economization


assumptions that include reuse of ITE exhaust heat to
maintain a minimum 15°C to 20°C (59°F to 68°F) temperature
and a 1.5°C (2.7°F) temperature rise from outdoor air to server
inlet.

Table H.1 Time-Weighted Failure Rate x-Factor Calculations for Air-Side Economization for ITE in Chicago
Time-at-Temperature Weighted Failure Rate Calculation for Air-Side Economization

Thermal Guidelines for Data Processing Environments, Fifth Edition 121


% Bin Hours and Associated x-Factors at Various Temperature Bins

15°C  T  20°C 20°C < T  25°C 25°C < T  30°C 30°C < T  35°C
Net
Location (59°F  T  68°F) (68°F < T  77°F) (77°F < T  86°F) (86°F < T  95°F)
x-Factor

% of % of % of % of
x-Factor x-Factor x-Factor x-Factor
Hours Hours Hours Hours

Chicago, IL 72.45% 0.865 14.63% 1.130 9.47% 1.335 3.45% 1.482 0.970
Appendix I

ITE Reliability Data for 
Selected Major 
U.S. and Global Cities
In general, to make a data center failure rate projection, an accurate histogram
of the time-at-temperature for the given location is needed, and the appropriate air
temperature rise from the type of economizer being used should be considered as
well as the data center environmental control algorithm. For simplicity in the anal-
ysis conducted for this book, the impact of economization on the reliability of data
center hardware is shown here with three key assumptions:

• A minimum data center temperature of 15°C to 20°C (59°F to 68°F) can be


maintained.
• A maximum data center temperature below the maximum of the associated
environmental class can be maintained through mechanical cooling.
• The data center temperature tracks with the outdoor temperature, with the
addition of a temperature rise that is appropriate to the type of economization
being used.

The method of data analysis in this appendix is not meant to imply or recom-
mend a specific algorithm for data center environmental control. A detailed treatise
on economizer approach temperatures is beyond the scope of this book. The intent
here is to demonstrate the methodology applied and provide general guidance. An
engineer well versed in economizer designs should be consulted for exact tempera-
ture rises for a specific economizer type in a specific geographic location.
A reasonable assumption for data center supply air temperature rise above the
outdoor ambient dry-bulb temperature is assumed to be 1.5°C (2.7°F). For water-
side economizers, the temperature of the cooling water loop is primarily dependent
on the wet-bulb temperature of the outdoor air.
With Chicago as an example, data from Weather Data Viewer (ASHRAE
2009b) can be used to determine the number of hours during a year when compres-
sorless cooling can be used, based on an assumed approach temperature between the
wet-bulb temperature and the supply air temperature.
In the analysis done for this appendix, a reasonable assumption of 9°C (16.2°F)
was used for the combination of approaches for the cooling tower, heat exchanger(s),
and cooling coil in the air handler. For water-side economization with a dry-cooler-
type tower (closed loop, no evaporation), a 12°C (21.6°F) air temperature rise of the
data center air above the outdoor ambient air temperature is assumed. The figures
and tables in this appendix were based upon the above assumptions.
Time-at-temperature weighted average failure rate projections are shown in
Figures I.1 through I.6 for selected U.S. and global cities and for different economizer
124 ITE Reliability Data for Selected Major U.S. and Global Cities

scenarios. The calculations for those graphs, including the percentage of hours spent
within each temperature range for each city and the reliability data as a function of
temperature, can be found in the corresponding Tables I.1 through I.6 and are based
on Weather Data Viewer (ASHRAE 2009b) software.
It is important to be clear regarding what the relative failure rate values mean.
The results have been normalized for a data center run continuously at 20°C (68°F);
this has the relative failure rate of 1.00. For those cities with values below 1.00, the
assumption is that the economizer still functions and the data center is cooled below
20°C to 15°C (68°F to 59°F) for those hours each year.
In addition, the relative failure rate shows the expected increase in the number
of failed information technology equipment (ITE) products, not the percentage of
total ITE products failing; for example, if a data center that experiences four failures
per 1000 ITE products incorporates warmer temperatures, and the relative failure
rate is 1.20, then the expected failure rate would be 5 failures per 1000 ITE products.
For the majority of U.S. and European cities, the air-side and water-side
economizer projections show failure rates that are very comparable to a tradi-
tional data center run at a steady-state temperature of 20°C (68°F). For a water-
side economizer with a dry-cooler-type tower, the failure rate projections for most
U.S. and global cities are 10% to 40% higher than the 20°C (68°F) steady-state
baseline.
For reference, each of Figures I.1 through I.6 includes three lines showing fail-
ure rate projections for continuous (77×24×365) operation at 20°C, 30°C, and 35°C
(68°F, 86°F, and 95°F). Even though economized, compressorless facilities reach
temperatures of 30°C (86°F) and higher; their failure rate projections are still far
below the failure rates one would expect from continuous, high-temperature, steady-
state operation.

I.1 NOTES ON FIGURES AND TABLES

1. The weather data being considered for both the net x-factor calculation and
hours per year of chiller operation are based only on temperature and not on
humidity.
a. The impact of humidity on the net x-factor calculation is currently under
development and needs to be considered based on the local climate.
b. The impact of humidity on hours per year of chiller operation varies based
on excursion type and humidity management techniques and needs to be
considered based on the local climate.
2. U.S. cities marked with an asterisk on the figures in this appendix are located
in the part of the country where ANSI/ASHRAE/IES Standard 90.1 (ASHRAE
2013) does not mandate economization. Most of these cities lie in a region of
the U.S. that is both warm and humid.
3. The number of hours per year of chiller operation required in the cities analyzed
in Figure I.1 through I.6 is shown in Figures I.7 through I.12. A data center
facility located in a climate that requires zero hours of chiller operation per year
could be built without a chiller.
Thermal Guidelines for Data Processing Environments, Fifth Edition 125

4. For a majority of U.S. and European cities, and even some Asian cities, it is
possible to build economized data centers that rely almost entirely on the local
climate for their cooling needs. However, the availability of Class A3 and A4
capable ITE significantly increases the number of U.S. and global locations
where compressorless facilities could be built and operated. The use of air- and
water-side economization (versus dry-cooler-type water-side economization)
also increases the number of available locations for compressorless facilities.
126 ITE Reliability Data for Selected Major U.S. and Global Cities

Figure I.1 Failure rate projections for air-side economizer for selected
U.S. cities.

Figure I.2 Failure rate projections for water-side economizer for selected
U.S. cities.
Table I.1 Time-Weighted Failure Rate x-Factor Calculations for Class A2 for Air-Side Economization for
Selected Major U.S. Cities Assuming 1.5°C (2.7°F) Temperature Rise between
Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Air-Side Economization
% Bin Hours and Associated x-Factors for U.S. Cities at Various Temperature Bins

Thermal Guidelines for Data Processing Environments, Fifth Edition 127


15°C  T  20°C 20°C < T  25°C 25°C < T  30°C 30°C < T  35°C Net
Location
(59°F  T  68°F) (68°F < T  77°F) (77°F < T  86°F) (86°F < T  95°F) x-Factor
% of Hours x-Factor % of Hours x-Factor % of Hours x-Factor % of Hours x-Factor
San Francisco, CA 87.37% 0.865 10.67% 1.130 1.60% 1.335 0.36% 1.482 0.903
Seattle, WA 88.00% 0.865 8.39% 1.130 3.04% 1.335 0.57% 1.482 0.905
Helena, MT 83.55% 0.865 8.52% 1.130 5.09% 1.335 2.84% 1.482 0.929
Madison, WI 76.05% 0.865 13.62% 1.130 8.12% 1.335 2.21% 1.482 0.953
Boston, MA 75.22% 0.865 15.15% 1.130 7.23% 1.335 2.40% 1.482 0.954
Denver, CO 75.91% 0.865 11.83% 1.130 7.15% 1.335 5.11% 1.482 0.961
Los Angeles, CA 67.13% 0.865 28.22% 1.130 4.23% 1.335 0.42% 1.482 0.962
Chicago, IL 72.45% 0.865 14.63% 1.130 9.47% 1.335 3.45% 1.482 0.970
Washington, DC 61.62% 0.865 17.47% 1.130 14.57% 1.335 6.34% 1.482 1.019
Atlanta, GA 52.41% 0.865 23.19% 1.130 16.75% 1.335 7.65% 1.482 1.052
Dallas, TX 45.48% 0.865 18.45% 1.130 19.72% 1.335 16.35% 1.482 1.108
Houston, TX 36.27% 0.865 22.98% 1.130 25.60% 1.335 15.15% 1.482 1.140
Phoenix, AZ 34.13% 0.865 14.89% 1.130 15.62% 1.335 35.36% 1.482 1.196
Miami, FL 9.48% 0.865 24.45% 1.130 48.62% 1.335 17.45% 1.482 1.266
Table I.2 Time-Weighted Failure Rate x-Factor Calculations for Class A2 for Water-Side Economization for

128 ITE Reliability Data for Selected Major U.S. and Global Cities
Selected Major U.S. Cities Assuming 9°C (16.2°F) Temperature Rise between
Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Water-Side Economization
% Bin Hours and Associated x-Factors for U.S. Cities at Various Temperature Bins
15°C  T  20°C 20°C < T  25°C 25°C < T 30°C 30°C < T  35°C Net
Location
(59°F  T  68°F) (68°F < T  77°F) (77°F < T  86°F) (86°F < T  95°F) x-Factor
% of Hours x-Factor % of Hours x-Factor % of Hours x-Factor % of Hours x-Factor
Helena, MT 80.69% 0.865 16.49% 1.130 2.81% 1.335 0.01% 1.482 0.922
Denver, CO 72.50% 0.865 22.24% 1.130 5.26% 1.335 0.00% 1.482 0.949
Seattle, WA 64.41% 0.865 30.06% 1.130 5.48% 1.335 0.05% 1.482 0.971
Madison, WI 62.54% 0.865 16.01% 1.130 15.78% 1.335 5.67% 1.482 1.017
Boston, MA 59.42% 0.865 17.90% 1.130 16.98% 1.335 5.70% 1.482 1.027
San Francisco, CA 41.62% 0.865 52.99% 1.130 5.38% 1.335 0.01% 1.482 1.031
Chicago, IL 59.59% 0.865 16.16% 1.130 16.80% 1.335 7.45% 1.482 1.033
Washington DC 48.73% 0.865 15.85% 1.130 19.29% 1.335 16.13% 1.482 1.097
Phoenix, AZ 35.94% 0.865 30.06% 1.130 20.13% 1.335 13.87% 1.482 1.125
Los Angeles, CA 20.92% 0.865 46.95% 1.130 31.50% 1.335 0.62% 1.482 1.141
Atlanta, GA 37.79% 0.865 18.17% 1.130 23.69% 1.335 20.36% 1.482 1.150
Dallas, TX 33.72% 0.865 16.09% 1.130 20.84% 1.335 29.35% 1.482 1.187
Houston, TX 22.14% 0.865 14.95% 1.130 21.60% 1.335 41.31% 1.482 1.261
Miami, FL 2.98% 0.865 8.58% 1.130 27.52% 1.335 60.93% 1.482 1.393
Thermal Guidelines for Data Processing Environments, Fifth Edition 129

Figure I.3 Failure rate projections for water-side economizer with


dry-cooler-type tower for selected U.S. cities.

Figure I.4 Failure rate projections for air-side economizer for selected
global cities.
Table I.3 Time-Weighted Failure Rate x-Factor Calculations for Class A2 for Water-Side Dry-Cooler-Type

130 ITE Reliability Data for Selected Major U.S. and Global Cities
Tower Economization for Selected Major U.S. Cities Assuming 12°C (21.6°F) Temperature Rise
between Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Water-Side Economization with Dry-Cooler-Type Tower
% Bin Hours and Associated x-Factors for U.S. Cities at Various Temperature Bins
15°C  T  20°C 20°C < T  25°C 25°C < T 30°C 30°C < T  35°C Net
Location
(59°F  T  68°F) (68°F < T  77°F) (77°F < T  86°F) (86°F < T  95°F) x-Factor
% of Hours x-Factor % of Hours x-Factor % of Hours x-Factor % of Hours x-Factor
Helena, MT 53.32% 0.865 15.61% 1.130 13.48% 1.335 17.59% 1.482 1.078
Madison, WI 48.26% 0.865 12.46% 1.130 13.79% 1.335 25.49% 1.482 1.120
Seattle, WA 33.56% 0.865 30.79% 1.130 22.16% 1.335 13.48% 1.482 1.134
Denver, CO 44.26% 0.865 14.85% 1.130 15.30% 1.335 25.59% 1.482 1.134
Chicago, IL 44.31% 0.865 12.83% 1.130 13.74% 1.335 29.13% 1.482 1.143
Boston, MA 41.16% 0.865 16.23% 1.130 15.95% 1.335 26.66% 1.482 1.147
Washington DC 29.94% 0.865 15.13% 1.130 14.89% 1.335 40.04% 1.482 1.222
San Francisco, CA 6.42% 0.865 38.41% 1.130 40.38% 1.335 14.79% 1.482 1.248
Atlanta, GA 18.89% 0.865 14.56% 1.130 17.00% 1.335 49.55% 1.482 1.289
Dallas, TX 15.96% 0.865 13.08% 1.130 14.69% 1.335 56.27% 1.482 1.316
Los Angeles, CA 1.01% 0.865 15.41% 1.130 45.39% 1.335 38.19% 1.482 1.355
Houston, TX 9.23% 0.865 10.99% 1.130 14.24% 1.335 65.54% 1.482 1.365
Phoenix, AZ 3.93% 0.865 12.14% 1.130 16.35% 1.335 67.58% 1.482 1.391
Miami, FL 0.30% 0.865 1.91% 1.130 6.15% 1.335 91.64% 1.482 1.464
Table I.4 Time-Weighted Failure Rate x-Factor Calculations for Class A2 for Air-Side Economization for
Selected Major Global Cities Assuming 1.5°C (2.7°F) Temperature Rise between
Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Air-Side Economization
% Bin Hours and Associated x-Factors for Global Cities at Various Temperature Bins

Thermal Guidelines for Data Processing Environments, Fifth Edition 131


15°C  T  20°C 20°C < T  25°C 25°C < T 30°C 30°C < T  35°C Net
Location
(59°F  T  68°F) (68°F < T  77°F) (77°F < T  86°F) (86°F < T  95°F) x-Factor
% of Hours x-Factor % of Hours x-Factor % of Hours x-Factor % of Hours x-Factor
Oslo 91.78% 0.865 6.43% 1.130 1.68% 1.335 0.11% 1.482 0.891
London 88.41% 0.865 8.92% 1.130 2.32% 1.335 0.34% 1.482 0.902
Frankfurt 84.90% 0.865 9.91% 1.130 4.13% 1.335 1.06% 1.482 0.917
Milan 68.27% 0.865 17.06% 1.130 10.65% 1.335 4.03% 1.482 0.985
Rome 63.43% 0.865 20.72% 1.130 13.33% 1.335 2.51% 1.482 0.998
Sydney 53.31% 0.865 35.26% 1.130 9.76% 1.335 1.68% 1.482 1.015
Tokyo 58.52% 0.865 20.11% 1.130 15.87% 1.335 5.50% 1.482 1.027
Bangalore 7.33% 0.865 47.11% 1.130 34.56% 1.335 11.00% 1.482 1.220
Hong Kong 19.95% 0.865 22.97% 1.130 34.31% 1.335 22.77% 1.482 1.228
Singapore 0.00% 0.865 1.21% 1.130 67.65% 1.335 31.14% 1.482 1.378
Mexico City 64.52% 0.865 25.37% 1.130 9.58% 1.335 0.54% 1.482 0.981
Sao Paulo 38.03% 0.865 40.67% 1.130 17.18% 1.335 4.12% 1.482 1.079
San Jose, CR 6.10% 0.865 60.88% 1.130 29.15% 1.335 3.86% 1.482 1.187
132 ITE Reliability Data for Selected Major U.S. and Global Cities

Figure I.5 Failure rate projections for water-side economizer for selected
global cities.

Figure I.6 Failure rate projections for water-side economizer with


dry-cooler-type tower for selected global cities.
Table I.5 Time-Weighted Failure Rate x-Factor Calculations for Class A2 for Water-Side Economization for
Selected Major U.S. Cities Assuming 9°C (16.2°F) Temperature Rise between
Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Water-Side Economization
% Bin Hours and Associated x-Factors for U.S. Cities at Various Temperature Bins

Thermal Guidelines for Data Processing Environments, Fifth Edition 133


15°C  T  20°C 20°C < T  25°C 25°C < T 30°C 30°C < T  35°C Net
Location
(59°F  T  68°F) (68°F < T  77°F) (77°F < T  86°F) (86°F < T  95°F) x-Factor
% of Hours x-Factor % of Hours x-Factor % of Hours x-Factor % of Hours x-Factor
Oslo 78.20% 0.865 18.55% 1.130 3.25% 1.335 0.01% 1.482 0.929
London 64.83% 0.865 28.82% 1.130 6.32% 1.335 0.03% 1.482 0.971
Frankfurt 66.45% 0.865 24.51% 1.130 8.86% 1.335 0.17% 1.482 0.973
Milan 50.32% 0.865 20.82% 1.130 21.70% 1.335 7.17% 1.482 1.066
Rome 38.46% 0.865 26.32% 1.130 24.48% 1.335 10.74% 1.482 1.116
Tokyo 45.10% 0.865 18.60% 1.130 19.61% 1.335 16.69% 1.482 1.109
Sydney 26.49% 0.865 37.34% 1.130 32.59% 1.335 3.58% 1.482 1.139
Bangalore 0.14% 0.865 14.59% 1.130 71.62% 1.335 13.64% 1.482 1.324
Hong Kong 7.22% 0.865 18.05% 1.130 23.48% 1.335 51.25% 1.482 1.339
Singapore 0.00% 0.865 0.00% 1.130 0.02% 1.335 99.98% 1.482 1.482
Mexico City 45.86% 0.865 53.56% 1.130 0.58% 1.335 0.00% 1.482 1.010
Sao Paolo 4.68% 0.865 34.97% 1.130 54.32% 1.335 6.03% 1.482 1.250
San Jose, CR 0.01% 0.865 7.12% 1.130 78.47% 1.335 14.40% 1.482 1.342
Table I.6 Time-Weighted Failure Rate x-Factor Calculations for Class A2 for Water-Side Dry-Cooler-Type

134 ITE Reliability Data for Selected Major U.S. and Global Cities
Tower Economization for Selected Major U.S. Cities Assuming 12°C (21.6°F) Temperature Rise between
Outdoor Ambient Temperature and ITE Inlet Air Temperature
Time-at-Temperature Weighted Failure Rate Calculation for Water-Side Economization with Dry-Cooler-Type Tower
% Bin Hours and Associated x-Factors for U.S. Cities at Various Temperature Bins
15°C  T  20°C 20°C < T  25°C 25°C < T 30°C 30°C < T  35°C Net
Location
(59°F  T  68°F) (68°F < T  77°F) (77°F < T  86°F) (86°F < T  95°F) x-Factor
% of Hours x-Factor % of Hours x-Factor % of Hours x-Factor % of Hours x-Factor
Oslo 56.72% 0.865 18.01% 1.130 16.66% 1.335 8.60% 1.482 1.044
Frankfurt 42.53% 0.865 21.55% 1.130 19.40% 1.335 16.52% 1.482 1.115
London 32.43% 0.865 30.63% 1.130 23.79% 1.335 13.15% 1.482 1.139
Milan 32.39% 0.865 17.26% 1.130 17.55% 1.335 32.80% 1.482 1.196
Rome 16.40% 0.865 22.61% 1.130 23.76% 1.335 37.23% 1.482 1.266
Tokyo 20.82% 0.865 17.79% 1.130 17.84% 1.335 43.55% 1.482 1.265
Sydney 2.73% 0.865 15.74% 1.130 32.88% 1.335 48.65% 1.482 1.361
Hong Kong 0.27% 0.865 3.70% 1.130 15.78% 1.335 80.26% 1.482 1.444
Bangalore 0.00% 0.865 0.03% 1.130 5.66% 1.335 94.31% 1.482 1.474
Singapore 0.00% 0.865 0.00% 1.130 0.00% 1.335 100.00% 1.482 1.482
Mexico City 4.72% 0.865 19.39% 1.130 39.87% 1.335 36.02% 1.482 1.326
Sao Paolo 0.29% 0.865 6.46% 1.130 30.25% 1.335 63.00% 1.482 1.413
San Jose, CR 0.00% 0.865 0.01% 1.130 4.84% 1.335 95.15% 1.482 1.475
Thermal Guidelines for Data Processing Environments, Fifth Edition 135

Figure I.7 Number of hours per year of chiller operation required for
air-side economizer for selected U.S. cities.

Figure I.8 Number of hours per year of chiller operation required for
water-side economizer for selected U.S. cities.
136 ITE Reliability Data for Selected Major U.S. and Global Cities

Figure I.9 Number of hours per year of chiller operation required for
water-side dry-cooler economizer for selected U.S.cities.

Figure I.10 Number of hours per year of chiller operation required for
air-side economizer for selected global cities.
Thermal Guidelines for Data Processing Environments, Fifth Edition 137

Figure I.11 Number of hours per year of chiller operation required for
water-side economizer for selected global cities.

Figure I.12 Number of hours per year of chiller operation required for
water-side dry-cooler economizer for selected global cities.
Appendix J

OSHA and 
Personnel Working in 
High Air Temperatures
As data center cold-aisle air temperatures have significantly increased due to the
increased ASHRAE recommended rack inlet air temperatures, so too have the hot-
aisle temperatures. As a result, many data center owners, operators, and IT manu-
facturers are concerned about personnel that work in these elevated temperature
environments. The 2011 Thermal Guidelines Classes A3 and A4 allowed for infor-
mation technology equipment (ITE) inlet air temperatures up to 40°C and 45°C
(104°F and 113°F), respectively, which can result in hot-aisle temperatures that
exceed 50°C (122°F). These temperatures are much higher than traditional cold- and
hot-aisle temperatures and can pose a significant health hazard to personnel who
work in these environments.
The U.S. Department of Labor’s Occupational Safety and Health Administra-
tion (OSHA), as well as the European Union’s Agency for Safety and Health at Work
(EU-OSHA), determine the minimum worker safety standards for the United States
and the European Union. As of January 2012, neither health organization had any
particular regulations specifying the allowable temperature ranges for working envi-
ronments. Instead, Health and Safety Executive (HSE) states and recommends that
workroom temperatures should provide “reasonable” comfort levels:
The temperature in workrooms should provide reasonable comfort without the
need for special clothing. Where such a temperature is impractical because of
hot or cold processes, all reasonable steps should be taken to achieve a tempera-
ture which is as close as possible to comfortable. “Workroom” means a room
where people normally work for more than short periods. (HSE 1992)
Although OSHA does not have a particular regulation or standard that covers
high-temperature environments, the General Duty Clause, Section 5(a)(1) of the
Occupational Safety and Health Act of 1970 (OSHA 2019), requires each employer
to “furnish to each of his employees employment and a place of employment which
are free from recognized hazards that are causing or are likely to cause death or seri-
ous physical harm.” OSHA has interpreted this rule such that employers shall
provide means and methods that will reduce the likelihood of worker heat stress.
These means or methods may include issuing personal protective equipment (PPE),
minimizing exposure through frequent breaks, frequent hydration, and developing
a heat stress program. There are various manufacturers that produce PPE for hot
working environments.
NIOSH (2016) and OSHA (2019) state that employers should develop a written
health and safety policy outlining how workers in hot environments will be protected
from heat stress. As a minimum, the following steps should be taken and addressed:
140 OSHA and Personnel Working in High Air Temperatures

• Adjust work practices as necessary when workers complain of heat stress.


• Make controlling exposures through engineering controls the primary means
of control wherever possible.
• Oversee heat stress training and acclimatization for new workers, workers
who have been off the job for a while, and workers with medical conditions.
• Provide worker education and training, including periodic safety talks on heat
stress during hot weather or during work in hot environments.
• Monitor the workplace to determine when hot conditions arise.
• Determine whether workers are drinking enough water.
• Determine a proper work/rest regime for workers.
• Arrange first-aid training for workers

Additionally, OSHA provides information developed by the American Confer-


ence of Governmental Industrial Hygienists (ACGIH) on heat exposure threshold
limits (ACGIH 2017), as shown in Table J.1. It is important to note that the infor-
mation shown in this table is recommended by OSHA Technical Manual (OSHA
2017) and is not part of a standard or regulation.
ACGIH’s screening criteria for TLVs and action limits for heat stress (see Table
J.1) are an initial screening tool to evaluate whether a heat stress situation may exist
based on wet-bulb globe temperature (WBGT), workload, and work/rest regimen.
WBGT is a weighted average of dry-bulb, wet-bulb, and globe temperatures and
incorporates the effects of all four environmental heat determinants (air temperature,
relative humidity, air movement, and radiant heat). WBGT has been the preferred
environmental heat metric for heat-related illness prevention in workplaces. OSHA
recognizes that measuring WBGT at a work site provides the most accurate infor-
mation about workers’ heat exposure (OSHA 2020).
Table J.1 shows that even the highest recommended environmental working
temperatures are well below the Class A3 cold-aisle temperature of 40°C (104°F).
This means that data center owners and operators need to be cognizant of tempera-
Table J.1 Screening Criteria for ACGIH TLVs® and
Action Limits for Heat Stress Exposure (ACGIH 2017)

Workload
% Work
Light Moderate Heavy* Very Heavy*

75% to 100%
31.0°C (87.8°F) 28.0°C (82.4°F) N/A N/A
(continuous)
50% to 75% 31.0°C (87.8°F) 29.0°C (84.2°F) 27.5°C (81.5°F) N/A
25% to 50% 32.0°C (89.6°F) 30.0°C (86.0°F) 29.0°C (84.2°F) 28.0°C (82.4°F)
0% to 25% 32.5°C (90.5°F) 31.5°C (86.9°F) 30.5°C (86.9°F) 30.0°C (86.0°F)
* Criteria values are not provided for heavy or very heavy work for continuous and 25% rest because of the
extreme physical strain. Detailed job hazard analyses and physiological monitoring should be used for these
cases rather than these screening criteria.
Thermal Guidelines for Data Processing Environments, Fifth Edition 141

tures, workload levels, and worker safety in their data centers if the temperatures
exceed 25°C (77°F).
It is important to note that although there are no particular laws or regulations
for the data center industry that prohibit working in 40°C (104°F) and above envi-
ronments, great care must be taken to ensure the safety of all personnel who may be
exposed to such temperatures and that appropriate safety and heat stress prevention
measures are implemented.
Appendix K

Allowable Server 
Inlet Temperature 
Rate of Change
The inlet air temperature change requirements of 5°C (9°F) in an hour (for tape
equipment) and 20°C (36°F) in an hour (for other types of IT equipment not includ-
ing tape) are not temperature rates of change. Figures K.1 through K.4 provide
examples of air inlet temperatures that are either compliant or noncompliant with the
temperature change requirements for data center rooms with and without tape-based
information technology equipment (ITE).
The control algorithms of many data center HVAC systems generate small
but rapid fluctuations in the cold air supply temperature, which can have a very
high rate of temperature change (see Figure K.5). These small changes are not a
problem for ITE functionality and reliability, because the time scale of the air inlet
temperature changes is typically too short for a large thermal mass, such as a
storage array, to respond to the changes (see Figure K.6).
A time lag of five minutes to respond to a change in air inlet temperature is
not an unusual amount of time for hard disk drives (HDDs) in a piece of ITE.
Small but rapid air temperature changes from the data center HVAC system
generally occur on a time scale much shorter than the time lag of the HDDs so that
the hard drives do not have a chance to respond to the rapid rates of temperature
change in the airstream. The extent of temperature change in the HDDs may also
be reduced by the cooling fan control algorithm of the equipment enclosure. Thus,
HDDs in ITE are significantly buffered from temperature changes and the rate of
temperature change of the air in the equipment inlet airstream. Other sub-
assemblies within the ITE (e.g., solid-state drives, option cards, power supplies)
are also somewhat buffered from data center air temperature changes. However,
this buffering is to a degree dependent on their thermal mass, cooling airflow, and
location within the ITE.
The intent of defining inlet air temperature change requirement as 5°C (9°F) and
20°C (36°F) for tape and other types of ITE, respectively, is two fold: 1) to provide
data center facility-level requirements that will keep the critical internal components
and subassemblies of the ITE within the manufacturer’s requirements, and 2) to
avoid costly and unnecessary data center HVAC system and facility upgrades that
might be needed to comply with the former rate-of-change-based requirement.
144 Allowable Server Inlet Temperature Rate of Change

(a) (b)

Figure K.1 Examples of tape equipment inlet air temperature versus time
that are compliant with the 5°C (9°F) in an hour temperature
change requirement for data center rooms with tape.
equipment.

(a) (b)

Figure K.2 Examples of tape equipment inlet air temperature versus time
that are noncompliant with the 5°C (9°F) in an hour temperature
change requirement for data center rooms with tape equipment.
Thermal Guidelines for Data Processing Environments, Fifth Edition 145

(a) (b)

Figure K.3 Examples of equipment inlet air temperature versus time that are
compliant with the 20°C (36°F) in an hour and the 5°C (9°F) in
15 minutes temperature change requirements for data center
rooms that contain other types of ITE not including tape.

(a) (b)

(c)

Figure K.4 Examples of equipment inlet air temperature versus time that:
a) are noncompliant with the 20°C (36°F) in an hour
requirement, b) are noncompliant with the 5°C (9°F) in 15
minutes requirement, and c) are noncompliant with 5°C (9°F)
in 15 minutes requirement but compliant with 20°C (36°F) in an
hour requirement for data center rooms that contain other
types of ITE not including tape.
146 Allowable Server Inlet Temperature Rate of Change

Figure K.5 Example of ITE air inlet temperature rate of change (°C/h)
calculated over 1 min, 5 min, 15 min, and 60 min time intervals.

Figure K.6 Example of time delay between inlet air temperature change to
storage array and the corresponding temperature change in
HDDs of the storage array.
Appendix L

Allowable Server Inlet RH Limits
versus Maximum Inlet 
Dry-Bulb Temperature
In most information technology equipment (ITE) specifications, the allowable inlet
air relative humidity (RH) limits are not static but are instead a function of the inlet air
dry-bulb temperature. In other words, the RH specification is not simply the stated mini-
mum and maximum RH values—these values are usually modified by minimum and
maximum dew-point limits. Whether or not the dew-point limits affect the RH limits is
a function of the dry-bulb temperature of the inlet air. Dew-point limits are typically used
to reduce allowable high humidity values at high dry-bulb temperatures and to increase
the minimum allowable humidity value at low dry-bulb temperatures.
RH is the percentage of the partial pressure of water vapor to the saturation pressure
at a given dry-bulb temperature. Thus, RH is relative to a given temperature. If the
temperature of a parcel of air is changed, the RH will also change even though the abso-
lute amount of water present in the air remains unchanged.
Dew point is a measure of the absolute water content of a given volume of air. It is
also the temperature at which water vapor has reached the saturation point (100% rh).
Consider Class A3 from the 2015 thermal guidelines. Class A3 is defined as a mois-
ture range of –12°C (10.4°F) dew point and 8% rh to 24°C (75.2°F) dew point and
85% rh. The 24°C (75.2°F) maximum dew-point limit restricts high RH values at higher
temperatures. The –12°C (10.4°F) dew-point restriction prohibits low RH values at
lower temperatures. These effects are illustrated in Figure L.1.

Figure L.1 Class A3 climatogram illustrating how dew-point limits modify


RH specification limits.
148 Allowable Server Inlet RH Limits versus Maximum Inlet Dry-Bulb Temperature

(a) (b)

Figure L.2 Climatogram of recommended ranges for Classes A1 to A4


(see Table 2.1 in Chapter 2 for more details): a) low levels of
pollutants and b) high levels of pollutants.

Figure L.3 Class A1 and A2 operation climatograms.

The purpose of applying dew-point limits to restrict RH values at high and low
temperatures is to minimize known reliability issues. For example, many types of corro-
sion are exponentially accelerated by RH and temperature. The maximum dew-point
limit helps reduce the risk of a corrosion-related failure by limiting the maximum RH
allowed at high temperatures. Similarly, damage to ITE from electrostatic discharge
(ESD) can be a problem at low RH levels. The minimum dew-point value serves to raise
RH limits at low temperatures to mitigate the risk of equipment damage from ESD.
Figures L.2 through L.7 show climatograms to graphically illustrate each of the
2021 thermal guideline classes and to show how the application of dew-point restrictions
changes the RH limits.
Thermal Guidelines for Data Processing Environments, Fifth Edition 149

Figure L.4 Class A3 and A4 operation climatograms.

(a) (b)

Figure L.5 Class A1 though A4 power OFF climatogram.


150 Allowable Server Inlet RH Limits versus Maximum Inlet Dry-Bulb Temperature

(a) (b)

Figure L.6 Climatogram of recommended ranges for Class H1 (see


Table 2.2 in Chapter 2 for more details): a) low levels of
pollutants and b) high levels of pollutants.

Figure L.7 Class H1 operation climatogram.





References and Bibliography
ACGIH. 2017. 2017 TLVs® and BEIs®. Based on the Documentation of the
Threshold Limit Values for Chemical Substances and Physical Agents and
Biological Exposure Indices. Cincinnati, Ohio: American Conference of Gov-
ernmental Industrial Hygienists.
ADA. 2010. Americans with Disabilities Act (ADA), 28 CFR Part 36.
AHRI. 2017. AHRI Standard 1360 (I-P), 2017 Standard for Performance Rating
of Computer and Data Processing Room Air Conditioners. Arlington, VA:
Air-Conditioning, Heating, and Refrigeration Institute.
ASHRAE. 2004. Thermal guidelines for data processing environments. Peachtree
Corners, GA: ASHRAE.
ASHRAE. 2008. Thermal guidelines for data processing environments, 2d ed.
ASHRAE Datacom Series, Book 1. Peachtree Corners, GA: ASHRAE.
ASHRAE. 2009a. Design considerations for datacom equipment centers, 2d ed.
ASHRAE Datacom Series, Book 3. Peachtree Corners, GA: ASHRAE.
ASHRAE. 2009b. Weather Data Viewer, Ver 4. Peachtree Corners, GA:
ASHRAE.
ASHRAE. 2012. Thermal guidelines for data processing environments, 3d ed.
ASHRAE Datacom Series, Book 1. Peachtree Corners, GA: ASHRAE.
ASHRAE. 2013. ANSI/ASHRAE/IES Standard 90.1-2013, Energy standard for
buildings except low-rise residential buildings. Peachtree Corners, GA:
ASHRAE.
ASHRAE. 2014a. Liquid cooling guidelines for datacom equipment centers, 2d
ed. ASHRAE Datacom Series, Book 4. Peachtree Corners, GA: ASHRAE.
ASHRAE. 2014b. Particulate and gaseous contamination in datacom environ-
ments, 2d ed. ASHRAE Datacom Series, Book 8. Peachtree Corners, GA:
ASHRAE.
ASHRAE. 2014c. PUETM: A comprehensive examination of the metric. ASHRAE
Datacom Series, Book 11. ASHRAE and The Green Grid. Peachtree Corners,
GA: ASHRAE.
ASHRAE. 2015a. ASHRAE Handbook—HVAC Applications. Peachtree Corners,
GA: ASHRAE.
ASHRAE. 2015b. Thermal guidelines for data processing environments, 4th ed.
ASHRAE Datacom Series, Book 1. Peachtree Corners, GA: ASHRAE.
ASHRAE 2016. IT equipment design impact on data center solutions. ASHRAE
Datacom Series, Book 13. Peachtree Corners, GA: ASHRAE.
152 References and Bibliography

ASHRAE. 2017. Chapter 1, Psychrometrics. In ASHRAE Handbook—Fundamen-


tals. Peachtree Corners, GA: ASHRAE.
ASHRAE. 2018a. ASHRAE position document on climate change. Peachtree
Corners, GA: ASHRAE. https://www.ashrae.org/file%20library/about/posi-
tion%20documents/ashrae-position-document-on-climate-change.pdf.
ASHRAE. 2018b. IT equipment power trends, 3d ed. ASHRAE Datacom Series,
Book 2. Peachtree Corners, GA: ASHRAE.
ASHRAE. 2019. Advancing DCIM with IT equipment integration. ASHRAE
Datacom Series, Book 14. Peachtree Corners, GA: ASHRAE.
ASHRAE. 2019. ANSI/ASHRAE Standard 90.4-2019, Energy Standard for Data
Centers. Peachtree Corners, GA: ASHRAE.
ASHRAE. 2020. Chapter 12, District heating and cooling. In ASHRAE Hand-
book—HVAC Systems and Equipment. Peachtree Corners, GA: ASHRAE.
ASTM. 2014. ASTM B827-05 (2014), Standard practice for conducting mixed
flowing gas (MFG) environmental tests. West Conshohocken, PA: ASTM
International.
ATIS. 2015. ATIS-0600336.2015, Design requirements for universal cabinets and
framework. Washington, DC: Alliance for Telecommunications Industry
Solutions (ATIS).
Atwood, D., and J.G. Miner. 2008. Reducing data center cost with an air econo-
mizer. Brief, Intel Corporation, Santa Clara, CA. www.intel.com/content/
dam/doc/technology-brief/data-center-efficiency-xeon-reducing-data-center-
cost-with-air-economizer-brief.pdf.
Augis, J.A., D.G. DeNure, M.J. LuValle, J.P. Mitchell, M.R. Pinnel, and T.L.
Welsher. 1989. A humidity threshold for conductive anodic filaments in
epoxy glass printed wiring board. Proceedings of 3rd International SAMPE
Electronics Conference, pp. 1023–30.
Comizzoli, R.B., R.P. Frankenthal, R.E. Lobnig, G.A. Peins, L.A. Psato-Kelty,
D.J. Siconolfi, and J.D. Sinclair. 1993. Corrosion of electronic materials and
devices by submicron atmospheric particles. The Electrochemical Society
Interface 2(3):26–34.
CFR. 1994. Standards for accessible design. Code of Federal Regulations. 28 CFR
Part 36, Section 4.3.3, Width. Washington, DC: U.S. Department of Justice,
ADA Standards for Accessible Design.
EIA. 1992. EIA-310, Revision D, Sept. 1, 1992, Racks, panels and associated
equipment. Englewood CO: Electronic Industries Association.
EPA. 2018. ENERGY STAR Computer Servers Version 3.0 Final Specification –
September 17, 2018. www.energystar.gov/products/spec/enterprise_servers
_specification_version_3_0_pd.
EPA. 2019. ENERGY STAR program requirements for computer servers. Wash-
ington, DC: U.S. Department of Energ.y. www.energystar.gov/sites/default/
files/ENERGY%20STAR%20Version%203.0%20Computer%20Servers%20
Program%20Requirements.pdf.
ESDA. 2006a. ANSI/ESD STM97.1, ESD Association standard test method for
the protection of electrostatic discharge susceptible items—Floor materials
Thermal Guidelines for Data Processing Environments, Fifth Edition 153

and footwear—Resistance measurement in combination with a person. Rome,


NY: Electrostatic Discharge Association.
ESDA. 2006b. ANSI/ESD STM97.2, ESD Association standard test method for
the protection of electrostatic discharge susceptible items—Floor materials
and footwear—Voltage measurement in combination with a person. Rome,
NY: Electrostatic Discharge Association.
ESDA. 2013. ANSI/ESD STM7.1-2013, ESD Association standard test method
for the protection of electrostatic discharge susceptible items—Floor Materi-
als—Resistive characterization of materials. Rome, NY: Electrostatic Dis-
charge Association.
ESDA. 2014a. ANSI/ESD S20.20-2014, ESD Association standard for the devel-
opment of an electrostatic discharge control program for protection of electri-
cal and electronic parts, assemblies and equipment (excluding electrically
initiated explosive devices). Rome, NY: Electrostatic Discharge Association.
ESDA. 2014b. ANSI/ESD STM9.1-2014, Footwear—Resistive Characteristics.
Rome, NY: Electrostatic Discharge Association.
ESDA. 2016. ANSI/ESD STM97.2, ESD Association standard test method for the
protection of electrostatic discharge susceptible items—Footwear/flooring
system—Voltage measurement in combination with a person. Rome, NY:
Electrostatic Discharge Association.
ESDA. 2019. ANSI/ESD S6.1-2019, Grounding. Rome, NY: Electrostatic Dis-
charge Association.
ETSI. 2014. ETSI EN 300 019-1-3 V2.3.2, Equipment engineering (EE); Environ-
mental conditions and environmental tests for telecommunications equip-
ment; Part 1-3: Classification of environmental conditions; Stationary use at
weatherprotected locations. Valbonne, France: European Telecommunica-
tions Standards Institute.
Europa. 2003. Directive 2003/10/EC on the minimum health and safety require-
ments regarding the exposure of workers to the risks arising from physical
agents (noise). Bilbao, Spain: European Agency for Safety and Health at
Work. http://osha.europa.eu/en/legislation/directives/exposure-to-physical
-hazards/osh-directives/82.
Hamilton, P., G. Brist, G. Barnes Jr., and J. Schrader. 2007. Humidity-dependent
loss in PCB substrates. Proceedings of the Technical Conference, IPC Expo/
APEX 2007, February 20–22, Los Angeles, CA.
Hinaga, S., M.Y. Koledintseva, J.L. Drewniak, A. Koul, and F. Zhou. 2010. Ther-
mal effects on PCB laminate material dielectric constant and dissipation fac-
tor. IEEE Symposium on Electromagnetic Compatibility, July 25–30, Fort
Lauderdale, FL.
HSE. 1992. Workplace health, safety and welfare; Workplace (health, safety and
welfare) regulations 1992; Approved code of practice. Liverpool, Merseyside,
England: Health and Safety Executive.
IEC. 2005. IEC 60950-1:2005, Information technology equipment - Safety -
Part 1: General requirements. Geneva, Switzerland: International Electro-
technical Commission.
154 References and Bibliography

IEC. 2008. IEC 61000-4-2:2008, Electromagnetic compatibility (EMC)—Part 4-2:


Testing and measurement techniques—Electrostatic discharge immunity test.
Geneva: International Electrotechnical Commission.
IEC. 2010. CISPR 24, Information technology equipment—Immunity characteris-
tics—Limits and methods of measurement. Geneva: International Electrotech-
nical Commission.
ISO. 2015. ISO 14644-1:2015, Cleanrooms and associated controlled environ-
ments—Part 1: Classification of air cleanliness by particle concentration.
Geneva: International Organization for Standardization.
ISO 2017. ISO 9296:2017, Acoustics—Declared noise emission values of infor-
mation technology and telecommunications equipment. Geneva: International
Organization for Standardization.
ISO 2018. ISO 7779-2018(en), Acoustics—Measurement of airborne noise emit-
ted by information technology and telecommunications equipment. Geneva:
International Organization for Standardization.
Khalifa, H., and R. Schmidt. 2014. An improved energy reuse metric. ASHRAE
TC.99 publication. http://tc0909.ashraetcs.org/documents.php.
Lund, H., P.A. Østergaard, M.A. Chang, and S. Werner. 2018. The status of 4th
generation district heating: Research and results. 4th International Conference
on Smart Energy Systems and 4th Generation District Heating, Aalborg, Den-
mark, November 13–14, 2018.
Manousakis, I., S. Sankar, G. McKnight, T.D. Nguyen, and R. Bianchini. 2016.
Environmental conditions and disk reliability in free-cooled datacenters.
FAST’16: Proceedings of the 14th Usenix Conference on File and Storage
Technologies, pp. 53–65.
NIOSH. 2016. Criteria for a Recommended Standard: Occupational Exposure to
Heat and Hot Environments, DHHS (NIOSH) Publication Number 2016-106
NIOSH publications and products, working in hot environments. centers for
disease control and prevention, www.cdc.gov/niosh/docs/2016-106/pdfs/
2016-106.pdf?id=10.26616/NIOSHPUB2016106.
OSHA. n.d. Heat stress guide. Washington DC: U.S. Department of Labor, Occu-
pational Safety and Health Administration. www.osha.gov/SLTC/emergency
preparedness/guides/heat.html.
OSHA. 1980. Noise control: A guide for workers and employers. Washington DC:
U.S. Department of Labor, Occupational Safety and Health Administration,
Office of Information. www.nonoise.org/hearing/noisecon/noisecon.htm.
OSHA. 2017. Heat stress. In Section III, Chapter 4 of OSHA technical manual.
Directive #TED 01-00-015, U.S. Department of Labor, Occupational Safety
and Health Administration, Washington, D.C. www.osha.gov/dts/osta/otm/
otm_iii/otm_iii_4.html#.
OSHA. 2019. Occupational Safety and Health Act of 1970, General Duty Clause,
Section 5(a)(1). Washington, DC: U.S. Department of Labor, Occupational
Safety and Health Administration. www.safetyandhealthmagazine.com/articles
/19258-oshas-general-duty-clause.
Thermal Guidelines for Data Processing Environments, Fifth Edition 155

OSHA. 2020. Heat. Washington DC: U.S. Department of Labor, Occupational


Safety and Health Administration. https://www.osha.gov/SLTC/heatstress/
heat_hazardrecognition.html#environmental_heat.
Patterson, M.K. 2008. The effect of data center temperature on energy efficiency.
Proceedings of Itherm Conference.
Patterson, M.K., D. Atwood, and J.G. Miner. 2009. Evaluation of air-side econo-
mizer use in a compute-intensive data center. Presented at Interpack’09, July
19–23, San Francisco, CA.
Pommerenke, D., A. Talezadeh, X. Gao, F. Wan, A. Patnaik, M. Moradian-
pouchehrazi, and Y. Han. 2014. The effect of humidity on static electricity
induced reliability issues of ICT equipment in data centers. ASHRAE RP-
1499 Final Report. Peachtree Corners, GA: ASHRAE.
Rankin, B. Humans by altitude. Radial cartography. www.radicalcartography.net/
howhigh.html.
Sauter, K. 2001. Electrochemical migration testing results—Evaluating printed
circuit board design, manufacturing process and laminate material impacts on
CAF resistance. Proceedings of IPC Printed Circuits Expo.
SES&4GDH. 2018. The status of 4th generation district heating: Research and
results. 4th International Conference on Smart Energy Systems and 4th Gen-
eration District Heating, Aalborg, Denmark, November 13–14. www.4dh.eu/
images/HenrikLund_4DH_Conference_13November2018.pdf.
Sood, B. 2010. Effects of moisture content on permittivity and loss tangent of PCB
materials. Webinar, Center for Advanced Life Cycle Engineering (CALCE),
University of Maryland, College Park.
Statskontoret. 2004. Technical Standard 26:6, acoustical noise emission of infor-
mation technology equipment. Stockholm, Sweden: Statskontoret.
Telcordia. 2001. GR-3028-CORE, Thermal management in telecommunications
central offices. Telcordia Technologies Generic Requirements, Issue 1,
December 2001. Piscataway, NJ: Telcordia Technologies, Inc.
Telcordia. 2012. GR-63-CORE, NEBS requirements: Physical protection. Telcor-
dia Technologies Generic Requirements, Issue 4, April 2012. Piscataway, NJ:
Telcordia Technologies, Inc.
TGG. 2010. ERE: A metric for measuring the benefit of reuse energy from a data
center. White paper (WP) #29. Beaverton, Oregon: The Green Grid.
Turbini, L.J., and W.J. Ready. 2002. Conductive anodic filament failure: A materi-
als perspective. School of Materials Science and Engineering, Georgia Insti-
tute of Technology, Atlanta, GA. www.researchgate.net/publication/251
735859_Conductive_Anodic_Filament_Failure_A_Materials_Perspective.
Turbini, L.J., W.J. Ready, and B.A. Smith. 1997. Conductive anodic filament
(CAF) formation: A potential reliability problem for fine-line circuits. Elec-
tronics Packaging Manufacturing, IEEE Transactions 22(1): 80–84.
Van Bogart, W.C. 1995. Magnetic tape storage and handling: A guide for libraries
and archives. Oakdale, MN: National Media Laboratory. https://www.clir.org/
pubs/reports/pub54/.
156 References and Bibliography

Verdingovas, V., M.S. Jellesen, and R. Ambat. 2014. Impact of NaCl contamina-
tion and climatic conditions on the reliability of printed circuit board assem-
blies. IEEE Transactions on Device and Materials Reliability, 14(1):42–51.
Zhang, H., S. Shao, H. Xu, H. Zou, and C. Tian. 2014. Free cooling of data cen-
ters: A review. Renewable and Sustainable Energy Reviews 35:171–82.
Zhang, R., R. Schmidt, J. Gilbert, and J. Zhang. 2018. Effects of gaseous pollution
and thermal conditions on the corrosion rates of copper and silver in data cen-
tre environment: A literature review. 7th International Building Physics Con-
ference, IBPC2018, Proceedings. https://surface.syr.edu/cgi/viewcontent.cgi
?article=1280&context=ibpc.
Zhang, J., R. Zhang, R. Schmidt, J. Gilbert, and B. Guo. 2019. Impact of gaseous
contamination and high humidity on the reliable operation of information
technology equipment in data centers. ASHRAE Research Project 1755, Final
Report. Peachtree Corners, GA: ASHRAE.
Zhang, R., J. Zhang, R. Schmidt, J. Gilbert, and B. Guo. 2020. Effects of moisture
content, temperature and pollutant mixture on atmospheric corrosion of cop-
per and silver and implications for the environmental design of data centers
(RP-1755). Science and Technology for the Built Environment 26(4): 567–86.
Thermal

Thermal Guidelines for Data Processing Environments | Fifth Edition


Essential Guidance for Data Center Designers and Operators

Thermal Guidelines for Data Processing Environments provides groundbreaking,


vendor-neutral information that empowers data center designers, operators,
and managers to better determine the impacts of varying design and operation
parameters on information technology equipment (ITE).
This book covers six primary areas:
Guidelines for
• Environmental guidelines for air-cooled equipment
• New environmental class for high-density air-cooled equipment
• Environmental guidelines for liquid-cooled equipment
• Facility temperature and humidity measurement
• Equipment placement and airflow patterns
Data Processing
• Equipment manufacturers’ heat load and airflow requirement reporting
Since its first publication in 2004, Thermal Guidelines has enabled HVAC
equipment manufacturers and installers, data center designers, and facility
operators to find common solutions and standard practices that facilitate
ITE interchangeability while preserving industry innovation. This fifth edition
Environments
Fifth Edition
features clarified wording throughout, changes due to research on the effects
of high relative humidity and gaseous pollutants on the corrosion of ITE, and
a new environmental class for high-density server equipment. The book
also includes a removable reference card with helpful information for facility
managers and others. The reference card may also be accessed online.
This book is the first in the ASHRAE Datacom Series, authored by ASHRAE
Technical Committee 9.9, Mission Critical Facilities, Data Centers, Technology
Spaces and Electronic Equipment. The series provides comprehensive

1
treatment of datacom cooling and related subjects.

ISBN 978-1-947192-64-5 (pbk) ASHRAE Datacom Series, Book 1 1 ASHRAE Datacom Series
ISBN 978-1-947192-65-2 (PDF)

180 Technology Parkway


9 781947 192645 Peachtree Corners, GA 30092
Product code: 90579 3/21
www.ashrae.org/bookstore

You might also like