P. 1
Best Practices for Data Center Energy Efficiency_20120525

Best Practices for Data Center Energy Efficiency_20120525

|Views: 130|Likes:
Published by descslam

More info:

Published by: descslam on Aug 03, 2012
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

09/22/2013

pdf

text

original

Sections

  • Challenging Conventional Wisdom: Game Changers
  • Workshop Learning Objectives
  • Data Center Energy
  • World Data Center Electricity Use - 2000 and 2005
  • How Much is 152B kWh?
  • Aggressive Programs Make a Difference
  • Projected Data Center Energy Use
  • The Rising Cost of Ownership
  • LBNL Feels the Pain!
  • Energy Efficiency Opportunities
  • Potential Benefits of Data Center Energy Efficiency
  • Electricity Use in Data Centers
  • Benchmarking for Energy Performance Improvement:
  • Energy Performance varies
  • Benchmarks Obtained by LBNL
  • PUE for India Data Centers
  • Indian Data Center Benchmarking Sources – Thanks to:
  • Energy Metrics and Benchmarking
  • Other Data Center Metrics:
  • Metrics & Benchmarking
  • IT Server Performance - Saving a Watt…
  • Moore’s Law
  • IT Equipment Age and Performance
  • Perform IT System Energy Assessments
  • Decommission Unused Servers
  • Cloud Computing
  • Storage Systems and Energy
  • IT System Efficiency Summary…
  • References and Resources
  • Use IT to Manage IT Energy
  • The Importance of Visualization
  • LBNL Wireless Sensor Installation
  • Real-time Temperature Visualization by Level
  • Displayed Under-floor Pressure Map…
  • Provided Real-time Feedback During Floor-tile Tuning
  • Determined Relative CRAC Cooling Energy Impact
  • Feedback Continues to Help: Note impact of IT cart!
  • Real-time PUE Display
  • PUE Calculation Diagram
  • Franchise Tax Board (FTB) Case Study
  • FTB Wireless Sensor Network
  • WSN Smart Software: learns about curtains
  • WSN Provided Effect on Cold-aisle Temperatures:
  • WSN Software = Dramatic Energy Reduction…
  • Cost-Benefit Analysis:
  • An Emerging Technology…
  • Intel Data Center HVAC:
  • Dashboards
  • Why Dashboards?
  • Another Dashboard Example…
  • Use IT to Manage IT: Summary
  • Equipment Environmental Specification
  • Purpose of the Ranges
  • 2011 ASHRAE Allowable Ranges
  • ASHRAE Liquid Cooling Guidelines
  • Summary Recommended Limits
  • Example Server Specification
  • Environmental Conditions: Summary
  • Air Management: The Early Days at LBNL
  • Air Management
  • Benefits of Hot- and Cold-aisles
  • Reduce Bypass and Recirculation
  • Maintain Raised-Floor Seals
  • Manage Blanking Panels
  • Reduce Airflow Restrictions & Congestion
  • Resolve Airflow Balancing
  • Results: Tune Floor Tiles
  • Optimally Locate CRAC/CRAHs
  • Typical Temperature Profile with Under-floor Supply
  • Next step: Air Distribution Return-Air Plenum
  • Return Air Plenum
  • Return-Air Plenum Connections
  • Isolate Hot Return
  • Other Isolation Options
  • Adding Air Curtains for Hot/Cold Isolation
  • Cold Aisle Airflow Containment Example
  • Fan Energy Savings
  • LBNL Air Management Demonstration
  • Isolating Hot and Cold Aisles
  • Efficient Alternatives to Under- floor Air Distribution
  • Example – Local Row-Based Cooling Units
  • Air Distribution – Rack-Mounted Heat Exchangers
  • Review: Airflow Management Basics
  • Air Management and Cooling Efficiency
  • HVAC Systems Overview
  • DX (or AC) units reject heat outside…
  • Computer Room Air Handling (CRAH) units using Chilled-Water
  • Select Efficient Chillers
  • Increase Temperature of Chiller Plant
  • Why Liquid Cooling?
  • In Rack Liquid Cooling
  • “Chill-off 2” Evaluation of Liquid Cooling Solutions
  • Use Free Cooling:
  • Outside-Air Economizers
  • Free Cooling – Outside Air Based
  • System Design Approach:
  • LBNL Example: Rear Door Cooling
  • Improve Humidity Control:
  • Cooling Takeaways…
  • Electrical system end use – Orange bars
  • Electrical Distribution
  • Electrical Systems – Points of Energy Inefficiency
  • UPS, Transformer, & PDU Efficiency
  • Measured UPS efficiency
  • LBNL/EPRI Measured Power Supply Efficiency
  • Emerging Technology: DC Distribution
  • Standby generation loss
  • Standby generator heater
  • Data center lighting
  • Motors and Drives
  • Key Electrical Takeaways
  • DOE Advanced Manufacturing Office (AMO - was ITP)
  • DOE DC Pro Tool Suite
  • Data Center Energy Practitioner (DCEP) Program
  • Energy Star
  • Energy Star Data Center Activities
  • Results at LBNL
  • More to Come

Best Practices for Data Center Energy Efficiency  Workshop

Data Center Dynamics, Mumbai, India May 25, 2012
Presented by:

Dale Sartor, P.E. Applications Team Leader, Building Technologies Lawrence Berkeley National Laboratory
(Version: 4/29/12)

1 | Lawrence Berkeley National Laboratory

eere.energy.gov

Download Presentation

This Presentation is Available for download at:
http://datacenterworkshop.lbl.gov/

2 | Lawrence Berkeley National Laboratory

eere.energy.gov

Agenda
• • • • • • • • • •

Introduction  Performance metrics and benchmarking IT equipment and software efficiency Use IT to save IT (monitoring and dashboards) Data center environmental conditions Airflow management Cooling systems Electrical systems Resources Workshop summary

3 | Lawrence Berkeley National Laboratory

eere.energy.gov

Challenging Conventional Wisdom: Game Changers Conventional Approach • Data centers need to be cool and controlled to tight humidity ranges • Data centers need raised floors for cold air distribution • Data centers require highly redundant building infrastructure Need Holistic Approach • IT and Facilities Partnership 4 | Lawrence Berkeley National Laboratory eere.energy.gov .

gov .energy.Introduction 5 | Lawrence Berkeley National Laboratory eere.

gov .Workshop Learning Objectives Provide background on data center efficiency Raise awareness of efficiency opportunities Develop common understanding between IT and Facility staff Review of data center efficiency resources Group interaction for common issues and solutions 6 | Lawrence Berkeley National Laboratory eere.energy.

energy.gov .High Tech Buildings are Energy Hogs 7 | Lawrence Berkeley National Laboratory eere.

Data Center Energy • Data centers are energy intensive facilities – 10 to 100 times more energy intensive than an office – Server racks now designed for more than 25+ kW – Surging demand for data storage – 1.gov .energy.5% of US Electricity consumption – Projected to double in next 5 years – Power and cooling constraints in existing facilities 8 | Lawrence Berkeley National Laboratory eere.

World Data Center Electricity Use .energy.gov .2000 and 2005 Source: Koomey 2008 9 | Lawrence Berkeley National Laboratory eere.

gov .energy.How Much is 152B kWh? Italy South Africa Mexico World Data Centers Iran Sweden Turkey 0 50 100 150 200 250 300 Final Electricity Consumption (Billion kWh) Source for country data in 2005: International Energy Agency. World Energy Balances (2007 edition) 10 | Lawrence Berkeley National Laboratory eere.

000 1960 1965 1970 US 1975 1980 1985 1990 1995 2000 California Western Europe 11 | Lawrence Berkeley National Laboratory eere.000 12.000 6.energy.000 4.gov .000 KWh 8.000 2.Aggressive Programs Make a Difference Energy efficiency programs have helped keep per capita electricity consumption in California flat over the past 30 years 14.000 10.

Projected Data Center Energy Use Current Estimated Energy Use EPA Report to Congress 2008 12 | Lawrence Berkeley National Laboratory eere.gov .energy.

The Rising Cost of Ownership • From 2000 – 2006.energy. computing performance increased 25x but energy efficiency only 8x – Amount of power consumed per $1. 2007 13 | Lawrence Berkeley National Laboratory eere.gov .000 of servers purchased has increased 4x • Cost of electricity and supporting infrastructure now surpassing capital cost of IT equipment • Perverse incentives -.IT and facilities costs separate Source: The Uptime Institute.

gov .Lawrence Berkeley National Laboratory LBNL operates large systems along with legacy systems We also research energy efficiency opportunity  and work on various deployment programs 14 | Lawrence Berkeley National Laboratory eere.energy.

energy.gov .LBNL Feels the Pain! 15 | Lawrence Berkeley National Laboratory eere.

gov MegaW atts 2001 2003 2005 2007 2009 2011 2013 2015 2017 16 | Lawrence Berkeley National Laboratory .energy.LBNL Super Computer Systems Power NERSC Computer Systems Power (Does not include cooling power) (OSF: 4MW max) 40 35 30 25 20 15 10 5 0 N8 N7 N6 N5b N5a NGF Bassi Jacquard N3E N3 PDSF HPSS Misc eere.

energy.gov .Data Center Energy Efficiency = 15% (or less) Energy Efficiency = Useful computation / Total Source Energy Typical Data Center Energy End Use 100 Units Source Energy Power Conversions & Distribution 35 Units Power Generation Cooling Equipment Server Load /Computing Operations 33 Units Delivered 17 | Lawrence Berkeley National Laboratory eere.

energy.gov 18 | Lawrence Berkeley National Laboratory .Energy Efficiency Opportunities • Server innovation • Virtualization • High efficiency power supplies • Load management Power Conversion & Distribution Server Load/ Computing Operations • • • • • Better air management Move to liquid cooling Optimized chilled-water plants Use of free cooling Heat recovery Cooling Equipment • • • • High voltage distribution High efficiency UPS Efficient redundancy strategies Use of DC power On-site Generation • On-site generation Including fuel cells and renewable sources • CHP applications (Waste heat for cooling) eere.

gov .energy.Potential Benefits of Data Center Energy Efficiency • 20-40% savings typical • Aggressive strategies can yield 50+% savings • Extend life and capacity of infrastructures 19 | Lawrence Berkeley National Laboratory eere.

gov .energy.20 | Lawrence Berkeley National Laboratory eere.

energy.Performance metrics and benchmarking 21 | Lawrence Berkeley National Laboratory eere.gov .

 Intel Corporation 22 | Lawrence Berkeley National Laboratory eere.energy.gov .Electricity Use in Data Centers Courtesy of Michael Patterson.

gov .energy.Benchmarking for Energy Performance Improvement: • Energy benchmarking can allow comparison to peers and help identify best practices • LBNL conducted studies of over 30 data centers: – Wide variation in performance – Identified best practices • Can’t manage what isn’t measured 23 | Lawrence Berkeley National Laboratory eere.

Your Mileage Will Vary Energy Performance varies The relative percentages of the  energy doing computing varies  considerably.gov .energy. Lighting 2% Other 13% HVAC .Air Movement 7% Lighting 2% HVAC Chiller and Pumps 24% Computer Loads 67% Office Space Conditioning 1% Electrical Room Cooling 4% Cooling Tower Plant 4% Data Center Server Load 51% Data Center CRAC Units 25% 24 | Lawrence Berkeley National Laboratory eere.

83 25 | Lawrence Berkeley National Laboratory eere.Benchmarks Obtained by LBNL High Level Metric: Power Utilization  Effectiveness (PUE) = Total Power/IT Power Average PUE = 1.energy.gov .

50 3.50 0.00 1.00 2.gov .50 1.00 Data c enter total/ IT power  Data c enter total/ IT power  2.00 3.00 1 2 3 4 5 26 | Lawrence Berkeley National Laboratory eere.50 0.PUE for India Data Centers U S  P U E 3.energy.50 1.50 2.50 In d ia  P U E 3.00 0.50 2.00 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 0.00 1.

energy.Indian Data Center Benchmarking Sources – Thanks to:      Intel Hewlett Packard APC Maruti Texas Instruments More Needed! 27 | Lawrence Berkeley National Laboratory eere.gov .

HVAC.gov .Energy Metrics and Benchmarking • Key Metrics: – PUE and partial PUEs (e.g. transactions/Watt) 28 | Lawrence Berkeley National Laboratory eere. Electrical distribution) – Utilization – Energy Reuse (ERF) • The future: Computational Metrics (e.g. peak flops per Watt.energy.

Watts per rack • Power distribution: UPS efficiency.gov . IT power supply efficiency – Uptime: IT Hardware Power Overhead Multiplier (ITac/ITdc) • HVAC – Fan watts/cfm – Pump watts/gpm – Chiller plant (or chiller or overall HVAC) kW/ton • Air Management – Rack cooling index (fraction of IT within recommended temperature range) – Return temperature index (RAT-SAT)/ITΔT • Lighting watts/square foot 29 | Lawrence Berkeley National Laboratory eere.energy.Other Data Center Metrics: • Watts per square foot.

energy.gov .Metrics & Benchmarking Power Usage Effectiveness Airflow Efficiency Cooling System Efficiency 30 | Lawrence Berkeley National Laboratory eere.

31 | Lawrence Berkeley National Laboratory eere.gov .energy.

IT Equipment and Software Efficiency 32 | Lawrence Berkeley National Laboratory eere.energy.gov .

gov .IT Server Performance Saving a Watt… The value of one watt saved at the IT equipment 33 | Lawrence Berkeley National Laboratory eere.energy.

Moore’s Law
Power reduction Over Time*
1.E+00 1.E‐01 1.E‐02 1.E‐03

Core Integer Performance Over Time*
10000
Core™ 2 Duo Extreme QX6700 Core™ 2 Duo Processor X6800 Pentium® ‐D Processor

1000

Pentium® 4 Processor EE Pentium® 4 Processor Pentium® ‐IIi Processor Pentium® ‐II Processor Pentium® Pro Processor Pentium® Processor

100
1.E‐04 1.E‐05

10
1.E‐06 1.E‐07

i486DX2

i486

1970

1980

1990

2000

2005

2010

386 1986 1988 1990 1992 1994

1

1996

1998

2000

2002

2004

2006

2008

Single Core                Moore’s Law

• Every year Moore’s Law is followed, smaller, more energy-efficient transistors result • Miniaturization provides 1 million times reduction in energy/transistor size over 30+ years. • Benefits: Smaller, faster transistors => faster AND more energy-efficient chips. Source: Intel Corp.
34 | Lawrence Berkeley National Laboratory 34
Source: Intel Corporate Technology Group

eere.energy.gov

IT Equipment Age and Performance
Age Distribution  of Servers Energy Consumption of Servers Performance Capability  of Servers

2007 & Earlier 2008, 2009 2010 - Current

Old Servers consume 60% of Energy, but deliver only 4% of  Performance Capability.
Data collected recently at a Fortune 100 company; courtesy of John Kuzma and William Carter, Intel
35 | Lawrence Berkeley National Laboratory eere.energy.gov

Perform IT System Energy Assessments

IT Energy Use Patterns: Servers
Idle servers consume as much as 50‐60% of power @ full load  as shown in SpecPower Benchmarks.

No Load

 60% of full Load
36 | Lawrence Berkeley National Laboratory eere.energy.gov

Decommission Unused Servers PHYSICALLY RETIRE AN INEFFICIENT OR UNUSED SYSTEM • Uptime Institute reported 15-30% of servers are on but not being used • Decommissioning goals include: – Regularly inventory and monitor – Consolidate/retire poorly utilized hardware 37 | Lawrence Berkeley National Laboratory eere.energy.gov .

gov . increasing utilization • Energy saved by shutting down underutilized machines 38 | Lawrence Berkeley National Laboratory eere.Virtualize and Consolidate Servers and Storage • Run many “virtual” machines on a single “physical” machine • Developed in the 1960s to achieve better efficiency • Consolidate underutilized physical machines.energy.

 staged servers Disaster Recovery App OS Dynamic Load Balancing App 1 OS App 2 OS App 3 OS … VMM HW App OS App 4 OS VMM HW VMM HW CPU Usage 90% VMM HW CPU Usage 30% • Upholding high-levels of business continuity • One Standby for many production servers Balancing utilization with head room 39 | Lawrence Berkeley National Laboratory eere.energy. reducing number of idle.Virtualize and Consolidate Servers and Storage Virtualization: Workload provisioning Server Consolidation App OS R&D Production App OS … App OS VMM HW HW HW VMM HW 10:1 in many cases Enables rapid deployment.gov .

gov .Cloud Computing Vertualized cloud computing can provide… • Dynamically scalable resources over the internet • Can be internal or external • Can balance different application peak loads • Typically achieves high utilization rates 40 | Lawrence Berkeley National Laboratory eere.energy.

gov .energy.Storage Systems and Energy • Power roughly linear to storage modules   • Storage redundancy significantly increases energy • Consider lower energy hierarchal storage • Storage De‐duplication ‐ Eliminate unnecessary copies 41 | Lawrence Berkeley National Laboratory eere.

Use Efficient Power Supplies LBNL/EPRI measured power supply efficiency Typical operation 42 | Lawrence Berkeley National Laboratory eere.energy.gov .

gov .energy.Use Efficient Power Supplies Power Supply Units • • • • Most efficient in the mid‐range of performance curves Right‐size for load Power supply redundancy puts operation lower on the curve Use Energy Star or Climate Savers power supplies Source: The Green Grid 43 | Lawrence Berkeley National Laboratory eere.

energy.gov .Use Efficient Power Supplies 80 PLUS Certification Levels Level of Certification Efficiency at Rated Load 115V Internal NonRedundant 20% 80 PLUS 80 PLUS Bronze 80 PLUS Silver 80 PLUS Gold 80 PLUS Platinum 80% 82% 85% 87% n/a 50% 80% 85% 88% 90% n/a 100% 80% 82% 85% 87% n/a 230V Internal Redundant 20% n/a 81% 85% 88% 90% 50% n/a 85% 89% 92% 94% 100% n/a 81% 85% 88% 91% 44 | Lawrence Berkeley National Laboratory eere.

000 $200 300 400 500 Mechanical UPS Power supply Watts per server 45 | Lawrence Berkeley National Laboratory eere.000 Annual Savings per Rack $6.gov .energy.000 $1. High Eff Power Supply $8.Use Efficient Power Supplies Power supply savings add up… Annual Savings:Standard vs.000 $5.000 $2.000 $3.000 $4.000 $7.

gov .energy.IT System Efficiency Summary… Servers Enable power  management  capabilities! Use EnergyStar® Servers Power  Supplies Reconsider  Redundancy Use 80 PLUS or  Climate Savers  products Storage Devices Take superfluous  data offline Use thin  provisioning  technology Consolidation Use virtualization Consider cloud  services  46 | Lawrence Berkeley National Laboratory eere.

80plus.org/ 47 | Lawrence Berkeley National Laboratory eere.ashraetcs.org http://tc99.org www.gov .org www.climatesaverscomputing.energy.ssiforums.References and Resources • • • • www.

gov .48 | Lawrence Berkeley National Laboratory eere.energy.

Using IT to Manage IT Innovative Application of IT in Data Centers for Energy Efficiency 49 | Lawrence Berkeley National Laboratory eere.gov .energy.

• Goals: – Provide the same level of monitoring and visualization of the physical space that exists for monitoring the IT environment. – Spot problems before they result in high energy cost or down time. – Measure and track performance metrics. • An operator can’t manage what they don’t measure.energy.Use IT to Manage IT Energy Using IT to Save Energy in IT: • Most operators lack “visibility” into their data center environment. 50 | Lawrence Berkeley National Laboratory eere.gov .

benchmarking.gov .The Importance of Visualization • IT Systems & network administrators have tools for visualization. 51 | Lawrence Berkeley National Laboratory eere. forensics. • Useful for debugging. • Data center facility managers have had comparatively poor visualization tools.energy. capacity planning.

 Measures: • • • • Temperature Humidity Pressure (under floor) Electrical power  Presents real-time feedback and historic tracking  Optimize based on empirical data.gov . not intuition.LBNL Wireless Sensor Installation  LBNL installed 800+ point sensor network. Image: SynapSense 52 | Lawrence Berkeley National Laboratory eere.energy.

energy.gov .Real-time Temperature Visualization by Level 53 | Lawrence Berkeley National Laboratory eere.

Displayed Under-floor Pressure Map… CRAC CRAC CRAC CRAC CRAC 54 | Lawrence Berkeley National Laboratory eere.gov .energy.

energy. Rack-Top Temperatures 55 | Lawrence Berkeley National Laboratory eere.gov .Provided Real-time Feedback During Floor-tile Tuning Under-Floor Pressure  Removed guesswork by monitoring and using visualization tool.

energy. Rack-Top Temperatures 56 | Lawrence Berkeley National Laboratory eere. • Turned off unnecessary CRAC units to save energy.Determined Relative CRAC Cooling Energy Impact Under-Floor Pressure • Enhanced knowledge of data center redundancy.gov .

gov .Feedback Continues to Help: Note impact of IT cart! Real-time feedback identified cold aisle air flow obstruction! 57 | Lawrence Berkeley National Laboratory eere.energy.

energy.Real-time PUE Display 58 | Lawrence Berkeley National Laboratory eere.gov .

energy.PUE Calculation Diagram 59 | Lawrence Berkeley National Laboratory eere.gov .

000 Sq Ft  • 12 CRAH cooling units  • 135 kW load Challenges: Challenges • Over‐provisioned  • History of fighting  • Manual shutoff not successful Solution: Solution • Intelligent supervisory control software with inlet air  sensing 60 | Lawrence Berkeley National Laboratory eere.Franchise Tax Board (FTB) Case Study Description: Description • 10.energy.gov .

gov .FTB Wireless Sensor Network • WSN included 50 wireless temperature sensors (Dust Networks radios) • Intelligent control software 61 | Lawrence Berkeley National Laboratory eere.energy.

gov .WSN Smart Software: learns about curtains CRAH 3 influence at start CRAH 3 influence after curtains CRAH‐03 62 | Lawrence Berkeley National Laboratory eere.energy.

energy.WSN Provided Effect on Cold-aisle Temperatures: 90 85 80 rack blanks VFDs upper limit of ASHRAE recommended range 75 degF 70 65 60 55 lower limit of ASHRAE recommended range 50 A B C D E F Time Interval 63 | Lawrence Berkeley National Laboratory eere.gov hot aisle isolation floor tile changes control software .

kW 50 40 30 20 10 0 1/31 AFTER 2/5 2/10 date/time 64 | Lawrence Berkeley National Laboratory eere.gov BEFORE DASH™ software started 2/15 .WSN Software = Dramatic Energy Reduction… 60 Main breaker.energy.

772 • Payback: 3.Cost-Benefit Analysis: • DASH cost‐benefit (sensors and software) • Cost: $56.824 • Savings: $30.057 • Savings: $42.energy.gov .9 years  • Total project cost‐benefit • Cost: $134.1 years 65 | Lawrence Berkeley National Laboratory eere.564 • Payback: 1.

gov .energy.An Emerging Technology… Control data center air conditioning using the built-in IT server-equipment temperature sensors 66 | Lawrence Berkeley National Laboratory eere.

data center cooling devices use return air temperature as the primary control-variable – ASHRAE and IT manufacturers agree IT equipment inlet air temperature is the key operational parameter – Optimum control difficult • Server inlet air temperature is available from ICT network – Intelligent Platform Management Interface (IPMI) or – Simple network management protocol (SNMP) 67 | Lawrence Berkeley National Laboratory eere.Intel Demonstration • Typically.energy.gov .

energy.gov .Intel Data Center HVAC: 68 | Lawrence Berkeley National Laboratory eere.

energy.gov .Intel Demonstration • Demonstration showed: – Servers can provide temperature data to facilities control system – Given server inlet temperature. facility controls improved temperature control and efficiency – Effective communications and control accomplished without significant interruption or reconfiguration of systems 69 | Lawrence Berkeley National Laboratory eere.

energy.Dashboards Dashboards can display multiple systems’ information for monitoring and maintaining data center performance 70 | Lawrence Berkeley National Laboratory eere.gov .

energy.Why Dashboards? • • • • • Provide IT and HVAC system performance at a glance Identify operational problems Baseline energy use and benchmark performance View effects of changes Share information and inform integrated decisions 71 | Lawrence Berkeley National Laboratory eere.gov .

gov .energy.Another Dashboard Example… 72 | Lawrence Berkeley National Laboratory eere.

73 | Lawrence Berkeley National Laboratory eere.gov .energy.Use IT to Manage IT: Summary • Evaluate monitoring systems to enhance operations and controls • Install dashboards to manage and sustain energy efficiency.

gov .energy.74 | Lawrence Berkeley National Laboratory eere.

gov .energy.Environmental Conditions 75 | Lawrence Berkeley National Laboratory eere.

Environmental Conditions What are the main HVAC Energy Drivers? • IT Load • Climate • Room temperature and humidity – Most data centers are overcooled and have humidity control issues – Human comfort should not be a driver 76 | Lawrence Berkeley National Laboratory eere.energy.gov .

Equipment Environmental Specification Air Inlet to IT Equipment is the important specification to meet Outlet temperature is not important to IT Equipment 77 | Lawrence Berkeley National Laboratory eere.gov .energy.

Environmental Conditions ASHRAE’s Thermal Guidelines: • Provide common understanding between IT and facility staff. • Recommends temperature range of 18 ̊C to 27 ̊C (80.6°F) with “allowable” much higher • New (2011) ASHRAE Guidelines – Six classes of equipment identified with wider allowable ranges from 32° C to 45° C (113°F).gov . – Provides more justification for operating above the recommended limits (in the allowable range) – Provides wider humidity ranges 78 | Lawrence Berkeley National Laboratory eere.especially when using economizers. • Endorsed by IT manufacturers • Enables large energy savings .energy.

79 | Lawrence Berkeley National Laboratory eere. For extended periods of time.energy.Purpose of the Ranges The recommended range is a statement of reliability. the IT manufacturers recommend that data centers maintain their environment within these boundaries. These are the boundaries where IT manufacturers test their equipment to verify that the equipment will function. The allowable range is a statement of functionality.gov .

  White paper prepared by ASHRAE Technical Committee TC 9.gov .2011 ASHRAE Thermal Guidelines 2011 Thermal Guidelines for Data Processing Environments – Expanded Data Center Classes and Usage  20 Guidance.9 80 | Lawrence Berkeley National Laboratory eere.energy.

energy.gov .2011 ASHRAE Allowable Ranges Dry Bulb Temperature 81 | Lawrence Berkeley National Laboratory eere.

the air-side and water-side economizer projections show failure rates that are very comparable to a traditional data center run at a steady state temperature of 20°C.gov .energy.2011 ASHRAE Thermal Guidelines ASHRAE’s key conclusion when considering potential for increased failures at higher (allowable) temperatures: “For a majority of US and European cities.” 82 | Lawrence Berkeley National Laboratory eere.

ASHRAE Liquid Cooling Guidelines
ASHRAE and a DOE High Performance Computer (HPC) user group have developing a white paper for liquid cooling • Three temperature standards defined based on three mechanical system configurations:
– Chilled water provided by a chiller (with or without a “tower side economizer” – Cooling water provided a cooling tower with possible chiller backup – Cooling water provided by a dry cooler with possible backup using evaporation

83 | Lawrence Berkeley National Laboratory

eere.energy.gov

Summary Recommended Limits

Liquid Cooling Class

Main Cooling Equipment

Supplemental Cooling Equipment

Building Supplied Cooling Liquid Maximum Temperature 17°C (63°F) 32°C (89°F)

L1

Cooling Tower and Chiller Cooling Tower

Not Needed

L2

Chiller

L3

Dry Cooler

Spray Dry Cooler, or Chiller

43°C (110°F)

84 | Lawrence Berkeley National Laboratory

eere.energy.gov

Example Server Specification

85 | Lawrence Berkeley National Laboratory

eere.energy.gov

• Many IT manufacturers design for harsher conditions than ASHRAE guidelines • Design Data Centers for IT equipment performance .gov .Environmental Conditions: Summary • Most computer room air conditioners (CRACs) are controlled based on the return air temperature – this needs to change • A cold data center = efficiency opportunity • Perceptions. based on old technology lead to cold data centers with tight humidity ranges – this needs to change.energy. • Address air management issues first 86 | Lawrence Berkeley National Laboratory eere.not people comfort.

87 | Lawrence Berkeley National Laboratory eere.gov .energy.

energy.gov .Airflow Management 88 | Lawrence Berkeley National Laboratory eere.

energy.gov .Air Management: The Early Days at LBNL It was cold but hot spots were everywhere High flow tiles reduced air pressure Fans were used to redirect air 89 | Lawrence Berkeley National Laboratory eere.

gov 90 | Lawrence Berkeley National Laboratory .Air Management  Typically. promoting efficiency.energy. more air circulated than required  Air mixing and short circuiting leads to:  Low supply temperature  Low Delta T  Use hot and cold aisles  Improve isolation of hot and cold aisles  Reduce fan energy  Improve air-conditioning efficiency  Increase cooling capacity Hot aisle / cold aisle configuration decreases mixing of intake & exhaust air. eere.

and Cold-aisles  Improves equipment intake air conditions by separating cold from hot airflow.Benefits of Hot.  Hot exhaust air  exits into rear  aisles.  Graphics courtesy of DLB Associates 91 | Lawrence Berkeley National Laboratory eere.gov . Supply cold air to  front of facing  servers.energy. Preparation: Arranging racks  with alternating hot  and cold aisles.

eere.gov .energy. 92 | Lawrence Berkeley National Laboratory Increases inlet temperature to servers.Reduce Bypass and Recirculation Bypass Air / Short‐Circuiting… Recirculation… Leakage Wastes cooling capacity.

Maintain Raised-Floor Seals Maintain sealing of all potential leaks in the raised floor plenum.gov .energy. Unsealed cable penetration Sealed cable penetration 93 | Lawrence Berkeley National Laboratory eere.

gov .   One 12” blanking panel added Temperature dropped ~20° top of rack Inlet Equip. Rack Outlet Air Recirculation SynapSense™ middle of rack Inlet Outlet SynapSense™ 94 | Lawrence Berkeley National Laboratory eere.Manage Blanking Panels Any opening will degrade the separation of hot and cold air  maintain server blanking and side panels.energy.

energy.Reduce Airflow Restrictions & Congestion Consider The Impact That Congestion  Congested Floor &  Has On The Airflow Patterns Ceiling Cavities 95 | Lawrence Berkeley National Laboratory Empty Floor &  Ceiling Cavities eere.gov .

 Rebalancing needed with new IT or HVAC equipment  Locate perforated floor tiles only in cold aisles Under‐floor pressure map with wireless sensors 96 | Lawrence Berkeley National Laboratory eere.energy.Resolve Airflow Balancing  BALANCING is required to optimize airflow.gov .

gov .Results: Tune Floor Tiles under-floor pressures SynapSense™ • • Too many permeable floor tiles if airflow is optimized – under-floor pressure up – rack-top temperatures down – data center capacity increases rack-top temperatures • Measurement and visualization assisted tuning process SynapSense™ 97 | Lawrence Berkeley National Laboratory eere.energy.

energy.Optimally Locate CRAC/CRAHs Locate CRAC/CRAH units at ends of Hot Aisles Air‐Handling Units 98 | Lawrence Berkeley National Laboratory eere.gov .

See for example V. “Comparison of Overhead and Underfloor Air Delivery Systems in a Data Center Environment Using CFD Modeling”. Sorell et al. 2005 99 | Lawrence Berkeley National Laboratory eere.energy.gov .Typical Temperature Profile with Under-floor Supply Hot air comes around top and sides of servers Too hot Too hot Just right Too cold Cold air escapes through ends of aisles Elevation at a cold aisle looking at racks There are numerous references in ASHRAE. ASHRAE Symposium Paper DE-05-11-5.

energy.Next step: Air Distribution Return-Air Plenum 100 | Lawrence Berkeley National Laboratory eere.gov .

energy.gov .Return Air Plenum • Overhead plenum converted to hot-air return • Return registers placed over hot aisle • CRAC intakes extended to overhead Before After 101 | Lawrence Berkeley National Laboratory eere.

Return-Air Plenum Connections Return air duct on top of CRAC unit connects to the return air plenum. 102 | Lawrence Berkeley National Laboratory eere.gov .energy.

Isolate Hot Return Duct on top of each rack connects to the return air plenum.gov .energy. 103 | Lawrence Berkeley National Laboratory eere.

Curtains.gov .Other Isolation Options  Physical barriers enhance separate hot and cold airflow. doors. Barriers placement must comply with fire codes.energy. or lids have been used successfully. Doors Lid Open 104 | Lawrence Berkeley National Laboratory Semi-enclosed cold aisle Enclosed cold aisle eere.

gov .energy.Adding Air Curtains for Hot/Cold Isolation 105 | Lawrence Berkeley National Laboratory eere.

60-70ºF 70-80ºF vs. 45-55ºF 106 | Lawrence Berkeley National Laboratory eere.gov .Isolate Cold and Hot Aisles 95-105ºF vs.energy.

Cold Aisle Airflow Containment Example LBNL Cold Aisle Containment study achieved fan energy savings of ~ 75% 107 | Lawrence Berkeley National Laboratory eere.energy.gov .

   Fan energy savings  of 70‐ 80% is possible with variable  air volume (VAV) fans in  CRAH/CRAC units (or central  AHUs) Without Enclosure With Enclosure Without Enclosure 108 | Lawrence Berkeley National Laboratory eere.gov .energy.Fan Energy Savings  Isolation can significantly  reduce air mixing and hence  flow   Fan speed can be reduced  and fan power is  proportional to the cube of  the flow.

PGE12813 90 Baseline 85 80 75 Temperature (deg F) 70 65 60 55 50 45 40 6/13/2006 12:00 Low Med High 6/14/2006 0:00 6/14/2006 12:00 Alternate 1 Setup ASHRAE Setup Recommended Range Alternate 2 6/15/2006 0:00 Time 6/15/2006 12:00 Ranges during demonstration 6/16/2006 0:00 6/16/2006 12:00 109 | Lawrence Berkeley National Laboratory eere.LBNL Air Management Demonstration Better airflow management permits warmer supply temperatures! Cold Aisle NW .gov .energy.

• Overall temperature can be raised if air is delivered without mixing.energy.Isolating Hot and Cold Aisles • Energy intensive IT equipment needs good isolation of “cold” inlet and “hot” discharge. • Cooling capacity increases with warmer air temperatures.gov . 110 | Lawrence Berkeley National Laboratory eere. • Cooling systems and economizers use less energy with warmer return air temperatures. • Supply airflow can be reduced if no mixing occurs.

Efficient Alternatives to Underfloor Air Distribution
Localized air cooling systems with hot and cold isolation can supplement or replace under-floor systems (raised floor not required!) Examples include:  Row-based cooling units  Rack-mounted heat exchangers  Both options “Pre-engineer” hot and cold isolation

111 | Lawrence Berkeley National Laboratory

eere.energy.gov

Example – Local Row-Based Cooling Units

112 | Lawrence Berkeley National Laboratory

eere.energy.gov

Air Distribution – Rack-Mounted Heat Exchangers

113 | Lawrence Berkeley National Laboratory

eere.energy.gov

g. curtains.gov . or complete isolation) Impact of good isolation: – Supply airflow reduced • Fan savings up to 75%+ – Overall temperature can be raised • Cooling systems efficiency improves • Greater opportunity for economizer (“free” cooling) – Cooling capacity increases 114 | Lawrence Berkeley National Laboratory eere.energy.g.Review: Airflow Management Basics Air management techniques: – Seal air leaks in floor (e.g. return air plenum. cable penetrations) – Prevent recirculation with blanking panels in racks – Manage floor tiles (e. no perforated tiles in hot aisle) – Improve isolation of hot and cold air (e.

gov .115 | Lawrence Berkeley National Laboratory eere.energy.

gov .energy.Cooling systems 116 | Lawrence Berkeley National Laboratory eere.

Air Management and Cooling Efficiency Linking good air management and an optimized cooling system:      Improved efficiencies Increased cooling capacity More hours for air-side and water-side free cooling Lower humidification/dehumidification energy Reduced fan energy 117 | Lawrence Berkeley National Laboratory eere.gov .energy.

energy.HVAC Systems Overview 118 | Lawrence Berkeley National Laboratory eere.gov .

direct expansion (DX) coil. • CRAH units – Fan and chilled water coil – Typically in larger facilities with a chiller plant • Both often equipped with humidifiers and reheat for dehumidification • Often independently controlled – Tight ranges and poor calibration lead to fighting 119 | Lawrence Berkeley National Laboratory eere.Computer Room Air Conditioners (CRACs) and Air Handlers (CRAHs) • CRAC units – Fan. and refrigerant compressor.energy.gov .

DX (or AC) units reject heat outside… Dry-Cooler DX Air-Cooled DX Evaporatively-Cooled DX R n era g efri t Water-Cooled DX Water 120 | Lawrence Berkeley National Laboratory eere.gov .energy.

energy.gov .Computer Room Air Handling (CRAH) units using Chilled-Water Air-Cooled Chiller Cooling Tower Evap-Cooled Chiller CRAH W at e r r ate W Water-Cooled Chiller W e at r Water 121 121 | Lawrence Berkeley National Laboratory eere.

energy.Optimize the Chiller Plant • Have a plant (vs. distributed cooling) • Use “warm” water cooling (multi-loop) • Size cooling towers for “free” cooling • Integrate controls and monitor efficiency of all primary components • Thermal storage • Utilize variable speed drives on: – – – – Fans Pumps Towers Chillers Primary CHW  Pump 10% Secondary CHW  Pump 4% CW Pump 13% Cooling Tower  5% Chiller  68% 122 | Lawrence Berkeley National Laboratory eere.gov .

57 Chiller 400 Ton Air Cooled 1200 Ton Water Cooled w/o VFD 1200 Ton Water Cooled with a VFD 123 | Lawrence Berkeley National Laboratory eere.Select Efficient Chillers Compressor kW / ton 25% 0.69 0.41 0.51 0.55 0.43 100% 1.energy.30 75% 0.77 0.96 0.25 0.34 50% 0.45 0.gov .

9 0.gov .5 0.000 Ton Chiller operating at 42 F CHWS Temp and 70 F CWS Temp 0.6 0.7 Efficiency (kW/ton) 0.4 0.energy.2 1.3 0.8 1.000 Ton Chiller operating at 60 F CHWS Temp and 70 F CWS Temp 0. 124 | Lawrence Berkeley National Laboratory eere.1 0 200 300 400 500 600 700 800 900 1000 Tons Data provided by York International Corporation.Increase Temperature of Chiller Plant 1 0.

000 cubic foot blimp [1.energy. liquid solutions become more attractive: Volumetric heat capacity comparison [5380 m3] ~ 190.gov .Moving (Back) to Liquid Cooling As heat densities rise.5 m3] = Water Air 125 | Lawrence Berkeley National Laboratory eere.

energy.Why Liquid Cooling? • Heat removal efficiency increases as liquid gets closer to the heat source • Liquids can provide cooling with higher temperature coolant – Improved cooling efficiency – Increased economizer hours – Greater potential use of waste heat • Reduced transport energy: 126 | Lawrence Berkeley National Laboratory eere.gov .

energy.gov .In‐Row Liquid Cooling Graphics courtesy of Rittal 127 | Lawrence Berkeley National Laboratory eere.

gov .In Rack Liquid Cooling Racks with integral coils and full containment Cooling Coil 128 | Lawrence Berkeley National Laboratory eere.energy.

gov .Rear‐Door Liquid Cooling Rear Door (open) Rear Doors (closed) Liquid Cooling Connections Inside rack RDHx. open 90° 129 | Lawrence Berkeley National Laboratory eere.energy.

energy.On Board Cooling 130 | Lawrence Berkeley National Laboratory eere.gov .

energy.gov .“Chill-off 2” Evaluation of Liquid Cooling Solutions 131 | Lawrence Berkeley National Laboratory eere.

energy.Use Free Cooling: Cooling without Compressors: • Outside‐Air Economizers  • Water‐side Economizers  Let’s get rid of chillers in data centers 132 | Lawrence Berkeley National Laboratory eere.gov .

gov 133 | Lawrence Berkeley National Laboratory .thegreengrid.Outside-Air Economizers Advantages • • Lower energy use Added reliability (backup for cooling system) Space. eere. Dust – Not a concern with Merv 13 filters Potential Issues • • • Gaseous contaminants – Not widespread – Impacts normally cooled data centers as well http://cooling.html • Shutdown or bypass if smoke is outside data center.org/namerica/WEB_APP/calc_index.energy.

gov .energy.UC’s Computational Research and Theory (CRT) Facility 134 | Lawrence Berkeley National Laboratory eere.

Hot and humid hours will enter the “allowable” range or require compressor air conditioning W B = 50 Ann ual Psychro m etric C hart o f O akland. Blue = recommended supply 2.Free Cooling – Outside Air Based 1. Green can become blue mixing return and outdoor air 3. w etbulb lines by 10 degrees F) W B = 70 W B = 60 W B = 40 W B = 30 0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100 105 D rybulb Tem p (F) 135 | Lawrence Berkeley National Laboratory eere.energy. C A (relative hum idity lines are stepped by 10% . Most of the conditions below and right of blue can be satisfied w/ evaporative cooling 4.gov .

5” total static peak) Hours of Operation Mode 1 Mode 2 Mode 3 Mode 4 Mode 5 total 100% Economiser OA + RA Humidification Humid + CH cooling CH only 2207 5957 45 38 513 8760 hrs hrs hrs hrs hrs hrs 136 | Lawrence Berkeley National Laboratory eere.energy.System Design Approach: • Air-Side Economizer (93% of hours) • Direct Evaporative Cooling for Humidification/ precooling • Low Pressure-Drop Design (1.gov .

gov .Water Cooling: • • • • Tower side economizer Four pipe system Waste heat reuse Headers. valves and caps for modularity and flexibility Predicted CRT Performance: • Annual PUE = 1.energy.1 137 | Lawrence Berkeley National Laboratory eere.

energy. • No contamination questions 138 | Lawrence Berkeley National Laboratory eere.gov .Water‐Side Economizers Advantages • Cost effective in cool and dry climates • Often easier retrofit • Added reliability (backup in the event of chiller failure).

energy.Integrated Water‐Side  Economizer Twb 41F [5C] You can use either a control valve or pump 44F [7C] 46F [8C] 44F [7C] <60F [16C] 49F [9C] [7C] 44F 139 | Lawrence Berkeley National Laboratory 60F [16C] Make sure that the heat exchanger is in series (not parallel with chillers on CHW side) eere.gov .

Potential for Tower Cooling On‐Board Cooling Rear‐Door Cooling Water‐Side Economizer 140 | Lawrence Berkeley National Laboratory eere.gov .energy.

gov . – Both options significantly more efficient than existing direct expansion (DX) CRAC units. 141 | Lawrence Berkeley National Laboratory eere.LBNL Example: Rear Door Cooling • • Used instead of adding CRAC units Rear door water cooling with toweronly (or central chiller plant in series).energy.

Improve Humidity Control:

• Eliminate inadvertent dehumidification
– Computer load is sensible only

• Use ASHRAE allowable humidity ranges
– Many manufacturers allow even wider humidity range

• Defeat equipment fighting
– Coordinate controls

• Disconnect and only control humidity of makeup air or one CRAC/CRAH unit • Entirely disconnect (many have!)

142 | Lawrence Berkeley National Laboratory

eere.energy.gov

Cost of Unnecessary  Humidification
AC 005 AC 006 AC 007 AC 008 AC 010 AC 011 Min Max Avg Visalia Probe Temp RH Tdp 84.0 27.5 81.8 28.5 72.8 38.5 80.0 31.5 77.5 32.8 78.9 31.4 72.8 84.0 79.2 27.5 38.5 31.7 Temp 47.0 46.1 46.1 47.2 46.1 46.1 46.1 47.2 46.4 76 55 70 74 68 70 55.0 76.0 68.8 RH 32.0 51.0 47.0 43.0 45.0 43.0 32.0 51.0 43.5 CRAC Unit Panel Tdp Mode 44.1 Cooling 37.2 Cooling & Dehumidification 48.9 Cooling 50.2 Cooling & Humidification 45.9 Cooling 46.6 Cooling & Humidification 37.2 50.2 45.5

Humidity down ~2% CRAC power down 28%

143 | Lawrence Berkeley National Laboratory

eere.energy.gov

Cooling Takeaways…

• Use efficient equipment and a central plant (e.g. chiller/CRAHs) vs. CRAC units • Use centralized controls on CRAC/CRAH units
– Prevent simultaneous humidifying and dehumidifying – Optimize sequencing and staging

• Move to liquid cooling (room, row, rack, chip) • Consider VSDs on fans, pumps, chillers, and towers • Use air- or water-side economizers where possible. • Expand humidity range and improve humidity control (or disconnect).
144 | Lawrence Berkeley National Laboratory eere.energy.gov

gov .145 | Lawrence Berkeley National Laboratory eere.energy.

Electrical Systems 146 | Lawrence Berkeley National Laboratory eere.energy.gov .

gov .energy.Electrical system end use – Orange bars Courtesy of Michael Patterson. Intel Corporation 147 | Lawrence Berkeley National Laboratory eere.

gov . uninterruptible power supply (UPS). AC-AC) loses some energy and creates heat • Efficiency decreases when systems are lightly loaded • Distributing higher voltage is more efficient and can save capital cost (conductor size is smaller) • Power supply. transformer. and PDU efficiency varies – carefully select • Lowering distribution losses also lowers cooling loads 148 | Lawrence Berkeley National Laboratory eere.Electrical Distribution • Every power conversion (AC-DC.energy. DC-AC.

Electrical Systems – Points of Energy Inefficiency 149 | Lawrence Berkeley National Laboratory eere.gov .energy.

and load • Redundancies impact efficiency 100% 95% 90% 85% 80% 75% Flywheel UPS Double-Conversion UPS Delta-Conversion UPS 70% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Percent of Rated Active Power Load 150 | Lawrence Berkeley National Laboratory eere. equipment.gov . & PDU Efficiency Factory Measurements of UPS Efficiency (tested using linear loads) Efficiency • Efficiencies vary with system design.energy.UPS. Transformer.

Measured UPS efficiency Redundant  Operation 151 | Lawrence Berkeley National Laboratory eere.energy.gov .

gov .LBNL/EPRI Measured Power Supply Efficiency Typical operation 152 | Lawrence Berkeley National Laboratory eere.energy.

N+1) • It’s possible to more fully load UPS systems and achieve desired redundancy • Redundancy in electrical distribution puts you down the efficiency curve 153 | Lawrence Berkeley National Laboratory eere.Redundancy Redundancy • Understand what redundancy costs and what it gets you – is it worth it? • Does everything need the same level? • Different strategies have different energy penalties (e.energy. 2N vs.gov .g.

85V Memory Controller Processor SDRAM Graphics Controller Bypass In Battery/Charger Rectifier Inverter Out AC/DC PWM/PFC Switcher Unregulated DC To Multi Output Regulated DC Voltages 3.gov .energy.From Utility Power to the Chip – Multiple Electrical Power Conversions AC DC AC DC 5V 12V Internal Drive External Drive I/O 12V DC/DC 12V 3.3V Voltage Regulator Modules DC/DC 1.3V 3. 5V 1.3V AC/DC Multi output Power Supply Uninterruptible Power Supply (UPS) Power Distribution Unit (PDU) Server 154 | Lawrence Berkeley National Laboratory eere.5/2.1V1.

Emerging Technology: DC Distribution 380V.  DC power distribution Eliminate several conversions • Also use for lighting. and variable speed drives • Use with on‐site generation including renewable  energy sources • • Back up - batteries or flywheel 155 | Lawrence Berkeley National Laboratory eere.gov .energy.

energy.g. temperature settings and control) 156 | Lawrence Berkeley National Laboratory eere.gov .Standby generation loss • Losses – Heaters – Battery chargers – Transfer switches – Fuel management systems • Opportunities to reduce or eliminate losses • Heaters (many operating hours) use more electricity than the generator produces (few operating hours) – Check with generator manufacturer on how to reduce the energy consumption of block heaters (e.

gov .Standby generator heater 157 | Lawrence Berkeley National Laboratory eere.energy.

S/Gear. Battery.Data center lighting • Lights are on and nobody’s home – Switch off lights in unused/unoccupied areas or rooms (UPS. etc) – Lighting controls such as occupancy sensors are well proven • Small relative benefit but easy to accomplish – also saves HVAC energy • Use energy efficient lighting • Lights should be located over the aisles 158 | Lawrence Berkeley National Laboratory eere.energy.gov .

Motors and Drives • Since most cooling system equipment operates continuously.energy. premium efficiency motors should be specified everywhere • Variable speed drives should be used – – – – Chillers Pumps Air handler fans Cooling tower fans 159 | Lawrence Berkeley National Laboratory eere.gov .

Key Electrical Takeaways • Choose highly efficient components and configurations • Reduce power conversion (AC-DC. DC-AC. DCDC) • Consider the minimum redundancy required as efficiency decreases when systems are lightly loaded • Use higher voltage 160 | Lawrence Berkeley National Laboratory eere.energy.gov . AC-AC.

161 | Lawrence Berkeley National Laboratory eere.gov .energy.

energy.gov .Resources 162 | Lawrence Berkeley National Laboratory eere.

gov .Resources Advanced Manufacturing Office • Tool suite & metrics for baselining • Training • Qualified specialists • Case studies • Recognition of high energy savers • R&D .technology development GSA • Workshops • Quick Start Efficiency Guide • Technical Assistance EPA • Metrics • Server performance rating & ENERGY STAR label • Data center benchmarking 163 | Lawrence Berkeley National Laboratory • • • • • • Federal Energy Management Program Workshops Federal case studies Federal policy guidance Information exchange & outreach Access to financing opportunities Technical assistance Industry • Tools • Metrics • Training • Best practice information • Best-in-Class guidelines • IT work productivity standard eere.energy.

gov .energy.Resources from DOE Federal Energy  Management Program (FEMP)  Best Practices Guide  Benchmarking Guide  Data Center  Programming Guide   Technology Case  Study Bulletins   Procurement  Specifications  Report Templates  Process Manuals  Quick‐Start Guide 164 | Lawrence Berkeley National Laboratory eere.

 case studies. and other resources • End‐user awareness training  – Workshops in conjunction with ASHRAE • Data Center Energy Practitioner (DCEP) certificate program  – Qualification of professionals to evaluate energy efficiency opportunities • Research.gov .DOE Advanced Manufacturing Office (AMO .was ITP) DOE’s AMO data center program provides tools and  resources to help owners and operators: • DC Pro Software Tool Suite – Tools to define baseline energy use and identify energy‐saving  opportunities • Information products – Manuals. and demonstration of advanced technologies 165 | Lawrence Berkeley National Laboratory eere.energy. development.

  fans • Free cooling 166 | Lawrence Berkeley National Laboratory eere. IT‐Equipment • Servers • Storage &  networking • Software Cooling • Air handlers/  conditioners • Chillers.DOE DC Pro Tool Suite High‐Level On‐Line Profiling and Tracking Tool • Overall efficiency (Power Usage Effectiveness [PUE]) • End‐use breakout • Potential areas for energy efficiency improvement • Overall energy use reduction potential In‐Depth Assessment Tools  Savings Air Management • Hot/cold  separation • Environmental  conditions • RCI and RTI Electrical Systems • UPS • PDU • Transformers • Lighting • Standby gen.energy. pumps.gov .

Target groups include: •Data Center personnel (in-house experts) •Consulting professionals (for-fee consultants) 167 | Lawrence Berkeley National Laboratory eere.energy. Key objective: •Raise the standards of assesors •Provide greater repeatability and credibility of recommendations.gov .Data Center Energy Practitioner (DCEP) Program A certificate process for energy practitioners qualified to evaluate energy consumption and efficiency opportunities in Data Centers.

energy.gov . Training and Exam on Select Disciplines + Assessment Process + DC Pro System Assessment Tools IT-Equipment. and Electrical Systems Cooling Systems Air Management Electrical Systems IT Equipment HVAC (available) IT (2012) Two Tracks: • Certificate track (training + exam) • Training track (training only) 168 | Lawrence Berkeley National Laboratory eere.Data Center Energy Practitioner (DCEP) Program Training & Certificate Disciplines/Levels/Tracks Level 1 Generalist: Prequalification. Cooling Systems. Training and Exam on All Disciplines + Assessment Process + DC Pro Profiling Tool Level 2 Specialist: Prequalification. Air-Management.

gov .energy.Energy Star A voluntary public-private partnership program • Buildings • Products 169 | Lawrence Berkeley National Laboratory eere.

UPS. and networking equipment for Energy STAR product specs 170 | Lawrence Berkeley National Laboratory eere. •Energy STAR specification for servers •Evaluating enterprise data storage.energy.Energy Star Data Center Activities •ENERGY STAR Datacenter Rating Tool – Build on existing ENERGY STAR platform with similar methodology (1-100 scale) – Usable for both stand-alone and data centers housed within another buildings – Assess performance at building level to explain how a building performs.gov . • Industry still discussing how to define useful work. not why it performs a certain way – ENERGY STAR label to data centers with a rating of 75+ – Rating based on data center infrastructure efficiency • Ideal metric would be measure of useful work/energy use.

energy.gov .energy.lbl.energystar. server_efficiency http://www1.energy.gov/femp/program/data_center.eere.html http://hightech.gov/industry/datacenters/ 171 | Lawrence Berkeley National Laboratory eere.html http://www.Resources http://www1.cfm?c=prod_development.eere.gov/index.gov/datacenters.

gov .energy.172 | Lawrence Berkeley National Laboratory eere.

Workshop Summary Best Practices 173 | Lawrence Berkeley National Laboratory eere.energy.gov .

45 – 30% reduction in infrastructure energy • More to come! 174 | Lawrence Berkeley National Laboratory eere.Results at LBNL LBNL’s Legacy Data Center: • Increased IT load – >50% (~180kW) increase with virtually no increase in infrastructure energy use • Raised room temperature 8+ degrees • AC unit turned off – (1) 15 ton now used as backup • Decreased PUE from 1.energy.gov .65 to 1.

yields big energy savings • Improve containment (overcome curtain problems) • Increase liquid cooling (HP in-rack.More to Come Next Steps for LBNL’s Legacy Data Center • Integrate CRAC controls with wireless monitoring system • Retrofit CRACs w/ VSD – Small VAV turndown. and APC inrow) • Increase free cooling (incl. tower upgrade) 175 | Lawrence Berkeley National Laboratory eere.gov .energy.

2. 8. 3. Measure and Benchmark Energy Use Identify IT Opportunities Use IT to Control IT Manage Airflow Optimize Environmental Conditions Evaluate Cooling Options Improve Electrical Efficiency Implement Energy Efficiency Measures eere. 6. 4.energy.Data Center Best Practices  Summary 1. 5.gov 176 | Lawrence Berkeley National Laboratory . 7.

Data Center Best Practices 1. Measure and Benchmark Energy Use  Use metrics to measure efficiency  Benchmark performance  Establish continual improvement goals 177 | Lawrence Berkeley National Laboratory eere.gov .energy.

178 | Lawrence Berkeley National Laboratory eere. power supplies)  Virtualize  Refresh IT equipment  Turn off unused equipment.gov .energy.Data Center Best Practices 2. Identify IT Opportunities  Specify efficient servers (incl.

gov . thermal maps).g.Data Center Best Practices 3.energy.  Use visualization tools (e.  Install dashboards to manage and sustain energy efficiency. 179 | Lawrence Berkeley National Laboratory eere. Use IT to Control IT Energy  Evaluate monitoring systems to enhance real-time management and efficiency.

energy. 180 | Lawrence Berkeley National Laboratory eere.Data Center Best Practices 4.gov . Manage Airflow     Implement hot and cold aisles Fix leaks Manage floor tiles Isolate hot and cold airstreams.

Data Center Best Practices 5.  Minimize or eliminate humidity control 181 | Lawrence Berkeley National Laboratory eere.energy. Optimize Environmental Conditions  Follow ASHRAE guidelines or manufacturer specifications  Operate to maximum ASHRAE recommended range.  Anticipate servers will occasionally operate in allowable range.gov .

Evaluate Cooling Options     Use centralized cooling system Maximize central cooling plant efficiency Provide liquid-based heat removal Compressorless cooling 182 | Lawrence Berkeley National Laboratory eere.gov .Data Center Best Practices 6.energy.

gov .energy.Data Center Best Practices 7. Improve Electrical Efficiency  Select efficient UPS systems and topography  Examine redundancy levels  Increase voltage distribution and reduce conversions 183 | Lawrence Berkeley National Laboratory eere.

Data Center Best Practices 8. Implement Standard Energy Efficiency Measures  Install premium efficiency motors  Upgrade building automation control system  Integrate CRAC controls  Variable speed everywhere  Optimize cooling tower efficiency 184 | Lawrence Berkeley National Laboratory eere.gov .energy.

gov .energy.Data Center Best Practices Most importantly… Get IT and Facilities People Talking and working together as a team!!! 185 | Lawrence Berkeley National Laboratory eere.

186 | Lawrence Berkeley National Laboratory eere.energy.gov .

Lawrence Berkeley National Laboratory Applications Team MS 90-3111 University of California Berkeley. CA 94720 DASartor@LBL. P.LBL.gov .Contact Information: Dale Sartor.energy.gov (510) 486-5988 http://Ateam.E.gov 187 | Lawrence Berkeley National Laboratory eere.

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->