You are on page 1of 16

Sponsored by

> Smart Energy | Supplement


INSIDE

Super smarts Server success Demand response


> NREL turns to AI Ops for the > Finding energy savings at the > Alternate strategies for smart
ultimate system rack level batteries
Sponsored by

Contents

4 S
 uper smarts Smart Energy: thinking
How NREL uses AI Ops to save
energy how to cut energy use
6 E
 nergy efficient servers

W
Intelligence cuts energy
ork smarter, Human intelligence has plenty to
use
not harder. contribute, in redesigning racks,
8 E
 +I Advertorial It's a good servers, and cooling systems, to
Busways to prevent arc flash motto for life, reduce energy waste.
but also for We may be approaching the
10 D
 emand response data center limits of efficiency achievable
Data centers can save the utility hardware and infrastructure. through the approach of PUE,
grid but why should they? This special supplement is about which reduces the energy used in
how AI and human intelligence cooling the facility.
13 K
 eeping up with 5G's power can be applied right now to digital The next step after that is to
demands infrastructure, to save Watts of concentrate on the energy used in
5G is set to usher in higher power demand. the IT equipment. Alex Alley tracks
data transfer speeds, enabling We've mostly looked inside the the trends in today's data centers,
a new wave of computing facility, with a foray into the wider which may be pointing to a need
technologies world. for new ideas in future (p6).

Common sense can save lives Supply and demand are well
and equipment lost to arc flash, understood in economics, but in
according to E+I (p8). data centers, they are still getting to
One way to reduce that risk is to know each other (p10).
use an intelligent design of busway, Data centers want energy, they
that includes protective housing want reliability, and they also want
and safety features. to use renewable power sources. It's
increasingly likely that they won't
A high-performance computing be able to get all three, unless they
site at an energy efficiency start work with the utilities.

8 laboratory was a natural place


to develop the use of AI to make
computing itself more efficient.
Renewables are intermittent. At
a certain point, utilities can't switch
on any more solar or wind power,
The US NREL institution found until sites with stored power start to
out that it takes a huge amount share it. That's the hurdle demand-
of data points and a lot of effort response is going to address - and
training up algorithms, to start to it's a challenge for data centers to
making its supercomputers run deliver on their promises to enable
more efficiently, and to intervene green energy.
when they work less well (p4).
Another finding is that, even 5G could change everything - but
with this level of applied intellect, only if we can actually provide the
it's too early to completely hand electric power it needs (p13).
over optimization to AI. We still The history of cellular comms is
need to check the working, or else a story of continuing change and
the AI's recommendations run the improvement. Vlad-Gabriel Anghel

10 13
risk of being impractical or just shows how we can power the next
wrong. revolution.

Smart Energy Supplement 3


AI Ops at the scale of exascale

Super smarts
NREL’s highly efficient data center is turning to AI to prepare us for
exascale supercomputers, Sebastian Moss reports Sebastian Moss
Deputy Editor

T
he world’s most efficient to build an exascale system that is really all this data, not just for the IT equipment,
data center wants to get usable and operational in a real world but for what we call the OT equipment,
even better, using artificial environment,” Vildibill, now HPE’s VP & GM the operational technologies, the control
intelligence to eke out more of high performance networking, told DCD. systems that run cooling systems, fans, and
compute power with the same “And it was kind of a humbling experience. towers, as well as the environmental data.
electrical energy, How do we manage, monitor and control “We realized that that's what we want to
Building upon a wealth of data, the one of these massive behemoth systems?” use to train our AI.”
Energy Systems Integration Facility (ESIF) Vildibill’s team started with a brute force Armed with a data set with a whopping
HPC data center hopes that AI can make its approach, he recalled: “We need to manage 150 billion sample points, Vildibill’s team
supercomputers smarter, and prepare us for and monitor this thing, we have to collect last year announced a three year initiative
an exascale future. this much data from each server, every with NREL to train and run an AI Ops
storage device, every memory device, and system at ESIF.
Nestled amongst the research labs of everything else in the data center. We've got “Our research collaboration will span the
the National Renewable Energy Laboratory to put it in a database. We've got to analyze areas of data management, data analytics,
campus in Colorado, the ESIF had an average it, and then we’ve got to use that to manage, and AI/ML optimization for both manual
power usage effectiveness (PUE) of just 1.032 monitor and control the system.” and autonomous intervention in data center
in 2017, and currently captures 97 percent of With this approach in mind, the group operations,” Kristin Munch, manager for
the waste heat from its supercomputers to did a rough calculation for an exascale the data, analysis and visualization group at
warm nearby office and lab space. system. “They came back and told me that NREL, said.
For the last five to ten years, researchers they can do it, but that the management “We’re excited to join HPE in this
at NREL have used sensors to try to track system that has to go next to the exascale multi-year, multi-staged effort - and we
everything happening in the facility, and system would have to be the size of the hope to eventually build capabilities for an
within its two systems - the HPE machines largest computer in the world [the 200 advanced smart facility after demonstrating
Peregrine and Eagle. This hoard of data has petaflops Summit system],” he said: “Okay, these techniques in our existing data
grown and grown to more than 16 terabytes, so we’ve stumbled across a real problem.” center.”
just waiting for someone to use it. At the time, Vildibill was also looking Vildibill told DCD that the project is
A little under three years ago, Mike into AI Ops, the industry buzzword for the already well underway. “We spent several
Vildibill - then VP of HPE’s Advanced application of artificial intelligence to IT months ingesting that data, training our
Technologies Group - had a problem. He operations. “We realized we needed AI Ops models, refining our models, and using
was in charge of running his company’s on steroids to really manage and control - their [8 petaflops] Eagle supercomputer to
exascale computing efforts, funded by the in an automated manner - a big exascale do that, although in small fractions - we
Department of Energy. system,” he said. didn't take the whole supercomputer for a
“We formed a team to do a very deep To train that AI, his team needed data - month, but rather, we would use it for 10
analysis and design of what is needed lots and lots of data. Enter NREL. “They have minutes, 20 minutes here and there.
“So we now have a trained AI.”

The system has now progressed to a


stage, Vildibill revealed, that it can “do real
time capturing of the data, put it into a
framework for analytics and storage, and do
the prediction in real time because now we
have it all together.
“We did 150 billion historical data points.
Now we're in a real time model. That’s the
Nirvana of all of this: Real time monitoring,
management and control.”
But, for all its value, data from Eagle
and the outgoing 2.24 petaflops Peregrine
can only get you so far. Exascale systems,
capable of at least 1,000 petaflops, will
produce a magnitude more data.
“The next steps we're doing within
NREL is just to bloat or expand the data that
they're producing,” Vildibill said. “Like for
example, if one sensor gives one data point

4 DCD Supplement • datacenterdynamics.com


The six layers of AI Ops
For the AI Ops program, Mike Vildibill
breaks the system down into six layers
through which data travels.
First is the data source: "we've got to
be able to ingest historical data, capture
telemetry from servers in real-time, and so
on.”
Then comes the aggregation and
processing of that data: “We pre-process
it to make it usable. Instead of 20 different
formats or log files, we have to unify it into a
format that everyone can understand."
Third is data analytics, followed by AI
and machine learning techniques in the
fourth layer.
In the layer after that, AI begins to
predict and advise the user. For instance, if
the system sees a large number of corrected
every second, we want to go in and tweak product. It is still advanced development. So errors on a DIMM in a specific rack, it could
it and have it do a 100 per second. Not that the question you ask is exactly the very first recommend that node should be replaced.
we need 100 per second, but we're trying to question that a product team would and will Finally, layer number six will provide
test the scalability of all the infrastructure in ask and that is: 'Okay, Vildibill, you guys are automation and control: "This is the big
planning for a future exascale system.” on something big. We want to productize objective. Instead of simply advising what
All this data is ingested and put into it. First question, is it for HPE or is it going a human should do, the system goes in.
dashboards that humans can (hopefully) to be a product for everybody?’ And I don't For example, it turns off a node, if it has
understand. "I could literally tell you 100 think that that decision even gets asked predicted that it's going to fail.”
things we want on that dashboard, but one until later in the development process.” That final stage is still a way off, however:
of them is PUE and the efficiency of the data Alphabet’s DeepMind grabbed headlines “This program is really touching upon the
center as a result." in 2016 with the announcement that it had first five of those six, and we want to get a
cut the PUE of Google’s data centers by couple of years of a strong prediction and
As an efficiency metric for data centers, 15 percent, and expected to gain further advice capability under our belt first.”
PUE has some detractors, but it’s good savings. It also said that it would share how The current focus is to make the process
enough for NREL. “That's what NREL cares it did it with the wider community, but DCD behind the first five layers more transparent
about, but we're building this infrastructure understands the company quietly shelved to humans.
for customers who have requirements that those plans as the AI program required "I want to be able to make sure
we don't even yet know,” Vildibill said. customized implementations unique to that, by the time a piece of advice or
He noted that the system "might do Google’s data centers. prediction comes out, we have a very clear
prediction analysis or anomaly detection," “I can tell you this - and I'm putting understanding of what data led to that
and we “can have dashboards that are about pressure on the future product team that's prediction, so that we can go back and audit
trying to save water. Some geographies like going to have to make these decisions - the decisions that are being made," said
Australia worry as much about how much but everything I'm describing is entirely Vildibill.
water is consumed by cooling a data center transferable,” Vildibill said. "I don't think we can get all the way to
as they do about how much electricity is “In fact, we envision this being automatically controlled systems unless
consumed. That customer would want a something that could even be picked up humans can understand the factors that led
dashboard that says, how efficiently they by the hyperscalers. It would be very ready to a decision," he concluded.
are using their data center by the metric for use to manage cloud infrastructure, "We can't just hand it over to the AI
of gallons per minute that are being in addition to being used by our typical and say 'I don't know what you're doing. I
evaporated into the air. customers, both HPC and enterprise, that hope it works out.' I think there's going to
“Some customers, in metropolitan areas are running on-premises. be a lot more research involved before we
like New York, are really sensitive to how “What I'm driving with this design is really turn the systems over to complete
much electricity they used during peak entirely transferable that I think, if it's not, automated control."
time versus off hours because they've got then you depreciate its value entirely.”
to shape their workload to try to minimize
electrical usage during peak times. Every
customer has a different dashboard. That
was the exciting thing about this program.”
It’s still early days though, Vildibill
cautioned, when asked whether the
AI Ops program would be available for
data centers that did not include HPE or
(HPE-owned) Cray equipment. “That's a
very fair question,” he said. “We're really
excited about what we're doing. We're
onto something big, but it's not a beta of a

Smart Energy Supplement 5


Efficiency efforts
move into the IT
stack
Data center power consumption is costly - but servers are getting more Alex Alley
efficient, Alex Alley reports Reporter

D
ata centers bring together by data centers. In February, the Lawrence power usage only went up by six percent.
a large number of servers Berkeley National Laboratory co-wrote a “Increasing data center energy efficiency
in one place, and run report commissioned by the US Department is not only an environmentally friendly
applications on them. of Energy, which revealed some interesting strategy but also a crucial way of managing
Whether they are enterprise, statistics. costs, making it an area that the industry
colocation or cloud data Firstly, the study confirmed an oft- should be prioritizing,” Jim Hearnden, part of
centers, they have to operate 24x7 to support quoted rule of thumb, that data centers now Dell Technologies’ EMEA data center power
those mission-critical applications so, as consume a small but significant part of global division, told DCD. “Most IT managers are
data centers emerged, the first priority was to energy. However, the word on the street has keen to increase their energy efficiency in
build in reliability. been cranking up to around two percent, relation to their data center, particularly when
Once the reliability task was done, costs the DOE report reckons it was closer to one doing so also helps improve performance
and efficiency came to the fore. Those early percent in 2018. and reduce cost.”
data centers were over-engineered and over- That sounds like a manageable figure, It’s clear that data centers have seen huge
cooled to ensure reliability, but it quickly but it masks areas where data centers have efficiency gains - and as one would expect
became apparent that more than half the become a burden. For instance, Ireland is from the PUE figures, the majority of these
energy they consumed went into keeping the facing a boom in data center building, and have been in the cooling side of the facility.
hardware cool, and less than half was actually has a limited ability to grow its grid. The Irish But during that same eight year period, server
used in computation. Academy of Engineering has predicted that energy consumption went up by 25 percent.
Ten years of working on the efficiency in 2027, 31 percent of all power on the grid That’s a substantial increase, although
of cooling systems has given us a current will go to data centers. it’s a much smaller uptick than the six-fold
generation of facilities with a power usage Secondly, and more interestingly, the increase in workloads the study noted.
effectiveness (PUE) of 1.2 or less, meaning report shows that this overall figure is not It’s clear that server are also getting more
more than 80 percent of the power they use growing as fast as some had feared. efficient, gaining the ability to handle higher
is burnt in the servers themselves. Over the past decade, things have workloads with less power.
So now, it’s time to start looking in dramatically changed. In 2018, data cen­ter Much of this is down to more powerful
more detail at the power used by servers, workloads and compute instances increased processors. We are still in the era of Moore’s
as a major component of the energy used more than six-fold compared to 2010, yet Law, where the number of transistors on a

6 DCD Supplement • datacenterdynamics.com


chip has been doubling every two years, as more, they are doing it with less electric very energy and cost-efficient.”
predicted by Gordon Moore, the one-time power.” OVH has around 400,000 servers in
CEO of Intel. Supermicro has spotted the part of the operation, and its process is just as software-
More transistors on a chip means more puzzle where it can help: “Manufacturers driven as Supermicro’s, Sterin told us: “We
processing power for a given amount of have not focused on idle servers and their submit a server to a lot of different tests
electrical energy, because more of that cost,” Herz said. “And newer management and environmental tests. This allows us
computation can be done within the chip, software can aid in keeping that to measure how much energy the rack is
using the small power budget of on-chip consumption down.” consuming. The goal is that our server needs
systems, without having to amplify the A five-year old server can use 175W to be energy and cost-efficient.”
signals to transmit to neighboring silicon. when it is idle, which is not that much less It’s clear that energy efficiency is now
Moore’s Law implies that the than when it is in use. Idle server power top of mind at all levels of the data center
computational power of processors should consumption has improved over recent stack. More efficient server chips are being
double every 18 months, without any years, but still Herz estimates that data managed more effectively, and used more
increase in electrical energy consumed, centers with idle servers can be wasting a continuously, so they crank out more
according to an observation by Moore’s third or even a half of the power they receive. operations per Watt of supplied power.
colleague David House in 1975. Newer management software can balance At the same time, those servers are being
As well as in the processors, there’s been workloads, distributing tasks so servers cooled more intelligently. Liquid cooling
waste energy to be eliminated in all the spend less time idling. “This software is used is ready to reduce the energy demand on
components that make up the actual servers not only to monitor the servers in your data cooling systems, while conventional systems
in the data centers. center but also to load balance the servers in are being operated at higher temperatures so
Supermicro makes “white-label” processor your data center and optimize the electric less energy is wasted.
power,” Herz said. We know that Moore’s Law is reaching the
“If you have a set amount of workloads end of its reign, however. Chip densities can’t
that you have to distribute over a certain go on increasing indefinitely, and delivering
number of servers in your data center, maybe the same rate of increase in performance
“Despite people doing there are more efficient ways to go about it. power Watt.

more, they are doing it Try optimizing the servers in your data center
so that you're running some of them at full
If we’ve made cooling as efficient as
possible, and chip efficiency begins to level
with less electric power" capacity. And, that way you're able to get out, where will the next efficiency gains be
economies of scale.” found? One possibility is in the software
Further up the stack, it’s possible to running on those processors: how many
optimize at a higher level, where the server cycles are wasted due to inefficient code?
power use shades over into the cooling. For Another possibility is in the transmission
boards used by many large data center instance, French webscale provider OVH of power. Between eight and 15 percent of all
builders, and it has been hard at work to takes Supermicro boards and customizes the power put into the grid is lost in the long-
shave inefficiencies off its servers, according its servers, with specially-adapted racks distance high-voltage cables that deliver it.
to Doug Herz, senior director of technical and proprietary water cooling systems. To reduce that would require a shift to a more
marketing at the company. Small watertight pockets are placed on hot localized power source, such as a micro-grid
“The data center’s electric power components to conduct heat and transport at the data center.
consumption in the US has started to flatten it away. The data center sector has great needs,
off,” he told DCD in an interview. “It’s not “It makes good business sense,” OVH’s and plenty of ingenuity. The next stage of
going up that fast due to a number of energy- chief industrial officer, Francois Sterin, told the efficiency struggle could be even more
saving technologies. Despite people doing DCD. “The goal is that our server needs to be interesting.

Smart Energy Supplement 7


Reducing Data Centre
Arc Flash Risk Through
Innovative Busway Design
It is estimated that more than 30,000 arc flash incidents occur each year,
resulting in an average of 7,000 burn injuries, 2,000 hospitalisations and
400 fatalities.

T
he NFPA 70E, describe efficiency designs can also increase the number one priority. iMPB is the safest
arc flash as “a dangerous the risk of arc flash in the data centre. and most flexible open-channel busway
condition associated with Transformers represent one of the highest system on the market due to a range of
the possible release of energy losses in electrical power distribution, in-built features that greatly reduce the
caused by an electric arc.” however they also provide inductive likelihood of arc flash in the data centre.
An arc flash occurs when and resistive impedance that limits fault These features are designed to ensure
a surge of electrical current caused by a current. To reduce losses, modern data continuity of power in in a world that
short circuit, flows through the air from centres are being designed with fewer, demands 24/7 uptime, whilst also making
one energised conductor to another. This larger transformers compared to traditional operator safety a priority. Arc flash testing
results in a release of ‘incident energy’ data centres. However, as this also reduces according to IEC/TR 61641:2014 has been
which is expressed as an explosion of heat electrical impedance within the power completed for E+I Engineering’s full iMPB
and pressure into the external environment. system, the trend of higher efficiency range.
Despite the transient nature of arc flash designs tends to increase the available fault
incidents, they have the potential to reach current and the risk of arc flash. This can Protective Housing
temperatures of up to 35,000°F. result in consequences such as lost work iMPB lengths are designed as an open track
time, downtime, fines, medical costs, lost system where tap off units can be plugged
Arc Flash Risk in the Data Centre business, equipment damage; And most in anywhere along the bar. The assembly
As the demand for data increases, modern importantly arc flash presents a severe is designed to exceed the minimum ‘finger
data centres are now seeking higher power safety risk to human life. safe’ requirements of both UL and IEC
capacities, higher rack densities and higher standard, this greatly reduces any risk of
efficiency designs, all of which have an Intelligent Medium Powerbar – Safety by any accidental contact and hence reduces
impact on arc flash risk. As power capacity Design the risk of arc flash. The copper conductors
and rack densities increase, all things E+I Engineering’s iMPB open-channel are fully isolated from the housing using
being equal, so too does the available fault busway solution has been manufactured a certified thermoplastic material, the
current. Ironically, the quest for higher with the safety of the installer and user as insulation has excellent dielectric strength
Advertorial | E+I

and is impact resistant. The lengths The mechanical interlock secures the
are connected using custom designed tap off box to the busbar using high tensile Hook Operated Tap-Off Units
thermally and electrically secure joint strength lockable hardware which cannot iMPB can be installed vertically or
packs, that can easily be disassembled and be fitted incorrectly. Once fitted to the bar, horizontally depending on project
reassembled. the engager handle can be turned. This requirements. However, it is typically
iMPB offers an enhanced layer of safety lifts the contacts into the busbar and has a ceiling mounted above the rack. For safer
by offering a closure strip that can be fitted positive lock once fully rotated. operation, E+I Engineering have introduced
over the area of the busbar that is not This mechanical connection between a hook operated tap-off unit which can be
connected to a tap off unit. This increases the tap off unit and the busway prior to any switched on and off from the floor using
the protection to IP3X, further reducing the electrical connection, ensure that there a simple hook mechanism. This adds a
risk of an arc flash. is no risk of an arc flash incident when further level of protection as users have no
installing iMPB tap off boxes to the busbar. direct contact with the tap-off box while it is
Mechanical MCB Interlock energised.
An MCB Safety Interlock can be integrated “Ground First, Break Last” Technology In mission critical environments, which
into E+I Engineering’s iMPB product to iMPB tap off units are fitted to the busbar is constantly demanding higher levels of
prevent the tap off unit being fitted to the using E+I Engineering’s unique ‘earth first, power, the dangers of arc flashes can never
bar while the MCB is in the ‘On’ position. break last’ safety feature. Each tap off unit be completely eliminated but they can be
Similarly, the tap off unit can only be interlocks onto the distribution length with controlled. This is done by understanding
removed from the busbar when the MCB a ground strip. This ensures that the ground where the potential hazards are and taking
is in the ‘Off’ position. The MCB can only is the first point of contact with the busbar steps to mitigate them. In the open channel
be switched on when the contacts are fully system during installation, achieving a iMPB busbar product, E+I Engineering have
engaged with the busbar. This provides lower fault current and lower fault clearance implemented a range of safety features to
users with an extra layer of safety when time as excess current will always exit the minimise the risk of an arc-flash incident,
fitting/removing tap off boxes from the busway system through the grounding preserving both operator safety and system
busbar. strip. efficiency.
Responding to demand response demand

Demand
response
- is there a
demand?
Data centers can play a role in cutting emissions and reducing the
Peter Judge
strain on the grid. But why should they? Global Editor

I
n a bid to reduce emissions, the share of renewable sources in their that is being retired is exactly the steady,
renewable sources are being used electricity generation, and reducing that readily available capacity that the grid needs,
where possible - but this creates provided by fossil fuels. However, there are providing a continuous baseload, and also
new problems for the grid, making two problems with this. extra flexible capacity as needed.
it harder to match generation and The electricity grid has to satisfy a
consumption. Data centers could Firstly, apart from hydroelectric power, fluctuating demand - and there are two big
help to create a balance, through techniques renewables are mostly intermittent. Solar factors where long term policies designed
referred to as “demand response” but so far panels and wind turbines only deliver energy to reduce emissions could actually add to
it’s proven difficult to enlist their help. when the sun shines or the wind blows, the burden on the grid. Electricity is being
All the world’s economies are attempting and can’t be switched on as required. And proposed as a replacement for fossil fuels
to reduce carbon emissions by increasing secondly, the fossil fuel-powered capacity in cars and heating. But this will increase

10 DCD Supplement • datacenterdynamics.com


the demand for electricity - and it only It’s been suggested that a group of data response, you are providing services for
reduces emissions if green electricity can centers could help shift demand by the grid, and getting revenue by providing
be increased to match that demand. migrating their loads amongst themselves those services. Instead of responding to the
To respond to changing supply and to make use of the cheapest and greenest cost of energy, you respond to a real-time
demand, the grid has to become flexible. electricity at a given moment. “Many typical signal.”
According to Mohan Gandhi, a civil data center workloads are delay-tolerant, In Ireland, facilities on the FFR program
engineer and analyst at the New Bridge and could be rescheduled to off-peak get a signal roughly once a month, says
Founders thinktank: “As intermittent hours,” says Gandhi. Pannanen. “Normally the frequency
renewables penetrate further into the There are drawbacks to this. Firstly, deviation lasts for only a few seconds.” This
generation mix, flexibility becomes an if a data center is running profitable is a level of usage that traditional lead-acid
increasingly important feature of the workloads, then it costs money to move batteries can readily support - and if FFR
electricity system.” them elsewhere, and the most cost-effective takes the place of a scheduled battery test, it
Some of this flexibility is based on use of that resource is to run it at capacity can actually create less stress to the system.
moving electricity to where it is needed, as long as possible. And secondly, the These systems are proven, says
but transmitting power is costly and customer who owns the data may need to Paananen. On 8 August 2019 in the UK, two
involves losses - and the cables may not ensure that is processed in a given location power plants went down, the frequency of
even be there: “Renewables are actually to comply with local regulations. the grid changed, and that signaled various
being built faster than cables can be responses, so numerous factories and
laid,” says Gandhi. “In Germany, It’s actually possible for data centers facilities went off-grid.
wind generation in the north to reduce their power demands without All this has been possible for years, but
has grown enormously, but the affecting IT workloads. Research by the - as with any new idea - the big hurdles
interconnection cables between Lawrence Berkeley National Laboratory are making it pay, and gaining users’ trust.
the north and south are yet to (LBNL) found that energy consumption could Utilities are prepared to offer cheaper
be built.” be reduced by five percent in five minutes, electricity at different times, and even pay
Instead of moving and 10 percent in 15 minutes by making consumers to take themselves off-grid. But
electricity around, another changes such as setting a temporarily higher will data centers take them up on this?
approach is to shift demand towards air temperature. Back in 2013, Ciaran Flanagan of ABB
times when electricity is cheaper or Beyond this, demand response told DCD: “Demand response programs
more available - an approach dubbed approaches tend to use the facility’s (DRPs) have not only become a tool for
“demand response.” uninterruptible power supply (UPS). This is grid operators to manage demand, but also
This can be as simple as offering designed to support the data center when a source of revenue for DRP participants.
consumers a cheaper tariff for night-time the grid fails: there is an alternative source DRPs are in operation today in many
electricity (in the UK often referred to as of power (usually diesel gensets), and some commercial and industrial sectors but,
Economy 7). In industry, energy use is more energy storage (typically batteries) that ironically, data centers are largely non-
concentrated, and there is potential for will support the data center while the local participants, even though they are the
more advanced methods including on-site power starts up. fastest-growing part of the grid’s load.”
generation and stored energy, so industrial Why not use the batteries, or even switch
sites can temporarily shift their load to diesel for a few hours, when energy is A big reason for data centers’ reluctance
completely off the grid, or even become an expensive? “When power is expensive, you is that they make profits from continuous
energy source, feeding power into the grid. can use energy from batteries, not the grid,” availability, and sharing support systems
“Demand response is often the most says Janne Paananen, technology manager might increase the risk of failure.
economical form of flexibility because of energy equipment company Eaton. “This “Participating in demand response
it requires few new transmission or gives savings in cost of energy. You can do programs may reduce the availability or lead
distribution investments,” says Gandhi. It it yourself.” to a higher risk of downtime. This risk is
also has a lot of potential: The European Beyond the DIY approach, there are exacerbated by the potential surrendering
Commission estimates that Europe as a systems managed by the utilities, which of control to aggregators,” says Gandhi.
whole could deliver 100GW of demand work in a surprisingly simple way. The grid “Data centers are typically in the business
response power (and this is expected to frequency in the UK is 50Hz (plus or minus of avoiding downtime, minimizing risk and
rise to 160GW by 2030). However, the one percent), but it varies at heavy loads. maximizing availability.”
European grid is currently only accessing The utilities use this to regulate the power One operator put it more simply at a
around 20GW of available demand response - the grid detects the change in frequency DCD event: “I put in that UPS to support my
capacity. Globally, Gandhi estimates and uses that to switch on extra capacity. load when the data center is browning out.
that 20 percent of the world's electricity Because there are industries with their Why would I share that just when I most
consumption will be eligible for demand- own generating capacity for backup, the need it?”
response by 2040. electricity industry has come up with a “The challenge is educating the
As a sophisticated and significant scheme called Firm Frequency Response market so they understand it is safe,” says
electricity user, believed to be using around (FFR), in which those third party resources Paananen, “there are safety features built in.
one percent of the US grid’s output, digital are turned on in response to the same The UPS will refuse to participate when it is
infrastructure can play a big role here. “Data change in frequency. needed.”
centers, with their real-time management Data center UPS systems are designed to Beyond that, the trouble is that any
and workload flexibility, are good switch on immediately, and can be hooked revenue from demand-response is small:
candidates for demand response schemes,” into this sort of scheme. “I’m not sure how much the extra revenue
says Gandhi. “They can ‘shift’ load outside FFR is in operation in Ireland and likely is meaningful for data centers.” He suggests
peak hours, or deliver surplus energy stored to come onstream in the UK shortly. Eaton that the return could be in automated
in their batteries and on-site generators to is working with the FFR scheme in Ireland, maintenance with no service charge - at
the grid at times of undersupply.” says Paananen. ”With fast frequency least that’s something that customers need.

Smart Energy Supplement 11


Responding to demand response demand

In a colocation facility, an operator’s hands seconds, and consumes very little energy. model,” Ghandi points out.
may be tied. The UPS may be shared by That is very nice for data center. The UPS Some governments are stepping in to
users, some of whom object to handing can run for less than its full design load.” demand the adoption of demand response,
over control to a demand response The trouble is that this kind of use and regulators are getting involved. “The
scheme. In theory, hyperscale data centers, demands more modern batteries: “There’s markets are not yet open in every country,”
with monolithic applications under one a more constant discharge in that scheme. says Paananen.
organization’s control, have more freedom. You would need lithium-ion batteries.” One aggregator that is optimistic is
Paananen hopes that a different It also requires critical functions in the Upside Energy, a UK operation that was
approach in use in Nordic countries may be UPS itself: “The UPS needs to understand, adopted as a partner for data center demand
a better fit for data centers. In 2017, Eaton and follow external signals - and make its response by equipment maker Vertiv.
launched an “UPS-as-a-reserve” (UPSaaR) own decisions.” The pair have made no big wins yet, but
service with Swedish energy supplier Upside CEO Devrim Celal says “we are
Fortum, a more flexible approach in which With all that effort, an ironic problem super excited about data centers, and that
UPS batteries effectively act as a part of a can be that the utilities aren’t always keen, business will be increasing significantly in
“virtual power plant - and get paid around says Gandhi, due to the basic laws of the next few years.”
€50,000 per MW of power allocated to supply and demand. Demand response In the meantime, though, there’s plenty
grid support. Similar schemes are now in programs are often implemented by third- of interest from other sectors. “There’s good
operation in Finland and Norway. party aggregators who have agreements activity with cooling and refrigeration, and
It’s still early days, but Paananen with utilities and consumers, and pool behind-the-meter cogeneration,” says Celal.
has faith: “Things are progressing, but the demands of a group of customers. But the demand response proponents
not nearly as fast as people hoped. The Aggregators can hook into proprietary want data centers on board. Right now,
challenge is if you really want to get control systems like those of Eaton, or large players like Google are paying to
commercial benefits from it, you need big add intelligence to the operation of other have more renewable energy connected to
batteries.” Large deals with hyperscalers will vendors’ equipment. the grid with power purchase agreements
take years to complete, he cautions. Demand response systems led by (PPAs), but there are limits to this, says
Paananen thinks the Nordic scheme aggregators help the consumers reduce Paananen: “At some stage, you may not
may be the most promising. Ireland uses consumption, and cut their costs. But be able to use all that renewable energy.
FFR for “containment,” kicking in when the why would utilities relinquish control and Demand response helps the grid to get
frequency has dropped significantly. But potentially lose revenue? more renewable energy in - so the only
the Nordic UPSaaR schemes are more about More subtly, as a third party, the way data centers can get green energy is by
fine-tuning or “regulating” the system when aggregator masks the real demand from the helping the grid to get it.”
it is slightly out of line. utility. “There is no incentive for electricity It could change perceptions, he goes
In Nordic markets, the UPS gets used suppliers to include aggregators in their on: “Instead of DCs being a problem,
more often, for shorter periods, and with a contracts with customers because this they are part of the solution, and help the
faster response required: “It can be only 30 undermines many areas of their business grid to adapt.”

12 DCD Supplement • datacenterdynamics.com


Power efficiency
needs to keep
up with 5G's
demands
5G is set to usher in higher data
transfer speeds, enabling a new wave of
computing technologies. But this will
put new pressure on energy efficiency,
Vlad-Gabriel Anghel finds Vlad-Gabriel
Anghel
Contributor

I
n 1979, Nippon Telegraph and higher than the peak possible with 2G, and
Telephone (NTT) launched the first supported new protocols and solutions like
mobile cellular network in Tokyo, VoIP, video conferencing and web access.
Japan. An analog system, it is now This opened the packet switching era
referred to as a “first-generation” or in mobile phone communications. While
1G network. By 1983, NTT rolled out functions like Internet access struggled
coverage throughout the whole of Japan, at first, the launch of the iPhone in 2007
while other 1G networks were springing up stretched 3G’s capabilities to the limit. It
in Europe and elsewhere. was clear that 4G would be needed - and
Motorola began service in 1983 in the international standards bodies including the
US - where Bell Labs had proposed such a ITU have been working on it since 2002.
network as early as 1947, but dismissed it In 2009, the 4G Long Term Evolution (4G
as impractical. 1G had a lot of drawbacks LTE) standard got its first run in Sweden and
including poor audio quality and limited Norway, and was rolled out over the next few
coverage. There was no roaming support years. 4G’s speed has enabled multiplayer
between networks, because there were no gaming on the go, high quality video
standards. However, it was revolutionary streaming and much more. However, the
and paved the way for further development protocol is plagued by network patchiness in
within the sector. a lot of regions around the globe with some
The next iteration, 2G was a big suffering extremely low 4G penetration.
improvement, using digital radio signals.
It appeared in 1991 in Finland, and was Enter 5G
launched under a standard - GSM - which This all paved the way for 5G, which was
promised the possibility of international being developed from the moment 4G
roaming. Providing SMS and MMS, the new was delivered and is now already deployed
technology was adopted widely. Despite in certain areas. It promises massive
slow speeds and relatively small bandwidth, improvements such as allowing up to a
2G revolutionized businesses and customers million devices per square kilometer - which
on a scale never seen before. could revolutionize a lot of sectors.
3G evolved in the years leading up to The latest estimates from IHS Markit see
2000, with the aim of standardizing the 5G having an economic impact of at least $12
network protocols used by vendors thus trillion as the focus shifts from connecting
providing truly international roaming. NTT people to information towards connecting
Docomo launched the first 3G network in everything to everyone. Plus it promises an
2001. 3G offered speeds about four times energy efficiency revolution in the field of

Smart Energy Supplement 13


Building 5G efficiently

mobile communication networks. of their overall power consumption in actual transfer rates and reduces the possibility of
The Internet of Things era is now in full data traffic, with the rest being wasted interference. It also focuses radio energy
swing, with more devices connected to the mostly by keeping the components in a directly towards the connected device, and
Internet than there are people on Earth. ready-to-operate state. can identify the exact amount of power and
All these devices send and receive data on 5G base stations will be able to go to energy required to further reduce energy
an almost constant basis. While home IoT sleep when network activity pauses. This consumption for both the base station and
applications benefit from a local wireless is immensely important as base stations the device itself.
network, business and industrial IoT will account for around 80 percent of the power Furthermore, 5G makes use of smaller
require untethered connectivity over long used by a mobile network. network cells, covering a given area with
distances as well as enough bandwidth to In current networks, the majority of base a larger number of smaller antennas. In
accommodate all these applications at the stations may be idle at any moment. Despite mobile networks, a cell refers to the base
same time. the increased number of devices and higher station, its antennas and the physical area
Currently there are so many devices data rates, 5G may have more opportunities they serve.
connecting to both 3G and 4G networks to power down base stations. Since 5G has 5G’s small cells are designed to be
that these networks are close to breaking a higher data transfer rate, data transits deployed inside large buildings or outside in
point, and cannot add the growing number the network more quickly, increasing the highly populated areas. Power consumption
of fresh devices that require perpetual time when the base station is idle. 5G data increases with the distance between the
connectivity. This is where 5G comes packets are more compressed, further base station and the client device (the
in. It is faster, smarter and more efficient reducing traffic volume. antenna has to “shout”), so these smaller
than its predecessors with a much higher network cells should be deployed to keep
bandwidth. the communication distance as small as
5G specifies a power possible.
Antennas, spectrum, and base stations
Communications networks are increasingly
reduction of 90 percent Finally, through a combination of
new scheduling algorithms the spectral
finding that space, power consumption, over 4G - from the base efficiency of 5G New Radio is massively
and emissions are major economical and improved over current networks.
operational issues. Given the vast increase station all the way In 4G networks, the signal scheduling
in connected devices it is clear that energy
efficiency will be a top concern for operators
to the client device included a large number of control and
verification codes at regular intervals which
wishing to reduce overheads from both could consume up to 20 percent of the
capital and operational expenditure. The new network also adds a multipath network’s energy overhead during higher
So how does 5G fare in terms of power transport control protocol (MPTCP) which frequency transmissions.
consumption and efficiency compared to its reduces the need for packet replication and In 5G streams, the control and
predecessors? retransmission and increases reliability verifications codes are greatly reduced. This
5G’s design requirements specify a 90 as more network paths are created. This is is because the use of smaller mobile network
percent reduction in power consumption possible because 5G uses cheap and efficient cells cuts the communication distance,
compared to current 4G networks - a figure MIMO (multiple input multiple output) reducing the chance of interference or
which is based on the whole ecosystem, antennas. A MIMO system of antennae can failure.
from the base stations all the way to the handle more clients and bigger volumes of The scalability and flexibility of 5G
energy used by the client device. This network activity, and increase reliability by networks are increased with software
massive reduction happens through a providing more routes for the data to take. defined networking (SDN) and its related
combination of updated design practices, Should one route fail another one can take virtualization technologies. SDN works by
hardware optimizations, newer protocols, its place. decoupling the control layer of the wider
and smart software management of the MIMO antennas communicate with network from its data plane and merged
underlying infrastructure. multiple clients using focused beams of into a centralized control plane, which
Currently mobile networks use, on radio waves (“beamforming”). This increases has access and overview over the whole
average, anywhere between 15 to 20 percent the channel efficiency along with data network. This means that the hardware

14 DCD Supplement • datacenterdynamics.com


resources can be dynamically assigned by increase in energy used by the total mobile parts of the underlying infrastructure.
software to optimize traffic flow. communication infrastructure. This will Since radio frequency signals carry both
be the case until (and if) previous network information and energy, it is theoretically
What does 5G actually deliver? generations are phased out. possible to harvest some energy and receive
The 5G design specifies an almost Utopian This can have a domino effect on the some information from the same input.
technology which promises an energy wider digital infrastructure. The growth of This system is known as SWIPT -
efficiency revolution across the whole 5G networks comes hand in hand with the simultaneous wireless information and
mobile communication ecosystem, but rise and expansion of Edge data centers, so power transfer. The hardware required for
these promises should be taken with a pinch it seems reasonable to include the energy this is still in development and there’s a rate-
of salt when analyzing how the technology use of these facilities when discussing 5G’s energy tradeoff between the amount of data
will perform once deployed. efficiency. and energy derived from a signal.
The rollout of 5G to the public will be slow, Putting critical resources closer to the So SWIPT will never charge your
mostly because of the relatively high initial network edge can reduce latency, so 5G plus smartphone wirelessly, but it is a novel
investment required. So a full 5G coverage to Edge facilities may deliver individual client approach which could offset the power
the level of current 4G deployment is within interactions more effectively. However, if we consumption required for data transmission
sight but will take a few years. Full coverage is factor in the vast increase in data storage, on the client device.
well beyond that. and processing needs of these applications The rollout of 5G is likely to be closely
For a long time, “legacy” mobile over the coming decade, it seems that 5G linked to the rise of the modular and
networks like 3G and 4G will account for will in fact simply enable an even more containerized “Edge” data center market,
the majority of a base station’s total energy power-hungry digital infrastructure which in turn is driven by the need to make
consumption. The latest figures from Cisco’s industry than today’s. communication as efficient as it can be in
Annual Internet Report 2018 - 2023 further This could be potentially offset by energy and latency terms, by placing time-
emphasize this with “the top three 5G a close collaboration between all the sensitive functionality at the network edge.
countries in terms of percent of devices and interconnecting industries, ensuring a It’s clear that energy efficiency will be
connections share on 5G will be China (20.7 powerful and reliable smart grid backed up key to all future developments in the digital
percent), Japan (20.6 percent), and United by renewables and proper energy storage infrastructure industry.
Kingdom (19.5 percent), by 2023.” technologies. In the next decade, we will see how 5G’s
Some analysts have argued that there energy efficiency - and its impact on the
may in fact be no energy savings when the Searching for standards world - plays out. Right now, the technology
technology is rolled out - depending on Standards are slow to emerge, as a is in its infancy and there is not enough
one’s definition of energy efficiency across 5G. consensus is needed between various data to know what the impact its energy
For example, the computational power parts of the communications and energy consumption will have in real life.
required to process 5G signals is around four ecosystem. The number of bits transmitted As every moving part of the industry
times that of 4G. On comparable hardware, per Joule of energy expended is now one becomes more knowledgeable and less
one would expect 5G’s data processing of the top metrics used to analyse the risk averse, 5G could bring in an era of
component to have four times the energy efficiency of a network. near instantaneous communication with
consumption. However, this is only true if Currently, a cellular site will deliver an a plethora of new industries and markets,
5G is rolled out on the same infrastructure energy efficiency of around 20kbit/Joule, making every effort to minimize their
that 4G currently runs on without adopting and some research papers in the field impact on the world, through collaboration
the increased efficiency delivered by forecast that 5G could boost this more than and smart infrastructure deployment and
hardware manufacturers. two orders of magnitude to 10Mbit/Joule. management.
Mobile operators have been given The future looks even more exciting as The roll out of this Edge-based technology
significant spectrum for 5G networks, technologies are already starting to appear will bring huge changes to data centers and
partly due to the imminent saturation of that introduce novel ways to harness 5G. other infastructure.
previous networks and the rise of industrial In “radio frequency harvesting,” the But for that to happen, companies have to
and commercial IoT. However efficient energy transmitted over radio waves can be ensure that efficiency is paramount, as they
5G is by design, it will add up to a further captured and used on the client device or rush to compete for the sector.

Smart Energy Supplement 15


INTELLIGENT
MEDIUM POWERBAR
Delivering power safely and
efficiently in mission critical
environments.

E+I Engineering's innovative iMPB


product is an open channel busway
system designed for use in data centres
and other mission critical environments.
E+I engineering have completed iMPB
installation in data centres across the
globe where security and flexibility of
electrical distribution is paramount.

iMPB has been engineered with the


safety of the installer and user in mind.

For more information about our full range of products


please contact us at info@e-i-eng.com

Donegal, Ireland | South Carolina, USA | Ras al-Khaimah, UAE

WWW.E-I-ENG.COM

You might also like