Professional Documents
Culture Documents
by Anubama Chinnakannan
Executive summary
Artificial Intelligence (AI) is a fast-emerging tool for
analytics-based services. Enabled by progress in
data generation, storage, and computational power,
operators can leverage AI and set up standardized
operations across the organization. But AI is a broad
term with multiple definitions and applications, which
can lead to confusion. In this paper, we explain AI, its
subset Machine Learning, and its variations. We
show how AI can be applied at three levels (in the
cloud, at the edge, and on the device itself). We
provide generic and building-specific examples for
each concept.
Schneider Electric – Energy Management Research Center White Paper 502 Ver 1 2
Introduction The use of commercial AI has tripled 1 over the past two years and led to the need
for a reevaluation of core IT infrastructure to optimize for AI productivity. Digital
leadership is essential, and many organizations have already launched AI initiatives
in their operations. According to Gartner, despite the global impact of COVID-19,
47% of AI investments were unchanged. 2 One sector that can benefit greatly from AI
is digital buildings, the focus of this paper.
In North America and Europe, humans tend to spend 90% of their lives indoors, on
average. 3 As such, buildings are constantly improved for occupant comfort,
sustainability, efficiency, and other factors, using different technologies. For
example, the built environment is undergoing a digital transformation, starting from
the simple conversion of analog equipment to digital devices, to engaging several
IoT sensors integrated in an intelligent communication sphere. Progressive
technologies such as AI act as enablers in ongoing building optimization efforts, and
they leverage existing investments in IoT products.
Buildings account for 40% 4 of all energy use in the United States, and on average
almost 30% 5 of energy consumed in commercial buildings is wasted. AI has begun
to solve this problem.
This large amount of generated data tends to remain siloed in databases either on
the cloud or on-premise. All that data can be an asset once relationships have been
understood and established. These relationships can be gathered offline or online
and processed for active AI implementation and data analyses (Figure 1).
1
Chirag Dekate, et. al., Predicts 2021: Operational AI Infrastructure and Enabling AI Orchestration
Platforms, 2020
2
Gartner, 2 Megatrends Dominate the Gartner Hype Cycle for Artificial Intelligence, 2020
3
Klepeis, Neil & Nelson, William & Ott, Wayne & Robinson, John. (2001). The National Human Activity
Pattern Survey (NHAPS): A Resource for Assessing Exposure to Environmental Pollutants.
4
https://www1.eere.energy.gov/buildings/publications/pdfs/corporate/bt_stateindustry.pdf
5
https://www.energy.gov/eere/buildings/about-commercial-buildings-integration-program
6
Building information modeling (BIM), Electrical power management system (EPMS), Building
automation system (BAS)
Figure 1
AI process outline
The ‘Internet of Things’ (IoT) led to the generation, implementation, and use of
sensors everywhere. Data gathered through these sources present a good amount
of untapped potential. AI can be employed to leverage, assist, and increase the
value of such investments.
AI enablers
AI has a long-drawn roadmap from the first time the term was coined in 1956.
Exponential development in essential technologies has delivered traction to AI
research, processes, and innovation. The three key enablers are:
Power, speed, storage, and cost go together, and unsurprisingly, there is no shortfall
in innovative solutions in this merger either. A Stanford annual AI Index report 8
states that: Over a span of 18 months, the time required to train a network on cloud
infrastructure, for supervised image recognition, fell from 3 hours in October 2017 to
7
https://deepmind.com/blog/article/deepmind-ai-reduces-google-data-centre-cooling-bill-40
8
https://hai.stanford.edu/sites/default/files/ai_index_2019_report.pdf
about 88 seconds in July 2019. As for cost, in October 2017, a 13-day training time
was required to reach 93% accuracy in image classification and would cost around
$2,323. The latest benchmark available for an advanced process (Stanford
DAWNBench running a ResNet model)” 9 on the cloud to attain image classification
with the same accuracy costs just around $7 in March 2020.
AI explained The term AI has multiple definitions depending on the industry, application, and use.
A quick description: AI makes it possible for a computer to mimic human-like
intelligence. It enables a machine to learn from training datasets and then to learn
from experience. Some branches of AI have an added layer for decisiveness 10,
some work with processes inspired by human brain interactions (neural networks),
and some, specifically ‘Machine Learning’ (ML), learn from a sample or training
dataset. ML sometimes identifies patterns without being led to them and can be
prompted to carry decision-making capabilities too. It can be trained to execute
these decisions with minimal human intervention.
AI encompasses several branches (see Figure 2), and this paper focuses on ML
and use cases that define trends in the built environment.
Figure 2
Branches of AI
• Supervised learning
• Unsupervised learning
• Reinforcement learning
9
https://dawn.cs.stanford.edu/benchmark/
10
AI techniques are enriching data-based decision making capabilities by analyzing large amounts of
data and providing critical insights based on its application.
pairs on its own. While these models usually have good accuracy, the need for a lot
of labelled data poses a challenge and experts are not always available to meet this
requirement.
The field of predictive analytics in a BMS uses this concept. The model can predict a
potential fault occurrence or load profiles and also learn from IoT device data to
deliver energy performance and savings insights, optimization solutions, and control
algorithms. For example, a snapshot of devices that monitor occupancy and lighting
in a building provides data to identify wasted energy resources at unoccupied
spaces. It can learn arrival times and space occupancy patterns then connect them
with building load profiles and off-peak hours to regulate energy savings measures.
A simple example is automating processes such as turning on HVAC units at
different zones based on requirement and red band avoidance. It is possible to
incorporate a learning model that considers adjusting lighting systems to work with
natural available light and take occupant preferences into consideration.
Unsupervised learning contrasts with the above in that it does not train from a pre-
labeled dataset.
The model is looking for patterns in a massive dataset without being told what it is
looking for. This is powerful because it does not have defined search boundaries
and constraints. It delivers unique insights and truly ‘thinks outside the box’. It
identifies correlations that are overlooked or ‘not thought of’ by manual data
handling.
Figure 3
Unsupervised leaning:
Clustering representation
The model cannot only separate the needle from the haystack on its own, but it can
go on and separate all sorts of other things we did not know were in the haystack. A
good use case is creating cluster maps to plot operational data of cold-water
chillers. With minimal configuration, these plots can detect outliers or patterns that
can be used for energy efficiency studies and/or identify trends that may potentially
lead to a failure. These outlier data points may not be relevant enough to trigger an
alarm, but odd enough for an unsupervised learning algorithm to detect a useful
pattern for, say, preventive maintenance. This is illustrated in Figure 3.
With these concepts in mind, it can be said that reinforcement learning practices a
trial and error method to reach a defined objective. By continuously taking actions,
and being rewarded or punished for each action, it learns whether these actions
take it toward or away from the end objective. This can be primitively compared with
the dynamic between most teachers and students. The objective may be to learn
French and you are either rewarded or penalized on your tests and chose to learn
from your mistakes. The Reinforcement learning algorithm approaches its actions
and makes trial and error-based decisions. It is possible to train a model partly or
wholly by trial and error in operation, but long adaptation times may be needed for
the system to converge to a relevant behavior.
These learning tools can be used for applications such as adaptive temperature
control through the IoT network. An inflow of information from operational devices
and controllers, observational learning based on weather, occupancy, building
types, and any other variables feed constantly improving models. With more
accuracy in identifying relationships between these variables, the model is
improved. The incentive is to eliminate the need for building occupants to turn on/off
or constantly adjust set points in thermostats, eventually reducing discomfort and
complaints to facility managers.
Further, the approach can be applied to eliminate the need to manually trim existing
temperature controls, which is almost impossible to do while operating a large-scale
system. The building environment will not stay the same over time, therefore, control
sequences and settings must be continuously adapted if the AI is not doing this for
you.
Where is AI Data driven analytics and development and deployment of AI algorithms can be
carried out in the cloud, at the edge, or embedded on the device itself. What matters
applied? is that the algorithms have access to the data needed and enough computing power
to run a model. As such, key criteria for conducting analytics include network
connectivity, storage facilities, and cybersecurity.
Cloud AI
Cloud computing services are provided over a network that usually comes with
resources such as data storage and computational power. Cloud computing
provides ease of operation and scalability. Administrators can remotely tap into it
and it allows us to work with multiple data sources to train models. For example, it is
easier to train models with data from many buildings at once. ML/AI building
applications in the cloud are emerging and a few are described below.
Predictive control
Building control is traditionally reactive, i.e. it does not usually take predictive
measures based on forecasted events such as weather or expected occupancy. If it
does, it is commonly rule-based and does not make tradeoffs between comfort,
energy usage, and other factors.
Model predictive control is technology that can incorporate predictive analytics for
occupancy levels, energy price forecasts, and weather forecasts to optimize for
comfort and energy cost savings. It requires a mathematical model of the building to
learn from, for example, analyzing how much pre-cooling can be done before a
spike in energy prices.
of initial effort needed to create the model and tailor it for different building profiles.
In recent years, the use of AI has revived this approach by creating a building model
using supervised ML. The trained model creates assumptions of the building’s
HVAC configuration and can be initiated with little human involvement. It learns how
the building responds to heating, cooling, and also usage patterns. Manually fine-
tuning HVAC controls is an intensive process but AI makes it easy along with
centralized control via networked electronic devices.
For example, the model can derive an energy optimization schedule from a
supervised learning algorithm. We start with a certain level of accuracy and work on
improving the model over time. It is re-trained and re-directed to constantly improve
efficiency while ensuring uptime. Schneider Electric has been actively involved in
evaluations and investments that develop predictive analytics and condition
management tools. An example of failure prediction is in the Building Analytics
automated fault detection and diagnostics platform. Efforts in developing an AI
component enhancing this portfolio and other AI-enabled energy management
services help buildings switch to a pre-emptive approach instead of reactive
management methods.
Figure 4
Energy Signature Tool
Energy and temperature meters in the building feed required data points into an
aggregator built to support the tool. This aggregator refines and streamlines the data
chunks to produce daily averages, which are then clustered to form models based
on user definitions such as the energy signature for a workday or holiday. The
baseline model can automatically update, and outliers can be manually disregarded
or assessed if needed. The plot in Figure 4 shows all energy data points plotted
against temperature. It fits within expected values for a building with just heating
11
https://www.gartner.com/smarterwithgartner/gartner-top-10-trends-impacting-infrastructure-operations-
for-2020/
Embedded AI
In the case of hardware on site, ML can be applied to connected devices such as a
room controller. Limited computing and data storage capacity likely means that an
algorithm will have to be trained elsewhere, with alternate training sets (Figure 5).
The trained model is a pre-packaged ML tool with minor tweaks. If required, this tool
can accommodate small design changes to adapt to different types of the same
device. It can thus perfectly complement (work as an add-on to) the working of these
devices.
Figure 5
AI embedded on a room
device: process outline
In summary, the same AI function can be delivered in multiple ways. For example,
any AI application can be embedded on a controller and made available on the
cloud. There are variations in deployment, cost, and maintainability but in some
cases, it might be beneficial to learn a model in the cloud then deliver this at the
edge or embedded equipment levels. This model may or may not need to re-learn at
regular intervals based on complexity and alterations to data inflows.
Domain knowledge coupled with a range of equipment and solutions under the three
layers of connected products, edge control platforms for buildings, and the apps,
analytics, and services layer on the cloud maintain a strong foundation towards
smart building development.
What’s next? The future of AI and specifically AI in buildings is evolving and exciting. Gartner’s
hype cycle for AI (illustrated in Figure 6) delivers a preview of upcoming
advancements in AI technology. Specifically, for buildings, a post-pandemic market
creates several AI applications like contactless building controls equipped with
features such as speech or pattern recognition, access control, security, and
Figure 6
Hype Cycle for AI by
Gartner
AI’s promises are faster and more accurate automated decision making, better
process control, insights for profitable business solutions, etc. AI nudges the shift in
new business models differentiating between feature-centric products versus value-
centric services, now moving toward new solutions and experiences such as
cognitive human-building interaction.
Leveraging AI
Understanding the key concepts driving the future of smart building technologies
and leveraging AI solutions provides a competitive edge in creating intelligent digital
buildings of the future. Buildings are expected to grow smarter and more interactive
with data-driven analytics, improving business value. Simple starting steps to help
prepare for wider implementation of emerging technologies such as AI are outlined
below.
While these are just a few steps among several ways to get the gears going, it helps
shift traditional perceptions of buildings and complements the transition to an IoT
centric and analytics-equipped smart building system.
It is important to keep in mind that the quality and reliability of these learned models
are statistical and depend on the representativeness of the dataset used for training.
The AI model will perform well when exposed within the boundaries of the data
space under which it trained. If there are drastic alterations or inflow of alternate
data points, the model might become skewed. Hence, care must be taken from the
first stage of developing an AI model with both the constitution of the learning data
set and the conditions it might be exposed to, once developed.
Further work in “Responsible AI” and AI governance also becomes a priority for AI
on an industrial scale.
The availability of data and an increase in computational power – both key enablers
for AI implementation – have kickstarted a new digital disruption. Data from National
Venture Capital Association (Q3 2019) shows that 965 AI based firms raised $13.5
billion in the first nine months in the United States alone. According to the Economist
Intelligence Unit Report, 75% of business executives say AI will be implemented in
their companies in the next three years. On a large scale, this technology operates
as a complementary or add-on innovation to general systems. The financial
advantage to working with data-based models is already presenting itself.
Conclusion The rise of IoT technology, the proliferation of data, and hybridization of computing
architecture contribute to the useful implementation of AI in building automation.
Buildings generate large amounts of data that AI methods such as ML and neural
networks can leverage to deliver solutions for building automation, cost control,
energy savings, and optimization, to name a few. AI can be optimized to your
preferences either at the edge or in a centralized cloud. These solutions maintain
the importance of cybersecurity, resiliency, and occupant comfort. Based on the
application, as well as data and storage requirements, AI can be implemented
across all levels, on every device, and work its way up to a remote analytics
platform.
Contact us
For feedback and comments about the content of this white paper:
Schneider Electric Science Center
dcsc@schneider-electric.com