You are on page 1of 21

Article

Multi-Agent System Theory based on HMM Learning


for Smart Home Automation
Kalthoum Zaouali 1,†,‡ * , Mohamed Lassaad Ammari 2,†,‡ , Amine Chouaieb 3 and Ridha
Bouallegue 4,†
1 Université Tunis El Manar, Ecole Nationale d’Ingnieurs de Tunis; zaoualikalthoum@gmail.com
2 Department of Electrical and Computer Engineering, Laval University; lammari@gel.ulaval.ca
3 Chifco Company; amine.chouaib@chifco.com
4 Sup’Com, Carthage University; ridha.boualegu@supcom.rnu.tn
* Correspondence: zaoualikalthoum@gmail.com
† Innov’Com Laboratory

Version November 1, 2018 submitted to Sensors

1 Abstract: Smart home monitoring technology has received vast attentions in the areas of Internet
2 of Things to meet the challenges of the remote monitoring, the intelligent control, the advanced
3 energy management and the data analytic and prediction. The smart home application is inherently
4 dynamic by producing time series data with a real-time behavior. We seek to extract and analyze
5 this incoming data to predict the useful features for a decision-making system. This paper describes
6 a new methodology of smart home data analysis based on Hidden-Markov Model (HMM) to
7 learn a proposal Multi-Agents System (MAS). The experimental results show that the HMM model
8 outperformances the support vector regression model. To our knowledge, our proposed hybrid
9 solution based on both techniques MAS and HMM modeling for home automation is not well solved
10 in previous researches.

11 Keywords: Smart home; Distributed control; Home automation; Multi-agents system; Machine
12 learning; Hidden-Markov model; Time series prediction.

13 1. Introduction
14 Nowadays, the human beings always seek refuge, relaxation, security and prosperity at home,
15 they want everything to be operated without any effort like magic. Since the homeowners, especially
16 the old people, are hoping to make the most of these opportunities at home for a full-time, it is
17 essential that their behavior greatly influences the quality of life. The development of integrated
18 electronic features respectively to the decline in their costs, opened new perspectives of automation
19 controls and monitoring of habitat that appears to be a task of great social importance. In recent years,
20 computer technology has been applied to the creation of smart homes in order to improve the lives of
21 people at home. A “Smart Home” is an entity integrated with diversified automation, communication,
22 and control service functions that collaborate in a convenient manner via intelligent tasks to provide
23 many services as energy management, comfort, security and monitoring. The field of “Smart Home”
24 is multidisciplinary with variations in architectures, dissimilar devices and systems, and diverse
25 application and features. In “Smart Home” environment, the inter-operation unpredictability is one of
26 the fundamental purposes behind uncertainty in interoperability among these systems that having
27 diverse requirements exchanges information and federated routine.
28 Home automation solutions range from overview information about energy consumption, presence
29 and weather data in illustrative forms, to automate and monitor appliances either locally or remotely.
30 Traditional strategies of building control are no longer efficient and smarter to ensure dynamic

Submitted to Sensors, pages 1 – 21 www.mdpi.com/journal/sensors


Version November 1, 2018 submitted to Sensors 2 of 21

31 pricing of electricity and demand response application for residential customers. To overcome this
32 fundamental limitation and to solve such complex and dynamic decision processes, a Multi-Agent
33 System (MAS) is required to provide intelligent energy management and home automation.
34 As distributed artificial intelligence systems, the MASs allow the intercommunication and the
35 interaction between intelligent agents. The interactions of these entities of agent technology are made
36 by the perception of the environment and executing actions according to their decisions in order to
37 conduct it and to study its several situations. Agents must cooperate to achieve the overall goals
38 as they have only a partial representation of their environment. Its performed actions can be done
39 with acting intelligent ability to learn, to analyze and to form a new knowledge. To design its own
40 plans and to achieve the proposed objectives, an intelligent agent has to be autonomous and active.
41 Its reactivity and pro-activity actions are depicted by sensing and interpreting the changes in the
42 environment in real time consuming starting information [1].
43

44 Home automation can be addressed through appliances data prediction and analytics offering
45 support for more advanced applications such as activity recognition. Automatic human Activity
46 Recognition required new context-aware domestic smart services such as voice-controlled in smart
47 homes [2]. Chahuara et al. [3] investigated an on-line Activities of Daily Living recognition
48 from non-visual, audio and home automation sensors. The performances of this smart service are
49 determinated by comparing several learning models such as conditional random fields, sequential
50 Markov Logic Network, Support Vector Machine, a Random Forest and a non-sequential MLN.
51 Predicted time series data can be investigated for energy consumption data mining, home automation
52 and also homeowner activity and behavior forecasting. Data histories clusters are utilized by machine
53 learning that take into account the effects of previous device states. Therefore, we propose a hybrid
54 approach to predict and classify features using the HMM model compered with support vector
55 regression model.
56 The MAS consist of an intelligent distributed artificial system integrating IoT aspects and human
57 cognitive for smart home automation. The proposed architecture is based on virtual multi-agent
58 platform incorporating services to provide wireless sensor networks interconnectivity. This design
59 makes it possible for smart home to include organizational aspects of human activities, improving
60 the auto-control and auto-programming of appliances. The autonomous agents fixed in devices
61 consume and provide different services to offer more distributed and complex functionalities and
62 capabilities. The proposed virtual MAS should guarantee the platform scalability, flexibility, and
63 efficient and intelligent communication with the ability to add new functionalities for user applications
64 layer. The main functions are the fusion of incoming information from heterogeneous sources. The
65 proposed system must be able to monitor automatically the smart home and to predict the household
66 energy consumption through a predictive model with input variables regarding the weather and space
67 information, appliances state, power consumptions and the date-time. Wireless Sensor Networks
68 (WSNs) are used to analyze electrical appliances command and interactions between different sensors
69 in order to make decisions automatically on a daily basis.
70 Currently, smart home owners have trouble to daily control and manage the various
71 heterogeneous smart sensors and devices. One of the greatest challenges in the current smart home
72 field, is to deal with the collaborative control problem of these heterogeneous smart devices using a
73 home automation platform.
74 Our contribution is to propose a universal smart home automation platform architecture based
75 on a multi-agent system and predictive software for making decision. The proposed system provides
76 heterogeneous devices connectivity, collaborative agents communication, human-devices interaction
77 and appliance auto-control to improve smart home automation system.
78 In this paper, we represent the ability to predict the behavior of smart home owners by monitoring the
79 daily appliances usages. We developed a robust, flexible and intelligent platform integrating temporal
80 and spatial contextual information based on the incoming sensor data.
Version November 1, 2018 submitted to Sensors 3 of 21

81 The developed prototype is used to forecast daily power consumption and to estimate human actions
82 in a smart home. We test the performance of our system using statistical metric and the experimental
83 results are encouraging. The model features are updated based on the time series analysis. The
84 technical experience requires to manipulate the proposed smart home scenario the MAS based on
85 HMM learning algorithm and the Android mobile system. Our solution is well integrated into the
86 professional company IoT application.
87 The rest of the sections of this paper are structured as follows. In Section 2, we review related
88 work in the literature. We describe our proposed MAS modeling for home automation in section 3.
89 Section 4 defines the predicting incoming time-sequence data using the SVM and the HMM models.
90 The feature extraction techniques for training process are explained in section 5. A discussion of the
91 experiments and results is presented in section 6 followed by the conclusion.

92 2. Related works
93 A smart home should meet the requirements of homeowners, including security, weather
94 information, consumption measurements and remote control of different categories of smart products.
95 According to the rapid anticipation for the smart home, several commercial intelligent IoT solutions
96 are involved for home automation using smart sensors as a universal remote-control software-defined
97 smart home appliance that can be handled safely by homeowners [4].
98 Smart home applications are generally based on the Internet of Things and cognitive dynamic systems.
99 Feng et al. [5] investigated a cognitive interactive people-centric IoT in the smart home to improve
100 human life quality with intelligent control of its setting by considering perception–actions that
101 contribute to the interactive IoT ecosystem.
102 Viswanath et al. [6] proposed the design of IoT system for real-time residential smart grid applications
103 focused on a large number of homes devices communicating through a universal IoT home gateway, a
104 cloud back-end server, and a user interfaces offering services as well as energy management, demand
105 response management, dynamic pricing, energy monitoring, home automation and home security.
106 MAS are a preferred approach addressing complex systems that rely on classical artificial intelligence
107 extended to a distributed computing environment and the sharing of tasks, resources and knowledge.
108 Their totally decentralized nature makes them particularly suitable for this type of systems. MAS allow
109 work on the overall operation of a system by focusing on the component entities and their interactions.
110 However, MAS are applied increasingly in various fields of real life due to the technological revolution
111 and the intensive use of Internet services by large companies [7]. Several constraints are questioned
112 because of this application domain diversity. Among these constraints we can cite the difficulty of
113 specifying and systematic modeling applications, the problems related to the concepts of resource
114 allocation and task and complexity of negotiation and inter-cooperation between the agents.
115 Several areas of application agents are highlighted. One example is the energy field and the application
116 of MAS in the management of electrical networks intelligently (Smart Grid [8], Smart City [9,10] and
117 Smart Home [11]). The problem is to share energy respecting the different criteria such as overall
118 performance, disparities between clients, and waste problem and its impact on the environment. Many
119 approaches have been demonstrated, but they are often based on the reasoning that some policymakers
120 must actually be negotiable between the actors. The MAS allows to consider the consequences of
121 allocation of energy rules and respective attitudes and changes in real time the parameters by the
122 actors. The fields of telecommunications and information may also be based on the MAS for network
123 management, remote management of smart home and seeking information. The communication skills
124 of designers are part of existing real problems and sought to resolve them based on cooperative and
125 interactive criteria of MAS. Many researchers have applied MAS for dynamic spectrum allocation
126 and sharing in the context of cognitive radio networks to improve real-time application performance
127 for a cognitive radio mobile terminal [12]. These distributed systems are applied in e-commerce
128 systems to reduce the Internet search time through intelligent assistants that provide products and
129 alternative merchants and offering the possibility of negotiating a product between users, customers
Version November 1, 2018 submitted to Sensors 4 of 21

130 and vendors [13]. A persistent universe for distributed adaptive tracking-type games are also designed
131 based MAS [14]. In the field of information, information retrieval techniques in heterogeneous and
132 distributed sources incorporate interacting agents to collect, analyze, update knowledge base and
133 diagnose the information system [15]. MAS are also applied in the Workflow for the management of
134 relevant information, very complex, dynamic and from several resources. It offers large companies the
135 possibility of making automatic tracking and remote information circulated between their different
136 customers, as well as the negotiation and automatic support of their business processes [16]. The MAS
137 has the appropriate meta-data representing ontology. It has a cooperative and autonomous behavior
138 strengthening intentional communication capability between agents. In industry, MAS are operated
139 in production automation process [17], smart cards [18] and cooperative robots [19]. Manufacturers
140 often try to achieve effective resolution of issues targeted as the complexity of real-mode systems.
141 This issue provides an intellectual challenge to researchers who seek to maintain soft constraint
142 systems (soft-real-time system) using agents. Thus, realizing the integration of control, maintenance
143 and technical management of production process automation system based on MAS can lead to the
144 resolution of the various industrial problems [20]. Recently, multi-agent system are utilized for crop
145 irrigation monitoring allowing farmers to rationalize the amount of resources that optimize the crops
146 needs. The system offered an economical solution combing intelligence and context-awareness by
147 merging heterogeneous data from the wireless sensor network [21]. Tele-medicine also applies MAS in
148 the supervision and assistance of the sick elderly. They are obviously used in the field of logistics and
149 information on the trips [22,23].
150 Smart grid architectures consist of many autonomous systems with partial knowledge that have to
151 cooperate and communicate in order to solve complex problems. Multi-agent technology is a mature
152 and efficient mechanism that provides many features imperatively required for smart grid applications
153 to solve its distributed nature problems. However, it still difficult to provide a generic perspective
154 on smart grid architectures. In the automation of power distribution networks, a multi-agent control
155 logic was modeled as communicating logical nodes [24]. Dynamical multi-agents systems have been
156 used for the systematic modeling of the cyber and electrical grids to provide flexibility [25,26]. Other
157 solutions for smart grid architectures are explored using a MAS to develop a decision-making support
158 for distributed power generation [27] and for managing electric vehicles charging stations [28].
159 In order to minimize the wastage of household energy and adjust the energy consumption according
160 to the customary behavior, a scenario of smart home based on a Wireless Sensors Network (WSN) and
161 MAS has been set up by building owners [29]. The efficiency of this scenario depend on the correlation
162 between the outlets consumption, the human presence and the weather parameters. The architecture
163 of MAS-based home energy management system implements agents to optimize home energy use
164 through local control algorithms and collaboration with other agent groups[30].
165 Several researches identify information security requirements that impact critical societal services
166 ranging over transport, home automation, energy management, industrial control, and health services
167 and their perceptions and attitudes on the security of Internet of Things (IoT)[31,32].
168 As dynamic contributors to smart grids, smart home systems ensured interactive communication
169 that affected the electricity demand, generation, and bills, leading to a reduction in the total demand
170 curve. Smart Grid modeling is based on a multi-agent system that designed each smart home as
171 an autonomous agent making rational decisions for power generation, storage, and trading features
172 founded on the expected utilities they offer [33–35]. The individual decisions of these home agents
173 significantly reduce electricity costs, which encourages conventional homes to purchase their own
174 local generation-storage systems and to benefit from the different operational conditions [36].
175 Integrating internet of things (IoT) with cloud computing [37] and web services, Javed et al. [38]
176 investigated a model based wireless sensor nodes for smart home monitoring and HVAC using
177 machine learning like ANN to estimate setpoints for controlling the heating, ventilation, and cooling
178 of the smart home with reducing human intervention.
179 Home automation trends towards IOT and Ambient Intelligence technologies for device self-control.
Version November 1, 2018 submitted to Sensors 5 of 21

180 Ruta, et al. [39] proposed a distributed multi-agent framework for home and building automation and
181 energy management, based on a semantic improvement of domotic standards using the discovery and
182 orchestration of formal annotation-based resources to support the semantic characterization of user
183 profiles and device functionalities. In [40], the authors proposed an adapting automation architecture
184 for activity recognition in smart home based on semi-supervised learning algorithms and Markov
185 models combining the data observations and the feedback of the users’ decisions.
186 Jose et al. [41] studied the various security issues and verification approaches to improves smart
187 home security. They proposed a device fingerprinting algorithm that considered device’s geographical
188 location, username and password of user as well as the device used to access the home.
189 During the last few decades, the Hidden-Markov Models (HMMs) have become an increasingly
190 important framework in the mining and analysis of data. The HMM has been built on its success in
191 speech recognition technology until the last research [42,43]. Similarly, the HMM has been exploited
192 for handwriting manuscripts recognition [44]. The algorithms based on the HMMs are used now
193 to solve a variety of problems, including gene prediction and protein structure prediction [45,46].
194 Improving and adapting HMM algorithms for different applications is a promising field of research.
195 Many deterministic and probabilistic models have been proposed for forecasting and prediction. The
196 HMM-based solutions have been used to predict the stock return (finance & economics) and the stock
197 price index [47].
198 In this work, we focus on HMM for two reasons. First, the power consumption comes from the
199 superposition of a finite set of ON / OFF loads of various devices. These overlays were shown in
200 the past to be well modeled using the Markov models. Second, the HMMs have been widely used
201 to model sequential events and have been shown to combine both parsimony and descriptive power
202 [48]. A discrete process taking place in the real world generates what can be viewed as a symbol
203 at each step like the prediction of weather conditions [49]. These conditions and location add up to
204 form a real-world time series. As time goes by and the process keeps running, we get a sequence of
205 symbols generated by the process. Based on the outcome of the process, these symbols can be discrete
206 or continuous. These sequences of observed symbols can be utilized to generate a statistical model that
207 describes the workings of the process and the generation of symbols over time. The HMM can be used
208 for this purpose to identity or classify other sequences of symbols. A Markov chain is useful when we
209 need to compute a probability for a sequence of events, which we can observe in the world. However,
210 the events we are interested in may not be directly observable in the world. An HMM allows us to talk
211 about both observed events and hidden events, which we think of as causal factors in our probabilistic
212 model.

213 3. Multi-Agents System Modeling For Home Automation


214 In a world where technology is constantly evolving and where competition is fierce, it becomes
215 necessary for companies to use the telecommunications and information technology. Home automation
216 is considered a field that is becoming increasingly important among the areas which fall within the
217 interests of several companies. It is in this context that many companies decided to improve the Smart
218 Home systems with the integration of an automatic process for control of daily acts of the homeowners
219 and setting service advice and automated rules by their behavior. The occupant of a Smart Home often
220 uses the same devices every day, keeping the same order. An intelligent learning system could learn
221 the behavior of each device and automatically suggest to make the device ON. The major aim is to
222 integrate a solution to learn, to recognize and to automate these modes of interaction with different
223 devices based on a MAS to ensure the criterion which combines comfort, safety and the economy in
224 terms of energy consumed. Using an agent language like Java Agent Development (JADE) framework,
225 the MASs are able to share information, to exchange messages and to solve complex problems [50]. The
226 FIPA interaction protocols between agents are being developed to define a standard communication
227 structure ensuring the conduct of conversations between the agents. These protocols use a manager
228 agent to offer the performance of a task in terms of appropriate resources and expertise by selecting
Version November 1, 2018 submitted to Sensors 6 of 21

229 the best proposal returned and then begins the necessary exchanges with the elected official. An agent
230 asks another agent to perform an action and inform the initiator about the structure in which agents
231 exist and operate [51].

232 3.1. Agents Functionalities


233 In the MAS-based Smart Home setting, we use four containers where each container is a room
234 of the house embedded in a main container (Smart Home container) as shown in Figure 1. These
235 containers interact to estimate the presence of owners and their provided activity in each Smart
236 Home room. Every container consists of a set of agents instances’ that represent the equipment or
237 environment variables (temperature, humidity, etc.) responsible for learning the habits, equipment
238 states, environmental conditions and situations of the home owners in order to supervise, control and
239 take appropriate actions. Each agent has a specific role.
240 The proposed MAS architecture for each container is presented in Figure 2. The architecture is
241 a hierarchical MAS consisting of eight agent groups constituting the objects of a container that meet
242 their goals through the collaboration with other containers (agent groups). The specific roles of agents
243 forming the MAS system consist of:

244 • “Sensor Agent”: This agent is responsible for monitoring and controlling environment sensors
245 by receiving messages from the same container agents and saving them in the database. These
246 messages can be transmitted to the “Electrical Outlet Agent”, the “Air Conditioner Agent” and
247 the “Lamp Agent” in order to predict the user actions.
248 • “Electrical Outlet Agent”: This agent receives the data sent by the “Sensor Agent” and uses it as
249 an input of the SVM classifier to predict the decision states of the electrical outlet.
250 • “Air Conditioner Agent”: The following agent receives the data sent by the “Sensor Agent” and
251 uses it as an input of the SVM classifier to predict the states of the air conditioner.
252 • “Lamp Agent”: The required data to predict the states of the lamp are sent by the “Sensor Agent”.
253 The “Lamp Agent” uses this information as an input of the SVM classifier.
254 • “Temperature Agent”: It treats the prediction of the temperature variable for a defined period
255 using the SVM machine learning. Then, the results are sent to the “Sensor Agent” embedded in
256 the same container.
257 • “Humidity Agent”: It predicts the humidity variable for a defined period using the SVM machine
258 learning. The results are sent to the “Sensor Agent” embedded in the same container.
259 • “Luminosity Agent”: It predicts the luminosity variable for a defined period using the SVM
260 machine learning. The results are sent to the “Sensor Agent” embedded in the same container.
261 • “Presence Agent”: It predicts the presence state for a defined period using the SVM machine
262 learning. The results are sent to the “Sensor Agent” embedded in the same container.
Version November 1, 2018 submitted to Sensors 7 of 21

Multi-Agents System
Container

Electrical Outlet Agent Lamp Agent Air Conditioner Agent

Inform Inform Inform

Sensor Agent

Inform Inform Inform Inform

Temperature Agent Humidity Agent Luminosity Agent Presence Agent

Figure 1. General framework for a Figure 2. Proposed MAS-based


Smart Home system architecture for Smart Home automation

263 4. Predicting incoming time-sequence data models


264 In order to predict and classify the measured results, numerous methods are applied. The SV M
265 method seems to be better suited to this problem saw his ability to handle a small amount of data. We
266 will compare our proposed HMM-based MAS modeling to the SVM model.

267 4.1. SVM model for prediction and making-decision


268 SVM are statistical learning methods focused on predicting and classifying data based on the set
269 of support vectors and minimizing the mean error [52].
270 The SVM models have been widely used to solve classification and regression problems.

271 4.1.1. Classification function


The SVM separates the different classes of data, by a hyperplane corresponding to the following
decision function:
f (x) = sign(hw, φ(x)i + b) (1)

where x is a set of observations, w is the vector of coefficients and b is a noise constant.


This supervised machine learning seeks to deduce the function Φ : x ⊂ <d → {−1; 1} from a set of
observations that will correctly classify a maximum number of vectors xi by describing the optimal
hyper-plane between the two classes:
Considering two classes of points, scored −1 and 1, we have a set of N vectors xi ∈ X ⊂ <d , i ∈ [1, N ] (
d is the input space dimension) with their associated class yi ∈ {−1; 1}.
The constrained quadratic optimization problem can be solved by w = ∑i αi Φ(xi ) in terms of a subset
of training patterns N. The classifier capacity results in the assignment of a new unknown point in the
right class. The classification function is given by the minimum of the following function:

1
f < w, ξ >= min kwk2 + C ∑ ( ξ i ) (2)
2 i
Version November 1, 2018 submitted to Sensors 8 of 21

subject to:
yi (hw, Φ(xi ) + b) ≥ 1 − ξ i i

ξi ≥ 0

where C is a pre-specified value, and ξ i are the class variables.


Due to the rich theoretical bases and the possibility of transferring the problem to a large space of
features, the SVM can provide a good generalization performance. The sensors send pairs of values
and activities forming the training set.
This nonlinear binary classification problem requires a maximum optimization model based on the
Lagrange multipliers technique and the Kernel function. The SVM classifier manages its nonlinear
nature by replacing each scalar product of the weight of activities in a dual form with the Kernel
function. The SVM solution w can be shown using the kernel method as follows:

w= ∑ y i α i Φ ( xi )
i

The support vectors’ coefficients occur when a point (xi , yi ) meets the constraint found by minimizing
the following equation:

1
W( α ) = ∑ ∑ αi − 2 ∑ ∑ (αi α j yi y j K(xi , xj )) (3)
j =1 i =1 j =1

subject to:
0 ≤ αi ≤ C

∑ yi αi = 0
i

272 4.1.2. Regression function


273 When the SVM are used in regression problems to predict real values, we will talk about the
274 “Support Vector Regression” (SVR) technique. Let consider all of training points (x1 , y1 )...(x N , y N )
275 where xi ∈ X ⊂ <d , i ∈ [1, N ] and the output target yi ∈ Y ⊂ <1 , i ∈ [1, N ], with the condition C  0
276 and e  0. The SVR can also perform regression and estimate the accuracy by computing the scale
277 parameter y = f (x), where f(x) is the estimated decision function. This function ignores errors that
278 are smaller than a certain threshold e  0, thus creating a tube around the true output. The optimal
279 regression function of the two subjects

280 hw, Φ(xi )i + b − yi ≤ e − ξ −


281 yi − hw, Φ( xi )i + b ≤ e − ξ +

is given by:
1
f < W, ξ >= min kwk2 + C ∑(ξ i− ) + C ∑(ξ i+ ) (4)
2 i i

where C is a pre-specified value, and ξ i− , ξ i+ are gap variables representing upper and lower constraints
on system outputs. The dual form of SVR using the Kernel function is:

∗ ∑ i
max

W (α, α∗ ) = max α ∗ ( yi − e ) − αi ( yi + e )
α,α α,α
i =1
l
1
− ∑ ∑ (αi∗ − αi )(α∗j − α j )K ( xi , x j ) (5)
2 i =1 j =1
Version November 1, 2018 submitted to Sensors 9 of 21

with ∑i=1 (αi∗ − αi ) = 0 et 0 ≤ αi , αi∗ ≤ C.


The result regression function of the resolution of the above equation by determining the Lagrange
multipliers αi , αi∗ is given by:
f ( x ) = ∑ (ᾱi − α¯i∗ )K ( xi , x j + b̄) (6)
SVs

with:
1 l
2 i∑
b̄ = − (αi − αi∗ )(K ( xi , xr ) + K ( xi , xs )) (7)
=1
l
< w̄, x >= ∑ (αi − αi∗ )K(xi , x j ) (8)
i =1

282 Our research provides the opportunity to learn the behavior of a resident of a Smart Home from all
283 detected usual tasks. The appropriate equipment, according to the “Electrical Outlet Agent”, the “Air
284 Conditioner Agent” and the “Lamp Agent”, request the “Sensor Agent” to look for specific data
285 to select the equipment criteria grouped by the “Temperature Agent”, the “Humidity Agent”, the
286 “Luminosity Agent” and the “Presence Agent”. The “Sensor Agent” returns the requested list which
287 will be processed by the equipment agents. Finally, the results that meet the needs of the resident
288 return to the “Sensor Agent”. The interactions between agents performing the learning algorithm are
289 illustrated by sequence diagrams shown in Figures 3 and 4.
290

Agents: Temperature,
Sensor Agent Luminosity, Humidity LibSVM DataBase
Agents: Temperature,
and Presence
Luminosity, Humidity LibSVM Sensor Agent DataBase
and Presence
1: Select Data

1: Select Data 2: Data selected

2: Data selected 3: SVM-Scale()

3: SVM-Scale() 4: Data scaled

4: Data scaled 5: SVM-Train()

5: SVM-Train() 6: Model

6: Model 7: Send Message

7: SVM-Predict() 8: Get Message

8: Data predicted 9: SVM-Predict()

9: Send Predicted Data 10: Data predicted

10: Save Data 11: Save Data

Figure 3. Interaction sequence diagram for Figure 4. Interaction sequence diagram


temperature, luminosity, humidity and presence for Sensor Agent
agents

291 After making its predictions, the MAS sends notifications to the user who can consult the list
292 of predicted tasks. Following its consulted results, the user can react by canceling, confirming or
293 changing the date and the system will finally save these changes.

294 4.2. HMM model for prediction and making-decision


295 The Hidden Markov Model (HMM) is a good modeling tool describing a natural model structure
296 based on a double stochastic process, which can provide an output distribution model for the sequences
297 with temporal constraints, spatial variability along the sequence, and real world complex processes.
Version November 1, 2018 submitted to Sensors 10 of 21

298 The proposed HMM technique focuses on forward-backward, Viterbi, Baum-Welch and prediction
299 algorithms to recognize and generate a new sequence of observable symbols.

Feature extraction

Variable selection

Normalized
vector creation

Observations matrix A;
Parameter Transitions matrix B;
initialization Initial prob vectors πi

Learning a
real model

Baum-Welch Best model


algorithm estimation

Viterbi Most probable state


algorithm sequence prediction

Forward-backward Likelihood
algorithm calculation

no
Convergence?

yes

Estimated and
approved model

Variable prediction Real values

Mean Squared
Error calculation
Error

Figure 5. Time series data prediction process using HMM

300 We consider an HMM model with N states S = s1 , s2 , .., s N where the state at time t is denoted by
301 qt and M distinct observation symbols per state denoted by v = v1 . . . v M . The HMM model parameter
302 notation is λ = (A, B, Π), where:

303 • A = aij , 1 ≤ i, j ≤ N is the state transition probability distribution, with aij =

304 P qt = s j |qt
− 1 = si ,
305 • B = b j (vk ) for 1 ≤ i ≤ N , 1 ≤ k ≤ M is the observation symbol probability distribution in

306 state j, with b j (vk ) = P vk at t = s j |qt = s j ,
307 • πi = P (q1 = si ) , 1 ≤ i ≤ N is the initial state distribution.

308 These parameters correspond to the case where the observation are characterized as discrete symbols
309 chosen from a finite alphabet, and therefore discrete probability density can be used. However, in
310 practice, we mostly deal with continuous observation signals like speech, biological signals and image.
311 Thus, we have to use the HMM with a continuous observation density. In this work, we are interested
312 in solving three problems that will be detailed in the following subsections.

313 4.2.1. Evaluation Problem


The probability of the observation sequence q = q1 , . . . q T is:

P (o, q|λ) = bq1 (o1 )bq2 (o2 ) · · · bqT (o T ) (9)


Version November 1, 2018 submitted to Sensors 11 of 21

The probability of such a state sequence can be written as:

P ( q | λ ) = π q 1 a q 1 q 2 a q 2 q 3 . . . a q T −1 q T (10)

Accordingly, the probability of observation, given the model, will be:

P (o | λ ) = ∑ P (q|λ) P (o, q|λ) (11)


∀q

= ∑ π q 1 bq 1 ( o 1 ) a q 1 q 2 bq 2 ( o 2 ) a q 2 q 3 . . . bq T ( o T ) a q T −1 q T
∀q

314 The direct computation of this equation will involve the order of 2TN T calculations, which is absolutely
315 impossible for practical applications. Fortunately, an efficient procedure exists and is called a
forward-backward procedure based on the two functions defined by Algorithm 1 and Algorithm 2.

Algorithm 1: Forward-backward algorithm: α forward function


Data: Consider the forward variable αt (i ) defined as:

α t ( i ) = P ( o 1 o 2 . . . o T , q t = si | λ )

for i = 1 to N do
/*Initialization*/

α 1 ( i ) ← π i bi ( o 1 )

t←1

/*Induction*/
while t ≤ T − 1 do
j←1

while j ≤ N do
" #
N
α t +1 ( j ) = ∑ αt (i)aij bj (ot+1 )
i =1

j ← j+1

t ← t+1

/*Termination*/
N
P (o| λ ) = ∑ α T (i )
i =1

316

317 4.2.2. Optimization (Decoding) problem


There are several possible ways to obtain the optimal state sequence associated with the given
observation. For example, we can choose the states which are individually the most likely. To
implement this, we define a new variable:

α t (i ) β t (i )
γ t ( i ) = P ( q t = si ) = (12)
P (o| λ )

Using this variable, we can solve for individually most likely state at time t:

qt = arg min [γt (i )] ;1 ≤ i ≤ N (13)

318 This solution is not optimal when there are some null transitions between the different states. Therefore,
319 we need to modify the optimality criterion. Following Algorithm 3, we find the single best state
320 sequence for the given observation sequence in the model.
Version November 1, 2018 submitted to Sensors 12 of 21

Algorithm 2: Forward-backward algorithm: β backward function


Data: Consider a backward variable as follows:
β t ( i ) = P ( o t + 1 o t + 2 . . . o T | q t = si , λ )

for i = 1 to N do
/*Initialization*/

β T (i ) = 1
t ← T−1

/*Induction*/
while t ≥ 1 do
i←1

while i ≤ N do
N
β t (i ) = ∑ aij bj (ot+1 ) βt+1
i =1

i ← i+1

t ← t−1

/*Termination*/
N
P (o| λ ) = ∑ β 1 (i )
i =1

321 4.2.3. Training problem


322 There is no known analytical approach to solve the model parameters that maximize the
323 probability of observation. Here, one of the most famous algorithms named the Baum-Welch
324 algorithm is defined by Algorithm 4. For an appropriate training of the model, we need to feed
325 multi-observation sequences to the re-estimation procedure. The final re-estimation formulation for an
326 ergodic, continuous observation HMM with a multi-observation training can be shown in this way.
327 This formulation is supposed for a mixture of Gaussian distributions as probability density function of
328 observations. In this process, forward and backward variables consist of a large number of a and b
329 terms that are generally significantly less than 1. It can be seen, as t gets bigger, that each term will
330 exceed precision range of any machine even in double precision. Hence the only reasonable way of
331 those computations is incorporating a scaling procedure.
332 A discrete process taking place in the real world generates what can be viewed as a symbol at each
333 step. For example, a coin toss can lead to a series of heads and tails. As time goes by and the process
334 keeps running, we get a sequence of symbols generated by the process. Based on the outcome of the
335 process, these symbols can be discrete (e.g., result of coin toss or number of house appliance in a house)
336 or continuous (e.g., humidity). These sequences of observed symbols can be utilized to generate a
337 statistical model that describes the workings of the process and the generation of the symbols over time.
338 This model can be used to identity or classify new sequences of observations (symbols). Such a used
339 model is the HMM. When the information gathered from a system is generated in a sequential manner,
340 the state transition networks can be utilized to represent a consistent knowledge about its functioning.
341 The network consists of a set of states and transitions between the states. Transitions between states
342 are triggered by some events in the domain. The HMM state transition networks have an initial state
343 and an accepting or end one. The network will recognize a sequence of events if these events start
344 at the initial state and end in the accepting one, when all the events have occurred. In the HMM, the
345 states are referred to as hidden because the system we wish to model may have underlying causes
346 that cannot be observed. Therefore, the hidden or non-observable process taking place in the system
347 can be determined from the process that generates a sequence of observable symbols. The transition
348 and emission matrices are estimated by the MATLAB HMM toolbox using the dhmm_em function. For
349 each observation of a given sequence, we apply the probability distributions P( x( N + 1)|x), as shown
Version November 1, 2018 submitted to Sensors 13 of 21

Algorithm 3: Viterbi algorithm (decoding)


Data: The best score along the single path at time t, which accounts for the first t observations and ends in the state δ(i ), can be
expressed as follows:
δt (i ) = max P (q1 q2 . . . qt = i, o1 o2 . . . ot |λ)

/*Initialization*/
for i = 1 to N do
δ1 (i ) = πi bi (o1 )

ψ1 (i ) = 0

t←2

/*Induction*/
while t ≤ T − 1 do
j←1

while j ≤ N do
 
δt (i ) = max δt−1 (i ) aij b j (ot )

Ψt (i ) = arg max δt−1 (i ) aij


 

j ← j+1

t ← t+1

/*Termination*/

P∗ = max [δT (i )]
q∗T = arg max [δT (i )]
t←T

/*Path backtracking*/
while t ≥ 1 do
q∗T = ψt+1 (q∗t+1 )

t ← t−1

350 in Algorithm 5, in order to predict (or forecast) the next observation that automatically will depend on
351 the previous state and symbol.

352 5. Feature extraction for classification and regression


353 The features were obtained from the wireless sensor network installed in the smart home. These
354 features are classified using the SVM and HMM machine learning.

355 5.1. SVM model features


356 The development of the classifier took place using the LIBSVM library of the framework JADE.
357 LIBSVM toolbox generates a structure containing the type of SVM, kernel type, number of support
358 vectors, number of classes, class labels and offset. It is employed both in the training and testing
359 phases. Several SVM structures were trained on all the possible combinations of signals. A prediction
360 can be obtained with a combination of previously predicted signals.
361 In our case study, the SVM machine learning is made in four steps (LIBSVM functions):

362 1. "SVM-Scale" function: .


363 2. "SVM-Train" function for training data to obtain a prediction model.
364 3. "SVM-Predict" function using the model to make predictions about the classes or the values in
365 the case of regression.
366 4. "SVM-Test" function to calculate the model performance. In this step, we select data that we
367 know their class or their values (for the case of regression) and we use the model to predict their
368 class and value, and in the end, we compare the two real and predicted classes (or the real and
369 predicted values).
Version November 1, 2018 submitted to Sensors 14 of 21

Algorithm 4: Baum-Welch
Data: Consider a set of K observation of sequences. We need to maximize
  the product of each probability of the individual
observation, given the following model: P (o|λ) = ∏kK=1 P o(k) |λ = ∏kK=1 Pk
/* Initialization*/

λ = (A0 , B, Π)
Iterate until {convergence }
/*Forward-backward variable: expectation step*/
αt (i ) aij b j (ot+1 ) β t+1 (i )
ξ t (i, j) =
∑iN N α (i ) a b ( o
=1 ∑ j =1 t ij j t+1 ) β t+1 (i )

(k) (k) (k)


P(o|λ), αt (i ), β t (i ), γt (i ) ← f orward − backward(o, λ)

/*Re-estimation: maximization step*/


T −1
∑t=1 ξ t (i, j)
âij = T −1
∑ t = 1 γt ( i )
T
∑t=1,ot =vk γt (i )
b̂ j (vk ) = T −1
∑ t = 1 γt ( i )

π̂ii = γ1 (i )
/*Termination*/
K 1 T −1 ( k ) (k) (k)
∑ k =1 Pk ∑t=1 α t (i ) aij b j ( o t+1 ) β t+2 ( j )
âij = ( k ) ( k )
K T
∑k=1 P1 ∑t=1 αt (i ) β t+1 ( j)
k

K1 T (k) (k)
∑ k =1
Pk ∑ t =1 α t ( i ) β t ( j )
b̂ij = (k) (k)
K T
∑k=1 P1 ∑ t =1 α t ( i ) β t +1 ( j )
k

K
1
π̂i = ∑ Pk
γ1 (i )
k =1

λ̂ = Â, B̂, Π̂


/*Gaussian mixture distributions formulation: probability density function of observations*/


K 1 T (k) (k)
∑ k =1 Pk ∑ t = 1 γt ( j ) o t
µi = (k)
K 1 T
∑ k =1 Pk ∑ t = 1 γt ( j )

K 1 T (k) (k)
∑ k =1 Pk ∑ t=1 γt ( j)(ot − µi )
σi2 = (k)
K T
∑k=1 P1 ∑ t = 1 γt ( j )
k

370 We have chosen to use as ’Input-Attributes’ the time and the date and as ’Input-Labels’ the class or
371 the regression values for the prediction of power consumption, presence, open/close and environment
372 variables (Temperature, Humidity and luminosity).
373 The SVM model parameters are specified in the following table 1.
Version November 1, 2018 submitted to Sensors 15 of 21

Algorithm 5: Prediction algorithm


Data: xi : observation ; zi : hidden state
/*Initialization*/ for i = 1 to N + 1 do

P ( x N +1 | x ) =
1
P (x) ∑ P ( x N +1 , z N +1 ) ∑ P ( z N +1 | z N ) α ( z N )
z N +1 zN

with P(x) = ∑z N α(z N ).

Table 1. SVM Model parameters for each variable.


State Presence Door-sensor Temperature Humidity Luminosity Power

Hour Hour Hour Hour Hour Hour Hour


Minute Minute Minute Minute Minute Minute Minute
Day of the month Day of the month Day of the month Day of the month Day of the month Day of the month Day of the month
Attributs Day of the week Day of the week Day of the week Day of the week Day of the week Day of the week Day of the week
Inputs Week of the month Week of the month Week of the month Week of the month Week of the month Week of the month Week of the month
SVM-Train
Month Month Month Month Month Month Month
Year Year Year Year Year Year Year
Class ’0’ : OFF Class ’0’ : Absence Class ’0’ : Close
Labels Real values Real values Real values Real Values
Class ’1’ : ON Class ’1’ : Presence Class ’1’ : Open
Outputs Model Regression Model Regression Regression Classification Classification Classification Classification

Hour Hour Hour Hour Hour Hour Hour


Minute Minute Minute Minute Minute Minute Minute
Day of the month Day of the month Day of the month Day of the month Day of the month Day of the month Day of the month
Inputs Attributs Day of the week Day of the week Day of the week Day of the week Day of the week Day of the week Day of the week
SVM-Predict Week of the month Week of the month Week of the month Week of the month Week of the month Week of the month Week of the month
Month Month Month Month Month Month Month
Year Year Year Year Year Year Year
Class ’0’ : OFF Class ’0’ : Absence Class ’0’ : Close
Outputs Labels Real values Real values Real values Real Values
Class ’1’ : ON Class ’1’ : Presence Class ’1’ : Open

Hour Hour Hour Hour Hour Hour Hour


Minute Minute Minute Minute Minute Minute Minute
Day of the month Day of the month Day of the month Day of the month Day of the month Day of the month Day of the month
Attributs Day of the week Day of the week Day of the week Day of the week Day of the week Day of the week Day of the week
Inputs Week of the month Week of the month Week of the month Week of the month Week of the month Week of the month Week of the month
SVM-Test
Month Month Month Month Month Month Month
Year Year Year Year Year Year Year
Class ’0’ : OFF Class ’0’ : Absence Class ’0’ : Close
Labels Real values Real values Real values Real Values
Class ’1’ : ON Class ’1’ : Presence Class ’1’ : Open
Outputs Performance MSE MSE MSE MSE MSE MSE MSE

374 5.2. HMM model features


375 In our case study, we manage to set up a training set of luminosity, power consumption, humidity,
376 door opening /closing, equipment state and temperature. Table 2 represents the HMM parameters
377 for each variable to predict the number of observations, the number of hidden states, the transition
378 matrix, the observation matrix, the proposed number of iterations, and the iteration rank allowing the
379 convergence of the algorithm.

Table 2. HMM parameters for each variable.

State Power Presence Temperature Open/close Luminosity


Number of observations X X X X X X
Number of hidden states 10 75 ou 100 100 100 8 75
Transition matrix 10*10 75*75 100*100 100*100 8*8 75*75
Observation matrix 10*X 75*X 100*X 100*X 8*1500 75*X
Number of iterations 300 300 300 300 300 300
Convergence of the algorithm Iteration 26 Iteration 43 Iteration 36 Iteration 62 Iteration 23 Iteration 66

380 5.3. Quality and precision criteria


To assess the quality of the predictions obtained in the SVM-Test as well as our HMM model, we
will mainly look at their respective predictive performances measured by the Mean Squared Error
(MSE) and the Root MSE (RMSE). As a matter of fact, we are working with hourly-normalized data
and we will model each hour independently from one another, which is a common choice given the
type of data, hence leading to a model of 24 separate daily hours. Indexing the respective MSE criteria
of these models by the instant i = 0, ..., 23, to which they are associated, and given their respective
Version November 1, 2018 submitted to Sensors 16 of 21

observations y1 , i, ..., y N ,i, these models return 24 t-day-ahead predictions, defined as the expectations
of the predictive distributions, i.e., for i = 0, .., 23:

N
1
MSE =
N ∑ (ŷi yi )2 (14)
i =1

v
N
u
u1
RMSE = t
N ∑ (ŷi yi )2 (15)
i =1

381 In our case study, we managed to set up a training set of three months in 2015 so as to predict the
382 values of the different variables for June 2016.

383 6. Experiments and Results


384 We construct the SVM and the HMM learning schemes to predict the environment variables and
385 power consumption in a “Smart Home”. We used the Smart Life dataset for validating the performance
386 of our proposed architectures. The Smart Life dataset consists of an independent dataset collected
387 from couple apartment for a year using “Smart Life” data collected from different sensors. The sensors
388 were installed in everyday objects, such as refrigerator, and light switches, etc.
389 We thoroughly analyzed a full year’ historical data and utilized them to configure a large number
390 of input and output dataset for the machine learning. The SVM prediction model can simultaneously
391 predict the future amount of time-series data. In regards to our proposed MAS based on HMM
392 modeling process, the considered issues include the selection of the emission density function, the
393 update rule and the characteristics of the selected time series. Each state of the HMM is associated with
394 a probability density function with different parameters, which gives each state various prediction
395 preferences. Some states are more sensitive to low volatile data and others are more accurate with
396 high volatile segments. Using the MSE and the RMSE functions, we investigate the performance of
397 our proposed MAS based on the HMM/SVM predictors. Throughout the experiments, the proposed
398 HMM-based prediction model shows the highest prediction accuracy, compared to the SVM model, as
399 shown in table 3. As results, these prediction data can be effectively utilized by the MAS proposed
400 modeling for behavior recognition and energy management system in “Smart Home”.
401 Figures 7, 9, 11 and 15 show that for a long-term (monthly) forecast, the prediction for the time
402 series with low intensity of harmonic variations, as the temperature variable (RMSE = 1.122), is more
403 performant than the high intensity of harmonic variations, like the humidity (RMSE = 3.334) and
404 the luminosoty (RMSE = 3.181). The performance of the HMM model for air conditioner operating
405 status classification is perfect with an RMSEvalue = 0, describing a perfect conformity of the predicted
406 equipment state values to the real ones. However, the SVM accuracy is about RMSE = 3.02. For the
407 power variable, the accuracy prediction (RMSE = 4.594) is better than the SVM model (RMSE = 6.243).
408 These simulations confirm that our proposed HMM-based MAS outperforms the SVM model and the
409 results are very close to the real values.
410 Furthermore, for a short-term (hourly) forecast, the RMSE accuracy of our proposed approach is
411 lower than the long-term forecast for the steady-states and great for transient states and harmonics.
412 Figures 6, 8, 10, 14, 12 and 13 show that the HMM model outperforms the SVM model for hourly
413 prediction of the air-conditioner state and power consumption, the weather and the presence variables.
414 All the simulations confirm the outperformance of our MAS based on the HMM predictor
415 compared with the SVM model. The HMM is a good modeling tool describing a natural model
416 structure based on a double stochastic process, which can provide an output distribution model for the
417 sequences with temporal constraints, spatial variability along the sequence, and real world complex
418 processes. The HMM focuses on the efficient evaluation, decoding and training algorithms described
419 previously.
Version November 1, 2018 submitted to Sensors 17 of 21

Table 3. Evaluating predictive accuracy (MSE, RMSE) of the proposed HMM model compered with the
SVM model used to predict the Air-conditioner equipment state and power consumption, multi-sensors
variables (presence, temperature, luminosity and humidity) and door-sensor (Open/Close)
State Presence Open/Close Temperature Humidity Luminosity Power
MSE RMSE MSE RMSE MSE RMSE MSE RMSE MSE RMSE MSE RMSE MSE RMSE

Hourly 8.98 2.996 0.625 0.790 9.396 3.065 9.977 3.158 11.926 3.453 6.533 2.556 358801.625 599
SVM
Monthly 9.112 3.02 0.482 0.694 9.487 3.080 19.903 4.461 97.927 9.895 27.280 5.223 38.984 6.243

Hourly 0 0 0 0 0 0 1.022 1.011 4.045 2.011 44.205 6.648 150017.333 387.320


HMM
Monthly 0 0 0 0 0.177 0.420 1.493 1.221 11.115 3.334 10.120 3.181 21.105 4.594

Figure 6. Comparing HMM model Figure 7. Comparing HMM model with


with SVM model for hourly temperature SVM model for monthly temperature
prediction prediction

Figure 8. Comparing HMM model Figure 9. Comparing HMM model


with SVM model for hourly humidity with SVM model for monthly humidity
prediction prediction

Figure 10. Comparing HMM model Figure 11. Comparing HMM model
with SVM model for hourly luminosity with SVM model for monthly luminosity
prediction prediction
Version November 1, 2018 submitted to Sensors 18 of 21

Figure 12. Comparing HMM model Figure 13. Comparing HMM model
with SVM model for presence prediction with SVM model for equipment state
prediction

Figure 14. Comparing HMM model Figure 15. Comparing HMM model
with SVM model for hourly power with SVM model for daily power
consumption prediction consumption prediction

420 7. Conclusion
421 To carry out our work, we started by introduce the Multi-agent systems and their applications to
422 design our solution for smart home automation. This introduction is crowned by a detailed research
423 on artificial intelligence fields focusing on HMM machine learning. In this paper, we proposed a
424 new active learning algorithm to predict and classify the features of a WSN installed in a smart
425 home. The SVM is one of the strategies adopted to solve the problem of prediction and multi-class
426 classification for home automation. We proposed a MAS based on the HMM machine learning to
427 automate the process of air conditioning in a home as a case study. Indeed, we used the HMM model
428 to predict the ambient temperature and the condition and value of the power of the air conditioner
429 at a specific timescale. These recorded features will be useful for decision-making and automatic
430 control of the air conditioner. The HMM predictor can determine the temperature set point belonging
431 to a predefined class. Experimental results on real world application "Smart Life" demonstrates the
432 effectiveness of our proposed MAS based on HMM model that outperforms the SVM model. However,
433 the combination of different criteria raises the problem of uncertainty and diversity of active learning
434 based on HMM model. In the future, we will explore better ways to measure uncertainty and diversity
435 and examples of developing advanced combination approaches like HMM-LSTM deep learning for
436 home automation, energy prediction and management, and activity recognition. Nevertheless, we
437 can extend the realization of this project by enriching the use of more accurate intelligent sensors. In
438 addition, we can optimize our solution using a consistent big data learning base for real-time prediction
439 of sequential data.

440 Acknowledgment
441 This work is part of a PhD thesis supported by “PASRI-MOBIDOC” Project financed by the EU,
442 “Chifco” company, “Innov’COM” laboratory, and National School of Engineering of Tunis-ENIT.
Version November 1, 2018 submitted to Sensors 19 of 21

443

444 1. Hurtado, L.A.; Mocanu, E.; Nguyen, P.H.; Gibescu, M.; Kamphuis, R.I.G. Enabling Cooperative Behavior
445 for Building Demand Response Based on Extended Joint Action Learning. IEEE Transactions on Industrial
446 Informatics 2018, 14, 127–136. doi:10.1109/TII.2017.2753408.
447 2. Boucha, D.; Amiri, A.; Chogueur, D. Controlling electronic devices remotely by voice and brain
448 waves. Proc. Int. Conf. Mathematics and Information Technology (ICMIT), 2017, pp. 38–42.
449 doi:10.1109/MATHIT.2017.8259693.
450 3. Chahuara, P.; Fleury, A.; Portet, F.; Vacher, M. On-line human activity recognition from audio and home
451 automation sensors: Comparison of sequential and non-sequential models in realistic Smart Homes1.
452 Journal of Ambient Intelligence and Smart Environments 2016, 8, 399–422.
453 4. Lin, Y.W.; Lin, Y.B.; Hsiao, C.Y.; Wang, Y.Y. IoTtalk-RC: Sensors As Universal Remote Control for
454 Aftermarket Home Appliances. IEEE Internet of Things Journal 2017, 4, 1104–1112.
455 5. Feng, S.; Setoodeh, P.; Haykin, S. Smart Home: Cognitive Interactive People-Centric Internet of Things.
456 IEEE Communications Magazine 2017, 55, 34–39.
457 6. Viswanath, S.K.; Yuen, C.; Tushar, W.; Li, W.T.; Wen, C.K.; Hu, K.; Chen, C.; Liu, X. System design of the
458 internet of things for residential smart grid. IEEE Wireless Communications 2016, 23, 90–98.
459 7. Ruta, M.; Scioscia, F.; Loseto, G.; Sciascio, E.D. A Semantic-Enabled Social Network of Devices for Building
460 Automation. IEEE Transactions on Industrial Informatics 2017, 13, 3379–3388. doi:10.1109/TII.2017.2697907.
461 8. Papatsimpa, C.; Linnartz, J.P.M.G. Energy efficient communication in smart building WSN running
462 distributed Hidden Markov chain presence detection algorithm. Proc. IEEE 4th World Forum Internet of
463 Things (WF-IoT), 2018, pp. 112–117. doi:10.1109/WF-IoT.2018.8355221.
464 9. Song, H.; Srinivasan, R.; Sookoor, T.; Jeschke, S., Building Cyber-Physical Systems – A Smart Building Use
465 Case. In Smart Cities: Foundations, Principles, and Applications; Houbing, S.; Ravi, S.; Tamim, S.; Sabina, J.,
466 Eds.; John Wiley & Sons, 2017; chapter Smart Cities: Foundations, Principles and Applications, pp. 606–638.
467 doi:10.1002/9781119226444.ch21.
468 10. Elchamaa, R.; Dafflon, B.; Ouzrout, Y.; Gechter, F. Agent based monitoring for smart cities: Application
469 to traffic lights. Proc. Information Management Applications (SKIMA) 2016 10th Int. Conf. Software,
470 Knowledge, 2016, pp. 292–297. doi:10.1109/SKIMA.2016.7916235.
471 11. Gazafroudi, A.S.; Pinto, T.; Prieto-Castrillo, F.; Prieto, J.; Corchado, J.M.; Jozi, A.; Vale, Z.; Venayagamoorthy,
472 G.K. Organization-based multi-agent structure of the smart home electricity system. Evolutionary
473 Computation (CEC), 2017 IEEE Congress on. IEEE, 2017, pp. 1327–1334.
474 12. Lu, J.; Li, L.; Shen, D.; Chen, G.; Jia, B.; Blasch, E.; Pham, K. Dynamic multi-arm bandit game based
475 multi-agents spectrum sharing strategy design. Proc. IEEE/AIAA 36th Digital Avionics Systems Conf.
476 (DASC), 2017, pp. 1–6. doi:10.1109/DASC.2017.8102137.
477 13. Al-Haj, A.; Al-Tameemi, M.A. Providing security for NFC-based payment systems using a management
478 authentication server. Proc. 4th Int. Conf. Information Management (ICIM), 2018, pp. 184–187.
479 doi:10.1109/INFOMAN.2018.8392832.
480 14. Zhang, Q.; Zhang, J.F. Adaptive Tracking Games for Coupled Stochastic Linear Multi-Agent Systems:
481 Stability, Optimality and Robustness. IEEE Transactions on Automatic Control 2013, 58, 2862–2877.
482 doi:10.1109/TAC.2013.2270869.
483 15. Patil, R.N.; Kadam, A.D.; Vanjale, S.B.; Thakore, D.; Joshi, S. A semi-supervised research approach to
484 web-Image Re-ranking: Semantic image search engine. Proc. and Optimization Techniques (ICEEOT) 2016
485 Int. Conf. Electrical, Electronics, 2016, pp. 1883–1891. doi:10.1109/ICEEOT.2016.7755015.
486 16. Mokni, M.; Hajlaoui, J.E.; Brahmi, Z. MAS-Based Approach for Scheduling Intensive Workflows in Cloud
487 Computing. Proc. IEEE 27th Int. Conf. Enabling Technologies: Infrastructure for Collaborative Enterprises
488 (WETICE), 2018, pp. 15–20. doi:10.1109/WETICE.2018.00010.
489 17. Huang, V.K.L.; Bruckner, D.; Chen, C.J.; Leitão, P.; Monte, G.; Strasser, T.I.; Tsang, K.F. Past, present and
490 future trends in industrial electronics standardization. Proc. IECON 2017 - 43rd Annual Conf. of the IEEE
491 Industrial Electronics Society, 2017, pp. 6171–6178. doi:10.1109/IECON.2017.8217072.
492 18. Amin, R.; Islam, S.K.H.; Biswas, G.P.; Khan, M.K. An efficient remote mutual authentication scheme using
493 smart mobile phone over insecure networks. Proc. Data Analytics and Assessment (CyberSA) 2015 Int.
494 Conf. Cyber Situational Awareness, 2015, pp. 1–7. doi:10.1109/CyberSA.2015.7166114.
Version November 1, 2018 submitted to Sensors 20 of 21

495 19. Rizk, Y.; Awad, M.; Tunstel, E.W. Decision Making in Multiagent Systems: A Survey. IEEE Transactions on
496 Cognitive and Developmental Systems 2018, 10, 514–529. doi:10.1109/TCDS.2018.2840971.
497 20. Leitão, P.; Rodrigues, N.; Turrin, C.; Pagani, A. Multiagent System Integrating Process and Quality Control
498 in a Factory Producing Laundry Washing Machines. IEEE Transactions on Industrial Informatics 2015,
499 11, 879–886. doi:10.1109/TII.2015.2431232.
500 21. Villarrubia, G.; Paz, J.F.D.; Iglesia, D.H.; Bajo, J. Combining multi-agent systems and wireless sensor
501 networks for monitoring crop irrigation. Sensors 2017, 17, 1775.
502 22. Kaur, A.; Cheema, S.S. A hybrid multi-agent based particle swarm optimization for telemedicine system
503 for neurological disease. Proc. Int. Conf. Recent Advances and Innovations in Engineering (ICRAIE), 2016,
504 pp. 1–6. doi:10.1109/ICRAIE.2016.7939527.
505 23. Jemaa, A.B.; Ltifi, H.; Ayed, M.B. Visual intelligent remote healthcare monitoring system using multi-agent
506 technology. Proc. 2nd Int. Conf. Advanced Technologies for Signal and Image Processing (ATSIP), 2016,
507 pp. 465–471. doi:10.1109/ATSIP.2016.7523118.
508 24. Vyatkin, V.; Zhabelova, G.; Higgins, N.; Ulieru, M.; Schwarz, K.; Nair, N.K.C. Standards-enabled Smart
509 Grid for the future Energy Web. Proc. Innovative Smart Grid Technologies (ISGT), 2010, pp. 1–9.
510 doi:10.1109/ISGT.2010.5434776.
511 25. Gazafroudi, A.S.; Pinto, T.; Prieto-Castrillo, F.; Corchado, J.M.; Abrishambaf, O.; Jozi, A.; Vale, Z. Energy
512 flexibility assessment of a multi agent-based smart home energy system. Proc. IEEE 17th Int. Conf.
513 Ubiquitous Wireless Broadband (ICUWB), 2017, pp. 1–7. doi:10.1109/ICUWB.2017.8251008.
514 26. Leitão, P.; Colombo, A.W.; Karnouskos, S. Industrial automation based on cyber-physical systems
515 technologies: Prototype implementations and challenges. Computers in Industry 2016, 81, 11–25.
516 doi:10.1016/j.compind.2015.08.004.
517 27. Robitzky, L.; Müller, S.C.; Dalhues, S.; Häger, U.; Rehtanz, C. Agent-based redispatch for real-time overload
518 relief in electrical transmission systems. Proc. IEEE Power Energy Society General Meeting, 2015, pp. 1–5.
519 doi:10.1109/PESGM.2015.7285886.
520 28. Lakshminarayanan, V.; Chemudupati, V.G.S.; Pramanick, S.; Rajashekara, K. Real-time Optimal Energy
521 Management Controller for Electric Vehicle Integration in Workplace Microgrid. IEEE Transactions on
522 Transportation Electrification 2018, p. 1. doi:10.1109/TTE.2018.2869469.
523 29. Conte, G.; Morganti, G.; Perdon, A.M.; Scaradozzi, D. Multi-Agent System Theory for Resource
524 Management in Home Automation Systems. JOURNAL OF PHYSICAL AGENTS 2009, VOL. 3.
525 30. Asare-Bediako, B.; Kling, W.L.; Ribeiro, P.F. Multi-agent system architecture for smart home
526 energy management and optimization. Proc. IEEE PES ISGT Europe 2013, 2013, pp. 1–5.
527 doi:10.1109/ISGTEurope.2013.6695331.
528 31. Asplund, M.; Nadjm-Tehrani, S. Attitudes and perceptions of iot security in critical societal services. IEEE
529 Access 2016, 4, 2130–2138.
530 32. Zouai, M.; Kazar, O.; Haba, B.; Saouli, H. Smart house simulation based multi-agent system and internet
531 of things. Proc. Int. Conf. Mathematics and Information Technology (ICMIT), 2017, pp. 201–203.
532 doi:10.1109/MATHIT.2017.8259717.
533 33. Wu, C.; Chen, Y.; Chin, J.; Chen, S. The developments of communication agent and home appliance
534 automation agent based on intel ACAT. Proc. IEEE Int. Conf. Applied System Invention (ICASI), 2018, pp.
535 465–467. doi:10.1109/ICASI.2018.8394286.
536 34. Saba, D.; Degha, H.E.; Berbaoui, B.; Laallam, F.Z.; Maouedj, R. Contribution to the modeling and simulation
537 of multiagent systems for energy saving in the habitat. Proc. Int. Conf. Mathematics and Information
538 Technology (ICMIT), 2017, pp. 204–208. doi:10.1109/MATHIT.2017.8259718.
539 35. Feron, B.; Monti, A. Integration of space heating demand flexibility in a home energy management system
540 using a market-based multi agent system. Proc. IEEE Power Energy Society General Meeting, 2017, pp.
541 1–5. doi:10.1109/PESGM.2017.8273810.
542 36. Li, W.; Logenthiran, T.; Phan, V.T.; Woo, W.L. Intelligent housing development building management
543 system (hdbms) for optimized electricity bills. Environment and Electrical Engineering and 2017 IEEE
544 Industrial and Commercial Power Systems Europe (EEEIC/I&CPS Europe), 2017 IEEE International
545 Conference on. IEEE, 2017, pp. 1–6.
546 37. Li, X.; Zheng, S. Application of universal control platform in intelligent building. Proc. 3rd Int. Conf.
547 Intelligent Green Building and Smart Grid (IGBSG), 2018, pp. 1–4. doi:10.1109/IGBSG.2018.8393565.
Version November 1, 2018 submitted to Sensors 21 of 21

548 38. Javed, A.; Larijani, H.; Ahmadinia, A.; Gibson, D. Smart random neural network controller for HVAC
549 using cloud computing technology. IEEE Transactions on Industrial Informatics 2017, 13, 351–360.
550 39. Ruta, M.; Scioscia, F.; Loseto, G.; Di Sciascio, E. Semantic-based resource discovery and orchestration in
551 home and building automation: A multi-agent approach. IEEE Transactions on Industrial Informatics 2014,
552 10, 730–741.
553 40. Karami, A.B.; Fleury, A.; Boonaert, J.; Lecoeuche, S. User in the Loop: Adaptive Smart Homes Exploiting
554 User Feedback—State of the Art and Future Directions. Information 2016, 7, 35.
555 41. Jose, A.C.; Malekian, R.; Ye, N. Improving Home Automation Security; Integrating Device Fingerprinting
556 Into Smart Home. IEEE Access 2016, 4, 5776–5787.
557 42. Li, T.S.; Kao, M.; Kuo, P. Recognition System for Home-Service-Related Sign Language Using
558 Entropy-Based K-Means Algorithm and ABC-Based HMM. IEEE Trans. Systems, Man, and Cybernetics:
559 Systems 2016, 46, 150–162. doi:10.1109/TSMC.2015.2435702.
560 43. Hadian, H.; Sameti, H.; Povey, D.; Khudanpur, S. Flat-Start Single-Stage Discriminatively Trained
561 HMM-Based Models for ASR. and Language Processing IEEE/ACM Transactions on Audio, Speech 2018,
562 26, 1949–1961. doi:10.1109/TASLP.2018.2848701.
563 44. Maqqor, A.; Halli, A.; Satori, K.; Tairi, H. Using HMM Toolkit (HTK) for recognition of arabic manuscripts
564 characters. Proc. Int. Conf. Multimedia Computing and Systems (ICMCS), 2014, pp. 475–479.
565 doi:10.1109/ICMCS.2014.6911316.
566 45. Jose, S.; Nair, P.; Biju, V.G.; Mathew, B.B.; Prashanth, C.M. Hidden Markov Model: Application towards
567 genomic analysis. Proc. Power and Computing Technologies (ICCPCT) 2016 Int. Conf. Circuit, 2016, pp.
568 1–7. doi:10.1109/ICCPCT.2016.7530222.
569 46. Lyons, J.; Dehzangi*, A.; Heffernan, R.; Yang*, Y.; Zhou, Y.; Sharma, A.; Paliwal*, K. Advancing the
570 Accuracy of Protein Fold Recognition by Utilizing Profiles From Hidden Markov Models. IEEE Transactions
571 on NanoBioscience 2015, 14, 761–772. doi:10.1109/TNB.2015.2457906.
572 47. Farshchian, M.; Jahan, M.V. Stock market prediction with Hidden Markov Model. Proc.
573 Communication and Knowledge (ICTCK) 2015 Int. Congress Technology, 2015, pp. 473–477.
574 doi:10.1109/ICTCK.2015.7582714.
575 48. Tokuda, K.; Nankaku, Y.; Toda, T.; Zen, H.; Yamagishi, J.; Oura, K. Speech Synthesis Based on Hidden
576 Markov Models. Proceedings of the IEEE 2013, 101, 1234–1252. doi:10.1109/JPROC.2013.2251852.
577 49. Mocanu, E.; Nguyen, P.H.; Gibescu, M.; Kling, W.L. Comparison of machine learning methods for
578 estimating energy consumption in buildings. Proc. Int Probabilistic Methods Applied to Power Systems
579 (PMAPS) Conf, 2014, pp. 1–6. doi:10.1109/PMAPS.2014.6960635.
580 50. Siddiqui, U.; et al.. Elastic JADE: Dynamically Scalable Multi Agents Using Cloud Resources. Proc. Second
581 Int Cloud and Green Computing (CGC) Conf, 2012, pp. 167–172. doi:10.1109/CGC.2012.60.
582 51. Dorri, A.; Kanhere, S.S.; Jurdak, R. Multi-Agent Systems: A Survey. IEEE Access 2018, 6, 28573–28593.
583 doi:10.1109/ACCESS.2018.2831228.
584 52. Fleury, A.; Vacher, M.; Noury, N. SVM-based multimodal classification of activities of daily living in health
585 smart homes: sensors, algorithms, and first experimental results. Information Technology in Biomedicine,
586 IEEE Transactions on 2010, 14, 274–283.

587 c 2018 by the authors. Submitted to Sensors for possible open access publication under the terms and conditions
588 of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like