You are on page 1of 21

Fire Technology

Ó 2020 Springer Science+Business Media, LLC, part of Springer Nature


Manufactured in The United States
https://doi.org/10.1007/s10694-020-00964-4

Intelligent Fire Monitor for Fire Robot


Based on Infrared Image Feedback Control

Jinsong Zhu, Wei Li*, Da Lin, Hengyu Cheng and Ge Zhao, School of
Mechatronic Engineering, China University of Mining and Technology,
Xuzhou 221116, People’s Republic of China

Received: 5 November 2019/Accepted: 1 February 2020

Abstract. An infrared image-based feedback control system for intelligent fire moni-
tor was proposed in this paper, whose aims are to realize automatic aiming of the fire
site and continuous fire tracking in the process of fire extinguishing through adjusting
of the yaw angle of the fire monitor. Firstly, infrared camera was used to capture
images of the fire site, and under high temperatures, the fire would be easily
observed. An improved adaptive image threshold segmentation scheme was developed
for image background segmentation and fire acquisition. The fire region in the cap-
tured image was considered as region of interest (ROI) and the geometric center of
which was calculated. Due to the special advantageous structural design of the pro-
posed system, the geometric center of ROI representing the actual fire was considered
as the direction which the fire monitor should aim at. Furthermore, a fuzzy control
scheme simulating the operating mode of firefighters was proposed, and the ROI cen-
ter position deviation and deviation rate were considered as inputs of the control sys-
tem. Experimental results show that, after the fire was detected by the system, the fire
monitor could be yawed to the direction of the fire within 2000 ms, and the deviation
of image position is less than 1.6 pixels. Similarly, the proposed system could be well
adapted to the changing position of the fire caused by combustion. And after no
more than image iteration of three frames, the controller realizes aiming of the fire by
the fire monitor, and the final deviation is less than 0.1 pixels. In general, the pro-
posed infrared image feedback control system for fire horizontal position aiming of
intelligent fire monitor has great potential for intelligent fire extinguishing by fire-
fighting robots.

Keywords: Infrared image, Adaptive threshold segmentation, Servo control, Fire monitor, Fire robots

1. Introduction
Firefighting operations are dangerous, especially when the fire site is full of smoke
or other toxic gases [1]. The fire accident in Tianjin, China, in August 2015, resul-
ted in death of 99 firefighters [2]. The accident investigation results show that the
explosion of dangerous goods warehouse is the main cause of a large number of
deaths during the fire extinguishing process. Fire robots have long been consid-
ered as an alternative to firefighters entering the fireground to perform fire-fighting

* Correspondence should be addressed to: Wei Li, E-mail: liwei_cmee@163.com

1
Fire Technology 2020

operations. In recent years, this domain has attracted more and more attention. In
China, national-level scientific research projects are particularly concerned about
firefighting and rescue equipment. Intelligent fire robots usually need to be able to
detect and locate the fire after approaching the fire, and can accurately spray
water to the fire site. Furthermore, fire robot was required to track the fire trend
and adjust the spray angle of the fire monitor timely according to the change of
the fire position during the fire extinguishing process. Nowadays, equipment simi-
lar to above work process is widely used in the fire protection of large space
buildings, which is called automatic fire suppression system. Typically, automatic
fire suppression system was installed in the ceiling of building, and fire protection
for large space is achieved by a plurality of such devices. However, after the fire
robot enters the fire region, the action of fire monitor still relies on the remote
operation of firefighter during the fire-fighting process. As shown in Fig. 1, a
wheel-type fire-fighting robot manufactured by a Chinese company, the infrared
and vision cameras are mounted on the fire-fighting robot. Fire image acquired by
the camera was merely used to assist the firefighter to get the fire situation, yet the
action of fire monitor still depends on manual operation. Usually, manual remote
control cannot achieve fast fire-fighting operations, and there are high require-
ments for firefighters’ professional skills.
Many types of fire robots have been proposed and used in indoor or outdoor
fire environments. Kuo et al. [3]. designed a fire fighting robot with one extin-
guisher and three flame sensors for intelligent building, and an adaptive fusion
method was proposed for fire detection of the robot. Kim et al. [4]. proposed a
fire detection algorithm for a smart fire robot, which was based on long-wave
infrared camera, ultraviolet radiation sensor and laser radar. Experimental results
show that the robot could immediately calculate its path towards the fire and
return to its original starting place when the fire is out. Rangan [5] developed a
fire fighting robot which uses a modular design concept, including fire detection,

Figure 1. A wheel-type fire-fighting robot during work.


Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

path directing and extinguishing operations. Computer vision based fire detection
algorithm was used to extract the non-static characteristics of fire, and tempera-
ture and UV sensors were used to confirm the presence of fire along with depth
mapping.
Nowadays, firefighting robots are primarily used to assist firefighters in per-
forming fire positioning or fire-fighting. At present, many intelligent fire robots or
fire-fighting solutions based on multi-sensor fusion have been proposed, but are
limited to conceptual stage [6–8]. For instance, China CITIC Group [9] designed a
small crawler fire robot which could accurately detect fire temperature and dan-
gerous objects through temperature and vision sensors. Prabha et al. [10] pro-
posed a dual-robot cooperative fire-fighting scheme, in which sensor information
is shared among fire robots, and the fire-fighting operation is performed by the
robot closest to the fire. Gao et al. [11] developed a fire robot based on vision and
infrared sensors, which carries the water nozzle through the mechanical arm, and
the detection, tracking and extinguishing of fire are realized by the control system.
A fire-fighting robot based on vision and ultrasonic sensors was proposed by Sire-
gar et al. [12], the vision sensor was used for fire field detection, and the ultrasonic
sensor was used to avoid collision of robots with obstacles. Moreover, other fire-
fighting robot items that had entered the research phase have been proposed,
including infrared stereo vision, laser radar, monocular vision and so on.
Joshua [13] uses a pair of infrared cameras to detect and locate the fire, and
designs a feedback controller to adjust the yaw and pitch angles of the nozzle.
Experimental results show that the scheme is ideal for fire extinguishing in close
and small scenes. Furthermore, Joshua [14, 15] combines infrared stereo vision
with ultraviolet sensor for fire extinguishing of automatic fire suppression system
under low visibility. And the servo control of the nozzle is realized through seg-
mentation and positioning of fire site aims. Similarly, Kim [16, 17] included laser
radar into the infrared stereo vision. The infrared stereo vision mainly provides
three-dimensional information of fire while laser radar provides a more accurate
distance of objects in the view field. Research shows that this scheme can realize
the positioning and tracking of fire targets based on scene matching algorithm,
and has good adaptability to both smoked and smokeless scenes. Similarly,
Starr [18] applied the fusion data of 3D laser and long-wave infrared stereo vision
system in ranging operations under fire environment. It was found that the fused
data sensors had satisfactory measurement results under different concentrations
of smoke, and it could provide more accurate location service for fire-fighting
robots to enter the fire environment. In general, intelligent operation of a fire-
fighting robot can be divided into several processes: fire detection, fire positioning,
fire aiming, and tracking fire extinguishing.
The focus of this paper is to realize fire monitors horizontal aiming and contin-
uous tracking of the fire site based on infrared image in the process of fire-fight-
ing. The main contributions include: Firstly, horizontal geometric center position
of the infrared image represents the jet direction of the jet trajectory based on the
unique system structure. Then, the improved maximum entropy image segmenta-
tion algorithm achieves accurate extraction of fire ROI, and the comparison with
other segmentation methods further proves the superiority of the proposed
Fire Technology 2020

method. Finally,the improved fuzzy control rule simulates the manually operates
way of a professional fireman, it is closer to the actual fire suppression process.
The chapters of the paper are arranged as follows: Sect. 2 introduces the hard-
ware structure of the intelligent fire monitor system and explains the basic flow of
the control scheme. Section 3 describes the improved adaptive threshold segmenta-
tion method for extracting fire regions from infrared images and some segmenta-
tion results. The fuzzy controller introduction and the controller simulation results
are given in the Sect. 4. Section 5 gives an introduction to the experiment and an
analysis of some experimental results. Finally, some conclusions of this paper are
given.

2. System Architecture
The infrared image vision system for intelligent fire monitor proposed in this
paper includes a MAGNITY14min long-wave infrared camera, an electric fire
monitor, and a visual servo controller.

2.1. System Development


The vision system proposed in this paper can achieve horizontal aiming and track-
ing of fire during the fire extinguishing process. The hardware and structure of the
system are shown in Fig. 2. An infrared camera is placed above the fire monitor
nozzle, and a industrial camera is placed on the right side of the infrared camera
in the horizontal direction. In particular, industrial camera is used in intelligent

Figure 2. The hardware and structure of the vision system (dotted


box is beyond the scope of this paper).
Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

fire monitor to detect jet trajectory and so as to adjust range of falling points.
Therefore, it is beyond the scope of this article(Dotted boxes in Figs. 2 and 3,
respectively). The infrared cameras optical axis direction of the is consistent with

Figure 3. Workflow of the proposed infrared vision servo system (e


and ec are the deviation and deviation rate respectively, dotted box
is beyond the scope of this paper).
Fire Technology 2020

the fire monitors head in the intelligent fire monitor system we designed. Based on
this special structure design, the horizontal geometric center position of the infra-
red image represents the jet direction of the jet trajectory. Therefore, in order to
aim the jet trajectory at the fire, it is only necessary that the fire location in the IR
image coincides with the image horizontal geometric center. The above two cam-
eras are mounted on the fire monitor through a specific bracket. The yaw and
pitch motors included in the fire monitor are used to adjust the yaw and pitch
angles of the nozzle, respectively. In this paper, the infrared vision servo system
was designed to control the yaw motor to achieve the horizontal aiming of the fire
by the fire monitor. In particular, mounting point of the bracket is between the
yaw and pitch motor of the fire monitor, so the bracket as well as the two cam-
eras will follow the horizontal rotation of the fire monitor, instead of with the
pitch rotation.

2.2. Control Overview


The workflow of the proposed infrared vision servo system is shown in Fig. 3.
The infrared camera is used to detect the fire source in the view field of the infra-
red cameras when the intelligent fire monitor starts working. Based on previous
research results, it is easy to realize segmentation and primary identification of
smoke and fire under high temperature in an infrared image. Generally, when a
fire robot is used to perform a fire-fighting operation, there is usually occurrence
of smoke or fire.
At the beginning, fire may not be in the image in the actual fire-fighting pro-
cess. Fire monitor need to be set with a no-blind zone yaw angle adjustments to
find the fire. Once the fire is found in the view field of the infrared image, the fire
monitor would immediately begin to the spray water. In most cases, the nozzle of
the fire monitor did not aim at the center of the fire at the beginning of the opera-
tion. According to the features of the proposed visual servo system, the jet trajec-
tory of the fire monitor nozzle coincides with the vertical center line of the
infrared image. Therefore, with this innovative structural design, we can directly
adjust the yaw angle of the fire monitor to achieve the coincidence of the fire cen-
ter position and the vertical center line of the captured infrared image, so as to
achieve the nozzle jet trajectorys aiming of the fire. Furthermore, the improved
adaptive threshold segmentation algorithm helps us to better realize the segmenta-
tion and center position extraction of the fire in the infrared image. In the pro-
posed visual servo system, the deviation adjustment of center point abscissa of fire
in infrared image and infrared image center point abscissa become the focus of
the control algorithm. This part will be elaborated in the Sect. 4. With the help of
the proposed infrared visual servo system, it is possible to achieve continuous aim-
ing of the fire during the entire fire extinguishing process until the fire is extin-
guished.
Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

3. Fire Identification and Location


The selected infrared camera is MAGNITY14min (8–14 lm), which can output a
160  120 pixels grayscale image with a maximum acquisition frame rate of 25 Hz
for the fire temperature range of 20°C to 500°C. The small structure of
48  45  43 mm and the lightweight of 120 g better satisfy the requirements of
the fire robot for the size and quality of the camera. As we know, the fire temper-
ature is usually higher than the ambient temperature and therefore typically
appears as a higher grayscale region in the infrared image. However, the fire envi-
ronment is usually complicated, and there are many high temperature areas in the
fire environment due to the existence of heat radiation and heat conduction. These
high temperature regions also typically exhibit higher brightness in infrared ima-
ges. However, the existence of these highlight areas reduces the Signal-to-Noise
Ratio of the infrared image, which makes it difficult to extract the fire ROI
through traditional threshold segmentation method, and even leads to erroneous
segmentation. In our work, the improved maximum entropy adaptive threshold
segmentation method was applied to the fire segmentation in infrared images.
Due to the interference of the fire environment, in addition to the highlighted
area indicating the fire in the image acquired by the infrared camera, there are
some disturbances caused by those high temperature areas. These disturbances,
which are close to the brightness of the fire area, make image segmentation diffi-
cult. Traditional threshold segmentation methods such as average threshold and
Otsu threshold [19] cannot achieve satisfactory fire segmentation. The maximum
adaptive entropy threshold segmentation method [20] can achieve good target seg-
mentation for images with large differences between the target and background.
However, the one-dimensional maximum entropy segmentation method with a
single pixel gray value still has many errors in the segmentation of complex fire
image, due to the low signal-to-noise ratio of the image caused by other high tem-
perature regions. In this paper, we have improved the maximum entropy thresh-
old segmentation method [21, 22]. The image pixel gray value and its
neighborhood gray mean are simultaneously considered for image segmentation.
For a fire grayscale image, the resolution is W  H and the gray level is L. The
gray value of the image pixel at the coordinates (x, y) is f(x, y), and its average
gray value in the n  n neighborhood is expressed as:

½n2 X ½n2
1 X
gðx; yÞ ¼ 2 f ðx þ i; y þ jÞ ð1Þ
n
i¼½n2 j¼½n2


where n2 represents the integer result for n/2, n is the size of the neighborhood.
Thus, a new image F(x, y) was constructed with the original image pixel point
gray value f(x, y) and its neighborhood gray level mean g(x, y). A two-dimen-
sional threshold was assumed to be used for fire segmentation of the new image
F(x, y), expressed as:
Fire Technology 2020


b0 f ðx; yÞ  s and gðx; yÞ  t
Fs;t ðx; yÞ ¼ ð2Þ
b1 f ðx; yÞ > s and gðx; yÞ > t

where 0  b0 ; s; t; b1  L  1. For image f(x, y), Pij represents the probability that
the pixel point gray level and its neighborhood gray mean value appear in the
image pixel F(i, j). Thus, the two-dimensional histogram of the image is repre-
sented as:

Fs;t ði; jÞ
Pi;j ¼ ð3Þ
W H

We know that the gray levels of the pixels in the fire region and the background
region are evenly distributed in the image respectively. Therefore, the gray value
of the pixel and its neighbor gray level mean are very close. Conversely, in the
target and background boundary regions of the image, the gray level of the pixel
and the mean value of the gray of the neighborhood are different. This causes the
two-dimensional histogram of the image to exhibit bimodal characteristics. The
discrete two-dimensional entropy of the image is expressed as:
XX
H ¼ Pi;j lgPi;j ð4Þ
i j

Furthermore, the entropy sum of the target and background in the image is
expressed as:
( )
s1 X
X t1 L1 X
X L1
/ðs; tÞ ¼  Pi;j lg Pi;j þ Pi;j lg Pi;j ð5Þ
i¼0 j¼0 i¼s j¼t

According to the principle of maximum entropy, the result which satisfies


Maxf/ðs; tÞg is the optimal segmentation threshold, which can be obtained by
image traversal calculation.
The segmentation results of a set of fire infrared images are shown in Fig. 4.
Figure 4a shows the original grayscale image of the fire captured by the infrared
camera. In addition to the bright region of the fire, high temperature smoke and
distant high temperature regions also appear as brightness in the image. These
regions that approximate the fire brightness are easily mistaken as a segmentation
target. Figure 4b, c represent the one-dimensional and two-dimensional his-
tograms of the original image, respectively, and the calculated neighborhood size
in the two-dimensional histogram is 10  10 pixels. It can be clearly seen that the
one-dimensional histogram is in a unimodal state, which indicates that it is diffi-
cult to find a suitable fire segmentation threshold. In contrast, the bimodal state
of the two-dimensional histogram suggests that it is very easy to find the optimal
segmentation threshold. According to the above principle of the maximum two-di-
mensional entropy value, the maximum segmentation threshold of Fig. 4a is 181,
and the segmentation result is as shown in Fig. 4e. Figure 4f shows the calculated
Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

Figure 4. Segmentation result of fire images (a represent the


original grayscale image of the fire captured by the infrared camera,
b and c are the one-dimensional and two-dimensional histograms of
a, respectively. d and e are the one-dimensional maximum entropy
and two-dimensional maximum entropy segmentation result of a, and
f represent the fire center position of e, respectively).
Fire Technology 2020

fire center point position (63, 90) based on the segmentation result. In our work,
geometric center position abscissa of the fire region in the image is applied as the
horizontal aiming direction of the fire monitor.
In contrast, the traditional one-dimensional maximum entropy target segmenta-
tion method is used for fire segmentation in Fig. 4a, and the results are as shown
in Fig. 4d. It is not difficult to find that there is still false detection in the seg-
mented image. The main cause of the detection deviation is the interference
caused by high temperature regions, while one-dimensional entropy value based
on single pixel value gray could not reflect the difference between the target and
background. Furthermore, the commonly used adaptive threshold segmentation
method was used for fire segmentation in Fig. 4a, and the result is as shown in
Fig. 5. It is found that the segmentation result of the Yen method is similar to
that using the one-dimensional maximum entropy method, but they are still not
accurate. The calculated segmentation threshold of the Yen method is 176. While
the results of methods like average grayscale, Otsu threshold, percentage thresh-
old, threshold iteration, and bimodal thresholds are very poor. And the segmenta-
tion thresholds matched by these methods are 103, 87, 88, 87, 67, respectively. A
large number of false detection occurs and the segmentation threshold is smaller.
It can be seen that the image acquired by the infrared camera is usually affected
by smoke and other high temperature regions, resulting in a low signal-to-noise
ratio, and it is difficult to segment the fire accurately with the conventional
method. Fortunately, the two-dimensional maximum entropy method considering
the gray value of single pixel and its neighborhood gray value has obtained satis-

Figure 5. Fire image segmentation results by other methods (a–f


represent the segmentation results of Fig. 4a respectively, with
Average grayscale, Otsu threshold, Yen method, Percentage
threshold, Threshold iteration, Bimodal thresholds, respectively).
Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

factory fire detection results. Experimental results show that the method is very
suitable for fire detection and central geometric position extraction.

4. Anthropomorphic Fuzzy Control Rule


Generally, manual or electric fire monitor is operated by a firefighter according to
the fire state during the fire extinguishing process. In detail, when the firefighter
observed that the jet trajectory of fire monitor nozzle deviate from the fire posi-
tion, the yaw angle of fire monitor would be adjusted greatly to quickly respond
to the deviation. Similarly, when the jet trajectory is observed to be close to the
fire position, the yaw angle would be adjusted slowly. In general, the above
adjustment strategy is derived from the firefighters’ work experience and opera-
tional skills. Under normal circumstances, it is difficult to accurately match the
yaw angle of fire monitor following the change of fire position. In this paper, an
anthropomorphic infrared image visual servo controller [23] is designed with refer-
ence to the operation of firefighter.
The developed controller uses the deviation between the infrared image fire cen-
ter point position and the geometric center position of the image as well as the
deviation rate as system inputs. Simulating the manual operation process, devia-
tion e is divided into 7 fuzzy subsets, which are represented as NB, NM, NS, ZO,
PS, PM, and PB, respectively. These 7 fuzzy subsets sequentially indicate the devi-
ation degree of fire position. Similarly, deviation rate ec is also divided into 7
fuzzy subsets. Here, deviation e and deviation rate ec are expressed by triangular
membership function, represented as:
8
> 0 xa
>
>
< xa axb
ba
fðx; a; b; cÞ ¼ ð6Þ
>
>
cx
bxc
>
: cb
0 xc

where a and b represent the length of the triangle bottom, and c represents the
height. Based on the operation mode of the firefighters, the fuzzy control rules of
the controller are designed as shown in Table 1. The rule table is stored in the
controller, and the output of the controller is acquired by means of look-up
table according to deviation e and deviation rate ec. The weighted averaging
method is used to defuzzify the controller output from the rule table, which is
expressed as:
Pn
lAi ðxÞlBi ðyÞZi
u ¼ Pi¼1
n ð7Þ
i¼1 lAi ðxÞlBi ðyÞ

where u represents the output of the controller, and Zi represents the abscissa of
the triangle membership function corresponding to u. In our work, the maximum
resolution of the fire image captured by the infrared camera is 160  120. There-
Fire Technology 2020

Table 1
Fuzzy Controller Rule Table

EC

U NB NM NS Z PS PM PB

E
NB NB NB NB NB NM Z Z
NM NB NB NB NB NM Z Z
NS NM NM NM NM Z PS PS
Z NM NM NS Z PS PM PM
PS NS NS Z PM PM PM PM
PM Z Z PM PB PB PB PB
PB Z Z PM PB PB PB PB

fore, deviation e and deviation change rate ec are set to be 80 and 160, respec-
tively. The maximum rotation time of the selected firefighting yaw motor in the
view field of the infrared camera is taken as the output maximum of the con-
troller, which is 10,000 ms. Taking the fire position deviation in Fig. 4f as an
example, the simulation results of the developed controller are shown in Fig. 6.
As can be seen from Fig. 6, the initial deviation is 17 pixels and the deviation is
reduced to 8.1 pixels after one action of the controller. After 5 actions, the devia-
tion is reduced to less than 4 pixels, and after 11 times, the deviation is less than
2 pixels. The results show that the controller responds quickly to the fire position
deviation, which greatly ensures that the fire robot can quickly adjust the yaw
angle of the fire monitor to ensure that the sprayed water aims at the fire position.
Furthermore, the controller’s steady-state error is 0.2 pixels, which exceeds the
maximum resolution of the infrared camera, and meets the precision requirements
of the fire monitor for fire aiming.

Figure 6. Simulation results of the developed controller.


Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

Figure 7. Fire location in the experiment (Exp1.(a) represent the fire


location in the first part of the experiment, corresponding to Fig. 9b.
Exp2.(a), Exp2.(b), Exp2.(c), Exp2.(d), Exp2.(e) and Exp2.(f)
represents the fire location in the second part of the experiment,
respectively, corresponding to Fig. 11).

Figure 8. Structural diagram of the experimental platform.


Fire Technology 2020

5. Experiment and Analysis


In order to verify the performance of the controller, our proposed infrared vision
control system was carried out on fire monitor for experimental testing, and the
experiment was done in an open environment engineering machinery test site. The
experiment mainly consisted of fire aiming after the fire was detected and fire
tracking of the fire monitor during the fire-fighting process. Therefore, the fire
location was artificially disturbed for testing purposes during the experiment.The
fire location during the experiment is shown in Fig. 7 and explained in detail in
Sect. 5.1.

Figure 9. Fire images captured in the first part of the experiment (a


and c are the fire images captured at the beginning and end of once
cycle of the developed controller, respectively, b and d represent the
fire center position of a and c, respectively).
Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

5.1. Experimental Setup


The experimental device mainly includes a fire monitor, a bracket, an infrared
camera, a development board, and a computer, as shown in Fig. 8. The fuel for
the fire is made of waste wood, and a small amount of diesel is used as a combus-
tion improver. The image acquisition frequency of the developed system is set to
0.5 Hz, and the image pixel deviation adjustment rate of the fire monitor is
18.75 ms/pixels. The selected power equipment is a 60 W brushless DC motor
whose type is PSKD5  20, which provides a yaw angle adjustment for the fire
monitor. ZLGEPC287 development board is selected to receive the action time
and direction from the controller. The personal computer with 3.5 Hz CPU and
8 GB RAM is used to process fire images acquired by the infrared cameras,
including image pre-processing, two-dimensional maximum entropy threshold seg-
mentation, fire center position calculation and fuzzy controller computing tasks.
The first part of the experiment was the fire monitors aiming of fire after the
fire was detected by the developed control system. Firstly, the direction of the fire
monitor nozzle was artificially adjusted not to align with the fire and the fire was
ignited later. Then, the first frame captured by the infrared camera is transmitted
to the computer after the system is turned on. The controller system takes the cal-
culated deviation of the first frame fire image as the input, and sends the calcula-
tion result to the development board for adjustment of the yaw angle of the fire
monitor.
In the second part of the experiment, the fire tracking during the fire-fighting
process was considered. The fire location was dragged to simulate the change in
fire state during the combustion process. The fire was deflected from the right and
left sides of the fire monitor nozzle, respectively, and each set of 3 frames of fire
image was captured to analyze the controller’s execution response within 2000 ms.

Figure 10. Controller response curves of the first part of the


experiment.
Fire Technology 2020

Figure 11. Fire images captured in the second part of the


experiment (a and b are the fire images captured at the beginning
and end of first cycle of the developed controller, respectively. c and
d are the fire images captured at the beginning and end of second
cycle of the controller, respectively. e and f are the fire images
captured at the beginning and end of third cycle of the controller,
respectively).

5.2. Results Analysis


Figures 9 and 10 show the fire image and controller response curves about the
first part of the experiment. As can be seen from Fig. 9a, b, the fire center posi-
tion is located on the left of the image vertical centerline, and its horizontal center
Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

Figure 12. Controller response curves of the second part of the


experiment.

position is 61 pixels. Figure 10 shows the response curve of the controller after the
first fire frame is acquired and it can be seen that the controller performs 5
actions within the response time of 2000 ms. The controller contains an inactive
region since the DC motor is insensitive to operating time less than 40 ms. After
five actions of the controller, the fire position deviation was reduced to  0:6 pix-
els. Figure 9c shows the second frame fire image collected by the infrared camera.
It can be seen that the adjusted fire horizontal center pixel is 81 and the final devi-
ation is 1.6 pixels. Because the fire is constantly changing during the combustion
process, such error results are satisfactory. In general, the horizontal deviation of
fire position in the infrared image is minor when fire monitor is used to perform a
fire-fighting mission, since the firefighter has roughly known the position and
direction of the fire. Therefore, experimental results show that the proposed con-
trol system can quickly adjust the fire monitor nozzle to the fire direction after
detection of fire, and ensure a small error.
Figures 11 and 12 show the fire image and controller response curves about the
second part of the experiment. Figure 11a–c are consecutive three-frame fire ima-
ges acquired by the proposed system, with the fire position on the left of the infra-
red image vertical center line. The fire position is artificially moved in the image
field of view, the fire shown in Fig. 11b is moved vertically closer to the fire moni-
tor based on Fig. 11a. Similarly, the fire shown in Fig. 11c is moved horizontally
away from the fire monitor based on Fig. 11b. The left bottom of Fig. 12 shows
the response curve of the proposed controller after obtaining the three consecutive
frames fire images. It can be seen that although the fire location was artificially
changed, the controller’s position adjustment output for the fire monitor is still
reliable. After the response of the third frame image ends, the remaining fire devi-
ation is  0:1 pixels. It shows that the control system is very robust and can resist
interference caused by the change of the fire position.
A similar operation is performed with the fire location on the right side of the
infrared image vertical centerline. Fire shown in Fig. 11e is moved horizontally
away from the fire monitor based on Fig. 11d and the fire shown in Fig. 11f is
Fire Technology 2020

Table 2
Controller Output Response with Each Frame Image

Anthropomorphic fuzzy
controller

Relative Fire dis- Initial deviation Time Final deviation


location tance Frames (pixels) Times (ms) Direction (pixels)

L 16.0 1 39.4 1 1480 L 18.5


2 520
12.4 2 23.3 1 930 L 3.5
2 640
3 380
4 50
10.2 3 16.3 1 730 L 0.1
2 440
3 240
4 180
5 140
6 100
7 90
8 80
R 15.0 1 50 1 1940 R 29.9
2 60
14.2 2 36.3 1 1680 R 16.4
2 320
12.1 3 16.7 1 750 R 0.1
2 450
3 250
4 180
5 130
6 100
7 80
8 60

moved vertically closer to the fire monitor based on Fig. 11e. It is not difficult to
find that the fire deviation on the right side of the infrared image field is still cor-
rected in time. After the end of the third frame image response, the fire deviation
is 0.1 pixels. In the above experiment, the controller output response with each
frame image is recorded in Table 2 minutely.
It can be seen from Table 2 that, in the case of a large deviation of fire posi-
tion, the controller cannot reduce the deviation to a satisfactory degree within the
2000 ms time interval. In Fig. 11a, d, e, the horizontal deviation exceeds 35 pixels,
and the remaining deviation in one cycle exceeds 15 pixels. The state changes dra-
matically as the fire is burning. For this reason, the sampling frequency of the
controller is set to 0.5 Hz. Thus the long sampling interval may cause the con-
troller output to be inaccurate, and even cause the fire target to escape from the
view field of infrared camera and missed. Positively, in the case of manual inter-
ference, the proposed controller output responses for the images in each of the
Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

two groups obtained good deviation correction results. This means that the pro-
posed controller can accurately aim the fire monitor nozzle to the fire position
within 6000 ms, and the deviation does not exceed 0.1 pixels. In particular, the
fire maximum positional deviation in experiment was set to 50 pixels for the pur-
pose of controller performance verification. However, fire monitors would merely
encounter such large positional deviations at the beginning of the fire-fighting
operation. In the process of burning, it is almost impossible that fire positional
deviation would exceed 60% of the camera view field in a short time.

6. Conclusion
In this paper, an infrared image feedback control system for fire aiming of fire
monitor was proposed and experimentally verified. Firstly, the infrared camera is
installed vertically above the fire monitor, and the aim of regarding the center
position of the fire region in the infrared image as the horizontal aiming target of
the fire monitor could be realized. Then, the improved maximum entropy adaptive
fire segmentation algorithm is proposed and applied to determine the fire position
in infrared image under complex fire environment. Finally, a fuzzy controller that
simulates the Firefighters’ professional operations is built and verified,experimen-
tal results and action curves well verify the performance of the controller based on
the anthropomorphic fuzzy rules.
After acquiring the fire target through the proposed controller, the fire monitor
could realize accurate aiming of the fire in less than 6000 ms. Furthermore, even if
the fire position changes greatly during the fire-fighting operation, the proposed
controller can still achieve tracking and aiming of the fire. In the future, our focus
is on expanding the function of the system to achieve the aim of the fire monitor
jet trajectory range to the fire distance. On the other hand, it is to improve struc-
tural design of the intelligent fire monitor and mount it on fire robot for fire-fight-
ing operations in the real fire environment.

Acknowledgements
This work is supported by National Key R&D Program of China
(2016YFC0802900) and Xu-gong Construction Machinery Group (XCMG)
Research Institute and a Project Funded by the Priority Academic Program
Development of Jiangsu Higher Education Institutions, Top-notch Academic Pro-
grams Project of Jiangsu Higher Education Institutions.

References
1. China Fire Protection Association (2018) Statistical report of the 17th international
fire equipment and technology exchange exhibition. Fire Sci Technol 15: 23 34
2. Min X (2015) On-the-spot characteristics of extra serious explosion accident in Tianjin
port on August 12. City Disaster Reduct 5:9–12
Fire Technology 2020

3. Su KL (2006) Automatic fire detection system using adaptive fusion algorithm for fire
firghting robot. In: IEEE international conference on systems
4. Kim JH, Keller B, Lattimer BY (2013) Sensor fusion based seek-and-find fire algorithm
for intelligent firefighting robot. In: IEEE/ASME international conference on advanced
intelligent mechatronics. https://doi.org/10.1109/AIM.2013.6584304
5. Sandeep GSP (2014) A computer vision based approach for detection of fire and direc-
tion control for enhanced operation of fire fighting robot. In: International conference
on control. https://doi.org/10.1109/CARE.2013.6733740
6. Bose J, Mehrez SC (2017) Development and designing of fire fighter robotics using
cyber security. In: International conference on anti-cyber crimes, pp 118–122. https://do
i.org/10.1109/Anti-Cybercrime.2017.7905275
7. Qian J, Zeng JJ, Yang RQ, Weng XH (2006) A fire scout robot with accurate returning
for urban environment. Robotica 25(3):351–358. https://doi.org/10.1017/
S0263574706003146
8. Cai L, Zhang R (2013) Design and research of intelligent fire-fighting robot. Adv Mater
Res 823:358–362. https://doi.org/10.4028/www.scientific.net/AMR.823.358
9. Jia YZ, Li JS, Guo N, Jia QS, Du BF, Chen CY (2018) Design and research of small
crawler fire fighting robot. Chin Autom Congress 823:4120–4123
10. Prabha PS, Shivaanivarsha N (2017) A design of firefighting and supervising self-suffi-
cient robots. In: 3rd IEEE international conference on science technology engineering
and management
11. Gao S, Zhang ZY, Zhao ZH, Jamali MM (2018) Vision and infra-red sensor based fire
fighting robot. In: 61st IEEE international midwest symposium on circuits and systems.
https://doi.org/10.1109/MWSCAS.2018.8624080
12. Siregar B, Purba HA, Efendi S, Fahmi F (2017) Fire extinguisher robot using ultrasonic
camera and wifi network controlled with android smartphone. In: Materials science and
engineering conference series
13. McNeil JG, Lattimer BY (2017) Robotic fire suppression through autonomous feed-
back control. Fire Technol 53(3):1171–1199. https://doi.org/10.1007/s10694-016-0623-1
14. McNeil JG, Lattimer BY (2016) Autonomous fire suppression system for use in high
and low visibility environments by visual servoing. Fire Technol 52(5):1343–1368.
https://doi.org/10.1007/s10694-016-0564-8
15. McNeil JG, Lattimer BY (2015) Real-time classification of water spray and leaks for
robotic firefighting. IGI Glob 1:1–26
16. Kim JH, Starr JW, Lattimer BY (2015) Firefighting robot stereo infrared vision and
radar sensor fusion for imaging through smoke. Fire Technol 51(4):823–845. https://
doi.org/10.1007/s10694-014-0413-6
17. Kim JH, Lattimer BY (2015) Real-time probabilistic classification of fire and smoke
using thermal imagery for intelligent firefighting robot. Fire Saf J 72:40–49. https://
doi.org/10.1016/j.firesaf.2015.02.007
18. Starr JW, Lattimer BY (2013) Application of thermal infrared stereo vision in fire envi-
ronments. In: IEEE/ASME international conference on advanced intelligent mechatron-
ics. https://doi.org/10.1109/AIM.2013.6584337
19. Xu X, Xu S, Jin L, Song E (2011) Characteristic analysis of Otsu threshold and its
applications. Pattern Recogn Lett 32(7):956–961. https://doi.org/10.1016/
j.patrec.2011.01.021
20. Gull SF, Skilling J (1984) Maximum entropy method in image processing. Commun
Radar Signal Proces IEE Proc F 131(6):646–659. https://doi.org/10.1049/ip-f-
1:19840099
Intelligent Fire Monitor for Fire Robot Based on Infrared Image Feedback Control

21. Stuller JA (1995) A two-dimensional image model based on occlusion and maximum
entropy. In: Conference on signals, systems and computers. https://doi.org/10.1109/AC
SSC.1995.540935
22. Chen HK, Yang XL (2012) Improved two-dimensional maximum entropy segmentation
algorithm for infrared images based on fractal theory. Infrared 33(8):27–31. https://
doi.org/10.3969/j.issn.1672-8785.2012.08.005
23. Santis AD, Siciliano B, Villani L (2008) A unified fuzzy logic approach to trajectory
planning and inverse kinematics for a fire fighting robot operating in tunnels. Intel Serv
Robot 1:41–49. https://doi.org/10.1007/s11370-007-0003-2

Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published
maps and institutional affiliations.

You might also like