You are on page 1of 14

DESIGN OF AN INTELLIGENT ROBOT SYSTEM TO SUPPORT PATIENT SCREENING

AND CONTACTLESS DATA COLLECTION FOR MOBILE ROBOTS TO SUPPORT


HOSPITALS IN COVID-19 TREATMENT
Quan Ngo Anh Nguyen, Hoang Thuan Tran*, Quang Ngoc Pham, Thanh Chi Vo, Thang Kim Nguyen,
Submitted: 1 January XXX; accepted

Abstract: The combination of engineering technology and


This paper presents the construction of an intelligent infection management in hospitals has led to
mobile robot system used in the clinic to assist in breakthroughs in infection control technology [4]–[7].
screening patients to avoid contact with Covid-19 Healthcare organizations around the world have been
infection. The system is built on an intelligent autonomous optimizing hospital information by gathering and
organizing information from multiple perspectives to
mobile robot system in a simulated clinic. To ensure the
build innovative models in hospital development.
accuracy of moving according to the desired trajectory
Several outcomes related to health services avoiding
and reaching the correct positions where the robot needs
infectious diseases were shown separately due to
to be in contact with the patient, we use a new positioning
specific mandates. It is best to deploy robots at the
algorithm. It is a fusion of the data obtained from the hospital reception to spread information about the
LiDAR sensor and the encoder sensor with the Extended different units/divisions of the hospital and guide
Kalman filter. The program that applies the feature patients and visitors [8]. Nursing robots in hospital and
extraction algorithm is also developed enough to meet at home to care for the elderly are studied in [9], [10].
the requirements of real-time environment mapping in Robots can also provide immediate medical assistance
the room. Moving and avoiding obstacles are also after a critical accident to prevent further injury [11],
suggested by the Tangent Bug algorithm using ultrasonic [12]. Robots can help doctors who are far away from
sensors. Next, health data collection techniques including the patient to get all the physiological parameters and
patient identification, body temperature, blood pressure diagnose diseases with audio-visual aids [13]–[16].
will be collected and data sent to the center for processing Delivery services inside hospitals could be made
are also mentioned in the paper. possible by mobile robots that serve food and drinks,
dispense medicines, remove unclean laundry, provide
Keywords: COVID-19, Extended Kalman filter, Mobile new bed linens, and transport waste through normal
robot, Tangent bug algorithm and contaminated, etc. [13], [14]. Cleaning robots for
medical environments seem capable of delivering the
innovation that the creators of non-industrial robots
1. Introduction have been anticipating years ago [15].
Since the outbreak of the coronavirus pandemic of At this time, the COVID-19 pandemic is still
2019 (COVID-19) occurred in Wuhan, China, advanced ongoing, however, there are jobs that cannot help but
technologies can fully support devices to complete leave the house, especially when sick or have a chronic
their tasks in hospitals or in other environment to illness that requires re-examination. This is a force
protect people from viruses. Therefore, mobile robots majeure. Currently, all medical examination and
can provide alternative solutions to combat this treatment facilities must arrange at least one clinic to
disease without contact between sick and healthy isolate suspected cases of respiratory infection (fever,
people. New approaches to robotics in infectious cough, etc.) of unknown cause and take preventive
disease environments are rapidly being exploited to measures, isolate and conduct COVID-19 testing
assist hospitals or public areas. The role of robots in immediately after detecting suspected cases. Patients
the following COVID-related applications: disinfecting, who are required to go for medical examination and
checking people's temperature, monitoring public treatment should pay attention and follow all
places, delivering food and other items, preparing food, instructions of reception staff at hospitals and medical
and interacting personally by far contacts, etc. [1]–[3]. facilities to prevent the risk of COVID-19 infection.
Today's mobile robots have made many advances Recently published research works [6], [9], [13]–[15],
and this knowledge is applied in performing various [17]–[23] have solved the problems of disinfection,
tasks; This flexibility in programming these devices environmental cleaning, delivery of necessities, etc.
allows us to create new applications without having to But no author has yet solved the above problem. This
create new designs, i.e., it allows us to read designs is an urgent problem that robots that can directly
according to current needs. Since the spread of disease contact patients to replace humans have not been
can be reduced when there is no human-to-human concerned by the authors.
interaction but human-machine; The applications that From the above points, we propose to build a robot
have been developed are very diverse. system that works to support the screening of Covid-
19 patients, with many functions integrated into a not need to be in direct contact with patients and can
mobile robot, called CeeSRbt. The mobile robot shows still classify patients at risk of COVID infection to go to
the ability to move in the screening area of the people the treatment area.
who come for the examination. The robot will collect In addition, the robot guidance such as positioning
data from the patient to send to the doctor or data or tracking control to the desired locations and some
processing center. The hardware and software are theories related to this implementation will be
significantly designed to be adaptable to the hospital presented in the next sections. For the near obstacle
environment. The robot receives commands and avoidance task in the local area, we applied the
operates independently. Engineering designs involve Tangent Bug method [24], [25] with the Lidar sensor
work that effectively assists patients or doctors. and developed a control program that allows the
As shown in Fig. 1, we present the indirect control obstacle avoidance to control the tracking robot.
of blood oxygen levels (SpO2 is an important indicator trajectory to the desired position. The content of the
to monitor and evaluate the health of COVID-19 paper is arranged as follows: Section 2 presents the
patients in the recommended treatment regimen [20]) hardware structure, sensors and communication
and patient body temperature through Ther-mal system in the robot; Section 3 presents a summary of
Image Processing or Infrared (IR) spectral detection the navigation and screening of the patient, and finally
method [21]. The patient's epidemiological reports the results and discusses them.
information is also collected through the patient-robot
interaction software. Through these data, doctors do

(a) (b) (c)


Fig. 1 a) 3D simulation image of CeeSRbt; (b) fabricated CeeSRbt images; (c) detailed mechanical drawings

2. System Model
RGB Camera The robot is designed in addition to
2.1. Sensor System the functions mentioned above, it also has the function
Encoder In mobile robotics, the encoder is of guiding the patient to move in the hospital.
used to measure the movement (direction and speed) Accordingly, the robot both moves and tracks the
of each of the wheels of the robot. Determining the patient's face to maintain a distance from the patient
position of the robot by this encoder is a popular during movement. If the patient stops or slows down,
method in the world called the Odometry method [8]– the robot will stop to wait, which is made possible by a
[10]. method that predicts the distance from the image
sensor to the person's face using the monocular
LIDAR sensor (Light Detection and Ranging) To camera [11].
increase the efficiency of positioning we use the Lidar
sensor. Lidar is a method for determining ranges Infrared thermal image sensor The temperature
(variable distance) by targeting an object with a laser sensor used in this research is Panasonic's AMG8833 –
and measuring the time for the reflected light to return Fig. 2. An infrared thermal image sensor with 64
to the receiver. In this research, the research team used thermal pixels distributed into a 2D 8x8 matrix, can
the lidar method to locate the robot in the hospital measure the temperature of an object in the range of
using the RPLIDAR A1 sensor. from 0°C to 80°C. Non-contact infrared sensor with
measuring distance up to 7m under ideal conditions. embedded firmware which permits to control the
Sensor measuring angle up to 60° in both horizontal motor by a PID algorithm.
and vertical. The sampling rate is 10 fps (frame per The sensory information network of the robot is
second). shown in Fig. 4, in which the computer is the central
control unit. Due to the characteristics of different
information rates between sensors (the information
here is point-to-multipoint), a special design for the
communication network in the robot has been
implemented. The RGB camera sensor is connected to
the control unit via a high-speed communication
channel.

(a) (b)
Fig. 2. (a) Infrared thermal image sensor – AMG8833
sensor Node, (b) Case for the AMG8833 sensor Node.

The robot is set to move at a speed of 0.2m/s to


0.6m/s. Therefore, under ideal conditions, it is possible
to collect temperature data even while the robot is
moving.

SpO2 sensor The SpO2 sensor as shown in Fig. 8 is


the MAX30102 MH-ET LIVE module that integrates the
MAX30102 heart rate and SpO2 sensor. The
MAX30102 sensor contains two light-emitting diodes
(LEDs), one infrared (peak wavelength 880nm) and
the other red (peak wavelength 660nm); along with a
photodiode that is specific for wavelengths between
600 and 900nm. The values measured by the sensor
are transmitted with a checksum, such that the validity
of the data is verified when it is received by the
Fig. 4. Sensors, actuators, and communications in Robots
supporting microcontroller. According to Fig. 3, this
sensor is worn on the patient's finger.
The control program for the system to collect
patient data and disinfect the floor during robot
movement can be summarized as follows:
Starting from the hardware configuration shown
above, the software that collects information from the
sensors can be divided into modules with the following
specific tasks:
• The program for collecting and processing data
from the sensor RPLidar A1 is a special type
developed specifically with the C++ language;
(a) (b)
Fig. 3. (a) Infrared thermal image sensor – MAX30102 • Camera image information is collected directly
sensor Node; (b) Case for the MAX30102 sensor Node. into the computer and processed through an
image processing program, which is developed
in Visual C ++ environment with Intel's OpenCV
2.2. Control System tool;
Transmission and Communication in Robots The
• Control program for two optical encoder
drive system uses high-speed, high-torque, reversible-
measuring circuits (mounted on motor shafts)
DC motors. Each motor is attached to a quadrature
connected directly to a microcomputer (via RS-
optical shaft encoder that provides 500 ticks per
revolution for precise positioning and speed sensing. 232C);
The motor control is implemented in a Fig. 5 shows an overview of the navigation process
microprocessor-based electronic circuit with an for a mobile robot, which can be summarized as
follows:
1) Robot collects data through sensors; the robot. Every time the sensor nodes "find" the Wi-Fi
2) Determine the position of the robot; network emitted by the robot, they connect to a server
3) Mapping (if necessary or import existing global (which is an open-source Mosquitto broker application
maps); that runs in the background inside the robot's
4) Plan the way; computer – https://mosquitto.org/). All collected
5) Control the moving robot to follow the trajectory sensor data is stored in the software on the computer
to the destination and avoid obstacles (if any). to provide medical treatment data to the doctor.

3. Proposed Method
3.1. Guidance the robot to approach the
patient
Positioning for robots using Extended Kalman filters
As shown in Fig. 6 the coordinate system and
parameter symbols of the mobile robot are designed to
move on the flat floor in the room, where (𝑋𝐺 , 𝑌𝐺 ) are
the axes of the global coordinate system of the room,
(𝑋𝑅 , 𝑌𝑅 ) are local coordinate system axes that match
the robot’s center.

Fig. 5. Flowchart of navigation process for mobile robot

Collection of patient health data for medical treatment


As mentioned above, SpO2 index and body
temperature are 2 important measurements in
monitoring and treating COVID-19 patients. Our robot
will collect these 2 data automatically. Data collection
is done automatically. The robot moves close to the
patient's bed within a radius of 1 meter, the sensor
nodes will automatically connect to the Wi-Fi network
emitted by a router placed on the robot. And those
Fig. 6. The robot's posture and parameters in two
nodes actively send measured data to the central
coordinate systems
computer according to the MQTT protocol. A typical
MQTT is a TCP-based protocol [26] uses a topic-based
publish-subscribe architecture. It's used to establish
Let the angular velocities of the right and left
communication between multiple devices. This
wheels be 𝜔𝑅 and 𝜔𝐿 respectively, and the sampling
communication protocol is suitable for transferring
interval Δt is sufficiently short. So that the input
data between resource-constrained devices with low
control signals that are the displacement increments
bandwidth and low power requirements. Therefore,
applied to these wheels are:
this messaging protocol is widely used for
communication in IoT Framework. 𝛥𝑠𝑅 = 𝛥𝑡𝑅𝜔𝑅
(1)
One exchange in the MQTT protocol. With the 𝛥𝑠𝐿 = 𝛥𝑡𝑅𝜔𝐿
above technical characteristics, MQTT can work in
From here, the distance increment of the robot
unstable transmission conditions. Hence it is suitable
center Δs is:
for our robot model.
𝛥𝑠 = (𝛥𝑠𝑅 + 𝛥𝑠𝐿 )⁄2 (2)
AMG8833
Sensor Node
And the angular increment in which the robot
MQTT
Robot s Data
subscribe BROKER
rotates Δθ is:
Management hRobot/sensor Topic(hRobot/
Software Node sensor) 𝛥𝜃 = (𝛥𝑠𝑅 + 𝛥𝑠𝐿 )⁄𝐿 (3)
MAX30102
Personal Computer So the equation of state of the robot at time k in the
Sensor Node
global coordinate system updated from time k–1 is as
Fig. 5. Flowchart of navigation process for mobile robot
follows:
As shown in Fig. 10, the wireless sensor data collection
system is set up with 3 clients in which AMG8833 and
MAX30102 are 2 sensor nodes (called MQTT clients)
that collect data independently of each other and with
𝑥𝑘 𝑥𝑘−1 𝛥𝑠𝑘 𝑐𝑜𝑠(𝜃𝑘−1 + 𝛥𝜃𝑘 ⁄2) increments 𝛥𝑠𝑅 and 𝛥𝑠𝐿 , these increments when
[𝑦𝑘 ] = [𝑦𝑘−1 ] + [ 𝛥𝑠𝑘 𝑠𝑖𝑛(𝜃𝑘−1 + 𝛥𝜃𝑘 ⁄2) ] (4) subjected to noise can be expressed as the sum of the
𝜃𝑘 𝜃𝑘−1 𝛥𝜃𝑘 two components including the value identification and
interference:
The displacement increments 𝛥𝑠𝑅 and 𝛥𝑠𝐿
(proportional to the set speeds of the 2 wheels 𝜔𝑅 and 𝛥𝑠𝑅 = 𝛥𝑠𝑅0 + 𝜀𝑅
(5)
𝜔𝐿 as measured by the crankshaft encoder sensors) are 𝛥𝑠𝐿 = 𝛥𝑠𝐿0 + 𝜀𝐿
input variables affected by errors systematic and where 𝛥𝑠𝑅0 and 𝛥𝑠𝐿0 are the nominal values of the
random errors. input signal; 𝑅 and 𝐿 are the values of the white noise
In practice, the system (4) encounters systematic process, which are independent, have zero mean and
errors (such as dimensional errors of mechanical are normally distributed.
parts, shaft offsets, encoder resolution limits, etc.) and
non-systematic errors. In this paper, the errors are The equation of state (4) can be written in more
proportional to the two-wheel displacement detail as follows:
(𝛥𝑠𝑅0(𝑘) +𝜀𝑅(𝑘) )+(𝛥𝑠𝐿0(𝑘) +𝜀𝐿(𝑘) ) (𝛥𝑠𝑅0(𝑘) +𝜀𝑅(𝑘) )−(𝛥𝑠𝐿0(𝑘) +𝜀𝐿(𝑘) )
𝑐𝑜𝑠 [𝜃𝑘−1 + ]
2 2𝐿
𝑥𝑘 𝑥𝑘−1
(𝛥𝑠𝑅0(𝑘) +𝜀𝑅(𝑘) )+(𝛥𝑠𝐿0(𝑘) +𝜀𝐿(𝑘) ) (𝛥𝑠𝑅0(𝑘) +𝜀𝑅(𝑘) )−(𝛥𝑠𝐿0(𝑘) +𝜀𝐿(𝑘) )
[ 𝑘 ] = [𝑦𝑘−1 ] +
𝑦 𝑠𝑖𝑛 [𝜃𝑘−1 + ] (6)
2 2𝐿
𝜃𝑘 𝜃𝑘−1
(𝛥𝑠𝑅0(𝑘) +𝜀𝑅(𝑘) )−(𝛥𝑠𝐿0(𝑘) +𝜀𝐿(𝑘) )
[ 𝐿 ]
Due to the cumulative nature of systematic errors, estimate x̂ 0 and the covariance of the error
the accuracy of the position estimation will be worse estimate 𝑷0 .
as the robot moves [27]. Many studies have tried to
• Prediction phase: At each time step k, the filter
increase the reliability of this positioning, including the
propagates the state x and covariance P of the
use of the Kalman filter. We also applied sensor fusion
system in the previous step to the current step
using the Kalman filtering algorithm for positioning to
using the time update equation:
the robot and obtained results that improved the
quality of these measurements significantly [28], [29]. ̂−
𝒙 ̂𝑘−1 , 𝒖𝑘−1 , 0)
𝑘 = 𝑓(𝒙 (10)
It can be summarized as follows:
If the state vector 𝒙  𝕽𝑛 of a controlled process is 𝑷−
𝑘 = 𝑨𝑘 𝑷𝑘−1 𝑨𝑇𝑘 + 𝑾𝑘 𝑸𝑘−1 𝑾𝑇𝑘 (11)
represented by (4). This nonlinear equation has the Here, input control signals 𝒖 = [𝛥𝑠𝑅 𝛥𝑠𝐿 ]𝑇
form: measured by the encoder sensor.
To get the filter's matrix P, the matrices Q and A are
𝒙𝑘 = 𝑓(𝒙𝑘−1 , 𝒖𝑘−1 , 𝒘𝑘−1 ) (7)
defined as follows:
where f is a system function with 𝒙 = [𝑥 𝑦 𝜃] is the 𝑇
𝑸𝑘 is the input noise covariance matrix with size
state vector, 𝒖 = [𝛥𝑠𝑅 𝛥𝑠𝐿 ]𝑇 is the input control vector [2×2], we have:
and 𝒘 = [𝑅 𝐿 ]𝑇 is the process noise
This state is observed by several measurements 𝛿𝑅 |𝛥𝑠𝑅(𝑘) | 0
𝑸𝑘 = 𝑐𝑜𝑣𝑎𝑟(𝛥𝑠𝑅(𝑘) , 𝛥𝑠𝐿(𝑘) ) = [ ]
that make up the output vector z: 0 𝛿𝐿 |𝛥𝑠𝐿(𝑘) |
(12)
𝒛𝑘 = ℎ(𝒙𝑘 , 𝒗𝑘 ) (8)
Here, 𝛿𝑅 and 𝛿𝐿 are error constants representing the
where h is the measurement function and 𝒗 = [𝑅 𝐿 ]
uncertainty parameters of the motor control circuit
is the measurement noise affecting the readings of
and the interaction of the wheel with the floor, and
displacement increments from the rotary axis
they are independent of each other. Thus, the variance
encoders.
of the process noise is proportional to the absolute
𝒘𝑘 and 𝒗𝑘 are independent, are white noise, have a
value of the increments distance travelled by each
normal probability distribution with covariance
wheel 𝛥𝑠𝑅 and 𝛥𝑠𝐿 . These scale constants depend on
matrices Q and R respectively:
the structure of the robot and the environment. They
𝑃(𝒘𝑘 )~𝑵(0, 𝑸𝑘 ); 𝑃(𝒗𝑘 )~𝑵(0, 𝑹𝑘 ); 𝐸(𝒘𝑖 𝒗𝑗 𝑇 ) = 0(9) are deterined experimentally as 𝛿𝑅 = 𝛿𝐿 ≡ 𝛿 = 0.0003
𝑨𝑘 is a matrix of size [3×3], calculated by:
a) Define matrices in Sensor Fusion steps
𝜕𝑓𝑥 𝜕𝑓𝑥 𝜕𝑓𝑥
The speed and distance of the robot are shown in 𝛥𝜃𝑘
𝜕𝑥 𝜕𝑦 𝜕𝜃 1 0 −𝛥𝑠𝑘 𝑠𝑖𝑛 (𝜃𝑘−1 + )
Fig. 1(c) is measured by the Odometry method based 𝜕𝑓𝑦 𝜕𝑓𝑦 𝜕𝑓𝑦 2
on the function f in the system (7). 𝑨𝑘 = 𝜕𝑥 𝜕𝑦 𝜕𝜃
= [0 1 𝛥𝑠𝑘 𝑐𝑜𝑠 (𝜃𝑘−1 +
𝛥𝜃𝑘
)]
2
• Starting phase: From step 𝑘 = 0, the robot starts 𝜕𝑓𝜃 𝜕𝑓𝜃 𝜕𝑓𝜃
0 0 1
[ 𝜕𝑥 𝜕𝑦 𝜕𝜃 ]
from a known initial position with a posterior state
(13)
𝑾𝑘 is a matrix of size [3×2], calculated by:
1 𝛥𝜃 𝛥𝑠 𝛥𝜃 1 𝛥𝜃𝑘 𝛥𝑠𝑘 𝛥𝜃𝑘
𝑐𝑜𝑠 (𝜃𝑘−1 + 𝑘) + 𝑘 𝑠𝑖𝑛 (𝜃𝑘−1 + 𝑘) 𝑐𝑜𝑠 (𝜃𝑘−1 + )+ 𝑠𝑖𝑛 (𝜃𝑘−1 + )
2 2 2𝐿 2 2 2 2𝐿 2
1 𝛥𝜃𝑘 𝛥𝑠𝑘 𝛥𝜃𝑘 1 𝛥𝜃𝑘 𝛥𝑠𝑘 𝛥𝜃𝑘
𝑾𝑘 = 𝑠𝑖𝑛 (𝜃𝑘−1 + )+ 𝑐𝑜𝑠 (𝜃𝑘−1 + ) 𝑐𝑜𝑠 (𝜃𝑘−1 + )− 𝑠𝑖𝑛 (𝜃𝑘−1 + ) (14)
2 2 2𝐿 2 2 2 2𝐿 2
1 1
[ − ]
𝐿 𝐿

• Correction phase: The filter corrects the a priori matched to the local map will be collected into vector
state estimate with 𝒛𝑘 measurements by the 𝒛𝑘 , used as input for the EKF editing step:
following measure update equations:
𝒛𝑘 = [𝑟1(𝑘) , 𝜓1(𝑘) , … , 𝑟𝑁(𝑘) , 𝜓𝑁(𝑘) ]𝑇 (20)
𝑲𝑘 = 𝑷− 𝑇 − 𝑇
𝑘 𝑯𝑘 (𝑯𝑘 𝑷𝑘 𝑯𝑘 + 𝑹𝑘 ) −1
(15)
b) Calculate the matrices R, H and V
̂−
̂𝑘 = 𝒙
𝒙 ̂−
𝑘 + 𝑲𝑘 [𝒛𝑘 − 𝒉(𝒙 𝑘 )] (16) From the coordinates and direction of the robot
estimated by the odometry method, the parameters 𝛽𝑗
𝑘-
𝑷𝑘 = (𝑰 − 𝑲𝑘 𝑯𝑘 )𝑷− (17)
and 𝜌𝑗 of the 𝑗𝑡ℎ line segment in the global map are
In this case we use an absolute measurement using converted to parameters 𝑟𝑖 and 𝜓𝑖 according to the
the RPLIDAR sensor. This sensor allows the detection local coordinate system of the robot. They are
of features of the environment around the robot, which calculated by:
are lines detected in the environment (for example, a
line that cuts across a flat wall in a room) 𝑟𝑖 |𝐶𝑗 |
[𝜓 ] = [ ]
𝑖 𝛽𝑗 − 𝜃 + 𝜋(−0,5𝑠𝑖𝑔𝑛(𝐶𝑗 ) + 0,5) (19)
𝐶𝑗 = 𝜌𝑗 − 𝑥 𝑐𝑜𝑠 𝛽𝑗 − 𝑦 𝑠𝑖𝑛 𝛽𝑗
where r and φ are the range and bearing from robot
position to obstacles respectively.
𝜓𝑖 and 𝑟𝑖 are the measurements that are directly
affected by the measurement noises of the RPLIDAR
sensor. Call these measurement disturbances 𝜀𝑟 and 𝜀𝜓
then the measured component vector from the laser
sensor when including the interferences will be:
𝑟𝑖 |𝐶𝑗 | 𝜀𝑟
[𝜓 ] = [ ] + [𝜀 ](22)
𝑖 𝛽𝑗 − 𝜃𝑟 + (−0,5𝑠𝑖𝑔𝑛(𝐶𝑗 ) + 0,5)𝜋 𝜓

Thus, when adding a Lidar sensor, the


Fig. 7. Lidar sensor measures 2 parameters of a straight
measurement function h is added 2N components 𝜓𝑖
line in the environment
and 𝑟𝑖 . When k is hidden, we have the following
matrices::
The matrix 𝐻𝑜𝑑𝑙 has size [(2N)3]:
By a transformation such as the Hough transform,
we have two parameters of distance r from the line to 𝜕𝑟1 𝜕𝑟1 𝜕𝑟1
𝜕𝑥 𝜕𝑦 𝜕𝜃
the origin and angle  between the line as shown in Fig. 𝜕𝜓1 𝜕𝜓1 𝜕𝜓1 −𝑐1 𝑐𝑜𝑠 𝛽1 −𝑐1 𝑠𝑖𝑛 𝛽1 0
7. The global map of the built environment consists of 𝜕𝑥 𝜕𝑦 𝜕𝜃 0 0 −1
a set of Line segments are described by the parameters 𝐻𝑜𝑑𝑙 = ⋮ ⋮ ⋮ = ⋮ ⋮ ⋮
𝜕𝑟𝑁 𝜕𝑟𝑁 𝜕𝑟𝑁
𝛽𝑗 and 𝜌𝑗 . The equation of the straight line in canonical 𝜕𝑥 𝜕𝑦 𝜕𝜃
−𝑐1 𝑐𝑜𝑠 𝛽𝑁 −𝑐1 𝑐𝑜𝑠 𝛽𝑁 0
[ 0 0 −1]
form is: 𝜕𝜓𝑁 𝜕𝜓𝑁 𝜕𝜓𝑁
[ 𝜕𝑥 𝜕𝑦 𝜕𝜃 ]
𝑥𝐺 𝑐𝑜𝑠 𝛽𝑗 + 𝑦𝐺 𝑠𝑖𝑛 𝛽𝑗 = 𝜌𝑗 (18) (23)
As the robot moves, a local map of the environment where ci is the sign function 𝑠𝑖𝑔𝑛(𝐶𝑖 ).
is built with coordinates mounted on the robot. It also The matrix 𝑉𝑜𝑑𝑙 has size [(2N)3]:
includes a set of line segments described by the
𝜕𝑟1 𝜕𝑟1 𝜕𝑟1
equation: 𝜕𝜀𝜑 𝜕𝜀𝑟 𝜕𝜀𝜓
𝑥𝑅 𝑐𝑜𝑠 𝜓𝑖 + 𝑦𝑅 𝑠𝑖𝑛 𝜓𝑖 = 𝑟𝑖 (19) 𝜕𝜓1 𝜕𝜓1 𝜕𝜓1 0 1 0
𝜕𝜀𝜑 𝜕𝜀𝑟 𝜕𝜀𝜓 0 0 1
where 𝜓𝑖 and 𝑟𝑖 are the parameters of the 𝑖 𝑡ℎ line 𝑉𝑜𝑑𝑙 = ⋮ ⋮ ⋮ = ⋮ ⋮ ⋮ (24)
among N. 𝜕𝑟𝑁 𝜕𝑟𝑁 𝜕𝑟𝑁
0 1 0
Line segments of the local and global environment 𝜕𝜀𝜑 𝜕𝜀𝑟 𝜕𝜀𝜓 [0 0 1]
will be detected as matches. Line segments that are 𝜕𝜓𝑁 𝜕𝜓𝑁 𝜕𝜓𝑁
[ 𝜕𝜀𝜑 𝜕𝜀𝑟 𝜕𝜀𝜓 ]
The matrix 𝑅𝑜𝑑𝑙 has size [(2N)(2N)]: threshold was developed to solve this problem. A
𝑣𝑎𝑟(𝑟1 ) 0 0 ⋯ 0 0
maximal threshold 𝑑𝑖𝑠𝑇𝑚𝑎𝑥 was defined as the largest
0 𝑣𝑎𝑟(𝜓1 ) 0 ⋯ 0 0 distance to determine if a point belongs to a line. For
𝑅𝑜𝑑𝑙 = ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ each group of points, instead of comparing the distance
0 0 ⋯ 0 𝑣𝑎𝑟(𝑟𝑁 ) 0 from each point to the best fit line with fixed 𝑑𝑖𝑠𝑇, all
[ 0 0 ⋯ 0 0 𝑣𝑎𝑟(𝜓𝑁 )] distances would be stored in an array and sorted in an
(25) ascending order. If the 𝑁2𝑡ℎ distance was smaller than
the maximal threshold 𝑑𝑖𝑠𝑇𝑚𝑎𝑥 , it would be chosen as
c) Identification of Kalman filter’s parameters
the distance threshold for this line segment. Therefore,
from the Lidar
the number of distance threshold in each scan would
This section describes the algorithm to extract line
be the same as the number of detected line segments.
segments from a set of points based on the least square
d) Local and global maps matching
quadratic (LSQ) method. The point cloud was obtained
The extracted line segments of same landmarks
from a scanner with an 360° field of view and a 1°
from local and global maps are then matched together
angular resolution. The data collected by the RPLIDAR
using a straight-forward algorithm. The extracted line
was first converted into Cartesian coordination and
segments from global map 𝐺𝑗 and local map 𝐿𝑗 are
stored in an array according to the equation:
described as follows:
𝜋 𝜋
𝑥 = 𝑟 𝑐𝑜𝑠 (𝜑 ) ; 𝑦 = 𝑟 𝑠𝑖𝑛 (𝜑 ) (26) 𝐺𝑗 = [𝑥1𝐺,𝑗 𝑥2𝐺,𝑗 𝑦1𝐺,𝑗 𝑥2𝐺,𝑗 ] 𝑗 = 1 … 𝑛𝐺
180 180
(28)
Points that lie too close to robot position (< 40cm) 𝐿𝑖 = [𝑥1𝐿,𝑖 𝑥2𝐿,𝑖 𝑦1𝐿,𝑖 𝑦2𝐿,𝑖 ] 𝑖 = 1 … 𝑛𝐿
were treated as noise and were eliminated. Because where (𝑥1 , 𝑦1 ) and (𝑥2 , 𝑦2 ) are Cartesian coordinate of
the RPLIDAR scans from right to left, neighboring two end points of extracted line segments respectively,
points have high chance to lie on a same landmark such 𝑛𝐺 and 𝑛𝐿 are number of extracted global and local line
as on the wall. The LSQ was applied to a group of 𝑁1 segments.
consecutive laser readings starting from the first The detected global line segments are first
points to find the best fit line of these points. A distance transformed into local map with the equation:
threshold 𝑑𝑖𝑠𝑇 was predefined and compared with the 𝑥1𝐺,𝑗 𝑦1𝐺,𝑗 𝑥0 𝑦0
distance from each point in the group to the best fit 𝐺𝑇𝑗 = 𝑮 ([𝑥 𝑦2𝐺,𝑗 ] − [𝑥0 𝑦0 ]) (29)
2𝐺,𝑗
line. If there were more than N2 of the points in the
group (in our research, 𝑁2 = 0.75𝑁1 ), the distance of 𝑐𝑜𝑠 𝜑0 − 𝑠𝑖𝑛 𝜑0
where 𝐺 = ( ) is the conversion
which is fewer than the 𝑑𝑖𝑠𝑇, a line was detected and a 𝑠𝑖𝑛 𝜑0 𝑐𝑜𝑠 𝜑0
new best fit line was calculated based on these points. matrix, [𝑥0 𝑦0 𝑧0 ] are indicators of robot’s position in
𝑇

The point with the smallest index would be saved as global map estimated by Odometry. Each local line
one end of the line segment. The slope m and intercept segment 𝐿𝑖 is compared with all transformed global
k of the line were computed using the formula: line segments 𝐺𝑇,𝑗 , and two lines segments are
(∑ 𝑥)(∑ 𝑦) considered as “matched” if their range and bearing to
∑ 𝑥𝑦−
𝑚= 𝑛
; 𝑘 = 𝑦̄ − 𝑚𝑥̄ (27) robot position are approximately the same and the
(∑ 𝑥)2
∑ 𝑥2−
𝑛 overlapping rate between them is less than previous
defined threshold [30].
where n is the total number of data points, x̅ and 𝑦̅ are
The range and bearing of global and local line
the mean of the 𝑥 − and 𝑦 − coordinates of the data
segments to robot position in robot’s coordinate are
points respectively. The algorithm would then be used
to find the remaining points of the cloud to add more presented as (𝜌𝐺,𝑗 , 𝜓𝐺,𝑗 ) and (𝜌𝐿,𝑗 , 𝜓𝐿,𝑗 ) respectively.
points to the line segment. Points that lie too far from
their previous neighbors in the array would be
ignored. The last point that met the distance threshold
was then treated as the other end of the found line
segment. The algorithm was repeated with the new
group until all elements of the collected point cloud
were checked.
Nevertheless, a fixed threshold 𝑑𝑖𝑠𝑇 did not work
for all scans. Experiment results show that in about
10% of the scans, the threshold of the distance from
one point to the best fit line should be larger in order
to detect lines from data points. However, if the
threshold applied for all scans was too large, extracted
line segments would be not smooth and could affect
the matching result. In this paper, a simple dynamic
Fig. 8. The range and bearing of global and local line The inequations below represent conditions for
segments to robot’s position in robot’s coordinate the matching local and transformed global line segments
environment in robot’s coordinate. Two lines segments 𝐿𝑖 and 𝐺𝑇,𝑗
are matched if all following conditions are met:
The overlapping rate between the local line
𝑂𝑘 (𝑎𝑘 , 𝑏𝑘 ) < 𝑇
segment 𝐿𝑖 and the transformed global line segment
𝑂𝑘 (𝑐𝑘 , 𝑑𝑘 ) < 𝑇
𝐺𝑇,𝑗 is defined as follows: (31)
(𝜌𝐿,𝑖 − 𝜌𝐺,𝑗 )2 < 𝑇𝜌
𝑂𝑘 (𝑎𝑘 , 𝑏𝑘 ) = |𝑎𝑘 + 𝑏𝑘 − 𝐺 𝑇,𝑗 | 2
{(𝜓𝐿,𝑖 − 𝜓𝐺,𝑗 ) < 𝑇𝜓
𝑂𝑘 (𝑐𝑘 , 𝑑𝑘 ) = |𝑐𝑘 + 𝑑𝑘 − 𝐺 𝑇,𝑗 | (30) where 𝑇, 𝑇𝜌 , 𝑇𝜓 are predefined threshold.
𝑘 = (𝑗 − 1)𝑛𝐿 + 𝑖 e) Estimation of line parameter’s covariances
𝑘 = 1,2, . . . , 𝑛𝐿 𝑛𝐺 − 1, 𝑛𝐿 𝑛𝐺 In order to compute the measurement covariance
where 𝐺𝑇,𝑗 is the length of transformed global line for the EKF, each extracted line segment could be
segment 𝑎𝑘 , 𝑏𝑘 , 𝑐𝑘 , and 𝑑𝑘 are Euclidean distances presented as (𝜌, 𝜓) parameters where ρ stands for the
between the end points of the line segments 𝐿𝑖 and 𝐺𝑇,𝑗 . perpendicular distance from robot’s position to the
line and ψ is the line orientation:
|−𝑘| 𝑘 −𝑚𝑘
𝜌= ; 𝜓 = 𝑎𝑟𝑐𝑡𝑎𝑛 2 ( , ) (32)
√𝑚2 +1 1+𝑚2 1+𝑚2

where (𝑚, 𝑘) are slope and intercept of detected line


which were computed in (28).
From the solution give by [31], the measurement
covariance matrix, R, can be calculated with the
Fig. 9. The overlapping parameters between global and
assumption that each data point has the same
local line segments
Cartesian uncertainty:

2 −𝑏 𝜎 2 +𝑐 𝜎 2
𝑎𝑖 𝜎𝑦𝑦 1 −𝑒 0 0 𝑣𝑎𝑟(𝑟𝑖 ) 0
𝑖 𝑥𝑦 𝑖 𝑥𝑥
𝑅𝑖 = ( 2 ) + (0
2 𝑐𝑜𝑠 2 𝜑 +𝜎 2 𝑠𝑖𝑛 2 𝜑 −2𝜎 2 𝑠𝑖𝑛 𝜑 𝑐𝑜𝑠 𝜑
𝜎𝑦𝑦 𝑖 𝑥𝑥 𝑖 𝑥𝑦 𝑖 𝑖) ≅[ ] (33)
(𝑎𝑖 −𝑐𝑖 )2 +𝑏𝑖 2 −𝑒 𝑒 0 𝑣𝑎𝑟(𝜓𝑖 )
𝑛
𝜋 1 1
where: 𝑒𝑖 = 𝑦̄ 𝑐𝑜𝑠 𝜓𝑖 − 𝑥̄ 𝑠𝑖𝑛 𝜓𝑖 ; 𝜑𝑖 = (𝜓𝑖 + ); 𝑥̄ = 𝑥𝑗 ; 𝑦̄ = 𝑦𝑗 ; 𝑎𝑖 = ∑(𝑥𝑗 − 𝑥̄ )2 ; 𝑏𝑖 = 2 ∑(𝑥𝑗 − 𝑥̄ )(𝑦𝑗 − 𝑦̄ );
𝑥̄ 𝑛 𝑛
𝑐𝑖 = ∑(𝑦𝑗 − 𝑦̄ )2 .

Motion Control The main goal is to control the mobile


robot to track a certain trajectory. A different
trajectory with a path with time constraints added to
it, which makes the control target not only minimize
the distance between the robot and the path, but also
to ensure the travel time.
We define the actual robot state as: 𝑋 = [𝑥 𝑦 ]𝑇
and according to the pattern trajectory is: 𝑋𝑟 =
[𝑥𝑟 𝑦𝑟 𝑟 ]𝑇 .
When the robot moves, the error will appear:
𝑒1 𝑐𝑜𝑠 𝜃 𝑠𝑖𝑛 𝜃 0 𝑥𝑟 − 𝑥
𝑒 = [𝑒2 ] = [− 𝑠𝑖𝑛 𝜃 𝑐𝑜𝑠 𝜃 0] [𝑦𝑟 − 𝑦 ] (34) Fig. 10. The positional error between the robot's actual
𝑒3 0 0 1 𝜃𝑟 − 𝜃 coordinates and the reference coordinates in the
trajectory
From the kinetic and derivative model (4), we get
the error model as follows:
𝑒̇1 𝑐𝑜𝑠 𝑒3 0 𝑣 −1 𝑒2
𝑟 𝑣
[𝑒̇2 ] = [ 𝑠𝑖𝑛 𝑒3 0] [𝜔 ] + [ 0 −𝑒1 ] [𝜔] (35)
𝑟
𝑒̇3 0 1 0 −1
where 𝑣𝑟 , 𝜔𝑟 are the linear velocity and angular of the
robot according to the trajectory.
The controller for the robot is built as follows:
𝑣 𝑣𝑟 𝑐𝑜𝑠 𝑒3 𝑣𝑓𝑏 The Tangent Bug algorithm consists of two main
[ ]=[ 𝜔 ] + [𝜔 ] (36)
𝜔 𝑟 𝑓𝑏 distinct behaviors: Motion to Goal and Following
where 𝑣𝑓𝑏 , 𝜔𝑓𝑏 are the feedback signal of the controller, boundary.
are selected as follows:
𝑣𝑓𝑏 = 𝑘1 𝑒1
𝑠𝑖𝑛 𝑒3 (37)
𝜔𝑓𝑏 = 𝑘2 𝑣𝑟 𝑒2 + 𝑘3 𝑒3
𝑒3

where 𝑘1 > 0, 𝑘2 > 0 and 𝑘3 are the coefficients.


Then the control law for tracing the trajectory will
be rewritten as:

𝑣 𝑣𝑟 𝑐𝑜𝑠 𝑒3 + 𝑘1 𝑒1
[ ]=[ 𝑠𝑖𝑛 𝑒3 ] (38)
𝜔 𝜔𝑟 + 𝑘2 𝑣𝑟 𝑒2 + 𝑘3 𝑒3
𝑒3

Controlling the robot to avoid obstacles with with the


Tagent Bug algorithm Tangent Bug uses data from
Lidar to calculate the endpoints of finite continuums on
the contour of the obstacle. Sensor data consists of
directional distance values of rays emitted from the
robot's current position and scattered in a radial
pattern. The Lidar sensor is modeled with a raw
distance of 𝜌(𝑥, 𝜃) which is the distance to the nearest
obstacle along the ray from x by an angle θ. If there are
no obstacles, the raw distance function will give the
maximum value. But if it intersects with an obstacle, it
will create a distance to the point of intersection with
the obstacle.
Fig. 12. Tangent bug algorithm block diagram

a) Motion Goal
First, the robot follows the target direction while
the front does not find an obstacle. This movement
continues to be performed when reading the Lidar
sensor data gives the maximum distance result for the
angle corresponding to the direction from the robot to
the target. The end of this movement when the robot
senses an obstacle in the direction of the target.
After this point, the robot moves on to the second
part of the Motion to Goal behavior. In this behavior,
the robot detects discontinuities on the obstacle and
uses the data from the Lidar sensor to calculate the
distance 𝑑ℎ𝑒𝑢𝑟𝑖𝑠𝑡𝑖𝑐 . If this distance is reduced compared
to the previous calculation, the robot will continue the
(a) (b) above behavior. If this distance is increased, the robot
Fig. 11. a) Calculating heuristic distance from Lidar’s data; will switch to Following boundary behavior.
b) Actual path of the robot when avoiding obstacles b) Following boundary
Tangent Bug uses a ℎ𝑒𝑢𝑟𝑖𝑠𝑡𝑖𝑐 function to determine Following boundary behavior is invoked as the
the estimated distance from the current robot position 𝑑ℎ𝑒𝑢𝑟𝑖𝑠𝑡𝑖𝑐 distance increases. The robot starts to move
to the target through the tangential discontinuity 𝑂𝑖 to in the direction of the last chosen discontinuity from
the obstacle: the direction towards the target. With the selected
direction, the robot moves on the tangent line. This
𝑑ℎ𝑒𝑢𝑟𝑖𝑠𝑡𝑖𝑐 = 𝑑(𝑟𝑜𝑏𝑜𝑡, 𝑂𝑖 ) + 𝑑(𝑂𝑖 , 𝑔𝑜𝑎𝑙) (39) tangent line is perpendicular to the line connecting the
If there is no obstacle between the robot and the robot and the point closest to the robot on the obstacle.
target then: 𝑑ℎ𝑒𝑢𝑟𝑖𝑠𝑡𝑖𝑐 = 𝑑(𝑟𝑜𝑏𝑜𝑡, 𝑔𝑜𝑎𝑙). This behavior continues to execute when 𝑑𝑓𝑜𝑙𝑙𝑜𝑤𝑒𝑑 ≥
𝑑𝑟𝑒𝑎𝑐ℎ .
Where 𝑑𝑓𝑜𝑙𝑙𝑜𝑤𝑒𝑑 is the shortest distance between
the sensed boundary and goal; 𝑑𝑟𝑒𝑎𝑐ℎ is the shortest
distance between blocking obstacle and goal (or
distance to goal if no blocking obstacle visible).
Whenever 𝑑𝑟𝑒𝑎𝑐ℎ < 𝑑𝑓𝑜𝑙𝑙𝑜𝑤𝑒𝑑 , the robot will switch
back to Motion to Goal behavior.
The algorithm diagram is shown in Fig. 15. Here, a
clear area flag is created to toggle between the two
behaviors. Initially, this flag is active and therefore
Motion to Goal will be applied. If the flag is disabled, (a) (b)
the robot will switch to Following boundary Fig. 13. (a) Voice recognition model; (b) Flowchart of the
behavior. Finally, the robot will stop when it reaches passenger answering program algorithm
the target
3.2. Screening In the final phase, the text data converted from the
passenger's voice will be classified by the model built
Voice interaction with the patient The voice
in phase 1 to determine its label. From there, the
interaction function consists of 3 phases. In the first
computer program on the robot will respond correctly
phase, we build a text classification model by machine
with the necessary information to answer the
learning using the fasttext library [32], an open-
passenger as shown in the flowchart of Fig. 13(b).
source library that allows users to learn text
As shown in Fig. 13(b), in case the program does
representations and text classifiers to classify text
not determine which label the text belongs to, this text
strings. Accordingly, we build a keyword data and label
will be stored in a log file for us to check and add to the
the keyword combinations as follows:
machine learning data.
Tab. 1. Keyword and label datasheet
Measure temperature, blood oxygen concentration,
No. Keyword Tag
sterilization After the patient puts his hand on the
1 hello, hello robot, hi, etc. __label__#greeting#
2 Do you have a fever? __label__#fever# front of the robot, the Spo2 readings and temperature
3 Have you lost your sense __label__#taste# will be checked. In addition, the hand sanitizer spray
of taste? system is also activated after the measurement is
4 Put your finger on the __label__#spo2m# completed.
SpO2 meter. First of all, blood oxygen saturation (SpO2) was
5 Place your palm upside __label__#sterilization# calculated using a formula supplied in the Maxim
down in front of the Integrated™ sample code [33]. The AC and DC
nozzle. component of the pulsative waveform was calculated
6 bye bye, goodbye, thank __label__#goodbye# for both the red and infrared channels and stored in
you … etc integer variables (𝐴𝐶𝑅𝑒𝑑 , 𝐴𝐶𝐼𝑅 , 𝐷𝐶𝑅𝑒𝑑 , 𝐷𝐶𝐼𝑅 ) as a mean
… … … of 5 consecutive peaks/valleys. A ratio (R) of the AC
and DC was then calculated from the mean AC and DC
Initially, the robot greeting "Hello" will guide the values using (40), and the SpO2 value was calculated
patient to put his hand in the measuring position to by the (41):
check the parameters, etc. 𝐴𝐶𝑅𝑒𝑑 ÷𝐷𝐶𝑅𝑒𝑑
According to Table 1, there are more than 100 𝑅= (40)
𝐴𝐶𝐼𝑅 ÷𝐷𝐶𝐼𝑅
labels where each label represents a type of
information provided to the user depending on the 𝑆𝑝𝑂2 = −45,060𝑅2 + 30,345𝑅 + 94,845 (41)
user's question. Next, the AMG8833 sensor measures the
In the second phase, the user's voice is recorded in temperature of an object based on the thermal
real time from the microphone, then using API radiation emitted by that object. The Stefan-Boltzmann
functions to transmit to Google's Speech-to-Text API law describes the power emitted by an object using
audio processing system. The text string will be (42):
returned immediately after and put into the Text
Classifier as shown in Fig. 13(a). 𝑃 = 𝜀𝐴𝜎𝑇 4 (42)
where P is the power radiated from an object; T is the
temperature of the object; A is surface area; σ is Stefan–
Boltzmann constant, ε is the emissivity of the object (it
gets values from 0 to < 1 for objects which do not
absorb all incident radiation - grey object; If it is an
absolute black object then equal 1).
Applying Moghaddam's formula [33], we calculate The robot moves along the rows of seats, to the
the temperature of a grey body as follows: seats in turn, then returns to the starting position and
repeats the cycle.
𝑉𝑜𝑢𝑡 = 𝑘(𝑇𝑜4 − 𝑇𝑠4 ) (43)
When moving through the seats, if the robot uses
where 𝑉𝑜𝑢𝑡 is the output voltage of the sensor, 𝑇0 is the the camera to check if there is someone sitting on the
surface temperature of the object being measured, 𝑇𝑠 is chair, if there is, the robot will stop, if not, it will
the temperature of the sensor's thermistor, k is the continue to the next seats.
empirical constant representing the parameters A, ε, σ At the chair positions with patients, the robot will
as well as the electronic noise that may exist during the turn to the patient, greet, then conduct screening
measurement (calibrating the sensors). including temperature measurement, SpO2, then hand
Based on (43), the surface temperature of the sanitizer spray.
object will be calculated by the formula (44): If the measurement results are normal, the robot
will guide the patient into the clinic, if the result is
𝑇𝑜 = (𝑉𝑜𝑢𝑡 /𝑘 + 𝑇𝑠4 )1/4 (44) abnormal, the robot will guide the patient into the
isolation room.
Actual movement results are shown in Fig. 15. As
4. EXPERIMENTAL RESULTS we can see, the robot only stops at 3 seats with people
4.1. The robot moves to the patient's location sitting, then conducts screening.
During the movement, the robot uses Lidar sensors
In the case of no obstructions Set up the screening to locate. The data from the Lidar will be matched with
clinic environment as shown in Fig. 14(a). Arrange two the clinic's map, where the map is already set up.
rows of patient seats 2m apart, the seats in each row In Fig. 16 we see the data from the lidar is
are also 2m apart, for a total of 8 seats. represented by the green dot, the red lines are the lines
that are matched to the walls of the map.

(a)

Fig. 15. Experimental robots moves to the patient

(b) From the matching results, the robot's state


Fig. 14. a) Screening clinic model; b) robot path trajectory (position and direction) will be updated through the
EKF, the results show that the robot follows the
outlined trajectory-Fig. 17
Fig. 16. Data obtained from Lidar and line segments are matched to the map

In the case of obstructions When there is an • Next, the robot will ask the patient to put their
obstacle in the path, the robot uses the Tangent bug finger on the SpO2 measuring sensor in front of the
algorithm with data from the Lidar to avoid the robot as shown in Figure 1a, the AMG8833 sensor
obstacle, then follows the original trajectory. placed on the screen reads the temperature data of
Fig. 17 show the results of obstacle avoidance when the correct patient. Each patient will have 3 data
a person blocks the robot's path samples read after 5 seconds and the data will be
sent to the central computer.
• After collecting the results, the robot will analyze
the data and predict the patient's status
(Measurement results are shown in Table 2).
Patients will have their hands disinfected from the
robot's nozzle and be directed to the patient to go
to the next clinic.

Tab. 2. Results Of Measuring Body Temperature And


Blood Oxygen Concentration
PID Timespan SpO2 Heart Body
Temperature
m00001 20.21 93 82 36.5
m00001 22.45 93 85 36.2
m00001 24.13 97 81 36.8
m00002 72.01 92 80 37
m00002 74.53 92 80 36.6
Fig. 17. (a) Passenger interact with the robot; (b) The
m00002 76.24 96 79 36.5
robot leads the passenger to the check-in counter
m00003 115.21 96 81 36.6
m00003 117.18 97 82 36.7
4.2. Screening for patients m00003 119.89 97 81 36.3
To predict patients at risk of COVID infection based
on symptoms and two indicators, body temperature 5. Conclusion
and blood oxygen concentration (SpO2). The robot will
screen patients by taking the above data. The steps are CeeSRbt is a new design of mobile robot that has
presented as follows: been researched and tested. This study provides an
• When the robot recognizes the patient in a face-to- intelligent screening system and indirect
face state. The robot will ask questions about measurement system for mobile robots. Support the
COVID-related symptoms for the patient to work of checking the patient's possibility of Covid
answer. infection before entering the clinic. The model uses
intelligent image recognition to detect patients to E. J. Hwang, B. Kyu Ahn, B. A. Macdonald and H. Seok
conduct screening exams and remind them to wear Ahn, "Demonstration of Hospital Receptionist
masks. The control of the possibility of COVID-19 Robot with Extended Hybrid Code Network to
infection and the classification of isolated or normal Select Responses and Gestures," 2020 IEEE
clinic for patients has proven the effectiveness of the International Conference on Robotics and
proposed method. Automation (ICRA), 2020, 8013-8018,
The intelligent screening robot does not use 10.1109/ICRA40945.2020.9197160.
humans to help reduce the risk of infection for medical
P. Manikandan, G. Ramesh, G. Likith, D. Sreekanth
staff and doctors. The results can be developed for
and G. Durga Prasad, "Smart Nursing Robot for
practical applications in hospitals and show great
COVID-19 Patients," 2021 International Conference
promise.
on Advance Computing and Innovative Technologies
In future developments, we will implement more AI
in Engineering (ICACITE), 2021, 839-842,
techniques into the system. The system can operate
10.1109/ICACITE51222.2021.9404698.
more independently and adapt to changes in the
operating environment. More facilities can be A. Ahamed, R. Ahmed, M. I. Hossain Patwary, S.
incorporated into the system to provide needed Hossain, S. Ul Alam and H. al Banna, "Design and
services for both patients and doctors in the hospital. Implementation of a Nursing Robot for Old or
Paralyzed Person," 2020 IEEE Region 10 Symposium
REFRENCES (TENSYMP), 2020, 594-597,
10.1109/TENSYMP50017.2020.9230956.
R. Bogue, “Robots in a contagious world,”
H. Samani and R. Zhu, "Robotic Automated External
Industrial Robot: the international journal of
Defibrillator Ambulance for Emergency Medical
robotics research and application, 2020.
Service in Smart Cities," in IEEE Access, vol. 4, 2016,
N. Zhu et al., “A novel coronavirus from patients 268-283, 10.1109/ACCESS.2016.2514263.
with pneumonia in China, 2019,” New England
I. Karabegović and V. Doleček, “The role of service
Journal of Medicine, vol. 382, no. 8, 2020, 727-733,
robots and robotic systems in the treatment of
10.1056/NEJMoa2001017.
patients in medical institutions,” Advanced
L. Qun et al., “Early Transmission Dynamics in Technologies, Systems, and Applications, 2017, 9–25,
Wuhan, China, of Novel Coronavirus–Infected 10.1007/978-3-319-47295-9_2.
Pneumonia”, New England Journal of Medicine, vol.
S. Kavirayani, D. S. Uddandapu, A. Papasani, and T.
382, no. 13, 2020, 1199-1207,
V. Krishna, “Robot for Delivery of Medicines to
10.1056/NEJMoa2001316.
Patients Using Artificial Intelligence in Health
G. Lacey, B. L. Sithole, and A. Vorndran, “Technology Care,” 2020 IEEE Bangalore Humanitarian
Supporting Infection Prevention and Control Technology Conference (B-HTC), 2020, 1–4,
Training in Africa”, 2021 IEEE International 10.1109/B-HTC50970.2020.9297948.
Humanitarian Technology Conference (IHTC), 2021,
R. Maan, A. Madiwale, and M. Bishnoi, “Design and
1–4, 10.1109/IHTC53077.2021.9698826.
Analysis of ‘Xenia: The Medi-Assist Robot’ for Food
K. Srividya, S. Nagaraj, B. Puviyarasi, and S. Vani, Delivery and Sanitization in Hospitals,” 2nd Global
“Infection Prevention and Control using UV- Conference for Advancement in Technology (GCAT),
Disinfectant Bot for COVID”, 2nd International 2021, pp. 1–7. DOI:
Conference for Emerging Technology (INCET), 2021, 10.1109/GCAT52182.2021.9587776.
1–5, 10.1109/INCET51464.2021.9456444.
N. Koceska, S. Koceski, P. Beomonte Zobel, V.
S. Nagaraj, K. Srividya, S. Premalatha, Santhosh, and Trajkovik, and N. Garcia, “A Telemedicine Robot
Sabareeswaran, “Infection Prevention and Control System for Assisted and Independent Living,”
Using Self Sterlizing Door Handle for Covid”, 4th Sensors, vol. 19, no. 4, 2019, DOI:
International Conference on Computing and 10.3390/s19040834.
Communications Technologies (ICCCT), 2021, 515–
Md. R. Sarder, F. Ahmed, and B. A. Shakhar, “Design
519, 10.1109/ICCCT53315.2021.9711780.
and implementation of a lightweight telepresence
F. Astrid, Z. Beata, E. Julia, P. Elisabeth, D.-E. Magda, robot for medical assistance,” 2017 International
and others, “The use of a UV-C disinfection robot in Conference on Electrical, Computer and
the routine cleaning process: a field study in an Communication Engineering (ECCE), 2017, 779–
Academic hospital,” Antimicrobial Resistance & 783, 10.1109/ECACE.2017.7913008.
Infection Control, vol. 10, no. 1, 2021, 1–10.
K. Srividya, S. Nagaraj, B. Puviyarasi, and S. Vani,
“Infection Prevention and Control using UV-
Disinfectant Bot for COVID”, 2021 2nd International Thermophysics and Heat Transfer, vol. 21, no. 1,
Conference for Emerging Technology (INCET), 2021, 2007, 128–133, 10.2514/1.26181.
1–5, 10.1109/INCET51464.2021.9456444.
K. Goyal, K. Agarwal, and R. Kumar, “Face detection
A. Madan, A. Ganesh, P. N, S. Bandekar, and A. N. and tracking: Using OpenCV”, 2017 International
Nagashree, “Mobile Covid Sanitization Robot”, 2021 conference of Electronics, Communication and
International Symposium of Asian Control Aerospace Technology (ICECA), vol. 1, 2017, 474–
Association on Intelligent Robotics and Industrial 478, 10.1109/ICECA.2017.8203730.
Automation (IRIA), 2021, 131–136,
T. T. Hoang, D. T. Hiep, P. M. Duong, N. T. T. Van, B.
10.1109/IRIA53009.2021.9588769.
G. Duong, and T. Q. Vinh, “Proposal of algorithms for
C.-W. Park et al., “Artificial intelligence in health navigation and obstacles avoidance of autonomous
care: Current applications and issues”, Journal of mobile robot,” IEEE 8th Conference on Industrial
Korean Medical Science, vol. 35, no. 42, 2020, Electronics and Applications (ICIEA), 2013, 1308–
10.3346/jkms.2020.35.e379. 1313, 10.1109/ICIEA.2013.6566569.
F. Astrid, Z. Beata, E. Julia, P. Elisabeth, D.-E. Magda, R. Deriche, R. Vaillant, and O. D. Faugeras, “From
and others, “The use of a UV-C disinfection robot in noisy edge points to 3D reconstruction of a scene: A
the routine cleaning process: a field study in an robust approach and its uncertainty analysis”,
Academic hospital”, Antimicrobial Resistance & Theory And Applications Of Image, 1992, 71–78,
Infection Control, vol. 10, no. 1, 2021, 1–10, 10.1142/9789812797896_0008
10.1186/s13756-021-00945-4.
A. Joulin, E. Grave, P. Bojanowski, M. Douze, H.
K. Nosirov, S. Begmatov, M. Arabboev, and K. Jégou, and T. Mikolov, “Fasttext. zip: Compressing
Medetova, “Design of a Model For Disinfection text classification models,” arXiv preprint
Robot System”, 2020 International Conference on arXiv:1612.03651, 2016.
Information Science and Communications
“MAXREFDES117#: Heart-Rate and Pulse-
Technologies (ICISCT), 2020, 1–4,
Oximetry Monitor.” 2020. [Online]. Available:
10.1109/ICISCT50599.2020.9351370.
https://www.maximintegrated.com/en/design/re
A. Vyshnavi, A. Manasa, C. Hamsika, S. Sai, and P. ference-design-center/system-
Shalini, “UV Disinfection Robot with Automatic board/6300.html/tb_tab.
Switching on Human Detection,” EAI Endorsed
Transactions on Internet of Things, vol. 6, no. 23,
2020, 10.4108/eai.25-9-2020.166364.
S. Shaikh, N. Akhter, and R. Manza, “Current trends
in the application of thermal imaging in medical
condition analysis”, Int. J. Innov. Technol. Explor.
Eng, vol. 8, no. 8, 2019, 2708–2712.
K. F. Uyanik, “A study on tangent bug algorithm,”
KOVAN Research Lab. Dept. of Computer Eng.
Middle East Technical Univ. Ankara, Turkey.
S. Yousuf and M. B. Kadri, “Implementation of
Modified Tangent Bug Navigation Algorithm for
Front Wheel Steered and Differential-Drive
Robots”, 2020 International Symposium on Recent
Advances in Electrical Engineering Computer
Sciences (RAEECS), vol. 5, 2020, 1–6,
10.1109/RAEECS50817.2020.9265853.
J. Postel et al., “Transmission Control Protocol”, RFC
793, Internet Engineering Task Force (IETF), 1981.
J. Borenstein and L. Feng, “UMBmark: a method for
measuring, comparing, and correcting dead-
reckoning errors in mobile robots”, 1994.
S. Moghaddam, J. Lawler, J. Currano, and J. Kim,
“Novel Method for Measurement of Total
Hemispherical Emissivity”, Journal of

You might also like