You are on page 1of 15

sensors

Article
The Design and Development of an Omni-Directional
Mobile Robot Oriented to an Intelligent
Manufacturing System
Jun Qian 1 , Bin Zi 1, *, Daoming Wang 1 ID
, Yangang Ma 1 and Dan Zhang 2
1 School of Mechanical Engineering, Hefei University of Technology, 193 Tunxi Road, Hefei 230009, China;
qianjun@hfut.edu.cn (J.Q.); denniswang@hfut.edu.cn (D.W.); myg_ballack@mail.hfut.edu.cn (Y.M.)
2 Lassonde School of Engineering, York University, 4700 Keele Street, Toronto, ON M3J 1P3, Canada;
dzhang99@yorku.ca
* Correspondence: binzi@hfut.edu.cn or zibinhfut@163.com; Tel.: +86-551-6290-1606

Received: 16 July 2017; Accepted: 8 September 2017; Published: 10 September 2017

Abstract: In order to transport materials flexibly and smoothly in a tight plant environment,
an omni-directional mobile robot based on four Mecanum wheels was designed. The mechanical
system of the mobile robot is made up of three separable layers so as to simplify its combination and
reorganization. Each modularized wheel was installed on a vertical suspension mechanism, which
ensures the moving stability and keeps the distances of four wheels invariable. The control system
consists of two-level controllers that implement motion control and multi-sensor data processing,
respectively. In order to make the mobile robot navigate in an unknown semi-structured indoor
environment, the data from a Kinect visual sensor and four wheel encoders were fused to localize the
mobile robot using an extended Kalman filter with specific processing. Finally, the mobile robot was
integrated in an intelligent manufacturing system for material conveying. Experimental results show
that the omni-directional mobile robot can move stably and autonomously in an indoor environment
and in industrial fields.

Keywords: omni-directional mobile robot; localization; extended Kalman filter (EKF); intelligent
manufacturing system; design

1. Introduction
Intelligent manufacturing and logistics have motivated the development of mobile robot
technology in recent years [1]. Intelligent factories have been increasingly demanding autonomous
mobile robots to fulfill various movement tasks [2,3]. In consideration of a complicate working
environment, mobile robots need to be flexible, adaptive, and safe [4,5]. Consequently, traditional
mobile robots based on differential drive with large turning radii [6] are not satisfactory.
There are many kinds of omni-directional mobile robots that have been developed and
produced [7]. Two typical types of wheels can realize the omni-directional movement, i.e., a steerable
wheel and an omni-directional wheel [8]. Both of them have been used for mature products, including
Adept Seekur and KUKA omniRob robots. The steerable wheel has its own rotational mechanism,
which can change its steering angle actively [9], while the omni-directional wheel uses a special wheel
structure for flexible movement, examples of which include the Swedish wheel [10] and the Mecanum
wheel. Swedish wheels have been widely used on small moving platforms, especially on robot soccers.
Owing to its ideal pushing force, simple mechanism, and easy control performances, Mecanum wheels
are more appropriate for carrying heavy goods in the industrial environment [11]. Unfortunately,
Mecanum wheels also have several disadvantages. The special wheel structure allows for the lateral
movement degree-of-freedom (DOF), but the rotation of the rollers around the Mecanum wheel will

Sensors 2017, 17, 2073; doi:10.3390/s17092073 www.mdpi.com/journal/sensors


Sensors 2017, 17, 2073 2 of 15

result in the slippage of the mobile robot [12]. This slippage is difficult to be modeled and calculated.
Meanwhile, the gap between the two adjacent rollers causes a periodical vibration in the moving
platform, which obviously affects movement accuracy and stability.
To deal with the accumulate movement errors and estimate the positions of mobile robots, many
efforts have been implemented by integrating exteroceptive sensors and using data fusion algorithms to
acquire accurate localization results. The natural environment is usually changed by mounting featured
structural objects for exact measurements, for example, passive reflectors [13], artificial landmarks [14],
and the Radio Frequency Identification (RFID) technique [15]. However, infrastructure reconstructions
are not universal because they affect flexibility, especially for intelligent factories. Therefore, it is more
important to use on-board sensors to measure the natural environment. The inertial navigation system
is appropriate for most mobile platforms, while the high performance restricts its hardware cost [16].
With the development of the Microsoft Kinect sensor and its successful application in the field of
entertainment, the integration of the Kinect sensor and other on-board sensors can also be used on
mobile robots [17]. The methods and experiments of data fusion on omni-directional mobile robots
have been widely researched [18,19]. However, further comprehensive works should be completed
since the industrial fields are more complicated and diversified.
The objective of intelligent manufacturing is a cyber-physical system (CPS), which is made
up of physical entities equipped with perception, processing, and decision-making abilities [20].
There are two flows in the intelligent manufacturing system (IMS), i.e., material flow and information
flow. In order to convey materials flexibly and efficiently when facing multi-station and multi-task
applications, intelligent mobile robots need to be specially developed so as to satisfy the whole system
and the industrial environment. Meanwhile, the data exchange between physical entities relies on
unified interfaces and communication standards [21], which are also difficult since many traditional
physical entities need upgrades. From this point of view, intelligent mobile robots oriented to IMS are
very important components of smart factories and smart logistics.
In this paper, we focus on designing and developing an omni-directional mobile robot with
four Mecanum wheels from the systematic point of view. The whole process resembles the design
of other intelligent systems or microsystems [22]. Firstly, modularized wheels with a vertical
suspension mechanism are designed to realize the flexible and steady movement of the mobile
robot. The multi-layer mechanical structure of the body is utilized to be convenient for easy upgrades
and different applications. Secondly, the control system needs to give consideration to the reliability,
safety, and intelligence of the mobile robot. Therefore, a lower-level controller uses an industrial
grade controller for motion control, and a higher-level controller uses an embedded industrial PC for
multi-sensor data processing. In order to navigate autonomously in unknown semi-structured indoor
environment, data from wheel encoders and a Kinect visual sensor are fused to localize the mobile
robot. Finally, the mobile robot is applied in an IMS for conveying materials by experimental tests. It is
expected that the mobile robot provide a solution of intelligent mobility in smart factories with a low
cost and relative high accuracy.
The rest of this paper is organized as follows: Sections 2 and 3 describe the mechanical and control
systems of the mobile robot in detail, respectively. Section 4 presents the positioning system of the
mobile robot by using a simplified extended Kalman filter. Section 5 shows the experimental results of
localization and mapping. Finally, conclusions are presented in Section 6.

2. Mechanical System of the Mobile Robot


Since the environment of intelligent manufacturing systems are commonly favorable indoor sites
crowded with equipment and materials, an omni-directional mobile robot is required to move flexibly
in narrow spaces. In this paper, a four-wheel independent driven structure by using the advantages of
Mecanum wheels is designed [23]. The mobile robot is named RedwallBot-1 and shown in Figure 1.
In order to ensure expansibility and versatility, multi-layer mechanical modules and a modular wheel
structure are used for its body.
Sensors 2017, 17, 2073 3 of 15
Sensors 2017, 17, 2073 3 of 15
Sensors 2017, 17, 2073 3 of 15

Figure
Figure 1.
1. Omni-directional
Omni-directional mobile
mobile robot
robot RedwallBot-1.
RedwallBot-1.
Figure 1. Omni-directional mobile robot RedwallBot-1.
2.1. Modular
2.1.Modular Wheel
ModularWheel Structure
WheelStructure
Structure
2.1.
The small
Thesmall solid
smallsolid rollers
solidrollers
rollerson on the
onthe circumference
thecircumference
circumferenceof of each
ofeach Mecanum
eachMecanum
Mecanumwheel wheel
wheelareare arranged
arearranged
arrangedwithwith gaps,
withgaps,
gaps,
The
which
which may
may bring
bring periodical
periodical vibrations
vibrations in
in the
the mobile
mobile platform.
platform. One
One solution
solution is
is to
to use
use elastic
elastic wheel
wheel
which may bring periodical vibrations in the mobile platform. One solution is to use elastic wheel
hubs
hubsto to absorb the
toabsorb
absorb the vibration
vibration [24].
[24]. Another solution
solutionisis
Anothersolution to
isto add suspension
toadd
add suspension mechanisms
mechanisms on the whole
onthe
the whole
hubs the vibration [24]. Another suspension mechanisms on whole
Mecanum
Mecanumwheel wheel [25].
wheel[25]. Different
[25].Different suspension
Differentsuspension mechanisms
suspensionmechanisms
mechanismshave have been
havebeen designed
beendesigned to
designedtotokeep keep four
keepfour Mecanum
fourMecanum
Mecanum
Mecanum
wheels in
wheels in contact
contact with
with the
the ground.
ground. Traditional
Traditional suspension mechanisms
suspension mechanisms use
use aa link
link with
with dampers
dampers for
for
wheels in contact with the ground. Traditional suspension mechanisms use a link with dampers for
shock absorption,
shockabsorption, which
absorption, which
which is is similar to the damping of automobiles. However, a mobile robot using
shock is similar
similar to tothe
thedamping
dampingofofautomobiles.
automobiles.However,
However, a mobile
a mobile robot using
robot this
using
this suspension
suspension has has variable
variable distances
distances between
between longitudinal
longitudinal or or lateral
lateral wheels.
wheels.
this suspension has variable distances between longitudinal or lateral wheels.
A modular
modular wheel
wheel structure
structure with
with vertical
vertical suspension
suspension mechanisms
mechanismsisis designed as shown in in
AA modular wheel structure with vertical suspension mechanisms is designed
designedas as shown
shown in
Figure
Figure 2. 2. The Mecanum wheel on the bottom board is connected with the mobile chassis
Figure 2. The
TheMecanum
Mecanum wheel
wheelon the
on bottom
the bottomboard board
is connected with the mobile
is connected with the chassis compliantly.
mobile chassis
compliantly.
The connectors The connectors
include two include
cylinder twoand
slides cylinder
linear slides and
bearings, linear
while the bearings,
shock whileinclude
absorbers the shock
two
compliantly. The connectors include two cylinder slides and linear bearings, while the shock
absorbers
springs and include
buffers,two springs
respectively. and buffers, respectively. Therefore, the Mecanum wheel can regulate
absorbers include two springs andTherefore, the MecanumTherefore,
buffers, respectively. wheel canthe regulate
Mecanumits position
wheel vertically
can regulateand
its position
passively vertically
with springs and passively
according to with
the springs
loads and according
the ground to the loads and
topography. In thedesign,
the groundtwo topography.
hydraulic
its position vertically and passively with springs according to the loads and the ground topography.
In the design,
buffers two
are usedtwo hydraulic
to absorb buffers are used
the damping to absorb the damping energy quickly. Both the
In the design, hydraulic buffers energy
are used quickly. Both the
to absorb thesprings
damping andenergy
the buffers can beBoth
quickly. replaced
the
springs and the buffers
easily according can be replaced
to different easily according to different loads on the mobile platforms.
springs and the buffers can beloads on the
replaced mobile
easily platforms.
according to different loads on the mobile platforms.

Mobile Chassis
Mobile Chassis

Cylinder
Hydraulic Cylinder
Slide
Hydraulic
Buffer Slide
Spring
Buffer Spring
Bottom
Bottom
Board
MecanumBoard
Mecanum
Wheel
Wheel
(a) (b)
(a) (b)
Figure 2. Modular wheel structure with suspension. (a) The components of the modular wheel;
Figure 2.2. Modular
Modular wheel
wheelstructure
structurewith suspension. (a) The components of the modular wheel;
Figure
(b) 3D model of the modular wheel. with suspension. (a) The components of the modular wheel;
(b) 3D model of the modular wheel.
(b) 3D model of the modular wheel.
2.2. Multi-Layer Mechanical Modules
2.2. Multi-Layer Mechanical Modules
2.2. Multi-Layer Mechanical Modules
The body of RedwallBot-1 possesses a three-layered mechanical structure from bottom to top,
The body of RedwallBot-1 possesses a three-layered mechanical structure from bottom to top,
i.e., the
Thechassis
body oflayer, the controlpossesses
RedwallBot-1 layer, andathe applicationmechanical
three-layered layer. Two structure
adjacent layers are connected
from bottom to top,
i.e., the chassis layer, the control layer, and the application layer. Two adjacent layers are connected
with screw
i.e., the bolts,
chassis which
layer, thecan be easily
control layer,assembled and disassembled.
and the application layer. TwoFigure 3 shows
adjacent layersthe
aremechanical
connected
with screw bolts, which can be easily assembled and disassembled. Figure 3 shows the mechanical
components
with of thewhich
screw bolts, robot can
body.
be Both
easilythe chassis layer
assembled and and the controlFigure
disassembled. layer constitute
3 shows thethemechanical
main body
components of the robot body. Both the chassis layer and the control layer constitute the main body
of the mobileofplatform.
components the robotGeneral specifications
body. Both the chassisof RedwallBot-1
layer are shown
and the control in Table 1.the main body of
layer constitute
of the mobile platform. General specifications of RedwallBot-1 are shown in Table 1.
the mobile platform. General specifications of RedwallBot-1 are shown in Table 1.
Sensors 2017, 17, 2073 4 of 15
Sensors 2017, 17, 2073 4 of 15

Application
Application
Layer
Layer

Control
Control
Layer
Layer
Chassis
Chassis
Layer
Layer
(a) (b)
Figure 3. The mechanical components of the robot body. (a) 3D model of the mobile body; (b) three
Figure 3. The mechanical components of the robot body. (a) 3D model of the mobile body; (b) three
independent layers.
independent layers.

Table 1. General specifications of RedwallBot-1.


Table 1. General specifications of RedwallBot-1.
Description Quantity
Description
Mobile Body Length Quantity
760 mm
Mobile
Mobile Body
BodyLength
Width 500760mm
mm
Mobile Body Width
Mobile Body Height 500
600 mm mm
Mobile Body Height 600 mm
Wheel Diameter
Wheel Diameter
200 mm
200 mm
Max. Velocity
Max. Velocity of of
thethe Body
Body 1.4 m/s
1.4 m/s
Max. RotationalVelocity
Max. Rotational Velocity of the
of the Body
Body 3.0
3.0rad/s
rad/s
Mass
Massofofthe Mobile
the Mobile Body
Body 80 kgkg
80

(1). The Chassis Layer


This layer contains the movement mechanism and the power source. Each modular wheel is
connected with a DC servomotor and a harmonic reducer along the rotational axis of the Mecanum
wheel. Each
wheel. EachDC DCservomotor
servomotor is aisMaxon
a Maxon RE40 RE40
withwith an output
an output powerpower of The
of 150 W. 150 ratio
W. Theof theratio of the
harmonic
harmonic reducer
reducer is 50:1. Fouris modular
50:1. Four modular
wheels wheels are
are arranged arranged symmetrically
symmetrically along the center
along the longitudinal longitudinal
axis of
center axis of
the chassis. the chassis.
Meanwhile, a 48Meanwhile,
V 40 Ah lithiuma 48 Vbattery
40 Ah is lithium
used tobattery
supplyispower
used to
forsupply power
the mobile for the
platform.
mobile platform.
(2). The Control Layer
(2). The Control Layer
This layer consists of the control system, the on-board sensory suite, and the power converters.
This
The control layer consists
system of the the
includes control system,motor
controller, the on-board
drives, sensory suite,router,
an internet and the andpower
a touchconverters.
screen.
TheElmo
An control system includes
CEL-10/100 DC drive theiscontroller,
used for eachmotor drives,
Maxon an internetwhich
servomotor, router,
hasand a touchPID
a built-in screen. An
control
Elmo
functionCEL-10/100
[26–28]. DC drive istoused
According for each Maxon
the expected servomotor,
input speed whichand
of the drive has the
a built-in
feedback PIDfrom control
the
function
encoder at [26–28].
the endAccording to the expected
of the servomotor, the PID input speedcan
parameters of the drive andmanually
be regulated the feedbackto obtainfromideal
the
encoder
dynamicat the end ofAthe
responses. servomotor,
touch the PID
screen is used parameters
to show can be regulated
the parameters and statusmanually to obtain
graphically. ideal
Moreover,
dynamic
a handheld responses.
control box A istouch
madescreen is used
to manually to show
operate the parameters
the mobile platform by andusingstatus graphically.
a 3-axis joystick
Moreover,
based on three a handheld control box is made to manually operate the mobile platform by using a 3-axis
potentiometers.
joystick based on three potentiometers.
(3). The Application Layer
(3). The Application Layer
This layer has many selections to meet different needs, e.g., a light-weight robot manipulator
and aThis
forklayer
arm. has many selections
RedwallBot-1 to meet
is oriented different
to convey needs,for
materials e.g.,
anaintelligent
light-weight robot manipulator
manufacturing system,
and
so a alinear
fork transfer
arm. RedwallBot-1
line is designed is oriented
to carrytoaconvey materials
rectangular for anSeveral
container. intelligent manufacturing
transfer rollers are
system,horizontally
placed so a linear transfer
on the toplineofisthe
designed
mobile to carry aAmong
chassis. rectangular
those container.
rollers, oneSeveral
electrictransfer
transferrollers
roller
are placed
rotates with horizontally
other passive on the top together
rollers of the mobile
usingchassis. Among those rollers,
chain transmission. In orderone to electric transfer
adapt different
roller
heightsrotates with other
at several workpassive
stations, rollers togethercylinders
two electric using chain aretransmission.
placed underInthe order to adapt
transfer linedifferent
to offer
heights propulsion.
vertical at several work stations, two electric cylinders are placed under the transfer line to offer
vertical propulsion.
Sensors 2017, 17, 2073 5 of 15

3. The Control System of the Mobile Robot


Sensors 2017, 17, 2073 5 of 15

3.1. Control Hardware


3. The Control System of the Mobile Robot
In order to process considerable sensory data and manipulate the mobile robot autonomously,
3.1. Control
the control Hardware
system of RedwallBot-1 is made up of two layers. The lower-level and higher-level
controllers use a programmable
In order to process considerable logicsensory
controller
data (PLC) and an embedded
and manipulate industrial
the mobile robot PC (E-IPC),
autonomously,
respectively. Thesystem
the control two-level controllers communicate
of RedwallBot-1 is made up of with
twoeach other
layers. Thewith Ethernet.
lower-level andFigure 4 shows the
higher-level
wholecontrollers use a programmable
control system of the mobile robot.logic controller (PLC) and
The peripheral an embedded
devices industrial
are classified PC modules,
into four (E-IPC), i.e.,
mobilerespectively. The two-level
chassis, special controllers
application, communicate
perception, with each
and human other with
machine Ethernet.
interface Figure 4 shows
(HMI).
the whole control system of the mobile robot. The peripheral devices
The lower-level control system is used to provide the function of mobility. In order are classified into tofour
control
modules, i.e., mobile chassis, special application, perception, and human machine interface (HMI).
many devices reliably and flexibly, a PLC with a Siemens S7-1217C CPU was selected as the lower-level
The lower-level control system is used to provide the function of mobility. In order to control
controller. Several extended modules for the PLC were also used, which include an I/O module,
many devices reliably and flexibly, a PLC with a Siemens S7-1217C CPU was selected as the
an A/D module,controller.
lower-level and a D/A module.
Several extended modules for the PLC were also used, which include an I/O
The
module, an A/D module, and aomni-directional
Mecanum wheel-based D/A module. mobile robot uses the motion model of differential
drive. Therefore, each driving
The Mecanum wheel-basedwheel is rotated under
omni-directional mobiletherobot
velocity
uses control.
the motion The voltage
model signal is sent
of differential
to each servomotor
drive. Therefore, drive to control
each driving wheel theis wheel. By changing
rotated under the control.
the velocity ratios of 4-channel
The voltage
voltage signal signals,
is sent
to each servomotor drive to control the wheel. By changing the ratios of 4-channel
the mobile robot can move in different motion modes including longitudinal, lateral, diagonal, and voltage signals,
the mobile
rotational robot can
movements. move
The in different
signal of each motion
encodermodes including
installed at thelongitudinal, lateral, diagonal,
rear of the servomotor is sentandto both
the related drive and the PLC simultaneously. Thus, the PLC can calculate the movementisofsent
rotational movements. The signal of each encoder installed at the rear of the servomotor the to
mobile
both the related drive and the PLC simultaneously. Thus, the PLC can calculate the movement of the
chassis based on odometry. Meanwhile, two electronic cylinders are controlled synchronously to lift
mobile chassis based on odometry. Meanwhile, two electronic cylinders are controlled
the transfer rollers up and down.
synchronously to lift the transfer rollers up and down.
The higher-level controller
The higher-level is anisembedded
controller an embedded industrial PC with
industrial PC the withtype
theoftype
Advantech ARK-2150F.
of Advantech
It acquires multi-sensor
ARK-2150F. It acquiresdata and processes
multi-sensor dataprocesses
data and in real time. Therefore,
data in real time.localization, map building,
Therefore, localization,
and path planning of the mobile robot can be implemented.
map building, and path planning of the mobile robot can be implemented.

HMI Perception
E-IPC
Wireless Joystick
CCD Camera Kinect Visual
Sensor
Touch Screen
Ethernet
Router Laser Ranger Ultrasonic
Wired Joystick
Finder Sensors

A/D Module A/D Module


PLC IMU
D/A Module Photoelectric
Buttons Bumpers Sensors

I/O Module

Servomotor
Lamps Stepper Motor DC Motor
Drives
Drives Drive

Servomotors Encoders Encoders Stepper Motors DC Motor

Mecanum Brakes Electric Electric


Wheels Cylinders Transfer Roller

Chassis Application

Figure
Figure 4. 4. Blockdiagram
Block diagram of
of RedwallBot-1’s
RedwallBot-1’s control
controlsystem.
system.

3.2. Perception
3.2. Perception
The on-board sensory suite mainly includes a vision sensor, a laser scanner, ultrasonic sensors,
The on-board
and inertial sensory suite
measurement mainly
sensors includes
(IMUs). In fronta vision sensor,robot,
of the mobile a laser scanner,
a CCD ultrasonic
camera sensors,
is installed
and inertial measurement sensors (IMUs). In front of the mobile robot, a CCD camera
obliquely so as to follow the lane line on the ground. Meanwhile, a Kinect visual sensor is utilized is installed
obliquely so as tothe
to perceive follow theenvironment
nearby lane line onbytheintegrating
ground. Meanwhile, a Kinect
the color image and visual
depth sensor
image. isAutilized
laser to
perceive the (SICK
scanner nearby environment
LMS111 by integrating
laser ranger thetocolor
finder) is used image
measure and depth
the obstacles of image. A laser scanner
the 2D environment
(SICK LMS111 laser ranger finder) is used to measure the obstacles of the 2D environment accurately.
Sensors 2017, 17, 2073 6 of 15
Sensors 2017,17,
Sensors2017, 17,2073
2073 66of
of15
15
Sensors 2017, 17, 2073 6 of 15
accurately. Eight Maxbotix ultrasonic sensors are installed on four faces of the robot averagely.
accurately.
Several
accurately. Eight
low Eight Maxbotix
cost sensors
Maxbotix ultrasonic
are ultrasonic
used sensors
to provide areperceptions,
basic
sensors are installed on
installed onsuch
fouras
four faces of the
bumpers
faces of theand
robot averagely.
photoelectric
robot averagely.
Eight Maxbotix ultrasonic sensors are installed on four faces of the robot averagely. Several low cost
Several low
sensors.
Several low cost
The cost sensors
lasersensors
scanner,are
are used toto provide
ultrasonic
used provide basic
sensors,basic perceptions,
and perceptions,
bumps suchmake
together
such as bumpers
as bumpers and photoelectric
up the safety
and photoelectric
system for
sensors are used to provide basic perceptions, such as bumpers and photoelectric sensors. The laser
sensors. avoidance.
obstacle
sensors. The laser
The laser scanner,
scanner, ultrasonic
Figure 5ultrasonic sensors,
shows thesensors, and
on-boardand bumps
sensors
bumps together
andtogether make up
their assignment
make up onthethe
the safety system
mobile
safety for
robot.
system for
scanner, ultrasonic sensors, and bumps together make up the safety system for obstacle avoidance.
obstacle avoidance. Figure 5 shows the on-board sensors and their assignment on the
obstacle avoidance. Figure 5 shows the on-board sensors and their assignment on the mobile robot. mobile robot.
Figure 5 shows the on-board sensors and their assignment on the mobile robot.
Photoelectric
CCD sensor
Photoelectric
Photoelectric
camera
CCD sensor
sensor
CCD IMU sensor
camera
Lasercamera
scanner
IMU sensor
IMU sensor
Ultrasonic
Laser scanner
Laser
Kinect scanner
sensor sensor
Ultrasonic
Ultrasonic
Kinect sensor
Kinect sensor sensor
sensor
Bumper
Encoder Bumper
Bumper
Encoder
Encoder

Figure 5. Assignment of the on-board sensors.


Figure
Figure 5.Assignment
Figure5.
5. Assignmentof
Assignment ofthe
of theon-board
the on-boardsensors.
on-board sensors.
sensors.
3.3. Software Architecture
3.3.
3.3.Software
3.3. SoftwareArchitecture
Software Architecture
Architecture
The two-level control software is designed according to the hardware structure. Figure 6 shows
The
Thetwo-level
the whole
The two-level control
controlsoftware
control diagram
two-level control and theiris
software
software isdesigned
designed
data
is according
accordingto
exchange.
designed according tothe
to thehardware
the hardwarestructure.
hardware structure.Figure
structure. Figure666shows
Figure shows
shows
the
the whole
whole control
controldiagram
diagram and
andtheir
their data
data exchange.
exchange.
the whole control diagram and their data exchange.
Higher-level Software

Higher-levelSoftware
Higher-level Software
Path Obstacle
Localization Mapping SLAM
Planning Avoidance
Path
Path Obstacle
Obstacle
Localization
Localization Mapping
Mapping SLAM
SLAM
Planning
Planning Avoidance
Avoidance

Pose Deviation Odometry

PoseDeviation
Pose Deviation Odometry
Odometry
Backward Forward
Kinematics Chassis Kinematics
Lower-level Software Backward Forward
Backward Movement Forward
Kinematics
Kinematics Chassis
Chassis Kinematics
Kinematics
Lower-levelSoftware
Lower-level Software
Movement
Movement
Figure 6. Schematic diagram of the control software.
Figure 6.
Figure 6. Schematic
Schematic diagram
diagramof
of the
thecontrol
control software.
software.
Figure 6. Schematic diagram of the control software.
3.3.1. Lower-Level Software
3.3.1. Lower-Level
3.3.1. Lower-Level Software
Software
3.3.1. Lower-Level
The lower-levelSoftware
software focuses on motion control and the movement safety of the mobile
The
robot.The lower-level
Figure 7 shows software focuses on
the distribution on motion
of motion control and
four control
Mecanum and the
the on
movement safety of the
the mobile
The lower-level software
lower-level software focuses
focuses on motion and wheels
control the safety
movement
the movement mobile ofchassis
safety from
theofmobile the
mobile
robot.
robot.
vertical Figure
view. 7 shows the distribution of four Mecanum wheels on the mobile chassis from the
Figure showsThe
robot. 7Figure 7the forward
shows the and backward
distribution
distribution of ofkinematics calculations
four Mecanum
four Mecanum wheels wheels
on the are
ongiven
mobile in from
Equations
the mobile
chassis chassis (1)from
the verticalandview.
(2),
the
vertical view.
view. The
respectively.
vertical The forward
forward and backward
backward kinematics
kinematics calculations
calculations are
are given
given inin Equations
Equations (1) (1) and
and (2),
(2),
The forward and backward and kinematics calculations are given in Equations (1) and (2), respectively.
respectively.
respectively.
r yR θ1
θ3
rr yyRR θθ11
θθ33

vy w
b vvyy ww
Rear OR
bb xR
Rear
Rear ORR vx
O
vvxx xxRR
Front
Front
Front

θ4 l θ2
θθ44 ll θθ22
Figure
Figure 7.
7. The
The distribution
distribution of
of the
the four
four wheels
wheels and their movements.
and their movements.
Figure 7.
Figure 7. The
The distribution
distribution of
of the
the four
fourwheels
wheelsand
and their
their movements.
movements.
Sensors 2017, 17, 2073 7 of 15

 . 
    θ1
vx 1 1 1 1  . 
πr   θ2 
v =  1 −1 1 −1  
 .  (1)
 
 y  
720  θ. 3 
w −1/d 1/d −1/d 1/d
θ4
 .   
θ1
.
1 1 −d  
 
−1  vx
 .  = 180  1 d
 θ2  
 vy  (2)
 
1 −1 −d

 
 θ. 3  πr  
w
θ4 1 1 d

where vx and vy are the longitudinal and the lateral velocities of the mobile robot, and w is the angular
velocity. They constitute the velocity vector V = (vx , vy , w)T of the mobile robot. The dot above θ i
indicates the rotating speed of the i-th wheel (i = 1, 2, 3, 4) in deg/s. In addition, r is the radius of the
Mecanum wheel. In Figure 7, l and b are the distances between the longitudinal and the lateral wheels,
respectively. d is one 720th of the sum of l and b, so d = (l + b)/720.
In order to control the movement of the mobile chassis, the velocity or the pose deviation is
firstly converted into the rotating speed or angle of each wheel based on the backward kinematics.
Meanwhile, according to data from the four wheel encoders, the robot’s velocity is estimated based on
the forward kinematics.

3.3.2. Higher-Level Software


The higher-level software processes the sensory data and sends the control orders to the mobile
robot for navigation. Two exteroceptive sensors are mainly used, i.e., a CCD camera and a Kinect
sensor. By using OpenCV (Open Source Computer Vision Library), the color image from each visual
sensor can be conveniently processed. When tracking a fixed path on the ground, the lane line is
extracted firstly and fitted with a straight line or circular curve. Then, the deviations of the position
and the heading angle are calculated and the results are sent to the lower-level software to control the
mobile robot by using a PID control algorithm. By using a preview control, the mobile robot finds one
destination position on the lane line. According to the movement characteristic of the mobile robot
shown in Equation (1), an arc trajectory is dynamically planned and traced. Therefore, the geometric
center of the mobile robot can keep moving along the lane line. If the mobile robot navigates based on
a free path planned by the higher-level controller, multi-sensor data fusion algorithm will be used for
localization. Since the mobile robot has the omni-directional mobility, the obstacle avoidance is flexible
and diverse. In order to shorten the development cycle, the higher-level control system uses Linux and
open-source Robot Operating System (ROS) [29,30]. OPC Unified Architecture (UA) specification [31]
is a platform-independent communication standard for industrial applications. Therefore, ROS-based
higher-level control software could access data with Siemens PLC and other field devices using OPC
UA [32]. The lower-level controller, as an OPC UA server, communicates with E-IPC and other clients
through wired or wireless Ethernet. Therefore, the mobile robot could be competent for complicated
production systems.

3.4. Integration with an Intelligent Manufacturing System


The mobile robot RedwallBot-1 is applied to an intelligent manufacturing system for
demonstration, which has several automatic machines including a stero warehouse, a gantry
manipulator, two CNC (Computer Numeric Control) lathes, an industrial robot, a CNC milling
machine, etc. Figure 8 shows the main equipment and Figure 9 plots the sequential movement process
of the workpieces in the system. Rough parts are stored in the stero warehouse initially, and they
are carried by the mobile robot to the lathe machining area. A gantry manipulator moves them from
machining places to the worktable orderly. After that, a FANUC industrial robot transports them to
Sensors 2017, 17, 2073 8 of 15

the CNC milling machine. When the machining operation is terminated, the mobile robot carries the
Sensors 2017, 17, 2073 8 of 15
finished products to the stero warehouse again.
Sensors 2017, 17, 2073 8 of 15

(a) (a) (b)


(b) (c) (c)
Figure 8. The main equipment of the intelligent manufacturing system. (a) Stero warehouse; (b) CNC
Figure
Figure 8.
8. The
Themain
mainequipment
equipment of
ofthe
theintelligent
intelligent manufacturing
manufacturing system.
system. (a)
(a)Stero
Stero warehouse;
warehouse; (b)
(b) CNC
CNC
lathes and gantry manipulator; (c) industrial robot and milling machine.
lathes and gantry manipulator; (c) industrial robot and milling machine.
lathes and gantry manipulator; (c) industrial robot and milling machine.
Transport
Sending
Transport Place 2
Transport Gantry
Workpieces
Sending Returning
Place 1 ①
Transport Place 2 Manipulator
Gantry
Workpieces Returning ②
Place 1 ① Manipulator
⑧ ② CNC
Obstacles ③
Returning Lathe 1
⑧ CNC
Obstacles ③
Returning Lathe 1

Stero Retrieving ④ CNC


Warehouse RedwallBot-1
Workpieces Lathe 2
Stero Retrieving Industrial ④ CNC
Warehouse RedwallBot-1 Robot
Workpieces ⑦ Transport Line Lathe 2
Industrial

Robot
Transport
⑦ Transport Line Worktable
Place 3 CNC Milling ⑥ ⑤
Machine
Transport Worktable
Place 3 CNC Milling ⑥
Figure 9. The sequential movement process of theMachine
workpieces in the system.

4. Mobile Robot Figure Localization


Figure 9.9. The
Thesequential
sequential movement
movement process
process ofof the
the workpieces
workpieces in in the
the system.
system.
It is assumed that the working area of the mobile robot is the ideal horizontal two-dimensional
4.
4. Mobile Robot
Robot Localization
environment.
Mobile The robot pose at the k-th sampling instance is denoted as the status vector x(k) = (xk, yk,
Localization
ψk) . Here, the first two variables determine the central position and ψk is the heading angle of the
T
It
It is
is assumed
assumed that the
the working
thatcoordinate
working area
area of
of the
the mobile robot
mobilex(k)robot is
is the
the ideal
ideal horizontal
horizontal two-dimensional
two-dimensional
robot in the global frame xOy. Therefore, determines the pose transformation of the
environment.
environment. The Therobot pose
robot at
posethe k-th
at thesampling
k-th instance
sampling is denoted
instance is as the
denotedstatusas vector
the x(k) = vector
status (xk, yk,
robot coordinates xRORyR (see Figure 7) relative to xOy.
ψ k) . =
T Here, ythe
(xk ,To ψkfirst
)T . twothe variables determine the central position and ψk(EKF)
is the heading angle of
is the
x(k) k ,estimate Here, thepose
robot first two variables
incrementally, determine
an extend Kalmanthe central
filter position
algorithm and
[33,34]
ψk is the
robot
heading in the
usedangle global coordinate
of the by
recursively robot frame
in the global
integrating xOy. Therefore,
coordinate
the data from wheel x(k)
frame determines
xOy. Therefore,
encoders the pose
and the Kinect transformation
x(k) determines of the
the pose
sensor. In each
robot coordinates
recursion,
transformation of xthe
both ORrobot
Rtheystatus
R (see Figure
prediction
coordinates 7) relative
xand yRto
R ORthe
xOy.
following
(see Figure measurement
7) relative to update
xOy. form the frame of
To estimate
estimation. the
The robot
frequency pose
of theincrementally,
pose estimation anis extend
30
To estimate the robot pose incrementally, an extend Kalman filter (EKF)Hz, Kalman
which is filter
decided (EKF)
by algorithm
the Kinect
algorithm sensor.[33,34]
[33,34] is
is used
used recursively
recursively by integrating
by integrating thefrom
the data datawheel
from encoders
wheel encoders and thesensor.
and the Kinect KinectInsensor. In each
each recursion,
4.1.
recursion,Status Prediction
both the status prediction and the measurement
following measurement update form oftheestimation.
frame of
both the status prediction and the following update form the frame
estimation. The
Once frequency
the mobile of the
robot’s pose
motionestimation
function is
f(·)30
is Hz, which
given is decided
accurately, the
The frequency of the pose estimation is 30 Hz, which is decided by the Kinect sensor. by
new the Kinect
pose at sensor.
the time
instance k + 1 can be represented as
4.1.
4.1. Status
Status Prediction
Prediction x(k + 1) = f(x(k), u(k + 1), w(k + 1)) (3)
Once
Once the
wherethe mobile
u(kmobile
+ 1) and robot’s
robot’s 1)motion
w(k +motion
are the function
function ·) f(·
is )given
controlf(input is given
and accurately,
accurately,
the process the new
noise, the newatpose
pose
respectively. theu(kat+ the
time is time
1)instance
instance
cankbe+ 1represented
k + 1 determinedcanbybethe
represented
as angles
rotating as of the wheels. Thus, u(k) = (dθ1, dθ2, dθ3, dθ4) . Here, dθi means the
T

angular increment of the i-th wheel x(k +in1)the sampling


= f(x(k), u(kinterval dt, +
+ 1), w(k and 1))it is measured by the i-th wheel (3)
x(k + 1) = f(x(k), u(k + 1), w(k + 1)) (3)
encoder.
where u(k
where u(kThe + 1) and
predicted
+ 1) w(k
and w(k +
value 1) are
+ 1)ofare the
x(k +the control input and the process noise, respectively.
1) iscontrol input and the process noise, respectively. u(k + 1) is u(k + 1) is
determined by by the
the rotating
rotating angles dθ 22,, dθ
dθ 33,, dθ T
dθ 44))T.. Here, dθii means
determined angles ofof the
the wheels.
wheels. Thus, u(k) =
Thus, u(k) (dθ 11,, dθ
= (dθ Here, dθ means the the
angular increment of the i-th wheel in the sampling interval dt, and it is
angular increment of the i-th wheel in the sampling interval dt, and it is measured by the i-th wheel measured by the i-th wheel
encoder.
encoder.
The predicted
The predicted value
value ofof x(k
x(k ++ 1)
1) is
is
Sensors 2017, 17, 2073 9 of 15

  
cos ψ̂k − sin ψ̂k 0 vx
x̂− (k + 1) = x̂(k) +  sin ψ̂k cos ψ̂k 0  vy dt, (4)
  
0 0 1 w

where the hat ˆ above the status vector denotes its estimated value and the additive superscript −
denotes the predicted value.
According to Equation (1), if the coefficient matrix on the right side of the forward kinematics
formula is replaced by a coefficient matrix J, then
 . 
  θ1  
vx  .  1 1 1 1
, J = πr 
 θ2 
 v y  = J· 
 .  1 −1 1 −1 .
  
 θ. 3
 720
w  −1/d 1/d −1/d 1/d
θ4

Equation (4) can be represented as


 
cos ψ̂k − sin ψ̂k 0
x̂− (k + 1) = x̂(k ) +  sin ψ̂k cos ψ̂k 0  ·J·u( k ). (5)
 
0 0 1

The prior covariance matrix P− (k + 1) of the predicted value of x(k + 1) is

P− (k + 1) = ∇fx P(k)∇fTx + ∇fw Q(k + 1)∇fTw , (6)

where P(k) and Q(k + 1) are the covariance matrices of the pose x(k) and the process noise w(k + 1),
respectively. The initial values of P(0) and the value Q(k) are commonly designated as the diagonal
matrices. The superscript T means the matrix transposition. The Jacobian matrices ∇fx and ∇fw are
   
1 0 M1 cos ψ̂k − sin ψ̂k 0
∇fx =  0 1 M2 , ∇fw =  sin ψ̂k cos ψ̂k 0 ·J·I4 ,
   
0 0 1 + M3 0 0 1

where I4 is a four order identity matrix. The intermediate elements Mi (i = 1, 2, 3) in the matrix ∇fx are
calculated as    
M1 − sin ψ̂k − cos ψ̂k 0
 M2  =  cos ψ̂k − sin ψ̂k 0 ·J·u(k).
   
M3 0 0 1

4.2. Measurement Update


In this stage, the on-board exteroceptive sensors are commonly used to perceive the environment
and then regulate the robot pose according to the residual of measurement. The relationship between
the new measurement vector z(k + 1) and the measurement function h(·) is represented as

z(k + 1) = h(x(k + 1), v(k + 1)) (7)

where v(k + 1) is the measurement noise. Equation (7) establishes the connections between x(k + 1) and
z(k + 1).
A Kinect sensor is used to provide color image and depth image. Both of the two images at the
same time instance constitute an RGB-D image. By using the API functions supplied by OpenCV,
visual feature extraction and matching are easy to be carried out. First, each RGB-D image is processed
to acquire the feature points according to SURF (Speeded Up Robust Feature) algorithm [35] first. Then,
Sensors 2017, 17, 2073 10 of 15

the visual odometry method is used to realize relative pose estimation between adjacent image frames.
This method includes three steps, i.e., feature matching by using FLANN (Fast Library for Approximate
Nearest Neighbors) [36], relative pose determination by combining PnP (Perspective-n-Point) [37]
with ICP (Iterative Closest Point) algorithm [38], and pose optimization by using g2 o (General Graph
Optimization) [39]. Finally, the three-dimensional position increment vector and the three-dimensional
posture increment vector could be calculated. Since the work site is horizontal, only the increments
along the x, y directions and the rotating angle around the z-axis are needed. Owing to the difference
between the installing position of the Kinect sensor and the geometric center of the mobile robot,
the increments should be changed into the values corresponding to the robot coordinates. Now,
the measurement vector can be described as z(k + 1) = (dxR , dyR , dψR )T , which is made up of the
changes of the position and the heading angle of the mobile robot measured by the Kinect sensor.
According to the EKF algorithm, the predicted measurement vector is
 
cos ψ̂k sin ψ̂k 0
ẑ(k + 1) =  − sin ψ̂k cos ψ̂k 0 [x̂− (k + 1) − x̂(k)]. (8)
 
0 0 1

Referring to Equation (4), the result is


 
vx
ẑ(k + 1) =  vy dt.
 
w

The residual of measurement, i.e., ν(k + 1), is the difference between the measurement vector
and the predicted measurement value. It should be noted that the right side of Equation (8) involves
the posterior estimation of x(k). In fact, it is not an exact value and has a probabilistic distribution.
An approximation is used to simplify the calculation. The residual of measurement, i.e., ν(k + 1), and
its approximate covariance matrix is expressed as follows:

ν(k + 1) = z(k + 1) − ẑ(k + 1). (9)

S(k + 1) ≈ ∇hx P− (k + 1)∇hTx + R(k + 1). (10)

Here, the Jacobian matrix ∇hx is


 
cos ψ̂k sin ψ̂k 0
∇hx =  − sin ψ̂k cos ψ̂k 0 .
 
0 0 1

The covariance matrix R(k + 1) corresponds to the measurement noise v(k + 1), which relates to
the error of the ICP algorithm and the performance of the Kinect sensor. R(k + 1) is also designated as
a diagonal matrix.
The Kalman gain matrix is calculated as

K(k + 1) = P− (k + 1)∇hTx S(k + 1)−1 , (11)

where the superscript −1 represents the matrix inverse operation.


Then, the posterior estimation of the robot pose after correction and its covariance matrix are

x̂(k + 1) = x̂− (k + 1) + K (k + 1)ν(k + 1), (12)

P(k + 1) = (I3 − K(k + 1)∇hx )P− (k + 1) (13)

where I3 is a three-order identity matrix.


Sensors 2017, 17, 2073 11 of 15

According to the EKF algorithm described above and the package of GMapping supplied by
ROS, the incremental map of the environment that the mobile robot has explored could be built with
the representation of a grid map. Figure 10 shows the flow chart of the mobile robot localization
and mapping.
Sensors 2017, 17, 2073 11 of 15

EKF for Localization ROS based Mapping


Sensors 2017, 17, 2073Wheel Control input Status 11 of 15
encoders u(k+1) prediction

Estimation of
Kinect EKF for Localization
Estimation of ROS
x(k+1) based Mapping
sensor x(k) Gmapping Map
Wheel Control input Status package building
encoders u(k+1) prediction Kinect
Visual Measurement Measurement depth image
odometry vector z(k+1) update Estimation of
Kinect Estimation of x(k+1)
sensor x(k) Gmapping Map
package building

FigureFigure 10. Flow


10. Flow chart
chart ofof themobile
the Kinect
mobile robot
robot localization andand
localization mapping.
mapping.
Visual Measurement Measurement depth image
odometry vector z(k+1) update
5. Experiments and Discussion
5. Experiments and Discussion
FigureTests
5.1. Mobility and Stability 10. Flow chart of the mobile robot localization and mapping.
5.1. Mobility and Stability Tests
The mobileand
5. Experiments robot RedwallBot-1 was tested on a smooth marble floor. The linear movements
Discussion
Thewere
mobiletested robot RedwallBot-1
first. Once the robot moves was tested on a smooth
an additional marble
1 m, the error floor.
of the linearThe linear
moving movements
distance
was
were tested manually
first. Once
5.1. Mobility measured.
the robot
and Stability Several test results along different directions
Tests moves an additional 1 m, the error of the linear moving show that the means anddistance
standard
was manually deviations of the errors are between 1 and 4 mm, respectively. Therefore, the robot can run
Themeasured.
in a straightmobile
line. robot
Several
The rotational
test results
RedwallBot-1
movements was tested alonga different
were on smooth
also tested
directions
to marble floor.
detect the Theshow
slippage.
that
linear the means
movements
The robot rotated
and
standardwere
deviations of the
tested first. Once errors are between 1 and 4 mm, respectively. Therefore, the robot can run in
at a longitudinal linearthe robot and
velocity moves an additional
an angular velocity1 m,of the
0.25error
m/s of
andthe linear
0.25 rad/smoving distance
simultaneously.
a straight
wasline.
After theThe
robotrotational
manually rotated oncemovements
measured. Several test
and returned to were
results also tested
along
the starting different to detect
point, the directions the slippage.
show
robot slipped that the The robot
means
70~80 mm towards the rotated
and
standard
outside ofdeviations
at a longitudinal linear
the circle. ofThis
the errors
velocity andare
occurred between
anbecause
angular 1 and
the 4 mm,
velocity
wheels on respectively.
of 0.25
the m/s Therefore,
outside and
of the0.25 the
mobile robot
rad/s
robot can run
simultaneously.
rotated
in a straight line. The rotational movements were alsothe
tested to detect
thethe slippage.
floorThe robot rotated
After thefaster
robot than the
rotated wheels
onceonand the returned
other side and because
to the starting surface
point,ofthe marble
robot slipped was70~80
too slick.
mm towards
at a longitudinal
The serial linear
movements velocity
with and
linear an angular
motion and velocity
rotation of on0.25
them/s
spotand
were0.25 rad/s simultaneously.
implemented to trace a
the outside
After ofthethe circle.
robot rotatedThis
onceoccurred because the wheels on
thethe outside of themm mobile robot
the rotated
square with a side length ofand
4 m.returned
The linearto the starting
velocity andpoint,
angular robot slipped
velocity were70~80
0.3 m/s and towards
0.6 rad/s,
faster than the wheels
outside on the
of theResults
circle. This other sidebecause
occurred and because the onsurface of theofmarble floor was too slick.
respectively. are shown in Figure 11.the wheels
When the robotthe moved
outside from thePoint
mobileS torobot
Pointrotated
E, the
Thepositional
serial movements
faster than errors were with
the wheels on the
100 mm linear
other andmotion
side150 mmand
and because rotation
at two the surface
different ondirections.
the spot were
of the marble floorimplemented
The errors was wereslick.
too mainly to trace a
square with
causedThe byserial
a side movements
length
slippages of
when4 m. with
The linear
rotating linearmotion
at the and rotation
velocity
corners. and angularon the spot were implemented
velocity were 0.3 m/s to and
trace 0.6
a rad/s,
square with a side length of 4 m. The linear velocity and angular velocity were 0.3 m/s and 0.6 rad/s,
respectively. Results are shown in Figure 11. When the robot moved from Point S to Point E, the
respectively. Results are shown in Figure 11. When the robot moved from Point S to Point E, the
positional errors were 100 mm and4 150 mm at two different directions. The errors were mainly caused
positional errors were 100 mm and 150 mm at two different directions. The errors were mainly
Expected Trajectory
by slippages
causedwhen rotating
by slippages at the
when corners.
rotating at the corners. Real Trajectory
3

4
y(m)

2 Expected Trajectory
Real Trajectory
3
1
y(m)

2
E
0 S
-5 -4 -3 -2 -1 0 1
1 x(m)

Figure 11. Motion trajectory error when the robot moves


E along a square.
0 S
-5 -4 -3
In order to test the vibration and stability of-2the -1 0 1
x(m)mobile chassis, a 2-axial accelerometer (ADI
ADXL203) was installed on the mobile robot to detect the vertical vibration. The test site was an
FigureFigure
indoor environment 11.aMotion
11.with
Motion smooth trajectory
marble
trajectory error when
floor.
error the
Figure
when therobot
12 moves
shows
robot along
the
moves a square.
vibration
along acceleration curves
a square.
when the robot moved in different directions with designated velocities. According to comparisons,
In order toafter
the vibrations test installing
the vibration and stability
hydraulic buffersofwerethe mobile
explicitlychassis, a 2-axial
restrained whenaccelerometer (ADI
the robot moved
In order
ADXL203)to test
was the vibration
installed on the and stability
mobile robot of
to the
detect mobile
the chassis,
vertical a
vibration.2-axial
The
linearly or turned on the spot. The corresponding vertical slippage distance of each wheel along the
accelerometer
test site was an (ADI
ADXL203)indoor
cylinder environment
was slides
installed with
on the
was within a smooth
10mobile marble
mm. There robot floor. Figure
toadetect
is still 12 shows
the vertical
high-frequency the vibration acceleration
vibration.
vibration, which can The curves
test site was an
be weakened
when thepassively
indoor environment
by using robot with
moved a in different
smooth
compliant directions
marble
connections withFigure
floor.
between designated velocities.
12 shows
the chassis and the the According
layers to
vibration
control comparisons,
inacceleration
the next step. curves
the vibrations after installing hydraulic buffers were explicitly restrained when the robot moved
when the robot moved in different directions with designated velocities. According to comparisons,
linearly or turned on the spot. The corresponding vertical slippage distance of each wheel along the
the vibrations
cylinderafter
slides installing
was within 10 hydraulic
mm. Therebuffers
is still awere explicitly
high-frequency restrained
vibration, whichwhen
can bethe robot moved
weakened
by using passively compliant connections between the chassis and the control layers in the next step.
Sensors 2017, 17, 2073 12 of 15

linearly or turned on the spot. The corresponding vertical slippage distance of each wheel along the
cylinder slides was within 10 mm. There is still a high-frequency vibration, which can be weakened by
using passively compliant connections between the chassis and the control layers in the12next
Sensors 2017, 17, 2073 of 15
step.

Sensors 2017, 17, 2073 3 12 of 15


1
Without Hydraulic Buffers
Without Hydraulic Buffers
With Hydraulic Buffers
With Hydraulic Buffers 3 Robot Augular Velocity

)
2
1 Robot Linear Velocity
Vibration acceleration(m/s )

Movement speed(rad/s)
Vibration acceleration(m/s
Without Hydraulic Buffers
2

0.3

speed(m/s)
Without Hydraulic Buffers
2 With Hydraulic Buffers
0.5 With Hydraulic Buffers
Robot Augular Velocity 3

)
2
Robot Linear Velocity
Vibration acceleration(m/s )

Movement speed(rad/s)
Vibration acceleration(m/s
2

0.3

speed(m/s)
0.2 2

MovementMovement
0.5
2 3
0 1
0.2
0.1
1 2
0 1

-0.5 0.1
0 0 0
0 5 10 15 20 25 30 35 0 20 40 60 80 100 120 1
Time(s) Time(s)

-0.5
(a) 0 0
(b) 0
0 5 10 15 20 25 30 35 0 20 40 60 80 100 120
Figure 12. under different moving modes. (a) Longitudinal Vibration
Time(s)curves
Time(s) movement; (b)
Vibration
Figure rotational
12. (a)curves under different moving modes. (a) (b)
movement. Longitudinal movement;
(b) rotational movement.
Figure 12. Vibration curves under different moving modes. (a) Longitudinal movement; (b)
5.2. Localization and Mapping Experiments
rotational movement.
5.2. Localization andthe
To test Mapping Experiments
performance of localization and mapping, the mobile robot was placed in the
5.2.intelligent
Localization and Mappingproduction
manufacturing Experiments
line (see Figure 9). This area was full of equipment and parts.
To test the performance of localization and mapping, the mobile robot was placed in the intelligent
The length and width of the area were about 30 m and 15 m, respectively. The ground was covered
manufacturingTo test the performance
production of Figure
localization and area
mapping, the of
mobile robot was placed in thelength
by cement with white line (see
pebbles. According 9).toThis was full
the positioning methodequipment
mentioned in and parts.
Section 4, The
the
intelligent
and width of the manufacturing
area production line (see Figure 9). This area was full of equipment and parts.
mobile robot usedwere about
the data of a 30 m and
Kinect sensor 15andm,four
respectively.
wheel encoders Theforground wasand
localization covered
mapping. by cement
The length and width of the area were about 30 m and 15 m, respectively. The ground was covered
with whiteThe pebbles.
estimated According
trace of the mobile robot and the method
to the positioning incrementally generated
mentioned ingrid map are
Section shown
4, the mobilein robot
byFigure
cement 13.
with
The
white pebbles.
mobile robot
According
started from the
tozero
the point
positioning
and moved
method mentioned
clockwise along
in approximate
an
Section 4, the
used the datarobot
mobile of a used
Kinect sensor
data ofand
theFinally, foursensor
a Kinect wheeland encoders
four wheelfor localization and mapping. The estimated
rectangular trace. it returned to the start point. encoders for localization
Several estimated pointsandandmapping.
their
trace of the
The mobile robot
estimated trace and the incrementally
of thepositions
mobile robot generated grid
and the incrementally map are shown
generated in
gridinmap Figure 13.
are 14.
shown The mobile
corresponding practical were compared with each other, as shown Figure The in
robot started
Figure
average from
13. The the
deviationzero
mobile point
ofrobot and moved
started
the distances from the
between clockwise
zero
eachpoint along an
and moved
estimated andapproximate
clockwise
real rectangular
point isalong trace.
an approximate
207 mm. Since the Finally,
rectangular
it returned to the
Kinect trace.
start
sensor Finally,
point.
has it distance
Several
a limited returned of to
estimated 4 mthe andstart
points point.
and
a small fieldSeveral
their of view,estimated
corresponding
it cannot points
supply and
practical their were
positions
efficient
corresponding
compared with eachpractical
measurements other,
when positions
theas shownwere
environment compared
in isFigure
open withaverage
14.empty,
and The each other,
especially as the
deviation
when shown in
of the
robot Figure
distances
turns around.14. InThe
between
average
the deviation
future research, of the
a distances
laser range between
finder, an each
inertial estimated
measurement and
each estimated and real point is 207 mm. Since the Kinect sensor has a limited distance of 4 m and real
sensor,point
and is
a 207 mm.
magnetic Since
compass the
Kinect
will besensor has atolimited
integrated providedistance of 4 m anddata.
more measurement a small field of view, it cannot supply efficient
a small field of view, it cannot supply efficient measurements when the environment is open and
measurements when the environment is open and empty, especially when the robot turns around. In
empty,theespecially when the robot turns around. In the future research, a laser range finder, an inertial
future research, a laser range finder, an inertial measurement sensor, and a magnetic compass
measurement sensor, and
will be integrated a magnetic
to provide compass willdata.
more measurement be integrated to provide more measurement data.

Figure 13. Localization and mapping in the production line.

Figure 13. Localization and mapping in the production line.


Figure 13. Localization and mapping in the production line.
Sensors 2017,
Sensors17, x FOR
2017, PEER REVIEW
17, 2073 13 of 1513 of 15

-1 Estimated Point
Ground Truth
0

y(m)
2

4
0 2 4 6 8 10 12
x(m)

Figure
Figure 14. The
14. The comparisonbetween
comparison between estimated
estimated and
andreal positions.
real positions.

6. Conclusions
6. Conclusions
An omni-directional mobile robot RedwallBot-1 with four Mecanum wheels was developed
An omni-directional mobile robot RedwallBot-1 with four Mecanum wheels was developed
and tested in this study. The three-layer mechanical design and two-level control system ensure
and tested inapplications
flexible this study.andThefurther
three-layer mechanical
reorganization. Datadesign and two-level
from wheel encoders and control system
a Kinect sensorensure
flexiblewere
applications
fused using and further reorganization.
a simplified extender Kalman Data from
filter wheel encoders
to localize the mobile and a Kinect
robot sensor
because only were
fused using a simplified
the relative extender
measurements Kalman
could filter to RedwallBot-1
be obtained. localize the mobile robot because
was integrated with anonly the relative
intelligent
measurements couldsystem
manufacturing be obtained.
to conveyRedwallBot-1
materials. The was integrated
preliminary with an intelligent
experimental manufacturing
results show that the
autonomous localization and mapping method is feasible in industrial fields,
system to convey materials. The preliminary experimental results show that the autonomous with small estimation
errors. and
localization Future work willmethod
mapping focus onisimprovements in accuracy fields,
feasible in industrial and reliability
with fewof navigation using moreerrors.
small estimation C
on-board sensors. This mobile robot can be used as autonomous conveying equipment as well as a
Future work will focus on improvements in accuracy and reliability of navigation using more
mobile platform for research.
on-board sensors. This mobile robot can be used as autonomous conveying equipment as well as a
mobileAcknowledgments:
platform for research.
This project was supported by the National Natural Science Foundation of China
(Grant No. 51205104, 51575150 and 51505114) and the Anhui Provincial Natural Science Foundation (Grant
No. 1608085QE116).
Acknowledgments: This project was supported by the National Natural Science Foundation of China (Grant
Author Contributions: Bin Zi designed the intelligent manufacturing system integrated with the mobile robot;
No. 51205104,
Dan Zhang 51575150 andexperiments;
designed the 51505114)Yangang
and theMaAnhui Provincial
performed Natural
the experiments, andScience
DaomingFoundation
Wang analyzed(Grant
the No.
1608085QE116).
data; Jun Qian developed an omni-directional mobile robot and wrote the paper.
of Interest: The authors declare no conflict of interest.
AuthorConflicts
Contributions: Bin Zi designed the intelligent manufacturing system integrated with the mobile robot;
Dan Zhang designed the experiments; Yangang Ma performed the experiments, and Daoming Wang analyzed
References
the data; Jun Qian developed an omni-directional mobile robot and wrote the paper.
1. Erico, G. Three Engineers, Hundreds of Robots, One Warehouse. IEEE Spectr. 2008, 45, 26–34.
Conflicts of Interest: The authors declare no conflict of interest.
2. Bøgh, S.; Schou, C.; Rühr, T. Integration and Assessment of Multiple Mobile Manipulators in a Real-World
Industrial Production Facility. In Proceedings of the 45th International Symposium on Robotics, Munich,
ReferencesGermany, 2–3 June 2014; pp. 1–8. C
3. Bourne, D.; Choset, H.; Hu, H.; Kantor, G.; Niessl, C.; Rubinstein, Z.; Simmons, R.; Smith, S. Mobile N
1. Erico, G. Three Engineers, Hundreds of Robots, One Warehouse. IEEE Spectr. 2008, 45, 26–34.
Manufacturing of Large Structures. In Proceedings of the 2015 IEEE International Conference on Robotics
2. Bøgh, S.;
andSchou, C.; Rühr,
Automation T. Integration
(ICRA), and Assessment
Seattle, WA, USA, of Multiple
26–30 May 2015; Mobile Manipulators in a Real-World
pp. 1565–1572. R
Industrial
4. Kraetzschmar, G.K.; Hochgeschwender, N.; Nowak, W. RoboCup@Work: CompetingSymposium
Production Facility. In Proceedings of the 41th 45th International for the Factoryon
of Robotics,
the
Munich, Germany,
Future. 2–3 Comput.
Lect. Notes June 2014;
Sci. pp.
2014,1–8.
8992, 171–182. C
3. 5.
Bourne,Ullrich, G. Automated
D.; Choset, H.; Guided
Hu, H.; Vehicle Systems:
Kantor, G.;A Primer
Niessl,with
C.;Practical Applications;
Rubinstein, Springer: Berlin/
Z.; Simmons, Heidelberg,
R.; Smith, S. Mobile Pl
Germany, 2015.
Manufacturing of Large Structures. In Proceedings of the 2015 IEEE International Conference on Robotics us
6. Zeng, J.J.; Yang, R.Q.; Zhang, W.J.; Weng, X.H.; Qian, J. Research on Semi-automatic Bomb Fetching for an
and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 1565–1572.
EOD Robot. Int. J. Adv. Robot. Syst. 2007, 4, 247–252. au
4. Kraetzschmar, G.K.; Hochgeschwender, N.; Nowak, W. RoboCup@Work: Competing for the Factory of
7. Shneier, M.; Bostelman, R. Literature Review of Mobile Robots for Manufacturing; NISTIR 8022; National Institute
the Future. Lect. Notes Comput. Sci.Gaithersburg,
2014, 8992, MD,
171–182. R
of Standards and Technology: USA, 2015. [CrossRef]
5. Ullrich, G. Automated Guided Vehicle Systems: A Primer with Practical Applications; Springer: Berlin/
C
Heidelberg, Germany, 2015.
6. Zeng, J.J.; Yang, R.Q.; Zhang, W.J.; Weng, X.-H.; Qian, J. Research on Semi-automatic Bomb Fetching for an co
EOD Robot. Int. J. Adv. Robot. Syst. 2007, 4, 247–252. R
7. Shneier, M.; Bostelman, R. Literature Review of Mobile Robots for Manufacturing; NISTIR 8022; National
Sensors 2017, 17, 2073 14 of 15

8. Batlle, J.A.; Barjau, A. Holonomy in mobile robots. Robot. Auton. Syst. 2009, 57, 433–440. [CrossRef]
9. Oftadeh, R.; Aref, M.M.; Ghabcheloo, R.; Mattila, J. System Integration for Real-time Mobile Manipulation.
Int. J. Adv. Robot. Syst. 2014, 11. [CrossRef]
10. Indiveri, G. Swedish Wheeled Omnidirectional Mobile Robots: Kinematics Analysis and Control.
IEEE Trans. Robot. 2009, 25, 164–171. [CrossRef]
11. Kanjanawanishkul, K. Omnidirectional Wheeled Mobile Robots: Wheel Types and Practical Applications.
Int. J. Adv. Mechatron. Syst. 2015, 6, 289–302. [CrossRef]
12. Han, K.L.; Kim, H.; Lee, J.S. The sources of position errors of omni-directional mobile robot with Mecanum
wheel. In Proceedings of the 2010 IEEE International Conference on Systems Man and Cybernetics, Istanbul,
Turkey, 10–13 October 2010; pp. 581–586.
13. Herrero-Pérez, D.; Alcaraz-Jiménez, J.J.; Martínez-Barberá, H. An accurate and robust flexible guidance
system for indoor industrial environments. Int. J. Adv. Robot. Syst. 2013, 10. [CrossRef]
14. Yoon, S.W.; Park, S.B.; Kim, J.S. Kalman Filter Sensor Fusion for Mecanum Wheeled Automated Guided
Vehicle Localization. J. Sens. 2015, 2015, 1–7. [CrossRef]
15. Röhrig, C.; Heller, A.; Heß, D.; Kuenemund, F. Global Localization and Position Tracking of Automatic
Guided Vehicles using Passive RFID Technology. In Proceedings of the Joint 45th International Symposium
on Robotics, Munich, Germany, 2–3 June 2014; pp. 401–408.
16. Kim, J.; Woo, S.; Kim, J.; Do, J.; Kim, S. Inertial Navigation System for an Automatic Guided Vehcile with
Mecanum Wheels. Int. J. Precis. Eng. Manuf. 2012, 13, 379–386. [CrossRef]
17. Brunetto, N.; Salti, S.; Fioraio, N.; Cavallari, T.; Di Stefano, L. Fusion of Inertial and Visual Measurements for
RGB-D SLAM on Mobile Devices. In Proceedings of the 2015 IEEE International Conference on Computer
Vision Workshop, Santiago, Chile, 7–13 December 2015; pp. 148–156.
18. Tsai, C.C.; Tai, F.C.; Wang, Y.C. Global Localization using Dead-reckoning and Kinect Sensors for Robots
with Omnidirectional Mecanum Wheels. J. Mar. Sci. Technol. 2014, 22, 321–330.
19. Rowekamper, J.; Sprunk, C.; Tipaldi, G.D.; Stachniss, C.; Pfaff, P.; Burgard, W. On the Position Accuracy
of Mobile Robot Localization based on Particle Filters Combined with Scan Matching. In Proceedings
of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Portugal,
7–12 October 2012; pp. 3158–3164.
20. Thoben, K.D.; Wiesner, S.; Wuest, T. “Industrie 4.0” and Smart Manufacturing—A Review of Research Issues
and Application Examples. Int. J. Autom. Technol. 2017, 11, 4–17. [CrossRef]
21. Puerto, M.J.; Salle, D.; Outon, J.L.; Herrero, H.; Lizuain, Z. Towards a Flexible Production—System
Environment Server Implementation. In Proceedings of the 16th International Conference on Computer as a
Tool, Salamanca, Spain, 8–11 September 2015. [CrossRef]
22. Baglio, S.; Castorina, S.; Fortuna, L.; Herrero, H.; Lizuain, Z. Modeling and Design of Novel
Photo-thermo-mechanical Microactuators. Sens. Actuators A Phys. 2002, 101, 185–193. [CrossRef]
23. Peng, T.R.; Qian, J.; Zi, B.; Liu, J.; Wang, X. Mechanical Design and Control System of an Omni-directional
Mobile Robot for Material Conveying. Procedia CIRP 2016, 56, 412–415. [CrossRef]
24. Bae, J.J.; Kang, N. Design Optimization of a Mecanum Wheel to Reduce Vertical Vibrations by the
Consideration of Equivalent Stiffness. Shock Vib. 2016, 2016, 1–8. [CrossRef]
25. Roh, S.G.; Lim, B. Flexible Suspension Mechanism for Stable Diving of a Differential Drive Mobile Robot.
In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo,
Japan, 3–7 November 2013; pp. 5518–5523.
26. Zi, B.; Lin, J.; Qian, S. Localization, Obstacle Avoidance Planning and Control of Cooperative Cable Parallel
Robots for Multiple Mobile Cranes. Robot. Comput. Integr. Manuf. 2015, 34, 105–123. [CrossRef]
27. Zi, B.; Duan, B.Y.; Du, J.L.; Bao, H. Dynamic Modeling and Active Control of a Cable-Suspended Parallel
Robot. Mechatronics 2008, 18, 1–12. [CrossRef]
28. Zi, B.; Sun, H.H.; Zhang, D. Design, Analysis and Control of a Winding Hybrid-Driven Cable Parallel
Manipulator. Robot. Comput. Integr. Manuf. 2017, 48, 196–208.
29. Heß, D.; Künemund, F.; Röhrig, C. Linux Based Control Framework for Mecaum Based Omnidirectional
Automated Guided Vehicles. In Proceedings of the World Congress on Engineering and Computer Science
2013, San Francisco, CA, USA, 23–25 October 2013; pp. 395–400.
30. Quigley, M.; Gerkey, B.; Smart, W.D. Programming Robots with ROS: A Practical Introduction to the Robot
Operating System; O’Reilly Media: Sebastopol, CA, USA, 2015.
Sensors 2017, 17, 2073 15 of 15

31. Unified Architecture. Available online: https://opcfoundation.org/about/opc-technologies/opc-ua


(accessed on 1 January 2017).
32. Ros_Opcua_Communication. Available online: http://wiki.ros.org/ros_opcua_communication (accessed on
19 December 2016).
33. Welch, G.; Bishop, G. An Introduction to the Kalman Filter; TR95-041; ACM: New York, NY, USA, 2001.
34. Qian, J.; Zhu, H.B.; Wang, S.W.; Zeng, Y.S. Fast Reconstruction of an Unmanned Engineering Vehicle and its
Application to Carrying Rocket. IET J. Eng. 2014. [CrossRef]
35. Bay, H.; Tuytelaars, T.; Van Gool, L. SURF: Speeded up Robust Features. Lect. Notes Comput. Sci. 2006, 3591,
404–417.
36. Mjua, M.; Lowe, D.G. Fast Approximate nearest neighbors with automatic algorithm configuration.
In Proceedings of the 4th International Conference on Computer Vision Theory and Applications, Lisboa,
Portugal, 5–8 February 2009; pp. 331–340.
37. Li, S.Q.; Xu, C.; Xie, M. A Robust O(n) Solution to the Perspective-n-Point Problem. IEEE Trans. Pattern Anal.
Mach. Intell. 2012, 34, 1444–1450. [CrossRef] [PubMed]
38. Besl, P.J.; McKay, H.D. A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992,
14, 239–256. [CrossRef]
39. Kummerle, R.; Grisetti, G.; Strasdat, H.; Konolige, K.; Burgard, W. G2 o: A General Framework for Graph
Optimization. In Proceedings of the 2011 IEEE/RSJ International Conference on Robotics and Automation,
Shanghai, China, 9–13 May 2011; pp. 3607–3613.

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like