You are on page 1of 21

Research Article

International Journal of Advanced


Robotic Systems
May-June 2022: 1–21
Development stages of a semi-autonomous ª The Author(s) 2022
Article reuse guidelines:
underwater vehicle experiment platform sagepub.com/journals-permissions
DOI: 10.1177/17298806221103710
journals.sagepub.com/home/arx

Serhat YILMAZ

Abstract
The design of underwater unmanned vehicles is an interdisciplinary study that includes several fields, such as computa-
tional fluid dynamics, modeling and control of systems, robotics, image processing, and electronic card design. The
operational cost of such vehicles is high because it is dependent on variable fluid properties like salinity and high pressure
while its’ mobility must be resistant to environmental conditions such as undersea. The study describes an operating
platform, called Lucky Fin, on which the students can develop various control algorithms and can test and extract
hydrodynamic parameters of the underwater vehicle. The platform consists of an underwater vehicle and two testing
tanks. The control card, the user control program interface, and a manipulator’s arm are designed to be used for a series
of control applications such as depth, heading, target tracking, and capturing. The results of several tests are illustrated in
this article.

Keywords
Control card, user interface, depth control, heading control, robotic vision, CFD, manipulator, underwater vehicle

Date received: 07 July 2021; accepted: 09 May 2022

Topic: Mobile Robots and Multi-Robot Systems


Topic Editor: YangQuan Chen
Associate Editor: Ke-Cai Cao

Introduction field experiments. They use the models to estimate the


maneuvering radius of the AUV at altered speeds and to
Underwater vehicles (UVs) perform several tasks, such as
design an optimized gain controller.
diving to the desired depth, docking under demanding con-
On the other hand, Wang et al.5 have developed an AUV
ditions, and visual inspection of specific underwater struc-
platform that carries a laser line scanner to maintain an
tures1 while keeping the heading angle tangent to the
accurate and stable altitude. For doing this, the vertical
desired trajectory. An experimental environment can be set
thrusters are operated around the zero point, and the direc-
up to facilitate the development of different control meth-
tion of the thrusters has been altered frequently. There
ods, which result in reasonable performances. For this pur-
are several competitions for ROVs and AUVs in which
pose, Miorim et al.2 have developed an educational tool for
groups of students participate. These competitions encourage
remotely operated vehicles (ROVs). They have extracted
the requirements of a control center for educational pur-
poses, which provides different high sea ROV operations. Department of Electronics and Telecommunication Engineering, Faculty of
Fletcher and Harris3 developed a virtual environment (VE) Engineering, Kocaeli University, Kocaeli, Turkey
system for training the piloting skills on ROVs. The VE
system contains some mission operations including man- Corresponding author:
Serhat YILMAZ, Department of Electronics and Telecommunication
euvering, integration of sensor data, and situational aware- Engineering, Faculty of Engineering, Kocaeli University, Kocaeli 41001,
ness. Eng et al.4 prepared a method for online identification Turkey.
of autonomous underwater vehicle (AUV) dynamics in Email: yilmazserhat@gmail.com

Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License
(https://creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without
further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/
open-access-at-sage).
2 International Journal of Advanced Robotic Systems

students to develop and share new skills in underwater tech- hydrodynamic behavior of a vehicle could be extracted
nology and related applications. For this purpose, a depth by fluid dynamic imaging and simulations of the CFD.16
control system has been developed for a microprocessor- In a case study, the mathematical model of a low-speed
based UV.6 The depth information obtained by the pressure AUV was extracted by means of CFD analysis for two
sensor is compared with the desired depth, and the error critical hydrodynamic parameters, which are added mass
is eliminated by means of controlling vertical motors. (MA) and damping matrices.17
DeBitetto7 has used fuzzy controllers instead of traditional The mechanical, electronic, and software designs of a
control methods in a similar study. The pitch angle is con- UV test platform (TP), which has been developed by the
trolled using depth error information and its’ derivative. In the Kocaeli University Electronics and Telecommunications
mentioned study, the error and its’ derivative are described by Department between 2013 and 2020 and is called the Lucky
verbal variables, and each variable is separated into fuzzy Fin Project, are introduced in this study. The platform is
membership functions (MFs). composed of UV and lateral and vertical test tanks.
Using the sliding mode observer and Kalman Filter, Kim Although the basic concepts are preserved, sensors, cam-
and Shin8 have determined the hydrodynamic coefficients eras, battery types, microcontrollers, and card designs have
of a vehicle with six degrees of freedom (DOF). They have been modified in line with the development stages. Basic
extracted the block diagram model of the vehicle and have movements, such as diving, keeping the desired depth level
simulated its behavior of various reference depth and direc- or position angle, and tracking–grasping a target object, are
tion values. In these studies, various models related to the provided by control systems. To achieve this kind of robust
applications are used as UV models. Some of these UVs are autonomy, the kinematic and the dynamic behaviors of the
cable-connected and ROVs, while the other groups are vehicle should also be determined and modeled. P (propor-
AUVs. The vehicles in the latter groups usually commu- tional), PD (proportional–derivative), PID (proportional–
nicate with the surface system by means of acoustic mod- integral–derivative), and ANFIS (adaptive neuro-fuzzy
ems. Both traditional9 and novel10,11 control techniques are inference system) based controllers, which can respond to
applied to provide autonomous motion for UVs. Fernandes the desired commands within certain performance criteria,
et al.12 have managed to control a UV in four DOFs (yaw, have been developed and tested on the platform. Some
sway, surge, and heave), and they have designed a control- actual system parameters are identified and adjusted to
ler to track an appropriate and smooth reference path with improve the design. As the platform serves as a testbed for
an unmodelled plant dynamics, the varying UV parameters, a set of experiments, the electronics and telecommunica-
and the existence of environmental noise and disturbances. tion engineering students can develop applications on a
Image acquisition from UVs, acoustic transmission, and wide range of topics that they have learned during the
image processing are also significant subjects in this field. lectures. Depth and heading control applications of the
In a case study, images acquired from the UV are improved UV have been carried out and interpreted in the laboratory
using Canny edge detection, Hue, Luma, and Saturation tests. In addition, image processing tests on target tracking
algorithms.13 AUVs are generally equipped with at least and object capturing operations are also included. UV
one optical camera to acquire visual information about itself measures some parameters, such as orientations and
underwater locations. It is also essential for the estimation corresponding velocities, referring to the body frame.
of AUV position based on the target of underwater These values should be converted by the kinematic
objects.14 Object tracking algorithms are used in many dif- model and interpreted into the Earth Fixed parameters of
ferent areas, such as security systems, autonomous vehi- the shore computer. Some of the UV dynamic parameters
cles, robots, and traffic cameras. The common point is the are determined according to the simulations and experi-
defining and updating of the position of image frames in a mental tests.
video.15 The study considers the benefits and contributions of
Traditional parameter estimation methods, which have designing a framework, Lucky Fin, for educational pur-
been used to improve UV hydrodynamics, are towing poses UV development and testing. In doing so, it was
experiments and pool tests. Building and operating towing inspired by some guiding works. Modest improvements
tests for real size vehicles are quite an expensive method.9 and contributions are made to some of these studies in
These experiments give very realistic results when vehicle terms of equipment, design, and method, and the experi-
sensors are sensitive. However, changing flow conditions mental setup emerged. An underwater robot experimental
in experimental studies make it difficult to get reasonable testbed was developed for depth control and temperature
and precise results. Experimental studies on the determina- measurement.18 A piston actuator moves up and down to
tion of UV hydrodynamics require such a long period that adjust the sinking level of the robot in a glass rectangular
lose time for research and development processes. Compu- aquarium tank. The piston as an actuator has a deadzone
tational fluid dynamics (CFD) is an effective computer and deadband nonlinearity, which causes errors, oscilla-
simulation and design method for determining hydrody- tions, and instability. Accurate measurement takes 3 min,
namic model parameters of UUVs. Therefore, the usage and 60 cm depth is reached by 1.55 cm deviations. The
of computers and the CFD has been emphasized. The Lucky Fin Project was influenced by the principle of this
YILMAZ 3

work and aimed to improve it. The mechanical design of control algorithms are prepared on Raspberry Pi for detect-
the passive diving robot, which consists of a tube and a ing an object from the camera and capturing it with a
piston, has been replaced by a UV with four DOFs, pro- manipulator. The mobility of the manipulator increases the
pellant motors, a camera, and sensorial equipment. DOF of Lucky Fin by two degrees and provides more flex-
Heave-surge and roll-heading movements are applied in ible approach conditions to the object. The control card
cylinder-shaped horizontal and lateral tanks, respectively. (CC) and software are original and open to development.
More accurate depth control is achieved as given in the CFD is a convenient method for parameter extraction of the
“Discussions on UV control applications” section. Various UUV hydrodynamic model.29 Control of movements in
control methods have been proposed in the literature over low speeds, such as sway, heave, surge, and heading, can
the years. Sliding mode controllers (SMCs) are the com- be precisely modeled and simulated in terms of hydrody-
mon heading, depth, pitch, and yaw control methods namic coefficients, such as added mass (MA), drag (CD),
applied to AUV systems.19,20 Non-linear characteristics lifting (CL), and controller parameters. The combination of
of thruster brushless motors with a dead zone result in CFD simulation with free streaming turbulence models and
heading error. When the error goes beyond a definite rate, experimental tests improves parameter identification.30
the controller’s effort to adjust it as a step disturbance and However, confirmation of the hydrodynamic parameters
overshoots in the transient response could reach around five obtained in the simulations with experimental tests is rare
degrees.21 Unmeasurable conditions, sensory noise, elec- in the literature. Lucky Fin is a particular designed modular
tromagnetic effects, and signal delays also degrade the sta- type of UV that has four DOF, and the relevant computer-
bility and performance of the AUV heading control.22 The aided design (CAD) model is extracted for the first time.
high performance and high accelerations of brushless Cost-effective confirmatory experimental methods that are
motors are of course indisputable. However, the motor worthy of reference in testing complex-shaped modular
driver design of Lucky Fin provided slower but precise and UVs are included in the study.
stable responses of DC motors within narrow angular
ranges. A highly precise calibrated TCM3 electronic com-
pass is water tightly insulated and located at the top of the Overview of the UV TP
vehicle. This location is outside of the electronic control For the mentioned control purposes, a TP consisting of a
hull and is away from the power layer and metal parts UV, CC, a control program interface (CPI) software,
(Figure 9(b)). These adjustments let the vehicle less sus- and two simulation tanks was designed and implemented
ceptible to electromagnetic disturbances and provide pre- (Figure 1). Depth is measured by a high-precision pressure
cise control of Euler angles. Vision-based positioning and sensor where heading is determined by a three-axis elec-
tracking of an AUV have been used successfully in clean tronic compass. The depth and heading control applications
environments under restricted distances.23 Specifically, are applied in the horizontal tank (Figure 1(a)) and lateral
AUVs equipped with robotic arm manipulators could exe- tank (Figure 1(b)), respectively.
cute high-resolution computer vision-based tasks, such as The CC in the UV and the CPI on the computer utilize
detecting and capturing objects within reachable distances the Universal Synchronous/Asynchronous Receiver
of the manipulator.24 Image processing algorithms are Transmitter (USART) communication protocol. A CCD
addressed in real-time underwater visual tracking opera- camera is set on the UV for monitoring and image pro-
tions. Hough algorithm proposed for localization opera- cessing. The CC drives two vertical, and two horizontal
tions increases recognition capability on pipelines. 25 thrusters and optionally servomotors of a three DOF
Computational intelligence methods, such as artificial manipulator. Principally, the CC is designed for both
neural network based adaptive control systems, fuzzy cer- remote and autonomous control applications. Thus, the
ebella model articulation controller,11 and self-adaptive CC operates the vehicle as an open or a closed-loop con-
recurrent neuro-fuzzy control10 methods, improve tracking trol system. In open-loop mode, control commands sent
performance and dynamic motion response of AUVs. by the operator via the CPI are transmitted to the vehicle’s
Lucky Fin’s object tracking task is accomplished with actuators, and data from the sensor are monitored on the
vision-based direction and surge control. While PD and CPI. The closed-loop processes the sensory data and pro-
ANFIS algorithms are performed for the mentioned control duces the control signals. Features of the vehicle are given
operations, the Hough algorithm is used for object detec- in Table 1.
tion. Autonomous systems consisting of a UV and a robot
arm (RA) are designed to track underwater objects and
detect their positions precisely. The object is detected by Methods used in the UV system design
image processing, and the manipulator approaches the
object with methods such as fuzzy control.26 Some manip-
Kinematic modeling of the UV
ulators are controlled manually,27 and in others, the object The position and orientation of vehicles on three axes are
is captured with a fixed arm24 or by hovering and vacuum- calculated according to the kinematic model of the vehicle.
ing the object. 28 In this study, image processing and axial Orientations measured by the TCM3 electronic compass in
4 International Journal of Advanced Robotic Systems

Figure 1. (a) Depth and (b) heading control applications in the TP. TP: test platform.

Table 1. UV features.

Length (cm) Width (cm) Cross size (cm) Height (cm) Weight (kg) Power of each motor (W)
46 42,5 47 30 13 30

UV: underwater vehicle.

the vehicle should be converted by the Jacobian Matrix to Euler angles. The origin of the UV coordinate system is
be interpreted by the operator located next to the shore returned until it is matched with the earth coordinate sys-
computer. The pulse width modulation (PWM) value of tem’s origin ðx; y; zÞ (Figure 2).
each motor is determined by referring to its’ target axes. A compact representation of the Jacobian matrix for the
These equations are utilized in both three-dimensional kinematic equations is written as
model simulations by means of Matlab, and the equations " #
are embedded into the CPI software for online orienta- J 1 ðØ; q; jÞ 0
J¼ (1)
tion control applications. The most used kinematic equa- 0 J 2 ðØ; q; jÞ
tions are Euler angles and quadrature groups in these
applications.31 J1 and J2 are given in (2)

2 3
cqcj sjcØ þ sØsqcj sjsØ þ cØsqcj 0 0 0
6 7
6 cqsj cjcØ þ sØsqsj cjsØ þ cØsqsj 0 0 0 7
6 7
6 7
6 sq sØcq cØcq 0 0 0 7
6 7
J¼ 6 7 (2)
6 0 0 0 1 sØtq cØtq 7
6 7
6 7
6 0 0 0 0 cØ sØ 7
4 5
0 0 0 0 sØ=cq cØ=cq

where c, s, and t are abbreviations of cosine, sine, and Cartesian position vector with the angular Euler vector,
tangent, respectively. The Jacobian matrix matches the which is given in terms of SNAME32
YILMAZ 5

Figure 2. Illustration of the Lucky Fin UV elementary motions and frames on the 3D model. UV: underwater vehicle; 3D: three-
dimensional.

     limited to the determination of mass matrix for the UV: The


x_1 J 1 ðx2Þ 0 q_ 1
¼ x_ ¼ J ðxÞq_ (3) M consists of rigid body mass, M RB , and added mass, MA,
x_2 0 J 2 ðx2Þ q_ 2
matrixes
where x 1 ¼ ðx; y; zÞ is position vector and x 2 ¼ ðØ; q; jÞT
T
M ¼ M RB þ M A (7)
is Euler angle vector and both will be called an inertial
reference system. Fixed linear and angular velocity vectors M RB is defined as32
of the vehicle are given as follows 2 3
m 0 0 0 mzG myG
_ _ 6 0
q1 ¼ ðu; v; wÞT ; q2 ¼ ðp; q; rÞT (4) 6 m 0 mzG 0 mxG 7 7
6 7
The kinematic transformation of linear and angular 6 0 0 m myG mxG 0 7
6 7
M RB ¼6 7
velocities between local and earth coordinate systems are 6 0 mzG myG I xx I xy I xz 7
given as 6 7
6 7
    4 mzG 0 mxG I yx I yy I yz 5
ve vo
¼ J ðØ; q; jÞ (5) myG mxG 0 I zy I zy I zz
we wo (8)
These variables are determined according to the earth
r c is the distance vector between the center of the body
coordinate system. In this case, the velocities of the UV are
coordinate frame and the center of UV’s gravity
known and represented as the forward kinematic equation.
r c ¼ ½ xG yG zG  T (9)
Dynamic modeling of the UV I B is the tensor vector of the vehicle body,34 which is
Dynamic modeling deals with the relationship between the composed of Ixx, Iyy, and Izz inertial tensors occurring
robot’s resulting motion and the forces applied to the around x-, y-, and z-axes, respectively (Eq. (10)). UUVs
robot.33 UUV dynamics are derived from the Newton– are generally symmetrical in the x–z plane. Hydrodynamic
Euler equation of a rigid body in a liquid friction must be reduced so that vehicles can move faster in
the x-direction. For this reason, the front and rear of the
M v_ þ CðvÞv þ DðvÞv þ gðhÞ ¼ t (6)
vehicles are generally designed to be balanced and very
where t is the force-torque vector of the thruster motors. M close to symmetry in the y–z plane with respect to the
is the sum of the rigid mass, and MA is the matrix of the center. In addition, it is taken symmetrically in modeling
vehicle. CðvÞ, DðvÞ, and gðhÞ are the total Coriolis, Drag, to facilitate operations. Although UUVs are not symmetri-
and gravity-buoyancy matrixes, respectively. The study is cal in the x–y plane, they can be considered symmetrical
6 International Journal of Advanced Robotic Systems

because most vehicles operate at low speeds.35 When UUV where Kp and Kd are proportional and derivative coeffi-
is symmetrical in all planes and the origin of the body cients of the error, respectively.37 The PD control routine
reference frame is located at the center of gravity, in other includes numerical representations of the error, integral,
words, r c ¼ ½ 0 0 0 T . (8) would be reorganized as and derivation
2 3
m 0 0 0 0 0 eðtÞ  eðt  1Þ
60 m 0 0 uðtÞ ¼ K p eðtÞ þ K d (15)
6 0 0 7
7 Dt
6 7
60 0 m 0 0 0 7
6
M RB ¼ 6 7 (10) where eðtÞeðt1Þ is the numerical backward differentiation
6 0 0 0 I xx 0 0 7
7 Dt
6 7 approximation of the derivation and Dt is the sampling
4 0 0 0 0 I yy 0 5
period of the CC.
0 0 0 0 0 I zz On the other hand, on–off control is a basic feedback
system that switches the actuator from fully closed to fully
M A is given as follows
open according to the setpoint of the variable being con-
2 3
X u_ X v_ X w_ X p_ X q_ X r_ trolled. In such controllers, a dead band between the lower
6 Y u_ Y v_ Y w_ Y p_ Y q_ Y r_ 7 border (lb) and upper border (ub) can be placed to prevent
6 7
6 7 chattering around a set-point
6 Z u_ Z v_ Z w_ Z p_ Z q_ Z r_ 7
MA ¼ 6 6K
7 (11)
6 u_ K v_ K w_ K p_ K q_ K r_ 7
7

OFF ð0Þ; eðtÞ > ub
6 7 uðtÞ ¼ (16)
4 M u_ M v_ M w_ M p_ M q_ M r_ 5 ON ð1Þ; eðtÞ < lb
N u_ N v_ N w_ N p_ N q_ N r_
When we apply forces on the x-, y-, and z-axes and ANFIS method. ANFIS is a structure that is a combination
moments around these axes to a vehicle, the water masses of a Sugeno-type fuzzy system with neural learning capa-
that need to be pushed resist the vehicle in these directions. bility (Figure 3). Such an approach makes fuzzy logic
Each element of the M A matrix represents the added mass of more systematic and less dependent on experience. ANFIS
water to be pushed in the related direction. Each added mass calculates inputs in a forward direction by means of fuzzi-
is calculated by Newton’s Second Law of Motion; for exam- fication, production, normalization, and defuzzification
ple, Xu_ mass is a rate of change in UV acceleration accord- layers and propagates back the resultant error to the MF
ing to the change of thruster force in the x-direction32 parameters.38
Backpropagation method is utilized for adapting the MF
@X parameters, which provide the ANFIS model to learn the
X u_ ¼ (12)
@ u_ input–output characteristics of a system.39
Terms of M A depends on the UUVs shape. If a UUV is
symmetrical in all planes and the body reference frame is
the vehicle’s center of gravity, the hydrodynamic MA can
Four DOF Lucky Fin UV model
only be written in diagonal terms. Thus, except for the main The Earth (E) fixed frame and the local axes that the vehi-
directions or orientations stated on the prime diagonal, the cle can move are shown in Figure 4. If the horizontal thrus-
M A terms could be accepted as negligible36 ters run at identical velocities, the UV will move in the
2 3 x-direction. When the same thrusters operated at different
X u_ 0 0 0 0 0
velocities, they will rotate around the z-axis with an angular
6 0 Y 0 0 0 0 7
6 v_ 7 velocity (r) and an angular displacement ( ). When the
6 7
6 0 0 Z w_ 0 0 0 7 vertical thrusters of the vehicle run at identical velocities,
MA ¼ 6 6 0
7 (13)
6 0 0 K p_ 0 0 7
7
it dives in the z-direction. When it operates at different
6 7 velocities or in opposite directions, it rotates around the
4 0 0 0 0 M q_ 0 5
x-axis with its angular velocity (p) and an angular displace-
0 0 0 0 0 N r_ ment (’). The highest velocities obtained by the Lucky
Fin vehicle during the experimental studies are given in
Table 2.
Control methods
PD and on–off methods. These methods are used for individ- Image processing and target tracking methods
ual depth, roll, yaw, and heading control applications. The for robot vision systems
control output, ðtÞ, of a PD controller is given as
After starting image processing applications, the analog
deðtÞ camera used in the early days of UV design was re-
uðtÞ ¼ K p eðtÞ þ K d (14)
dt placed by the digital Raspberry Pi Camera. OpenCV is a
YILMAZ 7

INPUT LAYER μA1 (E) NORMALIZATION LAYER OUTPUT LAYER


E
O11 O21 O31 O41
π N f1
1

0.9

0.8
μA2 (E) E
0.7 O5 z
0.6
O12 O22 O32 ∑
E 0.5

0.4 π N f2 O42
0.3

0.2

0.1

0
E
0 2 4 6 8 10 12 14 16 18 20

0.9
μA3(E) O13 O23 O33
f3 O43
0.8

0.7 π N
0.6

0.5

0.4

0.3

0.2

0.1

0
0 2 4 6 8 10 12 14 16 18 20

FUZZIFICATION LAYER PRODUCTION LAYER DEFUZZIFICATION LAYER

Figure 3. ANFIS model with three rules. ANFIS: adaptive neuro-fuzzy inference system.

Since circular shapes are expected to form in the image,


the accumulator array consists of the center points (a, b) of
the circle and the radius value (r). Equation (17) illustrates
the central circle formula. Positions x and y are given in
(18) and (19), respectively.

r 2 ¼ ðx  aÞ 2 þ ðy  bÞ 2 (17)

x ¼ a þ rsinðqÞ (18)

y ¼ b þ rcosðqÞ (19)

Once the angle is valued between 0 and 2p, it can be


determined whether a point is on the circle or not. The
circle with the highest number of voted values in the accu-
Figure 4. The four DOF Lucky Fin UV: earth fixed frame and body mulator is drawn. Following the image has been processed,
fixed frame. DOF: degrees of freedom; UV: underwater vehicle. the thruster velocities should be adjusted proportionally
according to the position of the object. If we consider the
computer vision open-source library of Intel, which pro- vehicle camera window where the image is shown as the x–
vides a high level of algorithms and functions for robot y coordinate plane: The velocities of the thrusters should
vision problems.40 red, blue, and green (RGB) and hue vary continuously referring to the y-axis of the window. If
saturation value (HSV) models that are suitable for the the target object is on the right or on the left of the y-axis,
OpenCV format could be used. The minimum and maxi- left or right thrusters should run faster, respectively
mum values of the color can be obtained during the con- (Figure 6).
version process from RGB space to HSV space. Hough In order to change the speed of the motors, the PWM
circle transform is a method widely used in image pro- signals generated on the controller card should be sent to
cessing to detect circular shapes where many applications, the motors.42
such as eye recognition, license plate detection, the ball
finding on the football field, can be done.41 With this
method, shapes are determined as follows: edges of the
UV and manipulator systems for object capturing
image are detected, and the image is converted into bin- operations
ary. Edge pixels are voted on the accumulator. According The integration of the UV and the RA(UV þ RA) has
to the result of the accumulator, the shapes with the high- facilitated operations, such as repairing oil platforms,
est votes are more likely to be found in the image. The detecting and welding leaks of pipelines, collecting
detected geometric shapes can be displayed as an output underwater objects, and investigating deeply submerged
of the image. As an example, a green circle is drawn when wrecks.32 Images of underwater objects can be sent to the
a red object is detected (Figure 5). operator on land. The operator’s arm movement could be
8 International Journal of Advanced Robotic Systems

Table 2. Motion axes of four DOF Lucky Fin UV and the highest speeds obtained from the vehicle.

Linear displacements
DOF Definition of the motion and Euler angles Linear and angular velocities
1 Linear in the x-direction x u ¼ 0.12 m/sec
2 Linear in the z-direction z w ¼ 0.2 m/sec
3 Angular roll around the x-axis ’ p: could not be measured
4 Angular yaw around the z-axis r ¼ 5o/sec
DOF: degrees of freedom; UV: underwater vehicle.

Electronic equipment and software


designs
Controller card (CC) design
The developed UV CC includes two microcontrollers (mC),
which are ARM Cortex M4 based STM32F407VGT6
(STM32) mC and a BeagleBone Black development card.
The BeagleBone hosts an Arm Cortex-A8-based mC.
According to the applications, these two microprocessors
share the tasks (Figure 8). In the later studies, BeagleBone
Figure 5. Object detection by means of the Hough method. Black is replaced with Raspberry Pi minicomputer.45
The CC is designed to be powered by a 12 Volt battery
pack. The supply voltage is reduced by the switched-mode
converters till the required levels of the microcontrollers
and the sensors. A 12 V SEPIC battery charging unit is
added to the card. The STM32 mC undertakes the tasks
such as the PD control process by means of the sensors
and thrusters and commands of on–off switches of the
lightning and the camera. TCM3 electronic compass,
GY-80 inertial movement unit (IMU), and WIKA S10
pressure sensor are used to acquire direction, velocity, and
depth data, respectively. The pressure sensor and elec-
tronic compass are set under and upper parts of the UV,
respectively (Figure 9).
For instance, the calculation of the depth data by the CC
is given as follows
    
3; 3
Depth ¼ ADC   0; 4  625 cm (20)
4095
Figure 6. Position of an object refers to the y-axis of the UV.
UV: underwater vehicle. where ADC is analog to digital conversion data of the
measured pressure. In the later stages, the analog WIKA
detected by image processing methods, matched with the S10 sensor is replaced with a digital BAR 30 depth sensor.
angle movements in the manipulator’s joints, and conse- The mC uses the data from compass and IMU sensors as
quently, the object is grasped. 43 Seashells could be feedback while it sends the processed data to either the
detected at a depth of 100 m using image processing meth- BeagleBone Black card or the control interface directly.
ods and be collected by the vacuum system of a fixed arm The CC is designed to control up to 8 DC and 8 servo
mounted on the vehicle.28 One of the educational experi- motors for thrusters and manipulators. Servo motors can
ment designs in this study is deal with RA applications.44 directly be driven by PWM outputs of the mC. The reserved
Detecting an object from the camera on UV þ RA and outputs to servo motors are used for electronic speed con-
capturing the process of the target object are carried out by trollers of the applications, such as brushless DC motors,
means of image processing and axial control algorithms. which require high power. H-Bridge drivers are designed
The UV þ RA was also used in an ROV competition. The to control the speed and direction of the DC motors.
manipulator ideally provides two more DOF to the UV, The USART, I2C, and 1-Wire communication protocols
and consequently, conditions for approaching the object are implemented for communication among the microcon-
would be more flexible (Figure 7). trollers, sensors, and the interface. The USART protocol
YILMAZ 9

Figure 7. Underwater tests of the Lucky Fin UV þ RA system, Kocaeli University Indoor Swimming Pool. (a) RA view from UV indoor
camera and (b) uutdoor camera shots. UV: underwater vehicle; RA: robot arm.

Figure 8. Top view of the designed CC and locations of the main equipment. CC: control card.

allows long-distance communications at low speeds, The CPI


whereas the I2C protocol allows fast data transmissions The CPI, which is written in C # programming language, is
(Figure 10). a computer user interface that interacts with the vehicle’s
The CC contains four USART lines, and one of them is hardware and allows for monitoring and controlling of the
between TCM3 and STM32. The others are the CPI— UV both in manual and automatic modes. In the manual
STM32, BeagleBone (BB)—STM32, and BB—CPI. mode, the orientation of the vehicle is manipulated either
10 International Journal of Advanced Robotic Systems

Figure 9. (a) Pressure sensor and (b) electronic compass.

GPIO Direction Control


GPIO
DS1820
PWM 8X DC Motor Velocity
ADC and Direction Control
WIKA S10
STM 32 F407
Camera and GPIO Embedded SPWM 8X Servo Motor or ESC
Lighting Control Controller
I2C
GY80 IMU USB
USB/OTG

DAC
USART3 USART1 USART2

Camera Zoom USB Beagle Bone TCM3


Control Electronic
Compass

USART4 and Ethernet Live Stream RS232

PC

Figure 10. General task schematic of the CC. CC: control card.

by graphical CPI buttons or by a keyboard. The CPI can database file labeled with the date and time information.
capture images and video, acquire data, and plot the status The CPI includes the option to monitor, graph and report
graphics of the UV (Figure 11). recorded data, such as pitch, yaw, depth, heading, supply
CPI supports USB cameras. However, as the cable voltage, motor currents, and controller output.
distance between the computer and the UV is not close
enough, the image is transferred analog through the coaxial
line of a non-buoyant cable. The received analog video Discussions on UV control applications
image is converted to digital by means of a USB image Some experiments are introduced in practice for educa-
capturing device. Since the CPI initializes communication tional purposes in the designed development platform. The
with the vehicle, it starts to record the vehicle data into a on–off, PD, PID, and ANFIS control algorithms for depth
YILMAZ 11

Figure 11. The CPI of the UV. CPI: control program interface; UV: underwater vehicle.

Table 3. Transient state analysis of the depth control response.

Depth control %MO tr ðsecÞ ts ðsecÞ ess ðcmÞ


PD 17 6.7 12.3 About zero

PD: proportional–derivative; MO: maximum overshoot.

(around the surface) to 40 cm depth in 6.7s. Percent max-


imum overshoot (%MO) is %17. Settling time (ts) is around
12.3s. A considerable steady-state error (ess ) has not been
observed (Table 3).

Figure 12. Step response of the UV depth control system.46


UV: underwater vehicle.
Heading control tests
The lateral pool is engaged to the TP for heading (yaw)
and heading control applications are prepared and uploaded control application. Digital electronic compass (TCM3)
to the STM32 mC. and horizontal thrusters are used for the feedback and
actuation, respectively. Within the heading control experi-
ment, the UV is subjected to on–off and PD control tests
Depth control tests (Figure 13).
The depth control application is completed by means of the In the initial phase of the test, the UV was positioned
PD method, whose parameters are defined by trial and þ50 clockwise from the north. It is subjected to a shifting
fault. The step response of the UV to a 40 cm set value effect to the left or right side around 10 and is expected to
as the required depth is given in Figure 12. return to the set value. In the on–off control test, the UV
While the thruster’s current load influences the supply had oscillations (overshoots and undershoots) around and
voltage (12 V) of the analog pressure sensor, ripples are has not settled to the reference value. For this reason, mea-
observed in the depth graphic. Their supplies are separated suring of ts and ess could not be available (NA). Response
in the latter CC versions. The UV dived from 10 cm of the UV improved by fine-tuning the K p and K d
12 International Journal of Advanced Robotic Systems

parameters. The procedure is applied likewise to the UV for NumPy Libraries are utilized in a series of tasks, such as
PD heading control. Following three large ripples, the UV camera interface, masking, object tracking, production of
settled to the reference value of 52.5s (Figure 13). Perfor- PWM data, image processing, various kind of functions, and
mance comparison of the two methods is given in Table 4. storing huge sizes of position data and RGB data in arrays.
Consequently, the PD controller performed better than Within the scope of this application, the position changes in
the on–off controller. circular objects are determined by using the “Hough trans-
formation” method and the tracking process implemented
Image processing for object tracking system according to this position.41 The largest contour is deter-
mined after masking and enclosing the object center. A line
Analog camera and BeagleBone Black minicomputer are is drawn between the origin of the UV camera and the cen-
replaced with Rasberry Pi camera and Raspberry minicom- tral coordinates of the object. Then, the radius of the target is
puter, respectively. UV control applications are carried out calculated by finding the inner side of the circle. If the diam-
by the appropriate image processing methods: Imutils, eter of this circle we have found falls down a certain value,
PIRGBarray, Deque, Argparse, RPI.GPIO, time, CV2, and horizontal thruster motors run proportionally. The control
signal is a function of E both in P and ANFIS methods
U ¼ f ðEÞ (21)

Proportional (P) control method. The control signal (U) pro-


portional (Kp) to the error37 is carried out for adjusting both
surge velocity and heading angle refers to the deviation (E)
in pixel coordinates
U ¼ KpE (22)
A further object moves from the origin, and more errors
occur. According to this error, if the object is on the right
side, the speed of the left horizontal thruster proportionally
increases, and the speed of the right horizontal thruster
decreases. Contrariwise, the opposite situation is true for
Figure 13. On–off and PD time responses of the heading control the left side. E is determined to refer to the difference
application.47 PD: proportional–derivative.
between the central position coordinates of the object and
the origin of the vehicle camera (Figure 14).
Table 4. Performance comparison for transient state analysis of
the heading control.
ANFIS control method. ANFIS has one input (E) and one
Heading control MO (o) tr ðsecÞ ts ðsecÞ ess ðcmÞ output (U) with three rules, where E is the deviation from
the origin of the Lucky Fin Camera in the x-axis
On–off 12.2 2.7 NA NA
PD 7.2 1.5 52.5 About zero E ¼ Origin of UV camerað455pixelsÞ
PD: proportional–derivative; MO: maximum overshoot.  the origin of the object (23)

Figure 14. Distance error calculated refers to the center coordinates of the object.
YILMAZ 13

Table 5. Fuzzy rules of ANFIS.

Rule number Antecedent preposıtıons (pixels) Consequent preposıtıons (% PWM)


1 IF E¼ Negative (N) Then U¼ Turn Left (TL)
2 IF E¼ Zero Grade (ZG) Then U¼ STOP (S)
3 IF E¼ Positive (P) Then U¼Turn Right (R)
ANFIS: adaptive neuro-fuzzy inference system; PWM: pulse width modulation.

Figure 15. (a) Training and (b) testing responses for the trained FIS model for 10000 iterations. FIS: fuzzy inference system.

Figure 16. Trained input variable E.

While Table 5 is designed to demonstrate heading con- direction. In a contrariwise situation, the opposite operation
trol, the basic rules for surge control are similar. is true. Trained MFs of E are given in Figure 16. Singleton
The model is trained and tested referring to 37 training MFs TL, S, and R are located on 275,0. and þ275 of
data and 20 testing data, respectively. The data indicate a output variable U.
linear relationship between heading deviation [275, 0, Related input is fuzzified using Gauss MFs according to
þ275] and motor drive PWM ratios [100, 0, þ100]. trained parameters
However, a camera lens is monitoring the deviation 
ðxcÞ 2

larger at the center and smaller than it should be at the mAðxÞ ¼ e 2s 2 (24)
corners. In addition, the left motor stops under 20% e
PWM, and the torque of the right motor is slightly Here, c and s are the parameters that determine the
strong (around 2%) for identical PWM values around center and width of the Gaussian Curve, respectively.48
zero grade. For this reason, the data set is improved by Fuzzy inference system is executed to refer to fuzzification,
several tries and faults (Figure 15). production, normalization, and defuzzification processes
Negative outputs proportionally activate the left motor (Figure 3). ANFIS functions, which are operated in each
in the forward direction and the right motor in the reverse layer, can be extracted by programming codes step by
14 International Journal of Advanced Robotic Systems

Figure 18. Heading control test results: PID and ANFIS


control responses to the 20 pixels deflection (E). PID:
proportional–integral–derivative; ANFIS: adaptive neuro-fuzzy
inference system.
Figure 17. Object tracking in tests.

Table 6. Performance comparison of visual-based P and ANFIS


step.38 Execution of fuzzy rules for an example input, heading control.
E ¼ 146 pixels, are illustrated in Figure A1.
Visual heading control MO (pixels) tr ðsecÞ ts ðsecÞ ess ðpixelsÞ
According to E, the motor direction and the PWM ratio
applied to the motors are adjusted. Out-of-water tests vali- P 14.65 0.9 3.2 0.05
date the vehicle’s response to the object with the appropri- ANFIS 10.2 0.9 3 0.05

ate PWM values. The values captured by the camera are ANFIS: adaptive neuro-fuzzy inference system; MO: maximum overshoot.
transferred to the computer via Wi-Fi (Figure A2), and the
underwater tests of the UV were successfully accom-
plished. The vehicle is computer-controlled and able to
Image processing tests for the UV þ manipulator
move autonomously by the CC. As a result of these oper- object capturing system
ations, the image of the object in the camera is shown in OpenCV and real-time image processing libraries are
Figure 17. The location information output by writing the installed on Raspberry Pi. The application is started with the
line of code is as follows detection of coordinate points of the object and end effector
in the image and the calculation of the distance between
$ python3 object_tracker.py
them. The distance is calculated on a pixel basis on the
RADIUS ¼ 44 two-dimensional image space and is converted to cm. The
image captured by the camera is in RGB space and is trans-
CENTER ¼ ½455;478 formed into HSV space. The object and the end effector of
This output means that the radius of the target is the manipulator RA are labeled in blue and red colors,
reported as 44 pixels to the frame interface and the center respectively. Threshold values of red and blue colors in HSV
coordinates of the target are N (x, y) ¼ (455, 478) color space have been already determined in the image.
Original and thresholded images are shown in Figure 19.
(Figure 17). The related control algorithm is given in
The coordinates of the end effector and the target object
Figure A3. The object tracking program controls two para-
are determined and used to drive the motors. Pixel distance
meters: Heading control refers to deflection E from the
between two points, jABj, is accepted as an error that
target, and surge control refers to distance E from the tar-
should be reduced by proportional control
get. PWM rates of motors reacting to the E signals are the qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
same for both control procedures. The direction of rotation jABj ¼ ðx 2  x 1 Þ 2 þ ðy 2  y 1 Þ 2 (25)
is identical for surge control but reverse for heading con-
trol. Experimental results of the heading control tests for Considering the x-axis, the coordinate of the end effec-
PID and ANFIS methods are given in Figure 18.49 tor marked in red is x1 and the coordinate of the target
Feedback control by the image process, in general, marked in blue is x2, where y1 and y2 are the coordinates
reduces E, and reasonable results have been achieved for of the end effector and target along the y-axis, respectively.
object tracking operations. The controller settles the UV to The camera feedbacks the jABj to the servomotors by (22),
the direction of a target under 3.5s for both methods where Kp is selected as 0.7. In this application, the UV was
(Table 6). not moved except for the manipulator part. Details of the
The maximum overshoot in the ANFIS control is lower vision-based manipulator control on the UV are illustrated
than in the PID control. In addition, the settling is achieved in Figure A.4. Experimental jABj distance data versus time
earlier in the ANFIS control than in the PID control. are shown in Figure 20. The manipulator approaches to
YILMAZ 15

Figure 19. Positions of the object (blue) and end effector (red) in the original images and after the thresholding process.44

object in 20s. However, the end effector could not capture


the object in underwater tests.

Tests for estimation of some dynamic parameters


Hydrodynamic effects resulting from the movement of the
Lucky Fin UV and M matrix parameters of the UV dynamic
mathematical model are examined both in simulation and
experimental tests. The results are insufficient; however, they
will lead to constructing proper test setups for the future
determination of full dynamic parameters of the Lucky Fin.
In this application, CFD analysis is addressed to determine
M RB parameters of the Lucky Fin, whereas experimental tests
are applied to determine MA parameters. Solidworks CAD
model of Lucky Fin is defined in finite small elements, and
Figure 20. Approaching distance, jABj, of the end-effector to the
k-e turbulence model is extracted,50 which is used in the CFD
target.44 analysis. The vehicle is analyzed for M RB and hydrodynamic
inertial moments in three axes by means of flow simulation
and ANSYS fluent CFD software (Figure 21). Refer to the
simulation results, M RB is as follows34

Figure 21. (a) Vehicle body mesh structure and (b) example CFD analysis response of Lucky Fin to liquid flow in the x (surge) direction.
CFD: computational fluid dynamics.
16 International Journal of Advanced Robotic Systems

Department of Kocaeli University. Two different forces


(9 N and 4 N, respectively) are applied to the UV in the
first half and second half of the surging movement. Corre-
sponding accelerations are measured as 0:2  0:022 m/s2,
respectively (Figure 22). Observed velocities of the vehicle
were 0.12 m/s and 0.06 m/s.
Xu_ mass is a change in UV acceleration that refers to a
change of thruster force in the x-direction, and it is deter-
mined as 0.281 kg (Eq. (12)). Similarly, test results are
observed in the heave direction (Figure 23). The same
forces are applied by the vertical motors, and accelerations
are observed as 0.002–0.08 m/s2. Velocities are 0.11 m/s
and 0.24 m/s. Zw_ mass is the change in UV acceleration
refers to the change of thruster force in the z-direction (12)
Figure 22. Xu_ tests in the surge direction. and is calculated as 0.833 kg.

Conclusion and future works


A UV platform is designed to utilize for research and devel-
opment purposes in Kocaeli University Engineering
Faculty. Students have implemented their projects and
combined their theories with applications of the concepts,
which are taught in the Department of Electronics and Tel-
ecommunication Engineering. The modeling and design
phases of the UV platform are given, followed by some
experimental developments that illustrate the transition
efforts to autonomy.
However, the primary purpose of the experiments did
not focus on optimization. Students can develop novel
methods on the platform. For instance, a student group
from Kocaeli University has developed a two-axis arm
manipulator, which has been attached to the UV. The UV
with manipulator system is joined to the MATE-ROV 2016
International Competition. A student group developed
image processing applications on which the UV tracked
an object, where ANFIS and PID heading control methods
are used. In these studies, the performance of the ANFIS
control method was higher than the PID control method.
UV is planned to be put into practice in several half
autonomy applications, such as line tracking, path planning
and tracking, diving buddy (diving assistance), teleopera-
tions, and inspection of environmental pollution in receiv-
Figure 23. Zw_ tests in the heave direction.
ing water bodies in future works. By means of the TP, a
series of force–moment tests could be applied to the UV.
2 3
34; 342 0 0 0 0 0 This will result in finding the hydrodynamic parameters of
6 7 a vehicle. These parameters would allow simulations of
6 0 34; 342 0 0 0 0 7
6 7 CFD Programs.16 Dynamic model of a low-speed UV could
6 7
6 0 0 34; 342 0 0 0 7 be obtained by means of CFD analysis.17 UV measures
6 7
M RB ¼6
6
7
7
some parameters, such as orientations and corresponding
6 0 0 0 6; 17 0 0 7 velocities referring to the body frame. These values should
6 7
6 7 be converted by the kinematic model and interpreted into
6 0 0 0 0 6; 20 0 7
4 5 the Earth Fixed parameters of the shore computer. Some of
0 0 0 0 0 0; 76 the UV dynamic parameters are determined according to
simulations and experimental tests.
Some tests of Lucky Fin were conducted by the TPs in Onboard CC design, algorithms developed for depth,
the Electronics and Telecommunication Engineering heading, target tracking, grasping control and image
YILMAZ 17

processing, kinematic and dynamic modeling for future international joint conference, Bexco, Busan, South Korea,
simulation, and controller optimization studies bring about 18–21 October 2006, pp. 3100–3103. New York: IEEE.
partial autonomy to conventional ROVs. 9. Song F, An E, and Folleco A.Modeling and simulation of
autonomous underwater vehicles: design and implementa-
Acknowledgment tion. IEEE J Ocean Eng 2003; 28/2: 283–296.
The author would like to thank Ayako Tajima for her contribution. 10. Wang JS and Lee CSG. Self-adaptive recurrent neuro-fuzzy
control of an autonomous underwater vehicle. IEEE Trans
Declaration of conflicting interests Rob Autom 2003; 19/2: 283–295.
The author(s) declared no potential conflicts of interest with 11. Shi X, Xiong H, Wnag C, et al. A new model of fuzzy CMAC
respect to the research, authorship, and/or publication of this network with application to the motion control of AUV. In:
article. Proceedings of IEEE international conference on mechatro-
nics and automation, Niagara Falls, ON, Canada, 29 July–1
Funding August 2005, pp.2173–2178. New York: IEEE.
The author(s) received no financial support for the research, 12. Fernandes DA, Sørensen AJ, Pettersen KY, et al. Output
authorship, and/or publication of this article. feedback motion control system for observation class
ROVs based on a high-gain state observer: theoretical
ORCID iD and experimental results. Control Eng Pract 2015; 39:
Serhat YILMAZ https://orcid.org/0000-0001-9765-7225 90–102.
13. Manu DK and Karthik P. Development and implementation
Supplemental material of AUV for data acquisition and image enhancement. IJEAT
2020; 9/4: 2088–2093.
Supplemental material for this article is available online.
14. Jung J, Lee Y, Kim D, et al. AUV SLAM using forward/
downward looking cameras and artificial landmarks. In:
References
Conference on IEEE underwater technology (UT), Busan,
1. Choi J, Lee Y, Kim T, et al. Development of a ROV for visual
Korea (South), 21–24 February 2017, pp.1–3. New York:
inspection of harbor structures. In: IEEE underwater technol-
IEEE.
ogy (ut) conference, Busan, South Korea, 21–24 February
15. Sonka M, Hlavac V, and Boyle R. Image processing, analy-
2017, pp. 1–4. New York: IEEE.
sis, and machine vision. 3rd ed. Toronto: Thompson
2. Miorim F, Rocha F, and De Tomi G. Mobile control center: a
Learning, 2008.
new approach to deploy ROV-oriented education and train-
16. Satria D, Wiryadinata R, Esiswitoyo DPA, et al. Hydrody-
ing. In: OCEANS’15 MTS/IEEE conference, Washington DC,
namic analysis of Remotely Operated Vehicle (ROV) Obser-
USA, 19–22 October 2015, pp. 1–4. New York: IEEE.
vation Class using CFD. In: the international conference on
3. Fletcher B and Harris S. Development of a virtual envi-
aerospace and aviation, IOP conference series: materials
ronment based training system for ROV pilots. In:
science and engineering 2019, pp. 645.
OCEANS 96 MTS/IEEE conference proceedings, Fort
17. Jain A, Chandra RN, and Kumar M. Design and development
Lauderdale FL, USA, 23–26 September 1996, pp. 65–71.
New York: IEEE. of an open-frame AUV: ANAHITA. In: IEEE/OES autono-
4. Eng YH, Teo KM, Chitre M, et al. Online system identifica- mous underwater vehicle workshop (AUV), Porto, Portugal,
tion of an autonomous underwater vehicle via in-field experi- 6–9 November 2018, pp. 1–5. New York: IEEE.
ments. IEEE J Ocean Eng 2016; 41/1: 5–17. 18. Bokser V, Oberg C, Sukhatme GS, et al. A small submarine
5. Wang CC, Lin YR, and Chen HH. Accurate altitude control robot for experiments in underwater sensor networks. IFAC
of autonomous underwater vehicle. In: IEEE international Proceedings Volumes 2004; 37(8): 239–244.
underwater technology symposium (UT), Tokyo, Japan, 5–8 19. Hong EY, Soon HG, and Chitre M. Depth control of an
March 2013, INSPEC Accession No. 13530811, pp. 1–3. autonomous underwater vehicle, STARFISH, In:
New York: IEEE. OCEANS’10, Sydney, NSW, Australia, 24–27 May 2010,
6. Inoue T, Shibuya K, and Nagano A. Underwater robot with a New York: IEEE.
buoyancy control system based on the spermaceti oil 20. Trebi-Ollennu A and White BA. A robust nonlinear control
hypothesis - Development of the depth control system. In: design for remotely operated vehicle depth control systems.
IEEE/RSJ international conference on intelligent robots In: UKACC international conference on control ‘96, Exeter,
and systems conference, Taipei, Taiwan, 18–22 October UK, Publ. No. 427, Volume 2, 2–5 September 1996,
2010, pp. 1102–1107. New York: IEEE. pp. 993–997, New York: IET.
7. DeBitetto PA. Fuzzy logic for depth control of unmanned 21. Choi H-T and Lee Y.Highly accurate motion control system
undersea vehicles. IEEE J Ocean Eng 1995; 20/3: for omni-directional underwater robotOceans, Hampton
242–248. Roads, VA, USA, 14–19 October 2012.
8. Kim H and Shin Y. Design of adaptive fuzzy sliding mode 22. Yang R, Clement B, Mansour A, et al. Robust heading control
controller using FBFE for UFV depth control. In: SICE-ICASE and its application to Ciscrea underwater vehicle. In:
18 International Journal of Advanced Robotic Systems

OCEANS 2015, Genova, Italy, 18–21 May 2015. New York: 38. Yilmaz S, Ilhan R, and Feyzullahoğlu E. Estimation of adhe-
IEEE. sive wear behavior of the glass fiber reinforced polyester
23. Palomeras N, Nagappa S, Ribas D, et al. Vision-based composite materials using ANFIS model. J Elastom Plast
localization and mapping system for AUV intervention. 2021; 54: 86–110.
In: 2013 MTS/IEEE OCEANS, Bergen Norway, June 39. Mesbahi AH, Semnani D, and Khorasani SN. Performance
10–13 2013. prediction of a specific wear rate in epoxy nanocomposites
24. Mangipudi C P and Li P Y.Vision based passive arm locali- with various composition content of polytetrafluoroethylen
zation approach for underwater ROVs using a least squares (PTFE), graphite, short carbon fibers (CF) and nano-TiO2
on SO(3) gradient algorithm. In: American control confer- using adaptive neuro-fuzzy inference system (ANFIS). Com-
ence (ACC), pp. 5798–5803, Philadelphia-USA, July 10–12 pos Part B 2012; 43: 549–558.
2019, New York: IEEE. 40. Xu Z, Baojie X, and Guoxin W. Canny edge detection based
25. Zhou C, Liu Y, and Song Y.Detection and tracking of a UAV on Open CV. In: ICEMI’2017- IEEE 13th international con-
via hough transform, In: 2016 CIE international conference ference on electronic measurement & instruments, Yangz-
on radar (RADAR), Guangzhou, China, October 10–13 2016, hou, China, 20–23 October 2017, pp. 53–56. New York:
New York: IEEE. IEEE.
26. Cai M, Wang Y, Wang S, et al. Grasping marine products 41. Illingworth J and Kittler J. A survey of the Hough transform.
with hybrid-driven underwater vehicle-manipulator sys- Comput Vis Graph Image Process 1988; 44/1: 87–116.
tem. IEEE Transact Automat Sci Eng 2020; 17(3): 42. Lee YK. Commutation torque ripple minimization for three
1443–1454. phase BLDC motor drive using a simple PWM scheme reli-
27. Hu Z, Zhu X, Tu D, et al. Manipulator arm interactive control able to motor parameter variation. In: IEEE PES Asia-Pacific
power and energy engineering conference (APPEEC),
in unknown underwater environment. In: 2nd world confer-
Macao, China, 1–4 December 2019, pp.1–4. New York:
ence on mechanical engineering and intelligent manufactur-
IEEE.
ing (WCMEIM), 22–24 November 2019, pp.494–497,
43. Hu Z, Zhu X, Tu D, et al. Manipulator arm interactive control
Shanghai, China.
in unknown underwater environment, In: 2nd World confer-
28. Nishida Y, Ahn J, Sonoda T, et al. Benthos sampling by
ence on mechanical engineering and intelligent manufactur-
autonomous underwater vehicle equipped a manipulator with
ing (WCMEIM), Shanghai, China, 22–24 November 2019,
suction device. In: 2019 IEEE underwater technology (UT),
pp. 494–497. New York: IEEE.
Kaohsiung, Taiwan, 16–19 April 2019, pp. 1–4. New York:
44. Yılmaz S and Kılcı SB. Image processing applications for 6
IEEE.
degrees of freedom (DOF) underwater vehicle and manipu-
29. Eidsvik OA. Identification of hydrodynamic parameters for
lator system. imctd 2020; 1: 63–79.
remotely operated vehicles. Master Thesis, Norwegian Uni- _
45. Yakut M, Yılmaz S, Ince S, et al. Derinlik ve Yön Kontrol
versity of Science and Technology Department of Marine
uygulamalari için sualti araci tasarimi (underwater vehicle
Technology, Norway 2015. design for depth and direction control applications), GU J Sci
30. Mitra A, Panda JP, and Warrior HV. The effects of free Part: C 2015; 3: 343–355.
stream turbulence on the hydrodynamic characteristics of 46. Ergan AF.Design and implementation of hardware and user
an AUV hull form. Ocean Eng 2019; 174: 148–158. interface for underwater test platform, Master’s Thesis,
31. Siciliano B, Sciavicco L, and Villani L. Robotics: Modelling, Kocaeli University Institute of Science, Turkey, 2014.
Planning and Control. 1st ed. London: Springer Verlag, 47. Yılmaz S, Yakut M, and Ergan AF. Sualtı deney platformu
2009. için kontrol karti tasarimi (control board design for under-
32. Antonelli A. Underwater Robots, Motion and Force Control water experiment platform), In: Turkish National committee
of Vehicle-Manipulator Systems. 2nd ed. Berlin Heidelberg: of automatic control (TOK) proceedings, Kocaeli-Turkey,
Springer Verlag, 2006. _
11–13 September 2014, paper no.147, pp. 746–750. Istanbul:
33. Dinç M and Hajiyev C. Integration of navigation systems for TOK.
autonomous underwater vehicles. J Mar Eng Technol 2015; 48. Yilmaz S. Bulanık Mantık ve Mühendislik Uygulamaları
14: 32–43. (fuzzy logic and its’ engineering applications), Kocaeli Uni-
34. Kılcı SB.Dynamic modeling and simulation studies of four versity Press, 2007. Pub No:289, ISBN No: 978-605-4158-
degrees of freedom underwater vehicle. Master Thesis, 00-3.
Kocaeli University Institute of Science, Turkey, 2020. 49. Ofluoğlu AA. Object tracking in underwater vehicles with
35. Dukan F. ROVMotion control systems. Phd Thesis, Norwe- ANFIS method. Master Thesis, Kocaeli University Institute
gian University of Science and Technology, Norway, 2014. of Science, Turkey, 2021.
36. Vervoort JHAM.Modeling and control of an unmanned 50. Yılmaz G and Yılmaz S.Used methods for determine hydro-
underwater vehicle. Master Thesis, University of Canterbury, dynamic parameters in unmanned underwater vehicles
UK, 2009. (UUV). In: Tokyo Summit-III 3rd international conference
37. Dorf RC and Bishop RH. Modern Control Systems. 10th ed, on innovative studies of contemporary sciences, Tokyo,
New York, NY: Pearson Prentice Hall, 2005. Japan, 2021, pp.696–705, 19–21 February 2021.
YILMAZ 19

APPENDIX

Figure A.1. Execution of fuzzy rules.

Figure A.2. Camera data sent to a computer by the Wi-Fi module of STM32.
20 International Journal of Advanced Robotic Systems

Start

Load the Libraries and Establish the Configuration

Initialize the Camera Interface

Finalize the Y Is ESC Button


Program pressed?

Target Detected?
N
Y Hough Algorithm
Define the Circle and its’ center
which encloses the Target

Heading Control: U=f(E)

Turn
Y Distance>-2 N Distance<-2 Y Turn Left
Right pixel? pixels?

Surge Control: U= f(E)

Go
Y Radius> 62 N Radius< 58 Y Go Forward
Backward pixels? pixels?

N N
Required Distance of Target is reached
Stop Motors

Figure A.3. Algorithm of UV object tracking control by image processing. UV: underwater vehicle.
YILMAZ 21

Open CV UART Camera


Communica
Color filtering and detection tion
End effector Object

Servo-1
AB =End effector position-Object
Position(cm)
Servo-2

Servo-3
Servo-1 N
AB<20 cms?
Does not
turn z=Current Sensor
Y

Servo-1 progresses along the heading angle

N Did AB reach ed to
shortest distance?

Y STOP
Servo-2, progresses and end effector Y
approaches to object in the x direction

N Y Current>
x <= desired Servo3 Grasps Current Limit?
distance?

N
RASPBERRY PI

Figure A.4. Flowchart of vision-based manipulator control.44

You might also like