Professional Documents
Culture Documents
Serhat YILMAZ
Abstract
The design of underwater unmanned vehicles is an interdisciplinary study that includes several fields, such as computa-
tional fluid dynamics, modeling and control of systems, robotics, image processing, and electronic card design. The
operational cost of such vehicles is high because it is dependent on variable fluid properties like salinity and high pressure
while its’ mobility must be resistant to environmental conditions such as undersea. The study describes an operating
platform, called Lucky Fin, on which the students can develop various control algorithms and can test and extract
hydrodynamic parameters of the underwater vehicle. The platform consists of an underwater vehicle and two testing
tanks. The control card, the user control program interface, and a manipulator’s arm are designed to be used for a series
of control applications such as depth, heading, target tracking, and capturing. The results of several tests are illustrated in
this article.
Keywords
Control card, user interface, depth control, heading control, robotic vision, CFD, manipulator, underwater vehicle
Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License
(https://creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without
further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/
open-access-at-sage).
2 International Journal of Advanced Robotic Systems
students to develop and share new skills in underwater tech- hydrodynamic behavior of a vehicle could be extracted
nology and related applications. For this purpose, a depth by fluid dynamic imaging and simulations of the CFD.16
control system has been developed for a microprocessor- In a case study, the mathematical model of a low-speed
based UV.6 The depth information obtained by the pressure AUV was extracted by means of CFD analysis for two
sensor is compared with the desired depth, and the error critical hydrodynamic parameters, which are added mass
is eliminated by means of controlling vertical motors. (MA) and damping matrices.17
DeBitetto7 has used fuzzy controllers instead of traditional The mechanical, electronic, and software designs of a
control methods in a similar study. The pitch angle is con- UV test platform (TP), which has been developed by the
trolled using depth error information and its’ derivative. In the Kocaeli University Electronics and Telecommunications
mentioned study, the error and its’ derivative are described by Department between 2013 and 2020 and is called the Lucky
verbal variables, and each variable is separated into fuzzy Fin Project, are introduced in this study. The platform is
membership functions (MFs). composed of UV and lateral and vertical test tanks.
Using the sliding mode observer and Kalman Filter, Kim Although the basic concepts are preserved, sensors, cam-
and Shin8 have determined the hydrodynamic coefficients eras, battery types, microcontrollers, and card designs have
of a vehicle with six degrees of freedom (DOF). They have been modified in line with the development stages. Basic
extracted the block diagram model of the vehicle and have movements, such as diving, keeping the desired depth level
simulated its behavior of various reference depth and direc- or position angle, and tracking–grasping a target object, are
tion values. In these studies, various models related to the provided by control systems. To achieve this kind of robust
applications are used as UV models. Some of these UVs are autonomy, the kinematic and the dynamic behaviors of the
cable-connected and ROVs, while the other groups are vehicle should also be determined and modeled. P (propor-
AUVs. The vehicles in the latter groups usually commu- tional), PD (proportional–derivative), PID (proportional–
nicate with the surface system by means of acoustic mod- integral–derivative), and ANFIS (adaptive neuro-fuzzy
ems. Both traditional9 and novel10,11 control techniques are inference system) based controllers, which can respond to
applied to provide autonomous motion for UVs. Fernandes the desired commands within certain performance criteria,
et al.12 have managed to control a UV in four DOFs (yaw, have been developed and tested on the platform. Some
sway, surge, and heave), and they have designed a control- actual system parameters are identified and adjusted to
ler to track an appropriate and smooth reference path with improve the design. As the platform serves as a testbed for
an unmodelled plant dynamics, the varying UV parameters, a set of experiments, the electronics and telecommunica-
and the existence of environmental noise and disturbances. tion engineering students can develop applications on a
Image acquisition from UVs, acoustic transmission, and wide range of topics that they have learned during the
image processing are also significant subjects in this field. lectures. Depth and heading control applications of the
In a case study, images acquired from the UV are improved UV have been carried out and interpreted in the laboratory
using Canny edge detection, Hue, Luma, and Saturation tests. In addition, image processing tests on target tracking
algorithms.13 AUVs are generally equipped with at least and object capturing operations are also included. UV
one optical camera to acquire visual information about itself measures some parameters, such as orientations and
underwater locations. It is also essential for the estimation corresponding velocities, referring to the body frame.
of AUV position based on the target of underwater These values should be converted by the kinematic
objects.14 Object tracking algorithms are used in many dif- model and interpreted into the Earth Fixed parameters of
ferent areas, such as security systems, autonomous vehi- the shore computer. Some of the UV dynamic parameters
cles, robots, and traffic cameras. The common point is the are determined according to the simulations and experi-
defining and updating of the position of image frames in a mental tests.
video.15 The study considers the benefits and contributions of
Traditional parameter estimation methods, which have designing a framework, Lucky Fin, for educational pur-
been used to improve UV hydrodynamics, are towing poses UV development and testing. In doing so, it was
experiments and pool tests. Building and operating towing inspired by some guiding works. Modest improvements
tests for real size vehicles are quite an expensive method.9 and contributions are made to some of these studies in
These experiments give very realistic results when vehicle terms of equipment, design, and method, and the experi-
sensors are sensitive. However, changing flow conditions mental setup emerged. An underwater robot experimental
in experimental studies make it difficult to get reasonable testbed was developed for depth control and temperature
and precise results. Experimental studies on the determina- measurement.18 A piston actuator moves up and down to
tion of UV hydrodynamics require such a long period that adjust the sinking level of the robot in a glass rectangular
lose time for research and development processes. Compu- aquarium tank. The piston as an actuator has a deadzone
tational fluid dynamics (CFD) is an effective computer and deadband nonlinearity, which causes errors, oscilla-
simulation and design method for determining hydrody- tions, and instability. Accurate measurement takes 3 min,
namic model parameters of UUVs. Therefore, the usage and 60 cm depth is reached by 1.55 cm deviations. The
of computers and the CFD has been emphasized. The Lucky Fin Project was influenced by the principle of this
YILMAZ 3
work and aimed to improve it. The mechanical design of control algorithms are prepared on Raspberry Pi for detect-
the passive diving robot, which consists of a tube and a ing an object from the camera and capturing it with a
piston, has been replaced by a UV with four DOFs, pro- manipulator. The mobility of the manipulator increases the
pellant motors, a camera, and sensorial equipment. DOF of Lucky Fin by two degrees and provides more flex-
Heave-surge and roll-heading movements are applied in ible approach conditions to the object. The control card
cylinder-shaped horizontal and lateral tanks, respectively. (CC) and software are original and open to development.
More accurate depth control is achieved as given in the CFD is a convenient method for parameter extraction of the
“Discussions on UV control applications” section. Various UUV hydrodynamic model.29 Control of movements in
control methods have been proposed in the literature over low speeds, such as sway, heave, surge, and heading, can
the years. Sliding mode controllers (SMCs) are the com- be precisely modeled and simulated in terms of hydrody-
mon heading, depth, pitch, and yaw control methods namic coefficients, such as added mass (MA), drag (CD),
applied to AUV systems.19,20 Non-linear characteristics lifting (CL), and controller parameters. The combination of
of thruster brushless motors with a dead zone result in CFD simulation with free streaming turbulence models and
heading error. When the error goes beyond a definite rate, experimental tests improves parameter identification.30
the controller’s effort to adjust it as a step disturbance and However, confirmation of the hydrodynamic parameters
overshoots in the transient response could reach around five obtained in the simulations with experimental tests is rare
degrees.21 Unmeasurable conditions, sensory noise, elec- in the literature. Lucky Fin is a particular designed modular
tromagnetic effects, and signal delays also degrade the sta- type of UV that has four DOF, and the relevant computer-
bility and performance of the AUV heading control.22 The aided design (CAD) model is extracted for the first time.
high performance and high accelerations of brushless Cost-effective confirmatory experimental methods that are
motors are of course indisputable. However, the motor worthy of reference in testing complex-shaped modular
driver design of Lucky Fin provided slower but precise and UVs are included in the study.
stable responses of DC motors within narrow angular
ranges. A highly precise calibrated TCM3 electronic com-
pass is water tightly insulated and located at the top of the Overview of the UV TP
vehicle. This location is outside of the electronic control For the mentioned control purposes, a TP consisting of a
hull and is away from the power layer and metal parts UV, CC, a control program interface (CPI) software,
(Figure 9(b)). These adjustments let the vehicle less sus- and two simulation tanks was designed and implemented
ceptible to electromagnetic disturbances and provide pre- (Figure 1). Depth is measured by a high-precision pressure
cise control of Euler angles. Vision-based positioning and sensor where heading is determined by a three-axis elec-
tracking of an AUV have been used successfully in clean tronic compass. The depth and heading control applications
environments under restricted distances.23 Specifically, are applied in the horizontal tank (Figure 1(a)) and lateral
AUVs equipped with robotic arm manipulators could exe- tank (Figure 1(b)), respectively.
cute high-resolution computer vision-based tasks, such as The CC in the UV and the CPI on the computer utilize
detecting and capturing objects within reachable distances the Universal Synchronous/Asynchronous Receiver
of the manipulator.24 Image processing algorithms are Transmitter (USART) communication protocol. A CCD
addressed in real-time underwater visual tracking opera- camera is set on the UV for monitoring and image pro-
tions. Hough algorithm proposed for localization opera- cessing. The CC drives two vertical, and two horizontal
tions increases recognition capability on pipelines. 25 thrusters and optionally servomotors of a three DOF
Computational intelligence methods, such as artificial manipulator. Principally, the CC is designed for both
neural network based adaptive control systems, fuzzy cer- remote and autonomous control applications. Thus, the
ebella model articulation controller,11 and self-adaptive CC operates the vehicle as an open or a closed-loop con-
recurrent neuro-fuzzy control10 methods, improve tracking trol system. In open-loop mode, control commands sent
performance and dynamic motion response of AUVs. by the operator via the CPI are transmitted to the vehicle’s
Lucky Fin’s object tracking task is accomplished with actuators, and data from the sensor are monitored on the
vision-based direction and surge control. While PD and CPI. The closed-loop processes the sensory data and pro-
ANFIS algorithms are performed for the mentioned control duces the control signals. Features of the vehicle are given
operations, the Hough algorithm is used for object detec- in Table 1.
tion. Autonomous systems consisting of a UV and a robot
arm (RA) are designed to track underwater objects and
detect their positions precisely. The object is detected by Methods used in the UV system design
image processing, and the manipulator approaches the
object with methods such as fuzzy control.26 Some manip-
Kinematic modeling of the UV
ulators are controlled manually,27 and in others, the object The position and orientation of vehicles on three axes are
is captured with a fixed arm24 or by hovering and vacuum- calculated according to the kinematic model of the vehicle.
ing the object. 28 In this study, image processing and axial Orientations measured by the TCM3 electronic compass in
4 International Journal of Advanced Robotic Systems
Figure 1. (a) Depth and (b) heading control applications in the TP. TP: test platform.
Table 1. UV features.
Length (cm) Width (cm) Cross size (cm) Height (cm) Weight (kg) Power of each motor (W)
46 42,5 47 30 13 30
the vehicle should be converted by the Jacobian Matrix to Euler angles. The origin of the UV coordinate system is
be interpreted by the operator located next to the shore returned until it is matched with the earth coordinate sys-
computer. The pulse width modulation (PWM) value of tem’s origin ðx; y; zÞ (Figure 2).
each motor is determined by referring to its’ target axes. A compact representation of the Jacobian matrix for the
These equations are utilized in both three-dimensional kinematic equations is written as
model simulations by means of Matlab, and the equations " #
are embedded into the CPI software for online orienta- J 1 ðØ; q; jÞ 0
J¼ (1)
tion control applications. The most used kinematic equa- 0 J 2 ðØ; q; jÞ
tions are Euler angles and quadrature groups in these
applications.31 J1 and J2 are given in (2)
2 3
cqcj sjcØ þ sØsqcj sjsØ þ cØsqcj 0 0 0
6 7
6 cqsj cjcØ þ sØsqsj cjsØ þ cØsqsj 0 0 0 7
6 7
6 7
6 sq sØcq cØcq 0 0 0 7
6 7
J¼ 6 7 (2)
6 0 0 0 1 sØtq cØtq 7
6 7
6 7
6 0 0 0 0 cØ sØ 7
4 5
0 0 0 0 sØ=cq cØ=cq
where c, s, and t are abbreviations of cosine, sine, and Cartesian position vector with the angular Euler vector,
tangent, respectively. The Jacobian matrix matches the which is given in terms of SNAME32
YILMAZ 5
Figure 2. Illustration of the Lucky Fin UV elementary motions and frames on the 3D model. UV: underwater vehicle; 3D: three-
dimensional.
because most vehicles operate at low speeds.35 When UUV where Kp and Kd are proportional and derivative coeffi-
is symmetrical in all planes and the origin of the body cients of the error, respectively.37 The PD control routine
reference frame is located at the center of gravity, in other includes numerical representations of the error, integral,
words, r c ¼ ½ 0 0 0 T . (8) would be reorganized as and derivation
2 3
m 0 0 0 0 0 eðtÞ eðt 1Þ
60 m 0 0 uðtÞ ¼ K p eðtÞ þ K d (15)
6 0 0 7
7 Dt
6 7
60 0 m 0 0 0 7
6
M RB ¼ 6 7 (10) where eðtÞeðt1Þ is the numerical backward differentiation
6 0 0 0 I xx 0 0 7
7 Dt
6 7 approximation of the derivation and Dt is the sampling
4 0 0 0 0 I yy 0 5
period of the CC.
0 0 0 0 0 I zz On the other hand, on–off control is a basic feedback
system that switches the actuator from fully closed to fully
M A is given as follows
open according to the setpoint of the variable being con-
2 3
X u_ X v_ X w_ X p_ X q_ X r_ trolled. In such controllers, a dead band between the lower
6 Y u_ Y v_ Y w_ Y p_ Y q_ Y r_ 7 border (lb) and upper border (ub) can be placed to prevent
6 7
6 7 chattering around a set-point
6 Z u_ Z v_ Z w_ Z p_ Z q_ Z r_ 7
MA ¼ 6 6K
7 (11)
6 u_ K v_ K w_ K p_ K q_ K r_ 7
7
OFF ð0Þ; eðtÞ > ub
6 7 uðtÞ ¼ (16)
4 M u_ M v_ M w_ M p_ M q_ M r_ 5 ON ð1Þ; eðtÞ < lb
N u_ N v_ N w_ N p_ N q_ N r_
When we apply forces on the x-, y-, and z-axes and ANFIS method. ANFIS is a structure that is a combination
moments around these axes to a vehicle, the water masses of a Sugeno-type fuzzy system with neural learning capa-
that need to be pushed resist the vehicle in these directions. bility (Figure 3). Such an approach makes fuzzy logic
Each element of the M A matrix represents the added mass of more systematic and less dependent on experience. ANFIS
water to be pushed in the related direction. Each added mass calculates inputs in a forward direction by means of fuzzi-
is calculated by Newton’s Second Law of Motion; for exam- fication, production, normalization, and defuzzification
ple, Xu_ mass is a rate of change in UV acceleration accord- layers and propagates back the resultant error to the MF
ing to the change of thruster force in the x-direction32 parameters.38
Backpropagation method is utilized for adapting the MF
@X parameters, which provide the ANFIS model to learn the
X u_ ¼ (12)
@ u_ input–output characteristics of a system.39
Terms of M A depends on the UUVs shape. If a UUV is
symmetrical in all planes and the body reference frame is
the vehicle’s center of gravity, the hydrodynamic MA can
Four DOF Lucky Fin UV model
only be written in diagonal terms. Thus, except for the main The Earth (E) fixed frame and the local axes that the vehi-
directions or orientations stated on the prime diagonal, the cle can move are shown in Figure 4. If the horizontal thrus-
M A terms could be accepted as negligible36 ters run at identical velocities, the UV will move in the
2 3 x-direction. When the same thrusters operated at different
X u_ 0 0 0 0 0
velocities, they will rotate around the z-axis with an angular
6 0 Y 0 0 0 0 7
6 v_ 7 velocity (r) and an angular displacement ( ). When the
6 7
6 0 0 Z w_ 0 0 0 7 vertical thrusters of the vehicle run at identical velocities,
MA ¼ 6 6 0
7 (13)
6 0 0 K p_ 0 0 7
7
it dives in the z-direction. When it operates at different
6 7 velocities or in opposite directions, it rotates around the
4 0 0 0 0 M q_ 0 5
x-axis with its angular velocity (p) and an angular displace-
0 0 0 0 0 N r_ ment (’). The highest velocities obtained by the Lucky
Fin vehicle during the experimental studies are given in
Table 2.
Control methods
PD and on–off methods. These methods are used for individ- Image processing and target tracking methods
ual depth, roll, yaw, and heading control applications. The for robot vision systems
control output, ðtÞ, of a PD controller is given as
After starting image processing applications, the analog
deðtÞ camera used in the early days of UV design was re-
uðtÞ ¼ K p eðtÞ þ K d (14)
dt placed by the digital Raspberry Pi Camera. OpenCV is a
YILMAZ 7
0.9
0.8
μA2 (E) E
0.7 O5 z
0.6
O12 O22 O32 ∑
E 0.5
0.4 π N f2 O42
0.3
0.2
0.1
0
E
0 2 4 6 8 10 12 14 16 18 20
0.9
μA3(E) O13 O23 O33
f3 O43
0.8
0.7 π N
0.6
0.5
0.4
0.3
0.2
0.1
0
0 2 4 6 8 10 12 14 16 18 20
Figure 3. ANFIS model with three rules. ANFIS: adaptive neuro-fuzzy inference system.
r 2 ¼ ðx aÞ 2 þ ðy bÞ 2 (17)
x ¼ a þ rsinðqÞ (18)
y ¼ b þ rcosðqÞ (19)
Table 2. Motion axes of four DOF Lucky Fin UV and the highest speeds obtained from the vehicle.
Linear displacements
DOF Definition of the motion and Euler angles Linear and angular velocities
1 Linear in the x-direction x u ¼ 0.12 m/sec
2 Linear in the z-direction z w ¼ 0.2 m/sec
3 Angular roll around the x-axis ’ p: could not be measured
4 Angular yaw around the z-axis r ¼ 5o/sec
DOF: degrees of freedom; UV: underwater vehicle.
Figure 7. Underwater tests of the Lucky Fin UV þ RA system, Kocaeli University Indoor Swimming Pool. (a) RA view from UV indoor
camera and (b) uutdoor camera shots. UV: underwater vehicle; RA: robot arm.
Figure 8. Top view of the designed CC and locations of the main equipment. CC: control card.
DAC
USART3 USART1 USART2
PC
Figure 10. General task schematic of the CC. CC: control card.
by graphical CPI buttons or by a keyboard. The CPI can database file labeled with the date and time information.
capture images and video, acquire data, and plot the status The CPI includes the option to monitor, graph and report
graphics of the UV (Figure 11). recorded data, such as pitch, yaw, depth, heading, supply
CPI supports USB cameras. However, as the cable voltage, motor currents, and controller output.
distance between the computer and the UV is not close
enough, the image is transferred analog through the coaxial
line of a non-buoyant cable. The received analog video Discussions on UV control applications
image is converted to digital by means of a USB image Some experiments are introduced in practice for educa-
capturing device. Since the CPI initializes communication tional purposes in the designed development platform. The
with the vehicle, it starts to record the vehicle data into a on–off, PD, PID, and ANFIS control algorithms for depth
YILMAZ 11
Figure 11. The CPI of the UV. CPI: control program interface; UV: underwater vehicle.
parameters. The procedure is applied likewise to the UV for NumPy Libraries are utilized in a series of tasks, such as
PD heading control. Following three large ripples, the UV camera interface, masking, object tracking, production of
settled to the reference value of 52.5s (Figure 13). Perfor- PWM data, image processing, various kind of functions, and
mance comparison of the two methods is given in Table 4. storing huge sizes of position data and RGB data in arrays.
Consequently, the PD controller performed better than Within the scope of this application, the position changes in
the on–off controller. circular objects are determined by using the “Hough trans-
formation” method and the tracking process implemented
Image processing for object tracking system according to this position.41 The largest contour is deter-
mined after masking and enclosing the object center. A line
Analog camera and BeagleBone Black minicomputer are is drawn between the origin of the UV camera and the cen-
replaced with Rasberry Pi camera and Raspberry minicom- tral coordinates of the object. Then, the radius of the target is
puter, respectively. UV control applications are carried out calculated by finding the inner side of the circle. If the diam-
by the appropriate image processing methods: Imutils, eter of this circle we have found falls down a certain value,
PIRGBarray, Deque, Argparse, RPI.GPIO, time, CV2, and horizontal thruster motors run proportionally. The control
signal is a function of E both in P and ANFIS methods
U ¼ f ðEÞ (21)
Figure 14. Distance error calculated refers to the center coordinates of the object.
YILMAZ 13
Figure 15. (a) Training and (b) testing responses for the trained FIS model for 10000 iterations. FIS: fuzzy inference system.
While Table 5 is designed to demonstrate heading con- direction. In a contrariwise situation, the opposite operation
trol, the basic rules for surge control are similar. is true. Trained MFs of E are given in Figure 16. Singleton
The model is trained and tested referring to 37 training MFs TL, S, and R are located on 275,0. and þ275 of
data and 20 testing data, respectively. The data indicate a output variable U.
linear relationship between heading deviation [275, 0, Related input is fuzzified using Gauss MFs according to
þ275] and motor drive PWM ratios [100, 0, þ100]. trained parameters
However, a camera lens is monitoring the deviation
ðxcÞ 2
larger at the center and smaller than it should be at the mAðxÞ ¼ e 2s 2 (24)
corners. In addition, the left motor stops under 20% e
PWM, and the torque of the right motor is slightly Here, c and s are the parameters that determine the
strong (around 2%) for identical PWM values around center and width of the Gaussian Curve, respectively.48
zero grade. For this reason, the data set is improved by Fuzzy inference system is executed to refer to fuzzification,
several tries and faults (Figure 15). production, normalization, and defuzzification processes
Negative outputs proportionally activate the left motor (Figure 3). ANFIS functions, which are operated in each
in the forward direction and the right motor in the reverse layer, can be extracted by programming codes step by
14 International Journal of Advanced Robotic Systems
ate PWM values. The values captured by the camera are ANFIS: adaptive neuro-fuzzy inference system; MO: maximum overshoot.
transferred to the computer via Wi-Fi (Figure A2), and the
underwater tests of the UV were successfully accom-
plished. The vehicle is computer-controlled and able to
Image processing tests for the UV þ manipulator
move autonomously by the CC. As a result of these oper- object capturing system
ations, the image of the object in the camera is shown in OpenCV and real-time image processing libraries are
Figure 17. The location information output by writing the installed on Raspberry Pi. The application is started with the
line of code is as follows detection of coordinate points of the object and end effector
in the image and the calculation of the distance between
$ python3 object_tracker.py
them. The distance is calculated on a pixel basis on the
RADIUS ¼ 44 two-dimensional image space and is converted to cm. The
image captured by the camera is in RGB space and is trans-
CENTER ¼ ½455;478 formed into HSV space. The object and the end effector of
This output means that the radius of the target is the manipulator RA are labeled in blue and red colors,
reported as 44 pixels to the frame interface and the center respectively. Threshold values of red and blue colors in HSV
coordinates of the target are N (x, y) ¼ (455, 478) color space have been already determined in the image.
Original and thresholded images are shown in Figure 19.
(Figure 17). The related control algorithm is given in
The coordinates of the end effector and the target object
Figure A3. The object tracking program controls two para-
are determined and used to drive the motors. Pixel distance
meters: Heading control refers to deflection E from the
between two points, jABj, is accepted as an error that
target, and surge control refers to distance E from the tar-
should be reduced by proportional control
get. PWM rates of motors reacting to the E signals are the qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
same for both control procedures. The direction of rotation jABj ¼ ðx 2 x 1 Þ 2 þ ðy 2 y 1 Þ 2 (25)
is identical for surge control but reverse for heading con-
trol. Experimental results of the heading control tests for Considering the x-axis, the coordinate of the end effec-
PID and ANFIS methods are given in Figure 18.49 tor marked in red is x1 and the coordinate of the target
Feedback control by the image process, in general, marked in blue is x2, where y1 and y2 are the coordinates
reduces E, and reasonable results have been achieved for of the end effector and target along the y-axis, respectively.
object tracking operations. The controller settles the UV to The camera feedbacks the jABj to the servomotors by (22),
the direction of a target under 3.5s for both methods where Kp is selected as 0.7. In this application, the UV was
(Table 6). not moved except for the manipulator part. Details of the
The maximum overshoot in the ANFIS control is lower vision-based manipulator control on the UV are illustrated
than in the PID control. In addition, the settling is achieved in Figure A.4. Experimental jABj distance data versus time
earlier in the ANFIS control than in the PID control. are shown in Figure 20. The manipulator approaches to
YILMAZ 15
Figure 19. Positions of the object (blue) and end effector (red) in the original images and after the thresholding process.44
Figure 21. (a) Vehicle body mesh structure and (b) example CFD analysis response of Lucky Fin to liquid flow in the x (surge) direction.
CFD: computational fluid dynamics.
16 International Journal of Advanced Robotic Systems
processing, kinematic and dynamic modeling for future international joint conference, Bexco, Busan, South Korea,
simulation, and controller optimization studies bring about 18–21 October 2006, pp. 3100–3103. New York: IEEE.
partial autonomy to conventional ROVs. 9. Song F, An E, and Folleco A.Modeling and simulation of
autonomous underwater vehicles: design and implementa-
Acknowledgment tion. IEEE J Ocean Eng 2003; 28/2: 283–296.
The author would like to thank Ayako Tajima for her contribution. 10. Wang JS and Lee CSG. Self-adaptive recurrent neuro-fuzzy
control of an autonomous underwater vehicle. IEEE Trans
Declaration of conflicting interests Rob Autom 2003; 19/2: 283–295.
The author(s) declared no potential conflicts of interest with 11. Shi X, Xiong H, Wnag C, et al. A new model of fuzzy CMAC
respect to the research, authorship, and/or publication of this network with application to the motion control of AUV. In:
article. Proceedings of IEEE international conference on mechatro-
nics and automation, Niagara Falls, ON, Canada, 29 July–1
Funding August 2005, pp.2173–2178. New York: IEEE.
The author(s) received no financial support for the research, 12. Fernandes DA, Sørensen AJ, Pettersen KY, et al. Output
authorship, and/or publication of this article. feedback motion control system for observation class
ROVs based on a high-gain state observer: theoretical
ORCID iD and experimental results. Control Eng Pract 2015; 39:
Serhat YILMAZ https://orcid.org/0000-0001-9765-7225 90–102.
13. Manu DK and Karthik P. Development and implementation
Supplemental material of AUV for data acquisition and image enhancement. IJEAT
2020; 9/4: 2088–2093.
Supplemental material for this article is available online.
14. Jung J, Lee Y, Kim D, et al. AUV SLAM using forward/
downward looking cameras and artificial landmarks. In:
References
Conference on IEEE underwater technology (UT), Busan,
1. Choi J, Lee Y, Kim T, et al. Development of a ROV for visual
Korea (South), 21–24 February 2017, pp.1–3. New York:
inspection of harbor structures. In: IEEE underwater technol-
IEEE.
ogy (ut) conference, Busan, South Korea, 21–24 February
15. Sonka M, Hlavac V, and Boyle R. Image processing, analy-
2017, pp. 1–4. New York: IEEE.
sis, and machine vision. 3rd ed. Toronto: Thompson
2. Miorim F, Rocha F, and De Tomi G. Mobile control center: a
Learning, 2008.
new approach to deploy ROV-oriented education and train-
16. Satria D, Wiryadinata R, Esiswitoyo DPA, et al. Hydrody-
ing. In: OCEANS’15 MTS/IEEE conference, Washington DC,
namic analysis of Remotely Operated Vehicle (ROV) Obser-
USA, 19–22 October 2015, pp. 1–4. New York: IEEE.
vation Class using CFD. In: the international conference on
3. Fletcher B and Harris S. Development of a virtual envi-
aerospace and aviation, IOP conference series: materials
ronment based training system for ROV pilots. In:
science and engineering 2019, pp. 645.
OCEANS 96 MTS/IEEE conference proceedings, Fort
17. Jain A, Chandra RN, and Kumar M. Design and development
Lauderdale FL, USA, 23–26 September 1996, pp. 65–71.
New York: IEEE. of an open-frame AUV: ANAHITA. In: IEEE/OES autono-
4. Eng YH, Teo KM, Chitre M, et al. Online system identifica- mous underwater vehicle workshop (AUV), Porto, Portugal,
tion of an autonomous underwater vehicle via in-field experi- 6–9 November 2018, pp. 1–5. New York: IEEE.
ments. IEEE J Ocean Eng 2016; 41/1: 5–17. 18. Bokser V, Oberg C, Sukhatme GS, et al. A small submarine
5. Wang CC, Lin YR, and Chen HH. Accurate altitude control robot for experiments in underwater sensor networks. IFAC
of autonomous underwater vehicle. In: IEEE international Proceedings Volumes 2004; 37(8): 239–244.
underwater technology symposium (UT), Tokyo, Japan, 5–8 19. Hong EY, Soon HG, and Chitre M. Depth control of an
March 2013, INSPEC Accession No. 13530811, pp. 1–3. autonomous underwater vehicle, STARFISH, In:
New York: IEEE. OCEANS’10, Sydney, NSW, Australia, 24–27 May 2010,
6. Inoue T, Shibuya K, and Nagano A. Underwater robot with a New York: IEEE.
buoyancy control system based on the spermaceti oil 20. Trebi-Ollennu A and White BA. A robust nonlinear control
hypothesis - Development of the depth control system. In: design for remotely operated vehicle depth control systems.
IEEE/RSJ international conference on intelligent robots In: UKACC international conference on control ‘96, Exeter,
and systems conference, Taipei, Taiwan, 18–22 October UK, Publ. No. 427, Volume 2, 2–5 September 1996,
2010, pp. 1102–1107. New York: IEEE. pp. 993–997, New York: IET.
7. DeBitetto PA. Fuzzy logic for depth control of unmanned 21. Choi H-T and Lee Y.Highly accurate motion control system
undersea vehicles. IEEE J Ocean Eng 1995; 20/3: for omni-directional underwater robotOceans, Hampton
242–248. Roads, VA, USA, 14–19 October 2012.
8. Kim H and Shin Y. Design of adaptive fuzzy sliding mode 22. Yang R, Clement B, Mansour A, et al. Robust heading control
controller using FBFE for UFV depth control. In: SICE-ICASE and its application to Ciscrea underwater vehicle. In:
18 International Journal of Advanced Robotic Systems
OCEANS 2015, Genova, Italy, 18–21 May 2015. New York: 38. Yilmaz S, Ilhan R, and Feyzullahoğlu E. Estimation of adhe-
IEEE. sive wear behavior of the glass fiber reinforced polyester
23. Palomeras N, Nagappa S, Ribas D, et al. Vision-based composite materials using ANFIS model. J Elastom Plast
localization and mapping system for AUV intervention. 2021; 54: 86–110.
In: 2013 MTS/IEEE OCEANS, Bergen Norway, June 39. Mesbahi AH, Semnani D, and Khorasani SN. Performance
10–13 2013. prediction of a specific wear rate in epoxy nanocomposites
24. Mangipudi C P and Li P Y.Vision based passive arm locali- with various composition content of polytetrafluoroethylen
zation approach for underwater ROVs using a least squares (PTFE), graphite, short carbon fibers (CF) and nano-TiO2
on SO(3) gradient algorithm. In: American control confer- using adaptive neuro-fuzzy inference system (ANFIS). Com-
ence (ACC), pp. 5798–5803, Philadelphia-USA, July 10–12 pos Part B 2012; 43: 549–558.
2019, New York: IEEE. 40. Xu Z, Baojie X, and Guoxin W. Canny edge detection based
25. Zhou C, Liu Y, and Song Y.Detection and tracking of a UAV on Open CV. In: ICEMI’2017- IEEE 13th international con-
via hough transform, In: 2016 CIE international conference ference on electronic measurement & instruments, Yangz-
on radar (RADAR), Guangzhou, China, October 10–13 2016, hou, China, 20–23 October 2017, pp. 53–56. New York:
New York: IEEE. IEEE.
26. Cai M, Wang Y, Wang S, et al. Grasping marine products 41. Illingworth J and Kittler J. A survey of the Hough transform.
with hybrid-driven underwater vehicle-manipulator sys- Comput Vis Graph Image Process 1988; 44/1: 87–116.
tem. IEEE Transact Automat Sci Eng 2020; 17(3): 42. Lee YK. Commutation torque ripple minimization for three
1443–1454. phase BLDC motor drive using a simple PWM scheme reli-
27. Hu Z, Zhu X, Tu D, et al. Manipulator arm interactive control able to motor parameter variation. In: IEEE PES Asia-Pacific
power and energy engineering conference (APPEEC),
in unknown underwater environment. In: 2nd world confer-
Macao, China, 1–4 December 2019, pp.1–4. New York:
ence on mechanical engineering and intelligent manufactur-
IEEE.
ing (WCMEIM), 22–24 November 2019, pp.494–497,
43. Hu Z, Zhu X, Tu D, et al. Manipulator arm interactive control
Shanghai, China.
in unknown underwater environment, In: 2nd World confer-
28. Nishida Y, Ahn J, Sonoda T, et al. Benthos sampling by
ence on mechanical engineering and intelligent manufactur-
autonomous underwater vehicle equipped a manipulator with
ing (WCMEIM), Shanghai, China, 22–24 November 2019,
suction device. In: 2019 IEEE underwater technology (UT),
pp. 494–497. New York: IEEE.
Kaohsiung, Taiwan, 16–19 April 2019, pp. 1–4. New York:
44. Yılmaz S and Kılcı SB. Image processing applications for 6
IEEE.
degrees of freedom (DOF) underwater vehicle and manipu-
29. Eidsvik OA. Identification of hydrodynamic parameters for
lator system. imctd 2020; 1: 63–79.
remotely operated vehicles. Master Thesis, Norwegian Uni- _
45. Yakut M, Yılmaz S, Ince S, et al. Derinlik ve Yön Kontrol
versity of Science and Technology Department of Marine
uygulamalari için sualti araci tasarimi (underwater vehicle
Technology, Norway 2015. design for depth and direction control applications), GU J Sci
30. Mitra A, Panda JP, and Warrior HV. The effects of free Part: C 2015; 3: 343–355.
stream turbulence on the hydrodynamic characteristics of 46. Ergan AF.Design and implementation of hardware and user
an AUV hull form. Ocean Eng 2019; 174: 148–158. interface for underwater test platform, Master’s Thesis,
31. Siciliano B, Sciavicco L, and Villani L. Robotics: Modelling, Kocaeli University Institute of Science, Turkey, 2014.
Planning and Control. 1st ed. London: Springer Verlag, 47. Yılmaz S, Yakut M, and Ergan AF. Sualtı deney platformu
2009. için kontrol karti tasarimi (control board design for under-
32. Antonelli A. Underwater Robots, Motion and Force Control water experiment platform), In: Turkish National committee
of Vehicle-Manipulator Systems. 2nd ed. Berlin Heidelberg: of automatic control (TOK) proceedings, Kocaeli-Turkey,
Springer Verlag, 2006. _
11–13 September 2014, paper no.147, pp. 746–750. Istanbul:
33. Dinç M and Hajiyev C. Integration of navigation systems for TOK.
autonomous underwater vehicles. J Mar Eng Technol 2015; 48. Yilmaz S. Bulanık Mantık ve Mühendislik Uygulamaları
14: 32–43. (fuzzy logic and its’ engineering applications), Kocaeli Uni-
34. Kılcı SB.Dynamic modeling and simulation studies of four versity Press, 2007. Pub No:289, ISBN No: 978-605-4158-
degrees of freedom underwater vehicle. Master Thesis, 00-3.
Kocaeli University Institute of Science, Turkey, 2020. 49. Ofluoğlu AA. Object tracking in underwater vehicles with
35. Dukan F. ROVMotion control systems. Phd Thesis, Norwe- ANFIS method. Master Thesis, Kocaeli University Institute
gian University of Science and Technology, Norway, 2014. of Science, Turkey, 2021.
36. Vervoort JHAM.Modeling and control of an unmanned 50. Yılmaz G and Yılmaz S.Used methods for determine hydro-
underwater vehicle. Master Thesis, University of Canterbury, dynamic parameters in unmanned underwater vehicles
UK, 2009. (UUV). In: Tokyo Summit-III 3rd international conference
37. Dorf RC and Bishop RH. Modern Control Systems. 10th ed, on innovative studies of contemporary sciences, Tokyo,
New York, NY: Pearson Prentice Hall, 2005. Japan, 2021, pp.696–705, 19–21 February 2021.
YILMAZ 19
APPENDIX
Figure A.2. Camera data sent to a computer by the Wi-Fi module of STM32.
20 International Journal of Advanced Robotic Systems
Start
Target Detected?
N
Y Hough Algorithm
Define the Circle and its’ center
which encloses the Target
Turn
Y Distance>-2 N Distance<-2 Y Turn Left
Right pixel? pixels?
Go
Y Radius> 62 N Radius< 58 Y Go Forward
Backward pixels? pixels?
N N
Required Distance of Target is reached
Stop Motors
Figure A.3. Algorithm of UV object tracking control by image processing. UV: underwater vehicle.
YILMAZ 21
Servo-1
AB =End effector position-Object
Position(cm)
Servo-2
Servo-3
Servo-1 N
AB<20 cms?
Does not
turn z=Current Sensor
Y
N Did AB reach ed to
shortest distance?
Y STOP
Servo-2, progresses and end effector Y
approaches to object in the x direction
N Y Current>
x <= desired Servo3 Grasps Current Limit?
distance?
N
RASPBERRY PI