Professional Documents
Culture Documents
fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
Date of publication xxxx 00, 0000, date of current version xxxx 00, 0000.
Digital Object Identifier 10.1109/ACCESS.2017.DOI
ABSTRACT This paper proposes a novel monitoring system in which two cameras installed outside
the water tank collaborate to photograph eel larvae for tracking and analysis. Long-term and periodic
observations of fish can provide important information about their life. However, fresh eel larvae have a
tiny and transparent body, making it difficult to observe and track them through optical vision. To address
this problem, we proposed a monitoring system that uses a fixed high-resolution camera to observe the
entire breeding tank and a telecentric zoom camera on a Cartesian robot to track and observe a single
larva. In addition, the collaborative vision-based object search method helps the zoom camera to capture
photographs of the tiny larvae. We verified the method by placing several 3D plastic models similar to
eel larvae in the water tank at predetermined locations. We then conducted an experiment using actual
larvae and were able to obtain videos of the larvae. Subsequently, we analyzed the videos to obtain
the population estimation, shape, and size of the larvae. As a result, we established a novel eel larva
monitoring system and conducted actual larva-monitoring experiments.
INDEX TERMS Eel larvae, aquaculture, 3-D tracking, fish tracking, animal behavior, motion trajectory.
VOLUME 4, 2016 1
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
and cleaning processes. This automation research is valu- eel larvae and a telecentric zoom tracking camera that
able because it automates the work currently performed accurately moves in three axes and can track and photo-
by manual labor. However, for more effective breeding, graph the small larvae. In addition, we propose a practical
a technology is required that can be used for automated photographing and analysis method. We analyzed the fixed
breeding by monitoring and analyzing the environment and high-resolution camera image to detect the larvae in the
monitoring the fishes. water tank, and then proposed a collaborative algorithm
Breeding environment monitoring involves the accumu- that allows the tracking zoom camera to capture the larvae
lation of data obtained by measuring the temperature, hu- by finding the larvae’s high-density area. We conducted
midity, and dissolved oxygen in the breeding water with a demonstrative model experiment by randomly placing
sensors [9]. Moreover, by directly observing the object, plastic models of small eel larvae in a water tank. In the ex-
researchers can determine the object’s size and condition via periment, we confirmed that tiny larvae can be successfully
a sampling process [10]. However, if the tiny larvae were observed using our method. Furthermore, the telecentric
observed by sampling, only their static appearance could be zoom camera was able to find and track the actual eel larvae
observed because the sampled larvae were dead. This also placed in the tank for effective imaging. Subsequently, we
results in the loss of larvae due to the sampling process, extracted quantitative values from the larvae images using
and the observation period is relatively long. Therefore, if the shape of the body analysis. As a result, we successfully
it is possible to perform direct observation within the water photographed eel larvae based on the two collaborative
tank where breeding takes place, imaging can be performed cameras, and the quantitative values analyzed from the
at any time for observation in a short period, and we can videos could be used as an evaluation index for larval eel
obtain additional information, such as the movements in the farming.
water tank. The contribution of this paper lies in the following
Many researchers have proposed photographing zebrafish aspects:
directly in water tanks [11]–[15]. They modeled the fish’s 1) A monitoring system for tiny eel larvae that was de-
behavior to analyze the movement and suggested a solution signed for real-time observation and analysis outside
when two fish overlapped in an image. Thereafter, a system the water tank for aquaculture was developed.
for tracking and observing behavior in 3-D is proposed 2) Collaborative vision-based tiny object search method
[16], [17]. A study using zebrafish as neurobehavioral helps the zoom camera shoot tiny larvae.
research has also been proposed [18]. Many studies have 3) Video shooting, population estimation, trajectory ex-
been conducted on cucumbers, an aquatic organism with a traction, and body length measurement were per-
relatively slow speed and a unique body [19]–[21]. Along formed through experiments of larva models and
with the detection of sea cucumbers using vision technology, actual eel larvae.
observational studies in the sea and a water tank have been The remainder of this paper is organized as follows.
proposed [22], [23]. However, previous research results can Section 2 explains the proposed monitoring system design
be observed by the far-view camera outside the water tank in detail, collaborative tiny larvae shooting methods and
because they used relatively large fish. image-analysis methods for breeding management. In Sec-
In the case of zebrafish, larval monitoring studies have tion 3, we describe the experiments using the eel larva
been conducted. In most studies, larvae were transferred models for quantitative evaluation. In Section 4, we present
to a dish, rather than to a water tank for monitoring the experimental results with actual eel larvae to evaluate
and imaging [24]–[28]. Using a high-resolution camera, a the proposed methods. Finally, the conclusions are given in
zebrafish larva was placed on a 2-D plane, and trajectory Section 5.
was extracted through various photographs, and the behavior
was analyzed. However, it was challenging to shoot in a II. MATERIALS AND METHODS
water tank for breeding and to expand it to 3-D. Through A. AUTOMATED EEL LARVAE AQUACULTURE
larval monitoring, we can obtain the necessary information. The eel to be observed is Anguilla japonica. Recently, many
Monitoring data can be used in various ways in the field researchers have conducted breeding experiments on the
of model organism research [29], and the results obtained hatching of the larvae of this fish. Long-term breeding of
through fluorescent staining can capture the appearance of eel larvae is a monotonous task. Essentially, it is necessary
internal organs [30]. In addition, in the fully enlarged and to feed the larvae, clean the water tank, and control the
still image of the larva, the heart’s position can be detected seawater supply. We have developed an automated breeding
[31]. These data were obtained by sampling and fixing system that performs these functions using a robotic arm
the larvae to an imaging device and observing it. If long- and includes a breeding water control system using a fine
term observations can be performed in a living state, more control valve, as shown in Fig. 1.
meaningful results can be obtained in the future. We used a robot servomotor and a linear rail motor to
In this study, we developed a collaborative vision-based position the endpoint for feeding and cleaning the tank to
precision monitoring system equipped with a fixed high- the desired location. All frame structures are made of 3D-
resolution camera that records the entire water tank of printed plastic, aluminum, and stainless steel so that they
2 VOLUME 4, 2016
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
were 10 mm or less, and breeding began in the small aluminum profile at a distance that can be entirely focused.
water tank. The tank was made of white acrylic, except Fig. 3 (a) shows a fixed high-resolution camera image.
for the side on which the fixed high-resolution camera was
mounted to keep the camera shooting background constant. 3) Telecentric Zoom Tracking Camera
A constant background is an essential part of observing A telecentric zoom tracking camera was used to zoom in
tiny larvae because the larvae have transparent bodies, and on a single larva. The larva is difficult to observe in detail
their appearance is strongly influenced by the background. because it moves in a tank with a small size of less than
Therefore, a white background facilitates better recognition 10 mm. Therefore, we used a camera with a zoom lens to
and observation. The tank was cylindrical and the top was obtain an enlarged image at a close distance, tracking the
cut off. The entire frame was made of an aluminum profile, larva with physical movement.
and the wheels were attached to facilitate movement. The The camera is equipped with a 1" CMOS sensor, and it
tank was fixed in the correct position to perform tank can shoot high-resolution images with a USB 3.1 interface at
shooting under the same conditions. Further, we installed a 42 FPS. This camera has a telecentric lens with a PMAG of
telecentric zoom tracking camera and fixed high-resolution 0.367×, a depth of field of 4 mm, and a working distance
camera on the bare frame, which we attached to a Cartesian of 169 mm. This lens can obtain an image with a sense
robot that could move in three axial directions [32]. of perspective removed, so it is possible to measure the
larva’s size and volume using only the captured image [33].
1) Lighting Unit Further, because the area to be focused on is accurately
Lighting is an essential element for capturing images. Be- fixed, knowing the location of the camera can accurately
cause the area of focus changes depending on the aperture, determine the location of the captured larva. In other words,
increasing the brightness to the extent possible provides a if the larva is tracked and observed with an appropriate focal
broad focus area. Because the water tank size is fixed in point as the center, the camera’s movement path and the
integrating the monitoring system, the lighting suitable for movement path of the larva match. Fig. 3 (b) shows an
the size of the water tank must have a constant brightness. image from the telecentric zoom tracking camera.
Thus, we installed the lighting unit by mounting an LED This telecentric zoom tracking camera is attached to a
in the same direction for a fixed camera and set up the Cartesian robot that can move in three axes (0.1 mm)
lighting to form a uniform ambient light. The lighting unit to realize physical tracking. Because this robot minimizes
was controlled by an electronic switch and turned on only vibration and can move the camera to the desired position,
during shooting to avoid affecting the larvae. The actual it is easy to find the larva by moving the camera’s working
filming takes approximately 30 s, and because the filming field. The camera shoots while looking down from the top
takes place in a long cycle, the impact on the life of larvae of the tank. However, owing to the limitation of its short
is minimal. focal area, it is only possible to image the larvae in some
areas, not the entire tank.
(a) (b)
Figure 3. Example output images. (a) Fixed high-resolution camera image
and (b) telecenric zoom tracking camera image.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
frame image, i = {1, 2, · · · , Nt }, and Nt is the estimated from a distribution with the density function fH (X t ), we
number of eel larvae at the tth frame. can calculate the kernel density estimate at X t as
Nt
CM (Ri,t ) = X i,t (2) 1 X
fˆH (X t ) = KH (X t − X i,t ) (3)
Nt i=1
where CM is the center of mass in the contour region and
X i,t denotes the ith eel larva’s coordinate vector on the tth where H is the smoothing 2×2 matrix and K is the kernel
fixed high-resolution camera image plane. function. If we choose the kernel function as the standard
multivariate normal kernel, we can obtain KH as follows:
2π 1 T −1
KH (X t ) = exp − X t H X t (4)
|H| 2
where H denotes the covariance matrix. Then, we can
obtain the maximum value of the density function fH (X t ),
which can be called M t = {xmax,t , ymax,t }.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
To obtain the relational expression for the fixed camera, y-axis were controlled based on the center of mass of the
this is modeled using homography as contour.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
Figure 9. Eel larva models with skeleton structure to emulate the eel larvae’s
position.
2) Shape Analysis of Eel Larva Figure 10. Image of eel larvae-like plastic models.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
(d) Eel larva model # 45 (e) Eel larva model # 46 Figure 14. Scene of the experiment with the actual eel larvae.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
Figure 17. Analysis result of the telecentric zoom camera image (larva # 3).
Figure 16. Population estimation result of eel larvae.
in which the background is shaken owing to vibration during and controlling the surrounding brightness equally with a
the process of placing the larvae or temporary changes blackout curtain.
in external lighting. Moreover, such an error can occur
even when the larvae are excessively clustered. The above C. MOVEMENT ANALYSIS IN THE WATER TANK
cases can be solved by shooting after sufficient time has As shown in Fig. 15, we determined the paths of the 11
elapsed until the movement of the water tank is stable larvae. The trajectories visualized the path of a 3-axis Carte-
VOLUME 4, 2016 11
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
Table 3. Length estimation results of telecentric zoom tracking camera images (Larva # 1 - # 11).
# #
6.94 6.88
10 11
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
movement path of eel larvae in a water tank by using the Giselle Cutrim de Oliveira Santos, and Allan Kardec Barros. Zebrafish
Cartesian robot’s movement path. As a result, we verified automatic monitoring system for conditioning and behavioral analysis.
Scientific reports, 11(1):1–16, 2021.
that the proposed monitoring system effectively collected [16] Jonathan Cachat, Adam Stewart, Eli Utterback, Peter Hart, Siddharth
objective information for breeding small eel larvae. Gaikwad, Keith Wong, Evan Kyzar, Nadine Wu, and Allan V Kalueff.
Future work is expected to observe actual eel larvae over Three-dimensional neurophenotyping of adult zebrafish behavior. PloS
one, 6(3):e17597, 2011.
an extended period. In this study, we developed a system [17] Xiaoqing Liu, Yingying Yue, Meiling Shi, and Zhi-Ming Qian. 3-d video
and performed observations of actual larvae. However, mon- tracking of multiple fish in a water tank. IEEE Access, 7:145049–145059,
itoring must be performed several times during a certain 2019.
[18] Rupert J Egan, Carisa L Bergner, Peter C Hart, Jonathan M Cachat, Peter R
period. We will observe changes over a long period using Canavello, Marco F Elegante, Salem I Elkhayat, Brett K Bartels, Anna K
our monitoring system to determine the changes that eel Tien, David H Tien, et al. Understanding behavioral and physiological
larvae undergo as they grow after hatching. As the eel larvae phenotypes of stress and anxiety in zebrafish. Behavioural brain research,
205(1):38–44, 2009.
grow, their movements will change, and we will study this [19] Honglei Wei, Danni Peng, Xufei Zhu, and Dongdong Wu. A target
change using our observation device. tracking algorithm for vision based sea cucumber capture. In 2016
2nd IEEE International Conference on Computer and Communications
(ICCC), pages 401–404. IEEE, 2016.
References [20] Luzhen Ge, Gaili Gao, and Zhilun Yang. Study on underwater sea
[1] Jules S Jaffe. Underwater optical imaging: the past, the present, and the cucumber rapid locating based on morphological opening reconstruction
prospects. IEEE Journal of Oceanic Engineering, 40(3):683–700, 2014. and max-entropy threshold algorithm. International Journal of Pattern
[2] Katsumi Tsukamoto, Seinen Chow, Tsuguo Otake, Hiroaki Kurogi, Nori- Recognition and Artificial Intelligence, 32(07):1850022, 2018.
taka Mochioka, Michael J Miller, Jun Aoyama, Shingo Kimura, Shun [21] Xi Qiao, Jianhua Bao, Lihua Zeng, Jian Zou, and Daoliang Li. An
Watanabe, Tatsuki Yoshinaga, et al. Oceanic spawning ecology of fresh- automatic active contour method for sea cucumber segmentation in natural
water eels in the western north pacific. Nature communications, 2(1):1–9, underwater environments. Computers and Electronics in Agriculture,
2011. 135:134–142, 2017.
[3] Katsumi Tsukamoto, Tsuguo Otake, Noritaka Mochioka, Tae-Won Lee, [22] Fang Peng, Zheng Miao, Fei Li, and Zhenbo Li. S-fpn: A shortcut feature
Hans Fricke, Tadashi Inagaki, Jun Aoyama, Satoshi Ishikawa, Shingo pyramid network for sea cucumber detection in underwater images. Expert
Kimura, Michael J Miller, et al. Seamounts, new moon and eel spawning: Systems with Applications, page 115306, 2021.
the search for the spawning site of the japanese eel. Environmental Biology [23] Juan Li, Chen Xu, Lingxu Jiang, Ying Xiao, Limiao Deng, and Zhongzhi
of Fishes, 66(3):221–229, 2003. Han. Detection and analysis of behavior trajectory for sea cucumbers
[4] Hideki Tanaka. Progression in artificial seedling production of japanese based on deep learning. Ieee Access, 8:18832–18840, 2019.
eel anguilla japonica. Fisheries science, 81(1):11–19, 2015. [24] Ethan Gahtan, Paul Tanger, and Herwig Baier. Visual prey capture in larval
[5] Peter Edwards. Aquaculture environment interactions: past, present and zebrafish is controlled by identified reticulospinal neurons downstream of
likely future trends. Aquaculture, 447:2–14, 2015. the tectum. Journal of Neuroscience, 25(40):9294–9303, 2005.
[6] Martin Føre, Kevin Frank, Tomas Norton, Eirik Svendsen, Jo Arve Al- [25] Sean D Pelkowski, Mrinal Kapoor, Holly A Richendrfer, Xingyue Wang,
fredsen, Tim Dempster, Harkaitz Eguiraun, Win Watson, Annette Stahl, Ruth M Colwill, and Robbert Creton. A novel high-throughput imaging
Leif Magne Sunde, et al. Precision fish farming: A new framework to system for automated analyses of avoidance behavior in zebrafish larvae.
improve production in aquaculture. biosystems engineering, 173:176–193, Behavioural brain research, 223(1):135–144, 2011.
2018. [26] Clinton L Cario, Thomas C Farrell, Chiara Milanese, and Edward A
[7] Fernando D Von Borstel Luna, Edgar de la Rosa Aguilar, Jaime Suárez Burton. Automated measurement of zebrafish larval movement. The
Naranjo, and Joaquín Gutiérrez Jagüey. Robotic system for automation Journal of physiology, 589(15):3703–3708, 2011.
of water quality monitoring and feeding in aquaculture shadehouse. IEEE [27] Pierre R Martineau and Philippe Mourrain. Tracking zebrafish larvae in
transactions on systems, man, and cybernetics: systems, 47(7):1575–1589, group–status and perspectives. Methods, 62(3):292–303, 2013.
2016. [28] Xiaoying Wang, Eva Cheng, Ian S Burnett, Yushi Huang, and Donald
[8] Yousef Atoum, Steven Srivastava, and Xiaoming Liu. Automatic feeding Wlodkowic. Automatic multiple zebrafish larvae tracking in unconstrained
control for dense aquaculture fish tanks. IEEE Signal Processing Letters, microscopic video conditions. Scientific reports, 7(1):1–8, 2017.
22(8):1089–1093, 2014. [29] Chia-Yuan Chen and Chao-Min Cheng. Microfluidics expands the ze-
[9] Suresh Babu Chandanapalli, E Sreenivasa Reddy, D Rajya Lakshmi, et al. brafish potentials in pharmaceutically relevant screening. Advanced
Design and deployment of aqua monitoring system using wireless sensor healthcare materials, 3(6):940–945, 2014.
networks and iar-kick. Journal of Aquaculture Research and Development, [30] Kun Zhang, Hongbin Zhang, Huiyu Zhou, Danny Crookes, Ling Li, Yeqin
5(7), 2014. Shao, and Dong Liu. Zebrafish embryo vessel segmentation using a novel
[10] Mohammadmehdi Saberioon, Asa Gholizadeh, Petr Cisar, Aliaksandr dual resunet model. Computational intelligence and neuroscience, 2019,
Pautsina, and Jan Urban. Application of machine vision systems in 2019.
aquaculture with emphasis on fish: state-of-the-art and key issues. Reviews [31] Mingyu Cui, Liying Su, Peng Zhang, Huipeng Zhang, Hongmiao Wei,
in Aquaculture, 9(4):369–387, 2017. Zhuo Zhang, and Xuping Zhang. Zebrafish larva heart localization using a
[11] Ebraheem Fontaine, David Lentink, Sander Kranenbarg, Ulrike K Müller, video magnification algorithm. In 2020 IEEE International Conference on
Johan L van Leeuwen, Alan H Barr, and Joel W Burdick. Automated Mechatronics and Automation (ICMA), pages 1071–1076. IEEE, 2020.
visual tracking for studying the ontogeny of zebrafish swimming. Journal [32] Frank L Lewis, Darren M Dawson, and Chaouki T Abdallah. Robot
of Experimental Biology, 211(8):1305–1316, 2008. manipulator control: theory and practice. CRC Press, 2003.
[12] Natasha Speedie and Robert Gerlai. Alarm substance induced behav- [33] Masahiro Watanabe and Shree K Nayar. Telecentric optics for focus
ioral responses in zebrafish (danio rerio). Behavioural brain research, analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence,
188(1):168–177, 2008. 19(12):1360–1365, 1997.
[13] Johann Delcourt, Christophe Becco, Nicolas Vandewalle, and Pascal [34] Mohand Said Allili, Nizar Bouguila, and Djemel Ziou. Finite general
Poncin. A video multitracking system for quantification of individual gaussian mixture modeling and application to image and video foreground
behavior in a large fish shoal: advantages and limits. Behavior research segmentation. Journal of Electronic Imaging, 17(1):013005, 2008.
methods, 41(1):228–235, 2009. [35] Tarn Duong et al. ks: Kernel density estimation and kernel discriminant
[14] Radharani Benvenutti, Matheus Marcon, Matheus Gallas-Lopes, analysis for multivariate data in r. Journal of Statistical Software, 21(7):1–
Anna Julie de Melo, Ana Paula Herrmann, and Angelo Piato. Swimming 16, 2007.
in the maze: An overview of maze apparatuses and protocols to assess
zebrafish behavior. Neuroscience & Biobehavioral Reviews, 2021.
[15] Marta de Oliveira Barreiros, Felipe Gomes Barbosa, Diego
de Oliveira Dantas, Daniel de Matos Luna Dos Santos, Sidarta Ribeiro,
VOLUME 4, 2016 13
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2021.3095908, IEEE Access
JUHWAN KIM received his B.E. degree in Elec- BAE-IK LEE received his B.S. degree in Depart-
trical Engineering from Pohang University of ment of Aquaculture from Pukyong National Uni-
Science and Technology (POSTECH), Pohang, versity, Busan, Republic of Korea, in 1985 and
Republic of Korea, in 2015. Currently, he is M.S. and Ph.D. degrees from Tokyo University
a Ph.D. candidate in the Department of Con- of Marine Science and Technology, Tokyo, Japan,
vergence IT Engineering at POSTECH, Pohang, in 1988 and 1994, respectively. From 1994, he
Republic of Korea. Mr. Kim is a member of has been working at the division of Aquaculture
Hazardous and Extreme Environment Robotics Science in National Institute of Fisheries Science,
(HERO) Lab in POSTECH. His research inter- Busan, Republic of Korea, until now.
ests include aquaculture engineering, underwater
robotics, and multi-agent underwater manipulation.
14 VOLUME 4, 2016
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/