You are on page 1of 6

A DIGITAL CONTROL SYSTEM FOR THE TRITON UNDERSEA ROBOT

Oli Francis, Graeme Coakley, Christopher Kitts


Robotic Systems Laboratory, Santa Clara University, Santa Clara CA 95053

Abstract: Santa Clara Universitys Triton undersea robot is a small shallow-water vehicle
used to support a variety of marine science, technology, and education objectives. With a
human-in-the-loop, analog-based control system at its core, a microcontroller-based digital
control layer has been developed as a class exercise in order to augment piloting with
automatic control and to support networked operation of the vehicle. This paper presents the
Triton system, describes the design of the digital control layer, and presents experimental
results of the system during operation. Copyright 2002 IFAC.
Keywords: Engineering Education, Robot Teleoperation, Microcomputer-based control.

1. INTRODUCTION
Since 1998, the Santa Clara University Robotic
Systems Laboratory has engaged in the aggressive
development of low-cost robotic devices in order to
support scientific exploration, technology validation,
and engineering education.
These projects are
typically developed in less than a year as part of the
Universitys senior undergraduate design program.
To date, the Laboratory has been involved in the
design of more than twenty highly capable robotic
systems to include spacecraft, airships, land rovers,
undersea vehicles, and telescopic observatories.
These projects have supported a wide range of field
operations and have been financed through more than
$1,250,000 of external funding from government
agencies, industrial partners, and university
collaborators.
The Triton undersea robot, shown in Figure 1, was
developed in 1999 by a group of seven mechanical
and
electrical
engineering
undergraduates.
Developed for shallow water (less than 1000 feet)
tethered operations, Triton is a 270 pound vehicle
powered by two hp horizontal thrusters (for
horizontal plane motion) and two hp vertrans
thrusters (for vertical and lateral motion). The

vehicle includes a camera and lights to support both


piloting and video-based science operations; in
addition, the vehicle is able to support modular
instrumentation for specific scientific objectives
[Weast, et al., 1999].

Fig. 1. The Triton undersea robot during a pool test.


A 750 foot tether provides power, command, and
telemetry connectivity between Triton and a surface
control console. The pilot controls the vehicle
through the use of a video monitor displaying the
realtime on-board video as well as a pilot box
consisting of two joysticks (for maneuvering) and an

array of switches (for light and camera control). As


the pilot provides control inputs, analog control
signals and power modulated by analog circuitry is
transmitted through the tether to the vehicle; the
overall component control architecture is depicted in
Figure 2. This analog approach was originally
adopted for several reasons to include its common
use in the marine industry, the expertise of design
mentors from local industry, and the resources and
knowledge available to the student team.
To date, Triton has been used to support geologic
studies of Lake Tahoe by the Scripps Institution of
Oceanography, ecological surveys of the Channel
Islands region for the National Marine Sanctuary, and
marine biology studies for the National Undersea
Research Program. Triton has also been used as a
technology testbed for undersea manipulators and
stereo vision applications [Yoshida, et al., 2000;
Weast, et al., 1999].
EXTENDED LAYER
DIGITAL LAYER
Digital
Pilot Box

Digital
Console

Video
Monitor

Analog Console

Video
Monitor

DIGITAL LAYER
ANALOG LAYER
Pilot
Box

Power
System

Sensor
Display

750 ft 30 conductor tether

Sensors

Routing
&
Filtering

Vert. Thruster

Lights

Vert. Thruster

Camera

Horiz. Thruster

Triton Vehicle

Horiz. Thruster

Fig. 2. Simplified Triton system block diagram


showing analog, digital, and extended control
layers.

2. DIGITAL CONTROL SYSTEM


As part of a class project, the authors developed a
digital control system as an extension to the Triton
undersea robot. The team had several specific
objectives. First and foremost, this experience was
designed to provide hands-on exposure to issues in
teleoperated systems. Second, the team wished to
develop experience with microcontroller systems and
digital control techniques that were being considered
for use as the primary control system for Mantaris, a
second generation undersea robot being developed
within the Laboratory. Third, the resulting system
was intended to improve piloting by implementing
automatic depth and heading control. Fourth, the
digital control layer was intended to provide a natural
interface to a robotic teleoperation architecture being
developed as a means of supporting distributed
research and education.
In its original form, Triton had proven to be a highly
capable and robust system.
Because of the
experimental nature of the project, the predominant
constraint for the team was to maintain the fidelity of
the original human-in-the-loop analog control
system. For this reason, the digital control system
was implemented as a layer above the core analog
system; accordingly, existing interfaces and design
implementations were not open for redesign, and all
digital control elements were required to be located
within the surface control console.
2.1 Architecture of the Digital Control Layer
As shown in Figure 2, the Digital Layer interfaces
with the existing Triton control console via the pilot
box interface for control inputs and existing
audio/video and auxiliary port for telemetry. This
connectivity allows the digital control console to
perform the following functions:
Provide a direct pilot interface at the digital layer
via a digital pilot box and video monitor with
digitally imposed telemetry data.
Implement simple but effective automated depth
and heading control.
Provide an interface to additional command and
telemetry processing/distribution layers via a
serial port interface.
The Digital Layer is implemented with a network of
Ubicom SX and PICMicro PIC microcontrollers
and support circuitry, as shown in Figure 3. Three
control modes are supported for the Triton vehicle:
Pilot Only Control: Complete human-in-the-loop
control is achieved by having the Joystick
subsystem monitor the pilot inputs and compose
the control directives.
Pilot Assist Control: Human piloting is
selectively combined with automated depth
and/or heading control. Within the Hub, control

directives from the Joystick are merged with


computed directives from the Sensor subsystem.
Extended Layer Control: Control and monitoring
is implemented by human and/or automated
systems at a higher layer in the control hierarchy.
In this mode, the digital layer serves as a relay
for the necessary data flows. Extended layers
can be used to enable distributed pilot/observer
connectivity,
support
more
advanced
controls/interfaces, and provide enhanced data
processing.

Hub Processing. Given the control mode options, the


Hub SX microcontroller is capable of receiving
commands from three sources: the Joystick SX, the
Sensor PIC, and processors in the Extended Layers.
The Hub selects, occasionally merges (for Pilot
Assist Control), and routes the control directives to
the Control microcontroller based on the selected
control.

serial
port 1

serial
port 2

EXTENDED LAYER
DIGITAL LAYER
Pilot
Inputs

Joystick
SX

Hub
SX

Control
SX

Control
Drivers

Video
Monitor

Sensor
PIC

Text
Overlay

Digital Console

Sensor
Filters

ANALOG LAYER
joystick
port

auxiliary
port

video
port

Sensor Processing.
The Sensor microcontroller
accepts depth and heading telemetry from the Triton
Analog Layer. Depth data is provided as a 0-10V
analog signal, which must be attenuated and
digitized. Heading data is provided in a pulse width
format, which must be timed for interpretation.
Using a text overlay system, the Sensor
microcontroller superimposes depth and heading data
onto the realtime video, and the result is displayed on
a monitor for use by the pilot.
Control Processing. With the appropriate commands
routed to it by the Hub microcontroller, the Control
microcontroller operates an interface circuit that
buffers,
filters and differentially amplifies
microcontroller outputs in order to provide a +/-10V
motor control signal to the Analog Layer.
Joystick (Digital Pilot Box) Processing. Similar to
the analog pilot box, the digital pilot box consists of a
joystick and camera control switches. The Joystick
microcontroller continuously monitors these input
devices in order to compose its command directive to
the Hub microcontroller.
It is interesting to note that, in contrast to the analog
pilot box, the development team chose a single lowcost joystick common in the personal computing
industry; this component provided the necessary
functionality and robustness for approximately 1/30th
of the price.
Furthermore, a flight joystick
configuration was selected which fully supported
pilot control with a single stick. The flexible
nature of the digital configuration allows this joystick
to be reconfigured with ease in order to support
individual pilot preferences.
2.2 Depth and Heading Control Implementation
In the Pilot Assist Control mode, the Sensor PIC
exploits its direct access to vehicle telemetry and it
underutilized computational ability in order to
compute motor commands capable of implementing
closed loop depth and/or heading control.
To enter this mode, the pilot simply presses the
appropriate button in order to lock the current
depth and/or heading.
A proportional control
strategy with an empirically tuned gain is used in
order to achieve simple but effective control (see
Section 3). This strategy is executed by determining
the error in depth and/or heading, computing the
proportional motor command, and possibly applying
deadbands or safety limits.
2.3 Data Handling

Fig. 3. The Digital Layers Microcontroller


Architecture

Network communication among the SX and PIC


microcontrollers relies on a standardized serial data
protocol as summarized in Table 1. This protocol

represents the range of all Triton commands


supported by the Digital Layer.
Flow and control between the microcontrollers is
executed by virtue of a native communication
technique implemented within the SX and PIC chips.
Flow and control between the Hub microcontroller
and any PC in the Extended Layer is implemented
through a send/receive strategy in which the Hub
microcontroller serves as the master.
The first character in the protocol designates the
source of the command. The second character
nominally specifies normal operation; sensed
emergency conditions cause a control override, which
leads to thruster shut-down.
Characters 3-18 designate four vehicle-level motion
behaviors: forward/backward motion, left/right
turning, vertical motion, and crabbing motion (a slide
slip motion implemented via the vertrans thrusters).
Each of these behaviors is specified with a set of four
characters; the first designates direction while the
remaining three explicitly represent thruster power
over a 0 - 100% range.
The final character specifies control of the on-board
camera. Values of this character allow adjustment of
camera zoom and focus.
Table 1 Serial Communication Protocol
Char #
1
2
3
4-6
7
8-10
11
12-14
15
16-18
19

Data
Source
Mode
Main Dir.
Main %
Turn Dir
Turn %
Depth
Dir.
Depth %
Crab Dir.
Crab %
Camera

Notes
Identifies processor sending string
Standard or Emergency
Forward or Backward
0-100% Thruster Power
Left or Right
0-100% Thruster Power
Surface or Dive
0-100% Thruster Power
Left or Right
0-100% Thruster Power
Zoom +/-, Focus +/-, Autofocus

There are two interesting issues worth pointing out


with respect to the aforementioned representation of
motion control. First, the protocol specifies four
vehicle-level behaviors each of which involves the
coordinated operation of two thrusters. For example,
if the turn motion is commanded left, then the
port and starboard horizontal thrusters are
commanded to thrust backward and forward,
respectively in order to produce this motion. A more
natural manner of specifying vehicle control would
be to explicitly represent and command each thruster
separately within the communication protocol. For
this project, however, the vehicle-level behaviors
were used by virtue of the Analog Layer control
interface, which uses the vehicle-level behaviors.

Second, the number of overall bits used in the


command string has far more information capacity
than is used. The inefficient data representation was
used to improve the interpretation of the string by
human developers and operators. Given that the
control performance does not suffer from
communication bandwidth issues, this representation
was judged to be appropriate for this project.
2.4 Extended Command & Telemetry Layers
In addition to providing a direct pilot interface, the
Digital Layer provides connectivity to additional,
extended layers of control and data distribution.
Although discussion of these layers is largely beyond
the scope of this paper, several have been
implemented.
First, a PC has been connected to the digital layer in
order to provide a computerized pilot interface.
This consists of a graphical user interface and a
flexible combination of control inputs to include both
the keyboard and joysticks. This computer interface
has also been used to route commands and data to
remote pilots and observers via combinations of radio
frequency and internet-based communications
systems. This capability is part of a significant SCU
effort to develop a robotic control architecture for
distributed educators and researchers.
2.5 Safety and Implementation Issues
With high voltage circuitry and rapidly rotating
propellers, Triton poses significant risks to students
in both the development and operational
environments.
The digital control layer was
developed in order to implement a variety of features
to improve safety. First, the overall design of the
Triton console was reorganized into physically
distinct high and low voltage circuitry. Second, an
emergency stop function directly disables the motor
driver circuitry when the emergency button is
manually pressed or automatically if the pilot box
becomes disconnected.
Third, in the event of
microcontroller failure, the motor drive circuitry
slowly discharges thereby stopping the thrusters.
3. EXPERIMENTAL TESTING & RESULTS
To support iterative development, verify end-to-end
functionality, and characterize system performance,
the Triton digital control layer was tested in a variety
of settings.
For laboratory development, a simple hardware-inthe-loop simulator was developed to support
component-level functional verification and testing.
Several days of testing in two SCU pools provided
the opportunity to refine digital control procedures, to

observe system operation, and to empirically tune the


control loop gains. These pool tests also allowed the
team to demonstrate the digital layers ability to
support remote, internet-based piloting of the vehicle
via an Extended Layer control system.
To begin a quantitative assessment of the
performance of the Triton digital control system, the
vehicle was evaluated in the test tank at the Monterey
Bay Aquarium Research Institute (MBARI) in
Monterey, California. The results presented here,
while not exhaustive, provide an initial indication of
the quality of the resulting system. Figures 4 and 5
show the dynamic response of the vehicles depth
and heading control loops for step inputs.
A more applied indication of performance involves
how these control loops directly enhance robot
piloting. For example, one essential piloting task is
to keep an object fairly stationary within the on-board
cameras field of view. In quantifying this objective,
the development team noted that the target a) should
move slowly enough to permit human scrutiny, and
b) should be far enough away from the edge of the
field of view to prevent its moving out of sight due to
environmental disturbances. In a comparative sense,
any proposed improvement should lead to tighter
target paths over time and to slower target velocities
across the screen.
To observe the Triton Digital Layers ability to
improve these two criteria, a pilot attempted to keep a
light on the side of the MBARI test tank in the
cameras field of view; this was done in both the
Pilot Only and the Pilot Assist (with both depth and
heading lock enabled) control modes. Figure 6
compares the target paths across the camera image
plane over a 30 second period; this comparison
clearly shows that the Pilot Assist mode enabled a
tighter target path. Figure 7 charts the horizontal
velocity of the target across the image plane during
this same period; this comparison highlights a
dramatic improvement in overall task performance.
Although this is only an initial experiment with
several uncontrolled parameters (such as distance
from the target), it suggests that Tritons Pilot Assist
capability will be beneficial during field operations.
Apart from these quantitative results, the
development team also relied on qualitative feedback
from experienced pilots in order to improve and
ultimately characterize the performance of the
system. During the teams visit to the MBARI test
tank, MBARI pilots, a group that includes some of
the most experienced and skilled operators of
undersea robots in the world, also tested the system.
All of the pilots provided positive comments
regarding the stability of the vehicle under pilot
control; one specifically characterized the robot and
the Digital Layer as being quick and nimble.

10

20

30

40

50

Fig. 4. Step response for automated depth control

Fig. 5. Step response for automated heading control

Fig. 6. Comparison of a 30 sec target path across


image plane for Pilot Only vs. Pilot Assist (with
depth and heading lock) control modes.

Fig. 7. Comparison of horizontal target velocity for


Pilot Only vs. Pilot Assist (with depth and
heading lock) control modes

Additional use of Triton during a science deployment


in Lake Tahoe, California provided valuable
opportunities to gain feedback from the ships captain
and a science team. The two SCU pilots for this
mission had previous experience controlling Triton
via its standard Analog Layer. These pilots were
very impressed with the usefulness of the heading
lock function during terrain-following tasks and in
configurations where large tether disturbances would
have otherwise spun the vehicle in an unwanted
manner. In addition, both pilots praised the single
joystick interface and its ability to be easily
reconfigured for individual pilot preferences. These
pilots were unable to notice any loss of
responsiveness due to the Digital Layers operation.
4. FUTURE WORK
The flexible nature of Tritons digital control
interface will enable a variety of undergraduate and
graduate-level engineering studies.
As an engineering system, the technical performance
of the depth and heading control loops will be
improved.
While the presented work is an
outstanding first step, its ad-hoc and empiricallytuned nature can be improved through formal
analysis and improved testing. Future work in this
area will include adjustment of the control gains,
implementation of depth and heading control with
arbitrary set-points (rather than locking only at the
current value), the use of position sensors for
automated navigation control, and more extensive
system verification and validation.
The digital control layers ability to support internetbased piloting and realtime distribution of science
data will be used to provide science and technology
validation services to a wide range of distributed
collaborators. This capability has been demonstrated
in test environments with acceptable performance;
future work in this area will focus on quantitatively
characterizing and validating its benefits as well as
maturing the capability to support marine operations.

Educationally, this class exercise provided


unprecedented opportunities for experiencing
engineering design, implementing control theory,
understanding the challenges of marine technology,
and conducting peer-reviewed engineering research.
The digital control layer has improved the ability to
conduct Triton undersea operations by enhancing
pilot operations via the introduction of autopilot
functions, by enabling a flexible, customizable
control interface, by incorporating diagnostics, and
by providing an interface for distributed control.
Finally, the use of a high-quality yet simple robotic
vehicle provides an ideal opportunity for
experiencing the benefits of project-based education,
for building collaborations with local engineering
organizations such as MBARI, and for establishing
highly motivating learning opportunities.
ACKNOWLEDGEMENTS
The work described in this paper has been supported
by the Santa Clara University Technology Steering
Committee, the NOAA National Undersea Research
Program. In addition, this material is based upon
work supported by the National Science Foundation
under Grant NO. EIA-0079875; any opinions,
findings, and conclusions or recommendations
expressed in this material are those of the authors and
do not necessarily reflect the views of the National
Science Foundation. The authors thank the dozens of
students who have participated in the development
and operation of the Triton system. For their
mentoring, special recognition is given to Chad
Bulich, Jeff Ota, Dr. Geoff Wheat, and members of
the SCU Engineering Alumni Board.
Finally,
appreciation is extended to MBARI and Deep Ocean
Engineering for their engineering assistance and use
of facilities. This work was performed in partial
satisfaction of undergraduate and graduate studies at
Santa Clara University.
REFERENCES

Finally, the project has provided significant insight


into the design and implementation of a distributed
control architecture currently being developed for the
Mantaris vehicle. For Mantaris, this system is the
native system with several microcontrollers deployed
throughout the undersea and surface components and
with combined Ethernet / RS485 networks supporting
command and data handling.
5. CONCLUSION
Development of the Triton digital control layer has
proved to be valuable technical exercise.

Weast, A., et al., (1999a). Triton: Design of an


Underwater Remotely Operated Vehicle (ROV).
Undergraduate Senior Thesis, Santa Clara University.
Yoshida, K., et al., (2000). Development of an
adaptive multi-finger gripper system for an
underwater ROV. Proceedings of the Advanced
Robotics Conference.
Weast, A., et al., (1999b). Integrating Digital Stereo
Cameras with Mars Pathfinder Technology for 3D
Regional Mapping Underwater. Proceedings of the
1999 IEEE Aerospace Conference, Snowmass, CO.

You might also like