You are on page 1of 10

Motion and Emotional Behavior Design

for Pet Robot Dog

Chi-Tai Cheng, Yu-Ting Yang, Shih-Heng Miao, and Ching-Chang Wong

Department of Electrical Engineering, Tamkang University


151, Ying-Chuan Rd. Tamsui, Taipei County, Taiwan 25137, R.O.C.
wong@ee.tku.edu.tw

Abstract. A pet robot dog with two ears, one mouth, one facial expression
plane, and one vision system is designed and implemented so that it can do
some emotional behaviors. Three processors (Inter® Pentium® M 1.0 GHz, an
8-bit processer 8051, and embedded soft-core processer NIOS) are used to
control the robot. One camera, one power detector, four touch sensors, and one
temperature detector are used to obtain the information of the environment. The
designed robot with 20 DOF (degrees of freedom) is able to accomplish the
walking motion. A behavior system is built on the implemented pet robot so
that it is able to choose a suitable behavior for different environmental situation.
From the practical test, we can see that the implemented pet robot dog can do
some emotional interaction with the human.

Keywords: Robot Dog, Emotional Behavior.

1 Introduction
In the past, pet robots with lovely behaviors let them enter human’s life [1]. The pixie
“Furdy” on the market makes pet robots become more popular. But its processer is
slower at that time. Until the recent year, the pet robot with more intelligence is
beginning to be presented to the public [2-5]. The Sony AIBO [6,7] is the most
advanced one in this area of robot dog.
A pet robot dog with 20 DOF (degrees of freedom) is designed and implemented
so that it can do some emotional behaviors. In order to let the implemented pet robot
have a high ability of environmental detection, a vision system (one USB camera),
three touch sensor, and a temperature sensor are equipped on the body of the
implemented robot to obtain the information of environment to decide an appropriate
action. Three processors (Inter® Pentium® M 1.0 GHz, 8051 and NIOS) are used to
control the robot. Inter processor is used to receive the vision from USB camera and
to be a high -level controller to process the high level artificial intelligent. 8051 is an
8-bit processer used for some facial expression. NIOS is a 32-bit embedded soft-core
processor used for the robot locomotion control. NIOS processor is used to be a low-
level controller to control the walking and other actions. A control board with a FPGA
chip and a 64 Mb flash memory are mainly utilized to control the robot. Many
functions are implemented on this FPGA chip so that the design circuit is simple.

J.-H. Kim et al. (Eds.): FIRA 2009, LNCS 5744, pp. 13–22, 2009.
© Springer-Verlag Berlin Heidelberg 2009
14 C.-T. Cheng et al.

2 Mechanical Structure Design


The main design concept of a pet robot is to let its weight and size be light and
compact, respectively. A pet robot dog with 20 DOF (degrees of freedom) is
described in this section. The frameworks of pet robot are mainly fabricated from
aluminum alloy 5052 in order to realize the concepts of light weight, wear-resisting,
high stiffness, and wide movable range. Each actuator system for the joint consists of
a high torque and a gear.
Mechanical structure design is the first step in the pet robot dog design. There are
20 degrees of freedom implemented by using the inertial coordinate system fixed on
the ground. There are four rotational direction of joints are defined by using the
passive joints and other 16 degrees of freedom are defined by using the active joint.
There are 2 degrees of freedom on the neck, 2 degrees of freedom on the ear, and 4
degrees of freedom of each leg. The photograph of pet robot dog is shown in Fig. 1.
The details of the development of the head and legs are described as follows:

Fig. 1. Photograph of the implemented pet robot dog

Fig. 2. Head design of the pet robot dog


Motion and Emotional Behavior Design for Pet Robot Dog 15

In the head design, there are four degrees of freedom on it. Two degrees of
freedom is adopted on the neck so that the head of the robot can turn right-and-left
and up-and-down. Another two degrees of freedom is adopted on the ear. The picture
of head design is described in Fig. 2. A LED dot-matrix with 8×16 LEDs as shown in
Fig. 3 is design and implemented in the center of the head so that the robot can do
some emotional interaction.

Fig. 3. The LED dot-matrix entity

(a) (b)

Fig. 4. Leg design of the pet robot dog

(a) (b) (c)

Fig. 5. Three states of the robot walks on an uneven floor: (a) general floor, (b) front rise, and
(c) back rise
16 C.-T. Cheng et al.

In the leg design, there are 16 degrees of freedom on the four legs. Each leg has 3
active degrees of freedom which are driven by servomotor. Another one degrees of
freedom is driven by nothing which is moving passive. The picture of leg is described
in Fig. 4. The passive joint is used for making the end of the leg always parallel to the
floor. Two states of the robot walking on the uneven floor are shown in Fig. 5. We
can see that the passive joint makes the robot more stable.

3 Electronic Structure Design


In the electronic design of the robot, the process device and the system block diagram
are described in Fig. 6 and Fig. 7, respectively. In order to build a fully autonomous
vision-based pet robot dog, a notebook is chosen to process the vision image of
environment. The image of the field is captured by the camera.
Three touch sensors, one power detector, and one temperature detector sensor are
mounted on the pet robot dog to obtain the touch state, power state, and temperature
state of the robot, respectively. The touch sensor is mounted on the head to detect the
touch behavior from the human. The power detector is mounted on the body to detect
the power of the battery. The temperature detector sensor is mounted on the joint in
the front leg to detect the temperature of the servomotor. These three kinds of
detectors are used to present the states of the robot by some quantitative data. By
analyzing these data, the robot emotion can be simulated. The relative signal is caught
by a System On a Programmable Chip (SOPC), which is a FPGA chip and a 32-bit
embedded soft core processor named NIOS in it is used to process the data and
control the robot.
The circuit block diagram is described in Fig. 8. It can be reduced to three parts
which are the LED dot-matrix circuit, the power detector circuit, and the temperature
detector circuit. The LED dot-matrix circuit is designed by 128 LED, and controlled
by an 8051 processer. It can express robot's feelings. Six kinds of feeling are designed

Fig. 6. The device of pet robot dog


Motion and Emotional Behavior Design for Pet Robot Dog 17

Fig. 7. System block diagram

Fig. 8. Circuit block diagram


18 C.-T. Cheng et al.

Fig. 9. Power detector circuit

Fig. 10. Temperature detector circuit

in the LED dot-matrix. The power detector circuit is shown in Fig. 9, where
ADC0804 is used to change analogy signal to an 8-bit digital signal. The temperature
detector circuit is shown in Fig. 10. Sensor DS1821 is used to detect temperature, and
the output is digital data.

4 Motion and Emotion Structure Design


A mechanical structure with 20 degrees of freedom is proposed to implement a robot
dog. Two move motions including “step forward” and “turn right/left”, six non-move
motions with ear, and six emotional expressions are designed for the robot’s
movements. In order to verify the effectiveness of the implemented pet robot dog,
three kinds of behaviors: Move Motions, Non-move Motions, and Emotional
Expressions are carried out on a horizontal even plan and described as follows:

(a) Move Motions


Move motion is design to let the robot move. The implemented pet robot dog can do
two kinds of motions: “step forward” and “turn right/left”. Some snapshots of the step
forward walking of the pet robot are shown in Fig. 11.
Motion and Emotional Behavior Design for Pet Robot Dog 19

(a) (b) (c) (d)

(e) (f) (g) (h)

Fig. 11. Step forward motion

(b) Non-move Motions


Non-move motions are design to interact with the human. They are often used with
the Emotional Expressions. Some snapshots of stretch motion and ear motion are
shown in Fig. 12 and Fig. 13, respectively.

(a) (b) (c) (d)

(e) (f) (g) (h)

Fig. 12. Stretch motion


20 C.-T. Cheng et al.

(a) (b) (c)


Fig. 13. Ear motion

(c) Emotional Expressions


Emotional Expressions is design to expression the emotion more direct. It makes the
pet robot friendly. Some snapshots of Emotional Expressions are shown in Fig. 14.

(a)!ɀɀ (b)!̬̬ (c)!ʒʑ (d)!̻̻ (e)!ʑʒ (f)!ʄʄ

Fig. 14. Emotional Expression

5 Behavior Design
There are four parts in the behavior design: (1) External Environment System, (2)
Physiology System, (3) Psychology System, and (4) Behavior System. They are
described as follows:
(1) External Environment System
External Environment System includes “touch sensor”, “vision system”, and “voice
detector”. The main idea of External Environment System is the natural interface
between human and robot. The natural interface makes the robot more like a real pet.
(2) Physiology System
Physiology System includes “seeing people”, “being touched”, “power quantity”, and
“motor’s temperature”. The main idea of Physiology System is the body state of the
robot. These quantitative data can let the robot understand its body state.
(3) Psychology System
Psychology System includes “hungry quantity”, “curious quantity”, “tire quantity”,
and “vitality”. The main idea of Psychology System is to simulate the real dog
feeling. Psychology System is affected by the Physiology System. The “hungry
quantity” is affected by “power quantity”, the “curious quantity” is affected by “be
Motion and Emotional Behavior Design for Pet Robot Dog 21

touched”, and the “tire quantity” is affected by “motor’s temperature”. In order to


design a realistic pet robot dog, “vitality” is designed to affect the “curious quantity”
which makes robot personality.
(4) Behavior System
Behavior System includes “eat”, “play intense”, “play gently”, and “sleep”. The main
idea of Behavior System is motion and response. Because of the “vitality” may be
different, so the behavior may not always the same. The robot with higher “vitality”
will play more with human. Reversely, the robot with lower “vitality” will appear
emotionless.
Two experiment results of the robot in the high “tire quantity” state and the “sleep”
state are shown in Fig.15. We can see that pet robot dog can do different emotion.

(a) (b) (c)


Fig. 15. The experiment results of the robot in the high “tire quantity” and the “sleep” state

6 Conclusions
A mechanical structure with 20 DOF is proposed so that the implemented pet robot
dog can do some basic motion experiments. Furthermore, some emotional behaviors
are design for it. In the future, a fuzzy system will be considered to let the pet robot
dog have the behavior learning ability so that it is able to choose a better behavior for
different environmental situation.

Acknowledgement
This research was supported in part by the National Science Council (NSC) of the
Republic of China under contract NSC 97-2218-E-032-001.

References
1. Denavit, J., Hartenberg, R.S.: A kinematic notation for lower-pair mechanisms based on
matrices. Transactions of ASME, Journal of Applied Mechanics 22, 215–221 (1955)
2. Paul, R.P.: Robot Manipulators: Mathematics Programming and Control. MIT Press,
Cambridge (1981)
3. Hiros, S.: A study of design and control of a quadruped walking vehicle. International
Journal of Robotics Research 3 (1984)
4. Blumberg, B.M., Galyean, T.A.: Multi-level direction of autonomous creatures for real-time
virtual environments. In: Proceedings of ACM SIGGRAPH (1995)
22 C.-T. Cheng et al.

5. Blumberg, B.M., Todd, P.T., Maes, P.: No Bad Dogs: Ethological Lessons for Learning in
Hamsterdam. In: Proceedings of International Conference on Simulated Adaptive Behavior,
pp. 295–304 (1996)
6. Kubota, N., Kojima, F., Fukuda, T.: Self-consciousness and Emotion for a Pet Robot with
Structured Intelligence. In: Proceedings of IEEE IFSA World Congress, pp. 1786–2791
(2001)
7. Terzopoulos, D., Tu, X., Grzeszczuk, R.: Artificial fishes: Autonomous locomotion,
perception, behavior, and learning in a simulated physical world. Journal of Artificial
Life 1(4), 327–351 (1994)