Professional Documents
Culture Documents
Saegusa 2011
Saegusa 2011
DOI 10.1007/s00542-010-1219-1
TECHNICAL PAPER
Received: 7 September 2010 / Accepted: 30 December 2010 / Published online: 12 January 2011
Ó Springer-Verlag 2011
Abstract The repletion rate of guide dogs for visually The results show that the algorithms are reliable enough to
handicapped persons is roughly 10% nationwide. The enable the guide-dog robot and the led-person to maneuver
reasons for this low rate are the long training period and the in a complex corridor environment.
expense for obtaining a guide dog in Japan. Motivated by
these two reasons, we are developing guide-dog robots.
The major objective is to develop an intelligent human– 1 Introduction
robot interface. This paper describes two novel interface
algorithms and strategy to guide visually handicapped The repletion rate of guide dogs for visually handicapped
person. We developed new leading edge searching method, persons is roughly 10% in Japan. Considering the long
which uses a single laser range finder (LRF) developed to training and high cost of guide dogs, it seems to be hard to
find the center of the corridor in an indoor environment. increase the repletion rates significantly within a short
We also developed a new twin cluster trace method that period of time. Because of this, we have undertaken
can recognize the led-person’s walking conditions mea- development of a guide-dog robot. Our main goal is to
sured by the LRF. The algorithm allows the guide-dog develop an intelligent human–machine interface (HMI)
robot to accurately estimate and anticipate the led-person’s between the visually handicapped person and the guide-
next move. We experimentally verified these algorithms. dog robot, and it is of engineering, social, and economic
importance.
The tasks for a guide dog include obeying the owner,
S. Saegusa (&) notifying the owner of turning points, leading the owner to
Education and Venture Creation Division, the target, detecting and avoiding obstacles, and intelligent
Center for Collaborative Research and Community Cooperation, insubordination (Table 1).
Hiroshima University, VBL Office 313, Kagamiyama 2-chome,
Higashi-Hiroshima 739-8527, Japan The most important of these tasks is guiding the owner
e-mail: shosaegu@hiroshima-u.ac.jp while being aware of his or her walking conditions
(walking speed, etc.). Since the owner is visually impaired,
Y. Yasuda his or her interaction with the dog is the key to successful
Yaskawa Electric Corporation, Kitakyushu, Japan
guiding. Tate et al. (1978) investigated a control method to
Y. Uratani detect the distance between a human and a mobile robot by
Yamaha Corporation, Shizuoka, Japan using a hardcoded classical control algorithm. However,
this method was not able to handle various walking con-
E. Tanaka
Shibaura Institute of Technology, Saitama, Japan ditions. Few studies have focused on this subject area, and
hence, we had to begin our development by studying a
T. Makino guiding system that could handle various user walking
Tokuyama National College of Technology, Tokuyama, Japan
conditions. Yamada et al. (2003) proposed a steering rope
J.-Y. Chang method using a force sensor measuring in three Cartesian
Massey University, Albany, Auckland 0745, New Zealand axes and three Euler rotational axes to serve as the
123
1170 Microsyst Technol (2011) 17:1169–1174
(1) Obey owner The dog easily obeys the owner’s commands
(2) Notify owner The dog nudges the owner when to turn
(3) Lead owner to target The dog leads the owner to the target
(4) Detect and avoid obstacles The dog detects and avoids obstacles that may hinder or endanger
the owner
(5) Intelligent in subordination The dog does not obey orders that would lead into danger
123
Microsyst Technol (2011) 17:1169–1174 1171
End
123
1172 Microsyst Technol (2011) 17:1169–1174
123
Microsyst Technol (2011) 17:1169–1174 1173
Experimental circumstances
Robot
Obstacle
2.2m
3m
Wall
Obstacle
Feet of follower
2.2m
123
1174 Microsyst Technol (2011) 17:1169–1174
Fig. 8 Picture showing a person passing the guide-dog robot and human follower (to the left), and the corresponding LRF measurement result
(to the right)
References
123