You are on page 1of 26

CHAPTER- 1

HUMANOID ROBOTS

1.1 INTRODUCTION:

A humanoid robot or an anthropomorphic robot is a robot with its overall appearance,


based on that of the human body, allowing interaction with made-for-human tools or
environments. In general humanoid robots have a torso with a head, two arms and two
legs, although some forms of humanoid robots may model only part of the body, for
example, from the waist up. Some humanoid robots may also have a 'face', with 'eyes' and
'mouth'. Androids are humanoid robots built to aesthetically resemble a human as in
shown in figure 1.1.

Fig 1.1: ASIMO Humanoid Robot

A humanoid robot is an autonomous robot, because it can adapt to changes in its


environment or itself and continue to reach its goal. This is the main difference between

1
humanoid and other kinds of robots. In this context, some of the capacities of a humanoid
robot may include, among others:

 self-maintenance (i.e. recharging itself)


 autonomous learning (learn or gain new capabilities without outside assistance,
adjust strategies based on the surroundings and adapt to new situations)

 avoiding harmful situations to people, property, and itself

 safe interacting with human beings and the environment

Like other mechanical robots, humanoid refer to the following basic components too:
Sensing, Actuating and Planning and Control. Since they try to simulate the human
structure and behavior and they are autonomous systems, generally humanoid robots are
more complex than other kinds of robots.This complexity affects all robotic scales
(mechanical, spatial, time, power density, system and computational complexity), but it is
more noticeable on power density and system complexity scales. In the first place, most
current humanoids aren’t strong enough even to jump and this happens because the
power/weight ratio is not as good as in the human body. The dynamically
balancing Dexter can jump, but poorly so far. On the other hand, there are very good
algorithms for the several areas of humanoid construction, but it is very difficult to merge
all of them into one efficient system (the system complexity is very high). Nowadays,
these are the main difficulties that humanoid robots development has to deal with.

Humanoid robots are created to imitate some of the same physical and mental tasks that
humans undergo daily. Scientists and specialists from many different fields
including engineering, cognitive science, and linguistics combine their efforts to create a
robot as human-like as possible. Their creators' goal for the robot is that one day it will be
able to both understand human intelligence, reason and act like humans. If humanoids are
able to do so, they could eventually work in cohesion with humans to create a more
productive and higher quality future. Another important benefit of developing androids is
to understand the human body's biological and mental processes, from the seemingly
simple act of walking to the concepts of consciousness and spirituality.

2
There are currently two ways to model a humanoid robot. The first one models the robot
like a set of rigid links, which are connected with joints. This kind of structure is similar
to the one that can be found in industrial robots. Although this approach is used for most
of the humanoid robots, a new one is emerging in some research works that use the
knowledge acquired on biomechanics. In this one, the humanoid robot's bottom line is a
resemblance of the human skeleton. Robots by Feng Yuan are also created for time
pass.The Beat Magnum True Hero Robot is an example of a robot which is used by
children to make small robots fight.

1.2 HISTORY

The concept of human like automatons is nothing new. Already in the second century
B.C.,Hero of Alexander constructed statues that could be animated by water, air and
steam pressure. In 14995,Leonardo Da Vinci designed and possibly built a mechanical
device that looked like an armored knight. It was designed to sit up, wave its arms, and
move it s head via a flexible neck while opening and closing its jaw. By the eighteenth
century, elaborate mechanical dolls were able to write short phrases, play musical
instruments, and perform other simple , life like acts.

In 1921 the word robot was coined by Karel Capek in its theatre play:R.U.R(Rossum’s
Universal Robot). The mechanical servant in the play had a humanoid appearance. The
first humanoid robot to appear in the movies was Maria in the film Metropolis.
Westinghouse Electric Corporation exhibited at the 1939 and 1940 world’s fairs the tall
motor man Elektro. Humanoid in appearance, it could drive on wheels in the feet, play
recorded speech,smoke cigarettes, blow up balloons, and moves its head and arms.
Elektro was controlled by 48 electrical relays and could response to voice commands.

Humanoid robots were not only part of the western culture.in 1952,Ozamu Tezuka
created Astroboy, the first and one of the world’s most popular Japanese sci-fi robots. In
1973, the construction of a human-like robot was started at the Waseda University in
Tokyo. Wabot-1 was the first full-scale anthropomorphic robot able to walk on two legs.
It could also communicate with a person in Japanese and was able to grip and transport
objects with touch-sensitive hands. The group of Ichiro Kato also developed Wabot-2,

3
which read music and play an electonic gun. It was demonstrated at the Expo1985 in
Tsukuba, Japan. Wabot-2 was equipped with a hierarchical system of 80 microprocessors.
Its wire-driven arms and legs had 50 degrees of freedom.

Many researchers have also been inspired by the movie Stra Wars which featured the
humanoid robot C3-PO and by the TV series Star Trek which featured the humanoid
data.In 1986 , Honda began a robot research programwith the goal that a robot “should
cooperate and coexist with human beings,by doing what a person cannot do and by
cultivating a new dimension in mobility to ultimately benefit society”. After ten years of
research , Honda intoduced in 1996 P2 to the public, the first self-contained full- body
humanoid. It was able to walk not only on flat floors,but could climb stairs. It was
followed in 1997 by P3 and in 2002 by Asimo.

In the U.S.Manny, a full-scale android body, was completed by the Pacific Northwest
National Laboratory in 1989. Manny had 42 degrees of freedom, but no intelligence or
autonomous mobility. Rodney Brooks and his team at MIT started in 1993 to construct
the humanoid upper-body Cog. It was designed and built to emulate human thought
processes and experience the world as a human. Another milestone was the Sony Dream
Robot,Unveiled by Sony in the year 2000. The small humanoid robot, which was later
called Qrio, was able to recognise face, could express emotion through speech and body
language, and could walk on flat as well as irregular surfaces.More recent examples of
humanoid robot appearances in the movies include David from A.I., and NS-5 from
I,Robot. The developments in humanoid robots through time has been shown in the table
shown below.

Table 1.1: Development through times

Year Development

c. 250 BC The Lie Zi described an automaton.[3]

Greek mathematician Hero of Alexandria described a machine to automatically


c. 50 AD
pour wine for party guests.[4]

4
Al-Jazari described a band made up of humanoid automata which, according to
Charles B. Fowler, performed "more than fifty facial and body actions during
each musical selection."[5] Al-Jazari also created hand washing automata with
automatic humanoid servants,[6][verification needed]
and an elephant
1206
clock incorporating an automatic humanoid mahout striking a cymbal on the
half-hour.[citation needed] His programmable "castle clock" also featured five
musician automata which automatically played music when moved by levers
operated by a hidden camshaftattached to a water wheel.[7]

Leonardo da Vinci designs a humanoid automaton that looks like an armored


1495
knight, known as Leonardo's robot. [8]

Jacques de Vaucanson builds The Flute Player, a life-size figure of a shepherd


1738 that could play twelve songs on the flute and The Tambourine Player that
played a flute and a drum or tambourine. [9]

Pierre Jacquet-Droz and his son Henri-Louis created the Draughtsman, the
1774 Musicienne and the Writer, a figure of a boy that could write messages up to 40
characters long. [10]

The story of the Golem of Prague, an humanoid artificial intelligence activated


1837 by inscribing Hebrew letters on its forehead, based on Jewish folklore, was
created by Jewish German writerBerthold Auerbach for his novel Spinoza.

Czech writer Karel Čapek introduced the word "robot" in his play R.U.R.
1921 (Rossum's Universal Robots). The word "robot" comes from the word "robota",
meaning, in Czech, "forced labour, drudgery". [11]
The Maschinenmensch (“machine-human”), a gynoid humanoid robot, also
called "Parody", "Futura", "Robotrix", or the "Maria impersonator" (played by
1927
German actress Brigitte Helm), perhaps the most memorable humanoid robot
ever to appear on film, is depicted in Fritz Lang's film Metropolis.
Isaac Asimov formulates the Three Laws of Robotics, and in the process of
1941-42
doing so, coins the word "robotics".
Norbert Wiener formulates the principles of cybernetics, the basis of
1948
practical robotics.
1961 The first digitally operated and programmable non-humanoid robot,
the Unimate, is installed on a General Motors assembly line to lift hot pieces of
metal from a die casting machine and stack them. It was created by George

5
Devol and constructed by Unimation, the first robot manufacturing company.
D.E. Whitney publishes his article "Resolved motion rate control of
1969
manipulators and human prosthesis".[12]
Miomir Vukobratović has proposed Zero Moment Point, a theoretical model to
1970
explain biped locomotion. [13]
Miomir Vukobratović and his associates at Mihajlo Pupin Institute build the
1972
first active anthropomorphic exoskeleton.
In Waseda University, in Tokyo, Wabot-1 is built. It was able to communicate
1973 with a person in Japanese and to measure distances and directions to the objects
using external receptors, artificial ears and eyes, and an artificial mouth. [14]
Marc Raibert established the MIT Leg Lab, which is dedicated to studying
1980
legged locomotion and building dynamic legged robots. [15]
Using MB Associates arms, "Greenman" was developed by Space and Naval
Warfare Systems Center, San Diego. It had an exoskeletal master controller
with kinematic equivalency and spatial correspondence of the torso, arms, and
1983
head. Its vision system consisted of two 525-line video cameras each having a
35-degree field of view and video camera eyepiece monitors mounted in an
aviator's helmet. [16]
At Waseda University, the Wabot-2 is created, a musician humanoid robot able
1984 to communicate with a person, read a normal musical score with his eyes and
play tunes of average difficulty on an electronic organ. [17]
Developed by Hitachi Ltd, WHL-11 is a biped robot capable of static walking
1985
on a flat surface at 13 seconds per step and it can also turn. [18]
WASUBOT is another musician robot from Waseda University. It performed a
1985 concerto with the NHK Symphony Orchestra at the opening ceremony of the
International Science and Technology Exposition.
Honda developed seven biped robots which were designated E0 (Experimental
1986 Model 0) through E6. E0 was in 1986, E1 – E3 were done between 1987 and
1991, and E4 - E6 were done between 1991 and 1993. [19]
Manny was a full-scale anthropomorphic robot with 42 degrees of
freedom developed at Battelle's Pacific Northwest Laboratories in Richland,
1989 Washington, for the US Army's Dugway Proving Ground in Utah. It could not
walk on its own but it could crawl, and had an artificial respiratory system to
simulate breathing and sweating.[20]
Tad McGeer showed that a biped mechanical structure with knees could walk
1990
passively down a sloping surface. [21]
Honda developed P1 (Prototype Model 1) through P3, an evolution from E
1993
series, with upper limbs. Developed until 1997.[22]
1995 Hadaly was developed in Waseda University to study human-robot

6
communication and has three subsystems: a head-eye subsystem, a voice
control system for listening and speaking in Japanese, and a motion-control
subsystem to use the arms to point toward campus destinations.
1995 Wabian is a human-size biped walking robot from Waseda University.
Saika, a light-weight, human-size and low-cost humanoid robot, was developed
at Tokyo University. Saika has a two-DOF neck, dual five-DOF upper arms, a
1996
torso and a head. Several types of hands and forearms are under development
also. Developed until 1998. [23]
Hadaly-2, developed at Waseda University, is a humanoid robot which realizes
1997 interactive communication with humans. It communicates not only
informationally, but also physically.
2000 Honda creates its 11th bipedal humanoid robot, ASIMO.[24]
Sony unveils small humanoid entertainment robots, dubbed Sony Dream Robot
2001
(SDR). Renamed Qrio in 2003.
Fujitsu realized its first commercial humanoid robot named HOAP-1. Its
successors HOAP-2 and HOAP-3 were announced in 2003 and 2005,
2001
respectively. HOAP is designed for a broad range of applications for R&D of
robot technologies.[25]
JOHNNIE, an autonomous biped walking robot built at the Technical
2003 University of Munich. The main objective was to realize an anthropomorphic
walking machine with a human-like, dynamically stable gait [26]
Actroid, a robot with realistic silicone "skin" developed by Osaka University in
2003
conjunction with Kokoro Company Ltd. [27]
Persia, Iran's first humanoid robot, was developed using realistic simulation by
2004
researchers of Isfahan University of Technology in conjunction with ISTT. [28]
KHR-1, a programmable bipedal humanoid robot introduced in June 2004 by
2004
a Japanese company Kondo Kagaku.
The PKD Android, a conversational humanoid robot made in the likeness of
science fiction novelist Philip K Dick, was developed as a collaboration
2005
between Hanson Robotics, the FedEx Institute of Technology, and
the University of Memphis.[29]
Wakamaru, a Japanese domestic robot made by Mitsubishi Heavy Industries,
2005
primarily intended to provide companionship to elderly and disabled people. [30]
REEM-A, a biped humanoid robot designed to play chess with the Hydra
2006 Chess engine. The first robot developed by PAL Robotics, it was also used as a
walking, manipulation speech and vision development platform.[31]
2007 TOPIO, a ping pong playing robot developed by TOSY Robotics JSC. [32]
2008 Justin, a humanoid robot developed by the German Space Agency (DLR). [33]

7
KT-X, the first international humanoid robot developed as a collaboration
2008 between the five-time consecutive RoboCup champions, Team Osaka, and
KumoTek Robotics. [34]
Nexi, the first mobile, dexterous and social robot, makes its public debut as one
of TIME magazine's top inventions of the year.[35] The robot was built through
2008
a collaboration between the MIT Media Lab Personal Robots Group, [36] Xitome
Design [37] UMass Amherst and Meka robotics.[38] [39]
REEM-B, the second biped humanoid robot developed by PAL Robotics. It has
2008 the the ability to autonomously learn its environment using various sensors and
carry 20% of its own weight. [40]
HRP-4C, a Japanese domestic robot made by National Institute of Advanced
2009 Industrial Science and Technology, shows human characteristics in addition to
bipedal walking.
Turkey's first dynamically walking humanoid robot, SURALP, is developed
2009
by Sabanci University in conjunction with Tubitak. [41]
NASA and General Motors revealed Robonaut 2, a very advanced humanoid
2010 robot. It was part of the payload of Shuttle Discovery on the successful launch
February 24, 2010. It is intended to do spacewalks for NASA.[42]
Students at the University of Tehran, Iran unveil the Surena II. It was unveiled
2010
by President Mahmoud Ahmadinejad.[43]
Researchers at Japan's National Institute of Advanced Industrial Science and
2010 Technology demonstrate their humanoid robot HRP-4C singing and dancing
along with human dancers.[44]
In September the National Institute of Advanced Industrial Science and
Technology also demonstrates the humanoid robot HRP-4. The HRP-4
2010
resembles the HRP-4C in some regards but is called "athletic" and is not a
gynoid.
REEM, a humanoid service robot with a wheeled mobile base. Developed by
2010 PAL Robotics, it can perform autonomous navigation in various surroundings
and has voice and face recognition capabilities.[45]
In November Honda unveiled its second generation Honda Asimo Robot. The
2011 all new Asimo is the first version of the robot with semi-autonomous
capabilities.

1.3 PURPOSE

Humanoid robots are used as a research tool in several scientific areas.Researchers need
to understand the human body structure and behavior (biomechanics) to build and study
humanoid robots. On the other side, the attempt to simulate the human body leads to a

8
better understanding of it. Human cognition is a field of study which is focused on how
humans learn from sensory information in order to acquire perceptual and motor skills.
This knowledge is used to develop computational models of human behavior and it has
been improving over time.It has been suggested that very advanced robotics will facilitate
the enhancement of ordinary humans. Although the initial aim of humanoid research was
to build better orthosis and prosthesis for human beings, knowledge has been transferred
between both disciplines. A few examples are: powered leg prosthesis for
neuromuscularly impaired, ankle-foot orthosis, biological realistic leg prosthesis and
forearm prosthesis.

Fig 1.2: Enon created for personal assistant

Besides the research, humanoid robots are being developed to perform human tasks like
personal assistance, where they should be able to assist the sick and elderly, and dirty or
dangerous jobs. Regular jobs like being a receptionist or a worker of an automotive
manufacturing line are also suitable for humanoids.

Fig 1.3: Nao is a robot created for companionship

9
In essence, since they can use tools and operate equipment and vehicles designed for the
human form, humanoids could theoretically perform any task a human being can, so long
as they have the proper software. However, the complexity of doing so is deceptively
great.They are becoming increasingly popular for providing entertainment too. For
example, Ursula, a female robot, sings, dances, and speaks to her audiences at Universal
Studios. Several Disney attractions employ the use of animatrons, robots that look, move,
and speak much like human beings, in some of their theme park shows. These animatrons
look so realistic that it can be hard to decipher from a distance whether or not they are
actually human. Although they have a realistic look, they have no cognition or physical
autonomy. Various humanoid robots and their possible applications in daily life are
featured in an independent documentary film called Plug & Pray, which was released in
2010.

Fig 1.4: A Robonaut

Humanoid robots, especially with artificial intelligence algorithms, could be useful for
future dangerous and/or distant space exploration missions, without having the need to
turn back around again and return to Earth once the mission is completed.

10
CHAPTER-2

FEATURES AND TECHNOLOGY

2.1 INTRODUCTION

In order to make a humanoid work as desired certain features and technologies are used
to make it more efficient in real time applications. This is done by using more advanced
cicuits like sensors, actuators, stacked die, a processor, a signal converter and also a
memory. The computer that controls humanoid robots movement is housed in the robot's
waist area and can be controlled by a PC, wireless controller, or voice commands.It is
shown in figure 1.2.

Fig 2.1: Humanoid robot connected by PC with wires

2.2 SENSORS

A sensor is a device that measures some attribute of the world. Being one of the three
primitives of robotics (besides planning and control), sensing plays an important role
in robotic paradigms.Sensors can be classified according to the physical process with

11
which they work or according to the type of measurement information that they give as
output. In this case, the second approach was used.

2.2.1 Proprioceptive Sensors


Proprioceptive sensors sense the position, the orientation and the speed of the humanoid's
body and joints.In human beings inner ears are used to maintain balance and orientation.
Humanoid robots use accelerometers to measure the acceleration, from which velocity
can be calculated by integration; tilt sensors to measure inclination; force sensors placed
in robot's hands and feet to measure contact force with environment; position sensors,
that indicate the actual position of the robot (from which the velocity can be calculated by
derivation) or even speed sensors.

2.2.2 Exteroceptive Sensors

Arrays of tactels can be used to provide data on what has been touched. The Shadow
Hand uses an array of 34 tactels arranged beneath its polyurethane skin on each finger tip.
[2]
Tactile sensors also provide information about forces and torques transferred between
the robot and other objects.Vision refers to processing data from any modality which uses
the electromagnetic spectrum to produce an image. In humanoid robots it is used to
recognize objects and determine their properties. Vision sensors work most similarly to
the eyes of human beings. Most humanoid robots use CCD cameras as vision
sensors.Sound sensors allow humanoid robots to hear speech and environmental sounds,
and perform as the ears of the human being. Microphones are usually used for this task.

2.3 ACTUATORS

Actuators are the motors responsible for motion in the robot.Humanoid robots are
constructed in such a way that they mimic the human body, so they use actuators that
perform like muscles and joints, though with a different structure. To achieve the same
effect as human motion, humanoid robots use mainly rotary actuators. They can be either
electric, pneumatic, hydraulic,piezoelectric or ultrasonic.Hydraulic and electric actuators

12
have a very rigid behavior and can only be made to act in a compliant manner through the
use of relatively complex feedback control strategies . While electric coreless motor
actuators are better suited for high speed and low load applications, hydraulic ones
operate well at low speed and high load applications. Piezoelectric actuators generate a
small movement with a high force capability when voltage is applied. They can be used
for ultra-precise positioning and for generating and handling high forces or pressures in
static or dynamic situations.Ultrasonic actuators are designed to produce movements in a
micrometer order at ultrasonic frequencies (over 20 kHz). They are useful for controlling
vibration, positioning applications and quick switching.Pneumatic actuators operate on
the basis of gas compressibility. As they are inflated, they expand along the axis, and as
they deflate, they contract. If one end is fixed, the other will move in a lineartrajectory.
These actuators are intended for low speed and low/medium load applications. Between
pneumatic actuators there are: cylinders, bellows, pneumatic engines, pneumatic stepper
motors andpneumatic artificial muscles.

Fig2.2: An artificial hand holding a light bulb

2.4 PLANNING AND CONTROL

In planning and control, the essential difference between humanoids and other kinds of
robots (like industrial ones) is that the movement of the robot has to be human-like, using

13
legged locomotion, especially biped gait. The ideal planning for humanoid movements
during normal walking should result in minimum energy consumption, like it does in the
human body. For this reason, studies ondynamics and control of these kinds of structures
become more and more important.

To maintain dynamic balance during the walk, a robot needs information about contact
force and its current and desired motion. The solution to this problem relies on a major
concept, the Zero Moment Point (ZMP).Another characteristic of humanoid robots is that
they move, gather information (using sensors) on the "real world" and interact with it.
They don’t stay still like factory manipulators and other robots that work in highly
structured environments. To allow humanoids to move in complex environments,
planning and control must focus on self-collision detection, path planning and obstacle
avoidance.Humanoids don't yet have some features of the human body. They include
structures with variable flexibility, which provide safety (to the robot itself and to the
people), and redundancy of movements, i.e. more degrees of freedom and therefore wide
task availability. Although these characteristics are desirable to humanoid robots, they
will bring more complexity and new problems to planning and control.

2.5 STATE-OF-THE-ART

Although,from the above,it may seem that the most important issues for construction and
control of humanoid robots have been solved,this is not the at all the case. The
capabilities of current humanoid robots are rather limited, when compared to humans.

2.5.1 Bipedal Locomotion

The distinctive feature of full-body humanoids is bipedal locomotion. Walking and


running on two legs may seem simple, but humanoid robots still have serious difficulties
with it. There are two opposing approaches to bipedal walking. The first-one is based on
the zero-moment-point theory (ZMP), introduced by Vukobratovic. The ZMP is defined
as the point on the ground about which the sum of the moments of all the active forces
equals zero. If the ZMP is within the convex hull (support polygon) of all contact points

14
between the feet and the ground, a bipedal robot is dynamically stable.The use of the
ZMP to judge stability was a major advance over the center-of-mass projection criterion,
which describes static stability. Prominent robots, which rely on ZMP-based control,
include Honda Asimo and Sony Qrio. Asimo was shown in 2006 to be capable of 6km/h
running. However, its gait with bent knees does not look human-like. It does not recycle
energy stored in elastic elements, the way humans do it and, hence, it is not energy
efficient. Furthermore, Asimo requires flat, stable ground for walking and running and
can only climb certain stairs.

Fig 2.3: Bipedal Locomotion

A completely different approach to walking is to utilize the robot dynamics. In 1990


McGeer showed that planar walking down a slope is possible without actuators and
control. Based on his ideas of passive dynamic walking, actuated machines have been
built recently. These machines are able to walk on level ground. Because their actuators

15
only support the inherent machine dynamics, they are very energy-efficient. They are
easy to control, e.g. by relying on foot-contact sensors. However, because they use round
feet, these machines cannot stand still. So far, these machines can also not start or stop
walking and are not able to change speed or direction. What is missing in current
humanoid robots is the ability to walk on difficult terrain and the rejection of major
disturbances, like pushes. Such capabilities were demonstrated by the quadruped BigDog.
This robot, however, is not suited for indoor use due to its combustion engine and
hydraulic actuators. First steps towards bipedal push recovery have been done in sim
ulation using Pratt’s concept of capture point. It is difficult to transfer these simulation
results to physical robots, partly due to the lack of suitable actuators. Although hydraulic
actuators (e.g. Sarkos biped used at ATR and CMU) and pneumatic actuators (e.g. Lucy
designed at Brussels) have been used for bipeds to implement compliant joints, their
walking performance is still not convincing.

2.5.2 Perception
Humanoid robots must perceive their own state and the state of their environment in
order to act successfully. For proprioception, the robots measure the state of their joints
using encoders,force sensors, or potentiometers. Important for balance is the estimation
of the robot attitude. This is done using accelerometers and gyroscopes. Many humanoid
robots also measure ground reaction forces or forces at the hands and fingers. Some
humanoid robots are covered with force-sensitive skin. One example for such a robot is
CB2, developed at Osaka University. Although some humanoid robots use super-human
senses, such as laser rangefinders or ultrasonic distance sensors, the most important
modalities for humanoid robots are vision and audition. Many robots are equipped with
two movable cameras. These cameras are used as active vision system, allowing the
robots to focus their attention towards relevant objects in their environment. Movable
cameras make depth estimation from disparity more difficult, however. For this reason,
fixed calibrated cameras are used for stereo. Most humanoid robots are equipped with
onboard computers for image interpretation. Interpreting real-world image sequences is
not a solved problem, though. Hence, many humanoid vision systems work well only in a
simplified environment. Frequently, key objects are colorcoded to make their perception

16
easier. Similar difficulties arise when interpreting the audio signals captured by onboard
microphones. One major problem is the separation of the sound source of interest (e.g. a
human communication partner) from other sound sources and noise. Turning the
microphones towards the source of interest and beam-forming in microphone arrays are
means of active hearing. While they improve the signal-to-noise ratio, the interpretation
of the audio signal is still difficult. Even the most advanced speech recognition systems
have substantial word error rates. Due to the described difficulties in perception, some
humanoid projects resort to teleoperation, where the signals captured by the robot are
interpreted by a human. Examples for teleoperated humanoids include the Geminoid
developed by Ishiguro, the Robonaut developed by NASA, and the PR1 developed at
Stanford.

2.5.3 Human-Robot Interaction


Many humanoid research projects focus on human-robot interaction. The general idea
here is that the efficient techniques which evolved in our culture for human-human
communication allow also for intuitive human-machine communication. This includes
multiple modalities like speech, eye gaze, facial expressions, gestures with arms and
hands, body language, etc. These modalities are easy to interpret by the human sensory
system. Because we practice them since early childhood, face recognition, gesture
interpretation, etc. seem to be hard wired in our brains. A smile from a robot does not
need much explanation. In order to address these modalities, communication robots are
equipped with expressive animated heads.

Fig 2.4: Janet robot interacting with human

17
Examples include Kismet and Leonardo, developed at MIT, and WE-4RII developed at
Waseda. Movable eyes, head, and chests communicate where the robot focuses its
attention. When the robot looks at the interaction partner, the partner feels addressed.
Some robots animate their mouth while generating speech. This helps the listener to
detect voice activity. Some robots have an emotional display. By moving eyebrows,
eyelids, the mouth, and possibly other parts of the face, a number of basic emotions can
be expressed. The expression of the emotional state can be supported by adapting pitch,
loudness, and speed of the synthesized speech. Robots with anthropomorphic arms and
hands can be used to generate gestures. At least four joints per arm are needed. One
example for a upper-body robot used to generate a variety of gestures is Joy, developed at
KAIST, Korea. The generated gestures of humanoids include symbolic gestures, such as
greeting and waving, batonic gestures, which emphasize accompanying speech, and
pointing gestures, which indicate a direction or reference an object. The size of objects
can also be indicated with arms and hands. The robot head can be used for pointing,
nodding and shaking as well. Robots with articulated fingers like Hubo, also developed at
KAIST, may even be used to generate sign language.
Full-body humanoids can use their entire body for communication using body language.
Wabian-RII, for example, was programmed to generate emotional walking styles.
Another example is HRP-2, which reproduced a Japanese dance captured from a human
dancer. The most extreme form of communication robots are androids and gynoids,
which aim for a photorealistic human-like appearance. Their faces are covered with
silicone skin, they have human-like hair, and they are dressed as humans. Some of these
robots are modeled after living persons, such as Repliee Q2, developed in Osaka, and the
copy of Zou Ren Ti, developed at XSM, China. These robots, however, heavily suffer
from the uncanny valley effect. There is not a monotonous increase in attractiveness as
robots become more human-like, but there is a sudden drop in attractiveness close to
perfect human-likeness. While the synthesis-part of multimodal interaction works
reasonably well, the insufficient perception performance of the computer vision and
audition systems and the lack of true meaning in the dialogue systems so far prevent
humanoid robots from engaging in truly intuitive multimodal interactions with humans.

18
2.5.4 Dexterous Manipulation
Another key human capability is dexterous manipulation. The human hand has about
thirty degrees of freedom. It is not easy to reproduce its strength, flexibility, and
sensitivity. Among the most advanced robotic hands are the Shadow hand, which is
driven by 40 air muscles and the four-finger hand developed by DLR and HIT. Dexterous
manipulation not only requires capable hands, but also hand-arm coordination and the
coordination of two hands and the vision system. Due to the high number of joints
involved, controlling grasping and manipulation is challenging. Three examples for
manipulation-oriented humanoid robots are the Robonaut, which is using the space tools
designed for humans, Justin, for which DLR developed an impedance-based control
scheme, and Twendy-One, which is equipped with passive impedances in the actuators.
While the performance of these robots is impressive, they cannot grasp and manipulate
unknown objects. This is mainly due to deficits in the perception of grasping affordances.
Also the interpretation of the touch and force sensors integrated in the hands must be
improved in order to allow for blind adjustments of the grip in the way humans do it.

Fig 2.5: Humanoid robot with Dexterous manipulation

2.5.5 Learning and Adaptive Behaviour


To be useful in everyday environments, humanoid robots must be able to adapt existing
capabilities and need to cope with changes. They are also required to quickly learn new

19
skills.Fortunately, humanoid robots have the unique possibility to learn from capable
teachers, the humans in their environment. This is called imitation learning or
programming by demonstration. Imitation learning has been applied, for example, to
complex motions like swinging a tennis racket or generating gestures and to manipulation
tasks. One difficulty of imitation learning is the perception of the teacher.
Frequently,external motion capture systems relying on special markers or attached
motion sensors are used to sense the motion of the teacher. Another difficulty is the
mapping between the human body and the robot’s body. Some human motions may not
be possible for the robot, e.g. due to lack of joints, limited joints angles or dynamic
constraints. On the other hand, the robot might have degrees of freedom that are not
constrained by the captured motion. These must be controlled by optimizing secondary
criteria such as energy use. A frequently used possibility to simplify imitation is also to
rely on kinesthetic teaching, where the teacher directly moves the limbs of the robot.
While the imitation of human motion patterns greatly assists in the generation of
humanoid motion, it does not suffice. Because of the differences between the teacher and
the robot, for true imitation the robot must infer the intentions of the human teacher and
come up with its own strategy to accomplish the same goals.

Fig 2.6: Robot serving food to human

Another possibility to optimize the behavior of humanoid robots is reinforcement


learning. Here, the robot revives rewards or punishments while interacting with the

20
environment. The problem now is to find the policy of actions which maximizes the
discounted future reward. Many techniques for reinforcement learning exist. Among the
most promising techniques are stochastic gradient techniques, which have been used to
optimize bipedal walking patterns and natural actor-critic learning, which has been shown
to learn hitting a ball with a bat. One general difficulty with reinforcement learning is the
generation of the rewards. It cannot be assumed that the environment generates a reward
structure sufficient for the learning of complex tasks. Rather, intermediate rewards must
be generated by the robot itself when successfully accomplishing subtasksor when
meeting intrinsic needs. On the other hand, the robot must generate negative rewards
when violating intrinsic constraints, for example when falling. Especially in such
situations it is crucial to learn from few examples. In addition to the learning of behavior,
learning and adaptation are also important for perception. Computer vision and speech
recognition, for example, must adapt to changing lighting conditions and varying
auditory backgrounds. The robots are required to familiarize themselves with new objects
in their environment and also need to learn new words and names. For navigation, some
humanoid robots construct maps of their surroundings. One example for successful
mapping with a humanoid has been presented by Gutmann et al. using Sony Qrio

21
CHAPTER-3
APPLICATIONS

3.1 INTRODUCTION

Because the capabilities of humanoid robots are rather limited, there are few real-world
applications for them so far. The most visible use of humanoid robots is technology
demonstration.

3.1.1 Technology Demonstration


Famous humanoid robots like the Honda Asimo or the Toyota Partner Robots do not
accomplish any useful work. They are, however, presented to the media and demonstrate
their capabilities like walking, running, climbing stairs, playing musical instruments or
conducting orchestras on stage and during exhibitions. Such a showcase of corporate
technology attracts public attention and strengthens the brand of the car manufacturers.
Hence, the huge development costs of these advanced humanoids might be
covered from the marketing budgets.

Fig 3.1:Asimo playing football

22
3.1.2 Space Missions
Another area where money is not much of an issue is missions to space. Since human life
support in space is costly and space missions are dangerous, there is a need to
complement or replace humans in space by human-like robots. The two prominent
projects in this area are the NASA Robonaut and DLR’s Justin. Both use a humanoid
torso mounted on a wheeled base. The humanoid appearance of the robots is justified,
because they can keep using space-certified tools which have been designed for humans
and because the humanoid body makes teleoperation by humans easier.

Fig 3.2:Robonauts

3.1.3 Manufacturing
While in industrial mass production robot arms are used which are not anthropomorphic
at all, the Japanese company Yaskawa sees a market for human-like dual-arm robots in
manufacturing. It recently announced the Motoman-SDA10 robot which consists of two
7DOF arms on a torso that has an additional rotational joint. Each arm has a payload of
10kg. Yaskawa aims to directly replace humans on production lines. The robot is able to
hold a part with one arm while using a tool with the other arm. It can also pass a part
from one arm to the other without setting it down. Sales target for the SDA10 is 3000
units/year.

23
3.1.4 Household
An obvious domain for the use of humanoid robots is the household. Some humanoid
projects explicitly address this domain. They include the Armar series of robots
developed in Karlsruhe, Twendy-One developed at Waseda University, and the personal
robot PR1 developed in Stanford. While these robots demonstrate impressive isolated
skills needed in a household environment, they are far from autonomous operation in an
unmodified household.

Fig 3.3: Humanoid helping a human

3.1.5 Robot Competitions


A currently more viable application for humanoid robots is robot competitions. RoboCup
and FIRA, for example, feature competitions for humanoid soccer robots. These robots
are fully autonomous and play together as a team. When they fall, they get up by
themselves and continue playing. The participating research groups either construct their
own robots or they use commercial humanoid robot kits available, e.g., from Robotis and
Kondo. RoboCup also selected the Aldebaran Nao humanoid robot as successor of the
Sony Aibo in the Standard Platform League. Another popular competition for humanoid
robots is Robo-One, where teleoperated robots engage in martial arts.

24
Fig 3.4: Robowar

There are also competitions for robots in human-populated environments like the
AAAI mobile robot competition, where the robots are supposed to attend a conference,
and RoboCup@home where the robots are supposed to do useful work in a home
environment. Because they provide a standardized test bed, such robot competitions serve
as benchmark for AI and robotics.

25
REFERENCES

1. en.wikipedia.org/wiki/Humanoid_robot

2. www.inel.gov/adaptiverobotics/humanoidrobotics/

3. www.ai.mit.edu/projects/humanoid-robotics-group

4. www.kuffner.org/james/humanoid

26

You might also like