You are on page 1of 10

SARJANA MUDA KEJURUTERAAN ELEKTRIK (SKEE)

AMALAN KEJURUTERAAN PROFESSIONAL

GROUP PROJECT
BLUE EYES TECHNOLOGY

LECTURER : DR.IR.MOKHTAR BIN HARUN

PREPARED BY:
1. MUHAMMAD NAQIUDDIN BIN ZULKEFLI
-SX160590KEES04
2. MUHAMMAD FAREED AMIERUL BIN ROSMADI
-SX160584KEES04
3. MUHAMMAD HAZIQ BIN ABD KARIM
-SX160653KEES04
4. MOHD ZULFADLI BIN AZIZ
-SX160571KEES04
ABSTRACT
Is it possible to create a computer which can interact with us as we interact each
other? For example imagine in a fine morning you walk on to your computer room
and switch on your computer, and then it tells you “Hey friend, good morning you
seem to be a bad mood today. And then it opens your mailbox and shows you some of
the mails and tries to cheer you. It seems to be a fiction, but it will be the life lead by
“BLUE EYES” in the very near future. The basic idea behind this technology is to
give the computer the human power. We all have some perceptual abilities. That is we
can understand each other feelings. For example we can understand ones emotional
state by analyzing his facial expression. If we add these perceptual abilities of human
to computers would enable computers to work together with human beings as intimate
partners. The “BLUE EYES” technology aims at creating computational machines
that have perceptual and sensory ability like those of human beings. How can we
make computers "see" and "feel"? Blue Eyes uses sensing technology to identify a
user's actions and to extract key information. This information is then analyzed to
determine the user's physical, emotional, or informational state, which in turn can be
used to help make the user more productive by performing expected actions or by
providing expected information. For example, in future a Blue Eyes-enabled
television could become active when the user makes eye contact, at which point the
user could then tell the television to "turn on". This paper is about the hardware,
software, benefits and interconnection of various parts involved in the “blue eye”
technology
INTRODUCTION TO BLUE EYES TECHNOLOGY

Imagine yourself in a world where humans interact with computers.


You are sitting in front of your personal computer that can listen, talk, or even
scream aloud. It has the ability to gather information about you and interact with you
through special techniques like facial recognition, speech recognition, etc. It can
even understand your emotions at the touch of the mouse. It verifies your identity,
feels your presents, and starts interacting with you.
You ask the computer to dial to your friend at his office. It realizes the
urgency of the situation through the mouse, dials your friend at his office, and
establishes a connection. The BLUE EYES technology aims at creating computational
machines that have perceptual and sensory ability like those of human beings.
Employing most modern video cameras and microphones to identifies the user’s
actions through the use of imparted sensory abilities. The machine can understand
what a user wants, where he is looking at, and even realize his physical or emotional
states.
The U.S. computer giant, IBM has been conducting research on the
Blue Eyes technology at its Almaden Research Center (ARC) in San Jose, Calif.,
since 1997. The ARC is IBM's main laboratory for basic research. The primary
objective of the research is to give a computer the ability of the human being to assess
a situation by using the senses of sight, hearing and touch. Animal survival depends
on highly developed sensory abilities. Likewise, human cognition depends on highly
developed abilities to perceive, integrate, and interpret visual, auditory, and touch
information. Without a doubt, computers would be much more powerful if they had
even a small fraction of the perceptual ability of animals or humans. Adding such
perceptual abilities to computers would enable computers and humans to work
together more as partners. Toward this end, the Blue Eyes project aims at creating
computational devices with the the sort of perceptual abilities that people take for
granted. Thus Blue eyes are the technology to make computers sense and understand
human behavior and feelings and react in the proper ways.
AIMS
1) To design smarter devices
2) To create devices with emotional intelligence
3) To create computational devices with perceptual abilities
The idea of giving computers personality or, more accurately, emotional
intelligence" may seem creepy, but the technologists say such machines would offer
important advantages.
De-spite their lightning speed and awesome powers of computation,
today's PCs are essentially deaf, dumb, and blind. They can't see you, they can't hear
you, and they certainly don't care a whit how you feel. Every computer user knows
the frustration of nonsensical error messages, buggy software, and abrupt system
crashes. We might berate the computer as if it was an unruly child, but, of course, the
machine can't respond. "It's ironic that people feel like dummies in front of their
computers, when in fact the computer is the dummy," says Rosalind Picard, a
computer science professor at the MIT Media Lab in Cambridge.
A computer endowed with emotional intelligence, on the other hand,
could recognize when its operator is feeling angry or frustrated and try to respond in
an appropriate fashion. Such a computer might slow down or replay a tutorial
program for a confused student, or recognize when a designer is burned out and
suggest he take a break. It could even play a recording of Beethoven's "Moonlight
Sonata" if it sensed anxiety or serve up a rousing Springsteen anthem if it detected
lethargy. The possible applications of "emotion technology" extend far beyond the
desktop.
A car equipped with an affective computing system could recognize
when a driver is feeling drowsy and ad-vise her to pull over, or it might sense when a
stressed-out motorist is about to explode and warn him to slow down and cool off.
Human cognition depends primarily on the ability to perceive,
interpret, and integrate audio-visuals and sensoring information. Adding extraordinary
perceptual abilities to computers would enable computers to work together with
human beings as intimate partners.Researchers are attempting to add more capabilities
to computers that will allow them to interact like humans, recognize human presents,
talk, listen, or even guess their feelings.

TRACKS USED
Our emotional changes are mostly reflected in our heart pulse
rate, breathing rate, facial expressions, eye movements, voice etc. Hence these are the
Parameters on which blue technology is being developed.
Making computers see and feel Blue Eyes uses sensing technology to
identify a user's actions and to extract key information. This information is then
analyzed to determine the user's physical, emotional, or informational state, which in
turn can be used to help make the user more productive by performing expected
actions or by providing expected information.
Beyond making computers more researchers say there is another
compelling reason for giving machine semotional intelligence. Contrary to the
common wisdom that emotions contribute to irrational behavior, studies have shown
that feelings actually play a vital role in logical thought and decision- making.
Emotionally impaired people often find it difficult to make decisions because they fail
to recognize the subtle clues and signals--does this make me feel happy or sad,
excited or bored? That help direct healthy thought processes. It stands to reason,
therefore, that computers that can emulate human emotions are more likely to behave
rationally, in a manner we can understand. Emotions are like the weather. We only
pay attention to them when there is a sudden outburst, like a tornado, but in fact they
are constantly operating in the background, helping to monitor and guide our
day-to-day activities.
Picard, who is also the author of the groundbreaking book Affective
Computing, argues that computers should operate under the same principle."They
have tremendous mathematical abilities, but when it comes to interacting with people,
they are autistic," she says. "If we want computers to be genuinely intelligent and
interact naturally with us, we must give them the ability to recognize, understand, and
even to behave' and express emotions." Imagine the benefit of a computer that could
remember that a particular Internet search had resulted in a frustrating and futile
exploration of cyberspace. Next time, it might modify its investigation to improve the
chances of success when a similar request is made.
METHODS

1) AFFECT DETECTION
This is the method of detecting our emotional states from the
expressions on our face. Algorithms amenable to real time implementation that extract
information from facial expressions and head gestures are being explored. Most of the
information is extracted from the position of the eye rows and the corners of the
mouth.

2) MAGIC POINTING
MAGIC stands for Manual Acquisition with Gaze Tracking
Technology. a computer with this technology could move the cursor by following the
direction of the user's eyes. This type of technology will enable the computer to
automatically transmit information related to the screen that the user is gazing at.
Also, it will enable the computer to determine, from the user's expression, if he or she
understood the information on the screen, before automatically deciding to proceed to
the next program. The user pointing is still done by the hand, but the cursor always
appears at the right position as if by MAGIC. By varying input technology and eye
tracking, we get MAGIC pointing.

3) SUITOR
SUITOR stands for Simple User Interface Tracker. It implements
the method for putting computational devices in touch with their users changing
moods. By watching what we page the user is currently browsing, the SUITOR can
find additional information on that topic. The key is that the user simply interacts with
the computer as usual and the computer infers user interest based on what it sees the
user do.

4) EMOTION MOUSE
This is the mouse embedded with sensors that can sense the
physiological attributes such as temperature, Body pressure, pulse rate, and touching
style, etc. The computer can determine the user’s emotional states by a single touch.
IBM is still Performing research on this mouse and will be available in the market
within the next two or three years. The expected accuracy is 75%.
One goal of human computer interaction (HCI) is to make an
adaptive, smart computer system. This type of project could possibly include gesture
8
recognition, facial recognition, eye tracking, speech recognition, etc. Another noninvasive
way to obtain information about a person is through touch. People use their
computers to obtain, store and manipulate data using.
In order to start creating smart computers, the computer must start
gaining information about the user. Our proposed method for gaining user information
through touch is via a computer input device, the mouse. From the physiological data
obtained from the user, an emotional state may be determined which would then be
related to the task the user is currently doing on the computer. Over a period of time, a
user model will be built in order to gain a sense of the user's personality.
The scope of the project is to have the computer adapt to the user in
order to create a better working environment where the user is more productive. The
first steps towards realizing this goal are described here.

2.1. EMOTION AND COMPUTING


Rosalind Picard (1997) describes why emotions are important to the
computing community. There are two aspects of affective computing: giving the
computer the ability to detect emotions and giving the computer the ability to express
emotions. Not only are emotions crucial for rational decision making but emotion
detection is an important step to an adaptive computer system. An adaptive, smart
computer system has been driving our efforts to detect a person’s emotional state.
By matching a person’s emotional state and the context of the
expressed emotion, over a period of time the person’s personality is being exhibited.
Therefore, by giving the computer a longitudinal understanding of the emotional state
of its user, the computer could adapt a working style which fits with its user’s
personality. The result of this collaboration could increase productivity for the user.
One way of gaining information from a user non-intrusively is by video. Cameras
have been used to detect a person’s emotional state. We have explored gaining
information through touch. One obvious place to put sensors is on the mouse.
Figure 2.1.physiological emotions for emotional mouse
THEORY
Based on Paul Elman’s facial expression work, we see a correlation
between a person’s emotional state and a person’s physiological measurements.
Selected works from Elman and others on measuring facial behaviors describe
Elman’s Facial Action Coding System (Elman and Rosenberg, 1997).
One of his experiments involved participants attached to devices to
record certain measurements including pulse, galvanic skin response (GSR),
temperature, somatic movement and blood pressure. He then recorded the
measurements as the participants were instructed to mimic facial expressions which
corresponded to the six basic emotions. He defined the six basic emotions as anger,
fear, sadness, disgust, joy and surprise. From this work, Dryer (1993) determined how
physiological measures could be used to distinguish various emotional states. The
measures taken were GSR, heart rate, skin temperature and general somatic activity
(GSA). These data were then subject to two analyses. For the first analysis, a
multidimensional scaling. (MDS) procedure was used to determine the dimensionality
of the data.

RESULT
The data for each subject consisted of scores for four physiological
assessments [GSA, GSR, pulse, and skin temperature, for each of the six emotions
(anger, disgust, fear, happiness, sadness, and surprise)] across the five minute baseline
and test sessions. GSA data was sampled 80 times per second, GSR and temperature
were reported approximately 3-4times per second and pulse was recorded as a beat
was detected, approximately 1 time per second. To account for individual variance in
physiology, we calculated the difference between the baseline and test scores. Scores
that differed by more than one and a half standard deviations from the mean were
treated as missing. By this criterion, twelve score were removed from the analysis.
The results show the theory behind the Emotion mouse work is fundamentally sound.

CONCLUSION:

The nineties witnessed quantum leaps interface designing for improved


man machine interactions. The BLUE EYES technology ensures a convenient way of
simplifying the life by providing more delicate and user friendly facilities in
computing devices. Now that we have proven the method, the next step is to improve
the hardware. Instead of using cumbersome modules to gather information about the
user, it will be better to use smaller and less intrusive units. The day is not far when
this technology will push its way into your house hold, making you more lazy. It may
even reach your hand held mobile device. Any way this is only a technological
forecast.
REFERENCES:

1). Levin J.L, An eye controlled computers.


2). Silbert.L and R.Jacob, The advantage of eye gazing interactions.
3). Richard A.Bolt, Eyes at interface.

You might also like