Professional Documents
Culture Documents
GROUP PROJECT
BLUE EYES TECHNOLOGY
PREPARED BY:
1. MUHAMMAD NAQIUDDIN BIN ZULKEFLI
-SX160590KEES04
2. MUHAMMAD FAREED AMIERUL BIN ROSMADI
-SX160584KEES04
3. MUHAMMAD HAZIQ BIN ABD KARIM
-SX160653KEES04
4. MOHD ZULFADLI BIN AZIZ
-SX160571KEES04
ABSTRACT
Is it possible to create a computer which can interact with us as we interact each
other? For example imagine in a fine morning you walk on to your computer room
and switch on your computer, and then it tells you “Hey friend, good morning you
seem to be a bad mood today. And then it opens your mailbox and shows you some of
the mails and tries to cheer you. It seems to be a fiction, but it will be the life lead by
“BLUE EYES” in the very near future. The basic idea behind this technology is to
give the computer the human power. We all have some perceptual abilities. That is we
can understand each other feelings. For example we can understand ones emotional
state by analyzing his facial expression. If we add these perceptual abilities of human
to computers would enable computers to work together with human beings as intimate
partners. The “BLUE EYES” technology aims at creating computational machines
that have perceptual and sensory ability like those of human beings. How can we
make computers "see" and "feel"? Blue Eyes uses sensing technology to identify a
user's actions and to extract key information. This information is then analyzed to
determine the user's physical, emotional, or informational state, which in turn can be
used to help make the user more productive by performing expected actions or by
providing expected information. For example, in future a Blue Eyes-enabled
television could become active when the user makes eye contact, at which point the
user could then tell the television to "turn on". This paper is about the hardware,
software, benefits and interconnection of various parts involved in the “blue eye”
technology
INTRODUCTION TO BLUE EYES TECHNOLOGY
TRACKS USED
Our emotional changes are mostly reflected in our heart pulse
rate, breathing rate, facial expressions, eye movements, voice etc. Hence these are the
Parameters on which blue technology is being developed.
Making computers see and feel Blue Eyes uses sensing technology to
identify a user's actions and to extract key information. This information is then
analyzed to determine the user's physical, emotional, or informational state, which in
turn can be used to help make the user more productive by performing expected
actions or by providing expected information.
Beyond making computers more researchers say there is another
compelling reason for giving machine semotional intelligence. Contrary to the
common wisdom that emotions contribute to irrational behavior, studies have shown
that feelings actually play a vital role in logical thought and decision- making.
Emotionally impaired people often find it difficult to make decisions because they fail
to recognize the subtle clues and signals--does this make me feel happy or sad,
excited or bored? That help direct healthy thought processes. It stands to reason,
therefore, that computers that can emulate human emotions are more likely to behave
rationally, in a manner we can understand. Emotions are like the weather. We only
pay attention to them when there is a sudden outburst, like a tornado, but in fact they
are constantly operating in the background, helping to monitor and guide our
day-to-day activities.
Picard, who is also the author of the groundbreaking book Affective
Computing, argues that computers should operate under the same principle."They
have tremendous mathematical abilities, but when it comes to interacting with people,
they are autistic," she says. "If we want computers to be genuinely intelligent and
interact naturally with us, we must give them the ability to recognize, understand, and
even to behave' and express emotions." Imagine the benefit of a computer that could
remember that a particular Internet search had resulted in a frustrating and futile
exploration of cyberspace. Next time, it might modify its investigation to improve the
chances of success when a similar request is made.
METHODS
1) AFFECT DETECTION
This is the method of detecting our emotional states from the
expressions on our face. Algorithms amenable to real time implementation that extract
information from facial expressions and head gestures are being explored. Most of the
information is extracted from the position of the eye rows and the corners of the
mouth.
2) MAGIC POINTING
MAGIC stands for Manual Acquisition with Gaze Tracking
Technology. a computer with this technology could move the cursor by following the
direction of the user's eyes. This type of technology will enable the computer to
automatically transmit information related to the screen that the user is gazing at.
Also, it will enable the computer to determine, from the user's expression, if he or she
understood the information on the screen, before automatically deciding to proceed to
the next program. The user pointing is still done by the hand, but the cursor always
appears at the right position as if by MAGIC. By varying input technology and eye
tracking, we get MAGIC pointing.
3) SUITOR
SUITOR stands for Simple User Interface Tracker. It implements
the method for putting computational devices in touch with their users changing
moods. By watching what we page the user is currently browsing, the SUITOR can
find additional information on that topic. The key is that the user simply interacts with
the computer as usual and the computer infers user interest based on what it sees the
user do.
4) EMOTION MOUSE
This is the mouse embedded with sensors that can sense the
physiological attributes such as temperature, Body pressure, pulse rate, and touching
style, etc. The computer can determine the user’s emotional states by a single touch.
IBM is still Performing research on this mouse and will be available in the market
within the next two or three years. The expected accuracy is 75%.
One goal of human computer interaction (HCI) is to make an
adaptive, smart computer system. This type of project could possibly include gesture
8
recognition, facial recognition, eye tracking, speech recognition, etc. Another noninvasive
way to obtain information about a person is through touch. People use their
computers to obtain, store and manipulate data using.
In order to start creating smart computers, the computer must start
gaining information about the user. Our proposed method for gaining user information
through touch is via a computer input device, the mouse. From the physiological data
obtained from the user, an emotional state may be determined which would then be
related to the task the user is currently doing on the computer. Over a period of time, a
user model will be built in order to gain a sense of the user's personality.
The scope of the project is to have the computer adapt to the user in
order to create a better working environment where the user is more productive. The
first steps towards realizing this goal are described here.
RESULT
The data for each subject consisted of scores for four physiological
assessments [GSA, GSR, pulse, and skin temperature, for each of the six emotions
(anger, disgust, fear, happiness, sadness, and surprise)] across the five minute baseline
and test sessions. GSA data was sampled 80 times per second, GSR and temperature
were reported approximately 3-4times per second and pulse was recorded as a beat
was detected, approximately 1 time per second. To account for individual variance in
physiology, we calculated the difference between the baseline and test scores. Scores
that differed by more than one and a half standard deviations from the mean were
treated as missing. By this criterion, twelve score were removed from the analysis.
The results show the theory behind the Emotion mouse work is fundamentally sound.
CONCLUSION: