You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/316989659

Facial Emotion Recognition System through Machine Learning approach

Conference Paper · May 2017


DOI: 10.1109/ICCONS.2017.8250725

CITATIONS READS
5 4,060

3 authors:

Renuka Deshmukh Shilpa Paygude


Maharashtra Institute of Technology Dr. Vishwanath Karad MIT World Peace University
5 PUBLICATIONS   10 CITATIONS    10 PUBLICATIONS   12 CITATIONS   

SEE PROFILE SEE PROFILE

Vandana Jagtap
Maharashtra Institute of Technology
28 PUBLICATIONS   241 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Facial Emotion Recognition from real time images View project

Vedic Mathematics for Image Processing View project

All content following this page was uploaded by Vandana Jagtap on 20 November 2017.

The user has requested enhancement of the downloaded file.


International Journal of Computer Science and Information Security (IJCSIS),
Vol. 15, No. 3, March 2017

A Comprehensive Survey on Techniques for


Facial Emotion Recognition
Renuka Deshmukh, M. E. Scholar, MIT, Pune
Vandana Jagtap, Professor, MIT, Pune

Abstract-The emotions, set in simple words are what people feel. Emotional aspects have huge impact on Social
intelligence like communication understanding, decision making and helps in understanding behavioral aspect of human.
Human faces provide various information about emotions. As per psychological researcher, a person expresses his
emotions less by verbal talk and more by non-verbal body posture and gestures. Emotion recognition or Affective
Computing (AC) being the AI related area imparts intelligence to computers in recognizing human emotions. Emotion
recognition is proved a popular research area topic in few decades. The aim of this paper is to report an illustrative and
comprehensive study of most popular emotion recognition methods, which are generally used in emotion recognition
problems. We are motivated by the lack of detailed study of all possible techniques implementations in available
literature. This paper provides an up-to-date comprehensive survey of techniques available for emotion recognition.

Keywords-emotions, images, emotion recognition, facial image, human computer interaction, facial emotion recognition.

I. INTRODUCTION
Emotions entail different components, such as subjective experience, cognitive processes, expressive behavior,
psychophysiological changes, and behavior. These various components of emotion are categorized in a different
way depending on the academic discipline. In psychology and philosophy, emotion includes a subjective, conscious
experience characterized by psychophysiological expressions, biological reactions, and mental states. The research
on emotion has increased significantly greater than the past two decades. There are many fields contributing that
include psychology, neuroscience, endocrinology, medicine, history, sociology, and computer science. There are
abundant theories that attempt to explicate the origin, experience, and function of emotions and have fostered more
intense research on this topic. Current areas of research in the concept of emotion include the development of
materials that motivate and elicit emotion [1]. Charles Darwin’s (1872/1965) book “The Expression of the Emotions
in Man and Animals” has been highly important for research on emotions. This book was intended to counteract
the claim by Sir Charles Bell (1844), that certain muscles were created so as to give humans the ability to
express their feelings. Darwin’s basic message was that emotion expressions are evolved and adaptive. For Darwin,
emotional expressions not only originated as part of an emotion process but also had an important communicative
function [2]. The cross-cultural studies conducted by Ekman and his collaborators and by Izard strongly suggested
universality in interpreting facial expressions of emotion. These findings countered customary ideas of cultural
relativism, and suggested that the study of facial expression is relevant to central questions regarding human nature.
Then, researcher developed measures of facial emotion recognition, which some emotion researchers used to
measure facial activity itself directly, rather than studying the observers’ judgments of the emotions they saw in an
expression. Whereas formerly facial activity were measured via electromyography, it is far more invasive and less
precise than scoring systems measuring the changes in the appearance of the face.
The purpose of emotion recognition systems is the appliance of emotion related knowledge in such a way
that human computer communication will be enhanced and furthermore the user's experience will become more
satisfying. By enabling computers to sense the emotional state of the user and react accordingly, this communication
can be renovated to a satisfying one. Refining the communication with computers is not the only application of

219 https://sites.google.com/site/ijcsis/
ISSN 1947-5500
International Journal of Computer Science and Information Security (IJCSIS),
Vol. 15, No. 3, March 2017

emotion recognition. There can be specialized systems that can be developed and can be used for even more serious
problems like in various medical applications aggression detection, stress detection, autistic disorder, asperger
syndrome, hepatolenticular degeneration, frustration detection.

Fig. 1: Example of expression for the six basic emotions

TABLE I
FACIAL EXPRESSION DESCRIPTION OF SIX BASIC EMOTIONS
(SUMPENO ET AL., 2011)
Sr. Emotion Description of Facial expression
No. Class

1 Happy The eyebrows are relaxed. The mouth is open and the mouth corners upturned.

2 Sad The inner eyebrows are bent upward. The eyes are slightly closed. The mouth is usually
relaxed.
3 Fear The eyebrows are raised ad pulled together. The inner eyebrows are bent upward. The eyes are
open and tense.
4 Anger The inner eyebrows are pulled downward and together. The eyes are wide open. The lips are
tightly closed or opened to expose the teeth.

5 Surprise The eyebrows are raised. The upper eyelids and the eyes are wide open. The mouth is opened.

6 Disgust The eyebrows and eyelids are relaxed. The upper lip is raised and curled, often asymmetrically.

A. Facial Emotion Recognition System


The common approach to facial emotion recognition consists of three steps: face detection and tracking, feature
extraction and expression classification. Face detection stage processes the facial images, without human
intervention to find the face region from the input images or sequences. After face is positioned, the next step is to
extract discriminative information caused by facial expressions. Facial expression recognition is the last stage of the
systems. The facial changes can be identified either as prototypic emotions or as facial action units.

220 https://sites.google.com/site/ijcsis/
ISSN 1947-5500
International Journal of Computer Science and Information Security (IJCSIS),
Vol. 15, No. 3, March 2017

Even Though humans are filled with various emotions, modern psychology defines six basic facial
expressions: Happiness, Sadness, Surprise, Fear, Disgust, and Anger as universal emotions. Facial muscles
movements help in identifying human emotions. The facial features are the key parameters that can be considered
for recognizing emotions. The facial parameters include eyebrow, mouth, nose, eyes and cheeks.

Fig. 2: Facial Emotion Recognition System Flow

II. LITERATURE SURVEY

The paper titled “A Novel Approach for Face Expressions Recognition” focus on a new method for face
expression recognition. Haar functions is used for face, eyes and mouth detection; edge detection method for extracting the
eyes correctly and Bezier curves is applied to approximate the extracted regions. Then, a set of distances for varied face type
is extracted and it is serve as training input for a multilayer neural network. The novel factor of this approach consists in
applying Bezier curves to efficiently extract the distances between facial parts. The pre classification is done using K-means
algorithm. A two layered feed-forward neural network created is then used as a classifying tool for the input images. The
consistency of the results is demonstrated by the median value. The performance achieved here is 82%. The method is not
able to treat situations when the eyes are closed. Strong illumination variations affect the results [3].

D. Drume, introduced and evaluated multi-level classification framework for the emotion classification. This framework
include three phases, face localization, facial feature extraction and training & classification. This paper uses principal
component analysis at level-1 and support vector machine at level-2 for the training and classification. Results show that
this approach successfully recognize facial emotion with 93% recognition rate. The results suggest that the method
introduced is able to support the more accurate classification of emotion from the images [4].

The Neural network classifying method is used in this work to perform facial expression recognition. The expressions
classified include the six facial expressions and the neutral one. A neural network, trained using Zernike moments, was
applied to the set of the Yale and JAFFE database images in order to perform face detection. Then detected faces were

221 https://sites.google.com/site/ijcsis/
ISSN 1947-5500
International Journal of Computer Science and Information Security (IJCSIS),
Vol. 15, No. 3, March 2017

processed to perform the characterization phase computed through vectors of Zernike moments. At last, a back propagation
neural network was trained to distinguish between the seven emotion’s states. Then method performances were evaluated on
the JAFEE and YALE database.

Figure 3: Block diagram of system [3].


The values of parameters n and m not only influences the number of elements of the feature vector but also
influences the discriminative ability between various expressions of the face. For values low and increasing values
of the pair (n, m) the performances of the method are ascending. This causes a direct impact of the increase of the
discerning capacity of the feature vectors with their increased size. Results for JAFFE database are better than those
obtained for Yale database which is due to the homogeneousness of the persons and the greater number of training
images [5].
Zhiding Yu report image based static facial expression recognition method for the Emotion Recognition.
They focus on the sub-challenge of the SFEW 2.0 dataset, where they seek to classify static images without human
intervention into 7 basic emotions. The method contains a face detection module based on the ensemble of three
state-of-the-art face detectors, followed by a classification module with the ensemble of multiple deep convolutional
neural networks (CNN). On the way to combine multiple CNN models, author presented two schemes for learning
the ensemble weights of the network responses: by minimizing the log likelihood loss, and by minimizing the hinge
loss. This method generates state-of-the-art result on the FER dataset. The method of two ensemble frameworks
achieves the performance, 60.75% and 61.29% accuracy [6].
V. D. Bharate, implement the adaptive sub-layer compensation (ASLC) based facial emotions
recognition method for human emotions recognition. The emotion class is recognized by using the extracted
features and K nearest neighbor algorithm. The emotion recognition precision is calculated as ratio of the number
of correctly classified input samples to the total number of input samples in the data set. They modified Marr

222 https://sites.google.com/site/ijcsis/
ISSN 1947-5500
International Journal of Computer Science and Information Security (IJCSIS),
Vol. 15, No. 3, March 2017

Hildreth algorithm using Adaptive sub-layer compensation and hysteresis analysis to reduce negative effects of
Laplacian of Gaussian (LoG), such as image degradation, unwanted details in image, and disconnected edge
details from the image. By using feature extraction method with ASLC method, they have achieved an overall
accuracy of 86.5% for the principal component analysis and 85.1% for the Wavelet features [7].
In this paper, M. Aziz et. al., have recognized facial expressions from the images given in JAFFE
database. They have put forward a combination of 3-different types of approach that is Scale Invariant Features
Transform (SIFT), Gabor wavelets and Discrete Cosine Transform (DCT) to implement. Some pre-processing steps
have been applied before extracting the features. Support Vector Machine (SVM) with radial basis kernel function is
used for the classification of facial expressions. They evaluated the results on the JAFFE database and design
experiments are done for person dependent and person independent methods. While implementing this method,
some emotional states were misclassified with others. Sad misclassified with fear and surprise. Similarly, anger
misclassified with disgust and to a little extent with sad. Neutral is the only expression in the images that is not
misclassified with any other expression [8].
TABLE II
VARIOUS TECHNIQUES FOR FACIAL EMOTION RECOGNITION
Papers Technique Accuracy
[3] Feed-Forward Neural Network 82%
[4] PCA and SVM 93%
[5] Neural network -
[6] Multiple Deep Convolutional Neural Networks (CNN) 60%
[7] k-Nearest Neighbor Algorithm 84%

[8] Support Vector Machine (SVM) 95%


[9] Relevance Vector Machines (RVM) 90.84%

D. Datcu, research aims at implementing Relevance Vector Machines (RVM) as a novel classification
technique for the recognition of facial expressions in static images. The Cohn-Kanade Facial Expression Database
data was selected for testing. They report 90.84% recognition rates for RVM for six universal expressions. The error
rate in the case of RVM (9.16%) is compared to that of SVM (10.15%) classifier. The important aspect is that in
case of RVM classifier the number of relevance vectors (156) is smaller than that of support vectors (276) of SVM.
This effect in a decrease of the number of kernel functions and the complexity of the model. This technique of
classification not only takes less processing time but also less memory [9].
The intelligent facial expression recognition system makes visual interfaces easier and helpful for human
computer interactions. Human Computer Interfaces and robotics are not the only applications of facial expressions
recognition systems, it rather have its applications in several distinct areas like Video Games, Animations,
Psychiatry, Educational Software, Sensitive Music, Medical science, etc. The artificial intelligence focus on emotion
recognition is the new frontline that could have huge significances in not only advertising, but in startups, wearables,
and more.
III. CONCLUSION

223 https://sites.google.com/site/ijcsis/
ISSN 1947-5500
International Journal of Computer Science and Information Security (IJCSIS),
Vol. 15, No. 3, March 2017

There is increasing integration of computers and computer interfaces in our lives, due to the arise in the need of
computers in order to be able to recognize and respond to human communication and behavioral cues of emotions
and mental states. The automated analysis of expressions is a challenging endeavor because of the uncertainty
inherent in the inference of hidden mental states from behavioral cues. As the facial expression recognition systems
are becoming robust and effective in communications, many other innovative applications and uses are yet to be
seen. The objective of this research paper is to give brief overview towards the process, various techniques, and
application of facial emotion recognition system.
REFERENCES
[1] https://en.wikipedia.org/wiki/Emotion#Basic_emotions.
[2]U. Hess and P. Thibault, “Darwin and Emotion Expression”, American Psychological Association, 2009, vol. 64, no. 2, 120 128.
[3] S. M. Banu et. al., “A Novel Approach for Face Expressions Recognition”, IEEE 10th Jubilee International Symposium on Intelligent
Systems and Informatics, 2012.
[4] D. Drume and A. S. Jalal, “A Multi-level Classification Approach for Facial Emotion Recognition”, IEEE International Conference on
Computational Intelligence and Computing Research, 2012.
[5] M. Saaidia et. al., “Facial Expression Recognition Using Neural Network Trained with Zernike Moments”, IEEE International Conference on
Artificial Intelligence with Applications in Engineering and Technology, 2014.
[6] Zhiding Yu et. al., “Image based Static Facial Expression Recognition with Multiple Deep Network Learning”, ACM, 2015.
[7] V. D. Bharate et. al., “Human Emotions Recognition using Adaptive Sublayer Compensation and various Feature Extraction Mechanism”,
IEEE WiSPNET, 2016
[8]M. Aziz et. al., “Facial Expression Recognition using Multiple Feature Sets”, IEEE, 2015.
[9]D. Datcu, and J. M. Rothkrantz, “Facial Expression Recognition with Relevance Vector Machines”, Interactive Collaborative Information
Systems (ICIS).

224 https://sites.google.com/site/ijcsis/
ISSN 1947-5500
View publication stats

You might also like