Professional Documents
Culture Documents
Chandigarh University: Bachelor of Technology
Chandigarh University: Bachelor of Technology
Pictures
A Project synopsis
Submitted in partial fulfilment of the requirements of the project.
Of
Bachelor of Technology
In
COMPUTER SCIENCE & ENGINEERING
Submitted by: -
Prince Kumar Singh (22BCS11746)
Rashi Gupta (22BCS10062)
Ashwani Kumar Pandey (22BCS10860)
Nishchay (22BCS10648)
Parteek (22BCS10395)
CHANDIGARH UNIVERSITY
1. Project title :-
“Emotion Detection from Facial Expression in the
pictures”
2. Team members details :-
NAME Roll no. e-mail
Prince Kumar Singh 22BCS11746 22BCS11746@cuchd.in
Rashi gupta 22BCS10062 22BCS10062@cuchd.in
Ashwani Kumar 22BCS10860 22BCS10860@cuchd.in
Pandey
Nishchay 22BCS10648 22BCS10648@cuchd.in
Parteek 22BCS10395 22BCS10395@cuchd.in
3. Objective :-
The most imformative channel for machine perception of
emotions is through facial expressions. Effective human
computer intelligent interaction(HCII) needs the computer to
detect emotions through facial expressions. This project aims
to develop automatic emotion detecting system by evaluating
machime learning algorithms for facial expression
recognisation through the pictures. The system will perform
features selection in each frames to anallyse the image and
compare with an authentic database of natural emotions to
classify each frame as a class of human emotion harnessing
facial expression dynamics.
5. Technical details:-
Facial expressions give important perception about
emotions. Therefore several approches have been
proposed to classify human affective states. The features
used are typically based on local spatial position or
displacement of specific points and regions of the face,
which uses global statistics of the acoustic features.
The facial emotion recognition process is divided into three key stage: •
Detection of faces and facial parts:- At this stage, an ER
solution treats the human face as an object. It detects facial features
(brows, eyes, mouth, etc.), notices their position, and observes their
movements for some time (this period is different for each solution).
1. Happiness
2. Surprise
3. Anger
4. Sadness
5. Fear
6. Disgust
7. Contempt
b. Feature size.
c. Skin colour.
In order to improve the accuracy of feature identification, some
researchers implement a part-based model that divides facial
landmarks into several parts according to the physical structures
of the face. This model then feeds these parts into the network
separately, with relevant labels.
4. Recognizing incomplete emotions-> Most algorithms focus on
recognizing the peak high-intensity expression and ignore
lowerintensity expressions. This leads to inaccurate recognition of
emotion when analyzing self-contained people from cultures with
traditions of emotional suppression.
Refrence
https://pypi.org/project/deepface/#:~:text=Deepface%20is%20a%20lightweight%20face,%2C%20ArcFace%20%2C
%20Dlib%20and%20SFace%20
2. https://medium.com/nerd-for-tech/deep-face-recognition-in-
python41522fb47028