Professional Documents
Culture Documents
They built a straightforward solution for emotion recognition supported the peaks within the EDA signal. The
success rates of their algorithm and therefore the SVM + GA was 64.66% and 90%, respectively . Zhang et al.
conducted an experiment within which 123 subjects were asked to wear smart bracelets with built-in
accelerometers. They attempted to spot emotions from walking data using the LibSVM algorithm. They achieved
classification accuracy of 91.3% (neutral vs. angry), 88.5% (neutral vs. happy), and 88.5% (happy vs. sad). the
popularity rate of the three emotions (neutral, happy, and angry) achieved an accuracy of 81.2%. These results
www.irjmets.com @International Research Journal of Modernization in Engineering, Technology and Science
[2055]
e-ISSN: 2582-5208
International Research Journal of Modernization in Engineering Technology and Science
( Peer-Reviewed, Open Access, Fully Refereed International Journal )
Volume:04/Issue:03/March-2022 Impact Factor- 6.752 www.irjmets.com
demonstrated that emotion can be reflected to some extent in walking, and wearable smart devices might be
accustomed recognize human emotions
Average Classification Method The average classification method will compare the given heart rate with the
average heart rate value of each emotion. The method will suggest the emotion whose average heart rate is the
closest to the user heart rate. For example, if the user’s heart rate is 110 bpm, the application will identify that
the user is in an angry emotion.
IV. RESULTS AND DISCUSSION
A. Dataset
This study proposes a music recommendation system which records the heartbeat of the user, which is
measured with the help of a smartwatch being wear by the user. Once the heartbeat is measured it is sent to the
database and there the analysing is done, after completing the analysis and fetching the appropriate data the
emotion is recognized and a playlist is suggested.
User Heartbeat Emotion
1. 70 Neutral
2. 142 Happy
3. 99 Sad
In above scenario we have taken three users, to each user we showed different videos. To the first user we
showed a random video of science, to the second user we showed a comedy video and to the third user we
showed a war crisis video.
B. Evaluation Results
The results in above Table presents the measuring of their heartbeats after watching the videos respectively..
Here the first user’s heartrate is 70 that is neutral so a normal list of songs and playlist is recommended. The
second user’s heartrate is measured as 142 that is the highest level of happiness so a happy list of songs and
playlists is recommended. The third user’s heartrate is 99 that is sad so a sentimental list of songs and playlists is
recommended.
V. CONCLUSION
For analysing a person’s mood or emotions, heart rate is considerd as preferable and has more accurate
detection of fear, sadness, aggression and excitement when happy when compared to the existing systems which
uses facial expressions as they can vary way too much. Unlike the systems based on facial expressions our
system won’t face any difficulty regarding the accuracy in the emotion detection process because our feelings
come from our heart thus detecting a person’s feelings based on how much a bpm the heart is pumping will
always give maximum accuracy. Therefore, we proposed a method of using heart rate data to identify human
emotions. The data of recorded heart rate is measured by wearable smart watch. The experimentsthat we have
performed above shows that the idea of implementing heart rate for emotion detection is advantageous for the
future scope. As it will also help to do the analysis of the users mental health and the user will realise the
amount of fluctuation from his or her heart goes through. This process is easy to approach based on wearable
consumer electronic devices especially smart watches. It will help to promote the application and development
of wearable devices for monitoring human emotional moods in static or quasi-static states. It will be great
advantage to the users looking for music based on their mood and feelings. It will also help reduce the searching
time for songs thereby making the work easy and more preferable to use, and increasing the overall accuracy
and efficiency of the system.
VI. REFERENCES
[1] Viola, P., and Jones, M. Rapid object detection using a boosted cascade of simple features. Proceedings of
the 2001 IEEE Computer Society Conference on, vol. 1, pp. 511-518 IEEE, 2001 (2001)
[2] H. Immanuel James, J. James Anto Arnold, J. Maria MasillaRuban, M. Tamilarasan, R. Saranya” EMOTION
BASED MUSIC RECOMMENDATION SYSTEM”: pISSN: 2395-0072 , IRJET 2019
[3] Hafeez Kabani, Sharik Khan , Omar Khan , Shabana Tadvi”Emotion Based Music Player” International
Journal of Engineering Research and General Science Volume 3, Issue 1, January-February , 2015
www.irjmets.com @International Research Journal of Modernization in Engineering, Technology and Science
[2056]
e-ISSN: 2582-5208
International Research Journal of Modernization in Engineering Technology and Science
( Peer-Reviewed, Open Access, Fully Refereed International Journal )
Volume:04/Issue:03/March-2022 Impact Factor- 6.752 www.irjmets.com
[4] ShlokGikla, Husain Zafar, ChuntanSoni, KshitijaWaghurdekar”SMART MUSIC INTEGRATING AND MUSIC
MOOD RECOMMENDATION”2017 International Conference on Wireless Communications, Signal
Processing and Networking(WiSppNET)
[5] T.-H. Wang and J.-J.J. Lien, “Facial Expression Recognition System Based on Rigid and Non-Rigid Motion
Separation and 3D Pose Estimation” J. Pattern Recognition , vol. 42, no. 5,pp. 962-977, 2009
[6] Srushti Sawant, Shraddha Patil, ShubhangiBiradar,”EMOTION BASED MUSIC SYSTEM”, International
Journal of Innovations & Advancement in CoputerScience,IJIACS ISSN 2347-8616 volue 7, Issue 3 March
2018
[7] Sudha Veluswamy, Hariprasad Kanna, Balasubramanian Anand, Anshul Sharma “METHOD AND
APPARATUS FOR RECOGNIZING AN EMOTION OF AN INDIVIDUAL BASED ON FACIAL ACTION
UNITS”US2012/0101735A1
[8] Markus Mans Folke Andreasson ”GENERATING MUSIC PLAYLIST BASED ON FACIAL EXPRESSION”
US8094891B2
[9] Mutasem K. Alsmadi ”FACIAL EXPRESSION RECOGNITION”US2018/0211102A1.