You are on page 1of 5

e-ISSN: 2582-5208

International Research Journal of Modernization in Engineering Technology and Science


( Peer-Reviewed, Open Access, Fully Refereed International Journal )
Volume:04/Issue:03/March-2022 Impact Factor- 6.752 www.irjmets.com
MUSIC PLAYER BASED ON HEATRATE AND MOOD
Rohit Milind Oturkar*1, Komal Sanjay Narke*2, Shravani Devichand Ohal*3,
Shital Nagane*4
*1,2,3Student, Department Of Information Technology, J.S. Polytechnic, Pune, Maharashtra, India.
*4Guide, Department Of Information Technology, J.S. Polytechnic, Pune, Maharashtra, India.
ABSTRACT
In today’s generation people tend to possess high level of stress because of economic conditions, political
conditions, family/personal reason. being attentive of music is also a can prove helpful or stress management.
However, it's more helpful after you discover music which suits your current level of stress or any emotion.
Then, the applying returns songs which have the identical mood because the user's emotion. The experimental
results showed that the system was distinctly able to differentiate between happy ,sad ,angry ,neutral emotions.
Emotion recognition and monitoring supported commonly used wearable devices can play an vital role in
psychological health monitoring and human-computer interaction. However, the prevailing methods cannot rely
on the common smart bracelets or watches for emotion monitoring existing. to accommodate this issue, our
study proposes how for emotional recognition using rate data from a wearable smart bracelet. The convenient
form factor of smart wearables makes it easy for user to carry them and procure recommended music on the go.
Keywords: Emotion, Stress, Songs, Wearables.
I. INTRODUCTION
In the past few years, the percentage of people all around of world have many issues regarding emotion
management ,people who have stress has raised at a higher heart rate. This is because of several reasons such as
debts, higher product price, bad economy, high living expenses, covid also being a major reason etc. In 2017,a
survey was made in which it was found people have increased phone usage and daily call logs have been
exceeding.
Stress can be eliminated in various ways; for example, workouts, watching movies, meditation, basically do
anything which you like and listening to music can be a way to enhance it. Many pieces of research state that
music can help people to reduce stress and be more focus. But major criteria or that would be getting songs
which best suits you at that moment. Thus, to reduce stress, the music with the proper mood should be chosen.
Furthermore, although there are many music player applications, there is no application which is able to select
songs based on the user emotion with help of heart rate.
To overcome these limitations, this paper proposes a mobile music player application which is able to
recommend songs based on the user feelings. To classify the user emotion, the proposed application applies the
heart rate. When the application gets a user heart rate from a smart band or a , it improvises the data and
concludes what the user emotion is. The user and song emotions in this paper are divided into four types;
namely, neutral, happy, sad and angry.
This technology can be break through for handicapped people who have minimum mobility in arms, fingers or
are visually impaired to see what songs to select from the device , our system suggest songs based on mood
Devices from Apple, Huawei, Fitbit and Xiaomi provide a solid platform for wearable emotion recognition based
on heart rate.
II. METHODOLOGY
Music Emotion Classification This section presents the prevailing methods for classifying emotion from the
songs. There are some relevant ways which are as follows. • Robert E. Thayer applies rhythm, tempo, intensity,
pitch, and timbre to tell apart music emotions. This research divides the music emotions into 8 types; namely,
exuberance, anxious/frantic, contentment, depression, calm, energetic, happy, and anxious/sad. s • Y. Song et al
apply an SVM-based approach for distinnguish the music emotions supported tags of the Last.FM website. The
research proposes . The neutral emotion has the foremost stable pulse (60 – 80 bpm), while the happy emotion
has the only variation rate (70 – 140 bpm) which depends on the kind of happiness. The sad emotion has the
second highest variation rate (80 – 100 bpm). the center rate of angry emotion is within the identical range

www.irjmets.com @International Research Journal of Modernization in Engineering, Technology and Science


[2053]
e-ISSN: 2582-5208
International Research Journal of Modernization in Engineering Technology and Science
( Peer-Reviewed, Open Access, Fully Refereed International Journal )
Volume:04/Issue:03/March-2022 Impact Factor- 6.752 www.irjmets.com
because the happy emotion, but it'll not but 100bpm. To differentiate among the three emotional moods (happy,
sad, and neutral), this all are evaluated on basis of realtime pulse ,the difference between sign is directly
associated with different moods Neutral=60-80bpm Happy =70-140bpm Sad =80-100bpm
Existing system
Existing Music Player Applications To develop an appropriate application, the prevailing music player
applications are investigated. Currently, B.System Requirement The emotion-based music player has 2
components; namely, admin and user.
• Admin: the person who can upload the music files on the database. if he is unable to interfere with any of the
users’ preferences and playlists.
• User: the person who can access to almost features of this application i.e. playlist creation, user emotion
detection, searching of songs, and suggestion
Although these features fulfils the user's basic requirements, yet the user should face the task of manually
browsing through the playlist of songs and choose songs supported his current mood and behaviour. For that we
should always first gather the specified data and knowledge.
Proposed system
Once you recognize exactly what you wish and also the equipment’s are in hand, it takes you to the primary real
step of machine learning- Gathering Data. This step is incredibly crucial because the quality and quantity of
information gathered will directly determine how good the predictive model will prove to be.
• It aims to supply user-preferred music with emotion awareness. In the past system user want to manually
select the songs which randomly played music might not match to the mood of the user, user must classify the
songs into various That is the wants of a personal, a user sporadically suffered through the requirement and
desire of browsing through his playlist, in keeping with his mood and emotions we should first gather the
specified data and data.
• Once you recognize exactly what you wish and therefore the equipment’s are in hand, it takes you to the
primary real step of machine learning- Gathering Data
This step is extremely crucial because the standard and quantity of information gathered will directly determine
how good the predictive model will find you to be.
• In existing system user want to manually select the songs, randomly played songs won't match to the mood of
the user, user must classify the songs into various That is the requirements of a private, a user sporadically
suffered through the need and desire of browsing through his playlist, keep along with his mood and emotions A
user Emotion Classification Methods The emotion-based music player applies the user’s sign and sign based data
to identify the user’s emotion. •Heart rate-based method a) Exact Classification Method The user emotion during
this paper is split into 4 types; namely, angry, sad, happy, and neutral emotion. the precise classification method
may well be an easy method that classifies the user rate supported the ranges provided in Quazi’sresearch . The
ranges of heart rates for each emotion are presented in Table II. just in case that the user’s rate is in multiple
ranges, the system will identify that the user has multiple emotions. as an example, if the user’s sign is 110 bpm,
the applying will conclude that the user is in both angry and happy emotion.
System Design an experiment designed and conducted by Ekman et al. it is proved that physiological behavior
had different responses to emotions. the center rate increased significantly when people were angry or scared
but decreased significantly in a very state of disgust . Britton’s research showed that the guts rate during a
cheerful mood was below that in an exceedingly neutral mood .Valderas showed that the results of relaxation
and fear on rate were significantly different, and therefore the average pulse rate during happiness was under
that while in a very sad state . Using the IBPSO algorithm, Xu et al. collected ECG and pulse signals for emotion
recognition within which the very best recognition rate of sadness and joy was 92.10%. Quiroz et al. used
walking acceleration sensor data and pulse data from a sensible watch to predict the spirit of the topic.
Table
Emotion Heart Rate (Lowest) Heart Rate (Highest)

Angry 83 per minute 135 per minute

www.irjmets.com @International Research Journal of Modernization in Engineering, Technology and Science


[2054]
e-ISSN: 2582-5208
International Research Journal of Modernization in Engineering Technology and Science
( Peer-Reviewed, Open Access, Fully Refereed International Journal )
Volume:04/Issue:03/March-2022 Impact Factor- 6.752 www.irjmets.com
Happy 72 per minute 140+ per minute

Neutral 56 per minute 72 per minute

Sad 62 per minute 100 per minute


Table

Emotion Average Heart Rate

Angry 109 per minute

Happy 106 per minute

Neutral 64 per minute

Sad 82 per minute


Significance
1. Reduction of stress due to adaptive music
2. Detects your mood and reduce efforts to curate playlist.
3. Saves time and easy to access
III. MODELING AND ANALYSIS
• Heart rate baseddata-based method This method applies the Face detection API of Microsoft Azure to spot the
user emotion from given pulse baseddata’s. supported the guts rate based expression analyzing, the Face
detection API is in an exceedingly very position to classify the datas into 8 different emotions, like anger,
contempt, disgust, fear, happiness, neutral, sadness, and surprise. As mentioned above, this paper focuses on
only 4 basic emotions, like angry, happy, sad and neutral.
If user emotion doesn’t match the 4 basic emotion. the applying will group these emotions into neutral. •
B.Songand Emotion Classification • additionally to the research of the 2 emotional classifications, this paper
analyzed the classification performance of three different emotions (happy, neutral, and sad. • Category of
neutral and happy emotion. The recognition results of neutral and happy emotions is given in Figure 8. •
Categories of Neutral and Sad Emotions.

They built a straightforward solution for emotion recognition supported the peaks within the EDA signal. The
success rates of their algorithm and therefore the SVM + GA was 64.66% and 90%, respectively . Zhang et al.
conducted an experiment within which 123 subjects were asked to wear smart bracelets with built-in
accelerometers. They attempted to spot emotions from walking data using the LibSVM algorithm. They achieved
classification accuracy of 91.3% (neutral vs. angry), 88.5% (neutral vs. happy), and 88.5% (happy vs. sad). the
popularity rate of the three emotions (neutral, happy, and angry) achieved an accuracy of 81.2%. These results
www.irjmets.com @International Research Journal of Modernization in Engineering, Technology and Science
[2055]
e-ISSN: 2582-5208
International Research Journal of Modernization in Engineering Technology and Science
( Peer-Reviewed, Open Access, Fully Refereed International Journal )
Volume:04/Issue:03/March-2022 Impact Factor- 6.752 www.irjmets.com
demonstrated that emotion can be reflected to some extent in walking, and wearable smart devices might be
accustomed recognize human emotions
Average Classification Method The average classification method will compare the given heart rate with the
average heart rate value of each emotion. The method will suggest the emotion whose average heart rate is the
closest to the user heart rate. For example, if the user’s heart rate is 110 bpm, the application will identify that
the user is in an angry emotion.
IV. RESULTS AND DISCUSSION
A. Dataset
This study proposes a music recommendation system which records the heartbeat of the user, which is
measured with the help of a smartwatch being wear by the user. Once the heartbeat is measured it is sent to the
database and there the analysing is done, after completing the analysis and fetching the appropriate data the
emotion is recognized and a playlist is suggested.
User Heartbeat Emotion
1. 70 Neutral
2. 142 Happy
3. 99 Sad
In above scenario we have taken three users, to each user we showed different videos. To the first user we
showed a random video of science, to the second user we showed a comedy video and to the third user we
showed a war crisis video.
B. Evaluation Results
The results in above Table presents the measuring of their heartbeats after watching the videos respectively..
Here the first user’s heartrate is 70 that is neutral so a normal list of songs and playlist is recommended. The
second user’s heartrate is measured as 142 that is the highest level of happiness so a happy list of songs and
playlists is recommended. The third user’s heartrate is 99 that is sad so a sentimental list of songs and playlists is
recommended.
V. CONCLUSION
For analysing a person’s mood or emotions, heart rate is considerd as preferable and has more accurate
detection of fear, sadness, aggression and excitement when happy when compared to the existing systems which
uses facial expressions as they can vary way too much. Unlike the systems based on facial expressions our
system won’t face any difficulty regarding the accuracy in the emotion detection process because our feelings
come from our heart thus detecting a person’s feelings based on how much a bpm the heart is pumping will
always give maximum accuracy. Therefore, we proposed a method of using heart rate data to identify human
emotions. The data of recorded heart rate is measured by wearable smart watch. The experimentsthat we have
performed above shows that the idea of implementing heart rate for emotion detection is advantageous for the
future scope. As it will also help to do the analysis of the users mental health and the user will realise the
amount of fluctuation from his or her heart goes through. This process is easy to approach based on wearable
consumer electronic devices especially smart watches. It will help to promote the application and development
of wearable devices for monitoring human emotional moods in static or quasi-static states. It will be great
advantage to the users looking for music based on their mood and feelings. It will also help reduce the searching
time for songs thereby making the work easy and more preferable to use, and increasing the overall accuracy
and efficiency of the system.
VI. REFERENCES
[1] Viola, P., and Jones, M. Rapid object detection using a boosted cascade of simple features. Proceedings of
the 2001 IEEE Computer Society Conference on, vol. 1, pp. 511-518 IEEE, 2001 (2001)
[2] H. Immanuel James, J. James Anto Arnold, J. Maria MasillaRuban, M. Tamilarasan, R. Saranya” EMOTION
BASED MUSIC RECOMMENDATION SYSTEM”: pISSN: 2395-0072 , IRJET 2019
[3] Hafeez Kabani, Sharik Khan , Omar Khan , Shabana Tadvi”Emotion Based Music Player” International
Journal of Engineering Research and General Science Volume 3, Issue 1, January-February , 2015
www.irjmets.com @International Research Journal of Modernization in Engineering, Technology and Science
[2056]
e-ISSN: 2582-5208
International Research Journal of Modernization in Engineering Technology and Science
( Peer-Reviewed, Open Access, Fully Refereed International Journal )
Volume:04/Issue:03/March-2022 Impact Factor- 6.752 www.irjmets.com
[4] ShlokGikla, Husain Zafar, ChuntanSoni, KshitijaWaghurdekar”SMART MUSIC INTEGRATING AND MUSIC
MOOD RECOMMENDATION”2017 International Conference on Wireless Communications, Signal
Processing and Networking(WiSppNET)
[5] T.-H. Wang and J.-J.J. Lien, “Facial Expression Recognition System Based on Rigid and Non-Rigid Motion
Separation and 3D Pose Estimation” J. Pattern Recognition , vol. 42, no. 5,pp. 962-977, 2009
[6] Srushti Sawant, Shraddha Patil, ShubhangiBiradar,”EMOTION BASED MUSIC SYSTEM”, International
Journal of Innovations & Advancement in CoputerScience,IJIACS ISSN 2347-8616 volue 7, Issue 3 March
2018
[7] Sudha Veluswamy, Hariprasad Kanna, Balasubramanian Anand, Anshul Sharma “METHOD AND
APPARATUS FOR RECOGNIZING AN EMOTION OF AN INDIVIDUAL BASED ON FACIAL ACTION
UNITS”US2012/0101735A1
[8] Markus Mans Folke Andreasson ”GENERATING MUSIC PLAYLIST BASED ON FACIAL EXPRESSION”
US8094891B2
[9] Mutasem K. Alsmadi ”FACIAL EXPRESSION RECOGNITION”US2018/0211102A1.

www.irjmets.com @International Research Journal of Modernization in Engineering, Technology and Science


[2057]

You might also like