Professional Documents
Culture Documents
p0872
He, J.-X. et al.
Paper:
No.518 Changxiang West Avenue, College Park, Zhenjiang City, Jiangsu 212000, China
∗3 School of Automation, China University of Geosciences
In recent years, with the further breakthrough of ar- as early as 1985: “The question is not whether intelli-
tificial intelligence theory and technology, as well as gence machines have emotions, but how intelligent ma-
the further expansion of the Internet scale, the recog- chines without emotions are intelligent.” In 1997, Picard
nition of human emotions and the necessity for satis- of Massachusetts Institute of Technology proposed affec-
fying human psychological needs in future artificial tive computing [1], which can not only be measured and
intelligence technology development tendencies have analyzed, but also can affect emotions. The introduction
been highlighted, in addition to physical task accom- of affective computing has introduced a new field of com-
plishment. Musical emotion classification is an im- puter science and resulted in a breakthrough in research
portant research topic in artificial intelligence. The regarding accessible human–robot interaction technology.
key premise of realizing music emotion classification Musical emotion classification is an important com-
is to construct a musical emotion model that conforms ponent of emotional classification. Music is an essen-
to the characteristics of music emotion recognition. tial form of the artistic expression of human emotions.
Currently, three types of music emotion classification Through the recognition and classification of musical
models are available: discrete category, continuous di- emotions, construction of emotional models, and reason-
mensional, and music emotion-specific models. The ing optimization, the digital expression patterns of human
pleasure-arousal music emotion fuzzy model, which music emotions can be obtained, thereby improving the
includes a wide range of emotions compared with smoothness of human–robot interaction technology.
other models, is selected as the emotional classification
system in this study to investigate the influencing fac-
tor for musical emotion classification. Two representa- 1.1. Musical Emotion Model
tive emotional attributes, i.e., speed and strength, are An important prerequisite for realizing music emotion
used as variables. Based on test experiments involving classification is to construct a musical emotion model
music and non-music majors combined with question- that conforms to the characteristics of emotional cogni-
naire results, the relationship between music proper- tion. Constructing a musical emotion model requires the
ties and emotional changes under the pleasure-arousal extraction of different musical clue features from musi-
model is revealed quantitatively. cal and acoustic symbols, such as pitch, timbre, strength,
modulation, rhythm, and melody, followed by emotional
classification integration. The music emotion model can
Keywords: music emotion, classification, affective com- be categorized into continuous dimension, discrete cate-
puting, fuzzy model, pleasure-arousal emotion space gory, and music emotion-specific models. Currently, the
continuous dimension and discrete category models are
used frequently.
1. Introduction
1.1.1. Continuous Dimensional Model
Marvin Minsky, one of the founders of artificial intel-
ligence, mentioned the following in The Society of Mind A continuous dimensional model represents a human
emotional state as a point in a two-dimensional (2D)
Source SS df MS F Sig.
A 902.16 2 220.75 1034.95a 0.000
B 184.65 2 9.31 190.40a 0.000
C 817.88 1 2.35 1482.86a 0.000
AB 88.20 4 30.69 47.32a 0.000
AC 82.97 2 117.05 99.78a 0.000
Fig. 5. Sample coincidence rate.
BC 3.11 2 5.55 3.56a 0.032
ABC 150.96 4 13.51 131.36a 0.000
a: Exact Statistic. In the table, A: represents speed,
strength) × 2 (music type: happy, serene) experimental
B: strength, and C: music type.
design was adopted.
Source SS df MS F Sig.
A 441.50 2 220.75 431.11a 0.000
B 18.62 2 9.31 22.64a 0.000
C 2.346 1 2.35 3.83a 0.053
AB 122.76 4 30.69 78.53a 0.000
AC 234.10 2 117.05 179.07a 0.000
BC 11.11 2 5.55 12.98a 0.000
ABC 54.16 4 13.54 40.36a 0.000
95% confidence
interval for
Mean Std. Sig.a differencea
difference error
Fig. 6. ABC pleasure degree simple effect test interaction (I–J) Lower Upper
diagram. bound bound
(I)A (J)A
2 −0.412∗ 0.035 0.000 −0.482 −0.342
1
3 −1.211∗ 0.041 0.000 −1.293 −1.130
(P < 0.001), i.e., the abrupt degree of fast music was
1 0.412∗ 0.035 0.000 0.342 0.482
significantly better than that of the medium-speed mu- 2
3 −0.799∗ 0.040 0.000 −0.879 −0.719
sic. The medium-speed music was significantly better 1 1.211∗ 0.041 0.000 1.130 1.293
than the slow music. 3
2 0.799∗ 0.040 0.000 0.719 0.879
ii. A significant difference existed between levels 1 and 2 (I)B (J)B
and between levels 1 and 3 of the strength fac- 2 −0.220∗ 0.037 0.000 −0.293 −0.147
tor (P < 0.001); no significant difference was ob- 1
3 −0.218∗ 0.040 0.000 −0.298 −0.139
served between levels 2 and 3 (P = 0.09 > 0.05), 1 0.220∗ 0.037 0.000 0.147 0.293
2
i.e., the arousal of the medium-strength music was 3 −0.002∗ 0.041 0.966 −0.079 −0.082
significantly better than that of the soft music. The 1 0.218∗ 0.040 0.000 0.139 0.298
3
arousal of the strong music was significantly better 2 −0.002 0.041 0.966 −0.082 0.079
than that of the soft-strength music, and the effect of (I)C (J)C
the medium-strength and strong music on the awak- 1 2 −0.073∗ 0.037 0.053 −0.148 0.001
ening of music was insignificant. 2 1 0.073∗ 0.037 0.053 −0.001 0.148
iii. A significant difference existed between levels 1 and 2 Based exact statistic. → Exact statistic on estimated marginal means.
∗ : mean difference is significant at the 0.5 level.
of the music type (P < 0.001), i.e., the arousal of mu- a : adjustment for multiple comparisons: least significant difference
sic type “happy” was significantly better than that of (equivalent to no adjustments).
music type “serene.”
Additionally, because of the significant interaction
among the three factors, the simple effect analysis re- nificant (P < 0.001), and the main effect of music type
vealed that, at any strength regardless of the music type, was insignificant (P = 0.053 > 0.05). The interaction
the arousal of the fast and medium-speed music was sig- effects between speed and strength, strength and music
nificantly better than that of the slow music. Furthermore, type, and speed and music type were extremely significant
at any speed regardless of the music type, the arousal of (P < 0.001), and the third-order interaction among speed,
the strong and medium-strength music was significantly strength, and music type was significant (P < 0.001).
better than that of the soft-strength music, but the degree As shown in Table 8, the results of multiple analyses
of arousal of the strong and medium-strength music was based on the degree of pleasure indicate that
insignificant; the music of type “happy” was a passing i. A significant difference existed between the levels of
speed. After the change in strength, the awakening of speed factors 1, 2, and 3 (P < 0.001); level 3 was
the music was significantly better than that of music-type better than level 2, and level 2 was better than level 1
“serene.” Fig. 6 shows a simple effect test interaction di- (P < 0.001), i.e., the pleasure of fast music was sig-
agram of BC (A: speed, B: strength, C: music type) plea- nificantly better than that of the medium speed mu-
sure when A = level 3. sic. The medium-speed music was significantly better
than the slow music.
3.2.2. Awakening Dimension Score Statistics ii. A significant difference existed between levels 1
As shown in Table 7, in the dimension of pleasure, the and 2, and between levels 1 and 3 of the strength
main effects of speed and strength were extremely sig- factor (P < 0.001); no significant difference was ob-
5. Conclusion
Based on music emotion classification experiments,
it was discovered that the statistical results of the ex-
perimental data were consistent with those of the con-
structed PA model music emotion classification fuzzy
Fig. 7. ABC awakening degree simple effect test interaction model, proving the feasibility of the PA model.
diagram. Furthermore, results regarding the effects of different
music attributes on music emotions using this model in-
dicated that the speed of music affected music emotion
served between levels 2 and 3 (P = 0.966 > 0.05), significantly. The strength of subjective emotional ex-
i.e., the pleasure of the medium-strength music was perience induced by fast music was higher than that of
significantly better. The pleasure of the strong music slow music. Fast music was more likely to induce positive
was significantly better than that of the soft-strength emotions. The faster the speed, the stronger was the plea-
music. The medium-strength and strong music did not sure and arousal of the music. Furthermore, the strength
differ significantly in terms of music pleasure. of music affected music emotions. The strength of sub-
jective emotions induced by strong music was higher than
In addition, owing to the significant interaction among that of soft music, but the distinction between strong mu-
the three factors, the analysis of simple effects revealed sic and the strength of emotional experience induced by
that, at any strength regardless of the music type, the plea- medium-strength music was insignificant. In the dimen-
sure of fast music was significantly better than that of slow sion of pleasure, strong music was more likely to induce
music. At any speed, regardless of the music type, the negative emotions; in the dimension of arousal, music was
pleasure of the strong and medium-strength music was more likely to induce positive emotions. The stronger the
significantly better than that of the soft-strength music, music, the more active it was, whereas the softer the mu-
but the arousal of the strong and medium-strength mu- sic, the calmer it was. The type of music exerted a more
sic did not differ significantly; the speed was low and pronounced effect on the awakening of music; music type
medium, regardless of the strength. The value of mu- “happy” had a higher effect than music type “serene.”
sic was serene, which was significantly better than mu- The results of this experiment can provide theoretical
sic type “happy.” When the speed was high, regardless guidance for composers and performers when creating
of the strength, the music pleasure of music type “happy” music as well as facilitate the construction of relevant mu-
was significantly better than those of other types of serene sic emotion recognition models.
music. Fig. 7 shows a simple effect test interaction dia- Finally, although some of the results were obtained
gram of AB (A: speed, B: strength, C: music type) arousal through experiments, certain deficiencies were discov-
when C = level 1. ered. For example, many musical factors affected the mu-
sic emotions. In this study, we selected two of the most
influential factors, i.e., strength and speed, to conduct the
4. Discussion experiments. In future studies, we plan to add other mu-
sic attributes to our study, including rhythm, beat, mode,
Based on the results above, it was discovered that the loudness, and reverberation, to investigate their effects on
speed of music significantly affected the emotion from musical emotions. The results of these studies will be arti-
music. The music strength affected the emotion from mu- ficial but will be facilitated by the field of intelligent com-
sic, whereas the music type exerted a more significant ef- position.
fect on the degree of awakening of music. The results
of these studies were consistent with those of many pub-
lished studies. For example, Hou and Liu [26] used EEG Acknowledgements
to investigate the EEG characteristics of speed and tone- This study was supported in part by the China Hubei Province
induced emotional activity. The results indicated that slow Natural Science Foundation for the Surface of the Project:
Based on Emotion Calculation of the Music Robot Intelligent [25] S. Garrido and E. Schubert, “Individual differences in the enjoy-
Composition and Performance of Key Technology Research ment of negative emotion in music: A literature review and experi-
ment,” Music Perception, Vol.28, No.3, pp. 279-296, 2011.
No.2019CFB581, and the China Ministry of Education Human- [26] J. Hou and C. Liu, “The Research of Brain Waves about Emotional
ities and Social Sciences Fund General Project: Dulcimer Music Activity Induced by Different Musical Conformations Composing
Robot Intelligent Knowledge Spectrum and Playing Key Technol- of Mode and Tempo: A Combined Representational Character of
Musical Training Experience and Gender,” Psychological Explo-
ogy Research No.16YJAZH080. ration, Vol.30, No.6, pp. 86-93, 2010.
[27] W. F. Thompson and G. Ilie, “A Comparison of Acoustic Cues in
Music and Speech for Three Dimensions of Affect,” Music Percep-
tion, Vol.23, No.4, pp. 319-330, 2006.
References:
[1] R. W. Picard, “Affective Computing,” MIT Press, 1997. [28] Y. Lu, “The influence of music factors, personality and situation
on music emotions,” Master Thesis, Southwest University, 2014 (in
[2] J. A. Russell, “A cicumplex model of affect,” J. of Personality and Chinese).
Social Psychology, Vol.39, No.6, pp. 1161-1178, 1980.
[3] J. Posner, J. A. Russell, and B. S. Peterson, “The cicumplex model
of affect: An integrative approach to affective neuroscience, cog-
nitive development, and psychopathology,” Development and Psy-
chopathology, Vol.17, No.3, pp. 715-734, 2005.
[4] P. Gilbert, “The Biopsychology of Mood and Arousal. Rovert E.
Thayer,” The Quarterly Review of Biology, Vol.67, No.3, p. 406, Name:
1992. Jing-Xian He
[5] A. Mehrabian, “Pleasure-arousal-dominance: A general framework
for describing and measuring individual differences in Tempera-
ment,” Current Psychology, Vol.14, No.4, pp. 261-292, 1996. Affiliation:
School of Arts and Communication, China Uni-
[6] A. Bittermann, K. Kuhnlenz, and M. Buss, “On the Evaluation of
Emotion Expressing Robots,” Proc. of 2007 IEEE Int. Conf. on versity of Geosciences
Robotics and Automation, pp. 2138-2143, 2007. School of Danyang Normal, Zhenjiang College
[7] K. Hevner, “Experimental studies of the elements of expression
in music,” American J. of Psychology, Vol.48 No.2, pp. 246-268,
1936.
[8] S. Sun et al., “Study on Linguistic Computing for Music Emotion,” Address:
J. of Beijing University of Posts and Telecommunications, No.z2,
pp. 35-40, 2006. 388 Lumo Road, Hongshan District, Wuhan, Hubei 430074, China
No.518 Changxiang West Avenue, College Park, Zhenjiang City, Jiangsu
[9] M. Zentner, D. Grandjean, and K. R. Scherer, “Emotions evoked by
the sound of music: characterization, classification, and measure- 212000, China
ment,” Emotion, Vol.8, No.4, pp. 494-521, 2008. Brief Biographical History:
[10] S. Huang, L. Zhou, Z. Liu, S. Ni, and J. He, “Empirical Research on 2017 Received bachelor’s degree from China University of Geosciences
a Fuzzy Model of Music Emotion Classification Based on Pleasure- 2020 Received MFA from China University of Geosciences
Arousal Model,” The 37th Chinese Control Conf., pp. 3239-3244, 2020- Teacher, Zhenjiang College
2018. Main Works:
[11] A. Gabrielsson, “Emotions in strong experiences with music,” P. • “Empirical Research on a Fuzzy Model of Music Emotion Classification
Juslin and J. Sloboda (Eds.), “Music and Emotion: Theory and Re- Based on Pleasure-Arousal Model,” The 37th Chinese Control Conf.,
search,” pp. 431-449, Oxford University Press, 2001.
pp. 3239-3244, 2018.
[12] K. R. Scherer and M. R. Zentner, “Emotional effects of music: Pro-
duction rules,” P. N. Juslin and J. A. Sloboda (Eds.), “Music and Membership in Academic Societies:
Emotion: Theory and Research,” pp. 361-392, Oxford University • Chinese Association for Artificial Intelligence (CAAI)
Press, 2001. • Zhenjiang Musicians Association
[13] X. Zhou, “An Empirical Study on the Influence of Melody Forms
on Emotion Induction,” Southwest University, 2019.
[14] L. Zhang and F. Pan, “The role of speed and mode in inducing emo-
tional response: Evidence from traditional Chinese and Western
music,” Exploratory Psychology, Vol.37, No.6, pp. 549-554, 2017.
[15] W. Huang, “An Empirical Study of the Influence of Classical Music
on College Students’ Emotions,” Hunan Normal University, 2007.
[16] X. Fan, “An Empirical Study of the Influence of Guqin Music on
the Emotion of College Students in Music Colleges,” Wuhan Con-
servatory of Music, 2013.
[17] E. Altenmüller et al., “Hits to the left, flops to the right: different
emotions during listening to music are reflected in cortical laterali-
sation patterns,” Neuropsychologia, Vol.40, No.13, 2002.
[18] R. Mo, R. H. Y. So, and A. B. Horner, “How Does Parametric Re-
verberation Change the Space of Instrument Emotional Character-
istics?,” 43rd Int. Computer Music Conf., 2017.
[19] D. Västfjäll, P. Larsson, and M. Kleiner, “Emotion and the auditory
virtual environment: Affect-Based judgment of virtual reverbera-
tion times,” Cyberpsychology and Behavior, Vol.5, No.1, pp. 19-32,
2002.
[20] D. Zhang, “The influence of store background music on consumer
purchase approach behavior,” Dongbei University of Finance and
Economics, Master Thesis, 2013 (in Chinese).
[21] P. N. Juslin and D. Västfjäll, “Emotional responses to music: The
need to consider underlying mechanisms: Erratum,” Behavioral and
Brain Sciences, Vol.31, No.6, p. 751, 2008.
[22] R. He, “An Empirical Study on the Influence of Musical Melody
Expectation on Emotional Experience,” Master Thesis, Southwest
University, 2019 (in Chinese).
[23] M. Li, “An Empirical Study of the Influence of Rhythm Structure on
Emotion,” Master Thesis, Southwest University, 2015 (in Chinese).
[24] H. Ma et al., “The influence of music culture experience on music
emotional processing,” Chinese Science Bulletin, Vol.20, pp. 2287-
2300, 2017.
Name: Name:
Li Zhou Xin-Yue Hu
Affiliation: Affiliation:
Arts and Communication School, China Univer- School of Arts and Communication, China Uni-
sity of Geosciences versity of Geosciences
Address: Address:
388 Lumo Road, Hongshan District, Wuhan, Hubei 430074, China 388 Lumo Road, Hongshan District, Wuhan, Hubei 430074, China
Brief Biographical History: Brief Biographical History:
2004 Received M.A. from Wuhan University 2016 Received M.E. from China University of Geosciences
2005 Certificate of Computer Music Composition, Central Conservatory Main Works:
of Music • X. Hu and Z. Li, “The Creative Features and Enlightenment of Joe
2004-2006 Assistant Professor, China University of Geosciences Hisaishi’s Animation Film Music,” Art Evaluation,
2006- Associate Professor, China University of Geosciences CNKI:SUN:YSPN.0.2019-07-069, 2019 (in Chinese).
2013-2015 Visiting Scholar, Center of Music Technology of Georgia
Institute of Technology
Main Works:
• “Study on Adaptive Chord Allocation Algorithm Based on Dynamic
Programming,” J. of Fudan University (Natural Science), Vol.58, No.3,
pp. 393-400, 2019.
• “Design of Mechanical Arm of Auto-Play Dulcimer Robot,” Machinery
Design & Manufacture, No.1, pp. 251-253, 2018.
• “Gait planning for biped dancing robot,” Engineering J. of Wuhan
University, Vol.49, No.6, pp. 949-954+960, 2016.
Membership in Academic Societies:
• Electronic Music Association, Chinese Musicians’ Association (CMA),
Senior Member
• Chinese Association for Artificial Intelligence (CAAI)
Name:
Zhen-Tao Liu
Affiliation:
School of Automation, China University of Geo-
sciences
Address:
388 Lumo Road, Hongshan District, Wuhan, Hubei 430074, China
Brief Biographical History:
2008 Received M.E. from Central South University
2013 Received Dr.Eng. from Tokyo Institute of Technology
2013-2014 Lecturer, Central South University
2014-2018 Lecturer, China University of Geosciences
2018- Associate Professor, China University of Geosciences
Main Works:
• “Speech Personality Recognition Based on Annotation Classification
Using Log-likelihood Distance and Extraction of Essential Audio
Features,” IEEE Trans. on Multimedia, doi: 10.1109/TMM.2020.3025108,
2020.
• “Electroencephalogram Emotion Recognition Based on Empirical Mode
Decomposition and Optimal Feature Selection,” IEEE Trans. on Cognitive
and Developmental Systems, Vol.11, No.4, pp. 517-526, 2019.
• “Speech Emotion Recognition Based on Feature Selection and Extreme
Learning Machine Decision Tree,” Neurocomputing, Vol.273, pp. 271-280,
2018.
Membership in Academic Societies:
• The Institute of Electrical and Electronics Engineers (IEEE)
• Chinese Association for Artificial Intelligence (CAAI)