You are on page 1of 14

C’ognition, 44 ( 1992) 227-240

~e~art~e~t of Brain and Cognitive Sciences, MT, Cambridge, MA 02139, USA

Received November 4, 1991, final revision accepted February 5, i992

E&off, N.L., and iviagee, J.J., lB2. Categorical pZiCXf3~iGfi


Gf facial expiessfons.
Cognition, 44:
227-240.

People universally recognize facial expressions of happiness, sadness, fear, anger,


disgust, and perhaps, surprise, suggesting a perceptual mechanism tuned to the
facial configuration displaying each emotion. Sets of drawings were generated by
co uter, each consisting of a series of faces differing by constant physical
amounts, running from one emotional expression to another (or from one emotion-
al expression to a neutral face). Subjects discriminated pairs of faces, then, in a
arate task, categorized the emotion displayed by each. Faces within a category
were discriminated more poorly than faces in different categories that difiered by an
equal physical amount. Thus emotional expressions, like colors and speech sounds,
are perceived categoricaHy, not as a direct reflection of their continuous physical
properties.

ressions provide perhaps the most effective means of communicating


emotion. Not entirely under the control of the emoter, they make emotional
states transparent in a way that thoughts and beliefs can never be. A century of
1 research has shown that the same facial movements are used to
iness, sadness, fear, anger, disgust, and surprise un’versally, though

Correspondence to: Nancy L. Etcoff, Department of Brain and Cognitive Sciences, El@237,
assachusetts Insitute of Technology, Cambridge, MA 02139, USA.
*Supported by NH-I grant DC00565 to the first author. We thank Susan Carey, Paul Ekman,
Steven Pinker, and Gregg Solomon for comments, and Brian Kellner for technical assistance. Parts Gf
this research wet: txesented at the Fifth Conference of the International Society for Research on
Emotion, New Brunswick, New Jersey, July, 1990.

OOIO-0277/92/$4.70 @ 1992 - Elsevier Science Publishers B.V. All rights reserved.


228 N. L. Etcoff and J. J. Magee

the rules for when it is appropriate to display them may vary


Ekman & Friesen, 1971). Darwin (1872) further argued that facial
not arbitrary, but have their origins in basic acts of self-preservation common to
humans and animals, and are sensibly related to the emotional. states t
xpressions as we now know them are s actions manque (
y movements associa

from those underlying inti


recordings from monkeys

sions of emotion even

sions is also biologically prepared to react to


fou onkeys reared in isolation showe
the first time with
paired angry, happ
ing experiment wit
increased resistance to extinction.
Finally, there is evidence t at very young infants
riately to facial expressions. Infant
facial movements (
recognize different versions of the same ex
ing tone of voice (Caron, Caron, &
year, facial expressions provide a powerful
olds will not crawl over a “visual cliff” if
but will if she is s iling (Sorce, Emde, Ca
What is still an open question, howeve
Facial expressions 229

sssibility is that the perceptual mechanism is specifically tuned to each


ures that represents a particular emotion, and categor-
ons as exemplifying the nearest emotion. This would predict
itative difference in the way similar expressions actually
iver, depending on whether they are in the same category (see
Alternatively, people could perceive facial configurations as
cting the piecemeal occurrence of different emotion-
hapes of facial muscles. Category membership would
er conceptual and linguistic systems but the actual perception
ation, with category membership having no real effect

ed here are designed to test whether the actual


of facial expressions is categorical. The logic and methodology are as
erceived categorically, then within a series of facial
y equal physical increments between two different
ton and a neutral state, the probability of identifying
expression as a particular emotion should not vary linearly across the series
should change relatively quickly at some boundary. Crucially, pairs of faces
ing by a given physical amount should be discriminated more accurately
that difference straddles the category boundary.
This methodology has been applied successfully to many perceptual domains,
including color and speech. For example, we perceive the wavelength continuum
ivided into a few basic color categories, and we perceive the acoustic
of voice-onset time as perceptually divided into the consonants ba and
a. A pair of bas sound more like one another, and are harder to discriminate,
an a ba and a pa, even if the latter are more similar acoustically; two greens are
arder to discriminate than a green d a yellow, even when they differ more in
wavelength (see Liberman, arris, offman, & Griffith, 1957; Bornstein, 1973;
reviewed in arnad, 1987).
ecause a discrimination task requires only that physical differences be atten-
cts would suggest that perceptual assignment of a facial configu-
ration to an emotion category is mandatory. People cannot help but see the face
as showing one or another kind of emotion; discriminating facial expressions is
easier for different emotions because it can be directed by the qualitatively
distinct internal category representations.

Subjects

IT undergraduates served as subjects. Each participated in one of


eight conditions, corresponding to a single pair of facial expressions.
230 N. L. Ercoff and J. J. Magee

Materials

Series of pictures of facial expressions wit identical physical di


sczcessive members were generated by c
displaying happiness, sadness, anger, disgust, fear, surprise,
sions were taken from a standardized set (Ekman & Fries
drawings of these faces were created using a computer syste
positions of 169 landmark points identifie
segments - a format designed to present
possible lines (Brennan, 1982). The resul
eyebrows, eyelids, and mouth, all crucial
Continua containing 11 evenly spaced
that computes the arithmetic mean of the valu
facial landmarks from a set of face drawings
(e.g., angry and afraid) corresponded e
mediate faces were created by averaging 1
9 angry faces and 2 afraid faces, and so on.
Three continua were created from emotions that are easily discriminated:
happy-sad, angry-sad, and angry-afraid. These
their discriminability but because they represent
“dimensions”

Ellsworth, 1982). If so, one mig


these scales and not show categ
o other continua

confusion occurred at the perceptual level. Two

odels were chosen such that the faces at

still recognizable by pilot subjects, and suc

tracted lips merged). This yielded two models for the continua anchored by an
angry face and three each for the other continua.
Facial expressions 231

Figure 1. Examples of stimuli from each emotion continuum. From /eft to right: angry-sad; angry-
afraid; angry-disgusted; happy-sad; happy-neutral; sad-neutral; happy-surprised; and
surprised-afraid.
232 N. L. Etcoff and J. J. Magee

ecause the stimuli consist of line drawings, some may argue t ings
are compromised because judg ade on ote,
however, that artificial stimuli are unavoidable: the design of a categorical
percepiion experiment requires a series of stimuli with eq
separating them, and such stimuli are not found in nature.
research on the categorical erception of speech, for
artificial synthesized speech. is also important to note
produced by the digitization program are quite detailed a
famous individuals have been found to be recognizabl
re exemplars of facial expressions (see t
nson and Perrear (1991) confirms th
ion: using the softwar

from experiments that had just used line drawings (Rhodes et al., 1987).

Procedure

separatcti by i-s intervals.

times for each


n the identification task, subjects were s
order, eight presentations

category boundaries i

‘Pilot testing revealed that some subjects treated angry and disgusted as synonyms, so the terms
were distinguished for subjects by defining anger in terms of “injustice” and disgust as “repulsion.”
Facial expressions 233

were perceived categorically. In the


ontinua into separate categories, with a
them (Figure 2). To determine whether the pattern
ation task suggests a boundary shift (rather than being
ing trend analysis was performed. First, a linear
using equally spaced weights for the 11 faces; this was, of
all the conditions. I hen the deviation from the linear
erestingly, this was significant at p < .OOl for all continua,
cation probability did not simply increase linearly with the
ly, a contrast orthogonal to the linear trend was
nonlinear component of an ideal pattern of categori-
the first five faces would be given a label with
s would be given the label with probability 1, and
ith probability .5 (note that the contrast is
ry boundary that happens to fall exactly in the
This COTtrast was significant at p < .OOl for all
must be interpreted with caution, because the
ot linearly reflect the subjects’ underlying degree
falling into one or the other category, they do
at category differences, not just proportion of mixture, are affecting the
htforward test of categorical perception comes
e discrimination data to be discussed.
dings are consistent wita several cross-cultural
rong agreement in labeling an expression even
in it (Ekman & Friesen, 1971; Ekman et al.,
at the endpoints was at or above 90% for all
sting that the endpoint emotions were easily
onfusion is also consistent with cross-cultural
o identify photographs of facial expressions, or
se two emotions are the most iikely to be
Ekman, Friesen, & Ellsworth, 1982).
discrimination task, we defined two clear
ntified as one emotion more than 66% of the
es identified as the other emotion more
rception would correspond to higher
accuracies for ting between-category pairs than within-category pairs,
uracy for within-category pairs. Either
or 3-step discriminations were analyzed, depending on whether one or two
at frequencies between 33% and 66%.
each continuu except those involving surprise, the discrimination data
r

&21%24364%5-708792-10

_-

Figure 2. For each emotion pair, the percentage of trials a face was ident@ed with a given emotion label (top), and the percentage of trials a pair of faces
was successfully discriminated (bottom). The dashed vertical lines show the category boundaries, estimated from the identification percentage.
Facial expressions 235

fell into the pattern corresponding to categorical perception: faces straddling a


category boundary were distinguished more accurately than faces within a cate-
gory, holding physical similarity constant. The means are shown in Figure 2.
Planned contrasts comparing the single between-category pair against the within-
category pairs were statistically significant (all ps < .03), and the residual variance
in the main effect was not significant (all ps > .lO). The results of the contrasts are
summarized as follows: 2 happy-sad, F( 1,49) = 14.87, p < .OOl; angry-afraid,
F&96) = 11.91, p c .OOl; angry-sad, F(1,88) = 12.58, p < .OOl; angry-disgus-
ted, F( 1,56) = 9.44, p c .Ol; happy-neutral, F( 1,77) = 11.24, p < .Ol; sad-
neutral, F( 1,88) = 5.84, p C .02; afraid-surprised, F( 1,88) < 1; happy-surprised,
F(1,70)<1.
Note that for pairs within a category, the discrimination rates were significantly
lower than for the pair at the boundary, but they were still above chance. A
skeptic might argue that these results do not indicate true categorical perception
since information about the physical properties of faces is not totally unavailable
to the perceiver. Early operational definitions of categorical perception from
studies in speech perception included at-chance discrimination within a category
as one criterion (e.g., Studdert-Kennedy, Liberman, Harris, & Cooper, 1970).
However, even in speech research, this claim has been disproved: people are able
to differentiate within-category patterns with con~;Tlfi~~k’~~~~~*~~~.’
CVBUb&UUIG /a m Bar=
UGM.I‘.WJ tW.6.)

clay, 1972; MacMillan, Kaplan, & Creelman, 1977; reviewed in Harnad, 1987).
As Eimas, Miller, and Jusczyk (1987) argue, this does not weaken the basic
concept of categorical perception. Rather, it suggests that categorical perception
is not simply the result of an inability to discriminate the low-level physical
properties of the stimuli, but is the result of further processing whose effects sum
with the results of the lower-level discriminations. Thus our results are entirely
consistent with current definitions of categorical perception as a discontinuity in
discrimination at the category boundary of a continuum, with greater difficulties
in discriminating members of the same category than members of different
categories even though the amount of physical differences between both pairs is
the same (e.g., Cutting & Rosner, 1974; Harnad, 1987).
Interestingly, categorical boundaries are perceived not just between emotions
*The different degrees of freedom associated with the F-tests reflect the fact that some analyses
were done on 2-step and some on 3-step discriminations, and that slightly different numbers of
subjects discriminated the different continua: 13 subjects discriminated the angry-afraid and the
angry-sad continua, 12 subjects discriminated the happy-neutral, sad-neutral, afraid-surprised, and
happy-surprised continua, 9 subjects discriminated the anger-disgust continua, and 8 subjects
discriminated the happy-sad continua.
‘Note that some of the within-category pairs near the boundary involved faces in the no-man’s land
between clear categories. If they were seen by some subjects in some trials as belonging to one
category, those trials would have effectively been between-category comparisons. The contrasts
indicating categorical perception come out despite this diluting effect. When only within-category pairs
with meinbers in each of the clear categories are analyzed, the between-within contrasts are even
stronger for all pairs except those involving surprised, which remain nonsignificant.
236 N. L. Etcoff and J. J. Magee

but between emotions and non-emotions (neutral faces). This suggests that an
emotionally neutral face does not just correspond to a low degree of emotionality
but is perceived as its own category, and the point where an emotion becomes too
weak to have signal value is sharply perceived.
Although s&Sects could clearly labe! the surprised faces in the identification
task, they did not show categorical perception when discriminating them from
happy or afraid faces: pairs straddling the category boundaries were no easier to
discriminate than pairs within a category. The noncategoricality is due to surprise
itself; happiness and fear were categorically discriminated from other e
such as neutrality, sadness, and anger. Although surprised faces are
similar to afraid ones and are commonly confused with them, this does not
explain the noncategoricality, because it also occurs when s
nated from happiness, an expr ion which is physically dis
seldom confused with, surprise. oreover anger and
confused but are discri mated categorically.
surprise expression s gests that it is pe
emotions, combining with an emotion in perception rather than defining a
category that is mutually exclusive with it. This finding is consistent with the lack
of unequivocal evidence that surprise expressions are universally labeled as such,
and with arguments that surprise may not be an emotion at all but a cognitive
state that easily combines with true, valenced emotions such as fear
(see Ekman, 1984; Oatley & Johnson-Laird, 1987). Lazarus captures
ship in describing surprise as a “p emotion” (Lazarus, 1991).
Note that it is unlikely that e categorical discrimination observe was
mediated by subjects’ use of verbal labels. Faces at t er to
name and showed much more variation in the were
discriminated more accurately. as as easy to label as a
emotion and showed the same identification task, yet t
e boundary was no easier to discriminate. The tende
ressions categorically was striking: no subject realiz
series contained gradations between expressions, and in the free naming task,
descriptions in which two emotions were mentioned were virtually nonexistent,
even though use of multiple labels of the subjects’ choosing was encourage

The results of this experiment show that, by standar


of happiness, fear, anger, disgust, sadness, an
tally: small physical differences in facial features at bo
assigned emotion categories are perceived
differences between faces assigned to the same emotion category. This suggests
Facial expressions 237

that continuous information about the physical configurations of facial features


are not the salient representations on which people base judgments about facial
expressions. Rather, such continuous information is obligatorily transformed into
categorical information corresponding to the nearest emotion, with the result that
even though the same-different physical discrimination task had nothing to do
with categories, and would have been optimally performed by attending to the
physical features directly, subjects could not help but base their discriminations in
large part on the categories.
Why should emotions be perceived categorically? In speech perception,
onemes are said to be perceived categorically because the goal of the perceptu-
system is to recover the motoric command program that the speaker used to
utter the sound; these commands correspond to the speaker’s intention of which
abstract phoneme to articulate (e.g., Liberman et al., 1957). The perception is
forced into one or another category because it would not make sense to perceive
it as having an intermediate value: there is no intermediate state in a speaker
corresponding to intending to articulate something between a p and a b.
At first glance, it would seem that this logic would not carry over to facial
expressions, because the person showing the expression can be in, and perhaps
express, a mixture of emotional states, and it would seem helpful for the perceiver
to see mixtures as mixtures. “Mixed emotions,” though, may actually not be a
mixture of two states but an alternation between them that is resolved into one or
another before motivating an overt behavior‘ Though people surely can ex-
perience a variety of emotions in response to one event, it is an open question
whether these emotions actually co-occur in time, or whether they unfold serially
or in alternation (see Lazarus, 1991). However, even if multiple emotions can
coexist, the state relevant to the perceiver may still be categorical. A person
experiencing both anger and fear may flee the situation or fight, but would not do
something halfway in between. If the perceiver is built to detect states in others
that can motivate behavior, it might be best to see the face in terms of the single
ost likely underlying state, rather than some mixture.
Do the current data bear on the nature of the emotions themselves that
underlie facial expressions? There is a longstanding controversy over whether
emotions are best conceptualized as falling into a small set of discrete categories
such as sadness and anger - one for each basic facial expression of emotion (see
Ekman, 1984; Izard, 1971, 1977; Tomkins, 196211963) - or whether emotional
experience is fundamentally continuous, each emotion being a point in a multi-
dimensional space defined by dimensions such as approach-avoidance and pleas-
Russell, 1980; Watson & Tellegen, 1985). ‘Ibe
expressions are perceived categorically is a direct prediction of the
selves are categorical. Moreover, it refutes sugges-
tions, motivated by the continuous-experience theories, that facial expressions do
not provide enough information to define category distinctions, that people do not
238 N. L Etcofl and J. J. Magee

naturally perceive expressions as falling into categories unless forced to by


multiple-choice experimental procedures, and that in perceiving expressions we
map continuous dimensions of facial configurations onto continuous aspects of
facial configurations (e.g., overall degree of tension of contraction of facial
muscles is perceived as degree of arousal).
The data, however, do not directly refute the continuous-experience theory
itself, because it is not logically necessary that each internal emotional state be
equated with one of its behavioral concomitants. A range of emotional ex-
periences could, in principle, be mapped onto a smaller set of categories of
expressions, and the perceptual system could be tuned to detect them, and use
this information in combination with other sources to infer the emotional state.
According to this way of reconciling the continuous-experience theory of emotion
with the categorical nature of the perception of emotional expressions, expres-
sions would be an autonomous intermediate level of mental categories, not
directly corresponding either to single underlying states, on one side, nor to
physic& facial configurations, on the other. Expressions would be a kind of
communicative symbol, analogous to phonemes or syllables, which also cannot be
identified either with particular acoustic waveforms or with particular meanings,
but constitute an intermediate level of analysis that the listener uses in mapping
from one to the other. Thus, while the current results cannot distinguish between
the categorical and continuous theories of emotion, they are somewhat more
easily accommodated by the categorical theory, and speak against a strong versio
of the continuous theory whereby facial expressions directly re
underlying dimensions of emotional experience.
A final implication of the categorrcal perception of expressi
puzzling finding that people are poor at detecting facial dec
ession is deliberately used to mask another, even tho
sically distinguishable from displays of the masking emotion alon
Friesen, 1974; Ekman, Friesen, & ‘Sullivan, 1988).
perceived as such but as an exemp r of the masking
facial deception might be used just because it ex
ordinary emotion perception.

Barclay, J.R. (1972). Noncategorical perception of a voiced stop: A replication. Perceptive and
Psychophysics, 11, 269-273.
Benson, P.J., & Perrett, D.I. (1991). Perception and recognition of p~ot~g~ap~ic quality facial
caricatures: Implications for the recognition of natural images. European Journal of Cognitive
Psychology, 3, 105-135.
Bornstein, M.H. (1973). Color vision and color naming: A psychophysiological hypothesis of cultural
difference. Psychological Bulletin, 80, 257-285.
Facial expressions 239

Brennan, SE. (1982). Caricature generator: Dynamic exaggeration of faces by computer. teonardo,
18, 170-178.
Bt-uyer, R. y Later% C.. SeTon, X., Feyereisen, P., Strypstein, E., Pienard, E., & Rectem, D. (1983).
A case of prosopagnosia with some preserved covert remembrance of familiar faces. Brain and
Cognition, 2, 257 -284.
Campbell, R., Landis, T., & Regard. M. (1986). Face recognition and lipreading: A neurological
dissociation. Brain, 109, 509-521.
Caron, R.F., Caron, A.J., & Myers, R.S. (1982). Abstraction of invariant face expressions in infancy.
Child Development, 53, 1009-1015.
Cutting, J.E., & Rosner, B.S. (1974). Categories and boundaries in speech and music. Perception and
Psychophysics, 16, W-570.
Darwin, C. (1872). The expression o,~the emotions in man and animals. London: Murray. Reprinted
1965, Chicago: University of Chicago Press.
Dimberg, U. (1990). Facial electromyography and emotional reactions. Psychophysiology, 27, 48l-
494.
Eimas, P.D., er, J.L., & Jusczyk, P.W. (1987). Infant speech perception. In S. Hamad (Ed.),
Catego perception. Cambridge, UK: Cambridge University Press.
Ekman, P. (1984). Expression and the nature of emotion. In K. Scherer & P. Ekman (Eds.),
Approaches to emotion (pp. 319-343). Hillsdale, NJ: Erlbaum.
Ekman, P., & Friesen, WV. (1971). Constants across cultures in the face and emotion. Journal of
Personality and Social Psycholqy, 17, 124-129.
Ekman, P., & Friesen, WV. (1974). Detecting deception from the body or face. Journal of Personality
and Social Psychology, 29, 288-298.
Ekman, P., & Friesen, W.V. (1975). Pictures of facial affect. Palo Alto, CA: Consulting Psychologists
Press.
Ekman, P., Friesen, W.V., & Ellsworth, P. (1982). What emotion categories or dimensions can
observers judge from facial behavior ? In P. Ekman (Ed. j, Emotion in the human face (pp.
39-55). Cambridge, UK: Cambridge University Press.
Ekman, P., Friesen, W.V., & O’Sullivan, M. (1988). Smiles when lying. Journal of Personality and
Social Psychology, 29, 288-298.
Ekman, P., Friesen, WY., O’Sullivan, M., Chan, A., Diacoyanni-Tarlatzis, I., Heider, K., Krause,
R., Lecompte, W.A., Pitcairn, T., Ricci-Bitti, P.E., Scherer, K.R., Tomita, M., & Tzavaras,
A. (1987). Universals and cultural differences in the judgments of facial expressions of
emotion. Journal of Personality and Social Psychology, 53, 712-717.
Etcoff, N.L. (1984). Selective attention to facial identity and facial emotion Neuropsychologia, 22,
281-295.
Etcoff, N.L. (1989). Asymmetries in recognition of emotion in F. Bolier & J. Grafman (Eds.),
Handbook of neuropsychology (Vol. 3, pp. 363-382). Amsterdam: Elsevier, Amsterdam.
Fried, I., Mateer, C., Ojemann, G., Wohns, R., & Fedio, P. (1982). Organization of visuospatial
functions in the human cortex. Brain, 105, 349-371.
Harnad, S. (1987). Categorical perception. Cambridge, UK: Cambridge University Press.
Hasselmo, ME., Rolls, E.T., & Baylis, G.C. (1989). The role of expression and identity in the
face-selective responses of neurons in the temporal visual cortex of the monkey. Behavioural
Brain Research, 32, 203-218.
Hochschild, A.R. (1983). The managed heart: Commercialization of human feeling. Berkeley, CA:
University of California Press.
Izard, C.E. (1971). The face of emotion. New York: Appleton-Century-Crofts.
!zard, C.E. (1977). Human emotions. New York: Plenum.
). Emoticn and adaptation. Oxford: Oxford University Press.
Liberman, A.M., arris, K.S., Hoffman, H.S., & Griffith, B.C. (1957). The discrimination of speech
sounds within and across phoneme boundaries. Journal of Experimental Psychology, 5%
358-368.
MacMillan, N.A., Kaplan, H.L., & Creelman, C.D. (1977). The psychophysics of categorical
perception. Psychological Review, 84, 452-471.
240 N. L. Etcoff and J. J. Magee

Meltzoff, A.N., & Moore, M.K. (1977). Imitation of facial and manual gestures in human neonates.
Science, 198, 75 -78.
Qatley, K., & Johnson-Laird, P.N. (1987). Towards a cognitive theory of emotion. Cognition and
Emotion, 1, 29-50.
Ohman, A., Sr Dimberg, LJ. (1978). Facial expressions as conditioned stimuli for electrodermal
responses: A case of “preparedness”’ ! Journal of Personality and Social Psychology, 36,
1251-125s.
Perrett, D.I., Smith, P.A.J., Potter, D.D., Mistlin, A.J., ead, AS., Milner, A.D., & Jeeves, M.A.
(1984). Neurones responsive to faces in the temporal cortex: Studies of functional organization,
sensitivity to identity and relation to perception. Human Neurobiology, 3, 197-208.
Rhodes, G., Brennan, S.E., & Carey, S. (1987). Identification and ratings of caricatures: Implications
for mental representations of faces. Cognitive Psychology, 19. 473-497.
Russell, J.A. (1980). A circumflex model of affect. Journal of Personality and Social Psychology, 39,
1161-1178.
Sackett, G.P. (1966). Monkeys reared in isolation with pictures as visual input: Evidence for an innate
releasing mechanism. Science, 154, 1468-1473.
Sorce, J.F., Emde, R.N., Camprs, J.J., & Klinnert, M.D. (1985). Maternal emotional signaling: Its
effect on the visual cliff behavior of l-year-olds. Developmental Psychology, 21, 195-200.
Studdert-Kennedy, M., Liberman, A.M., J-Jarris. K.S., & Cooper, F.S. (1970). Motor theory of
speech perception: A reply to Lane’s critical review. Psychological Review, 77, 234-249.
Tomkins, S.S. (296211963). Aflect, i,magery, consciousness (‘Vols. 1 and 2). New York: SpPEnger.
Walker-Andrews, A.S. (1986). Intermodal perception of expressive behaviors: Relation of eye and
voice? Developmental Psychology, 22, 373-377.
Watson, D., Br Tellegen, A. (1985). Toward a consensual structure of mood. Psychological Bulletin,
98, 219-235.

You might also like