Professional Documents
Culture Documents
To cite this article: Sumin Sun, Karl Schweizer & Xuezhu Ren (2019): Item-Position Effect in
Raven’s Matrices: A Developmental Perspective, Journal of Cognition and Development, DOI:
10.1080/15248372.2019.1581205
Article views: 60
ABSTRACT
This study examined whether there is a developmental difference in
the emergence of an item-position effect in intelligence testing. The
item-position effect describes the dependency of the item’s charac-
teristics on the positions of the items and is explained by learning.
Data on fluid intelligence measured by Raven’s Standard Progressive
Matrices (SPM) and data on working memory tasks were collected
from both primary school age children (7–8 years old) and secondary
school age adolescents (12–13 years old). The item-position effect of
SPM was represented and separated from the ability component by
the fixed-links model. The results indicated a clear age difference:
whereas the item-position effect was observed in the adolescents, it
was not found in the primary school children. In addition, separating
the item-position effect detected in the adolescents from the ability
component led to a larger correlation with working memory than
otherwise. These results suggest that age differences in intelligence
test performance may not only reflect differences in the general
ability but also in the sources of the item-position effect.
Introduction
The concept of fluid intelligence has found its way into almost all major models of
intelligence, as, for example, the Cattell–Horn–Carroll theory of cognitive abilities.
Furthermore, research reveals that there is a close relationship between fluid intelligence
and general intelligence (Kvist & Gustafsson, 2008). Because of this property, fluid
intelligence tests such as Raven’s Matrices are frequently used as one of the main measures
of general intelligence and to estimate the status of cognitive development.
A number of studies have shown that fluid intelligence develops from childhood to
young adulthood (e.g., Fry & Hale, 2000; Molnár, Greiff, & Csapó, 2013). Investigations of
the cognitive underpinnings of intelligence development showed that speed of informa-
tion processing, executive functions, and working memory are main cognitive correlates of
intelligence test scores (Brydges, Reid, Fox, & Anderson, 2012). However, there is the
concern that intelligence scores reflect not only abilities, but also method effects/variance
that may distort the estimation of intelligence development and its relationships with
cognitive constructs (Carlstedt, Gustafsson, & Ullstadius, 2000; Lozano, 2015). One
particular source is the item-position effect. It refers to the dependency of the response
CONTACT Xuezhu Ren renxz@hust.edu.cn School of Education, Huazhong University of Science & Technology,
Luoyu Road 1037, Wuhan 430074, China
Color versions of one or more of the figures in the article can be found online at www.tandfonline.com/hjcd.
© 2019 Taylor & Francis
2 S. SUN ET AL.
to the item of a sequence of homogeneous items on the position of the item within the
sequence (Schweizer, Schreiner, & Gold, 2009).
functions (Brydges et al., 2012; Fry & Hale, 2000; Tourva, Spanoudis, & Demetriou,
2016). For example, Tourva et al. (2016) report age-related changes in working memory
capacity, leading directly to developmental changes in fluid intelligence. Taken
together, these studies suggest that elementary cognitive processes may contribute to
the development of fluid intelligence.
The developmental difference in complex rule learning also suggests that young children
may not be able to detect and apply the rules characterizing the item that may play a role in
the emergence of the item-position effect in intelligence testing. The ability to learn complex
rules develops as the result of physiological maturation of the structural networks of the
prefrontal and medial temporal cortex and the anterior cingulate cortex (Ashby, Alfonso-
Reese, Turken, & Waldron, 1998). Given that children do not show the cognitive maturation
of adults, they may find it difficult to detect abstract rules. There is indeed research showing
that the ability of acquiring rules increases with increasing age (Rabi & Minda, 2014;
Reetzke, Maddox, & Chandkaran, 2016). Because of the developmental difference in intelli-
gence performance and learning as described above, we hypothesized that there may be no
or only a slight item-position effect when testing children due to their inefficiency in rule
learning, whereas it may be observable in adolescents.
Investigating the item-position effect in children is important for developmental
research concerning intelligence testing, as well as for a better understanding of the
development of fluid intelligence. Research on the item-position effect in adults has
revealed inhomogeneity of the intelligence tests. If the same effect is observed in children,
developmental researchers using intelligence tests to assess fluid intelligence need to be
aware that they not only measure fluid intelligence as ability but also something else. In
addition, there is support for the hypothesis that the main source of the item-position
effect is complex rule learning (Ren et al., 2014; Verguts & De Boeck, 2000). Detecting and
acquiring rules underlying the items requires test takers to process test items efficiently. If
no item-position effect is found in children, differences in intelligence performance
between children and adults may not only be due to the general ability but also to the
source of the item-position effect.
Method
Participants
Children and adolescents from three primary and secondary schools in a large urban
district in South China participated in this study. The majority of participants were from
a middle-class background. A total of 377 written consent forms were obtained from
participants’ parents. There were 159 primary school age children (81 males) and 218
secondary school age adolescents (107 males). The average age of primary school children
was 7.85 (SD = .31), and that of adolescents was 12.91 (SD = .39).
Measures
Raven’s Standard Progressive Matrices (SPM, Raven, Raven, & Court, 1998)
Due to testing-time restrictions, only half of the SPM items (i.e., all the odd-numbered
items) were used. It consisted of five sets, each including six items, with items within a set
arranged in an order of increasing difficulty. Each item comprises a series of geometrical
elements, one of which is missing. The task of the participant was to choose the most
appropriate option from multiple alternatives. Responses to each item were recorded as
binary data. A correct response was coded as 1 and incorrect response as 0. Participants
had 8 min to complete all items. This time limit was chosen based on pilot testing so as to
make sure that all participants had sufficient time to attempt all items of the test.
Procedures
The SPM data were collected by paper and pencil test. Working memory measures were
implemented by E-prime. Participants were tested individually in a quiet room. The
measures were administered in the following order: SPM, VWM, and RMS.
Statistical analysis
The representation of the item-position effect required the employment of the fixed-links
model (Zeller, Krampen, Reiß, & Schweizer, 2017). A characteristic of fixed-links models
was that factor loadings were constrained according to theory-based expectations so that the
variances of the manifest variables were decomposed into subcomponents. The considera-
tion of the position effect led to a fixed-links model including two independent latent
variables: one for representing the position component of the intelligence data and one
for the ability component. The item scores served as manifest variables of the model. The
factor loadings on the ability component were kept constant. The factor loadings on the
position component were determined by a quadratic function that has been proved to be
effective in modeling the item-position effect in intelligence test data (Zeller et al., 2017).
This model was addressed as the ability-position model. In addition, a congeneric one-factor
model that did not represent the item-position effect was also examined for a comparison.
LISREL8.8 was used for modeling analysis. Since scores of the SPM items were binary,
tetrachoric correlations were computed based on the item scores and these correlations
served as input to the CFA. The computation of tetrachoric correlations included the
switch from binary to continuous (Muthén, 1984). Since the ability-position model with
constrained loadings and the one-factor congeneric model were not nested, comparisons
between the models were performed by means of the Akaike’s information criterion
(AIC). A model with a smaller AIC indicated a better fit.
Results
Table 1 presents the descriptive results for all measures in each age group. Independent
sample t-tests were conducted to examine age differences in the raw scores of the
intelligence and working memory measures between the primary school children and
adolescents. The results showed that adolescents performed significantly better than
primary school children with respect to the raw scores of all measures (SPM, t = 18.64,
p < 0.01; VWM, t = 14.12, p < 0.01; RMS, t = 25.05, p < 0.01). Table 1 presents also the
correlations between variables. All correlations reached significance at the 0.01 level.
Testing whether the item-position effect affects the relationship with working memory
Since the item-position effect was observed in adolescents only, data from this age group was
further investigated to find out whether the item-position effect affected the relationship of
Table 2. Fit statistics of the measurement models for each age group of participants.
Type of models χ2 df RMSEA CFI GFI NNFI AIC
Primary school age children
One-factor model 127.58 135 .000 1.00 .92 1.01 199.58
Ability-position model 151.77 151 .006 .95 .90 .95 191.77
Secondary school age Adolescents
One-factor model 71.65 65 .022 .95 .95 .94 123.65
Ability-position model 81.92 76 .019 .95 .95 .95 111.92
N = 159 for primary school age children; N = 218 for secondary school age adolescents.
JOURNAL OF COGNITION AND DEVELOPMENT 7
a b
.58** .71**
.07
.11 .19 .44 .34 .65 .48 .17 .27 .47 .37 .62 .51
Item17 Item18 ... Item29 Item30 VWM RMS Item17 Item18 ... Item29 Item30 VWM RMS
.99 .96 .80 .88 .57 .77 .97 .93 .69 .82 .62 .74
Figure 1. Illustration of the correlation models representing the relationship between measures of fluid
intelligence and working memory. Figure A shows the correlation between the single-ability factor and
working memory. Figure B shows the correlations of the ability and position components of the
intelligence measure with working memory. VWM = Visuospatial working memory task;
RMS = Running memory span task (**p < .01).
SPM with working memory. First, the one-factor model was extended to a correlation model
by adding a latent variable representing working memory with loadings of the sub-scores of
VWM and RMS. This correlation model (see Figure 1a) showed an acceptable fit, χ2
(89) = 100.57, χ2/df = 1.13, RMSEA = .024, CFI = .95, GFI = .94, NNFI = .94. The single-
ability factor showed a moderate correlation with working memory, r = 0.58, t = 5.29,
p < .001. Second, the ability-position model was also extended to a correlation model (see
Figure 1b) by including sub-scores of working memory. The fit of this correlation model was
also acceptable, χ2 (99) = 109.99, χ2/df = 1.11, RMSEA = .023, CFI = .95, GFI = .94, NNFI = .94.
The ability component showed a relatively strong correlation with working memory, r = 0.71,
t = 3.56, p < 0.01. The magnitude of this correlation was significantly higher than the
correlation between the single-ability factor and working memory, Zdifference = 2.33, p < .05.
In contrast, the position composition showed a weak and insignificant correlation with
working memory, r = 0.07, t = 0.33, p > 0.05.
Discussion
The starting point of the current study is the observation of the item-position effect in
intelligence testing. A number of studies have observed the item-position effect in data
of fluid intelligence collected from adults (e.g., Lozano, 2015; Ren et al., 2014; Zeller
et al., 2017). The ability of learning complex rules was further proposed and demon-
strated as main source of the item-position effect (Carlstedt et al., 2000; Ren et al.,
2014; Verguts & De Boeck, 2000). The present study went a further step to examine
whether the item-position effect can be extended to intelligence testing on children,
and whether an age difference can be found regarding the item-position effect. The
results of the study indicated a clear age difference: whereas the item-position effect
was observed in secondary school age adolescents, it was not observed in the primary
school age children. In addition, separating the item-position effect detected in the
adolescent group from the ability part of the intelligence test gave rise to a higher
8 S. SUN ET AL.
correlation with measures of working memory than the correlation without consider-
ing the item-position effect.
The main finding was that the younger children and adolescents showed an important
difference with respect to their susceptibilities to the source of the item-position effect in
completing the SPM. Unlike data on adolescents, data on the primary school children
failed to reveal the item-position effect. This was demonstrated in the children data
showing that the latent variance of the position component in the ability-position model
did not reach significance. These results were in accordance with our hypothesis that
children may not be able to infer and detect those rules underlying intelligence items due
to the insufficient development of their learning abilities (Rabi & Minda, 2014; Reetzke
et al., 2016). As demonstrated by Bui and Birney (2014), rule learning contributes to
individual differences in intelligence performance, suggesting that those with high learning
abilities may be able to take advantage of those learning opportunities (i.e., successful
solution) when solving intelligence problems. Results from the present study suggest that
younger children are not efficient or even not be able to utilize those learning opportu-
nities to improve their intelligence performance.
It should be noted that the absence of the item-position effect in the primary school
children does not mean that intelligence testing is more appropriate for children than for
adults. As stated above, the occurrence of the item-position effect is mainly due to test-
takers’ rule-learning processes (Carlstedt et al., 2000; Ren et al., 2014; Verguts & De Boeck,
2000). Detecting and inferring the rules during testing means that children must process
items efficiently. Young children at this age may not be as efficient as adults in acquiring
complex rules. In addition, children may be less likely to spontaneously learn complex
rules embedded in a task (Bjorklund, Miller, Coyle, & Slawinski, 1997). Therefore, a lack
of efficiency in rule learning for the primary school children may explain the reason for
the lack of the item-position effect.
The observation of the item-position effect in the adolescents’ group was consistent
with previous findings that reported the item-position effect in other intelligence test data
using different testing materials (Ren et al., 2015). This suggests the universality of the
item-position effect in testing 12–13-year-old adolescents. Adolescents of this age group
have already finished their primary education, and have learned to infer rules for complex
problems and use them to solve new problems. Therefore, their learning abilities have
demonstrated a certain degree of improvement. As previously mentioned, complex learn-
ing ability serves as the main source of the item-position effect, which may account for the
observation of this effect in the adolescents’ group. In addition, it was further found that
considering the item-position effect revealed a significantly higher correlation between the
intelligence score and working memory. This indicates that the item-position effect
reflecting part of the method variance (Carlstedt et al., 2000) may influence the valid
representation of human intelligence and its relationship with other constructs.
Finally, we like to highlight that the major contribution of the present study is that the
item-position effect demonstrates significant age differences. The item-position effect was
not found in intelligence testing of primary school children, although it has been pre-
viously observed in adults or adolescents. The absence of the item-position effect in the
younger school children is likely due to their inefficiency in learning complex rules. Thus,
one important difference between children and the older adolescent in intelligence test
performance may lie in the sources of the item-position effect. In addition, the present
JOURNAL OF COGNITION AND DEVELOPMENT 9
study suggests that the item-position effect may vary when exploring the interplay of fluid
intelligence and other constructs.
Disclosure statement
No potential conflict of interest was reported by the authors.
Funding
This work was supported by the Social Science Foundation of China [grant Number: CBA150153]
References
Ashby, F. G., Alfonso-Reese, L. A., Turken, A. U., & Waldron, E. M. (1998). A neuropsychological
theory of multiple systems in category learning. Psychological Review, 105, 442–481.
Baddeley, A. (2012). Working memory: Theories, models, and controversies. Annual Review of
Psychology, 63, 1–29. doi:10.1146/annurev-psych-120710-100422
Bjorklund, D. F., Miller, P. H., Coyle, T. R., & Slawinski, J. L. (1997). Instructing children to use
memory strategies: Evidence of utilization deficiencies in memory training studies.
Developmental Review, 17, 411–441. doi:10.1006/drev.1997.0440
Brydges, C. R., Reid, C. L., Fox, A. M., & Anderson, M. (2012). A unitary executive function
predicts intelligence in children. Intelligence, 40, 458–469. doi:10.1016/j.intell.2012.05.006
Bui, M., & Birney, D. P. (2014). Learning and individual differences in Gf processes and Raven’s.
Learning and Individual Differences, 32, 104–113. doi:10.1016/j.lindif.2014.03.008
Carlstedt, B., Gustafsson, J. E., & Ullstadius, E. (2000). Item sequencing effects on the measurement
of fluid intelligence. Intelligence, 28, 145–160. doi:10.1016/S0160-2896(00)00034-9
Carpenter, P. A., Just, M. A., & Shell, P. (1990). What one intelligence test measures: A theoretical
account of processing in the Raven progressive matrices test. Psychological Review, 97, 404–431.
Debeer, D., & Janssen, R. (2013). Modeling item-position effects within an IRT framework. Journal
of Educational Measurement, 50, 164–185. doi:10.1111/jedm.2013.50.issue-2
Fry, A. F., & Hale, S. (2000). Relationships among processing speed, working memory, and fluid
intelligence in children. Biological Psychology, 54, 1–34.
Hamilton, J. C., & Shuminsky, T. R. (1990). Self-awareness mediates the relationship between serial
position and item reliability. Journal of Personality and Social Psychology, 59, 1301–1307.
doi:10.1037/0022-3514.59.6.1301
Knowles, E. S. (1988). Item context effects on personality scales: Measuring changes the measure.
Journal of Personality and Social Psychology, 55, 312–320.
Knowles, E. S., & Byers, B. (1996). Reliability shifts in measurement reactivity: Driven by content
engagement or self-engagement? Journal of Personality and Social Psychology, 70, 1080–1090.
doi:10.1037/0022-3514.70.5.1080
Kubinger, K. D. (2008). On the revival of the Rasch model-based LLTM: From constructing tests
using item generating rules to measuring item administration effects. Psychology Science
Quarterly, 50, 311–327.
Kvist, A. V., & Gustafsson, J. E. (2008). The relation of fluid intelligence and the general factor as
a function of cultural background: A test of Cattell’s investment theory. Intelligence, 36, 422−436.
Lozano, J. H. (2015). Are impulsivity and intelligence truly related constructs? Evidence based on
the fixed-links model. Personality and Individual Differences, 85, 192–198. doi:10.1016/j.
paid.2015.04.049
Molnár, G., Greiff, S., & Csapó, B. (2013). Inductive reasoning, domain specific and complex
problem solving: Relations and development. Thinking Skills and Creativity, 9, 35–45.
doi:10.1016/j.tsc.2013.03.002
10 S. SUN ET AL.
Muthén, B. (1984). A general structural equation model with dichotomous, ordered categorical, and
continuous latent variable indicators. Psychometrika, 49, 115–132. doi:10.1007/BF02294210
Rabi, R., & Minda, J. P. (2014). Rule-based category learning in children: The role of age and
executive functioning. PLoS One, 9, e85316. doi:10.1371/journal.pone.0085316
Raven, J., Raven, J. C., & Court, J. H. (1998). Manual for Raven’s progressive matrices and vocabulary
scales. Section 3, The standard progressive matrices. Oxford, UK: Oxford Psychologists Press.
Reetzke, R., Maddox, W. T., & Chandkaran, B. (2016). The role of age and executive function in
auditory category learning. Journal of Experimental Child Psychology, 142, 48–65. doi:10.1016/j.
jecp.2015.09.018
Ren, X., Schweizer, K., Wang, T., & Xu, F. (2015). The prediction of students’ academic perfor-
mance with fluid intelligence in giving special consideration to the contribution of learning.
Advances in Cognitive Psychology, 11, 97–105. doi:10.5709/acp-0175-z
Ren, X., Wang, T., Altmeyer, M., & Schweizer, K. (2014). A learning-based account of fluid
intelligence from the perspective of the position effect. Learning and Individual Differences, 31,
30–35. doi:10.1016/j.lindif.2014.01.002
Schweizer, K., Schreiner, M., & Gold, A. (2009). The confirmatory investigation of APM items with
loadings as a function of the position and easiness of items: A two-dimensional model of APM.
Psychology Science Quarterly, 51, 47–64.
Schweizer, K., Troche, S. J., & Rammsayer, T. H. (2011). On the special relationship between fluid
and general intelligence: New evidence obtained by considering the position effect. Personality
and Individual Differences, 50, 1249–1254. doi:10.1016/j.paid.2011.02.019
Tourva, A., Spanoudis, G., & Demetriou, A. (2016). Cognitive correlates of developing intelligence:
The contribution of working memory, processing speed and attention. Intelligence, 54, 136–146.
doi:10.1016/j.intell.2015.12.001
Unsworth, N., & Spillers, G. J. (2010). Working memory capacity: Attention, memory, or both?
A direct test of the dual component model. Journal of Memory and Language, 62, 392–406.
doi:10.1016/j.jml.2010.02.001
Van der Sluis, S., De Jong, P. F., & Van der Leij, A. (2007). Executive functioning in children, and
its relations with reasoning, reading, and arithmetic. Intelligence, 35, 427–449. doi:10.1016/j.
intell.2006.09.001
Verguts, T., & De Boeck, P. (2000). A Rasch model for detecting learning while solving an
intelligence test. Applied Psychological Measurement, 24, 151–162. doi:10.1177/
01466210022031589
Zeller, F., Krampen, D., Reiß, S., & Schweizer, K. (2017). Do adaptive representations of the
item-position effect in APM improve model fit? A simulation study. Educational and
Psychological Measurement, 77, 743–765. doi:10.1177/0013164416654946