You are on page 1of 11

Journal of Cognition and Development

ISSN: 1524-8372 (Print) 1532-7647 (Online) Journal homepage: https://www.tandfonline.com/loi/hjcd20

Item-Position Effect in Raven’s Matrices: A


Developmental Perspective

Sumin Sun, Karl Schweizer & Xuezhu Ren

To cite this article: Sumin Sun, Karl Schweizer & Xuezhu Ren (2019): Item-Position Effect in
Raven’s Matrices: A Developmental Perspective, Journal of Cognition and Development, DOI:
10.1080/15248372.2019.1581205

To link to this article: https://doi.org/10.1080/15248372.2019.1581205

Published online: 01 Mar 2019.

Submit your article to this journal

Article views: 60

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=hjcd20
JOURNAL OF COGNITION AND DEVELOPMENT
https://doi.org/10.1080/15248372.2019.1581205

Item-Position Effect in Raven’s Matrices: A Developmental


Perspective
Sumin Suna, Karl Schweizerb, and Xuezhu Rena
a
Huazhong University of Science & Technology, China; bGoethe University Frankfurt, Germany

ABSTRACT
This study examined whether there is a developmental difference in
the emergence of an item-position effect in intelligence testing. The
item-position effect describes the dependency of the item’s charac-
teristics on the positions of the items and is explained by learning.
Data on fluid intelligence measured by Raven’s Standard Progressive
Matrices (SPM) and data on working memory tasks were collected
from both primary school age children (7–8 years old) and secondary
school age adolescents (12–13 years old). The item-position effect of
SPM was represented and separated from the ability component by
the fixed-links model. The results indicated a clear age difference:
whereas the item-position effect was observed in the adolescents, it
was not found in the primary school children. In addition, separating
the item-position effect detected in the adolescents from the ability
component led to a larger correlation with working memory than
otherwise. These results suggest that age differences in intelligence
test performance may not only reflect differences in the general
ability but also in the sources of the item-position effect.

Introduction
The concept of fluid intelligence has found its way into almost all major models of
intelligence, as, for example, the Cattell–Horn–Carroll theory of cognitive abilities.
Furthermore, research reveals that there is a close relationship between fluid intelligence
and general intelligence (Kvist & Gustafsson, 2008). Because of this property, fluid
intelligence tests such as Raven’s Matrices are frequently used as one of the main measures
of general intelligence and to estimate the status of cognitive development.
A number of studies have shown that fluid intelligence develops from childhood to
young adulthood (e.g., Fry & Hale, 2000; Molnár, Greiff, & Csapó, 2013). Investigations of
the cognitive underpinnings of intelligence development showed that speed of informa-
tion processing, executive functions, and working memory are main cognitive correlates of
intelligence test scores (Brydges, Reid, Fox, & Anderson, 2012). However, there is the
concern that intelligence scores reflect not only abilities, but also method effects/variance
that may distort the estimation of intelligence development and its relationships with
cognitive constructs (Carlstedt, Gustafsson, & Ullstadius, 2000; Lozano, 2015). One
particular source is the item-position effect. It refers to the dependency of the response

CONTACT Xuezhu Ren renxz@hust.edu.cn School of Education, Huazhong University of Science & Technology,
Luoyu Road 1037, Wuhan 430074, China
Color versions of one or more of the figures in the article can be found online at www.tandfonline.com/hjcd.
© 2019 Taylor & Francis
2 S. SUN ET AL.

to the item of a sequence of homogeneous items on the position of the item within the
sequence (Schweizer, Schreiner, & Gold, 2009).

The item-position effect in intelligence testing


The item-position effect was first found in experimental settings (e.g., Hamilton & Shuminsky,
1990; Knowles, 1988; Knowles & Byers, 1996). Knowles (1988) reported that the reliabilities of
items in a personality scale increased as a function of the item serial positions: the later-
presented items within the test were more reliable than the former items. This position-related
change was also observed in ability tests (e.g., Debeer & Janssen, 2013; Kubinger, 2008). The
item-position effect of the ability tests has been investigated with methods based on item-
response theory. For example, Kubinger (2008) applied the linear logistic test model to
investigate the item-position effect in tests of problem solving and mathematical reasoning.
More recent investigations have applied confirmatory factor analysis (CFA) models to
represent the item-position effect of intelligence tests. A special CFA model addressed as
a fixed-links model has been used to decompose the variance of intelligence data into
a position component reflecting systematic variance associated with the item-position effect,
and an ability component representing the ability part of intelligence (e.g., Lozano, 2015;
Schweizer et al., 2009). The ability component has been shown to correlate almost perfectly
with general intelligence, or g (Schweizer, Troche, & Rammsayer, 2011). The position
component showed a rather small correlation with g.
Rule learning has been considered as the main source of the item-position effect in
intelligence testing. Since items of many fluid intelligence tests appear to be constructed by
using only a few rules (Carpenter, Just, & Shell, 1990), test-takers are likely to infer the rules
and improve their abilities as testing continues. Previous work has indicated that individuals
actually infer the rules when completing Raven’s Matrices and become more efficient in
repeated applications (Verguts & De Boeck, 2000). Furthermore, according to an empirical
study on adults by Ren, Wang, Altmeyer, and Schweizer (2014), complex rule learning is
closely related to the position component but not to the ability component of Raven’s
Advanced Progressive Matrices (APM). In addition, a more recent study based on a large
sample of adolescents shows that the position component of fluid intelligence contributed
more to the prediction of math and verbal learning than the ability component (Ren,
Schweizer, Wang, & Xu, 2015). These studies suggest that the position component of
intelligence test data is associated with the process of learning complex rules.

The developmental difference in intelligence and learning


While the item-position effect in intelligence testing has been mostly investigated in
studies on adults and adolescents, there is a shortage of investigations in younger
children. Since developmental research demonstrates that younger children’s fluid
intelligence and learning differ markedly from those of adults, a developmental differ-
ence regarding the item-position effect appears to be possible. The age range from
7–12 years is a critical period in which an age-related increase of fluid intelligence
occurs at a rapid rate (Molnár et al., 2013). Furthermore, the development of fluid
intelligence is often compared with the development of other cognitive processes, such
as, for example, speed of information processing, working memory, and executive
JOURNAL OF COGNITION AND DEVELOPMENT 3

functions (Brydges et al., 2012; Fry & Hale, 2000; Tourva, Spanoudis, & Demetriou,
2016). For example, Tourva et al. (2016) report age-related changes in working memory
capacity, leading directly to developmental changes in fluid intelligence. Taken
together, these studies suggest that elementary cognitive processes may contribute to
the development of fluid intelligence.
The developmental difference in complex rule learning also suggests that young children
may not be able to detect and apply the rules characterizing the item that may play a role in
the emergence of the item-position effect in intelligence testing. The ability to learn complex
rules develops as the result of physiological maturation of the structural networks of the
prefrontal and medial temporal cortex and the anterior cingulate cortex (Ashby, Alfonso-
Reese, Turken, & Waldron, 1998). Given that children do not show the cognitive maturation
of adults, they may find it difficult to detect abstract rules. There is indeed research showing
that the ability of acquiring rules increases with increasing age (Rabi & Minda, 2014;
Reetzke, Maddox, & Chandkaran, 2016). Because of the developmental difference in intelli-
gence performance and learning as described above, we hypothesized that there may be no
or only a slight item-position effect when testing children due to their inefficiency in rule
learning, whereas it may be observable in adolescents.
Investigating the item-position effect in children is important for developmental
research concerning intelligence testing, as well as for a better understanding of the
development of fluid intelligence. Research on the item-position effect in adults has
revealed inhomogeneity of the intelligence tests. If the same effect is observed in children,
developmental researchers using intelligence tests to assess fluid intelligence need to be
aware that they not only measure fluid intelligence as ability but also something else. In
addition, there is support for the hypothesis that the main source of the item-position
effect is complex rule learning (Ren et al., 2014; Verguts & De Boeck, 2000). Detecting and
acquiring rules underlying the items requires test takers to process test items efficiently. If
no item-position effect is found in children, differences in intelligence performance
between children and adults may not only be due to the general ability but also to the
source of the item-position effect.

The aim of the present study


The primary aim of this study was to investigate whether there was a developmental
difference regarding the item-position effect in intelligence testing. To achieve this aim,
intelligence data from primary school age children (7–8 years old) and secondary
school age adolescents (12–13 years old) were collected and investigated for the
presence of the item-position effect. We selected Raven’s Standard Progressive
Matrices for the assessment of intelligence, since it is considered as the most promi-
nent indicator of fluid intelligence (Carpenter et al., 1990). To further investigate
whether the item-position effect, in case it is observed, affects the validity of the
intelligence data, we collected data on two working memory measures so as to examine
whether the item-position effect influences the relationship between working memory
capacity (WMC) and intelligence. Working memory measures were employed since
a number of studies have shown that WMC is a well-established predictor of fluid
intelligence (Baddeley, 2012).
4 S. SUN ET AL.

Method
Participants
Children and adolescents from three primary and secondary schools in a large urban
district in South China participated in this study. The majority of participants were from
a middle-class background. A total of 377 written consent forms were obtained from
participants’ parents. There were 159 primary school age children (81 males) and 218
secondary school age adolescents (107 males). The average age of primary school children
was 7.85 (SD = .31), and that of adolescents was 12.91 (SD = .39).

Measures
Raven’s Standard Progressive Matrices (SPM, Raven, Raven, & Court, 1998)
Due to testing-time restrictions, only half of the SPM items (i.e., all the odd-numbered
items) were used. It consisted of five sets, each including six items, with items within a set
arranged in an order of increasing difficulty. Each item comprises a series of geometrical
elements, one of which is missing. The task of the participant was to choose the most
appropriate option from multiple alternatives. Responses to each item were recorded as
binary data. A correct response was coded as 1 and incorrect response as 0. Participants
had 8 min to complete all items. This time limit was chosen based on pilot testing so as to
make sure that all participants had sufficient time to attempt all items of the test.

Visuospatial working memory task (VWM)


The visuospatial working memory task was a variant of the symmetry span task
(Unsworth & Spillers, 2010). In the experimental task, a 4 × 4 matrix was presented on
the computer screen. Red squares as the visual stimuli successively appeared randomly in
one of the cells in the matrix for 750 ms. The interval between presentations of squares
was 500 ms. An empty matrix was presented at the end of each trial. Participants were
asked to recall the positions of the red squares by clicking on the cells of the empty matrix.
There were three trials of each set-size with four, five, six, and seven squares. The order of
trials with different set-sizes was random. We used the number of correctly recalled
positions of squares as the dependent variable, as was suggested in previous research
(e.g., Unsworth & Spillers, 2010).

Running memory span task (RMS)


This working memory task was adapted from Van der Sluis, De Jong, and Van der Leij
(2007). In this task, a sequence of digits varying in length from five, seven, nine, to 11
digits successively appeared on the computer screen. Participants were asked to con-
stantly recall the last three digits presented to them. To ensure continuous updating,
participants were asked to say aloud the last three digits they had seen, regardless of
where they were in the sequence. This was achieved by asking them to add the latest
digit to the mental list of three digits and drop the oldest one, and then to pronounce
the three digits of the new cluster. Each of the four lengths consisted of three
sequences, resulting in 12 sequences to be recalled. The dependent variable was the
number of correctly recalled sequences.
JOURNAL OF COGNITION AND DEVELOPMENT 5

Procedures
The SPM data were collected by paper and pencil test. Working memory measures were
implemented by E-prime. Participants were tested individually in a quiet room. The
measures were administered in the following order: SPM, VWM, and RMS.

Statistical analysis
The representation of the item-position effect required the employment of the fixed-links
model (Zeller, Krampen, Reiß, & Schweizer, 2017). A characteristic of fixed-links models
was that factor loadings were constrained according to theory-based expectations so that the
variances of the manifest variables were decomposed into subcomponents. The considera-
tion of the position effect led to a fixed-links model including two independent latent
variables: one for representing the position component of the intelligence data and one
for the ability component. The item scores served as manifest variables of the model. The
factor loadings on the ability component were kept constant. The factor loadings on the
position component were determined by a quadratic function that has been proved to be
effective in modeling the item-position effect in intelligence test data (Zeller et al., 2017).
This model was addressed as the ability-position model. In addition, a congeneric one-factor
model that did not represent the item-position effect was also examined for a comparison.
LISREL8.8 was used for modeling analysis. Since scores of the SPM items were binary,
tetrachoric correlations were computed based on the item scores and these correlations
served as input to the CFA. The computation of tetrachoric correlations included the
switch from binary to continuous (Muthén, 1984). Since the ability-position model with
constrained loadings and the one-factor congeneric model were not nested, comparisons
between the models were performed by means of the Akaike’s information criterion
(AIC). A model with a smaller AIC indicated a better fit.

Results
Table 1 presents the descriptive results for all measures in each age group. Independent
sample t-tests were conducted to examine age differences in the raw scores of the
intelligence and working memory measures between the primary school children and
adolescents. The results showed that adolescents performed significantly better than
primary school children with respect to the raw scores of all measures (SPM, t = 18.64,

Table 1. Descriptive results and correlations between the measures.


Measures M SD Skew Kurtosis VWM RMS
Primary school age children
VWM 28.89 8.77 0.16 −0.31 –
RMS 5.67 2.59 0.24 −0.39 0.35** –
SPM 19.84 3.14 −0.28 0.05 0.29** 0.38**
Secondary school age adolescents
VWM 42.57 9.54 −0.52 0.36 –
RMS 9.68 1.95 −0.92 0.77 0.31** –
SPM 25.40 2.42 −0.64 0.30 0.28** 0.28**
Note. N = 159 for primary school age children; N = 218 for secondary school age adolescents; VWM = Visuospatial working
memory; RMS = Running memory span; SPM = Raven’s Standard Progressive Matrices. ** p < 0.01.
6 S. SUN ET AL.

p < 0.01; VWM, t = 14.12, p < 0.01; RMS, t = 25.05, p < 0.01). Table 1 presents also the
correlations between variables. All correlations reached significance at the 0.01 level.

Attempts to represent the item-position effect in both age groups


The item-position effect was first examined in the younger-children group. The first 12
items of the SPM were too easy, so that they were completed almost faultlessly by children
of this age group. Therefore, these 12 items were excluded and only the 13–30th items
were used for modeling analyses. Inferring from previous attempts on adults (e.g.,
Schweizer et al., 2011), the remaining number of items (18 items for the primary school
sample, and 13 items for the secondary school sample) was appropriate for modeling the
item-position effect. The score (M = 9.25, SD = 2.59) of the 13–30th items had a strong
correlation (r = .93, p < .001), with that based on all 30 items.
The upper part of Table 2 presents the fit results for the models applied to the data of
the younger group. Both the one-factor model and the ability-position model indicated an
acceptable fit. The AIC of the latter model was slightly superior to that of the former one.
However, the variance of the latent variable representing the item-position effect did not
reach significance, Z = −.09, p = .46 > .05, whereas the latent variable representing ability
was significant, Z = 5.44, p < 0.001. The insignificant variance suggested that there was no
item-position effect and invalidated the ability-position model.
In a next step, the item-position effect was examined in data of the adolescents. The
first 16 items and the 19th item of SPM were also too easy so that these items were
correctly completed by over 98% of this age group. Therefore, data on the last 14 items
excluding the 19th item of SPM were used for modeling analyses. The score (M = 8.78, SD
= 2.12) of these items showed a strong correlation (r = .96, p < .001), with the total score
based on all 30 items of SPM.
The lower part of Table 2 presents the fit results for the models based on the data of
adolescents. Both the one-factor model and the ability-position model showed a good fit.
However, comparisons of the AICs indicated that the ability-position model was superior
to the one-factor model. The variances of the latent variables of this model reached
statistical significance (ability: Z = 3.68, p < .001; position: Z = 3.52, p < .001), suggesting
that both components represented important sources of variance.

Testing whether the item-position effect affects the relationship with working memory
Since the item-position effect was observed in adolescents only, data from this age group was
further investigated to find out whether the item-position effect affected the relationship of

Table 2. Fit statistics of the measurement models for each age group of participants.
Type of models χ2 df RMSEA CFI GFI NNFI AIC
Primary school age children
One-factor model 127.58 135 .000 1.00 .92 1.01 199.58
Ability-position model 151.77 151 .006 .95 .90 .95 191.77
Secondary school age Adolescents
One-factor model 71.65 65 .022 .95 .95 .94 123.65
Ability-position model 81.92 76 .019 .95 .95 .95 111.92
N = 159 for primary school age children; N = 218 for secondary school age adolescents.
JOURNAL OF COGNITION AND DEVELOPMENT 7

a b
.58** .71**
.07

Ability WMC Ability Position WMC

.11 .19 .44 .34 .65 .48 .17 .27 .47 .37 .62 .51

.00 .01 .31 .21

Item17 Item18 ... Item29 Item30 VWM RMS Item17 Item18 ... Item29 Item30 VWM RMS

.99 .96 .80 .88 .57 .77 .97 .93 .69 .82 .62 .74

Figure 1. Illustration of the correlation models representing the relationship between measures of fluid
intelligence and working memory. Figure A shows the correlation between the single-ability factor and
working memory. Figure B shows the correlations of the ability and position components of the
intelligence measure with working memory. VWM = Visuospatial working memory task;
RMS = Running memory span task (**p < .01).

SPM with working memory. First, the one-factor model was extended to a correlation model
by adding a latent variable representing working memory with loadings of the sub-scores of
VWM and RMS. This correlation model (see Figure 1a) showed an acceptable fit, χ2
(89) = 100.57, χ2/df = 1.13, RMSEA = .024, CFI = .95, GFI = .94, NNFI = .94. The single-
ability factor showed a moderate correlation with working memory, r = 0.58, t = 5.29,
p < .001. Second, the ability-position model was also extended to a correlation model (see
Figure 1b) by including sub-scores of working memory. The fit of this correlation model was
also acceptable, χ2 (99) = 109.99, χ2/df = 1.11, RMSEA = .023, CFI = .95, GFI = .94, NNFI = .94.
The ability component showed a relatively strong correlation with working memory, r = 0.71,
t = 3.56, p < 0.01. The magnitude of this correlation was significantly higher than the
correlation between the single-ability factor and working memory, Zdifference = 2.33, p < .05.
In contrast, the position composition showed a weak and insignificant correlation with
working memory, r = 0.07, t = 0.33, p > 0.05.

Discussion
The starting point of the current study is the observation of the item-position effect in
intelligence testing. A number of studies have observed the item-position effect in data
of fluid intelligence collected from adults (e.g., Lozano, 2015; Ren et al., 2014; Zeller
et al., 2017). The ability of learning complex rules was further proposed and demon-
strated as main source of the item-position effect (Carlstedt et al., 2000; Ren et al.,
2014; Verguts & De Boeck, 2000). The present study went a further step to examine
whether the item-position effect can be extended to intelligence testing on children,
and whether an age difference can be found regarding the item-position effect. The
results of the study indicated a clear age difference: whereas the item-position effect
was observed in secondary school age adolescents, it was not observed in the primary
school age children. In addition, separating the item-position effect detected in the
adolescent group from the ability part of the intelligence test gave rise to a higher
8 S. SUN ET AL.

correlation with measures of working memory than the correlation without consider-
ing the item-position effect.
The main finding was that the younger children and adolescents showed an important
difference with respect to their susceptibilities to the source of the item-position effect in
completing the SPM. Unlike data on adolescents, data on the primary school children
failed to reveal the item-position effect. This was demonstrated in the children data
showing that the latent variance of the position component in the ability-position model
did not reach significance. These results were in accordance with our hypothesis that
children may not be able to infer and detect those rules underlying intelligence items due
to the insufficient development of their learning abilities (Rabi & Minda, 2014; Reetzke
et al., 2016). As demonstrated by Bui and Birney (2014), rule learning contributes to
individual differences in intelligence performance, suggesting that those with high learning
abilities may be able to take advantage of those learning opportunities (i.e., successful
solution) when solving intelligence problems. Results from the present study suggest that
younger children are not efficient or even not be able to utilize those learning opportu-
nities to improve their intelligence performance.
It should be noted that the absence of the item-position effect in the primary school
children does not mean that intelligence testing is more appropriate for children than for
adults. As stated above, the occurrence of the item-position effect is mainly due to test-
takers’ rule-learning processes (Carlstedt et al., 2000; Ren et al., 2014; Verguts & De Boeck,
2000). Detecting and inferring the rules during testing means that children must process
items efficiently. Young children at this age may not be as efficient as adults in acquiring
complex rules. In addition, children may be less likely to spontaneously learn complex
rules embedded in a task (Bjorklund, Miller, Coyle, & Slawinski, 1997). Therefore, a lack
of efficiency in rule learning for the primary school children may explain the reason for
the lack of the item-position effect.
The observation of the item-position effect in the adolescents’ group was consistent
with previous findings that reported the item-position effect in other intelligence test data
using different testing materials (Ren et al., 2015). This suggests the universality of the
item-position effect in testing 12–13-year-old adolescents. Adolescents of this age group
have already finished their primary education, and have learned to infer rules for complex
problems and use them to solve new problems. Therefore, their learning abilities have
demonstrated a certain degree of improvement. As previously mentioned, complex learn-
ing ability serves as the main source of the item-position effect, which may account for the
observation of this effect in the adolescents’ group. In addition, it was further found that
considering the item-position effect revealed a significantly higher correlation between the
intelligence score and working memory. This indicates that the item-position effect
reflecting part of the method variance (Carlstedt et al., 2000) may influence the valid
representation of human intelligence and its relationship with other constructs.
Finally, we like to highlight that the major contribution of the present study is that the
item-position effect demonstrates significant age differences. The item-position effect was
not found in intelligence testing of primary school children, although it has been pre-
viously observed in adults or adolescents. The absence of the item-position effect in the
younger school children is likely due to their inefficiency in learning complex rules. Thus,
one important difference between children and the older adolescent in intelligence test
performance may lie in the sources of the item-position effect. In addition, the present
JOURNAL OF COGNITION AND DEVELOPMENT 9

study suggests that the item-position effect may vary when exploring the interplay of fluid
intelligence and other constructs.

Disclosure statement
No potential conflict of interest was reported by the authors.

Funding
This work was supported by the Social Science Foundation of China [grant Number: CBA150153]

References
Ashby, F. G., Alfonso-Reese, L. A., Turken, A. U., & Waldron, E. M. (1998). A neuropsychological
theory of multiple systems in category learning. Psychological Review, 105, 442–481.
Baddeley, A. (2012). Working memory: Theories, models, and controversies. Annual Review of
Psychology, 63, 1–29. doi:10.1146/annurev-psych-120710-100422
Bjorklund, D. F., Miller, P. H., Coyle, T. R., & Slawinski, J. L. (1997). Instructing children to use
memory strategies: Evidence of utilization deficiencies in memory training studies.
Developmental Review, 17, 411–441. doi:10.1006/drev.1997.0440
Brydges, C. R., Reid, C. L., Fox, A. M., & Anderson, M. (2012). A unitary executive function
predicts intelligence in children. Intelligence, 40, 458–469. doi:10.1016/j.intell.2012.05.006
Bui, M., & Birney, D. P. (2014). Learning and individual differences in Gf processes and Raven’s.
Learning and Individual Differences, 32, 104–113. doi:10.1016/j.lindif.2014.03.008
Carlstedt, B., Gustafsson, J. E., & Ullstadius, E. (2000). Item sequencing effects on the measurement
of fluid intelligence. Intelligence, 28, 145–160. doi:10.1016/S0160-2896(00)00034-9
Carpenter, P. A., Just, M. A., & Shell, P. (1990). What one intelligence test measures: A theoretical
account of processing in the Raven progressive matrices test. Psychological Review, 97, 404–431.
Debeer, D., & Janssen, R. (2013). Modeling item-position effects within an IRT framework. Journal
of Educational Measurement, 50, 164–185. doi:10.1111/jedm.2013.50.issue-2
Fry, A. F., & Hale, S. (2000). Relationships among processing speed, working memory, and fluid
intelligence in children. Biological Psychology, 54, 1–34.
Hamilton, J. C., & Shuminsky, T. R. (1990). Self-awareness mediates the relationship between serial
position and item reliability. Journal of Personality and Social Psychology, 59, 1301–1307.
doi:10.1037/0022-3514.59.6.1301
Knowles, E. S. (1988). Item context effects on personality scales: Measuring changes the measure.
Journal of Personality and Social Psychology, 55, 312–320.
Knowles, E. S., & Byers, B. (1996). Reliability shifts in measurement reactivity: Driven by content
engagement or self-engagement? Journal of Personality and Social Psychology, 70, 1080–1090.
doi:10.1037/0022-3514.70.5.1080
Kubinger, K. D. (2008). On the revival of the Rasch model-based LLTM: From constructing tests
using item generating rules to measuring item administration effects. Psychology Science
Quarterly, 50, 311–327.
Kvist, A. V., & Gustafsson, J. E. (2008). The relation of fluid intelligence and the general factor as
a function of cultural background: A test of Cattell’s investment theory. Intelligence, 36, 422−436.
Lozano, J. H. (2015). Are impulsivity and intelligence truly related constructs? Evidence based on
the fixed-links model. Personality and Individual Differences, 85, 192–198. doi:10.1016/j.
paid.2015.04.049
Molnár, G., Greiff, S., & Csapó, B. (2013). Inductive reasoning, domain specific and complex
problem solving: Relations and development. Thinking Skills and Creativity, 9, 35–45.
doi:10.1016/j.tsc.2013.03.002
10 S. SUN ET AL.

Muthén, B. (1984). A general structural equation model with dichotomous, ordered categorical, and
continuous latent variable indicators. Psychometrika, 49, 115–132. doi:10.1007/BF02294210
Rabi, R., & Minda, J. P. (2014). Rule-based category learning in children: The role of age and
executive functioning. PLoS One, 9, e85316. doi:10.1371/journal.pone.0085316
Raven, J., Raven, J. C., & Court, J. H. (1998). Manual for Raven’s progressive matrices and vocabulary
scales. Section 3, The standard progressive matrices. Oxford, UK: Oxford Psychologists Press.
Reetzke, R., Maddox, W. T., & Chandkaran, B. (2016). The role of age and executive function in
auditory category learning. Journal of Experimental Child Psychology, 142, 48–65. doi:10.1016/j.
jecp.2015.09.018
Ren, X., Schweizer, K., Wang, T., & Xu, F. (2015). The prediction of students’ academic perfor-
mance with fluid intelligence in giving special consideration to the contribution of learning.
Advances in Cognitive Psychology, 11, 97–105. doi:10.5709/acp-0175-z
Ren, X., Wang, T., Altmeyer, M., & Schweizer, K. (2014). A learning-based account of fluid
intelligence from the perspective of the position effect. Learning and Individual Differences, 31,
30–35. doi:10.1016/j.lindif.2014.01.002
Schweizer, K., Schreiner, M., & Gold, A. (2009). The confirmatory investigation of APM items with
loadings as a function of the position and easiness of items: A two-dimensional model of APM.
Psychology Science Quarterly, 51, 47–64.
Schweizer, K., Troche, S. J., & Rammsayer, T. H. (2011). On the special relationship between fluid
and general intelligence: New evidence obtained by considering the position effect. Personality
and Individual Differences, 50, 1249–1254. doi:10.1016/j.paid.2011.02.019
Tourva, A., Spanoudis, G., & Demetriou, A. (2016). Cognitive correlates of developing intelligence:
The contribution of working memory, processing speed and attention. Intelligence, 54, 136–146.
doi:10.1016/j.intell.2015.12.001
Unsworth, N., & Spillers, G. J. (2010). Working memory capacity: Attention, memory, or both?
A direct test of the dual component model. Journal of Memory and Language, 62, 392–406.
doi:10.1016/j.jml.2010.02.001
Van der Sluis, S., De Jong, P. F., & Van der Leij, A. (2007). Executive functioning in children, and
its relations with reasoning, reading, and arithmetic. Intelligence, 35, 427–449. doi:10.1016/j.
intell.2006.09.001
Verguts, T., & De Boeck, P. (2000). A Rasch model for detecting learning while solving an
intelligence test. Applied Psychological Measurement, 24, 151–162. doi:10.1177/
01466210022031589
Zeller, F., Krampen, D., Reiß, S., & Schweizer, K. (2017). Do adaptive representations of the
item-position effect in APM improve model fit? A simulation study. Educational and
Psychological Measurement, 77, 743–765. doi:10.1177/0013164416654946

You might also like