Professional Documents
Culture Documents
Teodoro (Editor) - Critical Perspectives On PISA As A Means of Global Governance - Risks, Limitations, and Humanistic Alternatives-Routledg
Teodoro (Editor) - Critical Perspectives On PISA As A Means of Global Governance - Risks, Limitations, and Humanistic Alternatives-Routledg
Index198
Figures
Despite vast critical literature (see the literature review by Zhao, 2020),
PISA has become one of the most influential forces in global education.
Its sway far exceeds that of other international large-scale assessments
(ILSA), chronologically older and significantly closer to the learning
of students in the different countries, such as those undertaken by the
International Association for the Evaluation of Educational Achievement
(IEA).
The fact that it stems from an international organization that has become
the main think tank in different areas at world level (Martens & Jakobi,
2010), playing an influential role as soft power within the framework of
the global education reform movement (GERM), is enabling PISA to take
on the features of a “big science” project, in the sense that, in the 1960s,
physicists Alvin M. Weinberg and Derek de Solla Price gave the phrase:
that of a project whose concern is to create large-dimension complexes,
4 António Teodoro
Notes
1 See Romuald Normand, Chapter 3, and Appendix.
2 See Leimgruber & Schmelzer, 2017.
3 See Teodoro, 2003.
4 See Teodoro, 2020.
5 Schleicher describes how the PISA emerged in the OECD thus:
I remember my first meeting of senior education officials at the OECD in
1995. There were representatives from 28 countries seated around a table
in Paris. Some of them were boasting that they had the world’s best school
system – perhaps because it was the one they knew best. When I proposed
a global test that would allow countries to compare the achievements of
their school systems with those of other countries, most said this couldn’t
be done, shouldn’t be done, or wasn’t the business of international
organisations. I had 30 seconds to decide whether to cut our losses or give
it one more try. In the end, I handed my boss, Thomas J. Alexander, then
director of the OECD Education, Employment, Labour and Social Affairs
Directorate, a yellow post-it note saying: “Acknowledge that we haven’t
yet achieved complete consensus on this project, but ask countries if we
can try a pilot”. The idea of PISA was born – and Tom became its most
enthusiastic promoter.
(Schleicher, 2018, pp. 17–18)
6 See the development of idea in Chapter 8.
7 The editor of this book is conducting, from 2019 to 2022, a project titled
“A Success Story? Portugal and the Pisa (2000–2015)”, funded by the Por-
tuguese Foundation for Science and Technology (FCT) (Ref. PTDC/CED-
EDG/30084/2017). Some authors of this book participate as researchers
in this project (see Contributors). In it, going from the Portuguese “success
case”, the researchers question the assumptions of this at once so powerful
and so fragile exercise in international comparison. See the website: http://
pisa.ceied.ulusofona.pt/en/.
8 See the full list of authors in About the Editor and Contributors.
10 António Teodoro
References
Bottani, N., & Walberg, H. J. (1992). À quoi servent les indicateurs internationaux
de l’enseignement? In CERI (Ed.), L’OCDE et les indicateurs internacionaux
de l’enseignement: Un cadre d’analyse (pp. 7–13). OCDE.
Centre for Educational Research and Innovation (CERI). (1996). Regards sur
l’Éducation: Les indicateurs de l’OCDE. OCDE.
Henry, M., Lingard, B., Rizvi, F., & Taylor, S. (2001). The OECD, globalisation
and education policy. Pergamon, Elsevier.
Leimgruber, M., & Schmelzer, M. (Eds.). (2017). The OECD and the international
political economy since 1948. Springer Berlin Heidelberg.
Magalhães, A. (2017). A OCDE, o AHELO e a governação do ensino superior
[The OECD, the AHELO, and the Higher Education Governance]. II Congresso
Internacional Os Desafios da Qualidade em Instituições de Ensino, Escola
Superior de Enfermagem, Coimbra, 17–20 October 2017.
Martens, K., & Jakobi, A. P. (2010). Introduction. The OECD as an actor in
international politics. In K. Martens & A. P. Jakobi (Eds.), Mechanisms of
OECD governance. International incentives for national policy-making?
Oxford University Press.
OECD. (2005). The definition and selection of key competencies [Executive
summary]. www.oecd.org/pisa/35070367.pdf
Price, D. S. (1963). Little science, big science . . . and beyond. Columbia University
Press.
Schleicher, A. (2018). World class: How to build a 21st-century school system.
OECD Publishing. doi:10.1787/4789264300002-en
Teodoro, A. (2003). Educational policies and new ways of governance in a
transnationalization period. In C. A. Torres & A. Antikainen (Eds.), The
international handbook on the sociology of education: An international
assessment of new research and theory (pp. 183–210). Rowman & Littlefield.
Teodoro, A. (2020). Contesting the global development of sustainable and
inclusive education. Education reform and the challenges of neoliberal
globalization. Routledge.
Weinberg, A. (1961). Impact of large-scale science on the United States. Science,
134, 161–164.
Zhao, Y. (2020). Two decades of havoc: A synthesis of criticism against PISA.
Journal of Educational Change, 21, 245–266. https://doi.org/10.1007/
s10833-019-09367-x
1
DOI: 10.4324/9781003255215-2
12 Camilla Addey
changed the way people across the world talk to each other while
also strengthening a layer of influential intermediate authority oper-
ating in between the market and the state. The strategic indetermi-
nacy of these standards, offering fluid goals to a global audience, is
politically shrewd, demonstrating the power of disposition or pure
activity divorced from content. Quality is a practice that is doing
something as it habituates, and saying something as it avoids contro-
versial political stance.
(2014, p. 202)
As ISO 9000 and PISA strengthen a layer that sits between the market
and the state, they both habituate without ever engaging in a public
political dialogue. ISO 9000 and PISA transform the world, cloaked in
their technical procedures.
Secretive nature and power distribution. Both ISO 9000 and PISA
are developed behind closed doors which render invisible the different
positions that are heatedly debated in the making of the infrastructure.
ISO members (known as subscribers and corresponding members) have
differing levels of power and send large numbers of experts to participate
in the meetings at which the standards are developed. Further to the
members of ISO, anyone can decide to buy the expensive draft version of
ISO standards (when there are standards in the process of development)
and share their comments with the ISO secretariat. So although the doors
Invisible struggles, encoded fantasies and ritualized incantations 17
year 2000, it was mainly European nations that subscribed to ISO 9000.
It then increased hugely in China and beyond Europe. To consider lower-
and-middle-income contexts, ISO set up DEVCO. Starting in 2013, the
OECD created PISA for development to adapt to cater for lower- and
middle-income nations, whilst at the same time stating that the PISA
metric could become the universal metric for learning outcomes (OECD,
2013).
Lack of content and the halo effect. The ISO 9000 standards are used to
measure whether an organization has met its self-established objectives,
based on the evaluation of a quality specialist who is hired to carry out
the process. The ISO 9000 standards are not a set of checks that are
the same for all: each client establishes how they will use the standards.
Whatever their evaluation, the client can expose the ISO 9000 certificate.
Easterling argues that ISO 9000’s popularity may be related to this very
lack of content: “over 170 countries have been ISO 9000 certified, yet no
one can say what ISO 9000 actually is” (2014, p. 176). Easterling claims
that the drive to habituate without specific content makes ISO 9000
powerful: “While lacking any specific content or binding requirement,
ISO is a perfect conduit of undeclared activities and intentions with
potentially dangerous consequences” (p. 19). Easterling claims that ISO
9000 standards have spread to every endeavour and are perceived as
“the answer to any problem in the field” (2014, p. 198). This resonates
with PISA: although PISA is a set of definite test items and skills, PISA
has acquired the aura of international standard in education that no
one can actually define (for a discussion on the use of “international
standards” in education, see Steiner-Khamsi & Stolpe, 2006). Similarly,
PISA is described as polyglot and polyvalent: it speaks the language of
whoever adopts it, and is given contextualized meanings (Addey & Sellar,
2018; Steiner-Khamsi, 2017). Like ISO 9000, PISA has also become the
panacea of all educational problems. This relates closely to the halo effect
of PISA, which has come to distribute prestige to anyone who claims to
draw on PISA data or to be associated with its instruments.
Double uses. Easterling argues that ISO 9000 exerts soft power though
compliance between members of the ISO 9000 club. She describes ISO
9000 as inoculating “organizations against regulation while developing
more expensive and opaque bureaucracies” (2014, p. 173). PISA has
also come to exert soft power on educational systems as described (i.e.,
see Grek, 2009; Martens, 2007). Easterling states that in the same way
as ISO 9000 promotes change of behaviour towards greater quality, it
can also inhibit such change and act as “another proxy in disguise”.
Kimerling argues that certifications allow organizations or nations to
hide their practices whilst showing off “enigmatic standards” that are
beyond the legislation of the state. Kimerling’s work showed how oil
companies in the Ecuadorian Amazon used “the cloak of international
Invisible struggles, encoded fantasies and ritualized incantations 19
Spatial software can mix remote abstract values together with the
values of a complex local context, without requiring that all parties
conform to a single universal principle. After all, institutions that
do so often develop elaborate rituals to demonstrate that they
are adhering to such principles when in fact they are departing or
decoupling from them.
(2014, p. 203)
The work of Star also describes how infrastructure means different things
locally (1999, p. 382). The diversity of rationales for participation and
the cherry-picking of PISA data to further agendas show how PISA, with
its tests and rigid protocols to ensure standardized implementation, also
takes on different meanings locally.
Easterling’s work helps analyse the ways in which infrastructure
transforms our worlds in mundane ways that are hard to see. On the one
side, the declared intent of the infrastructure is hidden by the message;
on the other, the struggles that occur in the making of the infrastructure
remain invisible to the user of the infrastructure. As Busch has stated,
it is this hidden, seemingly anonymous nature that gives infrastructures
power. Easterling adds to this by suggesting we consider how
infrastructures encode fantasies and ritualize incantations which hold a
strong grip in a world that is “more receptive to influential fictions and
beliefs” (Easterling, 2014, p. 169). Although rationalized infrastructures
carry universal stories and “moments of planetary integration” (p. 162),
they hold within them world aspirations and dreams and “harbour some
of the most elaborate irrationalities” (p. 168). Easterling states that in
order to cloak infrastructure in the rationality of science, infrastructure-
making-processes convene many internationally distinguished actors who
offer their expertise – a process that is also central to PISA. This analysis
resonates with PISA. Andreas Schleicher, the Director for the Directorate
of Education and Skills at the OECD, also known as the “father” of
the PISA, described PISA as his dream to create a universal language
for educators around the world and as a way to demonstrate what can
be achieved in education (Addey, 2016). Starting from this fantasy and
aspiration, and going on to assemble diverse fantasies and aspirations
across the many participating countries, PISA now carries multiple
aspirations (how we can best educate new generations for today’s society
and global market) and shared dreams (what top-performing countries
have achieved, what is possible for all, the idea that high skills lead to
greater economic competitiveness). Although these aspirations are based
on studies that demonstrate the scientific nature of such claims (i.e., what
Invisible struggles, encoded fantasies and ritualized incantations 21
Conclusions
Drawing on Easterling’s work, this chapter has explored how undeclared
politics are inscribed into PISA, politics that often diverge from the
declared intent of the infrastructure. The politics can be uncovered
by looking at the way in which the infrastructure is “doing” – how it
determines how objects and content are organized and circulate; seeking
out the stories that are ossified in the infrastructure; and how it acts
as a form of domestic and transnational sovereignty act in partnership,
22 Camilla Addey
Notes
1 Most approaches would seek to theorize how PISA is shaping through the
data and analysis it produces.
2 “Mundane to the point of boredom” (p. 377) Star states (1999) who dedi-
cates her paper foundational Critical Infrastructure Studies paper to member
of the Society of People Interested in Boring Things.
3 Dreams of universal rationality may sponsor their own special forms of irra-
tionality. Well-rehearsed theories, like those related to Capital and neoliberal-
ism continue to send us to the same places to search for dangers while other
concentrations of authoritarian power escape scrutiny.
(p. 22)
4 Those acquainted with the history of ILSAs will be reminded of how CERI
(the cradle of PISA as described by Throler, 2013) and PISA came to be devel-
oped under the pressure of the USA and France (Martens, 2007).
5 For further information on the privatization of educational assessment, see the
Facebook page of the ILSAINC research project: “ILSAINC. The ILSA industry”.
6 In line with STS, this understanding highlights how infrastructure “generates
de facto forms of polity faster than even quasi-official forms of governance
can legislate them” (p. 15).
7 Vietnam performed highly in PISA 2015 by excluding many students from
the sample.
8 For example, New Literacy Studies would claim that skills cannot be meas-
ured in such an “autonomous” (Street, 1984, 2013) and decontextualized
manner, nor can any result be interpreted universally without understanding
how such skills are deeply embedded in social, cultural, and institutional con-
text of practice (Barton et al., 2000; Hamilton, 2001).
References
Addey, C. (2015). International literacy assessments in Lao PDR and Mongolia:
A global ritual of belonging. In M. Hamilton, B. Maddox, & C. Addey
(Eds.), 2015. Literacy as numbers: Researching the politics and practices of
international literacy assessment (pp. 147–164). Cambridge University Press.
Addey, C. (2016). Camilla Addey interviews Andreas Schleicher on
PISA and the OECD. Laboratory of International Assessment Stud-
ies. Retrieved April 10, 2021, from http://international assessments.org/
camilla-addeyinterviews-andreas-schleicher-on-pisa-and-pisa-for-development
Invisible struggles, encoded fantasies and ritualized incantations 23
Addey, C. (2019). The appeal of PISA for development in Ecuador and Paraguay:
Theorising and applying the global ritual of belonging. Compare: A Journal of
Comparative and International Education, 50(8), 1159–1174. https://doi.org/
10.1080/03057925.2019.1623653
Addey, C., & Gorur, R. (2020). Translating PISA, translating the world. Com-
parative Education, 56(4), 547–564. https://doi.org/10.1080/03050068.2020.
1771873
Addey, C., Maddox, B., & Zumbo, B. (2020). Assembled validity: Rethinking
Kane’s argument-based approach in the context of international large-scale
assessments (ILSAs). Assessment in Education: Principles, Policy & Practice,
27(6), 588–606. https://doi.org/10.1080/0969594X.2020.1843136
Addey, C., & Sellar, S. (2018). Why do countries participate in PISA?
Understanding the role of international large-scale assessments in global
education policy. In A. Verger, M. Novelli, & H. K. Altinyelken (Eds.),
Global education policy and international development (pp. 97–117).
Bloomsbury.
Addey, C., & Sellar, S. (2019). Is it worth it? Rationales for (Non)participation
in international large-scale learning assessments. Education Research and
Foresight Working Papers Series, N.° 24. UNESCO. Retrieved April 11, 2021,
from https://en.unesco.org/node/268820
Addey, C., Sellar, S., Steiner-Khamsi, G., Lingard, B. & Verger, A. (2017). The rise of
international large-scale assessments and rationales for participation. Compare:
A Journal of Comparative and International Education, 47(3), 1–20. https://
doi.org/10.1080/03057925.2017.1301399
Addey, C., & Verger, A. (2019). ‘An attempt at the first ‘Davos of education’:
Dissonances in the OECD-forum for world education on the future of edu-
cation. Invited blog Education International, Unite for Quality Education
Campaign. Retrieved April 10, 2021, from www.unite4education.org/global-
response/an-attempt-at-the-first-davos-of-education-dissonances-in-the-oecd-
forum-for-world-education-on-the-future-of-education/
Barton, D., Hamilton, M., & Ivanic, R. (Eds.) (2000). Situated literacies. Reading
and writing in context. Routledge.
Breakspear, S. (2012). The policy impact of PISA: An exploration of the normative
effects of international benchmarking in school system performance. OECD
education working papers, N.° 71. OECD Publishing.
Busch, L. (2011). Standards: Recipes for reality. MIT Press.
Easterling, K. (2014). Extrastatecraft: The power of infrastructure. Space: Verso
Books.
Gorur, R. & Addey, C. (2021). Capacity building as the ‘Third Translation: The
story of PISA-D in Cambodia. In S. Grek, C. Maroy, & A. Verger (Eds.), World
yearbook of education 2021: Accountability and datafication in the governance
of education (pp. 94–109). Routledge.
Grek, S. (2009). Governing by numbers: The PISA ‘effect’ in Europe. Journal of
Education Policy, 24(1), 23–37.
Hamilton, M. (2001). Privileged literacies: Policy, institutional process and
the life of the IALS. Language and Education, 15(2–3), 178–196. doi:
10.1080/09500780108666809
24 Camilla Addey
Introduction
This chapter is a bibliometric analysis of the PISA international study
carried out every three years by the OECD. Bibliometric analysis is a
technique that allows presenting a macroscopic view of an extensive
amount of academic literature. The number of different journals publishing
on a specific topic and the subject categories allocated to publications can
show the variety of research topics and the multidisciplinary character of
a research domain (van Nunen et al., 2018).
In bibliometric analysis, the content of articles is examined to summarize
what is known about a certain subject, mapping the scientific field by
analysing its literature to discover patterns, trends and relationships
(Mascarenhas et al., 2018). Bibliometric methods are very useful for
summarizing the academic research of a scientific field, identifying the
major trends in terms of publications, citations, authors, keywords and
institutions (Martínez-López, 2018).
The VOSviewer software (van Eck & Waltman, 2010) was used. This
tool can be helpful for quickly viewing connections in large networks
(up to ten thousand items) and may particularly interest those seeking a
replacement for the extinct citation map tool in Web of Science (Wong,
2018). The software makes use of the VOS technique, an acronym
for Visualisation of Similarities. For a more detailed reflection on the
advantages of the VOS technique compared to multidimensional
mapping, refer to van Eck et al. (2010).
VOS aims to provide a low-dimensional visualization in which objects
are located so that the distance between any pair of objects reflects their
similarity as accurately as possible (van Eck & Waltman, 2007). In this
technique,
does not mean that they are not valid, or that PISA has improved. It
simply means that the criticism has been largely ignored (Zhao, 2020).
In Table 2.1, it is possible to verify the analogous works that reveal the
relevance of this theme. This study stands out for a more comprehensive
analysis by adding topics such as the volume of funding per country,
analysis of scientific production by journals, keyword analysis and clus-
ter analysis. On the other hand, it provides more up-to-date information
compared to the study by Domínguez et al. (2012), which is the closest
and most complete.
This chapter comprises four sections, starting with the introduction
and followed by methodology, analysis and discussion of results, and
conclusion.
Methodology
Data were collected on 15 January 2021, from the Web of Science (WoS)
website. This database was chosen given its extensive coverage of docu-
ments. Comparison of WoS and Scopus shows that WoS has a stronger
coverage, dating back to 1990, and most of its journals are written in
English (Chadegani et al., 2013).
28 Carlos Décio Cordeiro and Vítor Duarte Teodoro
A search for the term “PISA OECD” was performed to avoid “Uni-
versity of Pisa” and projects about the city of “Pisa”. A result of 596
documents was got. Among these, there were 478 papers and 522 doc-
uments written in English. After the combined filtering of these two
characteristics, 411 results were obtained. This was followed by fur-
ther detailed filtering to check that all articles fit. Three articles were
removed that cumulatively refer to OECD and “University of Pisa”
both from the category “WoS Nuclear Engineering and Design” which
are clearly out of line with the purpose of this chapter. A total of 408
results were got.
Once the database was determined, some problems of data uni-
formity were found. The publication rules of the various journals dif-
fer, which causes the name of the same author to appear in different
ways in different documents. It was necessary to standardize the 722
authors present in the database to ensure a correct link strength. It was
also necessary to standardize 21,097 references cited by the authors
throughout the 408 articles under study. An additional problem was
found when standardizing the cited references, as they only present the
name of the first author. For this reason, it was not possible to perform
a cluster analysis for the authors cited in the references, as it would be
biased.
Year
The first result in our research dates back to 1999, followed by a
one-year interregnum, until 2001 when a new article related to PISA
was published, which is understandable since it was in this year that
the first results were put out. The increase in publications remained
slight until 2011, when the number of publications exceeded ten, as
shown in Table 2.2. From that date onwards, this number took off.
As seen in Figure 2.1, the growth curve is close to an exponential
curve.
Given that the cumulative publication curve presents a smoother
trend, a simple linear regression in the form of time series with Ordinary
Least Squares (OLS) method was performed. To avoid heteroscedasticity
problems, the cumulative publication variable was used as a Napierian
logarithm. An R2 = 0.978 (p-value=0.00) was obtained, passing the tests
for heteroscedasticity (Breusch-Pagan, p-value = 0.17), autocorrelation
of residuals (Durbin-Watson, p-value=0.35) and normality of residuals
(p-value=0.19).
How PISA is present in the scientific production 29
Spain 39 USA 65
Australia 13 Australia 56
United Kingdom 13 United Kingdom 51
China 12 Spain 42
Global 9 Germany 41
United States of America 8 Italy 37
European Union 6 China 21
Germany 5 Turkey 20
Finland 5 Finland 16
Others 31 Others 173
Countries
Table 2.3 shows the volume of publications by country, according to the
first author’s correspondence address. It can be seen that in the five coun-
tries with the highest frequency (for the study, the United Kingdom was
equated to one country), three of them are native English speakers (the
United States of America, Australia and the United Kingdom), one criterion
of our research. The remaining rank is completed with Spain and Germany.
Table 2.3 also shows the frequency in number and origin of funding per
country. It should be noted that an article may receive more than one grant.
Two results stand out due to their disparity from the others: funding by
Spanish institutions with a funding/scientific output ratio (39/42) of 93%,
mostly from public funds, and China, with a funding/scientific output ratio
(12/21) of 57%. The remaining countries have an average ratio of 26%
(91/345).
Keywords
Considering that some articles do not present keywords, a simple analy-
sis of keywords could bias our study. We used keywords plus, which are
the keywords created by the Web of Science to catalogue its articles by
terms. Following this first criterion, the 50 most used keywords plus were
How PISA is present in the scientific production 31
selected. The three most used ones appear in over 50 articles, and are
“education” (54), “PISA” (54) and “achievement” (52).
Figure 2 shows a map of bibliographic networks for keywords plus.
According to this criterion, three clusters represent three major themes. The
diameter of the circles represents the number of occurrences of each term.
Cluster A, whose most used words are “achievement”, “schools”,
“impact”, “quality” and “inequality”, represents the thematic axis of the
authors who use approaches whose concerns are inequalities and quality
of schools and teachers, in an analysis carried out at local level. Cluster
B, whose most used words are “performance”, “students”, “mathemat-
ics”, “school” and “science”, represents the thematic axis of the authors
who carry out analysis at the micro level, in an analysis at the level of
the student and his/her performance. It is worth noting the difference
between the use of the word “school” and “schools” in different clusters,
which occurs precisely because of the perspective from which the author
evaluates the topic. Finally, Cluster C, whose most used words are “edu-
cation”, “PISA”, “policy”, “politics” and “governance”, focuses on a
more macro level, revolving on the implications of the PISA project at
the national level as well as the implementation of new decisions at the
political level because of the test results.
Journals
This section details the journals that obtained more citations (Table 2.4)
and that published more articles related to our research (Table 2.5). Two
metrics were used to assess their quality: Quartile and index-H.
Table 2.4 Ranking of 10 journals with the highest number of citations
32
Table 2.5 Ranking of 10 journals with the most articles related to the theme
Carlos Décio Cordeiro and Vítor Duarte Teodoro
How PISA is present in the scientific production 33
References
This section highlights the publications cited in the 408 articles selected
for this study, according to the criteria defined in the methodology.
A total of 21,097 results were obtained, and the ten articles with the
most repeated citations are detailed as follows:
Of these, three were published by the OECD. The remaining articles are
all related to the topic of education policies. Sellar and Lingard appear
in two of these papers. The predominance of each of these ten selected
articles varies from 6% (26 articles out of 408) to 16% (67 articles out of
408) in relation to the total number of articles analysed.
Authors
For the analysis of the authors, 20 authors were selected. The criterion
followed was to have two or more documents in the set of articles, select-
ing the 20 authors by the greatest strength of the link between them. We
note that 69 articles were published by these authors, representing 17%
of the total number of articles. These represent 2.8% of the total 722
authors. Of these 69 articles, 35% received some type of funding.
The authors were divided into clusters according to their writ-
ing themes, as detailed in the introduction section. Five clusters were
obtained as seen in Table 2.6 and Figure 2.3. Each cluster will be detailed
later with a review of the literature related to each one of them.
Table 2.6 20 authors with the highest link strength and their citations
How PISA is present in the scientific production 35
Cluster 1
This cluster is composed of five authors. It is the group of authors who
make a critical analysis of the PISA project. In this group of articles, we
find only documents from 2017 to 2020. There are 16 papers, of which
five were funded, that is about 31%.
One of the pressing criticisms is the implementation of the PISA for
development project, also known as PISA-D. This project is seen as a
strategy to legitimize participating states more than to assess educa-
tion (Auld et al., 2020). Some argue that having a relationship with the
OECD, appearing in their reports and databases, presents itself as a way
of legitimizing states as modern and accountable (Addey, 2020). Ques-
tions are raised about governance in the post-2015 era (Auld et al., 2019)
with these changes made by the OECD in the education landscape pre-
senting only a tiny picture on the values of education, failing to recognize
the complexity of education in all its aspects (Addey, 2020).
For this reason, the adaptation effect emerges, which comprises the
fact that the educational systems try to adapt to the PISA context, and
this new paradigm has implied changes by decision-makers in their inter-
action with the educational system (Addey & Gorur, 2020). It is then
36 Carlos Décio Cordeiro and Vítor Duarte Teodoro
Cluster 2
This cluster is composed of five authors. It is the group of authors who
analyse the importance of PISA in national education policies and their
How PISA is present in the scientific production 37
the field of education and beyond (Niemann & Martens, 2018) leading
to a need for greater involvement of non-OECD members in the PISA
study and scale development (Niemann et al., 2017).
Competitive comparison in education has deepened through the detail-
ing of ILSA data to new scales beyond the nation state (Engel & Frizzell,
2015). This context provides the basis for a discussion of how school-
based international assessment can operate as a governance tool, ena-
bling international organizations to have a greater influence on local
education policy formation and implementation (Rutkowski, 2015).
Cluster 3
This cluster is composed of four authors. It is the group of authors who
analyse the application of PISA and its derivatives, PISA for Schools and
PISA4U. In this group of articles, we find only papers from 2011 to 2020.
There are 18 papers, of which six were funded, that is about 33%.
Lewis et al. (2016) suggest that PISA for Schools provides an
exemplary demonstration of heterarchical governance, in which vertical
policy mechanisms open up horizontal spaces for new policy actors. It
also creates proportional spaces of comparison and governance, allowing
the OECD to “reach” spaces at the school level and directly influence
local educational practices. This is described as a “convergence of policy
method” in widely different policy contexts, where rapid policies and
methods of promoting such policies appear to dominate over potentially
more thoughtful policies and contextual and applied approaches
(Lewis & Hogan, 2019).
PISA for Schools reflects contradictory logics within the European
School System, where the inherently context-based goal of “becoming
European” is juxtaposed with the desire to employ de-contextualized
international evidence, giving rise to a perceived need for such data,
coupled with the overall authority of the OECD, which can produce a
problematic focus on data-driven practice (Lewis, 2020a). It follows,
then, that treating PISA for Schools and other similar education services
as a product results in a potentially dangerous conflation of public and
private benefits, with the potential that (private) profit may end up
trumping (public) education (Lewis, 2017b). These facts position the
OECD as the global expert on education policy, and now, with PISA
for Schools, the local expert on “what works and what doesn’t work in
education” (Lewis, 2017a).
Lewis (2014) argues that the early stages of test production by inter-
national organizations are significant sites in which the global govern-
ance of education is legitimized and enacted. These re-articulations are
set against the extension of economic-social and neo-social rationalities
in all domains of life and the topological production of new spaces of
How PISA is present in the scientific production 39
politics and power (Lingard et al., 2014). PISA and the OECD’s edu-
cation work, more broadly, have facilitated new epistemological and
infrastructural modes of global governance for the OECD in education
(Sellar & Lingard, 2014) and the manner in which they can mobilize
arguments and the media to expand the definition of educational qual-
ity and equity to enrich our vision of education and our debates about it
(Anagnostopoulos et al., 2016).
Also problematically, PISA4U allows the OECD to consolidate its status
as a global expert on education by providing a technical and discursive
platform from which to speak to the teaching profession, which risks
displacing more professionally oriented forms of teacher knowledge and
experience (Lewis, 2020b).
Cluster 4
This cluster is composed of three authors. It is the group of authors who
make an analysis of PISA-based effectiveness, efficiency and performance.
In this group of articles, we found only papers from 2011 to 2020. There
are 16 papers, of which six were funded, that is about 38%.
Following a careful analysis of the PISA data, we can point out that,
regarding schools, the results reveal that the average efficiency of schools
is close to 70%. So, achievement can be increased by 30% with more
effective use of available resources. Therefore, some practices related
to teacher accountability, engagement and professional development,
as well as extracurricular activities, are also positively associated with
higher levels of efficiency (Agasisti & Zoido, 2019). Results show that
students whose teachers focus on some teaching practices achieve better
results than those who have teachers who resort to many different activi-
ties in the classroom (Cordero & Gil-Izquierdo, 2018), considering that
traditional teaching methods have a positive influence on students’ profi-
ciency in mathematics, while implementing more innovative active learn-
ing strategies seems to have a negative impact on students’ performance
(Cordero & Gil-Izquierdo, 2018b). This analysis is reinforced by schools
attended by resilient students, which offer more extracurricular activi-
ties and are characterized by a more positive school climate (Agasisti &
Longobardi, 2017).
Teacher salaries and Internet use (as a proxy for technological literacy)
play a positive role in educational achievement (Agasisti, 2014). Despite
this, the results suggest that a more cautious approach should be taken
to the widespread use of digital innovation to support students’ work
outside of school (Agasisti et al., 2020).
While individual-level characteristics play a role in outcomes, some
school factors (i.e., extracurricular activities and school leadership) are
also involved, suggesting implications related to management policies
40 Carlos Décio Cordeiro and Vítor Duarte Teodoro
Cluster 5
This cluster is composed of three authors. It is the group of authors who
analyse the OECD agenda through PISA with special emphasis on an
analysis of the “regression to the mean” in Finnish results. In this group
of articles, we find only papers from 2009 to 2021. There are four papers,
of which three were funded, a total of 75%.
Rautalin et al. (2021) analysed OECD economic studies between 1995
and 2015. The results show that, while the reports were portrayed as
“scientific” as early as the 1960s, in the 2000s the reports clearly shifted
from the language of economics to a more popularized consultancy lan-
guage. The authors argue that these changes happened because of the
OECD’s reactions to transformations in the broader institutional envi-
ronment and were motivated by its efforts to appear as a significant actor
in knowledge-based policy-making.
How PISA is present in the scientific production 41
Conclusions
There is broad coverage of the themes made possible by the PISA study,
with a more detailed emphasis on its implication for policy decisions.
There is no documentation in this batch of articles analysing the constructs
created by the OECD in its context questionnaires. Similarly, the issue
of teachers is only weakly addressed, despite introducing specific teacher
questionnaires, and it is estimated that this factor may be related to the co-
existence of the TALIS project, also conducted by the OECD. Moreover,
there is a very limited approach to cross-referencing PISA data with other
OECD studies such as TALIS, PIAAC and ESonline, or even with other
international studies such as TIMMS, PIRLS, ePIRLS and ICILS.
This chapter aimed to fill a gap found in the literature related to the
quantification and cataloguing of the extensive repertoire of documents
relevant to the study area.
References
Addey, C. (2020). The appeal of PISA for development in Ecuador and Paraguay:
Theorising and applying the global ritual of belonging. Compare: A Journal of
Comparative and International Education. http://doi.org/10.1080/03057925.
2019.1623653
Addey, C., & Gorur, R. (2020). Translating PISA, translating the world. Com-
parative Education. http://doi.org/10.1080/03050068.2020.1771873
Addey, C., Maddox, B., & Zumbo, B. (2020). Assembled validity: Rethinking
Kane’s argument-based approach in the context of international large-scale
assessments (ILSAs). Assessment in Education-Principles Policy & Practice.
http://doi.org/10.1080/0969594X.2020.1843136
Agasisti, T. (2014). The efficiency of public spending on education: An empiri-
cal comparison of EU countries. European Journal of Education. http://doi.
org/10.1111/ejed.12069
Agasisti, T., Gil-Izquierdo, M., & Han, S. (2020). ICT use at home for school-
related tasks: What is the effect on a student’s achievement? Empirical evidence
from OECD PISA data. Education Economics. http://doi.org/10.1080/09645
292.2020.1822787
42 Carlos Décio Cordeiro and Vítor Duarte Teodoro
Agasisti, T., & Longobardi, S. (2014). Inequality in education: Can Italian dis-
advantaged students close the gap? Journal of Behavioral and Experimental
Economics. http://doi.org/10.1016/j.socec.2014.05.002
Agasisti, T., & Longobardi, S. (2017). Equality of educational opportunities,
schools’ characteristics and resilient students: An empirical study of EU-15
countries using OECD-PISA 2009 data. Social Indicators Research. http://doi.
org/10.1007/s11205-016-1464-5
Agasisti, T., Longobardi, S., & Regoli, A. (2017). A cross-country panel approach
to exploring the determinants of educational equity through PISA data. Qual-
ity & Quantity. http://doi.org/10.1007/s11135-016-0328-z
Agasisti, T., & Murtinu, S. (2012). ‘Perceived’ competition and performance in
Italian secondary schools: New evidence from OECD-PISA 2006. British Edu-
cational Research Journal. http://doi.org/10.1080/01411926.2011.588314
Agasisti, T., & Zoido, P. (2018). Comparing the efficiency of schools through inter-
national benchmarking: Results from an empirical analysis of OECD PISA 2012
data. Educational Researcher. http://doi.org/10.3102/0013189X18777495
Agasisti, T., & Zoido, P. (2019). The efficiency of schools in developing countries,
analysed through PISA 2012 data. Socio-Economic Planning Sciences. http://
doi.org/10.1016/j.seps.2019.05.002
Anagnostopoulos, D., Lingard, B., & Sellar, S. (2016). Argumentation in educa-
tional policy disputes: Competing visions of quality and equity. Theory into
Practice. http://doi.org/10.1080/00405841.2016.1208071
Aparicio, J., Cordero, J., Gonzalez, M., & Lopez-Espin, J. (2018). Using non-
radial DEA to assess school efficiency in a cross-country perspective: An empir-
ical analysis of OECD countries. Omega-International Journal of Management
Science. http://doi.org/10.1016/j.omega.2017.07.004
Asan, A., & Aslan, A. (2020). Quartile scores of scientific journals: Meaning,
importance and usage. Acta Medica Alanya, 4(1), 102–108.
Auld, E., Li, X., & Morris, P. (2020). Piloting PISA for development to suc-
cess: An analysis of its findings, framework and recommendations. Compare:
A Journal of Comparative and International Education. http://doi.org/10.
1080/03057925.2020.1852914
Auld, E., & Morris, P. (2016). PISA, policy and persuasion: Translating com-
plex conditions into education “best practice.” Comparative Education, 52(2),
202–229. doi:10.1080/03050068.2016.1143278
Auld, E., & Morris, P. (2019a). The OECD and IELS: Redefining early childhood
education for the 21st century. Policy Futures in Education, 17(1), 11–26.
https://doi.org/10.1177/1478210318823949
Auld, E., & Morris, P. (2019b). Science by streetlight and the OECD’s measure of
global competence: A new yardstick for internationalisation? Policy Futures in
Education, 17(6), 677–698. https://doi.org/10.1177/1478210318819246
Auld, E., Rappleye, J., & Morris, P. (2019). PISA for development: How the
OECD and World Bank shaped education governance post-2015. Comparative
Education. http://doi.org/10.1080/03050068.2018.1538635
Aydin, A., Erdag, C., & Tas, N. (2011). A comparative evaluation of Pisa 2003–
2006 results in reading literacy skills: An example of top-five OECD countries
and Turkey. Kuram ve Uygulamada Egitim Bilimleri, 11, 651–673.
How PISA is present in the scientific production 43
Bieber, T., & Martens, K. (2011). The OECD PISA study as a soft power in edu-
cation? Lessons from Switzerland and the US. European Journal of Education.
http://doi.org/10.1111/j.1465-3435.2010.01462.x
Castro, S., & Sevillano, M. (2019). Análisis bibliométrico de la investigación
educativa sobre desventaja sociocultural/socieducativa en el periodo 2015 a
2019. Enseñanza & Teaching: Revista Interuniversitaria de Didáctica, 37(2),
147–164. https://doi.org/10.14201/et2019372147164
Chadegani, A., Salehi, H., Yunus, M., Farhadi, H., Fooladi, M., Farhadi, M., &
Ebrahim, N. (2013). A comparison between two main academic literature col-
lections: Web of science and Scopus databases. Asian Social Science, 9(5).
Cordero, J., & Gil-Izquierdo, M. (2018). The effect of teaching strategies on stu-
dent achievement: An analysis using TALIS-PISA-link. Journal of Policy Mod-
eling. http://doi.org/10.1016/j.jpolmod.2018.04.003
Cordero, J., Polo, C., Santin, D., & Simancas, R. (2018). Efficiency measure-
ment and cross-country differences among schools: A robust conditional
nonparametric analysis. Economic Modelling. http://doi.org/10.1016/j.
econmod.2018.05.001
Dobbins, M., & Martens, K. (2012). Towards an education approach a la Finlan-
daise? French education policy after PISA. Journal of Education Policy. http://
doi.org/10.1080/02680939.2011.622413
Domínguez, M., Vieira, M., & Vidal, J. (2012). The impact of the programme
for international student assessment on academic journals. Assessment in
Education: Principles, Policy & Practice, 19(4), 393–409. doi:10.1080/0969
594X.2012.659175
Engel, L. (2015). Steering the national: Exploring the education policy uses of
PISA in Spain. European Education. http://doi.org/10.1080/10564934.2015.
1033913
Engel, L., & Frizzell, M. (2015). Competitive comparison and PISA brag-
ging rights: Sub-national uses of the OECD’s PISA in Canada and the USA.
Discourse-Studies in the Cultural Politics of Education. http://doi.org/10.1080/
01596306.2015.1017446
Engel, L., & Rutkowski, D. (2020). Pay to play: What does PISA participation
cost in the US? Discourse-Studies in the Cultural Politics of Education. http://
doi.org/10.1080/01596306.2018.1503591
Engel, L., Rutkowski, D., & Thompson, G. (2019). Toward an international
measure of global competence? A critical look at the PISA 2018 framework.
Globalisation Societies and Education. http://doi.org/10.1080/14767724.
2019.1642183
Ertl, H. (2006). Educational standards and the changing discourse on educa-
tion: The reception and consequences of the PISA study in Germany. Oxford
Review of Education, 32(5), 619–634. doi:10.1080/03054980600976320
Flis, I., & van Eck, N. (2017). Framing psychology as a discipline (1950–1999):
A large-scale term co-occurrence analysis of scientific literature in psychology.
History of Psychology, 21(4). http://doi.org/10.1037/hop0000067
Grek, S. (2009). Governing by numbers: The PISA “effect” in Europe. Journal of
Education Policy, 24(1), 23–37. doi:10.1080/02680930802412669
44 Carlos Décio Cordeiro and Vítor Duarte Teodoro
Grey, S., & Morris, P. (2018). PISA: Multiple “truths’ and mediatised global
governance. Comparative Education. http://doi.org/10.1080/03050068.2018.
1425243
Gumus, S., & Atalmis, E. (2011). Exploring the relationship between purpose
of computer usage and reading skills of Turkish students: Evidence from PISA
2006. Turkish Online Journal of Educational Technology, 10(3), 129–140.
Hirsch, J. (2005). An index to quantify an individual’s scientific research out-
put. PNAS – Proceedings of the National Academy of United States, 102(46),
16569–16572. https://doi.org/10.1073/pnas.0507655102
Hopfenbeck, T., Lenkeit, J., Masri, Y., Cantrell, K., Ryan, J., & Baird, J. (2018).
Lessons learned from PISA: A systematic review of peer-reviewed articles on
the programme for international student assessment. Scandinavian Journal of
Educational Research, 62(3). https://doi.org/10.1080/00313831.2016.1258726
Komatsu, H., & Rappleye, J. (2017a). A new global policy regime founded on
invalid statistics? Hanushek, Woessmann, PISA, and economic growth. Com-
parative Education. http://doi.org/10.1080/03050068.2017.1300008
Komatsu, H., & Rappleye, J. (2017b). A PISA paradox? An alternative theory
of learning as a possible solution for variations in PISA scores. Comparative
Education Review. http://doi.org/10.1086/690809
Komatsu, H., & Rappleye, J. (2019). Refuting the OECD-World Bank develop-
ment narrative: Was East Asia’s ‘economic miracle’ primarily driven by educa-
tion quality and cognitive skills? Globalisation Societies and Education. http://
doi.org/10.1080/14767724.2019.1577718
Lewis, S. (2014). The OECD, PISA and educational governance: A call to critical
engagement. Discourse-Studies in the Cultural Politics of Education. http://
doi.org/10.1080/01596306.2014.899833
Lewis, S. (2017a). Governing schooling through ‘what works’: The OECD’s PISA
for Schools. Journal of Education Policy. http://doi.org/10.1080/02680939.
2016.1252855
Lewis, S. (2017b). Policy, philanthropy and profit: The OECD’s PISA for Schools
and new modes of heterarchical educational governance. Comparative Educa-
tion. http://doi.org/10.1080/03050068.2017.1327246
Lewis, S. (2020a). ‘Becoming European’? Respatialising the European schools
system through PISA for Schools. International Studies in Sociology of Educa-
tion. http://doi.org/10.1080/09620214.2019.1624593
Lewis, S. (2020b). Providing a platform for ‘what works’: Platform-based gov-
ernance and the reshaping of teacher learning through the OECD’s PISA4U.
Comparative Education. http://doi.org/10.1080/03050068.2020.1769926
Lewis, S., & Hogan, A. (2019). Reform first and ask questions later? The impli-
cations of (fast) schooling policy and ‘silver bullet’ solutions. Critical Studies in
Education. http://doi.org/10.1080/17508487.2016.1219961
Lewis, S., Sellar, S., & Lingard, B. (2016). PISA for schools: Topological rational-
ity and new spaces of the OECD’s global educational governance. Comparative
Education Review, 60(1), 27–57. http://doi.org /10.1086/684458
Lingard, B., Sellar, S., & Savage, G. (2014). Re-articulating social justice as equity
in schooling policy: The effects of testing and data infrastructures. British Jour-
nal of Sociology of Education. http://doi.org/10.1080/01425692.2014.919846
How PISA is present in the scientific production 45
Longobardi, S., Pagliuca, M., & Regoli, A. (2017). Family background and
financial literacy of Italian students: The mediating role of attitudes and moti-
vations. Economics Bulletin, 37(4).
Longobardi, S., Pagliuca, M., & Regoli, A. (2018). Can problem-solving atti-
tudes explain the gender gap in financial literacy? Evidence from Italian stu-
dents’ data. Quality & Quantity. http://doi.org/10.1007/s11135-017-0545-0
Martens, K., & Niemann, D. (2013). When do numbers count? The differential
impact of the PISA rating and ranking on education policy in Germany and the
US. German Politics. http://doi.org/10.1080/09644008.2013.794455
Martínez-López, F., Merigó, J., Valenzuela-Fernández, L., & Nicolás, C. (2018).
Fifty years of the European journal of marketing: A bibliometric analysis.
European Journal of Marketing, 52(1/2), 439–468.
Mascarenhas, C., Ferreira, J. J., & Marques, C. (2018). University – industry
cooperation: A systematic literature review and research agenda. Science and
Public Policy. http://doi.org/10.1093/scipol/scy003/4829714
Meyer, H., & Benavot, A. (Eds.). (2013). PISA, power, and policy: The emer-
gence of global educational governance. Symposium Books.
Mølstad, C. E., Pettersson, D., & Forsberg, E. (2017). A game of thrones: Organ-
ising and legitimising knowledge through PISA research. European Educational
Research Journal, 16(6), 869–884. https://doi.org/10.1177/1474904117715835
Morgan, C. (2011). Constructing the OECD PISA. In M. Pereya (Ed.), PISA
under examination: Changing knowledge, changing tests, and changing
schools. Sense Publishers.
Niemann, D., Hartong, S., & Martens, K. (2018). Observing local dynamics of
ILSA projections in federal systems: A comparison between Germany and the
United States. Globalisation Societies and Education. http://doi.org/10.1080/
14767724.2018.1531237
Niemann, D., & Martens, K. (2018). Soft governance by hard fact? The OECD
as a knowledge broker in education policy. Global Social Policy. http://doi.
org/10.1177/1468018118794076
Niemann, D., Martens, K., & Teltemann, J. (2017). PISA and its consequences:
Shaping education policies through international comparisons. European Jour-
nal of Education. http://doi.org/10.1111/ejed.12220
OECD. (2001). Knowledge and skills for life: First results from PISA 2000.
OECD Publishing. https://doi.org/10.1787/9789264195905-en
OECD. (2004). Learning for tomorrow’s world: First results from PISA 2003.
OECD Publishing. https://doi.org/10.1787/9789264006416-en
OECD. (2014). PISA 2012 technical report. OECD Publishing.
Palmblad, M., & van Eck, N. (2018). Bibliometric analyses reveal patterns of
collaboration between ASMS members. Journal of The American Society for
Mass Spectrometry, 29(3), 447–454.
Pons, X. (2017). Fifteen years of research on PISA effects on education govern-
ance: A critical review. European Journal of Education, 52(2), 131–144.
Rappleye, J., & Komatsu, H. (2020a). Is knowledge capital theory degenerate?
PIAAC, PISA, and economic growth. Compare: A Journal of Comparative and
International Education. http://doi.org/10.1080/03057925.2019.1612233
46 Carlos Décio Cordeiro and Vítor Duarte Teodoro
Rappleye, J., & Komatsu, H. (2020b). Is shadow education the driver of East
Asia’s high performance on comparative learning assessments? Education Pol-
icy Analysis Archives. http://doi.org/10.14507/epaa.28.4990
Rappleye, J., Komatsu, H., Uchida, Y., Krys, K., & Markus, H. (2020). ‘Better
policies for better lives’? Constructive critique of the OECD’s (mis)measure
of student well-being. Journal of Education Policy. http://doi.org/10.1080/
02680939.2019.1576923
Rautalin, M. (2018). PISA and the criticism of Finnish education: Justifications
used in the national media debate. Studies in Higher Education. http://doi.org/
10.1080/03075079.2018.1526773
Rautalin, M., & Alasuutari, P. (2009). The uses of the national PISA results by
Finnish officials in central government. Journal of Education Policy. http://doi.
org/10.1080/02680930903131267
Rautalin, M., Alasuutari, P., & Vento, E. (2019). Globalisation of education
policies: Does PISA have an effect? Journal of Education Policy. http://doi.org/
10.1080/02680939.2018.1462890
Rautalin, M., Syvatera, J., & Vento, E. (2021). International organizations establish-
ing their scientific authority: Periodizing the legitimation of policy advice by the
OECD. International Sociology. http://doi.org/10.1177/0268580920947871
Rutkowski, D. (2015). The OECD and the local: PISA-based Test for Schools in
the USA. Discourse-Studies in the Cultural Politics of Education. http://doi.org
/10.1080/01596306.2014.943157
Sellar, S., & Lingard, B. (2013a). Looking east: Shanghai, PISA 2009 and the
reconstitution of reference societies in the global education policy field.
Comparative Education. http://doi.org/10.1080/03050068.2013.770943
Sellar, S., & Lingard, B. (2013b). The OECD and global governance in education.
Journal of Education Policy. http://doi.org/10.1080/02680939.2013.779791
Sellar, S., & Lingard, B. (2014). The OECD and the expansion of PISA: New
global modes of governance in education. British Educational Research Jour-
nal. http://doi.org/10.1002/berj.3120
Takayama, K. (2008). The politics of international league tables: PISA in Japan’s
achievement crisis debate. Comparative Education, 44(4), 387–407. http://doi.
org/10.1080/03050060802481413
Teltemann, J., & Jude, N. (2019). Assessments and accountability in secondary
education: International trends. Research in Comparative and International
Education. http://doi.org/10.1177/1745499919846174
Teltemann, J., & Schunck, R. (2016). Education systems, school segregation, and
second-generation immigrants’ educational success: Evidence from a country-
fixed effects approach using three waves of PISA. International Journal of
Comparative Sociology. http://doi.org/10.1177/0020715216687348
Teltemann, J., & Windzio, M. (2019). The impact of marketisation and spa-
tial proximity on reading performance: International results from PISA 2012.
Compare: A Journal of Comparative and International Education. http://doi.
org/10.1080/03057925.2018.1458597
van Eck, N. J., & Waltman, L. (2007). VOS: A new method for visualizing simi-
larities between objects. In H.-J. Lenz & R. Decker (Eds.), Advances in data
analysis: Proceedings of the 30th annual conference of the German Classifica-
tion Society (pp. 299–306). Springer.
How PISA is present in the scientific production 47
van Eck, N. J., & Waltman, L. (2010). VOSViewer: Visualizing scientific land-
scapes [Software]. www.vosviewer.com
van Eck, N. J., Waltman, L., Dekker, R., & Van den Berg, J. (2010). A compari-
son of two techniques for bibliometric mapping: Multidimensional scaling and
VOS. Journal of the American Society for Information Science and Technol-
ogy, 61(12), 2405–2416.
van Nunen, K., Li, J., Reniers, G., & Ponnet, K. (2018). Bibliometric analysis of
safety culture research. Safety Science, 108, 248–258.
Wong, D. (2018). VOSviewer. Technical Services Quarterly, 35(2), 219–220.
Xiaomin, L., & Auld, E. (2020): A historical perspective on the OECD’s ‘humani-
tarian turn’: PISA for development and the learning framework 2030. Com-
parative Education. https://doi.org/10.1080/03050068.2020.1781397
Zhao, Y. (2020). Two decades of havoc: A synthesis of criticism against PISA.
Journal of Education Change, 21, 245–266.
3
Over the past two decades, PISA has become a key element in guid-
ing national policies around the world and it has raised a great deal
of scientific and public debate (Meyer & Benavot, 2013). In the first
part of this chapter, I report on this issue without being exhaustive,
but rather showing how PISA has given rise to an international space
of circulation between science, expertise and policy despite increas-
ing criticism (Sellar & Lingard, 2014). After shedding light on the
genesis of the survey and its contribution to worldwide governing by
numbers, while networks of experts and policymakers were contribut-
ing to its legitimization and dissemination, sociological research has
moved towards a more precise scrutiny of some structuring effects on
education policies (Grek, 2014). Sociologists are particularly studying
processes of mediation and translation in national education systems
while evidence-based accountability policies are currently developed.
This chapter contributes to this emerging field of research by illustrat-
ing and situating the French case. However, it should be emphasized
that this national case can only be understood if it is contextualized
in a broader European space. Indeed, PISA is a key component of the
European lifelong learning strategy to which many elements of French
policy are related, as in other European countries. To avoid methodo-
logical nationalism, it is therefore important to consider how PISA
takes place in a European political arithmetic of inequalities, before
specifying the dissemination of this survey within the French context.
Then, I will show how an epistemic governance has been built around
the survey from knowledge produced by an association between experts
and policymakers and through a new formulation of the French ideal
of equal opportunities in education.
DOI: 10.4324/9781003255215-4
PISA as epistemic governance within the European political arithmetic 49
still concerns about limiting the repetition rate and improving access
rates to higher education, measurements by economists, also relayed by
psychologists, are now geared towards limiting school dropout rates and
improving cognitive skills extended to the social and emotional area.
Measuring the adequacy between training and employment has been
replaced by randomized controlled trials based on targeted experiments
and evaluations on behalf of the new “experimental economics”, which
take the social and educational fields as a vast laboratory. Economists
have changed their epistemological and methodological toolbox, but
the idea of investment in human capital, albeit criticized, remains the
dominant orthodoxy. It is not surprising that some of their leaders, such
as Eric Hanushek and Ludger Woessmann, have taken an interest in the
PISA data, considering it to be an excellent measure of the quality and
efficiency of education systems, and a benchmark for judging how well
countries are investing in their human capital (2010).
These conceptions are taken up by welfare theorists such as Esping-
Andersen and his colleagues (Esping-Andersen et al., 2001). According to
them, two reasons justify some expectations regarding the level of skills
and human capital among children. The first is demographic. Because
of low fertility, future cohorts of very modest young people will have to
support a large and rapidly growing elderly population. It is therefore
necessary to invest as early as possible in the productivity of youth to
ensure a sustainable Welfare State in Europe over the coming decades.
The other explanation lies in the rapidly expanding skillset required by
the knowledge economy. Reforms in European countries need to target
early school leavers with higher unemployment rates. These low-skilled
people are unlikely to obtain high retirement pensions and poverty
threatens them at the end of their lives. Cognitive (and non-cognitive)
skills are therefore considered essential for a good career path.
To the extent that acquired skills affect school success and lifelong
learning, it is important that all children get a good start to maximize
the “return on investment”. On this point, if we follow Esping-Andersen,
the situation of states is revealed by the PISA indicators, where a nation’s
superiority is explained by institutional factors generating differences
in performance. But social inheritance mechanisms are present in early
childhood. This is therefore where the Welfare State’s efforts must focus
on to create more equality and increase the productivity of the workforce.
The aim is to combat child poverty without compensating for inequalities
in parental resources in the acquisition of human capital. To achieve this,
it is necessary to reduce the economic insecurity of mothers at the bottom
of the income scale by supporting their inclusion into employment. The
other mechanism is to support parents’ investment in their children’s
cognitive development. Interventions should take the form of targeted
measures for “at-risk” children identified from early childhood.
PISA as epistemic governance within the European political arithmetic 57
Conclusion
More space would be needed to describe and analyse in depth the epis-
temic work that has been established around the PISA paradigm in
PISA as epistemic governance within the European political arithmetic 65
France. This chapter has shown that the PISA survey, in addition to
being a boundary object giving rise to multiple meanings and mobiliza-
tions at national and global scales, is based on an economic reasoning
that relativizes its uses to promote equality of opportunities. This inter-
national survey is undoubtedly a political technology and an instrument
of public action that legitimizes educational reforms towards greater
accountability and evidence-based education. From this point of view,
the PISA paradigm has the similar restructuring effect that occurred
with the comprehensive school during the 1960s and 1970s. It explains
the importance of studying these transformations over a long period
of time, through a history of the present, to highlight the genealogy
of the survey and its extension from the United States. The role of
major International Organizations in the mechanisms of PISA policy
travelling and transfer has also been long established. PISA is often
presented as a neo-liberal vanguard that contributes to introducing
privatization and private interests in public education. It is also a pow-
erful means of institutionalizing a new state epistemic governance on
behalf of the human capital theory. Beyond its instrumentalization by
indicators and data, and the production of expert knowledge within
and outside Ministries of Education, this epistemic work is based on
authority and imagery that are imposed on the public debate and the
media through a soft infusion like the opium that puts the most vigi-
lant minds to sleep. As the Chinese general Sun Tzu put it in his Art
of War, the PISA paradigm has the efficiency to win the epistemic bat-
tle before having engaged the fight. This is the reason why sociology,
as Pierre Bourdieu argued, from critical reflexive thinking on inter-
national assessments, rather than become a governing science, must
remain a martial art.
Notes
1 Programme Incitatif de Recherche en Education et Formation.
2 Conseil National d’Evaluation Scolaire.
References
Addey, C. (2015). Participating in international literacy assessments in Lao PDR
and Mongolia: A global ritual of belonging Literacy as numbers: Researching
the politics and practices of international literacy assessment. In M. Hamilton,
B. Maddox, & C. Addey (Eds.), Literacy as numbers: Researching the politics
and practices of international literacy assessment. Cambridge University Press.
Addey, C., Sellar, S., Steiner-Khamsi, G., Lingard, B., & Verger, A. (2017). The
rise of international large-scale assessments and rationales for participation.
Compare: A Journal of Comparative and International Education, 47(3),
434–452.
66 Romuald Normand
Alasuutari, P., & Qadir, A. (2019). Epistemic governance: Social change in the
modern world. Springer Nature.
Alexiadou, N., Fink-Hafner, D., & Lange, B. (2010). Education policy conver-
gence through the open method of coordination: Theoretical reflections and
implementation in ‘old’ and ‘new’ national contexts. European Educational
Research Journal, 9(3), 345–358.
Baudelot, C., & Establet, R. (2009). L’élitisme républicain: l’école française à
l’épreuve des comparaisons internationales. Seuil.
Berliner, D. C. (2020). Implications of understanding that PISA is simply another
standardized achievement test. In G. Fan & T. S. Popkewitz (Eds.), Handbook
of education policy studies (pp. 239–258). Springer.
Berliner, D. C., & Biddle, B. J. (1995). The manufactured crisis: Myth, fraud, and
the attack on America’s public schools. Longman Publishers.
Biesta, G. (2007). Why “what works” won’t work: Evidence-based practice and
the democratic deficit in educational research. Educational Theory, 57(1),
1–22.
Carvalho, L. M. (2020). Revisiting the fabrication of PISA. In G. Fan & T.
S. Popkewitz (Eds.), Handbook of education policy studies (pp. 259–273).
Springer.
Carvalho, L. M., & Costa, E. (2015). Seeing education with one’s own’ eyes and
through PISA lenses. Discourse: Studies in the Cultural Politics of Education,
36(5), 638–646.
Carvalho, L. M., & Costa, E. (2016). The praise of mutual – surveillance in
Europe. In R. Normand & J.-L. Derouet (Eds.), A European politics of educa-
tion? (pp. 53–72). Routledge.
Desrosières, A. (1998). The politics of large numbers: A history of statistical
reasoning. Harvard University Press.
Dubet, F. (2004). L’école des chances. Qu’est-ce qu’une école juste ? Éditions du
Seuil et La République des Idées
Dubet, F., Duru-Bellat, M., & Vérétout, A. (2010). Les sociétés et leur école.
Emprise du diplôme et cohésion sociale. Le Seuil.
Esping-Andersen, G., Gallie, D., Hemerijck, A., & Myles, J. (2001). A new
welfare architecture for Europe. Report submitted to the Belgian Presidency
of the European Union.
Felouzis, G., & Charmillot, S. (2012). Les enquêtes PISA. PUF (col. Que sais-je?).
Fenwick, T., & Edwards, R. (2010). Actor-network theory in education.
Routledge.
Gorur, R. (2011). Policy as assemblage. European Educational Research Journal,
10(4), 611–622.
Gorur, R. (2014). Towards a sociology of measurement in education policy.
European Educational Research Journal, 13(1), 58–72.
Grek, S. (2009). Governing by numbers: The PISA ‘effect’ in Europe. Journal of
Education Policy, 24(1), 23–37.
Grek, S. (2013). Expert moves: International comparative testing and the rise of
expertocracy. Journal of Education Policy, 28(5), 695–709.
Grek, S. (2014). OECD as a site of co-production: European education govern-
ance and the new politics of ‘policy mobilization’. Critical Policy Studies, 8(3),
266–281.
PISA as epistemic governance within the European political arithmetic 67
Tuijnman, A. (2003). Measuring lifelong learning for the new economy. Com-
pare: A Journal of Comparative and International Education, 33(4), 471–482.
Xingguo, Z., & Normand, R. (2021). Accountability policies in Chinese basic
education: The long March towards quality and evidence. In S. Grek, C.
Maroy, & T. Verger (Eds.). World yearbook of education accountability and
datafication in the governance of education. Routledge.
4
Introduction
In this section, the structure of education in Brazil and the development
of external assessment promoted by the federal authorities will be pre-
sented, as well as the influence exerted by Pisa, an instrument devel-
oped by the Organization for Economic Cooperation and Development
(OECD) over educational initiatives of the Ministry of Education (MEC).
Brazil, a federal country, with a population of 210 million people and
an area of 8.5 million km2, began its period of constitution as a nation
with the arrival of the Portuguese colonists in 1500. It proclaimed its
independence from Portugal in 1822, with the monarchy, and became a
republic in 1889.
Since 1996, the Brazilian education is organized in two levels: higher
education (ISCED 6 to 8) and basic education (ISCED 0 to 3). Basic
education is then divided into three stages of general education: Early
Childhood Education (ISCED 0), Core Education (ISCED 1 e 2) and
Middle Schooling (ISCED 3) (OECD, 2020a). Mandatory education in
Brazil goes from four to 17 years of age. In this age group, 98% of the
population attends school. Basic education comprehends 181.9 thousand
schools and 48.5 million enrolments, 82% of which are offered by the
public school networks funded by the state and municipal authorities
(INEP, 2019a).
Brazil is a very unequal country, both socially and regionally, and this
is reflected in the problems faced by education (Weller & Horta Neto,
2021). One of these is the high rate of grade retention, which translated
into a high proportion of individuals older than expected for a particular
school grade. In the first five years of Core Education, this proportion of
people older than the expected school age rises to 11.2% of enrolments
and reaches 24.7% in the final four years. In the four following years, this
proportion is even higher, jumping to 28.2% of enrolments. To worsen the
problem at this stage of education, nearly 12% of young people between
the ages of 15 and 17 are outside the school system (INEP, 2019a).
DOI: 10.4324/9781003255215-5
Pisa and curricular reforms in Brazil 71
Table 4.1 C
omparison of Brazil’s performance in PISA tests and average per-
formance of the OECD member countries: 2000 to 2015
Reading
The area of Reading has shown little variation as regards the definition
of its construct throughout the cycles. In 2000, the first PISA cycle, when
Reading was its core theme, was defined thus:
Table 4.2 Description of the scale for Reading Proficiency in PISA 2018
The results of Reading Literacy are presented in Table 4.2, divided into
nine levels, from 6 to 2 and then levels “1a”, “1b”, “1c” and “Below
1c” (OECD, 2019b, pp. 87–88). The space of this chapter would not be
enough to analyse the whole table. For this reason, passages from the
descriptions were selected, which present common tasks described in dif-
ferent levels. Thus, just as those presented in Table 4.2 were selected, oth-
ers might have been chosen with no impact on the quality and validity of
the analysis done. For this study the chosen passages were the ones that
referred to the ability to make inferences, one of the tasks required by the
language test. To make the table, some of the levels were also selected so
as to exemplify the extension of this skill, the minimum proficiency to do
each task and the percentage of OECD and Brazilian students classified
at each level in 2018.
To start with, it is necessary to comment some imprecisions of the way
PISA presents its results. As stated in the introduction to this chapter,
calculating proficiency uses the ITR model, which expresses a probability
78 João Luiz Horta Neto
that the task described has been carried out, not a certainty that all those
students classified on a particular level did it. Yet the language used in
the description of the scale creates the idea that the students classified
on a particular level managed to perform all the tasks described in it,
which may not correspond to the actual situation. Besides, like any meas-
ure, an error is associated with it; in other words, the value of the profi-
ciency indicated varies within a margin of error. These limitations are not
included in the description of the scale, thereby creating the notion that
what is expressed in these tables of the descriptions of the tasks carried
out is exact information, when actually it is not.
Regarding the tasks that involved the ability to make inferences
described in Table 4.2, when the different levels of the scale presented
are compared, only in one of them, level 1b, no reference is made to the
one task pertaining to that skill. Consequently, the report implies that
the students classified in that level were not capable of making inferences.
But what inferences is PISA alluding to? Since only access to part of the
items is provided, one of the limitations of the use of the IRT discussed
in the introduction to this chapter, it is not possible to understand all the
tasks the students were subject to and thus grasp why they couldn’t make
the inferences required of them.
In the Brazilian case, Table 4.2 shows that 17.7% of students were
classified on level 1b and, therefore, demonstrated not having developed
the ability to make inferences. This may be true for the texts and the
tasks required by the items on the test. In no way is it possible to assume
that these students are incapable of making inferences in other situations.
However, the manner in which the information is presented to society,
including to policymakers and the media, highlights the failure of the
school system, which allows 17.7% of Brazilians to be unable to make
inferences. If, on the one hand, it is a fact that there is a considerable
difference in performance between the students of OECD countries and
those of Brazil, due to the distinct realities in which they live, on the
other hand, it is pedagogically hard to believe that such a large section
of Brazilian students is incapable of making some type of inference. The
issue raised here has immense importance, since it is related with the uses
made of the results obtained. If the limits of the measures taken are not
made clear, countless conclusions may be drawn and the speeches defend-
ing so-called ideal models spread, casting them as silver bullets capable of
overcoming even social inequalities and ensuring learning improvements,
even if they do not clearly define what these would be.
Another item of data from Table 4.2, regarding level 4, by indicating
that students “are able to compare perspectives and extract inferences
based on multiple sources” provides very limited information on the
actual dimension of the required task. This is the problem of external
evaluations that use IRT to analyse results. As the figure itself means
nothing, the attempt to describe what students can do based on the
Pisa and curricular reforms in Brazil 79
tasks presented by the items is quite limited and may lead to multiple
interpretations. Thus, the data on Table 4.2 indicate that the inference
making skill can be measured in different tasks with varying degrees of
difficulty and complexity. No more than that. This being the case, it is
very difficult to formulate education policies to improve the students’
reading skill solely based on information like that. On the other hand, the
tables with the descriptions of the levels of proficiency give a clear picture
of the differences between nations and economies, both in terms of their
global position on the ranking and regarding the percentage of students
classified in each level. And this is quite enough to foster competition for
the best results in each PISA cycle.
Finally, when we consider all the tasks the students were able to perform
in PISA’s reading tests, not only those indicated in Table 4.2, we observe
that, although the definition of the construct is quite broad and refers to a
model of ideal society, the students were submitted to very specific tasks.
Nevertheless, the conclusions presented refer to the broadest sense of the
construct. In this way, the impression that the instrument is sufficiently
precise to guide any education system is enhanced. This very problem
arises in the reports of the two other fields of knowledge.
Mathematics
In 2000, the definition of the construct used by PISA to measure this
domain read as follows:
The text states that the traditional knowledge and skills, defined at school,
are not the focus of PISA and that the emphasis is on mathematical
knowledge put to use in the different contexts that require reflection and
good judgement. At the same time, it defines the following skills as those of
interest to PISA: mathematical thinking, arguing and modelling, problem-
solving and communication. To prepare the tests, those skills are used in
three dimensions: processes, focusing on the students’ ability to analyse,
reason and communicate ideas effectively, presenting, formulating and
solving mathematics problems; content, using knowledge about change
and growth, space and shape, probabilities, quantitative reasoning and
relations of uncertainty and dependence; context, emphasizing doing and
using mathematics in situations of the personal and school life, work
and sports, local community and society (OECD, 1999).
80 João Luiz Horta Neto
In 2012, the second time the field of Mathematics was the main
domain of PISA (the first had been in 2003), the document describing the
framework of the test for that specific cycle remarks that “an understanding
of mathematics is central to a young person’s preparedness for life in
modern society” (OECD, 2013, p. 24), bringing to the table an utilitarian
perspective on education, without a clear definition of what life is being
considered and what modern society that would be. The document also
presents changes to the concept of literacy, seeking to further clarify the
relation between competencies and what would be required by the items,
something that was not clear in the original formulation. Thus:
it is important for both policy makers and those engaged more closely
in the day-to-day education of students to know how effectively stu-
dents are able to engage in each of these processes.
(OECD, 2013, p. 28)
Table 4.3 Description of the scale for Mathematics Proficiency in PISA 2018
6 669 OECD: 2.4% They can reflect on their actions, and can
Brazil: 0.1% formulate and precisely communicate
their actions and reflections regarding
their findings, interpretations,
arguments; they can also explain
why these are appropriate to the
original situation.
4 545 OECD: 18.5% They can utilize their limited range of
Brazil: 3.4% skills and can reason with some insight
in straightforward contexts. They
can construct and communicate
explanations and arguments
based on their interpretations,
arguments and actions.
2 420 OECD: 22.2% They can extract relevant
Brazil: 18.2% information from a single
source and make use of a
single representational mode.
They are capable of making literal
interpretations of results.
1 358 OECD: 14.8% They are able to identify
Brazil: 27.1% information and carry out routine
procedures according to direct
instructions in explicit situations.
They can perform actions that are
almost always obvious and follow
immediately from the given stimuli.
Below OECD: 9.1% No skills are identified.
1 Brazil: 41.0%
Source: Adapted by the author from OECD (2019b).
The first observation to make when analysing Table 4.3 is the indica-
tion that 41% of Brazilian students were classified at level “below 1”
and that no skills for this level were identified. For a expert in the field of
psychometrics, the latter information would indicate the need to prepare
items with the ability to measure skills at that level of proficiency, there-
fore a technical problem to be solved. However, to the general public,
and the media in particular, the information that is conveyed is that 41%
of Brazilian students have no skills in Mathematics, or in other words,
they did not learn Mathematics and are therefore not prepared to become
21st-century citizens. When the percentages of level 1 and level “below
1” are added, the total comes to 68.1% of Brazilian students classified
below level 2, which is considered to be the basic level, that which all
students should minimally achieve, according to the OECD. This piece
of information reflects on the headline published by a renowned Brazil-
ian news site: “Pisa 2018: two thirds of 15-year-old Brazilians know less
than basic Mathematics” (Moreno, 2019).
Regarding the characteristics of the tasks described in Table 4.3, the
passages highlighted in bold, at levels 4 and 6, refer to interpreting and
arguing skills. What would be the gradation between these levels when
we compare the two highlighted passages “[based on interpretations and
arguments they make, they] can explaining why these are appropriate
to the original situation” and “[based on interpretations and arguments
they make, they] can construct and communicate explanations”? In the
text, the two abilities refer to the task of explaining something and, at
first sight, they seem to concern similar skills. However, these two levels
are 124 points apart on the scale. In the same way, for levels 1 and 2, if
we take the formulations “They can extract relevant information from a
single source and make use of a since representational mode” and “They
are able to identify information and carry out routine procedures accord-
ing to direct instruction in explicit situations” also seem to identify very
close tasks, although the levels are 62 points apart on the scale.
The highlighted passages refer to information on the tasks that stu-
dents are supposed to be able to perform, with the possibility of signal-
ling pathways for the improvement of curricula and pedagogic practices.
Yet this information is ineffective since no explanation is provided for
possible differences in students’ proficiency levels. Conversely, by point-
ing one direction for the teaching of Mathematics based on the discussion
of the construct proposed for the test, it indicates paths for improved
performance at PISA. This paradox ultimately reinforces the fight for
better positions in the ranking. Thus, it also imposes practices to be fol-
lowed in the classroom through the item models used in the test, which
are, it must be stated, very well prepared and quite creative. Still, these
items are developed by specialists with several years of practice, who
dedicated long hours to the preparation of each item and who received
Pisa and curricular reforms in Brazil 83
much feedback from other specialists who helped improved them until
they reached the level of excellence required by PISA. Therefore, distant
from the reality of most schools and, above all, from the teaching work
in the classroom. Another paradox.
Science
The field of Science was the one which most underwent change throughout
the PISA cycles. In the 2000 cycle, Scientific Literacy was defined as
The document stresses that the test to measure Scientific Literacy must
be prepared considering the basis of scientific procedures, adjusting them
to the tasks required by the test, to the knowledge to be mobilized and
the context where the tasks are presented. Each one of these aspects is
detailed in the OECD text.
In the 2006 cycle, when Science was the major domain, a new definition
of Literacy was presented, including attitudinal aspects from the students’
answers to questions of scientific and technological relevance. Thus,
Scientific Literacy refers to an individual’s:
With the new definition, the test to measure Scientific Literacy in 2006 was
constituted by several item units, each of them pertaining to a particular
context. The units contained items to measure cognitive aspects as well
as a last item, at the end of the unit, to measure the students’ engagement
with science within the presented context. Examples of these item units
can be found in Annex A of the document (OECD, 2006). Besides the
questions included in the cognitive test, other questions were included in
the questionnaires to measure students’ engagement with Science.
84 João Luiz Horta Neto
In 2015, when Science was once again PISA’s major domain, the previ-
ous formulation was improved upon.
This text does not allow for a more in-depth analysis; suffice it to say
that the new formulation comes closer to the one used for the Math-
ematics area, especially regarding the specification of the way of rea-
soning, scientifically in this context, mathematically in the previous one.
Another change concerned the questions presented at the end of the item
units, which were discarded, and the items built to measure engagement
with Science, which are now part of the questionnaires. According to
the text, there were two problems with the previous formulation: the
questions reduced the space for cognitive items and a mismatch was
observed between the answers given in the cognitive vis-à-vis those of the
questionnaires.
A proposal for the 2024 cycle is already being discussed within PISA
to include new changes in the formulation of Scientific Literacy. The
proposal suggests three new areas of knowledge, including informatics,
competencies of using scientific knowledge for action and decision-
making and using probabilistic reasoning. Another dimension would also
be added, entitled scientific identity (OECD, 2020b).
Science findings are presented in eight different levels, from 6 to 2
and also levels “1a”, “1b” (OECD, 2019b, p. 113). To discuss the 2018
results, Table 4.4 is presented with the passages of the text to describe
the features of the tasks students managed to perform according to the
proficiency levels presented in the 2018 report, adding level “below 1b”
since that information is discussed in the text. The selected passages refer
to the ability to interpret data.
Pisa and curricular reforms in Brazil 85
Table 4.4 Description of the scale for Science proficiency in Pisa 2018
Just as was done for the two previous areas, the goal of the analysis is
to point out the weaknesses of the information provided, which should
in principle guide policymakers and indicate paths for the teaching work.
As was commented earlier, the information provided is limited and hin-
ders clear understanding of the characteristics of the tasks that students
are capable of performing.
At level 6, it is stated that students can draw on knowledge from out-
side the school curriculum. However, as PISA does not reference the
school curricula of the participating countries and economies, and no
comparative study was made of them, it is not possible to identify what
this knowledge would be, since it would pertain to a myriad of possible
topics. Even if the task had been performed by merely 0.8% of OECD
students and none from Brazil, the question must be asked: what is the
practical meaning of this information?
86 João Luiz Horta Neto
raised that the term “expectations” might produce a Base, which defined
formulations that favoured its use to measure students’ performance in
external assessments. For this reason, the term was replaced for Rights,
so as to reinforce the idea of education as a right and, as such, accessible
to all and far from any measure of performance.
In 2013, MEC submitted to the National Education Council (CNE),2
a proposal of Rights to Learning for the students of the first three years
of basic education and pledged to complete the remaining Rights in the
first semester of the following year. Once the rights were defined, the
National Common Curriculum Base would be completed and delivered
to the society for new debates and further improvement. The proposal
would be presented in 2014 and was not completed due to changes in
the top positions of MEC. Still, despite all this, the group involved in its
construction presented the preliminary text which was in the discussion
process (Bonini, 2018).
In 2013, too, the Movement for the Base3 began to emerge, a still
active group which gathers several leaders funded by corporations with
the goal of influencing the construction of a national curriculum, taking
international experiences, especially the Common Core State Standards
Initiative, of the USA, as the basis. The great majority of people and
organizations that grouped around the Movement have been exerting
strong influence on the discussions on education policies and on the
paths of MEC since the 2000s. Some of these people would come to take
on leading positions in the structure of the Ministry and would prove
fundamental to the creation of the BNCC (Avelar & Ball, 2019).
In 2014, four years after MEC’s project had been sent, the PNE is
passed by the National Congress. As in MEC’s original proposal, the
Plan uses IDEB to define targets, but it increases the number of strategies
to achieve them. One of the new targets establishes the implementation of
the learning rights and goals, which will configure the common national
curricular base until 2016.
In mid-2015, during Dilma Roussef’s second term, the discussions
around the president’s impeachment escalate, forcing a rearrangement
of forces in the government, which leads to new changes in the leader-
ship of MEC. In the wake of this, the command of SEB was given to the
founder of a social organization, which markets and applies standardized
tests for several education secretariats all over Brazil. As a result of these
changes, a new commission composed by 116 professionals was created
to prepare a National Curricular Base. The mandate was to prepare a
proposal along the lines of a national curriculum, something different
from what gradually was built in the previous years. The document, com-
pleted in September, defined as its goal to signal learning and develop-
ment pathways for the students throughout basic education capable of
ensuring that students develop the 12 general skills and others specific for
90 João Luiz Horta Neto
each area and education stage. Skills were defined for each grade, entitled
learning objectives, each of which with its identification code. This type
of formulation is very similar to the definitions used in the framework of
large-scale tests used in Brazil.
This early official version of the BNCC was placed in public consulta-
tion on the Internet between October 2015 and March 2016. According
to MEC, there were over 12 million contributions to the text, but, analys-
ing the data, researchers claim no more than 150,000 people contributed,
some of whom with several quite piecemeal submissions (Cassio, 2019).
In April, MEC concludes the second version of the document. This ver-
sion was forwarded to CNE, which would hold conferences in the states,
with the entities aggregating state and municipal education secretaries, to
debate the document and collect suggestions (Cassio, 2019).
A month later, in May, President Roussef was removed from the
presidency to face the impeachment process and the Vice President
Michel Temer takes over. With his ascent to power, the MEC structure
was changed again and the same happened to the constitution of CNE
with the repeal of the nomination decree of 12 of its 24 members,
signed by the ousted President. A body with crucial technical functions
would become the stage of intense political dispute for the BNCC’s
approval.
The new Minister Rossiele Soares da Silva, in one of his first initiatives,
created the Managing Committee of the Curricular Common National
Base and the Reform of Middle Education, in which only MEC manag-
ers participated, presided by Maria Helena Castro, Executive Secretary
of MEC, the same who in 1999 advocate for education grounded on
competencies and skills, and the definition of education standards. In
September 2016, Consed and Undime, bodies representing the munici-
pal and state education leaders, send to the Minister of Education and
to the Managing Committee a report containing criticism and recom-
mendations for a revision of the BNCC. According to Castro (2020),
the document stated the leaders’ option for a competencies- and skills-
based BNCC and on the grounds of this option, the Managing Commit-
tee started drafting a new version of the BNCC. According to the author,
competency “is a way of mediating the right to learn and know how, so
that they can be followed by the teacher, by the school, by the family and
by the system”. (p. 106)
When the third version arrives at the CNE, in April 2017, still incom-
plete, since it was only two of three stages of basic education, one of the
counsellors claims that
the Union must define, in collaboration with the States, the Federal
District and the Municipalities, competencies and guidelines for
childhood education, basic education and middle education, which
will guide curricula and their minimum contents, so as to ensure
common basic schooling.
(Law 9.394/96)
It can be understood from this passage that the discussions on the BNCC
also mobilized the energies of those who favoured the expansion of exter-
nal evaluations.
The discussion on the BNCC was influenced both by the troubled
national context and by the debate involving two opposite models:
one brought by the forces which since 1995 had been expanding their
influence on MEC and another, which defended the proposals built by
MEC over the course of more than 10 years. This is reflected on the
Resolution, which approved the BNCC (MEC/CNE, 2017). Its text states
that the phrase “competencies and skills” must be considered equivalent
to the phrase “learning rights and goals” present in the Law of PNE.
Pisa and curricular reforms in Brazil 93
precedes the first version of BNCC. For other stages of basic education,
no reference is made to any right, except for vague mentions to the right
to learn.
The propositions for Fundamental Education are divided into five
areas of knowledge: Languages, covering the curricular components
Portuguese, Art, English and Physical Education; Mathematics; Human
Sciences, including History and Geography; Religious Education. In the
particular case of the area of Languages, by placing side by side such dif-
ferent curricular components as Languages, Art and Physical Education,
the false idea is created that these components bear some proximity. This
is artificial since the types of knowledge each of them structures are quite
distinct.
The text of BNCC mentions conceptions of Literacy, as does PISA,
but uses the concept in a superficial manner. In the area of Languages,
for instance, it refers to “Literacies”, “Literacies of the letter and of
the printed word” and “new essentially digital literacies” (MEC, 2019,
p. 69) without defining them.
Regarding Mathematics, it refers to Mathematical Literacy as being
Despite indicating the concept used by PISA as its basis for the used
definition, it does not discuss how the formulation used in the BNCC
was developed or its proximity with PISA and the reality of Brazilian
education.
The notion of Literacy used in the area of Sciences is not very robust
either. Despite defining it as “the ability to understand and interpret the
world (natural, social and technological), but also to transform it based
on the theoretical and procedural input from the sciences” (MEC, 2019,
p. 321), it does not discuss the implications of this formulation nor does
it indicate the path taken to arrive at this conception. As was the case
with Mathematics, the text refers to the skills that must be developed on
the basis of the competencies defined for this area, but does not clarify
the connection between the Literacy announced at the beginning of the
text and these two other concepts.
Each area of knowledge and each curricular component define a
set of specific competencies. Each curricular component, based on the
defined specific competencies, lists a set of skills to be developed in
each school grade. Comparing different area skills we observe that a
learning progression is sought from three elements: cognitive processes
96 João Luiz Horta Neto
Final remarks
PISA has become an instrument with a great deal of credibility to the
extent that it influences education policies all over the world. Its cognitive
tests and questionnaires are well prepared, quite solid technically
and have been receiving input from important world experts for its
continuous improvement, a clear sign of the concern to add innovations
and perfect the measurements carried out. Many of the improvements
it makes are debated and announced several years in advance, in a clear
demonstration of care to allow countries and economies to prepare their
education systems for the following PISA cycles. Besides the necessary
transparency that such a powerful tool must have, the concern to avoid
sudden changes in the relative positions of the countries and economies
is evident, mainly due to the importance that ranking has in the strategies
used to disseminate its results.
Its constructs, definitions of what will be measured, are presented from
a solid theoretical basis and end up indicating what should be taught and
how curricula should be structured. It also indicates to the countries the
pathways to be trodden to improve the results of forthcoming cycles.
The discussion on education policies is the essence of PISA, the element
that mobilizes politicians and the media, and which exerts influence on
societies. Having a good position in the ranking means being on the
right track to educate the citizen for the 21st century, as if there were a
single model for this or as if the conditions or starting points for all the
schools and their students were similar. This whole rationale gains an
98 João Luiz Horta Neto
Notes
1 In an OECD publication on Brazil (OECD, 2019c), this fact is discussed,
quite superficially. In OECD (2016, p. 181), a brief discussion is presented
on the variations of performance among different countries and a trend curve
is presented for the countries’ performance in the various cycles, although the
variations are not subject to in-depth discussion (see Figure I.5.3, available at
http://dx.doi.org/10.1787/888933432623).
2 CNE is a regulatory body linked to MEC, responsible for defining norms
from the interpretation of LDB. It is constituted by 24 members indicated by
entities of the civil society appointed by the country’s President for a four-year
term.
3 The Movement for the Base (Movimento pela Base, in Portuguese) defines its
mission thus:
BNCC defines the learning and development rights for all Brazilian chil-
dren and youth. We work to ensure that these rights are fulfilled, sup-
porting the implementation of the quality of BNCC and the New Middle
Education in all networks and public schools in the country. We monitor
and provide visibility to the progress of the implementation on various
fronts. We articulate for the alignment of policies and programs – cur-
ricular, of teacher education, didactical materials and assessments – with
100 João Luiz Horta Neto
References
Addey, C. (2016). Pisa for development and the sacrifice of policy-relevant
data. Educação e Sociedade, 37(136), 685–706. https://doi.org/10.1590/
es0101-73302016166001
Aguiar, M., & Tuttman, M. (2020a). Breve histórico do processo de elaboração
da Base Nacional Comum Curricular no Brasil. In A. Santos & M. Ferreira
(Eds.), Base Nacional Comum Curricular, qualidade da educação e autonomia
docente (pp. 95–102). Em Aberto, 33(107). INEP.
Aguiar, M., & Tuttman, M. (2020b). Políticas educacionais no Brasil e a Base
Nacional Comum Curricular: disputas de projetos. In A. Santos & M. Ferreira
(Eds.), Base Nacional Comum Curricular, qualidade da educação e autonomia
docente (pp. 69–94). Em Aberto, 33(107). INEP.
Avelar, M., & Ball, S. (2019). Mapping new philanthropy and the heterarchi-
cal state: The Mobilization for the national learning standards in Brazil.
International Journal of Educational Development, 64, 63–73. https://doi.
org/10.1016/j.ijedudev.2017.09.007
Bauer, A., Pimenta, C., Souza, S., & Horta Neto, J. L. (2015). Avaliação em
larga escala em municípios brasileiros: o que dizem os números? Estudos
de Avaliação Educacional, 26(62), 326–352. http://dx.doi.org/10.18222/
eae266203207
Bonini, A., Druck, I., & Barra, E. (2018). Direitos à aprendizagem e ao desenvolvi-
mento na educação básica: subsídios ao currículo nacional. UFPR. Retrieved
August 20, 2020, from https://acervodigital.ufpr.br/bitstream/handle/1884/55911/
direitos_a_aprendizagem_e_ao_desenvolvimento_na_educacao_basica_sub
sidios_ao_curriculo_nacional-preprint.pdf?sequence=1&isAllowed=y
Carvalho, L. (2009). Governando a educação pelo espelho do perito: uma análise
do Pisa como instrumento de regulação. Education and Society, 30(109),
1009–1036.
Carvalho, L. M. (2020). Revisiting the fabrications of PISA. In G. Fan & T. S.
Popkewitz (Eds.), Handbook of education policy studies: School/university,
curriculum, and assessment, vol. 2 (pp. 259–273). Springer.
Cassio, F. (2019). Existe vida fora da BNCC? In F. Cássio & R. Catelli Jr. (Orgs.).
Educação é a Base? 23 educefernceadores discutem a BNCC (pp. 13–39). Ação
Educativa.
Pisa and curricular reforms in Brazil 101
PISA Programme uses tests (for students aged 15) and questionnaires (for
students, teachers, principals and parents). PISA questionnaires are pub-
lic, but the large majority of test items are kept confidential for reuse and
scaling across years. Critics of PISA argue that this makes it difficult to
analyse the validity of PISA tests, particularly across different countries
and cultures since the items are initially written in English or French. This
chapter describes a study about how students from different age groups,
education levels and courses answer PISA items and how they evalu-
ate different aspects of the items. We hypothesized that PISA items are
too difficult for the majority of students aged 15 and that they are more
appropriated to older students. We administered a booklet with two sets
of publicly available Mathematics items (from PISA 2012) and Science
items (from PISA 2015) to a non-representative sample of 839 Portu-
guese students from Basic, Secondary and Higher Education (approxi-
mately 50% have more than 18 years), from different types of courses,
schools, polytechnics and universities, public or private. Students were
asked to answer the items and to evaluate different aspects of each item
(e.g. “understanding” the question and assessing its “difficulty”).
Comparing the scores on our student sample and the scores on the
same items in PISA tests (in 2012 and 2015), for both Portugal and
OECD countries (students aged 15), we found similar results across all
age groups of our sample, in both Mathematics and Science. This sug-
gests that PISA tests can target older students and that knowledge and
skills at age 15 are globally similar to knowledge and skills at older ages
(at least in Mathematics and Science). We also found that item facility
has a significative positive correlation with students’ assessment of the
comprehension of the items text, with the certainty of correction of the
answer, and with the assessment of the difficulty of the item. On the other
hand, item facility has no significative correlation with the student study
of the content of the item, as reported by students.
DOI: 10.4324/9781003255215-6
Testing PISA tests 105
Regarding the test items and what they measure, Hopmann (2008)
argues that there is no research showing that PISA covers enough to be
representative of the school subjects involved and UNESCO (2019) says
that their items have focused too much on a narrow set of subjects and
fail to capture what is important to education systems. Hopfenbeck et al.
(2018) state that PISA may be measuring different abilities in students of
different languages because there is a high degree of differential function-
ing in relation to language, to which Zhao (2020) adds that the problem
of there being languages where the text is much larger than in others,
even though the time given to administer the questionnaire is the same
for all countries. Sjøberg (2015, 2019) states that the OECD’s attempt to
decontextualize it so as not to disadvantage some countries over others
goes against the recommendations of educators and against the choices of
national leaders who want science to be relevant, interesting and linked
to a context. Araújo et al. (2017) also question whether the multidimen-
sionality of the items and the abstract skills that PISA intends to measure
Testing PISA tests 107
fit a response model that ultimately boils down to a single score and say
that “it is less than clear in PISA which dimensions are being measured
and which suppressed” (p. 4).
This study is based on a Question Booklet with mathematics and sci-
ence items, from 2012 and 2015, respectively, released by PISA. In addi-
tion to these items, for each item, we included four questions “Did you
fully understand the text?”, “Have you ever studied something related to
the subject of the question?”, “How sure are you of your answer?” and
“How do you rate the difficulty of this question?” to be answered by stu-
dents from elementary, secondary and higher levels, on four-level Likert
scale. We compare the results, by age, area and education level, to find
out if they are similar or not to those obtained by Portuguese and OECD
students who took the 2012 and 2015 PISA tests, against the background
of the criticism to PISA, namely those related to its items.
2 Methods
In this study, we use two types of methodological analysis: quantitative
and qualitative. In the first, we assume that the values presented in the
PISA reports are reliable and that the samples referred to and used in the
studies are representative. In the second, we used an interpretative quali-
tative approach, following the perspectives of Rémond (2006), Mullis
et al. (2009) and Rosa et al. (2020), among others. We resorted to the
reports and databases produced by the OECD. Since the first cycle of
PISA, in 2000, Portugal has registered a significant improvement in the
results obtained in the different literacies. In the 2018 edition of PISA,
in a ranked list of 79 participants, Portugal was ranked 24th in scien-
tific literacy, 24th in reading literacy and 22nd in mathematical literacy,
with 492 points in each domain, being above the OECD average (in all
domains).
For this study, nine groups of mathematics and science items were
selected from the 2012 and 2015 PISA tests. In 2012, PISA focused the
assessment on mathematics and in 2015 on science. Within the criteria
for choosing the items, we considered the different degrees of difficulty
of the items and in the answers given we followed the score/classifica-
tion assigned by the OECD (2013) (one or two, full credit, depending
on the question; 1, partial, depending on the question; 0, wrong). With
these items, a Question Booklet was designed with a view to be applied
to 15-year-old students, the target audience of PISA, and students attend-
ing higher education, seeking statistical comparison, which is essential to
draw safe conclusions, and to test our starting hypothesis, that is about
the unsuitability of the items.
2.2 Instrument
The Question Booklet comprises 36 pages, divided into three parts. Part
0 with one group of Mathematics items (from PISA 2012) and 1 group of
Science items (from PISA 2015); Part 1 with four groups of Mathemat-
ics items; and Part 2 with three groups of Science items. It also includes,
before Part 0, sociodemographic background questions for students
(name of institution, school year and course attended, date of birth, gen-
der, attendance of pre-school education and grade repetition) and for
parents/carers (professional activity and academic qualifications). Some
items, applied by PISA in digital format, were slightly adapted to paper
format. For each item, we also added questions to assess four aspects of
the items using a four-point Likert scale.
The students who agreed to participate in the study were asked to fill in
part 0 (common to all) and then only part A or B. When the time allowed
for the application of the Question Booklet (approximately 60 minutes),
some students answered all parts.
handed in, in particular a copy of the Question Booklet and the letter/
form with the request for authorization from parents/guardians. Once
the authorizations had been obtained, a day and time was then sched-
uled for the application of the Question Booklet. In higher education,
the teachers of different degrees, bachelor’s and master’s, were contacted
directly. To ensure the proper application of this evaluation instrument,
a member of the research team was always present. The application
of the Question Booklet took place between 7 October 2019 and 3
March 2020, having been interrupted with the pandemic crisis (SARS-
CoV-2) and the closure of educational establishments at national level.
The sampling process was by convenience. A total of 839 valid book-
lets were collected:
Table 5.1 Students who answered all the items of the booklet on the PISA test
OECD Portugal
Mathematics items, 2012 79,577 (from 295,416, 26.9%) 1,759 (from 5,722, 30.7%)
Science items, 2015 6,669 (from 248,620, 2.7%) 187 (from 7,325, 2.6%)
110 Vítor Duarte Teodoro et al.
Table 5.2 and Figure 5.1 also show the difference between each group
of our sample and the means for the OECD and for Portuguese students
who answered the same items on the PISA 2012 study, as well as the
statistical significance of these differences.
We conclude that:
Table 5.2 Sum of scores of the Mathematics items, by school class, age and
education level
School_Code |
Diff to mean Diff to mean
Age | School N Mean Median SD SEM PISAPT2012 Sign CECD2012 Sign
Level
Bas_A_1 14 5.14 5.0 2.515 0.672 -0.19 -0.68
Bas_B_1 14 5.00 4.8 1.629 0.435 -0.33 -0.82
Bas_C_1 15 4.23 3.5 1.963 0.507 -1.10 -1.59 0.038
Sec_A_1 20 5.80 6.0 2.256 0.504 0.47 -0.02
Sec_A_2 9 5.44 6.0 2.709 0.903 0.11 -0.38
Sec_B_1 9 3.67 3.5 1.521 0.507 -1.67 -2.16 0.029
Sec_C_1 14 5.46 5.5 1.562 0.418 0.13 -0.36
Sec_D_1 22 6.93 7.0 2.295 0.489 1.60 0.010 1.11
Sec_E_1 23 4.91 4.5 1.992 0.415 -0.42 -0.91
Sec_F_1 9 5.94 6.0 1.878 0.626 0.61 0.12
Sec_G_1 10 5.95 6.0 2.386 0.754 0.62 0.13
Sec_H_1 10 4.85 4.8 2.122 0.671 -0.48 -0.97
Sec_I_1 16 3.59 3.3 1.666 0.416 -1.74 0.017 -2.23 0.003
Sup_A_1 35 5.54 5.5 1.888 0.319 0.21 -0.28
Sup_B_1 45 8.20 8.5 1.733 0.258 2.87 0.000 2.38 0.000
Sup_C_1 32 5.75 5.8 2.275 0.402 0.42 -0.07
Sup_D_1 9 5.17 5.5 1.500 0.500 -0.17 -0.66
Sup_D_2 48 7.18 7.5 2.294 0.331 1.84 0.000 1.35 0.002
Sup_D_3 14 5.86 6.0 2.499 0.668 0.52 0.03
Sup_D_4 39 5.87 6.0 2.022 0.324 0.54 0.05
Sup_E_1 20 5.60 5.5 2.204 0.493 0.27 -0.22
Sup_F_1 14 4.89 5.0 1.767 0.472 -0.44 -0.93
Sup_G_1 26 4.42 4.5 1.798 0.353 -0.91 -1.40 0.016
Sup_G_2 22 4.95 5.0 2.149 0.458 -0.38 -0.87
Sup_G_3 35 4.63 4.5 2.217 0.375 -0.71 -1.20 0.017
15 years 48 5.16 5 2.307 0.333 -0.18 -0.67
16 years 98 5.61 6 2.149 0.217 0.28 -0.21
17 years 28 4.59 4 2.135 0.403 -0.74 -1.24 0.028
18 years 53 5.28 5 2.046 0.281 -0.05 -0.54
19 years 65 5.94 6 2.168 0.269 0.60 0.11
20 years 51 6.00 6 2.133 0.299 0.67 0.18
21 years 36 5.76 6 2.427 0.405 0.43 -0.06
22 years 58 6.68 7 2.696 0.354 1.35 0.001 0.86 0.028
23 years 26 6.46 8 2.687 0.527 1.13 0.050 0.64
> 23 years 46 5.51 5 2.323 0.342 0.18 -0.31
Basic 43 4.78 4.5 2.057 0.314 -0.56 -1.05 0.021
Secondary 142 5.35 5.3 2.249 0.189 0.01 -0.48
Higher Education 339 5.95 6.0 2.351 0.128 0.61 0.000 0.12
Total 524 5.69 5.5 2.328 0.102 0.35 0.011 -0.14
PISAPT 1759 5.33 5.0 2.908 0.069 -0.49 0.000
OECD 79577 5.82 6.0 2.972 0.011
112
Vítor Duarte Teodoro et al.
Figure 5.1 Sum of scores of the Mathematics items, by age, box plot and frequency curve
Testing PISA tests 113
The Table and Figure also show the difference between each group of our
sample and the means for the OECD and for Portuguese students who
answered the same items on the PISA 2015 study, as well as the statistical
significance of these differences.
It is possible to conclude that:
• The mean for the 187 Portuguese students who answered in PISA
2015 and answered all the items of the booklet is 6.25, and is not
significantly different from the mean 6.19 of all 6,669 OECD students
(Portugal 2015 score on PISA Science Literacy was 501, significantly
higher than 493, the OECD’s score).
• The mean of the 496 students of our sample, 6.54, has no statisti-
cal difference from the mean of the 187 Portuguese students who
answered in PISA 2015 (6.25) and is significantly higher from the
mean of the 6,669 students of the OECD sample (6.19).
• Of the 25 school classes of our sample, seven have a mean that is
statistically different from the 6.25 mean of the 187 Portuguese
students who participated in the PISA 2015 (four have a lower value
and the other three have a higher value).
• By school level, only the Basic Education group (38 students) has
a significant lower mean than the 187 Portuguese students who
participated in the PISA 2015.
• Of the age groups, only students aged 22 or 23 have a mean signifi-
cantly higher than the 187 Portuguese students who answered in the
PISA 2015; these students are from two classes of engineering course
students (Sup_B_1 and Sup_D_2), who are expected to have better
skills in science and mathematics.
As a global conclusion, we also have evidence that our sample of 496 stu-
dents answered the Science items in a similar way of the 187 Portuguese
students who answered in PISA 2015, across all ages.
114 Vítor Duarte Teodoro et al.
Table 5.3 Sum of scores of the science items, by school class, age and educa-
tion level
School_Code |
Diff to mean Diff to mean
Age | School N Mean Median SD SEM PISAPT2012 Sign CECD2012 Sign
Level
Bas_A_1 14 6.14 6.5 2.742 0.733 -0.11 0.32
Bas_B_1 12 4.42 4.0 1.782 0.514 -1.83 0.020 -1.41 0.020
Bas_C_1 12 4.58 4.5 1.621 0.468 -1.67 0.033 -1.24 0.036
Sec_A_1 14 8.71 9.0 2.268 0.606 2.46 0.001 2.89 0.000
Sec_A_2 8 8.75 9.5 1.581 0.559 2.50 0.009 2.93 0.006
Sec_B_1 8 5.38 5.5 2.066 0.730 -0.88 -0.45
Sec_C_1 15 6.87 7.0 2.416 0.624 0.62 1.04
Sec_D_1 18 6.89 7.0 1.937 0.457 0.64 1.06
Sec_E_1 21 5.86 6.0 1.797 0.392 -0.39 0.03
Sec_F_1 9 7.89 8.0 1.269 0.423 1.64 2.06
Sec_G_1 9 6.11 5.0 2.205 0.735 -0.14 0.29
Sec_H_1 9 7.89 8.0 1.537 0.512 1.64 2.06
Sec_I_1 20 4.20 4.0 3.037 0.679 -2.05 0.001 -1.62 0.001
Sup_A_1 35 6.80 7.0 1.967 0.333 0.55 0.98
Sup_B_1 44 9.18 9.0 1.660 0.250 2.93 0.000 3.36 0.000
Sup_C_1 32 5.66 6.0 2.610 0.461 -0.60 -0.17
Sup_D_1 7 6.57 7.0 2.225 0.841 0.32 0.75
Sup_D_2 35 7.06 7.0 2.287 0.387 0.81 1.23
Sup_D_3 14 6.79 7.0 1.718 0.459 0.53 0.96
Sup_D_4 38 6.16 6.0 2.150 0.349 -0.09 0.33
Sup_E_1 20 6.90 7.0 1.744 0.390 0.65 1.08
Sup_F_1 12 6.58 6.5 1.832 0.529 0.33 0.76
Sup_G_1 24 6.38 7.0 1.974 0.403 0.12 0.55
Sup_G_2 24 4.58 5.0 2.062 0.421 -1.67 0.003 -1.24 0.003
Sup_G_3 28 6.18 6.0 2.245 0.424 -0.07 0.35
15 years 37 6.38 6 2.822 0.464 0.13 0.55
16 years 96 6.36 7 2.318 0.237 0.11 0.54
17 years 26 6.62 6 2.499 0.490 0.36 0.79
18 years 54 6.54 7 2.337 0.318 0.29 0.71
19 years 71 6.44 6 2.034 0.241 0.19 0.61
20 years 52 6.77 7 1.986 0.275 0.52 0.94
21 years 31 6.35 7 2.715 0.488 0.10 0.53
22 years 48 7.25 8 2.957 0.427 1.00 0.024 1.43 0.005
23 years 26 7.81 9 2.593 0.508 1.56 0.006 1.98 0.002
> 23 years 41 5.59 6 2.280 0.356 -0.67 -0.24
Basic 38 5.11 4.5 2.240 0.363 -1.15 0.014 -0.72 0.012
Secondary 133 6.62 7.0 2.566 0.223 0.37 0.80
Higher Education 325 6.67 7.0 2.352 0.130 0.42 0.85 0.001
Total 496 6.54 7.0 2.433 0.109 0.29 0.35 0.004
PISAPT 187 6.25 6.0 2.661 0.195 0.07
OECD 6669 6.19 6.0 2.638 0.032
Testing PISA tests
Figure 5.2 Sum of scores of the Science items, by age, box plot and frequency curve
115
116 Vítor Duarte Teodoro et al.
Figure 5.3 Item facility, correlations by education level, PISA PT, PISA OECD
118
Vítor Duarte Teodoro et al.
Figure 5.4 Item discrimination, correlations by education level, PISA PT, PISA OECD
Testing PISA tests 119
For each scale, there are four grades. Figure 5.5 shows the bar charts,
by education level, for the means in each scale for each item. This figure
also shows all item facilities. In an overall analysis, we see that students’
assessment of items is similar in the three education levels, in the four
scales, with the exception of some more difficult items for Basic Educa-
tion students.
Figure 5.6 shows scattergrams and Pearson’s coefficient of the correla-
tions between item facility and the scales used by students to assess the
items, by education level.
Data show that there are positive linear correlations, with statistical
significance (Pearson’s coefficient from 0.198 to 0.755), between each
scale and the item facility for the total of students of our sample and
each education level (14 out of 16 cases). There are only two exceptions:
the scale “Have you ever studied something related to the subject of the
question?” has no significative correlation with the total of students of
our sample and with the higher education level.
In synthesis, we see that item facilities have a significative positive cor-
relation with students’ assessment of:
Figure 5.5 Item assessment by students, by education level, mean for each scale
Testing PISA tests 121
We can see that there is always a positive linear correlation, with sta-
tistical significance (Pearson’s coefficient from 0.205 to 0.955), between
each pair of scales (a total of six cases). Note that the lowest value of the
correlation is between the “Have you ever studied something related to
the subject of the question?” and “How do you rate the difficulty of this
question?” scales. This result emphasizes our earlier conclusion about
one the most important PISA assumptions: PISA items are independent
of the school curriculum.
Testing PISA tests 123
4 Conclusions
PISA provides comparative data on 15-year-old students that are at least
in the 7th grade. The study aims to see whether schools of participating
countries/economies prepare the students to play the role of informed
citizens. PISA does not intend to assess the school curriculum, but the
skills that students have acquired for active life. In that sense, it aims to
assess how well students mobilize their skills in three domains of literacy:
reading, mathematics, science. More recently, it also looks at collabora-
tive problem-solving and financial literacy.
PISA uses questionnaires (for students, teachers, principals and par-
ents) and tests (for students). The questionnaires are public, but the large
majority of test items are kept confidential to allow comparability across
participating countries/economies over the various cycles.
Using a set of mathematics and science items released by the OECD,
we conducted a national study, applying a booklet with some items to
students from different educational levels (Basic, Secondary, Higher
Education), aged between 15 and over 23 years. We had put forward a
hypothesis that the PISA items are very difficult for most students aged
15 and they are more appropriate for older students. We administered
a booklet with two sets of items: Mathematics (PISA 2012) and Science
(PISA 2015). The sample used was by convenience. We administered 839
booklets to students from different types of courses, schools and higher
education establishments, public or private. They were asked to answer
the items and to assess different aspects of each item with four Likert
scales: “Have you fully understood the text?”; “Have you ever studied
anything related to the subject of the question?”; “How sure are you of
your answer?”; and “How do you rate the difficulty of this question?”.
We compared the scores of our sample with the scores of the items in
PISA (in 2012 and 2015), both for Portugal and for OECD countries
(students aged 15), and we found that:
• Results are similar in all age groups in the two literacies under analy-
sis; this means that PISA tests may have a broader purpose, involving
a higher age group, which allows us to accept our starting hypoth-
esis. The knowledge and skills of 15-year-olds are similar to the
knowledge and skills of older age groups.
• Results also allow us to conclude that the item facility has a signifi-
cant positive correlation with the students’ evaluation of the items,
as well as the evaluation of the understanding of the item text, with
certainty to answer, and with the evaluation of the item difficulty. It
is also possible to ascertain that item facility has no significant cor-
relation with students’ study of the item’s content.
124 Vítor Duarte Teodoro et al.
References
Araújo, L., Saltelli, A., & Schnepf, S. (2017). Do PISA data justify PISA-based
education policy? International Journal of Comparative Education and Devel-
opment, 19(1), 1–17. https://doi.org/10.1108/IJCED-12-2016-0023
Carvalho, L. M., & Costa, E. (2009). Production of OECD’s ‘programme for
international student assessment’: Final report. Project KNOWandPOL, WP
11, March. www.knowandpol.eu/IMG/pdf/o31.pisa.fabrication.pdf
Carvalho, L. M., Costa, E., & Sant’Ovaia, C. (2020). Depicting the faces of
results-oriented regulatory processes in Portugal: National testing in policy
texts. European Educational Research Journal, 19(2), 125–141. https://doi.
org/10.1177/1474904119858799
Conselho Nacional de Educação. (2013). Avaliações internacionais e desem-
penho dos alunos portugueses. Conselho Nacional de Educação. www.
cnedu.pt/content/edicoes/seminarios_e_coloquios/LIVRO_Avaliacoes_
internacionais.pdf
Hopfenbeck, T., Lenkeit, J., Masri, Y., Cantrell, K., Ryan, J., & Baird, J.-A.
(2018). Lessons learned from PISA: A systematic review of peer-reviewed
articles on the programme for international student assessment. Scandinavian
Journal of Educational Research, 62(3), 333–353. https://doi.org/10.1080/00
313831.2016.1258726
Hopmann, S. T. (2008). No child, no school, no state left behind: Schooling in the
age of accountability. Journal of Curriculum Studies, 40(4), 417–456. https://
doi.org/10.1080/00220270801989818
Kreiner, S., & Christensen, K. B. (2014). Analyses of model fit and robust-
ness. A new look at the PISA scaling model underlying ranking of countries
according to reading literacy. Psychometrika, 79(2), 210–231. https://doi.
org/10.1007/s11336-013-9347-z
Marôco, J. (2020). International large-scale assessments: Trends and effects on
the Portuguese public education system. In H. Harju-Luukkainen, N. McEl-
vany, & J. Stang (Eds.), Monitoring student achievement in the 21st cen-
tury. European policy perspectives and assessment strategies (pp. 207–222).
Springer. https://doi.org/10.1007/978-3-030-38969-7_17
Mullis, I., Martin, M., Kennedy, A., Trong, K., & Sainsbury, M. (2009). PIRLS
2011 assessment framework. TIMSS & PIRLS International Study Center,
Lynch Scholl of Education: Boston College. https://timssandpirls.bc.edu/
pirls2011/framework.html
Organisation for Economic Co-operation and Development. (2009). PISA
data analysis manual SAS second edition. OECD. https://doi.org/10.1787/
9789264056251-en
Organisation for Economic Co-operation and Development. (2013). PISA 2012 –
Released mathematics items. www.oecd.org/pisa/pisaproducts/pisa2012-
2006-rel-items-maths-ENG.pdf
Testing PISA tests 125
International large-scale
assessment
Issues from Portugal’s participation
in TIMSS, PIRLS and ICILS
Vítor Rosa
Introduction
International large-scale assessment studies produce information
and indicators on the knowledge and skills of students from differ-
ent education systems. These evaluations, in the field of education,
have acquired great importance in recent decades. Governments from
various political quarters started to use the results of these studies,
with the aim of improving investments and achieving better school
performance.
Although international large-scale assessments (ILSAs) are currently
considered by many to be a regular feature of the education landscape,
they are a relatively recent phenomenon. Their origins can be traced
back to the pilot survey of the International Association for the Evalua-
tion of Educational Achievement (IEA) regarding student performance
assessment conducted in the 1960s (Rosine & Postlethwaite, 1994).
Since then, there have been significant developments. The number
of organizations responsible for the development, management and
administration of ILSAs has grown from one to seven major actors:
IEA, Conference of the Ministers of Education of French Speaking
Countries (CONFENEM), the Inter-American Development Bank
(IDB), the Organization for Economic Cooperation and Development
(OECD), the Southern and Eastern Africa Consortium for Monitor-
ing Educational Quality (SACMEQ), the United Nations Educational,
Scientific and Cultural Organization (UNESCO), and the World Bank
(Lietz et al., 2017; Wagemaker, 2014). Currently, it is estimated that
about 70% of the world’s countries participate in ILSAs (Lietz et al.,
2017).
Measurement methodologies have evolved considerably. In the ILSAs
Programme for International Student Assessment (PISA), Progress in
International Reading Literacy Study (PIRLS) and Trends in Interna-
tional Mathematics and Science Study (TIMSS), the roots are grounded
on long-term trend assessment methodologies in the National Assessment
DOI: 10.4324/9781003255215-7
International large-scale assessment 127
and science, which, in the view of Marôco et al. (2016a), are considered
domains or literacies that are
essential in the training of students who opt for education paths asso-
ciated with professional areas internationally known as STEM (Sci-
ence, Technology, Engineering and Mathematics).
(pp. 5–6)
Table 6.1 D
istribution of science items (content area, cognitive dimension
and item type), TIMSS 2015
and reading to acquire and use information. Literary texts are complete
recitals accompanied by illustrations. Texts should familiarize students
with plot, events, actions, character motivations, etc. Informational texts
enable one to address aspects of the real world, covering a wide variety of
subjects. They are accompanied by structuring and illustrative elements,
such as diagrams, letters, illustrations, photographs, lists, and tables,
among others.
In PIRLS, the population studied does not correspond to a generation
of students of a defined age, as is the case of the OECD’s PISA Programme.
The goal is to measure the performances of the set of students present at
a particular level of education, regardless of their age, their path and the
organization of the education system.
The countries/regions that participate in the study identify the strengths
and weaknesses of students’ reading skills. As mentioned earlier, Portugal
participated in 2011 and then in 2016 (3rd and 4th editions). In the 2016
edition, the study integrated a new dimension in the evaluation of read-
ing literacy: online (ePIRLS), involving 14 countries, including Portugal.
In the PIRLS/ePIRLS studies, through paper and digital media, infor-
mation about students, their families, teachers and schools is collected,
contextualizing “the reading learning opportunities as well as identifying
factors that influence these opportunities” (IAVE, 2017, p. 3). Within the
framework of PIRLS, the assessment protocol (texts and items) is based
on the intersection of comprehension processes and reading objectives
(Campbell et al., 2001).
The use of reading serves two purposes: literary experience and the
acquisition and use of information.
Ferreira and Gonçalves (2013) emphasize that
Rémond (2007) states that “the results of PIRLS lead us to think that the
tasks proposed to the students by the school are very limited” (p. 67).
Students feel “bewildered” by the complex and successive questions,
which must be answered without the teacher’s help (Rémond, 2007).
Regarding the Portuguese reality, and in a more comprehensive manner,
Benavente (2016) highlights several constraints at school level: lecture-
based classes, high number of students per class, needs dissociated
from the reality of the children and young people, economic competition
between schools, dismissal of thousands of teachers and support staff,
curriculum and syllabus reformulation, long and unsuitable syllabus,
obstacles to the integration of children and young people with specific
International large-scale assessment 133
When compared to other studies (PISA, TIMSS and PIRLS), the number
of participating countries/regions is lower. In 2018, 12 countries par-
ticipated in the CIL evaluation (Chile, Denmark, United States of Amer-
ica, Finland, France, Germany, Italy, Kazakhstan, Republic of Korea,
Luxembourg, Portugal and Uruguay) and two regions in benchmarking
(Moscow – Russian Federation, and North Rhineland-Westphalia – Ger-
many) and eight countries (Denmark, Finland, France, Germany, Portu-
gal, Luxembourg, United States, Republic of Korea) and 1 region (North
Rhine-Westphalia – Germany) participated in the CT assessment.4 In
134 Vítor Rosa
total, the ICILS study gathered information from 46,561 8th graders and
26,530 teachers from 2,226 schools.
The IEA follows the same structure as TIMSS and PIRLS in relation
to the numerical scale, which varies between 0 and 1,000 points and has
a fixed central point at 500 points (average performance). The standard
deviation is 100 points.
ICILS is structured within a conceptual framework of reference, where
the analysis dimensions and content areas assessed in the two domains
under consideration (CIL and CT) are defined. The test5 consists of dif-
ferent levels of difficulty of tasks, as well as levels of performance pro-
ficiency. Proficiency level 1 is between 407 and 491 points, proficiency
level 2 is between 492 and 576 points, proficiency level 3 is between 577
and 661 points and proficiency level 4 is higher than 661 points.6
According to the IEA, CIL “refers to an individual’s ability to use
computers to research, create and communicate, in order to actively
participate in contemporary societies, whether at home, at school, in the
workplace and in community and educational contexts” (IAVE, 2019,
p. 23).
As far as CT is concerned, the IEA defines it as follows:
Table 6.2 D
imensions and areas of dimensions, content areas (CIL and CT)
and percentages
1) Students from families with high family capital (this indicator inte-
grates the level of education, the professional qualification of par-
ents, the books available at home, with an emphasis on children’s
books, study support materials)8 perform better than students from
families with fewer socio-economic resources. This issue does not
only concern TIMSS and PIRLS. PISA 2018 and TIMSS 2015 also
revealed that socio-economic status is a strong predictor of Portu-
guese students’ performance.
2) The higher the mastery of basic literacy (and numeracy) issues before
schooling, the more likely students are to perform well in the 4th
grade.
3) Continued attendance of education programmes and care regard-
ing early childhood development are important for students from
families with lower socio-economic resources. In the case of Por-
tugal, the authors report that attendance of three or more years
represents a significant increase in Reading performance for stu-
dents with “Few or some resources”, but does not have a sta-
tistically significant result for the group with “Many resources”
(p. 11).
4) Students who trust their skills more are the ones who achieve the best
results in the three main literacy domains.
5) Of the different European countries, Portugal has the highest per-
centage of students from schools from disadvantaged backgrounds,
achieving, in all areas, scores above the international average. The
results of Portuguese students, when compared with those of other
countries, “suggest a good capacity of the education system to
reduce the differences derived from diverse socio-economic contexts”
(p. 12).
6) Schools that are more focused on school success allow their stu-
dents to perform better. In fact, school climate is an important
predictor.
7) Students attending so-called very safe and organized schools are
more represented in affluent socio-economic backgrounds. In this
respect, the indicator “Discipline Problems” is a good predictor of
reading performances.
International large-scale assessment 137
The ICILS studies (2013 and 2018) highlight that being born in a digital
world does not necessarily mean that someone is digitally competent.
Contrary to the common view that today’s young generation is a gen-
eration of “digital natives”, the findings of the first two cycles of ICILS
indicate that young people do not develop sophisticated digital skills.
Only the digital use of devices grows. On the other hand, there is a great
variation between countries regarding the achievement of information
literacy. The focus should be not only on young people with low socio-
economic resources, but also on those with higher levels of proficiency
in digital competence. There is also gender differentiation in the use of
ICTs. Girls perform better than boys at CIL, but this differentiation is
less evident in CT assessment. The results of ICILS also suggest the need
for a holistic approach to the pedagogical use of ICTs in schools. Provid-
ing students and teachers with ICT equipment is not enough to improve
their digital skills. They should be encouraged and supported in the use
of digital tools.
In a recent article, we tried to compare PISA, TIMSS and PIRLS in
Portugal (Rosa et al., 2020). The article aimed to extend the information
scope on these evaluations and to ascertain if it is possible to compare
them, taking into account their objectives. It drew on the known global
data of the participating countries, with a particular emphasis on the
data concerning Portugal. It was found that the results obtained by the
students are not the same all over the country, with differences between
the various regions. The domains that are assessed (e.g. reading, math-
ematics, science and physics) do not seem to be determinant to the results
of each study. The irrelevance of the domain tells us that regions have
good or poor results regardless of the domain concerned, and the rel-
evance of the assessment object indicates that competence at school may
have little to do with life skills.
Conclusion
In the last decades, large-scale evaluation has acquired great importance
in the field of education. The main goals of the ILSAs, especially those
undertaken in the school context, are to seek to improve the quality and
equity of education, as well as to respond to the growing global demand
regarding investments made in the educational offer. In general, ILSAs
share common objectives that explicitly or implicitly include one or more
of the following elements:
Notes
1 In a recently published article, we sought to compare PISA, TIMSS and PIRLS
(Rosa et al., 2020).
2 The IEA publicly disclosed three reading assessment units that were part of
the PIRLS test and two assessment units of the ePIRLS test administered in
2016. In 2011, it also released a set of items released by the consortium. IAVE
compiled the information of the Portuguese version in two documents (IAVE,
2018). Context questionnaires for the 2011 PIRLS are available on the IAVE
website: www.iave.pt/index.php/estudos-internacionais/pirls/instrumentos-
de-avaliacao (accessed on 09/09/2020).
3 In Portugal, ICT is a mandatory subject for students from the 5th to 9th
grade. Curricular skills are organized in four fields: 1) digital citizenship; 2)
investigating and researching; 3) communicating and collaborating; 4) creat-
ing and innovating. The Directorate General for Education (DGE) has been
promoting in several school years (2015–2017) introduction to programming
initiatives, aimed at students of the 3rd and 4th grades. In 2017/2018, this
subject received the name “Probótica” (Programming and Robotics).
International large-scale assessment 139
References
Addey, C., & Sellar, S. (2019). Cela en vaut-il la peine? Raisons de participa-
tion (ou non) aux évaluations internationales à grande échelle des apprent-
issages. Recherche et Prospective en Éducation, 24. https://unesdoc.unesco.
org/ark:/48223/pf0000368421_spa
Araújo, L., Costa, P., & Folgado, C. (2016). Avaliação da leitura: PIRLS 2011.
In F. Azevedo & Â. Balça (Coord.), Leitura e educação literária (pp. 15–30).
Pactor – Edições de Ciências Sociais, Forenses e da Educação.
Ávila, P. (2005). A literacia de adultos: competências-chave na sociedade do con-
hecimento. Tese de doutoramento em Sociologia. ISCTE.
Beaton, A., Martin, M., Mullis, I., Gonzalez, E., Smith, T., & Kelly, D. (1996).
Science achievement in the middle school years. Boston College/International
Association for the Evaluation of Educational Achievement (IEA).
Benavente, A. (2016, outubro). O ‘dia’ seguinte: O que a Troika fez à escola. Le
Monde Diplomatique – edição portuguesa, 8–9.
Benavente, A. (Coord.), Rosa, A., Costa, A., & Ávila, P. (1996). A literacia em
Portugal – resultados de uma pesquisa extensiva e monográfica. Fundação
Calouste Gulbenkian/Conselho Nacional de Educação.
Blurton, C. (1999). New directions in education. In M. Tawfik (Org.), The world
communication and information (pp. 46–61). UNESCO.
Bodin, A., & Grapin, N. (2018). Un regard didactique sur les évaluations du
PISA et de la TIMSS: mieux les comprendre pour mieux les exploiter. Mesure
et évaluation en éducation, 41(1), 67–96. https://doi.org/10.7202/1055897ar
Bourdieu, P. (1986). The forms of capital. In J. Richardson (Ed.). Handbook of
theory and research for the sociology of education (pp. 241–258). Greenwood.
Campbell, J., Kelly, D., Mullis, I., Martin, M., & Sainsbury, M. (2001). Frame-
work and specifications for PIRLS assessment 2001: Progress in international
reading literacy study. Boston College/IEA – International Association for the
Evaluation of Educational Achievement.
Carvalho, J., Amaro, G., Reis, P., & Neres, F. (1996). Terceiro estudo internac-
ional em matemática e ciências (TIMSS): semelhanças num contexto de difer-
enças. Instituto de Inovação Educacional.
Drent, M., Martina, M., & Fabienne, K. (2013). The contribution of TIMSS
to the link between school and classroom factors and student achievement.
140 Vítor Rosa
Rosa, V., Maia, J., Daniela, M., & Teodoro, A. (2020). PISA, TIMSS e PIRLS em
Portugal: uma análise comparativa. Revista Portuguesa de Educação, 33(1),
94–120.
Rosine, L., & Postlethwaite, N. (1994). Les études internationales de l’IEA.
Revue internationale d’éducation de Sèvres, 1, 19–26. https://doi.org/10.4000/
ries.4294
Rutkowski, L., Gonzalez, E., Joncas, M., & von Davier, M. (2010). International
large-scale assessment data: Issues in secondary analysis and reporting. Educa-
tional Researcher, 39(2), 142–151.
Saraiva, L. (2017). A aprendizagem das ciências em Portugal: uma leitura a partir
dos resultados do TIMSS e do PISA. Medi@ções, 5(2), 4–18. http://mediacoes.
ese.ips.pt/index.php/mediacoesonline/article/view/164/pdf_1
Schmidt, W., MacKnight, C., Valverde, G., Houang, R., & Wiley, D. (Eds.).
(1997). Many visions, many aims: A cross-national investigation of curricular
intentions in school mathematics, volume 1. Kluwer Academic Publishers.
Sim-Sim, I. (2013). Os resultados dos alunos portugueses no PIRLS, em leitura, e
as suas implicações para o ensino, para a formação de professores e para o sis-
tema educativo. In Conselho Nacional de Educação (Ed.), Avaliações internac-
ionais e desempenho dos alunos portugueses [Textos do Seminário realizado
no CNE a 25 de março de 2013] (pp. 69–90). CNE.
Spanhel, D. (2008). La importancia de las nuevas tecnologías en el sector edu-
cativo. In M. L. Sevillano (Coord.), Nuevas tecnologías en Educación Social
(pp. 29–52). McGraw-Hill.
Vanda, L., Nunes, A., Amaral, A., Gonçalves, C., Mota, M., & Mendes, R.
(2019). ICILS 2018 – PORTUGAL. Literacia em Tecnologias da Informação
e da Comunicação. IAVE.
Wagemaker, H. (2014). International large-scale assessments: From research to
policy. In L. Rutkowski, M. von Davier, & D. Rutkowski (Eds.), Statistics
in the social and behavioral sciences series. Handbook of international large-
scale assessment. Background, technical issues, and methods of data analy-
sis (pp. 11–36). CRC Press.
7
Introduction
The object of this study is the critical exploration of the media represen-
tation of the Program for International Student Assessment (PISA) in the
Portuguese newspaper Público. We sought to ascertain and understand,
within the Portuguese context, how a national reference daily general-
interest newspaper refers to the process of Portugal’s participation in
PISA, to the results of PISA and their respective implications, both in
news discourse and in opinion discourses, in the period from 2001 to
2018.
The interest in the media exploration of PISA derives from acknowl-
edging that the mass media are an element of the global information
society, both by making information available and steering the attention
of the target audiences, and by contributing to shape their beliefs and
value systems, their representation and attribution of importance to the
different current events (Castells, 2007; McCombs, 2005). As stated by
Coe and Kuttner (2018, p. 1), the information press plays “a significant
role in the education policy arena, informing the public about pressing
issues and influencing how such issues are prioritized and understood”.
Therefore, it matters to education research to identify the issues prior-
itized by the media, as well as the omitted ones, and understand how
the former are represented and conveyed to the public debate, with the
possibility of influencing the understanding that media consumers have
of education policies (Gerstl-Pepin, 2007). Nowadays and in our political
context, PISA is a recurring topic in the debates on the policies and the
state of education at a national and global scale, and it is important to
inquire into its presence in and coverage by the media.
PISA is a programme undertaken by the Organization for Economic
Cooperation and Development (OECD), with global ambition in the field
of evaluation and design of education policies (OECD, 2006). The pro-
gramme, which is carried out every three years, had its first edition in
2000, and since then has registered, with small variations, a significant
DOI: 10.4324/9781003255215-8
PISA in media discourse 143
In this edition (PISA, 2000), Portugal took the second last position
among the members of the OECD, in reading literacy and science
literacy. . . . In mathematics, Portugal’s average . . . place the country
four places from the base in all domains of the test. Since then,
Portugal has managed to evolve positively in all the domains of PISA,
with significant increases from 2000 to 2003, from 2006 to 2009
and, lastly, from 2012 to 2015.
(p. 9)
while PISA results were initially covered in the news as news items
around the triennial release of results, as the idea of PISA became
a taken-for-granted measure of educational excellence in the public
consciousness, it was referenced more consistently throughout the
years.
(p. 36)
The studies that have been cited validate our inquiry into the evolution
of the coverage and prominence given to PISA by the newspaper Público,
assessed both by the number of news items throughout the various PISA
editions and also by how the edition features, in other words, whether
it makes the front page or the first pages, the size of the item, the use of
information graphics when covering the issues, the addition of opinion
columns on the topic, namely the editorial. Moreover, another element
also prompted us to research the matter, namely, the conclusions reached
by Saraisky (2015) on the media coverage of PISA beyond the month
of the first release of results, even in news items that do not feature the
146 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro
programme as main topic, and on what that signals regarding the pro-
gressive impact and credibility of PISA in the media and in society.
Thus, on the basis of the studies listed earlier, it would be expected to
witness a progressive evolution in the media coverage of PISA along the
time span under study, translated into the rise of news items published
by year and PISA cycle and, both in the items that feature the programme
as main topic and in those that mention it in passing, an increase that
signs the growing importance, influence and credibility of the programme
within society and the media agenda (Hypothesis 1: H1). Moreover,
in those news items where PISA features as the main topic, it is also
expected to see progressive intensification of attention and care in the
media coverage of PISA, translated into the increase in front page place-
ments, of editorials on the topic, of the size of the articles and recourse
to relevant graphics, elements that mark the mounting importance, influ-
ence and credibility of the programme within society and in the media
agenda (Hypothesis 2: H2).
Some of the studies already cited also explicitly explore the more or
less positive or negative tone in which PISA or the national results are
presented in the analysed articles. The assessment criteria and modes used
differ, which reinforces the already stated comparative constraints. We
will now consider some of the conclusions of studies that focused on the
British, Norwegian, Canadian, Shanghai and Portuguese participations
in PISA.
Dixon et al. (2013) look into the presence of “negativity” in news
items published between June 2007 and May 2008, in Germany,
Finland, France and Britain. They observed the presence of negativity
in the four countries, even in those that achieved good results. Although
the proportion of negative pieces was significantly larger in France and
Britain – countries with worse PISA performance – than in Germany
and Finland, the authors still concluded that there was evidence of some
negativity bias in the case of Finland, a country with excellent results.
Studies focusing on the coverage of results of several PISA editions by
the British press underlined their negative tone: Grey and Morris (2018),
on the 2012 PISA edition, Baird et al. (2016), on editions from 2009
to 2012. These studies highlight both some of the negative words and
phrases used to characterize the results – for example decline, stagnation,
failure – and the association of result dissemination to a fierce “football
championship, the championship of the PISA league”, with its winners
and losers: “much angst has been expressed in England by politicians and
the media about slippage over time down the PISA performance league
table” (Baird et al., 2016, p. 129).
In other countries, the presence of the negative tone is highlighted, but
also the alignment between the tone of the news items and the direction
of the PISA results in their respective countries. Thus, in Norway, on
PISA in media discourse 147
the one hand, Baird et al. (2011) concluded that the media echoed the
“shock” and disappointment felt by society at the country’s results in
the 2000 and 2003 PISA cycles, providing a simplistic portrayal of the
country as “a loser”; on the other hand, Hopfenbeck and Görgen (2017),
on PISA 2015, concluded that the tone of most headlines of Norwegian
newspapers followed the positive evolution of results: greater positive,
then, when compared to the headlines in 2001. This alignment between
the media tone and the PISA results is also noted by Baird et al. (2016)
in the Canadian press, comparing the representation of 2012 results “on
the scale of a national emergency” (p. 126) and the representation of
the more positive results of previous editions. E Baird et al. (2016),
in the same study, also report the celebratory tone of the Shanghai press,
although it focuses, not so much on the clear higher position of the region
in the world ranking, but rather on what that points to the existing equity
among the city’s schools and districts (Baird et al., 2016).
Regarding the reaction of the Portuguese press to PISA, Lemos and
Serrão (2015), who analysed the media impact of PISA in Portugal in
two newspapers (Diário de Notícias and Expresso) and a newsmagazine
(Visão), from 2001 to 2012, considered, among other aspects, the gen-
eral tone of the articles that addressed this programme. They concluded
that the tone of the press shows a balance of positive, negative and neu-
tral approaches, with some variation among the media analysed and also
some variation associated with the general sense of the PISA results in
the country: 2009, better results, more items with a positive tone; 2003,
worse results, more items with a negative tone. Lemos and Serrão (2015)
emphasize that both the headlines of the articles and their bodies showed
that the country’s PISA results to a large extent set the tone of the news
items, even when these results are not the main topic, as becomes evident
by the considerable weight of positivity in the articles involving the 2009
cycle.
In short, the reported studies enable us to conclude that, regardless
of the presence of negativity, some balance prevails in the tone of news
items covering PISA, with the tone following the evolution of results in
their respective countries. In most studies, the results seem to be the cri-
terion, which ultimately most determines the tone of the articles, more
than the education systems whose efficiency and equity they supposedly
reveal, or the governments and their respective policies, or even the pro-
gramme itself.
The studies we have been citing validate then our inquiry into the pre-
vailing tone of the articles in Público, which have PISA as the main topic.
The goal is to identify the more or less positive sense of the PISA agenda
in the newspaper, as well as speculate on the underlying reasons, and of
the impact of the media tone in the representation and debates poten-
tially fostered in the society. Also in alignment with the studies alluded
148 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro
to, it is to be expected that our results, comprehending the six PISA cycles
(2000–2015), also point to, especially in informational items, a progres-
sively less relevant negativity trend, considering the positive evolution of
Portugal’s results in the programme (Hypothesis 3: H3). Concerning the
opinion items, punctuated in general by a more critical and evaluative
tone determined by the author’s perspective and social stance, it is to be
expected that, when compared to informative items, they will show a
stronger presence of negativity and will focus not just on the results but
also on the education system and policies, or on governmental action and
rulers (Hypothesis 4: H4).
Lastly, regarding the inquiry into the voices which in the written
press have the power to speak up on social phenomena, recreating them
through their discourses, there is a line of questioning which is aimed at
the authors of the texts, particularly opinion pieces (e.g. Boto, 2011):
who does the newspaper give credit to thus state his/her opinions and
influence the public space regarding the social phenomena they include
in their agendas? And what meanings can be inferred, namely in terms of
the published plurality of opinions? The answer to these questions points
to issues of power, dominance and hegemony as proposed, for instance,
by the Critical Discourse Analysis theory (Dijk, 2015), in the analysis
of the press, even if only addressing the more superficial features of its
products.
Within this framework, we asked ourselves about the authorship of
the opinion published, about who, and with what variety and plurality,
the newspaper confer power and credit upon to emit opinions on PISA
throughout its editions. It is not so much to explore here what these
voices say but rather “who they are and what their place is in the social
structure” (Kadushin, 1968, p. 685), and whether this situation some-
how constitutes a handover of this influencing power to an elite, that is
to a minority of individuals – “who are said to have caused more outputs,
or more important outputs” (p. 688).
Among the studies we have been citing, Saraisky (2015) pays particu-
lar attention to the inquiry into who has authorial voice in the press, to
whom this legitimacy and power is awarded to speak up in the public
space on matters of education policy and, in this way, exert influence on
the public debate and policies:
The theoretical literature suggests that elites play a key role in the
policy process. . . . and the analysis of speech acts bore this out.
At the speaker level, the discourse is guided by a mere handful of
elites: the data show that six speakers provide almost 30 percent
of the commentary on PISA in the US. There is virtually no public
voice in the discourse, despite the fact that education is one of the
most public of issues. . . . Teachers, parents and students are almost
PISA in media discourse 149
Method
This study consists in a descriptive, longitudinal and quantitative analysis
of the surface characteristics of 184 newspaper articles of the Portuguese
newspaper Público, published between 4 December 2001 and 31 Janu-
ary 2018. This time frame considered the newspaper articles published
on the day OCDE published the reports/volumes of results relative to
each one of the PISA cycles between 2000 and 2015 and for the following
two complete months. PISA results, let us recall, are published one year
after the assessment carried out, usually in December, and are reported in
one or more volumes, the electronic version of which is made available
on the OCDE website (Table 7.1).
The newspaper articles were obtained by researching the archive of
the print edition in PDF format (main section and supplements) of the
daily Público (1,153 documents). In each issue, the research was carried
out combining the terms PISA and OCDE. As a complement, a consulta-
tion was made in the documental collection of Hemeroteca Municipal
de Lisboa (Lisbon media library) to bridge any gaps in the digital editions
or upload of any item with mistakes. One hundred and six issues were
considered. In those issues, 171 articles were identified featuring PISA in
Portugal.
150 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro
PISA cycle Dates of publication of results by the OCDE Time frame for article collection
Table 7.2 F ields of analysis, their articulation with the hypotheses and respec-
tive categories
lead of the article and also on its body, whenever ambiguities persisted
on those bases.
When categorizing the articles regarding their type or journalistic
genre, following Ricardo (2004) and Lopes (2010), consideration was
given to the groups that journalistic genres are organized into: informa-
tive (news story, interview) and opinion (chronicle/opinion pieces, edito-
rial, letter to the editor), and also the cartoon. It should be noted that
none of the genres exists in editorial practices in a pure state; in other
words, many journalistic texts integrate features from the different gen-
res. It is then the analyst’s task to determine, in each case, which is the
dominant genre (Lopes, 2010). This situation is relevant for the case of
chronicles and opinion pieces, and for this reason, an option was made to
categorize these two genres jointly (Gradim, 2000). The category “other”
was also included, for the collected articles, which did not fit any of the
journalistic genres. In the corpus of 184 articles, the more frequent jour-
nalistic genres were news stories (90, 48.9%), followed by chronicles/
opinion pieces (55, 29.9%; the two genres, together, represent 78.8% of
the total articles).
Results
To present the results, we will follow a rationale aligned with the fields,
objectives and hypotheses of the study. Thus, we will see the results
regarding the evolution of the media coverage and prominence of PISA,
the prevailing tone of the articles and, finally, the leading authorial voices
in the discourses.
When presenting the results according to their time distribution,
we shall consider, in the conditions defined by the sample options, the
monthly and annual distribution, when deemed expedient, and, always,
the distribution by PISA cycles and, finally, comparing two PISA periods:
• In the first period, we aggregate the three initial cycles (2000, 2003,
2006).
• In the second period, the following three cycles (2009, 2012, 2015).
19 100.0 8 42.1 6 31.6 1 5.3 1 5.3 3 15.8 19 100.0 4 21.1 3 15.8 7 36.8 1 5.3 1 5.3 1 5.3 10 52.6
2009-2012- 76 67.9 34 56.7 27 93.1 3 100.0 4 57.1 6 66.7 2 50.0 76 67.9 16 55.2 8 44.4 16 84.2 2 100.0 3 100.0 25 69.4 33 60.0
2015 76 100.0 34 44.7 27 35.5 3 3.9 4 5.3 6 7.9 2 2.6 76 100.0 16 21.1 8 10.5 16 21.1 2 2.6 3 3.9 25 32.9 33 43.4
Change from 1st period (cycles 2000-2003-2006) to 2nd period (cycles 2009-2012-2013):
+40 +35.7 +8 +13.3 +25 +86.2 +3 +100.0 +1 +14.3 +3 +33.3 0 0.0 +40 +35.7 +3 +10.3 -2 -11.1 +13 +68.4 +2 +100.0 +3 +100.0 +14 +38.9 +11 +20.0
-27.5 +30.0 +3.9 -3.1 -0.4 -2.9 0.0 -15.1 -17.3 +12.7 +2.6 +3.9 +2.3 -17.7
PISA in media discourse 155
Figure 7.1 F requency of the do journalistic genres of the 112 articles with
PISA as the main subject, by year (in the sample considered), by
PISA cycle
5th and 6th cycles, respectively). There is a very strong contrast between
the first period (2000–2003–2006 cycles) and the second period (2009–
2012–2015 cycles): 2 versus 27. The chronicles/opinion pieces emerge, in
general, in the 1st year of dissemination of the results of the respective
PISA cycles.
Let us now consider the results on the prominence attributed to PISA
by Público and their respective evolution. To this end, we questioned
the prominence given to PISA by Público and its respective evolution,
the presence of the articles in which PISA is the main topic in editorials,
frontpage teasers the recourse in them to technical illustration and its
size.
Thus, in the set of 112 articles:
• Seven editorials (6.3%) were published in the year the first results
were disclosed. There are editorials in all cycles except for the third.
In the first period there are three editorials and in the second, four.
• There are 29 frontpage teasers (25.9%), and this situation always
occurred in the year the first results were published. With the excep-
tion of the 3rd cycle, with no frontpage teasers, in all the other cycles
this kind of attention occurred, especially in the 1st and 4th cycles.
In the second period there is a higher number of frontpage teasers
vis-à-vis the first period (16 and 13, respectively).
• Eighteen articles resort to graphs and tables to illustrate PISA results
(16.1%), on both national and international data. Articles with this
resource feature mostly in the first year of their respective dissemi-
nation of results. The distribution of this resource is slightly more
156 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro
expressive in the 3rd and 4th cycles, with four articles each. The first
period shows a higher number of articles with graphs and tables: ten
versus eight in the second period.
• There are 55 articles (49.1%) with the smallest size, inferior to
half page. Regarding the others sizes, half page stands out, with 36
articles (32.1%) and one page, or approximately, with 19 (17%). In
the distribution per cycles, it can be noted that in all of them there is
at least one full page article except for the 3rd, which has none. Still,
it is in the last three cycles that we can observe the highest number of
one-page articles: 4th cycle with four (21.1% of the total number of
articles of this size), 5th with five (26.3%) and 6th cycle with seven
(36.8%). Thus, the frequency of longer articles is higher in the sec-
ond period (16 versus three in the first period, 84.2% versus 15.8%
of the articles with the size in question). Two- or three-page articles
are scarcely represented – two and three, respectively, in the total set
of 112 articles – and can only be found in the second period.
Table 7.4 P
redominant tone of the 112 articles in which PISA is the main
topic
Positive Neutral Negative
%, line
%, column N % N % N % N %
cycle featured four articles (14.8% of the positive articles, 23.5% of the
respective cycle).
The negative tone is the most frequent in the 1st cycle, with 12 articles
(28.6% of negative articles, over half the articles in the respective cycle,
66.7%). Close to these figures is 4th cycle: 10 articles (23.8% of negative
articles, but with a percentual in the cycle quite inferior to the former,
25.0%). In the remaining four cycles, the presence of a negative tone
varies between 5 and 6 articles (26.3% in the 6th cycle and 57.1% in the
3rd cycle).
The neutral tone is present in all the cycles, and it is clearly dominant
in the 4th with 19 articles (44.2% of articles with a neutral tone, about
half the evaluations of the cycle, 47.5%). In the remaining cycles, even if
prevalent, the presence of the neutral tone is lower, varying between three
and six articles (15.8% in the 6th cycle and 54.5% in the 2nd cycle).
The analysis of the tone of the articles in the first and second periods
shows a significant difference in the positive tone: one article versus 26
(3.7% and 96.3% in the total number of articles with a positive tone, and
2.8% and 34.2%, at cycle level, respectively). The first period also stands
out by the near absence of articles with a positive tone, while the 2nd
stands out by the balance in the tone of articles, with items with neutral
tone coming in first place (38.9%), closely followed by positive items
(34.2%) and finally the negative one (27.6%).
The distribution of the tone of the articles in the two predominant
journalistic genres shows that in news stories the neutral tone is more
frequent, with 27 items (45.0% of the total of news stories; 62.8% of
the total number of articles with neutral tone). News stories with a posi-
tive tone are the least represented in the total set of news stories, with 13
articles (21.7%), although they contribute to about half the set of posi-
tive articles (48.1%). In the set of chronicles/opinion pieces, the negative
tone is the more frequent, with 13 items (44.8% of the total number of
chronicles/opinion pieces; 31.0% of negative articles). Next come arti-
cles with a neutral tone, with 10 items (34.5% of the total number of
chronicles/opinion pieces, 23.3% of pieces with this tone). In this genre,
too, the positive tone, with six articles, is the least represented (20.7% of
chronicles/opinion pieces; 22.2% of the articles with this tone).
Table 7.5 P
rofession of the authors of the 112 articles in which PISA is the
core topic
Lecturer Higher Ed./ School teacher/ Teacher/ Other or Not
Journalist Politician
researcher School manager Union leader Identified
%, line
%, column N % N % N % N % N % N % N %
Female and Male (112 pieces, 16 pieces with unidentified gender author)
1st Cycle (2001-2003) 13 72.2 1 5.6 4 22.2 18 100.0
17.3 10.0 19.0 16.1
2nd Cycle (2004-2006) 9 81.8 2 18.2 11 100.0
12.0 9.5 9.8
3rd Cycle (2007-2009) 4 57.1 1 14.3 2 28.6 7 100.0
5.3 10.0 9.5 6.3
4th Cycle (2010-2012) 27 67.5 3 7.5 1 2.5 1 2.5 8 20.0 40 100.0
36.0 30.0 100.0 100.0 38.1 35.7
5th Cycle (2013-2015) 9 52.9 2 11.8 4 23.5 2 11.8 17 100.0
12.0 50.0 40.0 9.5 15.2
6th Cycle (2016-2018) 13 68.4 2 10.5 1 5.3 3 15.8 19 100.0
17.3 50.0 10.0 14.3 17.0
Female
1st Cycle (2001-2003) 12 100.0 12 100.0
21.1 18.8
2nd Cycle (2004-2006) 7 100.0 7 100.0
12.3 10.9
3rd Cycle (2007-2009) 4 80.0 1 20.0 5 100.0
7.0 50.0 7.8
4th Cycle (2010-2012) 23 92.0 2 8.0 25 100.0
40.4 50.0 39.1
5th Cycle (2013-2015) 4 80.0 1 20.0 5 100.0
7.0 50.0 7.8
6th Cycle (2016-2018) 7 70.0 1 10.0 2 20.0 10 100.0
12.3 100.0 50.0 15.6
Male
1st Cycle (2001-2003) 1 33.3 1 33.3 1 33.3 3 100.0
9.1 12.5 12.5 9.4
2nd Cycle (2004-2006) 2 100.0 2 100.0
18.2 6.3
3rd Cycle (2007-2009) 1 100.0 1 100.0
12.5 3.1
4th Cycle (2010-2012) 1 9.1 3 27.3 1 9.1 1 9.1 5 45.5 11 100.0
9.1 37.5 100.0 100.0 62.5 34.4
5th Cycle (2013-2015) 2 25.0 2 25.0 3 37.5 1 12.5 8 100.0
18.2 66.7 37.5 12.5 25.0
6th Cycle (2016-2018) 5 71.4 1 14.3 1 14.3 7 100.0
45.5 33.3 12.5 21.9
Discussion of results
Regarding the first editions of PISA, we inquired into their representation
in Público, one of the main reference daily newspapers in Portugal. The
initial approach to the material that is the object of this chapter focused
on and sought to find meanings in seven fields: (i) intensity of coverage
and (ii) prominence of PISA in the newspaper’s agenda; (iii) prevailing
tone of the articles and (iv) authors’ voices that stand out, to which we
added cross-sectional fields: (v) evolution of the meanings surveyed; (vi)
the centrality of PISA in the articles; (vii) typology of the articles. On
these fields we formulated seven hypotheses which we will consider later.
Concerning the issue of the centrality of PISA, from the 184 articles,
112, somewhat over two-thirds, focused on PISA as core topic – in the
remaining third PISA was a secondary topic or merely a passing refer-
ence. We can say that both results shed light on the value that the news-
paper gives this topic, both by taking it as the core topic in a considerable
number of pieces and by bringing together PISA and other information
or debates in the field of education, thereby suggesting that in this news-
paper, too, the topic of PISA has become an unavoidable reference in the
debates on education and its policies (Saraisky, 2015).
When we consider the evolution of the coverage of PISA in Público in
articles which address the programme as their main topic, and regardless
of the multicausality of the PISA effect on media agendas, findings show
that in this newspaper, the effect is apparent and has been growing. Our
expectation that the media coverage of PISA would steadily evolve in
the studies time frame was thus confirmed, translated in the increase in
the number of articles published with PISA as the main topic, per PISA
cycle and in the context of the aggregation in two periods (H1). This
increase can be understood as signalling the growing importance of the
programme in the media agenda and in society, albeit not in the orienta-
tion of the national education policy. This finding aligns with the results
of studies on other countries which highlighted the relevant contribution
of the media to inform society on PISA, and associated its gradual media
coverage with the growing receptivity and political impact of the pro-
gramme (Saraisky, 2015; Baroutsis & Lingard, 2016; González-Mayorga
et al., 2017). This result is also in line with the findings of Lemos and
Serrão (2015) for the Portuguese situation.
Two aspects on the evolution of the coverage of PISA in articles where
the programme is the central topic should also be emphasized. The first
refers to the high incidence of a larger number of articles in the years of
the first dissemination of results for each edition; besides, the same can
be said of the months of December, since the first results are regularly dis-
seminated in this month. It seems reasonable to conclude that the topic of
PISA does not overcome the barrier of the media preference for novelty,
162 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro
for current events, therefore producing little echo in the public debate,
be it through information or published opinion, as was ascertained in
other studies as well (Dixon et al., 2013). Apparently, the press is par-
ticularly receptive to figures, to rankings, to the choreography of the
“championship of the PISA league”, which has its peak in the month of
December of the year of the first dissemination of results of each edition.
And yet, from the 2006 edition, the OECD has been disseminating suc-
cessive thematic studies expanding the analysis of results. These studies
translate an approach that is politically more explicit to what they reveal
of each country from social and educational perspectives. The newspa-
per’s reaction to these reports, however, is of less relevance compared to
the reaction to the first reports on each cycle, focused on the results of
the literacy tests, on the comparisons between countries, on the rank-
ings. Meanwhile, although media culture, in general, even in the refer-
ence press, helps us understand the preference for the more dramatic and
exuberant dimensions of social phenomena, often of a more superficial
nature (Coe & Kuttner, 2018), we also question the efficacy of the OCDE
regarding the dissemination of the in-depth reports. Maybe the OECD’s
promotional strategy regarding the dissemination of these reports does
not benefit from a similar investment to that which is allocated to the
dissemination of the early reports, in the terms revealed by Lingard’s
research (2016).
The second aspect to be highlighted regards the irregularity of the
annual distribution flow of the articles, without prejudice to a clear
quantitative increase in the coverage of PISA by the newspaper. Let us
note, for instance, the case of 2001, which is the third year with more
articles, but which is atypical in the context of the years closer to it,
which are among those which least represent PISA in Público’s agenda.
The novelty that PISA constituted in 2001 helps us understand the atten-
tion Público paid the topic, even if that attention is not in keeping with
PISA’s low impact in other Portuguese publications such as Diário de
Notícias, Expresso and Visão in that year (Lemos & Serrão, 2015). The
“tragedy” of the Portuguese results in the 2000 edition may also explain
its strong presence in Público’s agenda. The country’s feeble position lin-
gered in the two following editions, but the strong impact in society and
politics of other social phenomena of an internal nature may explain the
weaker presence of PISA in the newspaper’s agenda. Conversely, the year
2010, the first year of result dissemination of the 4th PISA cycle, was a
very significant turning point as regards the newspaper’s interest in the
matter, translated into a rise in the number of articles, which follows
the noteworthy improvement in Portugal’s performance in PISA. In the
following two cycles, albeit with oscillations, the significant number of
articles remains steady, thus maintaining the greater frequency of articles
in the second period analysed. This result is in line with the situation
PISA in media discourse 163
References
Baird, J.-A., Isaacs, T., Johnson, S., Stobart, G., Yu, G., Sprague, T., & Daugherty, R.
(2011). Policy effects of PISA. Oxford University Centre for Educational Assess-
ment. http://oucea.education.ox.ac.uk/wordpress/wp-content/uploads/2011/
10/Policy-Effects-of-PISA-OUCEA.pdf
Baird, J.-A., Johnson, S., Hopfenbeck, T. N., Isaacs, T., Sprague, T., Stobart,
G., & Yu, G. (2016). On the supranational spell of PISA in policy. Educational
Research, 58(2), 121–138. https://doi.org/10.1080/00131881.2016.1165410
Baroutsis, A., & Lingard, B. (2016). Counting and comparing school perfor-
mance: An analysis of media coverage of PISA in Australia, 2000–2014.
Journal of Education Policy, 32(4), 432–449. https://doi.org/10.1080/02680939.
2016.1252856
PISA in media discourse 167
Bieber, T., & Martens, K. (2011). The OECD PISA study as a soft power in edu-
cation? Lessons from Switzerland and the US. European Journal of Education,
46(1), 101–116. https://doi.org/10.1111/j.1465-3435.2010.01462.x
Boto, A. P. C. N. de B. (2011). Entre os problemas públicos e a agenda política:
O papel dos opinion makers em torno do novo modelo de avaliação de desem-
penho docente (2007–2009) [Dissertação de Mestrado, Instituto Superior de
Ciências Sociais e Políticas] http://hdl.handle.net/10400.5/3537
Carita, A., & Teodoro, V. D. (in press). A indisciplina escolar na imprensa: O
jornal Público entre 2011 e 2015. Education Policy Analysis Archives.
Castells, M. (2007). A era da informação: Economia, sociedade e cultura.
A sociedade em rede. Fundação Calouste Gulbenkian
Coe, K., & Kuttner, P. J. (2018). Education coverage in television news: A typol-
ogy and analysis of 35 years of topics. AERA Open, 4(1), 1–13. https://doi.
org/10.1177/2332858417751694
Coleman, R., McCombs, M., Shaw, D., & Weaver, D. (2009). Agenda setting. In
K. Wahl-Jorgensen & T. Hanitzsch (Eds.), The handbook of journalism studies
(pp. 147–160). Routledge.
Dijk, T. A. van. (2015). Critical discourse analysis. In D. Tannen, H. E. Hamil-
ton, & D. Schiffrin (Eds.), The handbook of discourse analysis (2nd ed., pp. 465–
485). John Wiley & Sons, Inc. https://doi.org/10.1002/9781118584194.ch22
Dixon, R., Arndt, C., Mullers, M., Vakkuri, J., Engblom-Pelkkala, K., & Hood,
C. (2013). A lever for improvement or a magnet for blame? Press and political
responses to international educational rankings in four EU countries. Public
Administration, 91(2), 484–505. https://doi.org/10.1111/padm.12013
Figazzolo, L. (2009, March). PISA: Is testing dangerous? Education International
(29). www.ei-ie.org/en/detail/4079/pisa-is-testing-dangerous
Gerstl-Pepin, C. I. (2007). Introduction to the special issue on the media, democ-
racy, and the politics of education. Peabody Journal of Education, 82(1), 1–9.
https://doi.org/10.1080/01619560709336534
González-Mayorga, H., Vidal, J., & Vieira, M.-J. (2017). El impacto del informe
PISA en la sociedad española: El caso de la prensa escrita. RELIEVE – Revista
Electrónica de Investigación y Evaluación Educativa, 23(1). https://doi.
org/10.7203/relieve.23.1.9015
Gradim, A. (2000). Manual de jornalismo. Universidade da Beira Interior. http://
labcom.ubi.pt/livro/64
Grek, S. (2009). Governing by numbers: The PISA ‘effect’ in Europe. Journal of
Education Policy, 24(1), 23–37. https://doi.org/10.1080/02680930802412669
Grey, S., & Morris, P. (2018). PISA: Multiple ‘truths’ and mediatised global gov-
ernance. Comparative Education, 2(54), 109–131. https://doi.org/10.1080/03
050068.2018.1425243
Hopfenbeck, T. N., & Görgen, K. (2017). The politics of PISA: The media, policy
and public responses in Norway and England. European Journal of Education,
52(2), 192–205. https://doi.org/10.1111/ejed.12219
Kadushin, C. (1968). Power, influence and social circles: A new methodology for
studying opinion makers. American Sociological Review, 33(5), 685. https://
doi.org/10.2307/2092880
168 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro
Lemos, V., & Serrão, A. (2015). O impacto do PISA em Portugal através dos
media. Sociologia, Problemas e Práticas, 78, 87–104. https://doi.org/10.7458/
spp2015783310
Lingard, B. (2016). PISA: Fundamentações para participar e acolhimento político.
Educação, Sociedade, Campinas, 37(136), 609–627. https://doi.org/10.1590/
es0101-73302016166670
Lopes, P. C. (2010). Géneros literários e géneros jornalísticos. Uma revisão
teórica de conceitos. Universidade da Beira Interior. www.bocc.ubi.pt/pag/
bocc-generos-lopes.pdf
Marôco, J., Goncalves, C., Lourenço, V., & Mendes, R. (2016). PISA 2015 – Por-
tugal Volume I: Literacia científica, literacia de leitura & literacia matemática.
IAVE Instituto de Avaliação Educativa I.P. www.iave.pt/images/FicheirosPDF/
Estudos_Internacionais/Relatorio_PISA2015.pdf
Martens, K., & Niemann, D. (2010). Governance by comparison: How rat-
ings & rankings impact national policy-making in education. Univer-
sität Bremen Collaborative Research Center. www.econstor.eu/bitstream/
10419/41595/1/639011268.pdf
McCombs, M. E. (2005). A look at agenda-setting: Past, present and future. Jour-
nalism Studies, 6(4), 543–557. https://doi.org/10.1080/14616700500250438
McCombs, M. E., & Shaw, D. L. (1972). The agenda-setting function of
mass media. Public Opinion Quarterly, 36(2), 176–187. https://doi.
org/10.1086/267990
McCombs, M. E., & Shaw, D. L. (1993). The evolution of agenda-setting
research: Twenty-five years in the marketplace of ideas. Journal of Commu-
nication, 43(2), 58–67. https://doi.org/10.1111/j.1460-2466.1993.tb01262.x
Nóvoa, A., & Yariv-Mashal, T. (2003). Comparative research in education:
A mode of governance or a historical journey? Comparative Education, 39(4),
423–438. https://dx.doi.org/10.1080/0305006032000162002
OECD. (2006). Assessing, scientific, reading and mathematical literacy: A frame-
work for PISA 2006. https://doi.org/10.1787/19963777
Ricardo, D. (2004). Ainda bem que me pergunta. Manual de escrita jornalística.
Editorial Notícias.
Saraisky, N. G. (2015). Analyzing public discourse: Using media content analysis
to understand the policy process. Current Issues in Comparative Education,
18(1), 26–41. www.tc.columbia.edu/cice/pdf/03_Green-Saraisky-CICE-18.pdf
8
Introduction
The Organization for Economic Cooperation and Development (OECD),
which at present integrates 36 members, aims to promote policies that
foster prosperity, equal opportunities and well-being for all. It is sup-
ported by an experience of over 60 years of knowledge, with a view to
preparing tomorrow’s societies. In close cooperation with public authori-
ties, economic agents and the representatives of civil society, this organi-
zation seeks to establish international rules and proposes data-based
solutions: improvement of economic performances, job creation, promo-
tion of effective education systems, fight against international tax eva-
sion, among others.
Developed by the OECD, the Programme for International Student
Assessment (PISA) was designed to evaluate if 15-year-old students are
capable of mobilizing their Reading, Mathematics or Science skills to
solve everyday life situations. On an optional basis, collaborative prob-
lem-solving and financial literacy can also be evaluated. This is not, there-
fore, about knowing if students can reproduce the knowledge acquired
in those fields. The PISA tests are designed on the basis of a framework
that is common to all the countries. The applications of the tests have
occurred in three-year cycles, and in each cycle one of the literacy areas
mentioned earlier is evaluated in more depth: Reading (2000, 2009,
2018); Mathematics (2003, 2012a); Science (2006, 2015); Collabora-
tive Problem-Solving (2015).1 Portugal has not yet participated in the
evaluation of Financial Literacy (which is scheduled for 2021), but it has
participated in all the PISA cycles so far (from 2000 to 2018). Students
are selected through a two-stage sampling process. In the first stage, a
stratified random sample of schools is constituted. In the second stage, all
the students eligible to take the tests in the selected schools are identified
(15-year-old students attending at least the 7th grade). Subsequently, the
international Consortium randomly selects the students in each partici-
pating school.
DOI: 10.4324/9781003255215-9
170 Vítor Rosa and Ana Lourdes Araújo
Methodological framework
As big data studies emerged and revolutionized the field, social scientists
have been confronted with new challenges in their attempt to find the
best way to understand the phenomena as well as the conceptual and
methodological approaches that can be used (Robertson, 2019). For
our inquiry, we resorted to the qualitative research methodology, based
on documental and bibliographical analysis (papers, reports, statistics,
press, interviews, images, among others). We are aware that each of
these data-collection techniques has its strengths and weaknesses. In
the social sciences, the use of qualitative data is associated with the
different paradigms trying to put together a view of the social reality
(Koivu & Damman, 2015; Uher, 2018). Qualitative research produces
How PISA is becoming a “big science” project 171
No. of Participations 4 6 6 10 14 14 11
ACER, Australia 7 ¤ ¤ ¤ ¤ ¤ ¤ ¤
WESTAT, USA 7 ¤ ¤ ¤ ¤ ¤ ¤ ¤
ETS, USA 6 ¤ ¤ ¤ ¤ ¤ ¤
NIER, Japan 4 ¤ ¤ ¤ ¤
CITOGROEP, the 4 ¤ ¤ ¤ ¤
Netherlands
aSPe, Belgium 4 ¤ ¤ ¤ ¤
CApStAn, Belgium 4 ¤ ¤ ¤ ¤
DIPF, Germany 4 ¤ ¤ ¤ ¤
HallStat SPRL, Australia 3 ¤ ¤ ¤
University of Heidelberg, 3 ¤ ¤ ¤
Germany
University of Luxembourg, 3 ¤ ¤ ¤
Luxembourg
CET, Israel 1 ¤
Achieve Inc, USA 1 ¤
DEPP, France 1 ¤
GESIS, Germany 1 ¤
ILS, Norway 1 ¤
IPN, Germany 1 ¤
LIST, Luxembourg 1 ¤
Pearson, UK 2 ¤ ¤
Statistic Canada, Canada 2 ¤ ¤
University of Jyväskylä, 1 ¤
Finland
University of Melbourne, 1 ¤
Australia
University of Twente, the 1 ¤
Netherlands
Figure 8.1 “Big science” and human development (in abstract terms)
Source: PISA Technical Reports (2000 to 2018)
Conclusion
Throughout the 1960s and 1970s, many OECD countries reformulated
the contents and methods of their schooling. Throughout the 1980s, the
guideline was to respond to the needs of society, of the acknowledge-
ment of economic hardship and social problems. In the 1990s indica-
tor projects were fostered (Indicators of Education Systems – INES, for
instance) to monitor the standard of learning and education systems.3
The issues addressed at present in the reports that the OECD produces
fall on the decrease of birth rates, the evolution and change of the fam-
ily structure, demographic ageing, women’s participation in the labour
market, migrations, multiculturalism and technological progress. Educa-
tion, as mentioned earlier, is also part of its concerns and, to this end, the
OECD has been implementing several projects. One megaproject, which
can be integrated in big science, is called PISA and was launched in 2000,
with a view to comparing (the common core) internationally several liter-
acy domains, with the target audience being 15-year-old students. Based
on the results, the OECD issues recommendations to improve education
systems. The PISA programme is seen by the political decision-makers
and the international organizations as a tool to compare school systems,
which will reveal their strengths and weaknesses. The media coverage
and instrumentalization of results also contribute to turn PISA into an
international competition vector, and its most symptomatic manifestation
consists in classifying countries according to their mean performances.
The implementation of PISA is subject to a complex set of tender
specifications. Data collection and processing must respond to a very
demanding list of standards regarding sampling, design, translation and
correction of the items, the conditions for test delivery and data manage-
ment, so as to avoid fraud. One of the more important standards is the
minimum response rate required. As is the case with all survey data, PISA
data are, inevitably, imperfect, especially when they are based on sam-
pling and estimated figures. The historic and cultural contexts are seldom
integrated in the analysis of findings.
Large international surveys, namely PISA, seek to provide the evidence
for governmental political action (Teodoro, 2015) and relegate to the
background the contextualization of learning processes on the political
dimensions of education (Teodoro, 2015). PISA does not allow for “a
How PISA is becoming a “big science” project 177
Notes
1 On the reach of the literacy fields, a topic to be further expanded below, see
OECD (2018, pp. 14–15).
2 Turner (2003) emphasizes that, when Price (1963) wrote the book, a para-
digm change was underway according to the science sociologists. Merton
(1938, 1942, 1957) already remarked on the emerging interest in the scien-
tific statute, the recognition of scientific discoveries, scientific priorities and
science and technologies.
3 On the history of the creation of INES, see Teodoro (2015, 2016).
References
Aspers, P., & Corte, U. (2019). What is qualitative in qualitative research. Quali-
tative Sociology, 42, 39–160. https://doi.org/10.1007/s11133-019-9413-7
Bart, D., & Daunay, B. (2016). Les blagues à PISA, le discours sur l’école d’une
institution internationale. Éditions du Croquant.
Bloem, S. (2015). The OECD directorate for education as an independent knowl-
edge producer through PISA. In H. G. Kotthoff & E. Klerides (Eds.), Govern-
ing educational spaces (pp. 169–185). Sense Publishers.
Breakspear, S. (2012). The policy impact of PISA: An exploration of the nor-
mative effects of international benchmarking in school system performance.
OECD Publishing.
Carvalho, L. M., & Costa, E. (2017). The praise of mutual surveillance in Europe.
In R. Normand & J.-L. Derouet (Eds.), A European politics of education: Per-
spectives from sociology, policy studies and politics (pp. 53–72). Routledge.
Centeno, V. G. (2017). The OECD’s educational agendas: Framed from above,
fed from below, determined in interaction. A study on the recurrent education
agenda. Peter Lang.
Elias, N. (1993). Qu’est-ce que la sociologie? Agora Pocket.
178 Vítor Rosa and Ana Lourdes Araújo
Galison, P., & Hevly, B. (1992). Big science: The growth of large-scale research.
Stanford University Press.
Gastro, M., & Oppelt, T. (2018). Big science and human development – what
is the connection? South African Journal of Science, 114(11/12). https://doi.
org/10.17159/sajs.2018/5182
Hallonsten, O. (2016). Use and productivity of contemporary, multidisciplinary
big science. Research Evaluation, 25(4), 486–495.
Hallonsten, O., & Christensson, O. (2017). Collaborative technological innova-
tion in an academic, user-oriented big science facility. Industry and Higher
Education, 31(6), 399–408. https://doi.org/10.1177/0950422217729284
Josephson, P. R., & Klanovicz, J. (2016). Big science e tecnologia no século
XX. Fronteiras: Revista Catarinense de História, 27, 149–168. https://doi.
org/10.36661/2238-9717.2016n27.8051
Koivu, K., & Damman, E. (2015). Qualitative variations: The sources of diver-
gent qualitative methodological approaches. Quality & Quantity: Interna-
tional Journal of Methodology, 49(6), 2617–2632.
Lingard, B. (2016). Rationales for and reception of the OECD’s PISA.
Educação & Sociedade, 37(136), 609–627. https://doi.org/10.1590/
es0101-73302016166670
Merton, R. (1938). Science, technology and society in seventeenth century
England. Saint Catherine Press.
Merton, R. (1942). Science and technology in a democratic order. Journal of
Legal and Political Sociology, 1, 115–126.
Merton, R. (1957). Priorities in scientific discovery. American Sociological
Review, 22, 635–659.
Niemann, D., & Martens, K. (2018). Soft governance by hard fact? The OECD
as a knowledge broker in education policy. Global Social Policy, 18(3),
267–283. https://doi.org/10.1177/1468018118794076
Nye, M. (1996). Before big science: The pursuit of modern chemistry and physics
1800–1840. Harvard University Press.
OECD. (2000). PISA 2000 Technical report. OECD.
OECD. (2003). PISA 2003 Technical report. OECD.
OECD. (2006). PISA 2006 Technical report. OECD.
OECD. (2009). PISA 2009 Technical report. OECD.
OECD. (2010). The high cost of low educational performance: The long-run
economic impact of improving PISA outcomes. OECD. https://www.oecd.org/
pisa/44417824.pdf
OECD. (2012a). PISA 2012 Technical report. OECD. https://www.oecd.org/
pisa/pisaproducts/PISA-2012-technical-report-final.pdf
OECD. (2012b). Better skills, better jobs, better lives: A strategic approach to
skills policies. OECD. https://www.oecd.org/education/imhe/IMHEinfos_
Jult12_EN%20-%20web.pdf
OECD. (2015). PISA 2015 Technical report. OECD.
OECD. (2018). PISA 2018 Assessment and analytical framework. OECD.
Price, D. S. (1963). Little science, big science . . . and beyond. Columbia Univer-
sity Press.
How PISA is becoming a “big science” project 179
1962). This already included other pioneering studies, namely what Gary
Becker would publish on “Human Capital” (Becker, 1964), which has
since served as locus classicus of the topic. The theory of human capi-
tal became ubiquitous in the works of the OECD, assuming the role of
scientific (and economic) legitimation of the climate of euphoria, to use
Húsen’s term (Húsen, 1979), which would shape the expansion of the
education systems in the 1960s and 1970s.
As Vera Centeno remarked, the OECD
Additionally, vis-à-vis the OEEC, which was able to issue binding deci-
sions, the OECD was equipped with weaker legal instruments. Regarding
its goals, there is no explicit reference to education in the OECD Con-
vention though “there was always an ‘inferred role’, derived from early
human capital formulations of links between economic productivity and
educational investment” (Rizvi & Lingard, 2009, p. 438). Besides, as
Papadopoulos observed
Gone are the days when education did not come within the purview of
the OECD (and of its predecessor, the OEEC). At the end of the second
decade of the 21st century, the OECD’s Secretary-General, Angel Gurría,
clamours for the centrality of education in the development processes.
For this reason, within the framework of its powers, the OECD assumes
the priority of offering its member (and associate) countries a “powerful
tool”, which will enable them “to fine-tune their education policies”,
constructed from “the world’s most comprehensive and reliable indica-
tor of students’ capabilities”. This is the ambitious role given to PISA, a
programme launched in 2000 and since then repeated every three years.
Conclusion 183
The magic of this argument does not simply lie in allowing an under-
standing of the past, finding a constant in the relation between economic
growth and knowledge capital,6 but, especially, in estimating projections
for the future. This forecasting endeavour was undertaken by Hanushek
and Woessmann in a study about the “Asian miracle” (Hanushek &
Woessmann, 2016), but also in a more focused study, on the Economic
Benefits of Improving Educational Achievement in the European Union
(Hanushek & Woessmann, 2019).
In this latter study, carried out at the request of the European Commis-
sion, E. Hanushek and L. Woessmann quantify the economic benefits of
educational improvement for each of the EU countries, from an analysis
centred on the relationship between educational achievement (as meas-
ured by PISA) and the long-run growth of nations. The report incorpo-
rates the projections of the dynamics of educational reform – that it takes
time for student improvements to appear and for better-skilled workers
to become a noticeable proportion of the workforce, and model four
educational improvement scenarios.
Table C.1 T
he economic benefits of improving educational achievement in the
European Union
Reform / Value of Reform
Policy Scenarios At present costs As % of present GDP As % of future GPD (discounted)
71 027 billion
Increasing average performance (25 points) 100 %
340 %
10 %
7.3 %
37 898 billion
Achieving universal basic skills 100 %
188 % 3.9 %
10 %
5 223 billion
100 %
At most 15 percent low achievers 25 % 0.5 %
10 %
7 097 billion
100 % 10 %
Enhancing skills of early school leavers 34 % 0.7 %
1 144 billion
100 % 10 %
At most 10 percent early leavers 6% 0.1 %
4 615 billion
100 % 10 %
Increasing top performance 22 % 0.5 %
Our focus has been the internal validity of the H&W statistical claims.
By focusing on extending data up through the present (2014), and
matching test score change with subsequent economic growth – all
logical moves for pursing the overarching question – the claims of
H&W are rendered invalid. As stated previously, this is not because
we endorse Human Capital premises but because invalidation based
on the same dataset and methodology, obtained after successfully
reproducing H&W’s (2015a) findings, is more powerful than other
genres of critique. Utilizing the same data and methods renders our
critique conclusive.
(Komatsu & Rappleye, 2017, p. 20)
190 António Teodoro
Komatsu and Rappleye insist that their purpose is not to contend that
education plays no role in economic growth or that the quality of learn-
ing has no bearing on economic success. What they aim to demonstrate
is that setting learning goals centred on skills that generate economic gain
can ultimately prove harmful to students’ learning (Komatsu & Rappl-
eye, 2017). One possible outcome of this approach is the formulation of
strategies for political and administrative authorities, teachers, families
and students who seek the easier route, in other words, that of finding
immediate reward (good results in the tests) and give up on more complex
and meaningful learning, able to create “better jobs” and “better lives”,
but also “better and fair societies”. And these scholars end up pointing
out what can be extremely ironic: the political claims that aim to assume
(and foster) PISA results as the prime indicators of the reality of educa-
tion systems are, after all, the cause of the decline and stagnation of the
quality of students’ learning in many of those countries and economies.
The OECD seems to get along well with antidemocratic liberalism (in
the Popular Republic of China, but also in Chile, Hungary, or Poland).9
As Yascha Mounk states in his remarkable essay People vs. Democracy.
Why Our Freedom Is in Danger and How to Save It (Mounk, 2018),
In this context, the OECD seems to continue to care more about liberal-
ism and much less about democracy. And, in education, that reversal of
priorities constitutes a capital sin.
The OECD’s proposals are questionable also from a humanistic and
critical perspective of cosmopolitism, which assumes the universality
of the human condition and the equal dignity of human beings. The
OECD comes forward today as an international organization that
works to build “better policies for better lives”, “to shape policies that
foster prosperity, equality, opportunity and well-being for all”.10 In the
field of education, two new buzz phrases are added: “better skills” and
“better jobs”.
The narrative developed in the OECD’s documents rests on an indi-
vidualistic view of the world, and of the economic and cultural relations
that human beings establish among one another, in search of better jobs
and a better life. It is a world of free consumers who fight for better
jobs, accumulating better skills at school and throughout life, in an iso-
lated journey, in constant competition for survival in a hostile world
threatened by unemployment. In consumption, no solidarity is estab-
lished; one competes for better prices and better social positions. It is
in production, in labour, that the values of solidarity are consolidated
as a starting point for the construction of societies guided by social jus-
tice and citizenship engaged with the dignity of all human beings. This
dimension, with profound implications in the organization of schools
and the modes of learning and teaching, clearly depicted in the history
of modern pedagogy, by Adolphe Ferrière, Jean Piaget, John Dewey,
Celestin Freinet or Paulo Freire, falls regrettably outside the OECD’s
concerns and proposals.
“In the dark, all schools and education systems look the same” is the
title given by A. Schleicher (Schleicher & Pablo Zoido, 2016, p. 374) to
a section of a chapter published on Global Education Policy Handbook
(edited by Mundy et al., 2016). The question is not the commendable
effort to want to “illuminate” education systems with relevant informa-
tion. The key issue lies in considering that the manner in which one “illu-
minates” derives from options of a political nature that must be equated
194 António Teodoro
and debated in the public space. They are not technical issues, but neutral
indicators. As Iveta Silova, Komatsu and Rappleye challenge us:
Social justice and polis are concepts that are absent from the new
OECDism.11 The reality is that we stand on the precipice of a planetary
cliff, with two options laid out before humanity. One side is the contin-
ued expansion of democracy, the further extension of human rights and
freedoms, and concerted efforts to address the growing threats and reali-
ties of global climate change. On the other is the dismantling of democ-
racy to be replaced by a populist, authoritarian rule; increased attacks on
the marginalized, oppressed and exploited populations of the world; and
acceleration of the degradation of planet Earth. We need international
organizations to be fully capable of confronting the challenges of the
post-truth, radically sceptical world we live in today, which has led to the
rise in atavistic, xenophobic neopopulist movements. This can be accom-
plished by (a) bridging knowledge production between universities (and
research) and the public, (b) supporting the revitalization of the public
spheres in old and new forms, (c) facilitating discourses that challenge the
dominant ideologies of today, (d) training the next generation of public
intellectuals and (e) serving as public intellectuals ourselves, intervening
in the public spheres to reaffirm our pursuit of social justice, democracy
and truth itself.12
Notes
1 See Eide, 1990.
2 My position, which I have been arguing since the publication of the article
Organizações Internacionais e políticas educativas nacionais. A emergên-
cia de novas formas de regulação transnacional ou uma globalização de
baixa intensidade [International Organizations and National Education
Conclusion 195
References
Becker, G. S. (1964). Human capital: A theoretical and empirical analysis, with
special reference to education. The University Chicago Press.
Centeno, V. G. (2017). The OECD’s educational agendas – framed from above,
fed from below, determined in interaction. A study on the recurrent education
agenda. Peter Lang. https://doi.org/10.3726/b12774
Dale, R. (2008). Brief critical commentary on CWEC and GSAE 8 years on. Paper
presented to 52th Conference Comparative and International Education Soci-
ety (CIES), Teachers College, Columbia University, New York, March 17–21.
Eide, K. (1990). 30 years of international collaboration in the OECD. Inter-
national Congress ‘Planning and Management of Educational Development,
Mexico, Mars 26–30, 1990. UNESCO ED-90/CPA.401/DP.1/11
Elfert, M. (2019). The OECD, American power and the rise of the “economics
of education” in the 1960s. In C. Ydesen (Ed.), The OECD’s historical rise
in education, global histories of education (pp. 39–61). Palgrave Macmillan.
https://doi.org/10.1007/978-3-030-33799-5_3
Hanushek, E., & Woessmann, L. (2015a). The knowledge capital of nations:
Education and the economics of growth. MIT Press.
Hanushek, E., & Woessmann, L. (2015b). Universal basic skills: What countries
stand to gain. OECD.
196 António Teodoro
Hanushek, E., & Woessmann, L. (2016). Knowledge capital, growth, and the east
Asian miracle. Science, 351(6172), 344–345. doi:10.1126/science.aad7796
Hanushek, E., & Woessmann, L. (2019). Economic benefits of improving educa-
tional achievement in the European Union: An updated and extension. EENEE
Analytical Report No. 39. European Commission.
Henry, M., Lingard, B., Rizvi, F., & Taylor, S. (2001). The OECD, globalisation
and education policy. Pergamon, Elsevier.
Húsen, T. (1979). L’école en question. Pierre Mardaga.
Komatsu, H., & Rappleye, J. (2017). A PISA paradox? An alternative theory
of learning as a possible solution for variations in PISA scores. Comparative
Education Review, 61(2). https://doi.org/10.1086/690809
Leimgruber, M., & Schmelzer, M. (Eds.). (2017). The OECD and the interna-
tional political economy since 1948. Springer Berlin Heidelberg.
Levin, B., & Fullan, M. (2008). Learning about system renewal. Educational
Management, Administration and Leadership, 36(2), 289–303.
Miranda, S. de. (1981). Portugal e o ocdeísmo. Análise Psicológica, 1(II), 25–38.
Mounk, Y. (2018). The people vs. democracy. How your freedom is in danger
and how to save it. Harvard University Press.
Mundy, K., Green, A., Lingard, B., & Verger, A. (Eds.). (2016). The handbook of
global education movement. Wiley Blackwell.
Nóvoa, A. (1998). Histoire & Comparaison (Essais sur l’Éducation). Educa.
OECD. (1989). Education and the economy in a changing society. OECD.
Papadopoulos, G. (1994). Education 1960–1990: The OECD perspective.
OECD.
Rappleye, J., & Komatsu, H. (2019). Is knowledge capital theory degenerate?
PIAAC, PISA, and economic growth. Compare: A Journal of Comparative and
International Education. doi:10.1080/03057925.2019.1612233
Rizvi, F., & Lingard, B. (2009). The OECD and global shifts in education policy.
In R. Cowen & A. M. Kazamias (Eds.), International handbook of compara-
tive education. Springer. https://doi.org/10.1007/978-1-4020-6403-6_28
Rubenson, K. (2008). OECD education policies and world hegemony. In R.
Mahon & S. McBride (Eds.), The OECD and transnational governance
(pp. 241–259). UBC Press.
Sahlberg, P. (2016). The global reform movement and its impact on schooling. In
K. Mundy, A. Green, B. Lingard, & A. Verger (Eds.), The handbook of global
education movement (pp. 128–144). Wiley Blackwell.
Saltman, K. J., & Means, A. J. (Eds.). (2019). The Wiley handbook of global
reform. Wiley Blackwell.
Schleicher, A. (2018). World class: How to build a 21st-century school system.
OECD Publishing. doi:10.1787/4789264300002-en
Schleicher, A., & Zoido, P. (2016). The policies that shaped PISA, and the policies
that PISA shaped. In K. Mundy, A. Green, B. Lingard, & A. Verger (Eds.), The
handbook of global education movement (pp. 374–384). Wiley Blackwell.
Schmelzer, M. (2016). The hegemony of growth: The OECD and the making of
the economic growth paradigm. Cambridge University Press.
Schultz, T. (1962). Investment in human beings. Journal of Political Economy,
70(5), 1–8.
Conclusion 197
Sedgwick, M. J. (Ed.). (2019). Key thinkers of the radical right. Behind the new
threat to liberal democracy. Oxford University Press.
Silota, I., Rappleye, J., & Komatsu, H. (2019). Measuring what really matters:
Education and large-scale assessments in the time of climate crisis. ECNU
Review of Education, 2(3), 342–346. doi:10.1177/2096531119878897
Stromquist, N. P. (2016). Using regression analysis to predict countries’ economic
growth: Illusion and fact in education policy. Real-World Economics Review,
76, 65–74.
Teodoro, A. (2001). Organizações internacionais e políticas educativas nacionais:
A emergência de novas formas de regulação transnacional ou uma globalização
de baixa intensidade. In S. R. Stoer, L. Cortesão, & J. A. Correia (Orgs.), Da
Crise da Educação à “Educação” da Crise: Educação e a Transnacionalização
dos Mecanismos de Regulação Social (pp. 125–161). Edições Afrontamento.
Teodoro, A. (2003). Educational policies and new ways of governance in a trans-
nationazation period. In C. A. Torres & A. Antikainen (Eds.), The interna-
tional handbook on the sociology of education: An international assessment of
new research and theory (pp. 183–210). Rowman & Littlefield.
Teodoro, A. (2019). The end of isolationism: Examining the OECD influence in
Portuguese education policies, 1955–1974. Paedagogica Historica. doi:10.108
0/00309230.2019.1606022
Teodoro, A. (2020). Contesting the global development of sustainable and inclu-
sive education. Education reform and the challenges of neoliberal globaliza-
tion. Routledge.
Zhao, Y. (2020). Two decades of havoc: A synthesis of criticism against PISA.
Journal of Educational Change, 21, 245–266. https://doi.org/10.1007/
s10833-019-09367-x
Index
accountability 14, 19, 48, 49, 50, 52, big science 3, 8, 169, 170, 171, 172,
54, 59, 60, 61, 62, 63, 64, 65, 69, 173, 174, 176
88, 94, 99 BNCC (of Brazil curricular base) 6,
active form 11, 12, 13 86, 88, 89, 90, 92, 93, 94, 95, 96,
Addey, C. 4, 5, 11, 12, 14, 16, 17, 18, 97, 99
19, 20, 34–35, 98, 138, 139 Bodin, A. 128, 139
Agasiti, T. 34–35 Bottani, N. 2
Agenda Setting Theory 144 Bourdieu, P. 65, 139
Aguiar, M. 91, 93 Brazil 4, 6, 183, 185
Alasuutari, P. 34–35, 41, 59, 66 Breakspear, S. 14
Application Protocol 108 Busch, L. 11, 20
Araújo, L. 105–106
Aspers, P. 171, 177 calculability 55
assemblages 54 Canada 146–147; Canadian
Assessment of Higher Education Standards Association 15
Learning Outcomes (AHELO) Carvalho, L. M. 74, 105, 170, 177
project 4 Cassio, F. 86, 90, 93
assessments 16, 19; market 16; Castells, M. 142
methodologies 17; products 16 Castro, M. H. 87, 88, 90
Auld, E. 26, 34 CeiED (of Lusofona University)
Australia 145 4, 144
Avelar, M. 89 Centeno, V. 170, 177, 181
Ávila, P. 130, 139 centralization 63
CERI (of the OECD) 1, 4, 52
Baird, J.-A. 146–147, 164 Christensen, K. B. 106
Ball, S. 89 Christensson, O. 173, 178
Baroutsis, A. 145, 161 citations 31–33
basic skills 60, 62, 63, 64 CNE (from Brazil) 90, 91, 92, 93, 96
Baudelot, C. 62 Coe, K. 142, 162
Bauer, A. 73 Coleman, R. 144
Becker, G. 55, 181 competencies 74, 80, 84, 90, 91, 92,
Benavente, A. 132, 139 94, 95, 96, 99
Benavot, A. 33 competition 50, 53
benchmarks 51, 54, 58; CONFENEM 126
benchmarking 57 Conselho Nacional de Educação
Berliner, D. C. 49 (from Portugal) 105
Bieber, T. 143 constructs 75, 86, 97
big data 170, 171, 179 Cordero, J. 34–35, 39–40
Index 199
Sellar, S. 13, 14, 17, 18, 19, 33–35, 39 tone of a newspaper article 156
Serrão, A. 143, 145, 147, 161–164 Tuinjmann, A. 57
Shanghai 53, 68, 146–147 Tuttman, M. 91, 93
Shaw, D. L. 144
Silova, I. 194 UK 145–146; National Standards
Sim-Sim, I. 130, 141 Body (BSI) 15
Sjøberg, S. 106 UNESCO 49, 74, 92, 106, 126,
skills 74, 75, 79, 80, 81, 82, 86, 89, 186, 188
90, 92, 94, 95, 96, 99; cognitive USA 145
55, 56
Soares, F. 91, 92, 93 van Eck, N. J. 25–26
soft power 18 Vento, E. 34–35
Spain 145 Verger, A. 14, 16, 19, 170, 179
Star, S. L. 11, 12, 14, 20 Visão (weekly newspaper) 147, 162
statistics 52, 55, 57
Visualisation of Similarities
Steiner-Khamsi, G. 14, 18, 19, 105
(VOS) 25
Stolpe, I. 18
storytelling 52, 54 Volante, L. 170, 179
Stromquist, N. 188, 189
Waldow, F. 105
Takayama, K. 33 Weinberg, A. M. 3, 4
TALIS 1, 4, 191 Weller, V. 70
Teltemann, J. 34–35, 37 WESTAT 175, 176
Teodoro, A. 1, 8, 74, 170, 176, 177, Wiseman, A. 17
179 Woessmann, L. 186, 187, 188, 189
tests 49, 50, 51, 54, 55, 62, 71, 73, World Bank (The) 57, 186, 188
74, 75, 76, 77, 78, 80, 82, 83, 98; World War II 192
low-impact 105
Thatcher government 183 Yariv-Mashal, T. 14, 19, 143
TIMSS 1, 7, 74, 126, 127, 128, 129,
130, 132, 133, 134, 135, 136, 137, Zhao, Y. 3, 8, 105–106, 183
138, 186 Zumbo, B. 17