You are on page 1of 219

Critical Perspectives on PISA as

a Means of Global Governance

This volume offers a critical examination of the Programme for


International Students Assessment (PISA), focusing on its origins
and implementation, relationship to other international large-scale
assessments, and its impacts on educational policy and reform at national
and cross-national levels.
Using empirical data gathered from a research project carried out by
the CeiED at Lusofona University, Lisbon, the text highlights connections
between PISA and emergent issues including the international circulation
of big science, expertise and policy, and identifies its conceptual and
methodological limits as a global governance project. The volume
ultimately provides a novel framework for understanding how OECD
priorities are manifested through a regulatory instrument based on
Human and Knowledge Capital Theory, and so makes a powerful case
to search for new humanistic approaches.
This text will benefit researchers, academics and educators with an
interest in education policy and politics, international and comparative
education, and the sociology of education more broadly. Those interested
in the history of education will also benefit from this volume.

António Teodoro is Professor of Sociology of Education and Comparative


Education at Lusofona University, Portugal. He is also Director of
the Interdisciplinary Research Centre for Education and Development
(CeiED).
Routledge Research in Education Policy and Politics

The Routledge Research in Education Policy and Politics series aims to


enhance our understanding of key challenges and facilitate on-going aca-
demic debate within the influential and growing field of Education Policy
and Politics.
Books in the series include:

Governing the School under Three Decades of Neoliberal Reform


From Educracy to the Education-Industrial Complex
Richard Münch

Nancy Fraser and Participatory Parity


Reframing Social Justice in South African Higher Education
Edited by Vivienne Bozalek, Dorothee Hölscher and Michalinos
Zembylas

Educating the Neoliberal Whole Child


A Genealogical Approach
Bronwen Jones

Education Policy and the Political Right


The Burning Fuse beneath Schooling in the US, UK and Australia
Grant Rodwell

Critical Perspectives on PISA as a Means of Global


Governance
Risks, Limitations, and Humanistic Alternatives
Edited by António Teodoro

For more information about this series, please visit: www.routledge.


com/Routledge-Research-in-Education-Policy-and-Politics/book-series/
RREPP
Critical Perspectives on
PISA as a Means of Global
Governance

Risks, Limitations, and Humanistic


Alternatives

Edited by António Teodoro


First published 2022
by Routledge
605 Third Avenue, New York, NY 10158
and by Routledge
4 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN
Routledge is an imprint of the Taylor & Francis Group, an informa
business
© 2022 selection and editorial matter, António Teodoro;
individual chapters, the contributors
The right of António Teodoro to be identified as the author of
the editorial material, and of the authors for their individual
chapters, has been asserted in accordance with sections 77
and 78 of the Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this book may be reprinted
or reproduced or utilised in any form or by any electronic,
mechanical, or other means, now known or hereafter
invented, including photocopying and recording, or in any
information storage or retrieval system, without permission in
writing from the publishers.
Trademark notice: Product or corporate names may be
trademarks or registered trademarks, and are used only for
identification and explanation without intent to infringe.
Library of Congress Cataloging-in-Publication Data
A catalog record for this book has been requested
ISBN: 978-1-032-18577-4 (hbk)
ISBN: 978-1-032-18581-1 (pbk)
ISBN: 978-1-003-25521-5 (ebk)
DOI: 10.4324/9781003255215
Typeset in Sabon
by Apex CoVantage, LLC
Contents

List of figures vii


List of tables viii
About the editor and contributors x
Acknowledgements xv

Introduction: a success story? Critical perspectives on


PISA as a means of global governance 1
ANTÓNIO TEODORO

1 Invisible struggles, encoded fantasies and ritualized


incantations: a critical infrastructure studies analysis of PISA 11
CAMILLA ADDEY

2 How PISA is present in the scientific production: a


bibliometric review 25
CARLOS DÉCIO CORDEIRO AND VÍTOR DUARTE TEODORO

3 PISA as epistemic governance within the European


political arithmetic of inequalities: a sociological
perspective illustrating the French case 48
ROMUALD NORMAND

4 Pisa and curricular reforms in Brazil: the influence of a


powerful regulatory instrument 70
JOÃO LUIZ HORTA NETO

5 Testing PISA tests: a study about how secondary and


college students answer Pisa items in mathematics and science 104
VÍTOR DUARTE TEODORO, VÍTOR ROSA, JOÃO SAMPAIO MAIA
AND DANIELA MASCARENHAS
vi Contents

6 International large-scale assessment: issues from


Portugal’s participation in TIMSS, PIRLS and ICILS 126
VÍTOR ROSA

7 PISA in media discourse: prominence, tone, voices and


meanings 142
ANA CARITA, TERESA TEIXEIRA LOPO AND VÍTOR DUARTE
TEODORO

8 OECD and education: How PISA is becoming a “big


science” project 169
VÍTOR ROSA AND ANA LOURDES ARAÚJO

Conclusion: limitations and risks of an OECD global


governance project 180
ANTÓNIO TEODORO

Index198
Figures

2.1 Cumulative publication and regression output 29


2.2 Bibliographic network map for keywords plus 31
2.3 Author clusters 35
4.1 IDEB variation from 2005 to 2017 72
5.1 Sum of scores of the Mathematics items, by age, box plot
and frequency curve 112
5.2 Sum of scores of the Science items, by age, box plot and
frequency curve 115
5.3 Item facility, correlations by education level, PISA PT,
PISA OECD 117
5.4 Item discrimination, correlations by education level,
PISA PT, PISA OECD 118
5.5 Item assessment by students, by education level, mean
for each scale 120
5.6 Item assessment by students, correlations between
facility and scales, by education level 121
5.7 Item assessment by students, correlations between scales 122
7.1 Frequency of the do journalistic genres of the 112
articles with PISA as the main subject, by year (in the
sample considered), by PISA cycle 155
8.1 “Big science” and human development (in abstract terms) 174
8.2 PISA Advisory Group (Countries), from 2000 to 2018 175
Tables

2.1 Analogue studies 27


2.2 Publication per year 29
2.3 Origin of funding and publication by country 30
2.4 Ranking of 10 journals with the highest number of
citations32
2.5 Ranking of 10 journals with the most articles related to
the theme 32
2.6 20 authors with the highest link strength and their
citations34
4.1 Comparison of Brazil’s performance in PISA tests and
average performance of the OECD member countries:
2000 to 2015 73
4.2 Description of the scale for Reading Proficiency in
PISA 2018 77
4.3 Description of the scale for Mathematics Proficiency in
PISA 2018 81
4.4 Description of the scale for Science proficiency in Pisa 2018 85
5.1 Students who answered all the items of the booklet on
the PISA test 109
5.2 Sum of scores of the Mathematics items, by school class,
age and education level 111
5.3 Sum of scores of the science items, by school class, age
and education level 114
6.1 Distribution of science items (content area, cognitive
dimension and item type), TIMSS 2015 129
6.2 Dimensions and areas of dimensions, content areas
(CIL and CT) and percentages 135
7.1 Time frame of the selected articles 150
7.2 Fields of analysis, their articulation with the hypotheses
and respective categories 151
7.3 Features of the 112 articles where PISA is the main topic 154
Tables ix

7.4 Predominant tone of the 112 articles in which PISA is the


main topic 157
7.5 Profession of the authors of the 112 articles in which
PISA is the core topic 160
C.1 The economic benefits of improving educational
achievement in the European Union 188
About the editor and contributors

António Teodoro is Professor of Sociology of Education and Comparative


Education in Lusofona University, at Lisbon, and director of the
Interdisciplinary Research Centre for Education and Development
(CeiED). Founder of free teacher trade unionism in Portugal after the
Portuguese Revolution in April 1974, he was the first general secretary
of National Teachers Federation (FENPROF), the most representative
Portuguese teacher union (1983–1994). In the recent past, Teodoro
was also chief inspector of primary education (1974–1975), a member
of the National Council of Education (1988–1994) and Advisor to
the Portuguese Minister Council from Education, Science, Culture
and Employment (1995–1999). António Teodoro is Cofounder of
the Portuguese Paulo Freire Institute and a member of the board and
vice president of the Research Committee of Sociology of Education
from the International Sociological Association (ISA; 2006–2014).
He is Founder and chair of the Portuguese Society of Comparative
Education (SPCE-SEC), and member-at-large of the Executive
Committee and chair of the Constitutional Standing Committee of
World Council of Comparative Education Societies (WCCES). He is
also Editor and founder of Revista Lusófona de Educação (Lusofona
Journal of Education) and member of the editorial board of dozens
of journals in Portugal, Brazil, France, Spain, Czech Republic, Greece
and Chile.
Recent book in Routledge: Contesting the Global Development of
Sustainable and Inclusive Education. Education Reform and the
Challenges of Neoliberal Globalization (2020). Other books in
English: Editor (with Carlos Alberto Torres) Critique and Utopia.
New Developments in the Sociology of Education in the Twenty-
First Century (Rowman & Littlefield, 2007), and (with Manuela
Guilherme) European and Latin American Higher Education Between
Mirrors. Conceptual Frameworks and Policies of Equity and Social
Cohesion (Sense Publishers, 2014).
About the editor and contributors xi

Camilla Addey is Marie Curie Fellow at GEPS – the Globalisation,


Education and Social Policies – research centre at the Department
of Sociology of the Autonomous University of Barcelona, Spain.
Formerly, Camilla was lecturer in Comparative and International
Education at Teachers College, Columbia University (USA), and a
researcher at Humboldt University in Berlin (Germany). Previously,
Camilla worked in education at UNESCO. She is interested in
international large-scale assessments in lower- and middle-income
contexts, global educational policy and education privatization. Her
current research project ILAINC, on the privatization of international
education assessment, can be followed on Facebook at “ILSA Inc. The
ILSA industry”. Camilla has published in Comparative Education;
Globalization, Societies and Education; Compare: A Journal of
Comparative and International Education; Critical Studies in
Education; and Assessment in Education: Principles, Policy and
Practice. Her latest edited book is Intimate accounts of education
policy research: the practice of methods, edited with Nelli Piattoeva
and published by Routledge (2021). Camilla is also a director of the
Laboratory of International Assessment Studies http://international-
assessments.org
Ana Lourdes Araújo is currently a doctoral student at Lusofona
University in Lisbon; she is also a researcher at the project “A success
story? Portugal and PISA (2000–2015)”, at Interdisciplinary Research
Centre for Education and Development (CEIED) at the same
University. She has a master’s degree in educational policies, from
Federal University of Maranhão (UFMA), Brazil. She is teacher and
pedagogical coordinator of the state and federal public schools in
São Luís do Maranhão, Brazil, and researcher at the UFMA Scientific
Dissemination Laboratory. Her research focuses is on educational
public policies; public science and academic literacies.
Ana Carita is a researcher at the Interdisciplinary Research Centre for
Education and Development (CeiED). Currently retired, she worked
at the Psychology and Guidance Services and developed teaching
activities at ISPA in the area of Educational Psychology, between 1989
and 2010, and at Lusofona University between 2010 and 2018, in
graduate and postgraduate training for psychologists and teachers.
She collaborated with other institutions in the continuous and post-
graduate training of teachers and educational psychologists, namely
with Lisbon University. She has developed research especially in the
fields of interpersonal conflict, discipline and citizenship in the school
context, bullying, ethics, areas in which she has authored several
publications. In recent years, she has developed research on the media
representation of some issues in the field of education.
xii About the editor and contributors

Carlos Décio Cordeiro is an economist by training. He is concerned with


issues related to public administration and financial literacy. He has
been a banker for more than 10 years, a dedicated client manager of
one of the biggest Portuguese banks. He is a specialist in quantitative
methodology analysis having obtained 20 marks in the quantitative
methods discipline of his master’s degree. He collaborates in the
project funded by FCT called “A success story? Portugal and the PISA
(2000–2015)”.
Teresa Teixeira Lopo is Research fellow at the Interdisciplinary Research
Centre for Education and Development (CeiED) of Lusófona
University. She is a sociologist, has a master’s degree, a DEA and a
PhD in Educational Sciences from the School of Social Sciences and
Humanities of NOVA University of Lisbon. She is the author or
co-author of several publications and has worked in various national
and international consultancy teams and research projects addressing
the themes of policy and politics of education.
João Sampaio Maia is a researcher at the Interdisciplinary Research
Centre for Education and Development (CeiED), and Invited Associate
Professor at the University Lusofona of Porto, Visiting Professor at the
University of St. Joseph, in Macau, collaborating Professor at School
of Education Paula Frassinetti, in Porto and Professor (retired) at the
School of Education of Porto, lecturing in teacher training programmes
and quantitative research method courses. He is graduated in
mathematics and in mechanical engineering from the University of
Porto and has a master’s and a doctoral degree in education from the
University of Minho, Portugal. He participated in seven international
educational projects, in European, Asian and African countries and is
author of several scientific and pedagogical books and papers in the
areas of education and mathematics.
Daniela Mascarenhas is an adjunct professor at the School of Education
of the Polytechnic Institute of Porto. She is involved in the training
of early childhood educators and teachers of the first and second
cycles of basic education. She has published several articles and is
a collaborating researcher in several research projects allocated to
the research centres CeiED and inED. She is an integrated researcher
in inED centre. She is a trainer in several continuing education
programmes for teachers and educators in Portugal and São Tomé
and Príncipe. Post-doctorate in Education Sciences, in the specialty
of Pedagogical Supervision, from the University of Minho (2019),
PhD in Education and Mathematics Didactics from the Faculty of
Education Sciences of the University of Granada (2011) and validated
by the University of Porto. She obtained the Diploma of Advanced
About the editor and contributors xiii

Studies (DEA) in Education and Didactics of Mathematics from the


Faculty of Educational Sciences of the University of Granada (2010).
She graduated in Mathematics (via teaching) from the University of
Minho (2003).
João Luiz Horta Neto is a researcher at the Brazilian National Institute
of Studies and Educational Research, INEP; has PhD in Social Policy
and master’s degree in Education from the University of Brasília and
experienced in Educational Assessment (Systems, Institutions, Plans
and Educational Programs), working mainly on the following topics:
public policies, educational evaluation, systems evaluation, large-
scale evaluation and development of measurement instruments for
large-scale educational assessments (cognitive tests and contextual
questionnaires).
Romuald Normand is Professor of Sociology at the University of Stras-
bourg, Faculty of Social Sciences, France (Research Unit CNRS SAGE:
Societies, Actors and Government of Europe) and Associate Visiting
Professor at the Beijing Normal University and Aarhus University,
Copenhagen. He works on comparative and European education
policies and politics. Romuald Normand is Convenor of the net-
work 28 “Sociologies of European Education” at the European Edu-
cational Research Association. He is a member of the editorial board
of the British Journal of Sociology of Education and Advisory Board
member of the Springer series Educational Governance Research.
Last publications: with David Dolowitz, and Magadalena Hadjiisky
(eds), Shaping Policy Agendas: The Micro-politics of Economic Inter-
national Organizations (Edward Elgar Publishing, 2020). With Liu
Min, Luís Miguel Carvalho, Dalila D. Oliveira, and Louis Levasseur
(eds), Education Policies and the Restructuring of the Educational
Profession. Global and Comparative Perspectives (Springer, 2018).
Vítor Rosa has a PhD degree in Physical Education and Sports, from the
Lusofona University, and a Postdoc in Sociology from ISCTE-IUL. He
was a professor and researcher at Université Paris-Ouest Nanterre La
Défense, France (2015–2016), Académie de Versailles (2017–2018),
Lusofona University (2013–2017) and University of Évora (2010–
2011). He collaborates as a scientific and editorial member of several
international scientific journals and other publications. He has also
collaborated in several research projects. Currently, he is Research
Fellow at CeiED (Interdisciplinary Research Centre for Education and
Development) in the PISA project – “A success story? Portugal and
PISA (2000–2015)” (ref.ª PTDC/CED-EDG/30084/2017).
Vítor Duarte Teodoro, for over 40 years, has been involved in teacher
education (at primary, secondary and tertiary levels), teaching
xiv About the editor and contributors

physics, science education, computer science and mathematics


(secondary and tertiary levels), research and supervision of research
in physics education, mathematics education, educational technology
and research in examinations. He has a long-standing experience
in publishing physics textbooks and educational software. Other
activities include the coordination of teacher networks (both national
and international) and educational consultancy for governmental
bodies (regarding exams, curricula, schools laboratories, digital
educational resources and teacher education).
Acknowledgements

This book derives from a research project undertaken by the


Interdisciplinary Research Centre for Education and Development
(CeiED), at Lusofona University, in Lisbon, titled “A Success Story?
Portugal and the Pisa (2000–2015)”, funded by the Portuguese Foundation
for Science and Technology (FCT) (Ref. PTDC/CED-EDG/30084/2017).
Our acknowledgement for the financial support for the research and for
the translation of the chapters is originally written in a language other
than English.
The researchers from CeiED were joined by other researchers working
in France, Spain and Brazil, who, at different times, participated in
some of its activities. The contributions from these researchers (Camilla
Addey, Romuald Normand and João Luiz Horta Neto) have enabled
us to understand that the issues analysed by the Portuguese team, with
Portugal as the centre of empirical analysis, show similarities with the
situation in countries with such distinct realities as France or Brazil. To
them all, our deepest appreciation.
Many chapters of the book were conceived and written in Portuguese.
Its publication in English, however, was only possible due a remarkable
transformation carried out by Isabel Canhoto, who, more than a
competent translator, was a watchful adviser concerned with the rigour
of the concepts and linguistic accuracy. To her, we give again our deepest
appreciation.
A final thanks to Elsbeth Wright and AnnaMary Goodall for the
professionalism and dedication put into the production of the book. Our
thanks are extended to all those at Routledge who make possible the
dissemination of the research carried out in Academia.
Introduction
A success story? Critical perspectives on
PISA as a means of global governance
António Teodoro

Major international studies, such as TIMSS, PISA, PIRLS or TALIS,


have become these days one of the main governance technologies in the
fields of education and training. In this set of studies, the Programme for
International Students Assessment (PISA) is, surely, the one which exerts
the greatest influence on political decision-makers, school administrators
or the media. PISA is a programme undertaken by the Organization for
Economic Cooperation and Development (OECD), with global ambition
in the field of evaluation and design of education policies (Schleicher,
2018).
PISA originated in the end of the 1980s, when there was an intense
debate in this organization on how to influence the course of the education
policies of member states.1 Although the OECD was created in a very
particular context (the reconstruction of post-war Europe pursued by
the Marshall Plan)2 and a very different one from the organizations that
belonged to the United Nations system, its action followed the prevailing
patterns: on the one hand, technical assistance, sought by national
authorities, above all as a means to legitimize the internal options taken in
the meantime3; on the other hand, the regular organization of initiatives
(seminars, conferences, workshops), studies and publications which
not only established a “global agenda” but also defined how education
problems should be addressed, equated and solved (Teodoro, 2020).
In the context of a strong impulse to new forms of hegemonic
globalization, namely neoliberal globalization,4 there was a decisive
change in the relations between governmental internal organizations and
the policies developed within the scope of national states. These relations
now have their nerve centre in international large-scale statistical surveys
and, in particular, project INES – Indicators of Educational Systems,
carried out by the Centre for Educational Research and Innovation
(CERI) of the OECD. In these statistical projects, the choice of indicators
constitutes the decisive issue in setting a global agenda, with significant
impact on the education policies both of the central countries and of
those other countries located in the semi-periphery of central spaces.
DOI: 10.4324/9781003255215-1
2 António Teodoro

At the outset, the INES programme was marked by strong controversy


and broad opposition from within the OCDE itself (Henry et al.,
2001). Its better-known public expression would be the annual flagship
publication of Education at a Glance. This OECD undertaking was
decided following a conference held in Washington in 1987, by the
initiative and at the invitation of the US Administration and the OECD
Secretary General, a conference in which representatives of 22 countries
participated, as well as several guest experts and observers. The main
item on the OECD agenda in the field of education at the time was the
quality of teaching, which was used as a starting point for launching
INES, possibly this organization’s most significant and important activity
in all the 1990s.
Acknowledging that the more complex issue was not so much
calculating valid indicators but rather the classification of the concepts,
the representatives of the OECD’s member countries and the guest experts
analysed a set of more than 50 possible national indicators, which they
eventually divided into four categories: (i) input indicators, (ii) output
indicators (results), (iii) process indicators and (iv) financial and human
resources indicators (Bottani & Walberg, 1992).
Implementing this programme enabled the OECD to amass an
important database of national education indicators, which have been
feeding Education at a Glance since 1992. In these Glances, besides
the traditional indicators, be they the different schooling rates and the
various indexes of access to education, or the expenses with education
and the qualifications of the teaching staff, there is a set of new indicators,
which have had profound consequences, upstream, for the formulation
of education policies at national level. These new indicators are presented
by the OECD in a particularly meaningful manner:

To respond to the growing interest of public opinion and public


powers regarding the results of teaching, over one third of the
indicators presented in this edition address results, both on the
personal level and regarding the labour market, and the assessment
of school efficiency. The indicators which derive inspiration from
the first International Adult Literacy Survey give an idea of the level
of proficiency of adults’ base competences and of the ties between
these competences and some key features of education systems.
The publication also comprehends a complete series of indicators
on the results in Mathematics and Sciences, which covers nearly
all the OECD countries and are inspired in the Third International
Mathematics and Sciences Study. Moreover, the indicators taken from
the first survey on the schools of the INES programme contribute to
expand the base of available knowledge on school efficiency.
(CERI, 1996, p. 10)
Introduction 3

PISA emerges from this intention to measure school efficiency through


students’ learning. To this end, it was necessary to find a measure of
comparison, which could overcome an issue so far considered almost
unsurmountable: how to build large-scale standardized tests that would
take into account the curricular diversity of participating countries, while
at the same time respecting their cultural idiosyncrasies. The answer to
this question was found in a very popular concept since the 1970s in
the sociology of professions and adult education, the concept of literacy,
defined later as the “capacity of students to analyze, reason and commu-
nicate effectively as they pose, solve and interpret problems in a variety
of subject matter” (OECD, 2005, p. 3).
Overcoming this problem enabled the OECD to present PISA as a
new instrument, different from that it had previously used. The core
of this OECD narrative is that education policies need to be informed
by scientific knowledge. Flying in the face of the historic contributions
of scientific research in the field of education (and pedagogy), Andreas
Schleicher, head of the Directorate of Education and Skills of the OECD,
argued that relevant knowledge is that which derives from large statistical
surveys built upon new indicators: “It was the idea to apply the rigours
of scientific research to education policy that nudged the OECD to create
PISA in the late 1990s” (Schleicher, 2018, p. 17). Schleicher, who assumes
PISA as his brainchild,5 stresses that it was this survey that started a new
generation of education policies informed by scientific research:

Of course, the OECD had already published numerous comparisons


on education outcomes by that times, but they were mainly based
of years of schooling, which isn’t always a good indicator of what
people are actually able to do with the education they have acquired.
(Schleicher, 2018, p. 18)

Despite vast critical literature (see the literature review by Zhao, 2020),
PISA has become one of the most influential forces in global education.
Its sway far exceeds that of other international large-scale assessments
(ILSA), chronologically older and significantly closer to the learning
of students in the different countries, such as those undertaken by the
International Association for the Evaluation of Educational Achievement
(IEA).
The fact that it stems from an international organization that has become
the main think tank in different areas at world level (Martens & Jakobi,
2010), playing an influential role as soft power within the framework of
the global education reform movement (GERM), is enabling PISA to take
on the features of a “big science” project, in the sense that, in the 1960s,
physicists Alvin M. Weinberg and Derek de Solla Price gave the phrase:
that of a project whose concern is to create large-dimension complexes,
4 António Teodoro

requiring enormous resources, diverted from other projects and priorities,


with connections to the business and financial world (in some cases, to
the military-industrial complex), employing a growing number of people
and companies, and having the supreme concern of growing and self-
feeding on continuous new projects (Price, 1963; Weinberg, 1961).6
The “new PISAs” are significative: first for the so-called developing
countries (PISA for Development); a second for schools individually
considered (PISA for Schools); a third for adults, the Programme for the
International Assessment of Adult Competences (PIAAC); a fourth for
working conditions in schools, the Teaching and Learning International
Survey (TALIS); and the latest, PISA Baby, Early Childhood Education
and Care. For higher education, too, the OECD has designed a PISA, the
Assessment of Higher Education Learning Outcomes (AHELO) project.
Announced in 2009, the programme was developed between 2010 and
2013, involving 17 countries from different regions. The instruments
were administered to about 4,900 scholars and around 23,000 students
close to their graduation (bachelor’s degree). The learning outcomes
that the feasibility study aimed to test were generic skills, such as critical
thinking, analytical thinking, problem-solving and specific discipline
skills in economics and engineering. Furthermore, questionnaires were
applied to students, lecturers and administrative staff, as regards context,
with a view to linking learning outcomes to students’ origins and paths
and their learning contexts. The final report showed that participating
countries, with few exceptions, were strongly critical of the initiative
and proposed it be abandoned (Magalhães, 2017). Because this route
was blocked, CERI-OECD submitted a new programme, “Fostering
and Assessing Students Creative and Critical Thinking Skills in Higher
Education and Teacher Education” (2018–2020), and the publication
of its international report is scheduled for 2021.
This book takes as its centre PISA and the OECD project of establishing
the “scientific bases”, to use the phrase advanced by Schleicher (2018),
for a global education governance. It derives from a research project
undertaken by the Interdisciplinary Research Centre for Education and
Development (CeiED), of Lusofona University, in Lisbon.7 The researchers
from CeiED were joined by other researchers working in France, Spain
and Brazil, who, at different times, participated in some of its activities.8
The contributions from these researchers (Camilla Addey, Romuald
Normand and João Luiz Horta Neto) have enabled us to understand that
the issues analysed by the Portuguese team, with Portugal as the centre
of empirical analysis, show similarities with the situation in countries
with such distinct realities as France or Brazil. To them all, our deepest
appreciation.
***
Introduction 5

Chapter 1 is titled “Ritualized incantations, encoded fantasies, and


invisible struggles: a critical infrastructure studies analysis of PISA”. Its
author, Camilla Addey, raises the question: What does it mean to under-
stand PISA as infrastructure? The chapter draws on Keller Easterling’s
book “Extrastatecraft: The Power of Infrastructure” to analyse how the
concepts of active form, stories and extrastatecraft help understand how
PISA is transforming education hidden in plain sight; and to analyse the
striking parallels between ISO 9000 (the quality management standards)
and PISA. Easterling’s theory distinguishes between the declared intent
of infrastructure and its underlying disposition, thus highlighting PISA’s
inherent agency as opposed to its declared content. The chapter shows
how the politics written into the global infrastructure space diverges
from the declared intent, often acting as an essential partner for the state
whilst also being beyond the reach of state jurisdiction.
In Chapter 2, Carlos Décio Cordeiro and Vítor Duarte Teodoro
present a bibliometric review on “How PISA is present in the scientific
production”. The PISA study has revolutionized the policy debate on
education. This paradigm shift has caused fierce debates, detailed in
the literature reviewed both in South East Asia and Central Europe and
even in the North American continent. Considering the relevance of the
topic, it was necessary to understand the knowledge produced by the
study conducted by the OECD. This chapter aims to make a quantitative
analysis of the scientific production related to the term “PISA OECD”.
Four hundred and eight articles were analysed and selected from the
Web of Science database. This study analyses the production by year
of publication, country and funding. Bibliographical references and
keywords are also analysed. The most cited authors were grouped by
clusters and their scientific production was dissected. Sustained growth
in publication and research funding in this area was concluded. The most
relevant thematic area is public policies, and it should be noted that
schools and students also deserve a prominent place in some literature.
Chapter 3 is titled “PISA as an epistemic governance within the
European political arithmetic of inequalities. A sociological perspective
illustrating the French case”. In the first part of the chapter, the author,
Romuald Normand, highlights how PISA has given rise to an international
space of circulation between science, expertise and policy despite
increasing criticism. After shedding light on the genesis of the survey and
its contribution to a worldwide government by numbers, while networks
of experts and policy-makers were contributing to its legitimization and
dissemination, sociological research has moved towards a more precise
scrutiny of some structuring effects on education policies. The chapter
contributes to this emerging field of research by illustrating and situating
the French case. However, it must be noted that this national case can
only be understood if it is contextualized in a broader European space.
6 António Teodoro

Indeed, the PISA survey is a key component of the European lifelong


learning strategy to which many elements of French policy are related,
as in other European countries. To avoid methodological nationalism, it
is therefore important to consider how PISA takes place in a European
political arithmetic of inequalities, before specifying the dissemination of
this survey within the French context. Then, the author shows how an
epistemic governance has been built around the survey from knowledge
produced by an association between experts and policymakers and
through a new formulation of the French ideal of equal opportunities in
education.
In Chapter 4, “Pisa and the curricular reforms in Brazil: The influence
of a powerful regulatory instrument”, João Luiz Horta Neto, based on
documental and bibliographical research, discusses PISA and its influence
on Brazilian public policies on education from the analysis of the
country’s main curricular document: the National Common Curricular
Base, BNCC (in the Portuguese acronym), approved in 2018. To this
end, the core conceptions which structure the three tests of PISA, and
its changes from 2000 to 2020, will be presented and discussed. Some
of the weaknesses of its results will also be discussed on the basis of the
descriptions of the proficiency scales, one of the ways used to disseminate
and explain its results. As example of the influence PISA has over the
initiatives carried out in several countries, the process of formulation of
the BNCC, from 1995 to 2018, will be discussed as well as the main
disputes that arose during this process. Moreover, the conceptions of
PISA present in its formulation are discussed as well as the emphasis given
to external assessment as a way to implement the BNCC. The analysis
shows that the two instruments, albeit different in nature, are defined as
inducers of the improvement in education standards, and do so from the
results of the tests and the incentive to compete for better results.
Chapter 5 is titled “Testing PISA Tests: A study about how secondary
and college students answer PISA items in Mathematics and Science”.
The authors, Vítor Duarte Teodoro, Vítor Rosa, João Sampaio Maia
and Daniela Mascarenhas, describe a study about how students from
different age groups, education levels and courses answer PISA items and
how they evaluate different aspects of these items. The hypothesis was
that PISA items are too difficult for the majority of students aged 15
and that they are more appropriated to older students. A handout with
two sets of publicly available Mathematics items (from PISA 2012) and
Science items (from PISA 2015) was submitted to a non-representative
sample of 839 Portuguese students from Basic, Secondary and Higher
Education (approximately 50% were over 18 years of age), from
different types of courses, schools, polytechnics and universities, public
and private. Students were asked to answer the items and to evaluate
different aspects of each item (e.g. “understanding” the question and
Introduction 7

assessing its “difficulty”). Comparing the scores of this student sample


and the scores on the same items in PISA tests (in 2012 and 2015), for
both Portugal and the OECD countries (students aged 15), the study
found similar results across all age groups of the study’s sample, in both
Mathematics and Science. This suggests that PISA tests can have a larger
target in students’ age, and that knowledge and skills at age 15 are globally
similar to knowledge and skills at older ages (at least in Mathematics
and Science). The authors also found that item facility has a significative
positive correlation with students’ assessment of the comprehension of
the text of the item, with the certainty of correction of the answer, and
with the assessment of the difficulty of the item. On the other hand, item
facility has no significative correlation with the students’ study of the
content of the item, as reported by students.
Chapter 6 is titled “The International Large-Scale Assessment: Issues
from the Portugal participation in TIMSS, PIRLS and ICILS”. The author,
Vítor Rosa, underlines that the international large-scale assessment studies
(ILSA) have acquired great importance in recent decades. Governments
from various political sectors started to use the results of these studies,
with the aim of improving investments and achieving better school
performance. Although the ILSAs are currently considered by many to be
a regular feature of the education landscape, they are a relatively recent
phenomenon. Their origins can be traced back to the pilot survey of the
International Association for the Evaluation of Educational Achievement
(IEA) regarding student performance assessment conducted in the 1960s.
Since then, there have been significant developments. Portugal has been
participating in several large-scale international studies (TIMSS, PIRLS,
ICILS and also PISA), allowing to obtain information about the education
system and the socio-economic context of the students’ families and
personalities. These data allow the definition and implementation of
educational policies to be influenced. The analysis of the general results
shows that Portugal has been improving its results (mathematics, reading,
sciences) on ILSA.
Chapter 7 is titled “PISA in Portuguese media discourse: Prominence,
Tone, Voices and Meanings”. The authors, Ana Carita, Teresa Teixeira
Lopo and Vítor Duarte Teodoro, present a study where they undertake the
critical exploration of the media representation of PISA in the Portuguese
newspaper Público. They sought to ascertain and understand, within
the Portuguese context, how a national reference general-interest daily
newspaper refers to the process of Portugal’s participation in PISA, to the
results of PISA and their respective implications, both in news discourse
and in opinion discourses, in the period from 2001 to 2018. The interest
in the media exploration of PISA derives from acknowledging that the
mass media are an element of the global information society, both by
making information available and steering the attention of the target
8 António Teodoro

audiences, and by contributing to shape their beliefs and value systems,


their representation and attribution of importance to the different current
events. The study shows, first, a progressive evolution of the coverage
and prominence attributed by the newspaper to PISA, signs of its growing
credibility and political importance in society and in the media agenda;
second, the growing positive tone of the titles of the pieces, signalling the
recognition of the favourable evolution of the Portuguese results in PISA,
especially evident in the news stories, in contrast to a greater negativity
in the chronicles/opinion pieces; and, finally, the almost absence of the
“public voice”, in particular of the direct protagonists of the schools,
signalling the newspaper’s limited openness to a socially plural opinion.
These seem to be three important conclusions that can be drawn from the
analysis of the surface features of Público’s production on PISA.
In Chapter 8, “OECD and Education: How PISA is becoming a ‘Big
Science’ project”, the authors, Vítor Rosa and Ana Lourdes Araújo,
recall that the concept of “big science” was used for the first time in the
United States in 1961, in reference to large-scale scientific research, with
significant budgets, sizeable teams, large instruments and laboratories,
producing a copious amount of data, albeit often of little relevance. The
aim of the chapter is to explore the hypothesis that PISA, carried out by
the OECD, can be integrated in the concept of “big science”. To this end,
the authors resorted to a qualitative and quantitative methodology, based
on the documents and technical reports produced by this organization.
The findings indicate that PISA has emerged as an extremely expressive
programme in this way of doing science, in the field of social and
human sciences. PISA involves an increasing number of countries and a
fortiori more institutions, social actors and more consumable educational
products in national programmes.
In the conclusion, the editor of this book (and lead investigator of the
research project that lies at its root and from which many of the chapters
derive) seeks to show the “limitations and the risks of a global govern-
ance project” in the field of education policies carried out by the OECD.
Recalling the origins of this organization, it assumes Yong Zaho’s baffle-
ment of how a “non-novelty” such as PISA, afflicted by serious conceptual
and methodological frailties, and preceded by other, more solid, surveys,
could have become this mighty regulatory instrument. The answer can
only be given from the analysis of how a small organization like the
OECD became the key organization in the legitimation of the global edu-
cational reform movement (GERM), which feeds the education policies
of countries and economies since the late 20th century. The source of
this power – this is the hypothesis put forward – lies in the way in which
the OECD was able to use the Human Capital Theory, and its update as
Knowledge Capital Theory, in the assertion of neoliberal globalization
as hegemonic answer, not only in the economic but also in the social and
Introduction 9

cultural fields. The rationale of the updated Knowledge Capital Theory


is very simple (and appealing): the levels of cognitive development of a
given country make it possible to know approximately the cognitive levels
of the labour force of that country; and, in turn, it is the quality of that
labour force, which will determine the levels of economic growth. Then
the question of how to operationalize the knowledge of those cognitive
levels of the labour force is addressed by an equally simple device: know-
ing the results obtained by the students of that country in PISA. Besides
resting on extremely fragile methodological assumptions, this simplifica-
tion may be leading to an extremely ironic situation: the political claims
that aim to assume (and foster) PISA results as the prime indicators of
the reality of education systems are, after all, the cause of the decline and
stagnation of the quality of students’ learning in many of those countries
and economies. And this makes the limitations of the OECD approach
and the required humanistic alternative all the more evident.

Notes
1 See Romuald Normand, Chapter 3, and Appendix.
2 See Leimgruber & Schmelzer, 2017.
3 See Teodoro, 2003.
4 See Teodoro, 2020.
5 Schleicher describes how the PISA emerged in the OECD thus:
I remember my first meeting of senior education officials at the OECD in
1995. There were representatives from 28 countries seated around a table
in Paris. Some of them were boasting that they had the world’s best school
system – perhaps because it was the one they knew best. When I proposed
a global test that would allow countries to compare the achievements of
their school systems with those of other countries, most said this couldn’t
be done, shouldn’t be done, or wasn’t the business of international
organisations. I had 30 seconds to decide whether to cut our losses or give
it one more try. In the end, I handed my boss, Thomas J. Alexander, then
director of the OECD Education, Employment, Labour and Social Affairs
Directorate, a yellow post-it note saying: “Acknowledge that we haven’t
yet achieved complete consensus on this project, but ask countries if we
can try a pilot”. The idea of PISA was born – and Tom became its most
enthusiastic promoter.
(Schleicher, 2018, pp. 17–18)
6 See the development of idea in Chapter 8.
7 The editor of this book is conducting, from 2019 to 2022, a project titled
“A Success Story? Portugal and the Pisa (2000–2015)”, funded by the Por-
tuguese Foundation for Science and Technology (FCT) (Ref. PTDC/CED-
EDG/30084/2017). Some authors of this book participate as researchers
in this project (see Contributors). In it, going from the Portuguese “success
case”, the researchers question the assumptions of this at once so powerful
and so fragile exercise in international comparison. See the website: http://
pisa.ceied.ulusofona.pt/en/.
8 See the full list of authors in About the Editor and Contributors.
10 António Teodoro

References
Bottani, N., & Walberg, H. J. (1992). À quoi servent les indicateurs internationaux
de l’enseignement? In CERI (Ed.), L’OCDE et les indicateurs internacionaux
de l’enseignement: Un cadre d’analyse (pp. 7–13). OCDE.
Centre for Educational Research and Innovation (CERI). (1996). Regards sur
l’Éducation: Les indicateurs de l’OCDE. OCDE.
Henry, M., Lingard, B., Rizvi, F., & Taylor, S. (2001). The OECD, globalisation
and education policy. Pergamon, Elsevier.
Leimgruber, M., & Schmelzer, M. (Eds.). (2017). The OECD and the international
political economy since 1948. Springer Berlin Heidelberg.
Magalhães, A. (2017). A OCDE, o AHELO e a governação do ensino superior
[The OECD, the AHELO, and the Higher Education Governance]. II Congresso
Internacional Os Desafios da Qualidade em Instituições de Ensino, Escola
Superior de Enfermagem, Coimbra, 17–20 October 2017.
Martens, K., & Jakobi, A. P. (2010). Introduction. The OECD as an actor in
international politics. In K. Martens & A. P. Jakobi (Eds.), Mechanisms of
OECD governance. International incentives for national policy-making?
Oxford University Press.
OECD. (2005). The definition and selection of key competencies [Executive
summary]. www.oecd.org/pisa/35070367.pdf
Price, D. S. (1963). Little science, big science . . . and beyond. Columbia University
Press.
Schleicher, A. (2018). World class: How to build a 21st-century school system.
OECD Publishing. doi:10.1787/4789264300002-en
Teodoro, A. (2003). Educational policies and new ways of governance in a
transnationalization period. In C. A. Torres & A. Antikainen (Eds.), The
international handbook on the sociology of education: An international
assessment of new research and theory (pp. 183–210). Rowman & Littlefield.
Teodoro, A. (2020). Contesting the global development of sustainable and
inclusive education. Education reform and the challenges of neoliberal
globalization. Routledge.
Weinberg, A. (1961). Impact of large-scale science on the United States. Science,
134, 161–164.
Zhao, Y. (2020). Two decades of havoc: A synthesis of criticism against PISA.
Journal of Educational Change, 21, 245–266. https://doi.org/10.1007/
s10833-019-09367-x
1

Invisible struggles, encoded


fantasies and ritualized
incantations
A critical infrastructure studies analysis
of PISA
Camilla Addey

Introduction: the PISA infrastructure hidden in


plain sight
What does it mean to understand PISA as infrastructure? This chap-
ter draws on Keller Easterling’s book “Extrastatecraft: The Power
of Infrastructure” to understand how Easterling’s concepts of active
form, stories and extrastatecraft help analyse how PISA is trans-
forming education hidden in plain sight; and by looking at the strik-
ing parallels between ISO 9000 (quality management standards)
and PISA. Easterling’s theory distinguishes between the declared
intent of infrastructure and its underlying disposition – hence PISA’s
inherent agency as opposed to the declared content.1 The chapter
shows how the politics written into the global infrastructure space
diverge from the declared intent, often acting as an essential partner
for the state (strengthening it by serving as a proxy or camouflage)
whilst also being beyond the reach of state jurisdiction.
When people hear the word infrastructure, most think of mate-
rial structures like cables, plumbing, roads, electrical power plants
or even school buildings. Current forms of infrastructure tend to
be repeatable formula (and no longer unique uniquely imagined
spaces) and, as such, constitute an infrastructural technology. But
Critical Infrastructure Studies use the concept of infrastructures to
think about ideas and ideologies, standards and software. Easter-
ling invites us to also consider “shared standards and ideas that
control everything” (2014, p. 21) as infrastructure. It is a hidden
substrate that is everywhere – so ubiquitous, mundane2 and seem-
ingly innocuous that it is hardly perceived; its “doings” and effects
hardly considered. Star (1999) claims that infrastructure is invis-
ible by definition and becomes visible only upon breakdown. This

DOI: 10.4324/9781003255215-2
12 Camilla Addey

hidden, anonymous nature of operating, Busch (2011) argues, is the


way in which infrastructure acquires power. Star (1999) quotes her
teacher Anselm Strauss, when describing the aim of CIS as studying
the unsteadied (p. 379) and “valorising previously neglected peo-
ple and things” (p. 139). She claims that however mundane and boring
infrastructure like standards, wires and settings of an information system
may first appear, neglecting them equals missing “equally essential aspects
of aesthetics, justice and change” (p. 139). To hack into infrastructure,
Star suggests going backstage and looking for traces left behind by those
building the infrastructure. In other words, we need to recover “the mess
obscured by the boring sameness of the information represented” (p. 380).
Inspired by Easterling’s application of the concept of global
infrastructure to the ISO quality management standards as “ritualized
incantations” of something called “quality” (p. 19), I first apply her lens to
PISA and then look at the parallels between Easterling’s analysis of ISO 9000
and PISA. The chapter then suggests PISA encodes fantasies and ritualizes
incantations. The chapter draws on data gathered in multiple research
projects carried out by the author on PISA and PISA for Development,
and on scholarship published on PISA. The research projects the chapter
draws on are the “WhyJoin”, “PISA for Development for Policy” and the
“ILSAINC, the ILSA Industry”, research projects carried out by the author
between 2012 and 2020 (for more information, see Addey, 2015, 2019).

PISA’s active form, its story and extrastatecraft


Infrastructure is understood as an operating system that shapes – it is doing
something through its “soupy matrix of details and repeatable formulas”
(Easterling, 2014, p. 11). This is what Easterling calls the “active form”.
Another way of saying this is that the medium is the message, but it escapes
detection because it is obscured by the information the infrastructure
generates. Reading the medium (as opposed to the message) allows us
to read the undisclosed politics written into infrastructure which tend
to diverge from what is the declared intent. Easterling (2014) states that
infrastructure is an operating system: “the information resides in the
invisible, powerful activities that determine how objects and content are
organized and circulated” (p. 13), and that “in information infrastructure,
every conceivable form of variation in practice, culture and norm is
inscribed at the deepest level of design” (p. 389). The main point that
is relevant to PISA here is that the information that the infrastructure
generates prevents us from seeing the medium as an operator. This
invites us to ask what PISA’s medium is doing and what messages are
preventing us from seeing this. How is PISA determining in undeclared
ways how education is understood and acted upon? How are PISA’s data
and analysis hiding what the infrastructure is doing? In a similar way to
Easterling’s “undeclared intent of infrastructure”, Star (1999) invites us
Invisible struggles, encoded fantasies and ritualized incantations 13

to identify the paradoxes of infrastructure – causing anomalies or small


obstacles to become barriers. This occurs because there are two processes
at work (what the infrastructure is doing and what it is saying), but only
one is visible to the user (what the infrastructure is saying).
Applying Easterling’s infrastructure as an operating system idea to data
and their management systems, Sellar (2017) describes data infrastruc-
ture as “an active and changing platform for storing, sharing and con-
suming data across networked technologies” (p. 345). In similar ways,
I apply this understanding to PISA to understand it as a soupy matrix
of details and repeatable formulas that are doing. PISA as a medium
of information (not to be confused with the data that PISA produces)
organizes and circulates content in undeclared ways as it hides behind its
triennial data reports. So although PISA reports may state that students
in a certain country achieve a certain PISA average with a certain number
years of schooling, we should shift our attention from the meaning of
such a statement in relation to the educational system, to focus on how
factual averages, age cohorts and school years are constructed and how
they affect such dimensions. How does the PISA infrastructure transform
real life, messy practices that mean different things in over 80 PISA con-
texts/educational systems into commensurable facts and how does this
act upon these messy practices? What is PISA doing by rendering these
practices measurable and comparable across different educational sys-
tems? A good example would be how equity is measured in PISA. The
PISA reports analyse levels of inequality and publish data on inequality
in educational systems. This is the message that tends to obscure what
the infrastructure is doing. However, there is a paradox in PISA’s equity
discourse. If we look into the PISA sampling guidelines on who can be
excluded from the PISA sample in each country and if we look at PISA
implementation practices where the aim is often a shortcut to demon-
strating better data/rankings, what emerges is that those who are most
excluded from the educational system are most likely to become further
invisible through PISA, thus putting profound constraints on educational
equity and quality.3 This highlights the power of those who shape infra-
structure; and the dissonance between PISA’s message and its medium.
Easterling states that stories are attached to infrastructure. They inflect
the active form of infrastructure and “can maintain an inescapable grip
on the disposition of infrastructure space” (p. 138). She claims that we
attach stories to infrastructures and that these can

become enshrined or ossified as ingrained expectations. Stories may


evolve beyond fluid scripts for shaping a technology into ideologies
that dictate the disposition or an organization. However immaterial,
these ideological stories have the power to buckle concrete and bend
steel, and they can often be difficult to escape.
(2014, p. 93)
14 Camilla Addey

In seeking to hack into infrastructure, Star (1999) also suggests


identifying the master narratives or the single voice (the “stories”
that Easterling describes) embedded in infrastructure. One may seek
to do this by identifying the narrative “that has been made other, or
unnamed” (Easterling, 2014, p. 385). If we apply this to PISA, an
example of a story or single narrative is PISA’s concept of performance
that is used to indicate how much learning has been acquired by
students. The PISA reports analyse average outcomes of students as
a proxy for educational quality, this message being so ossified that it
has transformed what we speak about educational quality and shaping
what we understand as the aim of education as a consequence. Again,
there is a paradox here. Although PISA claims to measure educational
quality, the narrow focus on a student’s ability to complete its tests
obscures all other dimensions of educational quality. In other words,
PISA’s infrastructure story is rendering educational quality invisible.
In a similar way, the many assumptions about learning, educational
systems – including what counts as a teacher and student – that are
inscribed in PISA, leave no space for alternative education stories to be
told, valued and acted upon.
Easterling describes infrastructure as a site of extrastatecraft
where multiple forms of domestic and transnational sovereignty act
in partnership, outside or in addition to statecraft. Infrastructure can
become an essential partner for the state, serving and strengthening it
as a proxy or a camouflage. The concept of extrastatecraft is used to
highlight how “multiple, overlapping, or nested forms of sovereignty,
domestic and transnational jurisdictions collide” (p. 15). Easterling
adds that infrastructure generates “de facto forms of polity faster than
even quasi-official forms of governance can legislate them” (p. 15).
Applied to PISA, this concept suggests PISA acts both as a partner for
the state, but also bypasses it as it establishes new sets of priorities,
acts as a global accountability eye (Novoa & Yariv-Mashal, 2003)
and transcends differences between educational systems and seeks to
train teachers (i.e., with initiatives like PISA for You). PISA reaches
beyond state jurisdictions, whilst at the same time being “an essential
partner for the state” (p. 16). Rationales for participation in PISA
(Addey, 2019; Addey & Sellar, 2018; Addey et al., 2017) substantiate
this claim, showing how government’s educational priorities are shaped
by PISA at the same time as governments use their participation in
PISA to forward their agenda – even where this conflicts with the basic
assumptions encoded in PISA. PISA also creates de facto forms of polity
as it puts pressure on educational systems to conform, without having
to wait for policy to be informed by PISA data (Breakspear, 2012; Grek,
2009; Martens, 2007).
Invisible struggles, encoded fantasies and ritualized incantations 15

ISO 9000 and the parallels with PISA


The parallels between Easterling’s study of ISO 9000 and PISA are striking
so much so that those well acquainted with PISA who read Easterling’s
analysis of ISO 9000 could easily mistake it for a study of PISA. These
parallels also highlight some differences in the ways in which PISA acts
vis-à-vis ISO 9000.
ISO, the International Organization for Standardization, is a private
non-governmental organization that was established in 1947. Easterling
describes is as “a private, voluntary, nongovernmental organization –
a business that sells its standards, protects its clients and maintains no
public archive” (2014, pp. 173–174). With IOs thriving on standards and
rituals – their universal language and practices, ISO is the organization
of standards and rituals per excellence (Ibid., p. 207). Although ISO is
regarded as “a model of rational activity” (Ibid., p. 206) and Mendel
describes the organization as a mundane and “silent force of social
rationalization across the globe” (2006, p. 162), Easterling describes ISO
standards and its practices are irrational, empty vessels that give even
greater power to the organization, to its infrastructures, and those who
adopt them.
The organization presides over not only the establishment of
technical standards but also standards for less tangible infrastructure
like management quality standards. By 2020, ISO had published
over 23 thousand standards and its network included 164 national
standards bodies. In the same way as the OECD is best known for its
PISA programme in education, ISO is best known for its ISO 9000 – its
management quality standards, which we are all accustomed to seeing
sported as a label by companies and organizations of all kinds. ISO 9000,
first published in 1987, was developed under the pressure of the German
Institute for Standardization (DIN), UK National Standards Body
(BSI) and the Canadian Standards Association.4 ISO 9000 works as a
“supposedly rationalized set of practices” (2014, p. 173) that companies
and organizations can buy in the form of a certification that signals
quality management. However, Easterling states that around the world,
“most companies sport ISO 9000 certification as a shibboleth or seal of
approval” (2014, p. 173) that serve as an “ideal vessel of irrationality”.
Its irrationality is its most instrumental and inspiring attribute, with
companies weaving ISO 9000 into their rhetoric, casting “their spell
with incantatory slogans and mantras” (Ibid., p. 192). Issues that are of
relevance to PISA in Easterling’s study are ISO 9000’s relationship with
the private sector, its secretive nature, its unequal power distribution,
its participation patterns with global reach and local meanings, its
double uses, its lack of content and halo effect and the alternatives it
has triggered.
16 Camilla Addey

Blurred boundaries between public and private. Although the OECD


is not a private institution like ISO, both organizations have a solid
relationship with the private sector, and access to their products and
services requires financial resources. The ISO is a private organization
which brings together not only private standard making bodies but also
national government standards bodies. The OECD is a publicly funded
organization whose members are governments. The OECD is not private
like the ISO; however, it has a growing relationship with the private
sector as it stimulates the private sector’s interest in education (i.e.,
see Addey & Verger, 2019). For example, in the case of International
Large-Scale Assessments like PISA, the OECD contracts expertise to
generate PISA data, thus stimulating the education assessment market,
creating new assessment products, enabling a network for the education
assessment market, creating a global advertising stage and providing a
halo effect to contractors who work in PISA.5 In a similar way to ISO
9000, participation in PISA requires paying an international fee (a set fee
and a national overhead calculated as a percentage of the national GDP)
suggesting PISA can be seen as a “product” in similar way to ISO 9000
(Easterling, 2014, p. 177).
Easterling argues that technical and management standards have

changed the way people across the world talk to each other while
also strengthening a layer of influential intermediate authority oper-
ating in between the market and the state. The strategic indetermi-
nacy of these standards, offering fluid goals to a global audience, is
politically shrewd, demonstrating the power of disposition or pure
activity divorced from content. Quality is a practice that is doing
something as it habituates, and saying something as it avoids contro-
versial political stance.
(2014, p. 202)

As ISO 9000 and PISA strengthen a layer that sits between the market
and the state, they both habituate without ever engaging in a public
political dialogue. ISO 9000 and PISA transform the world, cloaked in
their technical procedures.
Secretive nature and power distribution. Both ISO 9000 and PISA
are developed behind closed doors which render invisible the different
positions that are heatedly debated in the making of the infrastructure.
ISO members (known as subscribers and corresponding members) have
differing levels of power and send large numbers of experts to participate
in the meetings at which the standards are developed. Further to the
members of ISO, anyone can decide to buy the expensive draft version of
ISO standards (when there are standards in the process of development)
and share their comments with the ISO secretariat. So although the doors
Invisible struggles, encoded fantasies and ritualized incantations 17

to the ISO standards’ development can be opened, the process is not in


the public domain, the standards are copyrighted and their negotiations
are not available in a public archive. Easterling describes ISO 9000
as a form of private global governance as it “asserts authority in the
public arena” (2014, p. 174) and points out the paradox of the broad
consensus that ISO 9000 has achieved although it “does not originate
in a public and political dialogue” (Ibid., p. 208). PISA is also developed
behind closed doors: the OECD PISA secretariat opens these doors only
to its contractors, government representatives and the experts it selects.
However, not all PISA countries have the power to participate in the board
meetings (most non-OECD members have observer roles). The unequal
distribution of power in PISA is further reinforced by what is presented
as structural constraints (deadlines, financial limitations, etc.) which
serve some actors (i.e., main contractors, more influential member states)
to impose their methodologies and choices on the entire infrastructure
(i.e., see Addey et al., 2020). In PISA meetings, those who define and
construct PISA are “doing” education without making transparent their
metaphysical assumptions whilst cloaking PISA with the rationality of
science.6 Educational stakeholders and lay people are not admitted to these
meetings (Addey & Gorur, 2020). This is justified by the sophisticated
nature of the assessment methodologies and discussions, and as a way to
maintain the secrecy of the tests and their validity in the long run, and the
need to embargo the data until the release date. Joining the discussions or
commenting on the tests and procedures is not an option, not even upon
the payment of a fee like ISO offers.
Participation patterns with global reach and local meanings. Easterling
claims that however innocuous ISO 9000 may appear

it has universal ambitions. ISO hopes to certify more and more


companies and draw an even larger section of the global community
into the management habit. Once this population of players is
listening and committed to continually renewing its certification,
the management protocol becomes a means of reconditioning any
number of organizations with a new message or an inflection of the
old one.
(2014, p. 195)

Similarly, the OECD has similar universal ambitions with PISA


(Addey & Sellar, 2019; Gorur & Addey, 2021; Sellar & Lingard,
2014; Wiseman, 2013) and seeks to ensure participation is continuous
every three years. Similarly, to ISO 9000, PISA is constantly seeking to
innovate (i.e., by adding a different area to be tested in each round, i.e.,
financial literacy, global competencies, creativity). ISO was faced with
the challenges PISA was faced with: In the early days of ISO and until the
18 Camilla Addey

year 2000, it was mainly European nations that subscribed to ISO 9000.
It then increased hugely in China and beyond Europe. To consider lower-
and-middle-income contexts, ISO set up DEVCO. Starting in 2013, the
OECD created PISA for development to adapt to cater for lower- and
middle-income nations, whilst at the same time stating that the PISA
metric could become the universal metric for learning outcomes (OECD,
2013).
Lack of content and the halo effect. The ISO 9000 standards are used to
measure whether an organization has met its self-established objectives,
based on the evaluation of a quality specialist who is hired to carry out
the process. The ISO 9000 standards are not a set of checks that are
the same for all: each client establishes how they will use the standards.
Whatever their evaluation, the client can expose the ISO 9000 certificate.
Easterling argues that ISO 9000’s popularity may be related to this very
lack of content: “over 170 countries have been ISO 9000 certified, yet no
one can say what ISO 9000 actually is” (2014, p. 176). Easterling claims
that the drive to habituate without specific content makes ISO 9000
powerful: “While lacking any specific content or binding requirement,
ISO is a perfect conduit of undeclared activities and intentions with
potentially dangerous consequences” (p. 19). Easterling claims that ISO
9000 standards have spread to every endeavour and are perceived as
“the answer to any problem in the field” (2014, p. 198). This resonates
with PISA: although PISA is a set of definite test items and skills, PISA
has acquired the aura of international standard in education that no
one can actually define (for a discussion on the use of “international
standards” in education, see Steiner-Khamsi & Stolpe, 2006). Similarly,
PISA is described as polyglot and polyvalent: it speaks the language of
whoever adopts it, and is given contextualized meanings (Addey & Sellar,
2018; Steiner-Khamsi, 2017). Like ISO 9000, PISA has also become the
panacea of all educational problems. This relates closely to the halo effect
of PISA, which has come to distribute prestige to anyone who claims to
draw on PISA data or to be associated with its instruments.
Double uses. Easterling argues that ISO 9000 exerts soft power though
compliance between members of the ISO 9000 club. She describes ISO
9000 as inoculating “organizations against regulation while developing
more expensive and opaque bureaucracies” (2014, p. 173). PISA has
also come to exert soft power on educational systems as described (i.e.,
see Grek, 2009; Martens, 2007). Easterling states that in the same way
as ISO 9000 promotes change of behaviour towards greater quality, it
can also inhibit such change and act as “another proxy in disguise”.
Kimerling argues that certifications allow organizations or nations to
hide their practices whilst showing off “enigmatic standards” that are
beyond the legislation of the state. Kimerling’s work showed how oil
companies in the Ecuadorian Amazon used “the cloak of international
Invisible struggles, encoded fantasies and ritualized incantations 19

standards and corporate responsibility to wrap operations in a veneer


of environmental excellence and social responsibility” (Kimerling, 2001,
p. 394). This highlights the double use of certifications and standards:
they are used to not only promote positive change but also inoculate
and even promote positive images that hide unwelcome, even illegal
practices. This resonates with PISA as governments use the PISA label
in similar ways, either by stretching the procedures (excluding the most
marginalized students) or by using PISA to show off educational systems
that marginalize (thus opposing the very aims of PISA to increase quality
and equity) as was the case with countries like Vietnam.7 Another way
in which PISA inoculates is by colonizing the meaning of education with
learning outcomes, as suggested earlier.
Alternatives. Another interesting similarity is the emergence of
alternative national and international certificates of quality that ISO
9000 triggered. These alternatives are replicates or have great similarities
with the ISO 9000 jargon and metrics. Easterling states that companies
display all these logos to show they comply with universal principles “all
in lieu of adhering to the laws of a state. Compliance is voluntary, and the
seal of approval may be self-constructed or internal to the corporation”
(2014, p. 198). It is interesting that PISA has also triggered the growth
of alternative assessments; in some way both ISO 9000 and PISA have
generated a new trend (quality certification and educational assessment,
respectively). There have been alternative assessment movements like
the Pratham (a citizen-led assessment) but also the growth of national
assessments. Participation in multiple assessments, international, regional
and national, has come to act as a seal of approval that responds to
the pressure of “global eye” of accountability (Novoa & Yariv-Mashal,
2003) but also internal accountability pressures (Addey et al., 2017;
Grek, 2009; Ozga, 2020).

Invisible struggles, encoded fantasies and


ritualized incantations
Easterling describes ISO as a missionary organization bringing “the
secret signals of a capital market to the not-yet-initiated” (2014, p. 196).
Easterling adds that “ISO 9000, the most universal standard, is not based
on technical compliance but on emotional, motivational belief systems”
(2014, p. 209). What we see here is an elaborate form of irrationality. The
secret signals of the accomplished market that ISO 9000 provides resonate
with “secret signal” of a quality educational system that governments
seek with PISA participation (see Addey, 2019). The empirical study of
rationales for participation in ILSAs (including PISA) by Addey and Sellar
(2018, 2019) demonstrates similar compliance mechanisms that relate
to emotional and motivational belief systems, leading to similar rituals.
20 Camilla Addey

Easterling goes on to state that local meaning is given to the adoption of


global trends:

Spatial software can mix remote abstract values together with the
values of a complex local context, without requiring that all parties
conform to a single universal principle. After all, institutions that
do so often develop elaborate rituals to demonstrate that they
are adhering to such principles when in fact they are departing or
decoupling from them.
(2014, p. 203)

The work of Star also describes how infrastructure means different things
locally (1999, p. 382). The diversity of rationales for participation and
the cherry-picking of PISA data to further agendas show how PISA, with
its tests and rigid protocols to ensure standardized implementation, also
takes on different meanings locally.
Easterling’s work helps analyse the ways in which infrastructure
transforms our worlds in mundane ways that are hard to see. On the one
side, the declared intent of the infrastructure is hidden by the message;
on the other, the struggles that occur in the making of the infrastructure
remain invisible to the user of the infrastructure. As Busch has stated,
it is this hidden, seemingly anonymous nature that gives infrastructures
power. Easterling adds to this by suggesting we consider how
infrastructures encode fantasies and ritualize incantations which hold a
strong grip in a world that is “more receptive to influential fictions and
beliefs” (Easterling, 2014, p. 169). Although rationalized infrastructures
carry universal stories and “moments of planetary integration” (p. 162),
they hold within them world aspirations and dreams and “harbour some
of the most elaborate irrationalities” (p. 168). Easterling states that in
order to cloak infrastructure in the rationality of science, infrastructure-
making-processes convene many internationally distinguished actors who
offer their expertise – a process that is also central to PISA. This analysis
resonates with PISA. Andreas Schleicher, the Director for the Directorate
of Education and Skills at the OECD, also known as the “father” of
the PISA, described PISA as his dream to create a universal language
for educators around the world and as a way to demonstrate what can
be achieved in education (Addey, 2016). Starting from this fantasy and
aspiration, and going on to assemble diverse fantasies and aspirations
across the many participating countries, PISA now carries multiple
aspirations (how we can best educate new generations for today’s society
and global market) and shared dreams (what top-performing countries
have achieved, what is possible for all, the idea that high skills lead to
greater economic competitiveness). Although these aspirations are based
on studies that demonstrate the scientific nature of such claims (i.e., what
Invisible struggles, encoded fantasies and ritualized incantations 21

economic potential their performance equates to), these claims can be


described as irrational if seen from different ontological stances.8
Larkin’s poetics of infrastructure add to the irrationalities described here
earlier. Distinguishing between the functional functioning and the poetic
mode of infrastructure, Larkin draws on anthropological approaches to
infrastructure to unpack how “the political can be constituted through
different means” (2013, p. 329). He argues that infrastructure needs
to be analysed as concrete aesthetic vehicles that “emerge out of and
store within them forms of desire and fantasy” (p. 329) and that at
times these can be “wholly autonomous from their technical function”
(p. 239). Larkin’s work invites a sensibility to the form, the sense of
desire and possibility, and “collective fantasy of society”; also suggesting
in other words, that infrastructure encode and transmit dreams and
fantasies. Infrastructure thus affects our feelings deeply, leading to bodily
reactions to infrastructure, reminding us of the “deeply affectual relation
people have to infrastructures, the senses of awe and fascination they
stimulate” (p. 34) as an important part of their political affect. Thus,
even the body apprehends what it means to be “modern, mutable, and
progressive” (p. 337). Larkin draws on scholars who argue that, since
Enlightenment, infrastructure has intimately been conceived as “shaping
modern society and realizing the future” (p. 332). Drawing on Larkin,
and furthering the idea of planetary integration and world aspirations,
we can see how the PISA infrastructure also delivers collective desires
and a sense of possibilities, and fantasies about what can be achieved.
It could be suggested that PISA encodes aesthetic promises of going up
the league tables, of round numbers, of high averages and of factual
data on the messy practice of learning. These aesthetic promises affect
our feelings, lead to bodily reactions and stimulate feelings of awe and
fascination. Thus PISA is reinforced and moved through bodily reactions,
as PISA users respond to the aesthetics of PISA infrastructure and adopt
“a common visual and conceptual paradigm of what it means to be
modern” (Ibid.). Might we go as far as saying that through frequent PISA
implementations, PISA data launches, competitive league tables, growing
numbers of participants, PISA has become a ritualized incantation?

Conclusions
Drawing on Easterling’s work, this chapter has explored how undeclared
politics are inscribed into PISA, politics that often diverge from the
declared intent of the infrastructure. The politics can be uncovered
by looking at the way in which the infrastructure is “doing” – how it
determines how objects and content are organized and circulate; seeking
out the stories that are ossified in the infrastructure; and how it acts
as a form of domestic and transnational sovereignty act in partnership,
22 Camilla Addey

outside or in addition to statecraft. The parallels and differences between


ISO 9000 and PISA highlight how infrastructures encode universal and
local fantasies and aspirations and ritualized incantations. This chapter
has also brought up the bodily reactions of awe and fascination that
influence the reception and effects of infrastructure. What does this mean
for our understanding of PISA? This chapter has sought to argue that we
need to look beyond what infrastructure says, understand how it does
politics in mundane and invisible ways, including working through our
affective and bodily responses to infrastructure.

Notes
1 Most approaches would seek to theorize how PISA is shaping through the
data and analysis it produces.
2 “Mundane to the point of boredom” (p. 377) Star states (1999) who dedi-
cates her paper foundational Critical Infrastructure Studies paper to member
of the Society of People Interested in Boring Things.
3 Dreams of universal rationality may sponsor their own special forms of irra-
tionality. Well-rehearsed theories, like those related to Capital and neoliberal-
ism continue to send us to the same places to search for dangers while other
concentrations of authoritarian power escape scrutiny.
(p. 22)
4 Those acquainted with the history of ILSAs will be reminded of how CERI
(the cradle of PISA as described by Throler, 2013) and PISA came to be devel-
oped under the pressure of the USA and France (Martens, 2007).
5 For further information on the privatization of educational assessment, see the
Facebook page of the ILSAINC research project: “ILSAINC. The ILSA industry”.
6 In line with STS, this understanding highlights how infrastructure “generates
de facto forms of polity faster than even quasi-official forms of governance
can legislate them” (p. 15).
7 Vietnam performed highly in PISA 2015 by excluding many students from
the sample.
8 For example, New Literacy Studies would claim that skills cannot be meas-
ured in such an “autonomous” (Street, 1984, 2013) and decontextualized
manner, nor can any result be interpreted universally without understanding
how such skills are deeply embedded in social, cultural, and institutional con-
text of practice (Barton et al., 2000; Hamilton, 2001).

References
Addey, C. (2015). International literacy assessments in Lao PDR and Mongolia:
A global ritual of belonging. In M. Hamilton, B. Maddox, & C. Addey
(Eds.), 2015. Literacy as numbers: Researching the politics and practices of
international literacy assessment (pp. 147–164). Cambridge University Press.
Addey, C. (2016). Camilla Addey interviews Andreas Schleicher on
PISA and the OECD. Laboratory of International Assessment Stud-
ies. Retrieved April 10, 2021, from http://international assessments.org/
camilla-addeyinterviews-andreas-schleicher-on-pisa-and-pisa-for-development
Invisible struggles, encoded fantasies and ritualized incantations 23

Addey, C. (2019). The appeal of PISA for development in Ecuador and Paraguay:
Theorising and applying the global ritual of belonging. Compare: A Journal of
Comparative and International Education, 50(8), 1159–1174. https://doi.org/
10.1080/03057925.2019.1623653
Addey, C., & Gorur, R. (2020). Translating PISA, translating the world. Com-
parative Education, 56(4), 547–564. https://doi.org/10.1080/03050068.2020.
1771873
Addey, C., Maddox, B., & Zumbo, B. (2020). Assembled validity: Rethinking
Kane’s argument-based approach in the context of international large-scale
assessments (ILSAs). Assessment in Education: Principles, Policy & Practice,
27(6), 588–606. https://doi.org/10.1080/0969594X.2020.1843136
Addey, C., & Sellar, S. (2018). Why do countries participate in PISA?
Understanding the role of international large-scale assessments in global
education policy. In A. Verger, M. Novelli, & H. K. Altinyelken (Eds.),
Global education policy and international development (pp. 97–117).
Bloomsbury.
Addey, C., & Sellar, S. (2019). Is it worth it? Rationales for (Non)participation
in international large-scale learning assessments. Education Research and
Foresight Working Papers Series, N.° 24. UNESCO. Retrieved April 11, 2021,
from https://en.unesco.org/node/268820
Addey, C., Sellar, S., Steiner-Khamsi, G., Lingard, B. & Verger, A. (2017). The rise of
international large-scale assessments and rationales for participation. Compare:
A Journal of Comparative and International Education, 47(3), 1–20. https://
doi.org/10.1080/03057925.2017.1301399
Addey, C., & Verger, A. (2019). ‘An attempt at the first ‘Davos of education’:
Dissonances in the OECD-forum for world education on the future of edu-
cation. Invited blog Education International, Unite for Quality Education
Campaign. Retrieved April 10, 2021, from www.unite4education.org/global-
response/an-attempt-at-the-first-davos-of-education-dissonances-in-the-oecd-
forum-for-world-education-on-the-future-of-education/
Barton, D., Hamilton, M., & Ivanic, R. (Eds.) (2000). Situated literacies. Reading
and writing in context. Routledge.
Breakspear, S. (2012). The policy impact of PISA: An exploration of the normative
effects of international benchmarking in school system performance. OECD
education working papers, N.° 71. OECD Publishing.
Busch, L. (2011). Standards: Recipes for reality. MIT Press.
Easterling, K. (2014). Extrastatecraft: The power of infrastructure. Space: Verso
Books.
Gorur, R. & Addey, C. (2021). Capacity building as the ‘Third Translation: The
story of PISA-D in Cambodia. In S. Grek, C. Maroy, & A. Verger (Eds.), World
yearbook of education 2021: Accountability and datafication in the governance
of education (pp. 94–109). Routledge.
Grek, S. (2009). Governing by numbers: The PISA ‘effect’ in Europe. Journal of
Education Policy, 24(1), 23–37.
Hamilton, M. (2001). Privileged literacies: Policy, institutional process and
the life of the IALS. Language and Education, 15(2–3), 178–196. doi:
10.1080/09500780108666809
24 Camilla Addey

Kimerling, J. (2001). Corporate ethics in the era of globalization: The promise


and peril of international environmental standards. Journal of Agricultural and
Environmental Ethics, 14, 425–455.
Larkin, B. (2013). The politics and poetics of infrastructure. Annual Review
Anthropology, 42, 327–343.
Martens, K. (2007). How to become an influential actor – the ‘comparative turn’
in OECD education policy. In K. Martens, A. Rusconi, & K. Leuze (Eds.), New
arenas in education governance. The impact of international organizations and
markets on educational policy making (pp. 40–56). Palgrave Macmillan.
Mendel, P. J. (2006). The making and expansion of international management
standards: The global diffusion of ISO 9000 quality management certificates.
Oxford University Press.
Novoa, A., & Yariv-Mashal, T. (2003). Comparative research in education:
A mode of governance or a historical journey? Comparative Education, 39(4),
423–438.
OECD. (2013). The OECD’s contribution on education to the post-2015
framework: PISA for Development. OECD and post-2015 reflection series.
OECD Publishing.
Ozga, J. (2020). The politics of accountability. Journal of Educational Change, 21,
19–35. https://doi.org/10.1007/s10833-019-09354-2
Sellar, S. (2017). Making network markets in education: The development of data
infrastructure in Australian schooling. Globalisation, Societies and Education,
15(3), 341–351. https://doi.org/10.1080/14767724.2017.1330137
Sellar, S., & Lingard, B. (2014). The OECD and the expansion of PISA: New
modes of global governance in education. British Educational Research
Journal, 40(6), 917–936.
Star, S. L. (1999). The ethnography of infrastructure. American Behavioral Scien-
tist, 43(3), 377–391. https://doi.org/10.1177/00027649921955326
Steiner-Khamsi, G. (Ed.) (2017). The global politics of educational borrowing
and lending. Teacher College Press. E-book.
Steiner-Khamsi, G., & Stolpe, I. (2006). Educational import, local encounters
with global forces in Mongolia. Palgrave Macmillan.
Street, B. (1984). Literacy in theory and practice. Cambridge University Press.
Street, B. (2013). Relationships of policy, theory and research in the literacy field.
Centre for Literacy Summer School. Montreal, Canada.
Throler, D. (2013). The OECD and cold war culture: Thinking historically about
PISA. In Heinz-Dieter Meyer & Aaron Benavot (Eds.), PISA, power, and pol-
icy the emergence of global educational governance (pp. 141–161). Sympo-
sium Books.
Wiseman, A. (2013). Policy responses to Pisa in comparative perspective. In H-D.
Meyer & A. Benavot (Eds.), PISA, power, and policy the emergence of global
educational governance (pp. 303–322). Symposium Books.
2

How PISA is present in the


scientific production
A bibliometric review
Carlos Décio Cordeiro and Vítor Duarte Teodoro

Introduction
This chapter is a bibliometric analysis of the PISA international study
carried out every three years by the OECD. Bibliometric analysis is a
technique that allows presenting a macroscopic view of an extensive
amount of academic literature. The number of different journals publishing
on a specific topic and the subject categories allocated to publications can
show the variety of research topics and the multidisciplinary character of
a research domain (van Nunen et al., 2018).
In bibliometric analysis, the content of articles is examined to summarize
what is known about a certain subject, mapping the scientific field by
analysing its literature to discover patterns, trends and relationships
(Mascarenhas et al., 2018). Bibliometric methods are very useful for
summarizing the academic research of a scientific field, identifying the
major trends in terms of publications, citations, authors, keywords and
institutions (Martínez-López, 2018).
The VOSviewer software (van Eck & Waltman, 2010) was used. This
tool can be helpful for quickly viewing connections in large networks
(up to ten thousand items) and may particularly interest those seeking a
replacement for the extinct citation map tool in Web of Science (Wong,
2018). The software makes use of the VOS technique, an acronym
for Visualisation of Similarities. For a more detailed reflection on the
advantages of the VOS technique compared to multidimensional
mapping, refer to van Eck et al. (2010).
VOS aims to provide a low-dimensional visualization in which objects
are located so that the distance between any pair of objects reflects their
similarity as accurately as possible (van Eck & Waltman, 2007). In this
technique,

The co-occurrence frequency of two terms is obtained by count-


ing the number of articles in the relevant time period in which the
two terms both occur (in the title or abstract). We then used the
DOI: 10.4324/9781003255215-3
26 Carlos Décio Cordeiro and Vítor Duarte Teodoro

co-occurrence frequencies of the terms as input for the VOSviewer


software. Based on these frequencies, the VOSviewer software con-
structed a term map in which the distance between any pair of terms
provides an approximate indication of the relatedness of the terms
as measured by co-occurrences. Each term in a term map also has
a colour. Colours are used to show the grouping or clustering of
terms into topics. Terms that have the same colour belong to the
same cluster and tend to be more closely related than terms having
different colours. In other words, terms that have the same colour
tend to co-occur with each other more frequently than terms having
different colours.
(Flis & van Eck, 2017, p. 14)

In the term co-occurrence network visualization, the size of a term reflects


the number of publications in which the term occurs, and the distance
between two terms provides a rough sign of the relationship of the terms.
The relationship of the terms was determined on the basis of the number
of co-occurrences. Thus, the greater the number of publications in which
both terms occur, the stronger the relationship between the terms and the
smaller, on average, the distance between the terms in the visualization
(Palmblad & van Eck, 2018). The details of the selection criteria will be
explored in the following section.
PISA has introduced important changes in the governance of educa-
tion around the world. Driven by soft power strategies and new policy
transfers, this governance is based on data and measurement tools that
redefine the scales of education policies (Pons, 2017). Several recent com-
parative studies present statistical claims that improvements in ILSAs
(International Large-Scale Assessments), such as PISA, will lead to higher
GDP growth rates (Komatsu & Rappleye, 2017a).
PISA, as an instrument of educational governance, can be used by edu-
cational policymakers to legitimize what counts as knowledge through
coding and measuring an object called “literacy”. “Literacy” is central to
how the OECD targets the skills and competencies of the future worker
in a knowledge-based economy (Morgan, 2011).
Studies based on the PISA dataset have led to progress in educational
research while pointing to the need for caution when using this research
to inform educational policy (Hopfenbeck et al., 2018). The current
period is striking in that the OECD emphasizes uncertainty about the
future at the same time that it seeks to define it and reshape education
systems (Xiaomin & Auld, 2020).
However, not all is well, and in the view of several authors there are
own agendas, fallacies in the methodology used, among other “prob-
lems” as can be seen in the discussion of results section. However, this
often appears faintly in the literature. The lack of impact of the criticisms
How PISA is present in the scientific production 27

Table 2.1 Analogue studies

Author Scope Range Database Focus

Mølstad ILSA, with Until 2017 Scopus Most cited articles


et al. specific focus Field of study
(2017) on PISA Country of
researchers
Castro and Socio-cultural/ 2015–2019 WoS Country of
Sevillano socio- Dialnet researchers
(2019). educational Most cited
disadvantage authors
Domínguez PISA 2002–2010 Eric Country of
et al. EBSCOhost researchers and
(2012) ISI (now filiation
WoS) Field of study
Publications per
year
Literacy domains
Studied countries
Hopfenbeck PISA 1999–2015 ERIC Frequency of
et al. PsycINFO publications
(2018) Scopus according to
WoS journal, country
Zetoc and scientific
discipline.

does not mean that they are not valid, or that PISA has improved. It
simply means that the criticism has been largely ignored (Zhao, 2020).
In Table 2.1, it is possible to verify the analogous works that reveal the
relevance of this theme. This study stands out for a more comprehensive
analysis by adding topics such as the volume of funding per country,
analysis of scientific production by journals, keyword analysis and clus-
ter analysis. On the other hand, it provides more up-to-date information
compared to the study by Domínguez et al. (2012), which is the closest
and most complete.
This chapter comprises four sections, starting with the introduction
and followed by methodology, analysis and discussion of results, and
conclusion.

Methodology
Data were collected on 15 January 2021, from the Web of Science (WoS)
website. This database was chosen given its extensive coverage of docu-
ments. Comparison of WoS and Scopus shows that WoS has a stronger
coverage, dating back to 1990, and most of its journals are written in
English (Chadegani et al., 2013).
28 Carlos Décio Cordeiro and Vítor Duarte Teodoro

A search for the term “PISA OECD” was performed to avoid “Uni-
versity of Pisa” and projects about the city of “Pisa”. A result of 596
documents was got. Among these, there were 478 papers and 522 doc-
uments written in English. After the combined filtering of these two
characteristics, 411 results were obtained. This was followed by fur-
ther detailed filtering to check that all articles fit. Three articles were
removed that cumulatively refer to OECD and “University of Pisa”
both from the category “WoS Nuclear Engineering and Design” which
are clearly out of line with the purpose of this chapter. A total of 408
results were got.
Once the database was determined, some problems of data uni-
formity were found. The publication rules of the various journals dif-
fer, which causes the name of the same author to appear in different
ways in different documents. It was necessary to standardize the 722
authors present in the database to ensure a correct link strength. It was
also necessary to standardize 21,097 references cited by the authors
throughout the 408 articles under study. An additional problem was
found when standardizing the cited references, as they only present the
name of the first author. For this reason, it was not possible to perform
a cluster analysis for the authors cited in the references, as it would be
biased.

Analysis and discussion of results


This chapter explores the selected bibliography by year, nationality of the
first author, keywords, journals, references cited and authors.

Year
The first result in our research dates back to 1999, followed by a
one-year interregnum, until 2001 when a new article related to PISA
was published, which is understandable since it was in this year that
the first results were put out. The increase in publications remained
slight until 2011, when the number of publications exceeded ten, as
shown in Table 2.2. From that date onwards, this number took off.
As seen in Figure 2.1, the growth curve is close to an exponential
curve.
Given that the cumulative publication curve presents a smoother
trend, a simple linear regression in the form of time series with Ordinary
Least Squares (OLS) method was performed. To avoid heteroscedasticity
problems, the cumulative publication variable was used as a Napierian
logarithm. An R2 = 0.978 (p-value=0.00) was obtained, passing the tests
for heteroscedasticity (Breusch-Pagan, p-value = 0.17), autocorrelation
of residuals (Durbin-Watson, p-value=0.35) and normality of residuals
(p-value=0.19).
How PISA is present in the scientific production 29

Table 2.2 Publication per year


Year Number of publications
2021 1
2020 68
2019 59
2018 48
2017 38
2016 32
2015 37
2014 35
2013 18
2012 13
2011 17
2010 9
2009 7
2008 3
2007 9
2006 4
2005 3
2004 1
2003 3
2002 1
2001 1
2000 0
1999 1

Figure 2.1 Cumulative publication and regression output


30 Carlos Décio Cordeiro and Vítor Duarte Teodoro

Table 2.3 Origin of funding and publication by country

Origin of fundings Number of fundings Country of main author Number of publications

Spain 39 USA 65
Australia 13 Australia 56
United Kingdom 13 United Kingdom 51
China 12 Spain 42
Global 9 Germany 41
United States of America 8 Italy 37
European Union 6 China 21
Germany 5 Turkey 20
Finland 5 Finland 16
Others 31 Others 173

As this is a logarithmic analysis, interpreting the coefficients must be


carried out in the percentage form. It is expected that, on average, the
cumulative publication increases by about 28.7% for each year that
passes, as seen in Table 2.2 and Figure 2.1, the doubling time of the
number of cumulative publications occurs on average every 2 years and
9 months (2.74 years).

Countries
Table 2.3 shows the volume of publications by country, according to the
first author’s correspondence address. It can be seen that in the five coun-
tries with the highest frequency (for the study, the United Kingdom was
equated to one country), three of them are native English speakers (the
United States of America, Australia and the United Kingdom), one criterion
of our research. The remaining rank is completed with Spain and Germany.
Table 2.3 also shows the frequency in number and origin of funding per
country. It should be noted that an article may receive more than one grant.
Two results stand out due to their disparity from the others: funding by
Spanish institutions with a funding/scientific output ratio (39/42) of 93%,
mostly from public funds, and China, with a funding/scientific output ratio
(12/21) of 57%. The remaining countries have an average ratio of 26%
(91/345).

Keywords
Considering that some articles do not present keywords, a simple analy-
sis of keywords could bias our study. We used keywords plus, which are
the keywords created by the Web of Science to catalogue its articles by
terms. Following this first criterion, the 50 most used keywords plus were
How PISA is present in the scientific production 31

Figure 2.2 Bibliographic network map for keywords plus

selected. The three most used ones appear in over 50 articles, and are
“education” (54), “PISA” (54) and “achievement” (52).
Figure 2 shows a map of bibliographic networks for keywords plus.
According to this criterion, three clusters represent three major themes. The
diameter of the circles represents the number of occurrences of each term.
Cluster A, whose most used words are “achievement”, “schools”,
“impact”, “quality” and “inequality”, represents the thematic axis of the
authors who use approaches whose concerns are inequalities and quality
of schools and teachers, in an analysis carried out at local level. Cluster
B, whose most used words are “performance”, “students”, “mathemat-
ics”, “school” and “science”, represents the thematic axis of the authors
who carry out analysis at the micro level, in an analysis at the level of
the student and his/her performance. It is worth noting the difference
between the use of the word “school” and “schools” in different clusters,
which occurs precisely because of the perspective from which the author
evaluates the topic. Finally, Cluster C, whose most used words are “edu-
cation”, “PISA”, “policy”, “politics” and “governance”, focuses on a
more macro level, revolving on the implications of the PISA project at
the national level as well as the implementation of new decisions at the
political level because of the test results.

Journals
This section details the journals that obtained more citations (Table 2.4)
and that published more articles related to our research (Table 2.5). Two
metrics were used to assess their quality: Quartile and index-H.
Table 2.4 Ranking of 10 journals with the highest number of citations
32

Table 2.5 Ranking of 10 journals with the most articles related to the theme
Carlos Décio Cordeiro and Vítor Duarte Teodoro
How PISA is present in the scientific production 33

A journal has index h if h of its Np articles has at least h citations each,


and the other articles (Np – h) have ≤ h citations each (Hirsch, 2005). So,
an H-index is better the higher it is. Quartile scores, on the other hand,
can vary according to scientific categories. Q-scores of a journal briefly
show the 25% quantile resulting from the number of journals divided
into quarters (Asan & Aslan, 2020). Quartile rankings are ordered from
Q1 to Q4, with Q1 representing the best value in the ordering.
In both rankings, the journals “Journal of Education Policy” and
“Comparative Education” appear in the first two places, but inverted,
standing out for the quantity and quality of publications related to the
theme. The publisher that publishes most on this topic is undoubtedly
Routledge, getting an absolute majority in both rankings. Noteworthy
is the large number of journals belonging to Q1 (2019 criteria), 70% in
Table 2.4 and 80% in Table 2.5. Likewise, the H-index (2019 criteria) is
above average for journals in education whose average is h=23 for 2019
and for journals in economics whose average is h=29 for 2019.

References
This section highlights the publications cited in the 408 articles selected
for this study, according to the criteria defined in the methodology.
A total of 21,097 results were obtained, and the ten articles with the
most repeated citations are detailed as follows:

1. Grek, S. (2009). Governing by numbers: the PISA “effect” in Europe.


Journal of Education Policy (67 citations).
2. OECD (2014). PISA 2012 technical report. PISA, OECD Publishing,
Paris (38 citations).
3. Sellar, S. and Lingard, B. (2013b). The OECD and the expansion of
PISA: new global modes of governance in education. British Educa-
tional Research Journal (34 citations).
4. Meyer, H. and Benavot, A. (eds.) (2013). PISA, Power and Policy:
the emergence of global educational governance. Oxford: Sympo-
sium Books (31 citations).
5. Sellar, S. and Lingard, B. (2013a). Looking East: Shanghai, PISA
2009 and the reconstitution of reference societies in the global edu-
cation policy field. Comparative Education (28 citations).
6. Takayama, K. (2008). The politics of international league tables: PISA
in Japan’s achievement crisis debate. Comparative Education (28
citations).
7. OECD (2001), Knowledge and Skills for Life: First Results from
PISA 2000, PISA, OECD Publishing, Paris (27 citations).
8. OECD (2004), Learning for Tomorrow’s World: First Results from
PISA 2003, PISA, OECD Publishing, Paris (27 citations).
34 Carlos Décio Cordeiro and Vítor Duarte Teodoro

9. Auld, E. and Morris, P. (2016). PISA, policy and persuasion: translat-


ing complex conditions into education “best practice”. Comparative
Education (26 citations).
10. Ertl, H. (2006). Educational standards and the changing discourse
on education: the reception and consequences of the PISA study in
Germany. Oxford Review of Education (26 citations).

Of these, three were published by the OECD. The remaining articles are
all related to the topic of education policies. Sellar and Lingard appear
in two of these papers. The predominance of each of these ten selected
articles varies from 6% (26 articles out of 408) to 16% (67 articles out of
408) in relation to the total number of articles analysed.

Authors
For the analysis of the authors, 20 authors were selected. The criterion
followed was to have two or more documents in the set of articles, select-
ing the 20 authors by the greatest strength of the link between them. We
note that 69 articles were published by these authors, representing 17%
of the total number of articles. These represent 2.8% of the total 722
authors. Of these 69 articles, 35% received some type of funding.
The authors were divided into clusters according to their writ-
ing themes, as detailed in the introduction section. Five clusters were
obtained as seen in Table 2.6 and Figure 2.3. Each cluster will be detailed
later with a review of the literature related to each one of them.

Table 2.6 20 authors with the highest link strength and their citations
How PISA is present in the scientific production 35

Figure 2.3 Author clusters

Cluster 1
This cluster is composed of five authors. It is the group of authors who
make a critical analysis of the PISA project. In this group of articles, we
find only documents from 2017 to 2020. There are 16 papers, of which
five were funded, that is about 31%.
One of the pressing criticisms is the implementation of the PISA for
development project, also known as PISA-D. This project is seen as a
strategy to legitimize participating states more than to assess educa-
tion (Auld et al., 2020). Some argue that having a relationship with the
OECD, appearing in their reports and databases, presents itself as a way
of legitimizing states as modern and accountable (Addey, 2020). Ques-
tions are raised about governance in the post-2015 era (Auld et al., 2019)
with these changes made by the OECD in the education landscape pre-
senting only a tiny picture on the values of education, failing to recognize
the complexity of education in all its aspects (Addey, 2020).
For this reason, the adaptation effect emerges, which comprises the
fact that the educational systems try to adapt to the PISA context, and
this new paradigm has implied changes by decision-makers in their inter-
action with the educational system (Addey & Gorur, 2020). It is then
36 Carlos Décio Cordeiro and Vítor Duarte Teodoro

stated that the challenge of ILSAs is not to establish a single argument,


but to create a democratic space in which legitimately diverse arguments
and intentions can be acknowledged, considered, brought together and
displayed (Addey et al., 2020).
Similarly, there are glimpses of critics drawing attention to the imple-
mentation of the IELS programme, a programme based on PISA but for
5-year-olds, a programme being presented as an attempt by the OECD to
establish itself as a regulator of early childhood skills and competencies
(Auld & Morris, 2019a). Being that mediatized, governance shapes and
limits the overall framework within which results are debated and has a
powerful influence on how local politicians represent PISA results and
advocate for their own policies (Grey & Morris, 2018).
Auld and Morris (2019b) argue that the official conception of “global
competence” adopted was heavily influenced by the organization’s
demand to position itself as the agency responsible for monitoring pro-
gress on the Sustainable Development Goals. It was then altered to match
what could be easily measured, and although the organization presents
its global competence through a humanitarian discourse, it is framed by
its economic mission.
While the OECD, according to the aforementioned authors, is posi-
tioned as a major player in international regulation, its theoretical basis
that countries’ economic output, usually measured by GDP, is under-
pinned by increased cognitive outcomes as measured by ILSA, is called
into question.
The findings refute the close link between cognitive levels and per capita
GDP growth predicted by knowledge capital proponents, mainly the
OECD and the World Bank. These results suggest that the theory of
knowledge capital is now degenerate (Rappleye & Komatsu, 2020a).
Attention is then drawn to the fact that the new global policy regime is
based on misleading statistics (Komatsu & Rappleye, 2017a). In short,
one finds meagre evidence to support the relationship between ILSA and
economic growth (Komatsu & Rappleye, 2019).
Komatsu and Rappleye do not only limit themselves to criticizing the
theoretical basis of human capital theory, but also criticize the theories
used by the OECD to support the happiness measurement scale (Rap-
pleye et al., 2020), the importance of national exams as a major factor
of success in some countries (Rappleye & Komatsu, 2020b) as well as
learner-centred and teacher-centred theories, giving the PISA results in
science literacy as an example (Komatsu & Rappleye, 2017b).

Cluster 2
This cluster is composed of five authors. It is the group of authors who
analyse the importance of PISA in national education policies and their
How PISA is present in the scientific production 37

performance. In this group of articles, we find only documents from 2011


to 2020. There are 15 papers, of which four were funded, that is about
27%.
In this cluster, the recurring theme is the standardization of education
systems based on the soft governance tools present in the PISA study.
In an assessment of the different reactions of two countries, Germany
and the United States, to the PISA results, while the German secondary
school system was strongly affected by international comparison, start-
ing with the first PISA study in 2000, and underwent comprehensive
changes, only in 2010 did the USA respond noticeably in the public and
political discourse to its below-average ranking (Martens & Niemann,
2013). In Switzerland, on the other hand, PISA’s transnational commu-
nication platform enabled policy learning at the expert level, leading to
a rather high policy convergence. This was not the case in the United
States, where PISA was considered only one of the many studies assessing
the performance of education systems (Bieber & Martens, 2011).
When analysing the French system, noted in past centuries as one of
the most evolved, it was found that the “information-based market”
institutional regime shows high levels of student performance, but the
liberal “laissez-faire” type performs even better (Teltemann & Windzio,
2019). It becomes apparent that country-specific dependencies and policy
legacies, such as different systems of devolution of power, testing tradi-
tions and also the influence of non-governmental actors, also moderate
the impact of ILSAs (Niemann et al., 2018), as it is not only transnational
pressures that are crucial determinants of the fate of possible reform
measures, but also the capacity of the state to transform its education
system and take corrective action (Dobbins & Martens, 2012).
There are groups of countries that are mainly distinguished by dif-
ferent levels of prevalence of assessment, accountability and evaluation
practices (Teltemann & Jude, 2019). As an example, some students ben-
efited from the lower standardization of educational input because per-
formance gaps were smaller when a country’s educational resources were
unevenly distributed (Teltemann & Schunck, 2016).
These reflections leave open the different ways in which international
assessments are used to guide education policy in national spaces and
the role of the OECD as a transnational policy steering agent (Engel,
2015). Of note are the gaps opened by the lack of literature on cost-
benefit frameworks that may be useful for ongoing policy deliberations
on participation in PISA and other large-scale international assessments
(Engel & Rutkowski, 2020), encouraging the OECD to be transparent in
releasing results and educational stakeholders to be cautious interpreters
of upcoming results and rankings (Engel et al., 2019).
The OECD and its distinct approach to soft governance through hard
facts can become a model for other international organizations, both in
38 Carlos Décio Cordeiro and Vítor Duarte Teodoro

the field of education and beyond (Niemann & Martens, 2018) leading
to a need for greater involvement of non-OECD members in the PISA
study and scale development (Niemann et al., 2017).
Competitive comparison in education has deepened through the detail-
ing of ILSA data to new scales beyond the nation state (Engel & Frizzell,
2015). This context provides the basis for a discussion of how school-
based international assessment can operate as a governance tool, ena-
bling international organizations to have a greater influence on local
education policy formation and implementation (Rutkowski, 2015).

Cluster 3
This cluster is composed of four authors. It is the group of authors who
analyse the application of PISA and its derivatives, PISA for Schools and
PISA4U. In this group of articles, we find only papers from 2011 to 2020.
There are 18 papers, of which six were funded, that is about 33%.
Lewis et al. (2016) suggest that PISA for Schools provides an
exemplary demonstration of heterarchical governance, in which vertical
policy mechanisms open up horizontal spaces for new policy actors. It
also creates proportional spaces of comparison and governance, allowing
the OECD to “reach” spaces at the school level and directly influence
local educational practices. This is described as a “convergence of policy
method” in widely different policy contexts, where rapid policies and
methods of promoting such policies appear to dominate over potentially
more thoughtful policies and contextual and applied approaches
(Lewis & Hogan, 2019).
PISA for Schools reflects contradictory logics within the European
School System, where the inherently context-based goal of “becoming
European” is juxtaposed with the desire to employ de-contextualized
international evidence, giving rise to a perceived need for such data,
coupled with the overall authority of the OECD, which can produce a
problematic focus on data-driven practice (Lewis, 2020a). It follows,
then, that treating PISA for Schools and other similar education services
as a product results in a potentially dangerous conflation of public and
private benefits, with the potential that (private) profit may end up
trumping (public) education (Lewis, 2017b). These facts position the
OECD as the global expert on education policy, and now, with PISA
for Schools, the local expert on “what works and what doesn’t work in
education” (Lewis, 2017a).
Lewis (2014) argues that the early stages of test production by inter-
national organizations are significant sites in which the global govern-
ance of education is legitimized and enacted. These re-articulations are
set against the extension of economic-social and neo-social rationalities
in all domains of life and the topological production of new spaces of
How PISA is present in the scientific production 39

politics and power (Lingard et al., 2014). PISA and the OECD’s edu-
cation work, more broadly, have facilitated new epistemological and
infrastructural modes of global governance for the OECD in education
(Sellar & Lingard, 2014) and the manner in which they can mobilize
arguments and the media to expand the definition of educational qual-
ity and equity to enrich our vision of education and our debates about it
(Anagnostopoulos et al., 2016).
Also problematically, PISA4U allows the OECD to consolidate its status
as a global expert on education by providing a technical and discursive
platform from which to speak to the teaching profession, which risks
displacing more professionally oriented forms of teacher knowledge and
experience (Lewis, 2020b).

Cluster 4
This cluster is composed of three authors. It is the group of authors who
make an analysis of PISA-based effectiveness, efficiency and performance.
In this group of articles, we found only papers from 2011 to 2020. There
are 16 papers, of which six were funded, that is about 38%.
Following a careful analysis of the PISA data, we can point out that,
regarding schools, the results reveal that the average efficiency of schools
is close to 70%. So, achievement can be increased by 30% with more
effective use of available resources. Therefore, some practices related
to teacher accountability, engagement and professional development,
as well as extracurricular activities, are also positively associated with
higher levels of efficiency (Agasisti & Zoido, 2019). Results show that
students whose teachers focus on some teaching practices achieve better
results than those who have teachers who resort to many different activi-
ties in the classroom (Cordero & Gil-Izquierdo, 2018), considering that
traditional teaching methods have a positive influence on students’ profi-
ciency in mathematics, while implementing more innovative active learn-
ing strategies seems to have a negative impact on students’ performance
(Cordero & Gil-Izquierdo, 2018b). This analysis is reinforced by schools
attended by resilient students, which offer more extracurricular activi-
ties and are characterized by a more positive school climate (Agasisti &
Longobardi, 2017).
Teacher salaries and Internet use (as a proxy for technological literacy)
play a positive role in educational achievement (Agasisti, 2014). Despite
this, the results suggest that a more cautious approach should be taken
to the widespread use of digital innovation to support students’ work
outside of school (Agasisti et al., 2020).
While individual-level characteristics play a role in outcomes, some
school factors (i.e., extracurricular activities and school leadership) are
also involved, suggesting implications related to management policies
40 Carlos Décio Cordeiro and Vítor Duarte Teodoro

(Agasisti & Longobardi, 2014). However, educational funding can help


disadvantaged students get the opportunities they would not otherwise
have. This effect appears to be heterogeneous and mainly driven by coun-
tries whose economic development (in terms of GDP per capita) is lower
(Agasisti et al., 2017).
The results also show that most schools in OECD countries tend to be
less efficient in reading than in mathematics (Aparicio et al., 2018).
In the personal and family aspect, the direct impact of the family con-
text on financial literacy is real, given the mediating effect of students’
attitudes and motivations. Focusing on the gender gap, the process of
skills acquisition does not show significant gender differences (Longob-
ardi et al., 2017). A decomposition exercise of the gender gap in financial
literacy confirms the role played by motivational and behavioural factors
and, at the same time, highlights that putting men and women on an
equal footing regarding personal characteristics is not enough to close
this gap (Longobardi et al., 2018).
Finally, at the national and cross-national levels, it is found that
efficiency scores vary considerably between and within countries
(Agasisti & Zoido, 2018). There is greater heterogeneity across coun-
tries than across schools. In particular, differences in efficiency estimates
are mainly explained by economic indicators and cultural values. In
contrast, some factors previously identified as potential determinants
of student performance, such as the existence of tracking or central
examinations, do not seem to significantly affect the efficiency of sec-
ondary schools (Cordero et al., 2018). The results show a positive effect
of competition on school performance. This is relevant for policymak-
ing because competition seems to affect school performance (Agasisti &
Murtinu, 2012).

Cluster 5
This cluster is composed of three authors. It is the group of authors who
analyse the OECD agenda through PISA with special emphasis on an
analysis of the “regression to the mean” in Finnish results. In this group
of articles, we find only papers from 2009 to 2021. There are four papers,
of which three were funded, a total of 75%.
Rautalin et al. (2021) analysed OECD economic studies between 1995
and 2015. The results show that, while the reports were portrayed as
“scientific” as early as the 1960s, in the 2000s the reports clearly shifted
from the language of economics to a more popularized consultancy lan-
guage. The authors argue that these changes happened because of the
OECD’s reactions to transformations in the broader institutional envi-
ronment and were motivated by its efforts to appear as a significant actor
in knowledge-based policy-making.
How PISA is present in the scientific production 41

This analysis shows that education policy debates feature in an increas-


ingly global discourse, in which organizations such as the OECD play
an authoritative role (Rautalin et al., 2019). As an example, we note
the obvious decline in PISA scores by Finland and the inability of the
previously so well-positioned political elite to manage the public debate
on changes in PISA scores by fuelling a critical discussion on education
that was rhetorically much more challenging in the previous publicity
around PISA (Rautalin, 2018). Thus, the analysis shows that conclusions
drawn from PISA results in texts representing central government views
are biased and justify its political agenda (Rautalin & Alasuutari, 2009).

Conclusions
There is broad coverage of the themes made possible by the PISA study,
with a more detailed emphasis on its implication for policy decisions.
There is no documentation in this batch of articles analysing the constructs
created by the OECD in its context questionnaires. Similarly, the issue
of teachers is only weakly addressed, despite introducing specific teacher
questionnaires, and it is estimated that this factor may be related to the co-
existence of the TALIS project, also conducted by the OECD. Moreover,
there is a very limited approach to cross-referencing PISA data with other
OECD studies such as TALIS, PIAAC and ESonline, or even with other
international studies such as TIMMS, PIRLS, ePIRLS and ICILS.
This chapter aimed to fill a gap found in the literature related to the
quantification and cataloguing of the extensive repertoire of documents
relevant to the study area.

References
Addey, C. (2020). The appeal of PISA for development in Ecuador and Paraguay:
Theorising and applying the global ritual of belonging. Compare: A Journal of
Comparative and International Education. http://doi.org/10.1080/03057925.
2019.1623653
Addey, C., & Gorur, R. (2020). Translating PISA, translating the world. Com-
parative Education. http://doi.org/10.1080/03050068.2020.1771873
Addey, C., Maddox, B., & Zumbo, B. (2020). Assembled validity: Rethinking
Kane’s argument-based approach in the context of international large-scale
assessments (ILSAs). Assessment in Education-Principles Policy & Practice.
http://doi.org/10.1080/0969594X.2020.1843136
Agasisti, T. (2014). The efficiency of public spending on education: An empiri-
cal comparison of EU countries. European Journal of Education. http://doi.
org/10.1111/ejed.12069
Agasisti, T., Gil-Izquierdo, M., & Han, S. (2020). ICT use at home for school-
related tasks: What is the effect on a student’s achievement? Empirical evidence
from OECD PISA data. Education Economics. http://doi.org/10.1080/09645
292.2020.1822787
42 Carlos Décio Cordeiro and Vítor Duarte Teodoro

Agasisti, T., & Longobardi, S. (2014). Inequality in education: Can Italian dis-
advantaged students close the gap? Journal of Behavioral and Experimental
Economics. http://doi.org/10.1016/j.socec.2014.05.002
Agasisti, T., & Longobardi, S. (2017). Equality of educational opportunities,
schools’ characteristics and resilient students: An empirical study of EU-15
countries using OECD-PISA 2009 data. Social Indicators Research. http://doi.
org/10.1007/s11205-016-1464-5
Agasisti, T., Longobardi, S., & Regoli, A. (2017). A cross-country panel approach
to exploring the determinants of educational equity through PISA data. Qual-
ity & Quantity. http://doi.org/10.1007/s11135-016-0328-z
Agasisti, T., & Murtinu, S. (2012). ‘Perceived’ competition and performance in
Italian secondary schools: New evidence from OECD-PISA 2006. British Edu-
cational Research Journal. http://doi.org/10.1080/01411926.2011.588314
Agasisti, T., & Zoido, P. (2018). Comparing the efficiency of schools through inter-
national benchmarking: Results from an empirical analysis of OECD PISA 2012
data. Educational Researcher. http://doi.org/10.3102/0013189X18777495
Agasisti, T., & Zoido, P. (2019). The efficiency of schools in developing countries,
analysed through PISA 2012 data. Socio-Economic Planning Sciences. http://
doi.org/10.1016/j.seps.2019.05.002
Anagnostopoulos, D., Lingard, B., & Sellar, S. (2016). Argumentation in educa-
tional policy disputes: Competing visions of quality and equity. Theory into
Practice. http://doi.org/10.1080/00405841.2016.1208071
Aparicio, J., Cordero, J., Gonzalez, M., & Lopez-Espin, J. (2018). Using non-
radial DEA to assess school efficiency in a cross-country perspective: An empir-
ical analysis of OECD countries. Omega-International Journal of Management
Science. http://doi.org/10.1016/j.omega.2017.07.004
Asan, A., & Aslan, A. (2020). Quartile scores of scientific journals: Meaning,
importance and usage. Acta Medica Alanya, 4(1), 102–108.
Auld, E., Li, X., & Morris, P. (2020). Piloting PISA for development to suc-
cess: An analysis of its findings, framework and recommendations. Compare:
A Journal of Comparative and International Education. http://doi.org/10.
1080/03057925.2020.1852914
Auld, E., & Morris, P. (2016). PISA, policy and persuasion: Translating com-
plex conditions into education “best practice.” Comparative Education, 52(2),
202–229. doi:10.1080/03050068.2016.1143278
Auld, E., & Morris, P. (2019a). The OECD and IELS: Redefining early childhood
education for the 21st century. Policy Futures in Education, 17(1), 11–26.
https://doi.org/10.1177/1478210318823949
Auld, E., & Morris, P. (2019b). Science by streetlight and the OECD’s measure of
global competence: A new yardstick for internationalisation? Policy Futures in
Education, 17(6), 677–698. https://doi.org/10.1177/1478210318819246
Auld, E., Rappleye, J., & Morris, P. (2019). PISA for development: How the
OECD and World Bank shaped education governance post-2015. Comparative
Education. http://doi.org/10.1080/03050068.2018.1538635
Aydin, A., Erdag, C., & Tas, N. (2011). A comparative evaluation of Pisa 2003–
2006 results in reading literacy skills: An example of top-five OECD countries
and Turkey. Kuram ve Uygulamada Egitim Bilimleri, 11, 651–673.
How PISA is present in the scientific production 43

Bieber, T., & Martens, K. (2011). The OECD PISA study as a soft power in edu-
cation? Lessons from Switzerland and the US. European Journal of Education.
http://doi.org/10.1111/j.1465-3435.2010.01462.x
Castro, S., & Sevillano, M. (2019). Análisis bibliométrico de la investigación
educativa sobre desventaja sociocultural/socieducativa en el periodo 2015 a
2019. Enseñanza & Teaching: Revista Interuniversitaria de Didáctica, 37(2),
147–164. https://doi.org/10.14201/et2019372147164
Chadegani, A., Salehi, H., Yunus, M., Farhadi, H., Fooladi, M., Farhadi, M., &
Ebrahim, N. (2013). A comparison between two main academic literature col-
lections: Web of science and Scopus databases. Asian Social Science, 9(5).
Cordero, J., & Gil-Izquierdo, M. (2018). The effect of teaching strategies on stu-
dent achievement: An analysis using TALIS-PISA-link. Journal of Policy Mod-
eling. http://doi.org/10.1016/j.jpolmod.2018.04.003
Cordero, J., Polo, C., Santin, D., & Simancas, R. (2018). Efficiency measure-
ment and cross-country differences among schools: A robust conditional
nonparametric analysis. Economic Modelling. http://doi.org/10.1016/j.
econmod.2018.05.001
Dobbins, M., & Martens, K. (2012). Towards an education approach a la Finlan-
daise? French education policy after PISA. Journal of Education Policy. http://
doi.org/10.1080/02680939.2011.622413
Domínguez, M., Vieira, M., & Vidal, J. (2012). The impact of the programme
for international student assessment on academic journals. Assessment in
Education: Principles, Policy & Practice, 19(4), 393–409. doi:10.1080/0969
594X.2012.659175
Engel, L. (2015). Steering the national: Exploring the education policy uses of
PISA in Spain. European Education. http://doi.org/10.1080/10564934.2015.
1033913
Engel, L., & Frizzell, M. (2015). Competitive comparison and PISA brag-
ging rights: Sub-national uses of the OECD’s PISA in Canada and the USA.
Discourse-Studies in the Cultural Politics of Education. http://doi.org/10.1080/
01596306.2015.1017446
Engel, L., & Rutkowski, D. (2020). Pay to play: What does PISA participation
cost in the US? Discourse-Studies in the Cultural Politics of Education. http://
doi.org/10.1080/01596306.2018.1503591
Engel, L., Rutkowski, D., & Thompson, G. (2019). Toward an international
measure of global competence? A critical look at the PISA 2018 framework.
Globalisation Societies and Education. http://doi.org/10.1080/14767724.
2019.1642183
Ertl, H. (2006). Educational standards and the changing discourse on educa-
tion: The reception and consequences of the PISA study in Germany. Oxford
Review of Education, 32(5), 619–634. doi:10.1080/03054980600976320
Flis, I., & van Eck, N. (2017). Framing psychology as a discipline (1950–1999):
A large-scale term co-occurrence analysis of scientific literature in psychology.
History of Psychology, 21(4). http://doi.org/10.1037/hop0000067
Grek, S. (2009). Governing by numbers: The PISA “effect” in Europe. Journal of
Education Policy, 24(1), 23–37. doi:10.1080/02680930802412669
44 Carlos Décio Cordeiro and Vítor Duarte Teodoro

Grey, S., & Morris, P. (2018). PISA: Multiple “truths’ and mediatised global
governance. Comparative Education. http://doi.org/10.1080/03050068.2018.
1425243
Gumus, S., & Atalmis, E. (2011). Exploring the relationship between purpose
of computer usage and reading skills of Turkish students: Evidence from PISA
2006. Turkish Online Journal of Educational Technology, 10(3), 129–140.
Hirsch, J. (2005). An index to quantify an individual’s scientific research out-
put. PNAS – Proceedings of the National Academy of United States, 102(46),
16569–16572. https://doi.org/10.1073/pnas.0507655102
Hopfenbeck, T., Lenkeit, J., Masri, Y., Cantrell, K., Ryan, J., & Baird, J. (2018).
Lessons learned from PISA: A systematic review of peer-reviewed articles on
the programme for international student assessment. Scandinavian Journal of
Educational Research, 62(3). https://doi.org/10.1080/00313831.2016.1258726
Komatsu, H., & Rappleye, J. (2017a). A new global policy regime founded on
invalid statistics? Hanushek, Woessmann, PISA, and economic growth. Com-
parative Education. http://doi.org/10.1080/03050068.2017.1300008
Komatsu, H., & Rappleye, J. (2017b). A PISA paradox? An alternative theory
of learning as a possible solution for variations in PISA scores. Comparative
Education Review. http://doi.org/10.1086/690809
Komatsu, H., & Rappleye, J. (2019). Refuting the OECD-World Bank develop-
ment narrative: Was East Asia’s ‘economic miracle’ primarily driven by educa-
tion quality and cognitive skills? Globalisation Societies and Education. http://
doi.org/10.1080/14767724.2019.1577718
Lewis, S. (2014). The OECD, PISA and educational governance: A call to critical
engagement. Discourse-Studies in the Cultural Politics of Education. http://
doi.org/10.1080/01596306.2014.899833
Lewis, S. (2017a). Governing schooling through ‘what works’: The OECD’s PISA
for Schools. Journal of Education Policy. http://doi.org/10.1080/02680939.
2016.1252855
Lewis, S. (2017b). Policy, philanthropy and profit: The OECD’s PISA for Schools
and new modes of heterarchical educational governance. Comparative Educa-
tion. http://doi.org/10.1080/03050068.2017.1327246
Lewis, S. (2020a). ‘Becoming European’? Respatialising the European schools
system through PISA for Schools. International Studies in Sociology of Educa-
tion. http://doi.org/10.1080/09620214.2019.1624593
Lewis, S. (2020b). Providing a platform for ‘what works’: Platform-based gov-
ernance and the reshaping of teacher learning through the OECD’s PISA4U.
Comparative Education. http://doi.org/10.1080/03050068.2020.1769926
Lewis, S., & Hogan, A. (2019). Reform first and ask questions later? The impli-
cations of (fast) schooling policy and ‘silver bullet’ solutions. Critical Studies in
Education. http://doi.org/10.1080/17508487.2016.1219961
Lewis, S., Sellar, S., & Lingard, B. (2016). PISA for schools: Topological rational-
ity and new spaces of the OECD’s global educational governance. Comparative
Education Review, 60(1), 27–57. http://doi.org /10.1086/684458
Lingard, B., Sellar, S., & Savage, G. (2014). Re-articulating social justice as equity
in schooling policy: The effects of testing and data infrastructures. British Jour-
nal of Sociology of Education. http://doi.org/10.1080/01425692.2014.919846
How PISA is present in the scientific production 45

Longobardi, S., Pagliuca, M., & Regoli, A. (2017). Family background and
financial literacy of Italian students: The mediating role of attitudes and moti-
vations. Economics Bulletin, 37(4).
Longobardi, S., Pagliuca, M., & Regoli, A. (2018). Can problem-solving atti-
tudes explain the gender gap in financial literacy? Evidence from Italian stu-
dents’ data. Quality & Quantity. http://doi.org/10.1007/s11135-017-0545-0
Martens, K., & Niemann, D. (2013). When do numbers count? The differential
impact of the PISA rating and ranking on education policy in Germany and the
US. German Politics. http://doi.org/10.1080/09644008.2013.794455
Martínez-López, F., Merigó, J., Valenzuela-Fernández, L., & Nicolás, C. (2018).
Fifty years of the European journal of marketing: A bibliometric analysis.
European Journal of Marketing, 52(1/2), 439–468.
Mascarenhas, C., Ferreira, J. J., & Marques, C. (2018). University – industry
cooperation: A systematic literature review and research agenda. Science and
Public Policy. http://doi.org/10.1093/scipol/scy003/4829714
Meyer, H., & Benavot, A. (Eds.). (2013). PISA, power, and policy: The emer-
gence of global educational governance. Symposium Books.
Mølstad, C. E., Pettersson, D., & Forsberg, E. (2017). A game of thrones: Organ-
ising and legitimising knowledge through PISA research. European Educational
Research Journal, 16(6), 869–884. https://doi.org/10.1177/1474904117715835
Morgan, C. (2011). Constructing the OECD PISA. In M. Pereya (Ed.), PISA
under examination: Changing knowledge, changing tests, and changing
schools. Sense Publishers.
Niemann, D., Hartong, S., & Martens, K. (2018). Observing local dynamics of
ILSA projections in federal systems: A comparison between Germany and the
United States. Globalisation Societies and Education. http://doi.org/10.1080/
14767724.2018.1531237
Niemann, D., & Martens, K. (2018). Soft governance by hard fact? The OECD
as a knowledge broker in education policy. Global Social Policy. http://doi.
org/10.1177/1468018118794076
Niemann, D., Martens, K., & Teltemann, J. (2017). PISA and its consequences:
Shaping education policies through international comparisons. European Jour-
nal of Education. http://doi.org/10.1111/ejed.12220
OECD. (2001). Knowledge and skills for life: First results from PISA 2000.
OECD Publishing. https://doi.org/10.1787/9789264195905-en
OECD. (2004). Learning for tomorrow’s world: First results from PISA 2003.
OECD Publishing. https://doi.org/10.1787/9789264006416-en
OECD. (2014). PISA 2012 technical report. OECD Publishing.
Palmblad, M., & van Eck, N. (2018). Bibliometric analyses reveal patterns of
collaboration between ASMS members. Journal of The American Society for
Mass Spectrometry, 29(3), 447–454.
Pons, X. (2017). Fifteen years of research on PISA effects on education govern-
ance: A critical review. European Journal of Education, 52(2), 131–144.
Rappleye, J., & Komatsu, H. (2020a). Is knowledge capital theory degenerate?
PIAAC, PISA, and economic growth. Compare: A Journal of Comparative and
International Education. http://doi.org/10.1080/03057925.2019.1612233
46 Carlos Décio Cordeiro and Vítor Duarte Teodoro

Rappleye, J., & Komatsu, H. (2020b). Is shadow education the driver of East
Asia’s high performance on comparative learning assessments? Education Pol-
icy Analysis Archives. http://doi.org/10.14507/epaa.28.4990
Rappleye, J., Komatsu, H., Uchida, Y., Krys, K., & Markus, H. (2020). ‘Better
policies for better lives’? Constructive critique of the OECD’s (mis)measure
of student well-being. Journal of Education Policy. http://doi.org/10.1080/
02680939.2019.1576923
Rautalin, M. (2018). PISA and the criticism of Finnish education: Justifications
used in the national media debate. Studies in Higher Education. http://doi.org/
10.1080/03075079.2018.1526773
Rautalin, M., & Alasuutari, P. (2009). The uses of the national PISA results by
Finnish officials in central government. Journal of Education Policy. http://doi.
org/10.1080/02680930903131267
Rautalin, M., Alasuutari, P., & Vento, E. (2019). Globalisation of education
policies: Does PISA have an effect? Journal of Education Policy. http://doi.org/
10.1080/02680939.2018.1462890
Rautalin, M., Syvatera, J., & Vento, E. (2021). International organizations establish-
ing their scientific authority: Periodizing the legitimation of policy advice by the
OECD. International Sociology. http://doi.org/10.1177/0268580920947871
Rutkowski, D. (2015). The OECD and the local: PISA-based Test for Schools in
the USA. Discourse-Studies in the Cultural Politics of Education. http://doi.org
/10.1080/01596306.2014.943157
Sellar, S., & Lingard, B. (2013a). Looking east: Shanghai, PISA 2009 and the
reconstitution of reference societies in the global education policy field.
Comparative Education. http://doi.org/10.1080/03050068.2013.770943
Sellar, S., & Lingard, B. (2013b). The OECD and global governance in education.
Journal of Education Policy. http://doi.org/10.1080/02680939.2013.779791
Sellar, S., & Lingard, B. (2014). The OECD and the expansion of PISA: New
global modes of governance in education. British Educational Research Jour-
nal. http://doi.org/10.1002/berj.3120
Takayama, K. (2008). The politics of international league tables: PISA in Japan’s
achievement crisis debate. Comparative Education, 44(4), 387–407. http://doi.
org/10.1080/03050060802481413
Teltemann, J., & Jude, N. (2019). Assessments and accountability in secondary
education: International trends. Research in Comparative and International
Education. http://doi.org/10.1177/1745499919846174
Teltemann, J., & Schunck, R. (2016). Education systems, school segregation, and
second-generation immigrants’ educational success: Evidence from a country-
fixed effects approach using three waves of PISA. International Journal of
Comparative Sociology. http://doi.org/10.1177/0020715216687348
Teltemann, J., & Windzio, M. (2019). The impact of marketisation and spa-
tial proximity on reading performance: International results from PISA 2012.
Compare: A Journal of Comparative and International Education. http://doi.
org/10.1080/03057925.2018.1458597
van Eck, N. J., & Waltman, L. (2007). VOS: A new method for visualizing simi-
larities between objects. In H.-J. Lenz & R. Decker (Eds.), Advances in data
analysis: Proceedings of the 30th annual conference of the German Classifica-
tion Society (pp. 299–306). Springer.
How PISA is present in the scientific production 47

van Eck, N. J., & Waltman, L. (2010). VOSViewer: Visualizing scientific land-
scapes [Software]. www.vosviewer.com
van Eck, N. J., Waltman, L., Dekker, R., & Van den Berg, J. (2010). A compari-
son of two techniques for bibliometric mapping: Multidimensional scaling and
VOS. Journal of the American Society for Information Science and Technol-
ogy, 61(12), 2405–2416.
van Nunen, K., Li, J., Reniers, G., & Ponnet, K. (2018). Bibliometric analysis of
safety culture research. Safety Science, 108, 248–258.
Wong, D. (2018). VOSviewer. Technical Services Quarterly, 35(2), 219–220.
Xiaomin, L., & Auld, E. (2020): A historical perspective on the OECD’s ‘humani-
tarian turn’: PISA for development and the learning framework 2030. Com-
parative Education. https://doi.org/10.1080/03050068.2020.1781397
Zhao, Y. (2020). Two decades of havoc: A synthesis of criticism against PISA.
Journal of Education Change, 21, 245–266.
3

PISA as epistemic governance


within the European political
arithmetic of inequalities
A sociological perspective
illustrating the French case
Romuald Normand

Over the past two decades, PISA has become a key element in guid-
ing national policies around the world and it has raised a great deal
of scientific and public debate (Meyer & Benavot, 2013). In the first
part of this chapter, I report on this issue without being exhaustive,
but rather showing how PISA has given rise to an international space
of circulation between science, expertise and policy despite increas-
ing criticism (Sellar & Lingard, 2014). After shedding light on the
genesis of the survey and its contribution to worldwide governing by
numbers, while networks of experts and policymakers were contribut-
ing to its legitimization and dissemination, sociological research has
moved towards a more precise scrutiny of some structuring effects on
education policies (Grek, 2014). Sociologists are particularly studying
processes of mediation and translation in national education systems
while evidence-based accountability policies are currently developed.
This chapter contributes to this emerging field of research by illustrat-
ing and situating the French case. However, it should be emphasized
that this national case can only be understood if it is contextualized
in a broader European space. Indeed, PISA is a key component of the
European lifelong learning strategy to which many elements of French
policy are related, as in other European countries. To avoid methodo-
logical nationalism, it is therefore important to consider how PISA
takes place in a European political arithmetic of inequalities, before
specifying the dissemination of this survey within the French context.
Then, I will show how an epistemic governance has been built around
the survey from knowledge produced by an association between experts
and policymakers and through a new formulation of the French ideal
of equal opportunities in education.

DOI: 10.4324/9781003255215-4
PISA as epistemic governance within the European political arithmetic 49

PISA between science, expertise and policy: a


space of international circulation
Much research has explored the roots of the PISA survey since the First
International Mathematics Survey implemented by the International
Association for the Evaluation of Educational Achievement (IEA) (Pet-
tersson, 2014). The survey was taken over by the UNESCO and the
OECD, and it became a global undertaking involving an increasing num-
ber of states, while the United States wanted to export its accountability
methodology as an international standard (Sellar & Lingard, 2013a). At
least three fields of study devoted to PISA have been developed over the
last two decades: a careful examination of the epistemology of the sur-
vey and its metrology which have raised many criticisms regarding the
measurement of student skills; studies on the political conditions regard-
ing its implementation and its reception in many countries; an analysis
of the international circulation space based on links between epistemic
communities and expert networks that institutionalized PISA as global
governing by numbers.

The epistemology of PISA and its deep criticism


From an epistemological and methodological stance, psychologist David
C. Berliner has formulated solid criticism in summarizing previous
research findings (Berliner, 2020). Berliner and his colleagues have firstly
shown how measurement could be used to produce biased evidence to
justify policy choices in the US context (Berliner & Biddle, 1995; Nich-
ols & Berliner, 2007). Then, convincingly, he demonstrates that items
built for the international survey raise some questions in establishing
equivalences with other languages, particularly in the field of read-
ing. The context in which tests are administered and taken by students
explains some differences in interpreting instructions and exercises, as
well as expectations regarding success. But the problem also concerns the
way tests are methodologically designed and based on ranking students’
skills. The use of the Rasch model raises doubts about the psychometric
reliability of PISA, its sampling, test validity in terms of content, design,
consequences and predictability; furthermore, they remain quite open to
criticism from rules adopted by psychometrics itself.
Beyond this psychological and methodological criticism, there is a
cultural and social one (Sellar et al., 2017). Firstly, it concerns students’
preparation for PISA tests. The understanding of the stakes depends
on whether students are more or less familiar with testing, whether
items are more or less close to their native language and whether or
50 Romuald Normand

not pedagogy throughout the year is geared towards performance. Sec-


ondly, teachers’ and parents’ attitudes can influence conditions under
which the tests are taken. For example, in Asian countries, the attach-
ment to education, linked to Confucian ethics, common to parents and
teachers, puts strong pressure on student achievement. This success
refers to logic of honour not only for the family but also for the coun-
try, and often comes with a nationalist feeling that pushes for the best
results. On the contrary, in other countries, such as the United States,
these tests are not “high-stakes”, they do not commit the students’
future in terms of repetition and graduation, so they are not given the
same importance, contrary to other tests, like SAT, which determine
their access to universities. Thirdly, PISA tests do not take into account
the extreme social diversity of students and inequalities in schooling
that prevent some of them from participating, such as those from dis-
advantaged and remote rural areas in Asian countries. The shadow
industry, which leads some pupils to a double school-day work from
early childhood, creates significant bias in scores as a result of teaching
to the test.
Another criticism relates to PISA’s policy and politics (Addey et al.,
2017). Some countries are adopting this international survey to dem-
onstrate their internationality (Steiner-Khamsi & Waldow, 2018) and
to present themselves as modern and responsible states vis-à-vis the
international community. Governments and organizations administer-
ing PISA justify their participation also in terms of evidence for policy
development and an opportunity to acquire an innovative and sophis-
ticated statistical methodology (e.g. item response theory) and to build
national capacity to develop large-scale assessments. Several academics
have studied country participation in international surveys from this per-
spective. Grek (2009) has shown that countries want to measure their
performance relative to OECD countries and to assess the skill gap but
also to “be part of the picture”. Addey (2015) argues that participa-
tion is justified as technical capacity building but PISA also allows such
countries to join in a “global ritual of belonging”, to institutionalize a
national culture of accountability, and to increase the evaluative pressure
on their national context.
Finally, PISA is considered a key measure of human capital and a
proxy indicator of global economic competitiveness, a view that the
OECD supports by suggesting a causal correlation between education
and economic performance. Countries then adopt a rationalist perspec-
tive of normative emulation or competition between national policies
(Addey, 2015). By adapting curricula to the needs of their economy and
international assessments, they create an isomorphism of skills that takes
place at global scale.
PISA as epistemic governance within the European political arithmetic 51

The foundations of PISA as international


governing by numbers
Another way of examining PISA is to look into the international survey
as a new governmentality in education (Lascoumes & Le Galès, 2007).
PISA is an instrument of public action that brings together not only
sophisticated techniques but also values and interpretations on the effi-
ciency and quality of education systems (Carvalho, 2020). To understand
the foundations of PISA in education policies, long before its internation-
alization, it is useful to consider the archaeology of international assess-
ments by showing how the US metric was globalized (Normand, 2020a).
Indeed, from 1964 to 1968, the ECAPE project (Exploratory Committee
on Assessing the Progress of Education) brought together members of
the US Congress, interest groups (notably the Carnegie and Ford foun-
dations) and representatives of US states to design and develop the first
federal policy on standards-based assessments (Lemann, 1999). The tests
were to cover reading, English, mathematics and science, and to diagnose
the strengths and weaknesses of the US education system. In 1968, the
provisional committee became the National Assessment of Educational
Progress (NAEP), and the first federal assessments were launched.
However, it was only after the publication of the report A Nation at
Risk (published in 1983) that the federal government took a renewed
interest in the NAEP. Its political and technical structures were com-
pletely overhauled, and the US Congress appointed a committee (the
National Assessment Governing Board [NAGB]) to develop standards
for student achievement, design tests, publish scores and ensure their dis-
semination at federal level. Since then, NAEP assessments have become
benchmarks for assessing US student skills, particularly after the “No
Child Left Behind” Act (2001). The NAGB has benefited from the exper-
tise of the Education Testing Service, an agency specialized in test design,
which was created by the US Navy to redefine SAT (the admission test to
prestigious US universities).
During the 1980s, as the United States increasingly pressured the OECD
to develop and expand international assessments, the NAEP served as a
benchmark for the revision of the first IEA surveys on mathematics. The
International Assessment of Mathematics and Sciences (IAEP) reused the
NAEP items, while the Education Testing Service gradually imposed its
expertise on the design of the PISA project. While PISA used the meth-
odological elements of NAEP, the IEA and ETS created a consortium (the
IERI or IEA-ETS Research Institute) to develop research and analysis on
international assessments, to train researchers and experts on these issues
and to disseminate their standards worldwide (Grek, 2013).
PISA then became an instrument of international trans-governance
that was developed in the context of building international education
52 Romuald Normand

indicators, the development of a comparative turn adopted by the


OECD’s CERI, with a view, under US pressure, to developing account-
ability and performance policies that would make it possible to compare
states (Normand, 2009). This story is now well known and documented
(Henry et al., 2001). With the political mobilization of the OECD
countries, but also the strengthening of statistical methods for data com-
pilation and analysis, an aggregation effect emerged at international
level resulting in the production of a vision and a discourse on this
international ranking and benchmark for national education policies
(Carvalho & Costa, 2015; Linblad et al., 2018). The OECD secretariat
has been very active in bringing together and converging heterogene-
ous actors and activities towards a common framework for interpreting
numeracy and literacy issues on the basis of highly developed expertise
in statistics, psychometrics, comparative assessment, research on effec-
tive schools and the economics of human capital (Carvalho & Costa,
2016). As different types of international comparisons were harmonized
and data collection expanded, links were established with the European
Commission and major stories were fabricated for the media (Grek,
2014; Lawn & Grek, 2012).

PISA between storytelling, “Chinese miracle”


and evidence-based education
In Europe, this PISA paradigm was imposed by the action of European
networks of experts and policymakers at the same time as the European
education strategy was being developed after the Lisbon conference
(Normand, 2010). Countries such as Finland were taken as models of
successful education policies, without really being able to explain these
success stories, except in large decontextualized accounts (Rinne et al.,
2004; Simola, 2014, Sahlberg, 2011). This institutional isomorphism
was completed by the OECD’s activism in organizing major interna-
tional forums dedicated to justifying reforms of education systems on
the basis of PISA. This was recently the case of the Davos forum where
Andreas Schleicher, its representative, called for a mobilization of global
business and entrepreneurship to redefine education systems according
to a neo-liberal agenda while praising the success of the Asian coun-
tries with the highest scores. Dissonance was emerging between mem-
ber countries calling for more testing and evidence-based policies, and
those in favour of more flexible systems of accountability (as in Finland
and New Zealand). However, proposals were made to strengthen and
broaden PISA by including social and emotional skills. Still, if we look
closely at the situation of Asian countries, notably China, in the survey
results, some questions remain concerning this success and its related
storytelling.
PISA as epistemic governance within the European political arithmetic 53

Several explanations are put forward by researchers to account for


the superiority of Chinese students. Mathematics is a good illustration.
The most common explanation is simply “culture”. Students’ intensive
work but also their commitment to excellence, as well as their parents’
or teachers’, is a powerful driving force. In Singapore, there are highly
demanding academic contents. Chinese students, compared to their for-
eign peers, have a better knowledge of numbers and basic arithmetic,
while some mathematical concepts are introduced much earlier than in
the West. They are also very quiet in the classroom and they only inter-
vene if the teacher asks them to do so. Much of the learning takes place
in silence with the idea that one can learn without speaking while being
strongly engaged in the activity. The teacher’s directionality is strong in
teaching.
“Olympiads” are regularly organized to select and reward the best teachers
in mathematics. In addition to parents’ attachment to excellence, school
competition is exacerbated by evening classes. The heritage of the Confu-
cian culture also explains principles of conduct and attitudes adopted by
students and teachers. They share a common belief in effort and persever-
ance, modesty and humility at work, and respect for the authority of teachers,
parents, and elders structure relationships as well as learning/teaching
activities. Finally, the Chinese language lends itself to logical thinking and
spatial representation. Indeed, Chinese characters, which are drawn from
a very early age, help children to develop their sense of geometry, while
modes of classification in the Chinese language make it easier to adopt
logical modes of reasoning.
Participation in the PISA survey is a challenge for China, as it is for
other countries, and Chinese policymakers are strongly attached to
international rankings as a means of gaining prestige and recognition
in the international arena (Sellar & Lingard, 2013b). This explains why
Shanghai’s success in the PISA rankings, even more than Singapore’s,
has strong symbolic and political connotations (Tan, 2017). However,
the country also faces major challenges (Xingguo & Normand, 2021).
The underdevelopment of Western China (lack of buildings, unqualified
teachers, scarcity of public funds etc.) is coupled with social and health
issues that relativizes the country’s success in education beyond the main
Eastern cities. State action is also thwarted by the size of the school mar-
ket which, despite regulatory attempts, increases competition between
schools and puts strong pressure for success on gāokǎo (the university
entrance exam). Despite the development of national assessments and
data management technologies, significant gaps remain between schools
and students at local level while even cities like Beijing and Shanghai are
confronted with social issues in education (screening of migrant students,
discrimination and poverty in some districts, inequalities of resources
depending on families, selective schools from early childhood).
54 Romuald Normand

The global storytelling about PISA nevertheless corresponds to a neo-


positivism and scientism that legitimizes the development of evidence-
based education technologies, even in China (Krejsler, 2013; Normand,
2016a, 2016b). There seems to be a fascination among some experts
and policymakers for medicine and the policy borrowing of its analytical
and methodological tools. They both believe that the transfer of these
methods to the area of education, such as systematic reviews of research
literature and randomized controlled trials, will make it possible to find
“robust” and “irrefutable” solutions to the “ills” of an education system
that needs to be “cured” of its “pathologies”. Evidence-based education
also facilitates a new alliance between economists seeking to strengthen
their positivist requirements in extending their data collection to new
social fields and the evaluation of educational policies, and psychologists
trying to perfect their psychometric tests and their treatment of larger
samples. These technologies reassure policymakers by apparently
providing short-term “solutions that work” and being well adjusted to
their policy agendas, especially since the formula is easily transposable to
the media and public space. In doing so, they contribute to discrediting
educational sciences accused of being biased, useless, complicated or
ideological. This epistemological orientation, which emerged in the
United States under the action of the New Right, before being taken up
by the OECD and the European Commission to make it an international
strategy for reconfiguring educational research, is now very much active
in Northern European countries, although it has been debated and
criticized by educational researchers (Biesta, 2007; Hammersley, 2007).
Evidence-based policies are legitimized and strengthened in a climate of
“fake news” where they are considered a bulwark against the excesses of
social networks while they are used as a rational justification for States
developing their accountability systems. This trend also shows the eternal
return in educational research, at a time of big data and neurosciences, to
neo-positivist and neo-behaviourist assumptions.

PISA and the European political arithmetic


of inequalities
The PISA survey can also be considered a political assemblage whose
rationality and coherence are at first glance beyond the observer’s grasp
(Gorur, 2011). I have shown how these assemblages have been under
the responsibility of different epistemic communities, networks of experts
and policymakers associated with design indicators and benchmarks
for the European Open Method of Coordination including PISA data
(Normand, 2009, 2010). Actor-Network Theory (ANT) is particularly
fruitful not only in accounting for these assemblages but also in analysing
alliances, empowerment, circulation and translation within these different
PISA as epistemic governance within the European political arithmetic 55

spaces and calculating centres (Latour, 2005; Fenwick & Edwards,


2010). Studying PISA also calls for the development of sociology of
measurement (Gorur, 2014) to explain the principles and components of
this international and European calculability as “instrumentalism” and
“political technology”. Rather than thinking of it in terms of principles
of equity, efficiency or quality, the challenge is to access the statistical
reasoning that explains the integration of PISA into the European lifelong
learning strategy and its Open Method of Coordination, by opening
up some perspectives in terms of demography, welfare and statistics
(Desrosières, 1998; Normand, 2013; Porter, 1996).

Political arithmetic and the European


demographic challenge
While PISA is a new measure of inequalities in education, based
on differences in student achievement from psychometric tests, the
international survey is part of continuous transformations in issues related
to equal opportunity and its metrics since the 1920s. I have attempted
to highlight these epistemic times by characterizing the measurement of
inequalities from the first US IQ tests to the development of the first
major international survey by the IEA (Normand, 2011). In Europe, the
invention of the measurement of inequalities in education was inseparable
from a quest for welfare and a demographic challenge to improve the
quality and quantity of the population, against a background of eugenicist
ideology. The development of the comprehensive school, built on the
opposition to psychologists’ IQ tests, appears to be a reforming conquest
not devoid of economic concerns about improving the stock of human
capital long before it was theorized and modelled by Gary Becker and his
followers. The demographic challenge has also been on the horizon of
educational policies with issues about social reproduction and mobility,
which have fed the first development of sociology of education even from
a critical perspective. The definition of selection, merit and reproduction
criteria has been the subject of fierce debate and methodological as well
as epistemic changes, but it has formed the backbone of a state policy
aimed at qualifying an elite on behalf of a certain conception of equal
opportunities.
The definition of “human capital” itself has evolved. Initially focused on
“degeneration” and “deficiency” of the “unfit” through adjusted tests, it
has gradually taken a more positive turn in the valuation of the “reserve of
talents”, the promotion of the “most talented” that opened up the access
to secondary and higher education, towards today’s more predictive
conception of childhood development based on the measurement of basic
cognitive skills that would facilitate the entrance in the labour market
required by the knowledge economy. While during the 1980s there were
56 Romuald Normand

still concerns about limiting the repetition rate and improving access
rates to higher education, measurements by economists, also relayed by
psychologists, are now geared towards limiting school dropout rates and
improving cognitive skills extended to the social and emotional area.
Measuring the adequacy between training and employment has been
replaced by randomized controlled trials based on targeted experiments
and evaluations on behalf of the new “experimental economics”, which
take the social and educational fields as a vast laboratory. Economists
have changed their epistemological and methodological toolbox, but
the idea of investment in human capital, albeit criticized, remains the
dominant orthodoxy. It is not surprising that some of their leaders, such
as Eric Hanushek and Ludger Woessmann, have taken an interest in the
PISA data, considering it to be an excellent measure of the quality and
efficiency of education systems, and a benchmark for judging how well
countries are investing in their human capital (2010).
These conceptions are taken up by welfare theorists such as Esping-
Andersen and his colleagues (Esping-Andersen et al., 2001). According to
them, two reasons justify some expectations regarding the level of skills
and human capital among children. The first is demographic. Because
of low fertility, future cohorts of very modest young people will have to
support a large and rapidly growing elderly population. It is therefore
necessary to invest as early as possible in the productivity of youth to
ensure a sustainable Welfare State in Europe over the coming decades.
The other explanation lies in the rapidly expanding skillset required by
the knowledge economy. Reforms in European countries need to target
early school leavers with higher unemployment rates. These low-skilled
people are unlikely to obtain high retirement pensions and poverty
threatens them at the end of their lives. Cognitive (and non-cognitive)
skills are therefore considered essential for a good career path.
To the extent that acquired skills affect school success and lifelong
learning, it is important that all children get a good start to maximize
the “return on investment”. On this point, if we follow Esping-Andersen,
the situation of states is revealed by the PISA indicators, where a nation’s
superiority is explained by institutional factors generating differences
in performance. But social inheritance mechanisms are present in early
childhood. This is therefore where the Welfare State’s efforts must focus
on to create more equality and increase the productivity of the workforce.
The aim is to combat child poverty without compensating for inequalities
in parental resources in the acquisition of human capital. To achieve this,
it is necessary to reduce the economic insecurity of mothers at the bottom
of the income scale by supporting their inclusion into employment. The
other mechanism is to support parents’ investment in their children’s
cognitive development. Interventions should take the form of targeted
measures for “at-risk” children identified from early childhood.
PISA as epistemic governance within the European political arithmetic 57

Lifelong learning, human capital and


statistical reasoning
It is easier to understand why PISA and other international assessments
have developed so much in recent years, under the aegis of the OECD
and Eurostat. They are a measurement of investment in human capi-
tal, of the reduction of inequalities in student outcomes, of the perfor-
mance and efficiency in the governance of education systems and, more
generally, judging from Esping-Andersen’s reflections, of the quality of
Welfare State reforms. A new political arithmetic of inequalities is there-
fore proposed by major international organizations, including the Euro-
pean Commission, to reformulate the statistical framework developed in
national contexts and to devise new measurement instruments.
This statistical reasoning was already formulated by Albert Tuijn-
man (2003), who led the OECD’s INES project. As a disciple of Torsten
Husén, he also worked as an expert for the World Bank, on adult basic
skills for major international surveys, and as an economist at the Euro-
pean Investment Bank. He advocated the development of a skills-based
approach at the core of the European Statistical System to make progress
in the European lifelong learning policy. For him, the development of
skills could be represented by a production function in education, corre-
sponding to a mathematical expression linking inputs (physical, financial
and human capital) to outputs (measurement of success in different skills,
values and attitudes). Lifelong learning was seen as an “insurance policy”
to minimize “market risks” related to the uncertainty about the costs and
risks associated with the investment in human capital. Building on the
categories of non-formal and informal learning valued by the European
Commission, Tuijnmann ensured the collaboration between the INES
project and Eurostat services to build up statistics on lifelong learning.
His wishes have since been fulfilled in the development of a statistical
apparatus to support the development of the Open Method of Coordina-
tion in education. This statistical argument also shows how European
statistics formulate a neo-liberal agency for the individualization of paths
and careers throughout people’s lives, in line with a certification of skills
that opens up an area of mobility and flexibility into the European labour
market (Normand & Pacheco, 2014).

PISA, human capital and the European open method


of coordination
The aim of the OMC was to construct quality and benchmarking indica-
tors that could help to monitor education systems on data considered
to be objective (Alexiadou et al., 2010). The design of a statistical sys-
tem for lifelong learning aimed to facilitate the recognition of learning
58 Romuald Normand

activities outside the formal education system (self-learning, on-the-job


training) and to enhance the value of individual investment in training
by developing tailored tools. The demographic challenge was also impor-
tant for the European Commission. It was still related to improving the
quantity and quality of the population but through lifelong learning as
an alternative model of compulsory schooling developed during the 20th
century. The OMC indicators and benchmarks also aimed to measure
“school drop-outs” and “secondary school completion rates”, invest-
ment in education and training and its returns, employability and pro-
ductivity of older people.
If we go into details about the measurement tools adopted under the
Open Method of Coordination in its early formulations, it is relevant
to link them to demographic and employment issues, which guide the
European lifelong learning strategy. It is then possible to reformulate a
certain number of equivalences by making comparisons with the human
capital theory.
For example, participation in pre-school education or early school-
leaving (an equity domain) and the rate of completion of secondary
education (a lifelong learning domain) can be related to concerns associ-
ated with the productivity of the labor force from youth to adulthood.
Similarly, investment in education and training (a modernizing higher
education domain) and returns on education and training (an employ-
ability domain) are concepts promoted by human capital economists.
The participation of adults in lifelong learning (lifelong learning domain)
and their skills (lifelong learning and employability domains) correspond
to the objectives of maintaining the productivity of seniors. The level
achieved by the population (employability domain), higher education
graduation rates and student mobility between countries (modernizing
higher education domain) reflect the European Commission’s concern
over building and strengthening the “reserve of talents” to compete glob-
ally with the United States, China and South-East Asian countries. Indi-
cators of skills in reading, mathematics, science and ICTs (mastery of key
competencies domain) are strongly linked to international comparisons
of results like PISA, which economists consider a good measurement of
the productivity and quality of education systems.
PISA data were therefore gradually integrated as OMC (Open Method
of Coordination) indicators while human capital economists were
networking to advise the European Commission on education policies.
Its Directorate General for Education and Culture (DGEAC) has indeed
created EENEE (European Expert Network on Economics of Education),
a network that presents itself as a “Think Tank” aiming to improve
decision-making and policy-making in the field of education and training
in Europe. It provides advice and support to the European Commission
in analysing the economic aspects of education policies and reforms.
PISA as epistemic governance within the European political arithmetic 59

The EENEE works as a European platform of exchange for education


economists, and a source of information for policymakers, media and
those interested in the economics of education in Europe to monitor the
OMC progress.

PISA as epistemic governance: an illustration


from the French case
Beyond the role of international and transnational actors, the PISA survey
gives rise to re-problematizations and multiple translations according
to national contexts, in France as in other European countries (Krejsler
et al., 2014). There are varied political and institutional trajectories
showing diverse mobilized actors at the crossroads of legitimation and
politicization. In the French context, mediation characterizes PISA as a
boundary object (Normand, 2014) with multiple interpretations based
on heterogeneous principles of justice. It corresponds to epistemic
governance that can be analysed from the theoretical framework
developed by Pertti Alasuutari and Ali Qadir (2019). Indeed, French
social movements, interest groups and the political elite are not passive
in the face of the knowledge produced by PISA, which is also recycled
by some research institutions and think tanks. They seek to guide and
shape the thinking of others in the public and media space to change
their conception of educational reality and produce new meanings. Thus,
narratives and imageries in education are used in epistemic work that
calls as much for authority as for evidence or scientific reasoning that
escapes most educators. By acting on their desires, hopes and anxieties,
different individual and collective actors frame interpretations according
to values and beliefs based on the knowledge produced in the national
space around PISA. In France, this epistemic background is largely built
on the ideal of equal opportunities and the republican imagery.

PISA and the French republican imagery


The approach to PISA and accountability policies in France, from an insti-
tutionalist perspective, even if it focuses on the actors, too often adopts
the point of view of political decision-makers in a kind of “hypocriti-
cal conformism” that, following Pertti Alasuutari and Ali Qadir (2019),
does not take into account the imagery and arguments used to persuade
educators about the benefit of PISA. Epistemic cultures specific to meas-
urement in education are not only rooted in a positivist tradition that
has largely influenced the great republican narrative, from Condorcet to
Auguste Comte, but they also rely on metaphorical language. For exam-
ple, the idea of a necessary “culture of evaluation” has been forged as a
political argument while masking the stakes of metrics deployed by the
60 Romuald Normand

Ministry of Education during the last decades. Imagery in the rhetoric


used by policymakers or scripts that are developed by the French Minis-
try of Education provide opportunities for framing educational policies
and for conforming them to PISA and the Open Method of Coordination,
recently under the watchword of “re-founding the basis-skills school sys-
tem on a return to the legacy of the Republican founders” or “building a
trusting school system on basic skills”. Moreover, the “Republican pact”
appears to be the unassailable horizon for reforming the education sys-
tem on the basis of accountability and basic skills, and it is sometimes
even recognized as a “necessary fiction” by sociologists (Dubet, 2004).
This neo-republican imagery is characterized by a methodological
nationalism that turns the nation state into the natural and modern
form of the French education system and its organizing principle. Its
transformation and modernization are therefore the responsibility
of the State, governing by laws of the Republic and calling upon the
authority of science to fight against unequal opportunities. The idea
of educational progress is naturalized as a necessary step not only to
perpetuate the Republican heritage but also to avoid lagging behind
countries with the best PISA rankings. Finland has long served as an
idealistic model to justify reforming proposals because it seemed to
conciliate the implementation of basic skills with the maintenance of
an egalitarian system, in contrast to the marketization and neo-liberal
regimes in Anglo-Saxon countries, which are rejected by most French
policy-makers.
This permanent call for a “national leap”, largely embodied in
programming laws over several decades, as the last chance to reduce
inequalities between students, has remained a constant discourse shared
by ministers since the early 1980s. The public interest motif serves to
mask the variety of interest groups and “legitimate players”, who are
called upon to play a role in successive reforms. Beyond the recurrent
war between the Ministry and the more or less reformist trade unions,
there are power games and negotiations with other interest groups:
parents, secular or religious associations, political parties and members
of Parliament, and professional bodies within the public education
service. With the development of the media sphere and social networks,
the political rhetoric has evolved to challenge the media directly and
disseminate “political grammars” preparing mindsets for reforms that
were discussed upstream in think tanks and reformist circles in the
shadow of “invisible colleges” (Stone, 2008).
For lay people, the popular imagery is that of a Ministry of Education
organized by hierarchical levels and possessing a chain of command that
would implement a rational, bureaucratic top-down policy. The idea of a
minister as a charismatic authority who decides on everything is another
narrative, but it greatly simplifies the spatial and temporal ordering of
PISA as epistemic governance within the European political arithmetic 61

decisions and the implementation of reforms related to PISA. However,


this collective imagery maintains the idea of a power that “masters
and controls” policy-making with the required “moral authority” and
“legitimacy” to do so. In fact, this decision-making world is divided into
different antagonistic camps where a diversity of educational ideologies
and visions are expressed. It leads the State’s epistemic governance to
forge compromises on the PISA paradigm in dedicated spaces (major
debates, consensus conferences, parliamentary committees, high councils)
to prepare and legitimate its decision while at the same time trying to
affirm its authority.

The PISA paradigm and the construction of


an epistemic authority
The PISA paradigm has been progressively embedded by the Ministry
of Education within its Department of Evaluation, Foresight and Per-
formance (Direction de l’Evaluation, de la Prospective et de la Perfor-
mance – DEPP), by establishing a state of the school system for steering
the education system and promoting accountability. Assuming this
monopoly on governing by numbers, the Ministry delegates expertise to
its ministerial directorate which, coupled with reports from the General
Inspectorate, has a capacity-based authority (Normand, 2020b). It thus
gives the impression of objective measurement detached from reform-
ing issues taken on by the Minister and the Cabinet. This expertise is
strengthened by the regular creation of national high councils to assess
the education system and address recommendations, with members
appointed by the Minister. These councils take PISA data as the basis
for their assessments and conclusions. The delegation of the epistemic
authority to statistical and metrological objects, as the sociology of sci-
ence and technology has shown, is also a way of building evidence under
the guise of realism and neutrality. But this internal expertise would not
be enough to win the battle of opinions on the PISA paradigm among the
public and the media.
Another type of “ontological” authority is necessary. It is based on
the appeal to recognized external actors. This “ontological authority”,
which takes the form of expert knowledge, through expert opinion and
official reports, legitimizes decision-making by claiming relative neutral-
ity and impartiality. It refers to people, texts and institutions outside
the State that represent the state of the world expected by the PISA
paradigm. Through their speeches and writings, this expertise contrib-
utes to make PISA a reality in politics. This is why the Ministry was
able to call on experts who were involved in the INES project, such
as Norberto Bottani, to write reports justifying the development of
indicators and the more sustained use of international comparisons in
62 Romuald Normand

French education policy-making. It also took advantage of the French


presidency of the European Union to organize an international confer-
ence on international education indicators. This forum, which brought
together members of the Ministry, representatives of the European
Commission, the IEA and the OECD, legitimized the alignment of the
French policy with the Open Method of Coordination, and the adop-
tion of PISA as a benchmark for reforms to be carried out. From then
on, the OECD, through its representative Eric Charbonnier, regularly
informed the media about France’s position in the international rank-
ing, and made recommendations on measures to be adopted to reduce
inequality of opportunities.
Although expert knowledge was built largely on the expertise of the
DEPP, it also drew on contributions from human capital economists (rep-
resented by the Paris School of Economics), and psychologists specialized
in the metrology of tests (many of them coming from French-speaking
Belgium, such as Marc Demeuse or Marcel Crahay). Some French edu-
cational sociologists also helped to legitimize the validity of PISA with
policymakers and the media by developing a rhetoric of equal opportu-
nities. Christian Baudelot, originally a Marxist sociologist, meanwhile
converted to human capital theory, wrote a small book to justify the
methodological and epistemological robustness of the PISA survey and its
contribution to a better understanding of inequality of opportunities and
to the fight against the French selective meritocracy (Baudelot & Estab-
let, 2009). Some sociologists, such as François Dubet and Marie Duru-
Bellat, also close to reformist left-wing circles, sought to demonstrate the
value of the survey data by integrating them into a comparative study of
education systems in terms of equity and social cohesion, celebrating the
“Finnish miracle” while including these results in a rhetoric of justice on
equal opportunities (Dubet et al., 2010). Among the next generation,
Georges Felouzis, who had disseminated and publicized research find-
ings on school effectiveness, wrote a small manual that, while consider-
ing some limitations of the survey, praised the quality of the data and
their contribution to a better understanding of French students’ skills and
achievement gaps (Felouzis & Charmillot, 2012). Nathalie Mons also
wrote a book reusing OECD and PISA data to explain that France was
at a crossroads while calling for a change in educational policy, challeng-
ing the neo-liberal policies of free choice, high stakes testing and decen-
tralization, and justifying the necessary maintenance of the central state
to implement soft accountability and basic skills (Mons, 2015). These
proposals were in line with the positions of reformers concerned with
maintaining the moral authority of the French State while being deeply
attached to the republican imagery. Mons was later appointed head of
the CNESCO (National Council for the School System) by a Leftist min-
ister (François Peillon) in charge of organizing conferences of consensus
PISA as epistemic governance within the European political arithmetic 63

as well as conducting studies to support ongoing education policies


implemented by the Ministry.
While part of the sociology of education had been converted into sci-
ences of government, at the same time that it joined reformist proposals
related to the implementation of basic skills, PISA was seen as a natu-
ral void in the French education landscape without giving rise to much
debate and controversy. Through these cross-representations, based
on public and invisible interactions and exchanges, the paradigm was
built upon reality and legitimacy. The PISA paradigm gradually became
the source of a national commitment in which some actors and interest
groups ended up attributing causes and responsibilities in the production
of school problems, identifying those responsible, particularly teachers,
proposing reform solutions and expert knowledge while participating in
the construction of a great national narrative on the reduction of inequal-
ities. To do this, they managed to master a set of rhetorical rules, between
science, expertise and politics, enabling their arguments to be acceptable,
relevant and legitimate from a moral or scientific perspective around the
same reformist interests, paving the way to a French-Third Way in educa-
tion and a new accountability policy.

The PISA paradigm and the French basic


skills/accountability policy
Beyond the monopoly of expert knowledge, the imagery and rhetoric
developed by policymakers and experts in the media space, the adoption
of the PISA paradigm and its progressive integration into the French
accountability system can be explained by the relative efficiency of the
chain of command from the Ministry to the schools. The survival of
the Napoleonic model is no stranger to this bureaucratic and legalistic
implementation of a “culture of evaluation” along the various strata of the
administration. Intermediate professional bodies are generally concerned
with respecting hierarchical orders and applying ministerial decisions in
accordance with laws and regulations. Centralization concentrates power
in a few officials and professional bodies, at the same time that it brings
into play a revolving door clause at the top of the Ministry. However,
this appointment to the highest positions is increasingly the result of
a narrow politicization to the detriment of recognizing and awarding
seniority and experience in the public service. However, it strengthens
the moral authority of governmentality behind the great republican
principles that are always called upon, as well as the “force of law”, to
justify decision-making.
The integration of the PISA paradigm into the French education
system was achieved in three ways. First, it gradually became a rhetorical
justification for reforms carried out by successive ministers pointing
64 Romuald Normand

to the maintenance of strong inequalities compared to other countries


as measured by the PISA data. It then justified the introduction of
public management principles, with PISA data being included into the
measurement of the French education system outcomes (effectiveness)
alongside cost data (economy) and student enrolment and paths data
(efficiency). PISA also helped to legitimize the development of national
assessments and the connection between basic skills and the national
curriculum, which has undergone several successive reforms.
However, the French PISA paradigm must be situated in the broader
system of the Open Method of Coordination, for which the Ministry of
Education has sought to be a “good European student”. It explains par-
ticularly the interest of French policymakers in limiting school dropouts
and implementing a proactive policy in this area. In addition to satisfying
a measurement in human capital investment, the inclusion of this OMC
indicator in the French context is presented as a welfare measure for
equal opportunities targeting the most disadvantaged students and the
development of skills for early childhood.
More recently, the French accountability policy has taken a further
step in developing evidence-based education that complements the PISA
paradigm. Initiated in the early 2000s, consensus conferences were first
established to validate expert recommendations on student literacy skills
before being used more widely to legitimize a national strategy for literacy
and numeracy. On each occasion, specific institutions bringing together
experts, political decision-makers and representatives of Parliament
and civil society were created by the government to legitimize this basic
skills policy, the PIREF 2002–20031 and the CNESCO from 2013.2
Human capital economists from the Paris School of Economics have
also regularly conducted experiments commissioned by the Ministry of
Education largely based on randomized controlled trials and focused on
class sizes, school dropouts, school rhythms and literacy. More recently,
the legitimization of cognitive psychology and neurosciences by the
Minister himself has led to the implementation of a large-scale national
programme focused on student skills in early childhood supervised by the
newly appointed National Scientific Council of Education. By promoting
evidence-based education, this national council hopes to disseminate
evidence-based recommendations and best practices to practitioners and
contribute to improve student achievement. This programme, according
to its leaders, coupled with the design of new national assessments, should
raise French students’ scores to higher positions in the PISA rankings.

Conclusion
More space would be needed to describe and analyse in depth the epis-
temic work that has been established around the PISA paradigm in
PISA as epistemic governance within the European political arithmetic 65

France. This chapter has shown that the PISA survey, in addition to
being a boundary object giving rise to multiple meanings and mobiliza-
tions at national and global scales, is based on an economic reasoning
that relativizes its uses to promote equality of opportunities. This inter-
national survey is undoubtedly a political technology and an instrument
of public action that legitimizes educational reforms towards greater
accountability and evidence-based education. From this point of view,
the PISA paradigm has the similar restructuring effect that occurred
with the comprehensive school during the 1960s and 1970s. It explains
the importance of studying these transformations over a long period
of time, through a history of the present, to highlight the genealogy
of the survey and its extension from the United States. The role of
major International Organizations in the mechanisms of PISA policy
travelling and transfer has also been long established. PISA is often
presented as a neo-liberal vanguard that contributes to introducing
privatization and private interests in public education. It is also a pow-
erful means of institutionalizing a new state epistemic governance on
behalf of the human capital theory. Beyond its instrumentalization by
indicators and data, and the production of expert knowledge within
and outside Ministries of Education, this epistemic work is based on
authority and imagery that are imposed on the public debate and the
media through a soft infusion like the opium that puts the most vigi-
lant minds to sleep. As the Chinese general Sun Tzu put it in his Art
of War, the PISA paradigm has the efficiency to win the epistemic bat-
tle before having engaged the fight. This is the reason why sociology,
as Pierre Bourdieu argued, from critical reflexive thinking on inter-
national assessments, rather than become a governing science, must
remain a martial art.

Notes
1 Programme Incitatif de Recherche en Education et Formation.
2 Conseil National d’Evaluation Scolaire.

References
Addey, C. (2015). Participating in international literacy assessments in Lao PDR
and Mongolia: A global ritual of belonging Literacy as numbers: Researching
the politics and practices of international literacy assessment. In M. Hamilton,
B. Maddox, & C. Addey (Eds.), Literacy as numbers: Researching the politics
and practices of international literacy assessment. Cambridge University Press.
Addey, C., Sellar, S., Steiner-Khamsi, G., Lingard, B., & Verger, A. (2017). The
rise of international large-scale assessments and rationales for participation.
Compare: A Journal of Comparative and International Education, 47(3),
434–452.
66 Romuald Normand

Alasuutari, P., & Qadir, A. (2019). Epistemic governance: Social change in the
modern world. Springer Nature.
Alexiadou, N., Fink-Hafner, D., & Lange, B. (2010). Education policy conver-
gence through the open method of coordination: Theoretical reflections and
implementation in ‘old’ and ‘new’ national contexts. European Educational
Research Journal, 9(3), 345–358.
Baudelot, C., & Establet, R. (2009). L’élitisme républicain: l’école française à
l’épreuve des comparaisons internationales. Seuil.
Berliner, D. C. (2020). Implications of understanding that PISA is simply another
standardized achievement test. In G. Fan & T. S. Popkewitz (Eds.), Handbook
of education policy studies (pp. 239–258). Springer.
Berliner, D. C., & Biddle, B. J. (1995). The manufactured crisis: Myth, fraud, and
the attack on America’s public schools. Longman Publishers.
Biesta, G. (2007). Why “what works” won’t work: Evidence-based practice and
the democratic deficit in educational research. Educational Theory, 57(1),
1–22.
Carvalho, L. M. (2020). Revisiting the fabrication of PISA. In G. Fan & T.
S. Popkewitz (Eds.), Handbook of education policy studies (pp. 259–273).
Springer.
Carvalho, L. M., & Costa, E. (2015). Seeing education with one’s own’ eyes and
through PISA lenses. Discourse: Studies in the Cultural Politics of Education,
36(5), 638–646.
Carvalho, L. M., & Costa, E. (2016). The praise of mutual – surveillance in
Europe. In R. Normand & J.-L. Derouet (Eds.), A European politics of educa-
tion? (pp. 53–72). Routledge.
Desrosières, A. (1998). The politics of large numbers: A history of statistical
reasoning. Harvard University Press.
Dubet, F. (2004). L’école des chances. Qu’est-ce qu’une école juste ? Éditions du
Seuil et La République des Idées
Dubet, F., Duru-Bellat, M., & Vérétout, A. (2010). Les sociétés et leur école.
Emprise du diplôme et cohésion sociale. Le Seuil.
Esping-Andersen, G., Gallie, D., Hemerijck, A., & Myles, J. (2001). A new
welfare architecture for Europe. Report submitted to the Belgian Presidency
of the European Union.
Felouzis, G., & Charmillot, S. (2012). Les enquêtes PISA. PUF (col. Que sais-je?).
Fenwick, T., & Edwards, R. (2010). Actor-network theory in education.
Routledge.
Gorur, R. (2011). Policy as assemblage. European Educational Research Journal,
10(4), 611–622.
Gorur, R. (2014). Towards a sociology of measurement in education policy.
European Educational Research Journal, 13(1), 58–72.
Grek, S. (2009). Governing by numbers: The PISA ‘effect’ in Europe. Journal of
Education Policy, 24(1), 23–37.
Grek, S. (2013). Expert moves: International comparative testing and the rise of
expertocracy. Journal of Education Policy, 28(5), 695–709.
Grek, S. (2014). OECD as a site of co-production: European education govern-
ance and the new politics of ‘policy mobilization’. Critical Policy Studies, 8(3),
266–281.
PISA as epistemic governance within the European political arithmetic 67

Hammersley, M. (Ed.). (2007). Educational research and evidence-based prac-


tice. Sage. (In collaboration with Open University Press).
Hanushek, E. A., & Woessmann, L. (2010). How much do educational outcomes
matter in OECD countries? Economic Policy, 26(67), 429–491.
Henry, M., Lingard, B., Rizvi, F., & Taylor, S. (2001). The OECD, globalisation
and education policy. IAU Press & Elsevier.
Krejsler, J. B. (2013). ‘What Works’ in Education and social welfare? A mapping
of the evidence discourse and reflections upon consequences for professionals.
Scandinavian Journal of Educational Research, 57(1), 16–32.
Krejsler, J. B., Olsson, U., & Petersson, K. (2014). The transnational grip on
Scandinavian Education reforms: The open method of coordination challenging
national policy-making. Nordic Studies in Education, 34(3), 172–186.
Lascoumes, P., & Le Galès, P. (2007). Introduction: Understanding public policy
through its instruments – From the nature of instruments to the sociology of
public policy instrumentation. Governance, 20(1), 1–21.
Latour, B. (2005). Reassembling the social: An introduction to actor-network-
theory. Oxford University Press.
Lawn, M., & Grek, S. (2012). Europeanizing education: Governing a new policy
space. Symposium Books.
Lemann, N. (1999). The big test: The secret history of the American meritocracy.
Farrar, Straus & Giroux.
Lindblad, S., Pettersson, D., & Popkewitz, T. (Eds.). (2018). Education by the
numbers and the making of society: The expertise of international assessments
(pp. 222–228). Routledge.
Meyer, H.-D., & Benavot, A. E. (Eds.). (2013). PISA, power, and policy: The
emergence of global educational governance. Symposium Books.
Mons, N. (2015). Les nouvelles politiques éducatives: La France fait-elle les bons
choix? Presses Universitaires de France.
Nichols, S. L., & Berliner, D. (2007). Collateral damage: How high-stakes testing
corrupts America’s schools. Harvard Education Press.
Normand, R. (2009). Expert measurement in the government of lifelong learn-
ing. In E. Mangenot & J. Rowell (Coord.), What Europe constructs: New
sociological perspectives in European studies (pp. 225–242). Manchester Uni-
versity Press.
Normand, R. (2010). Expertise, networks and tools of government: The
fabrication of European policy in education. European Educational Research
Journal, 9(3), 408–423.
Normand, R. (2011). Gouverner la réussite scolaire: une arithmétique politique
des inégalités. Peter Lange/Presses de l’Ecole Normale Supérieure.
Normand, R. (2013). Governing population: The emergence of a political
arithmetic of inequalities in education. A comparison between the United
Kingdom and France. In M. Law (Ed.), The rise of data, historical perspectives.
Symposium Books.
Normand, R. (2014). The French pinnacle of PISA. In M. Lawn & R. Nor-
mand (Eds.), Shaping of European education. Interdisciplinary approaches
(pp. 32–49). Routledge.
Normand, R. (2016a). ‘What works?’ The shaping of the European politics of evi-
dence. In R. Normand (Ed.), Towards a New Europeanus Homo Academicus?
68 Romuald Normand

The changing epistemic governance of European education (pp. 95–125).


Springer International Publishing.
Normand, R. (2016b). ‘What works?’: From health to education, the shaping of
the European policy of evidence. In K. Trimmer (Ed.), Political pressures on
educational and social research (pp. 39–54). Routledge.
Normand, R. (2020a). The politics of metrics in education: A contribution to the
history of the present. In G. Fan & T. Popkewitz (Eds.), Handbook of educa-
tion policy studies. Springer.
Normand, R. (2020b). France: The French state and its typical “agencies” in edu-
cation. Policy transfer and ownership in the implementation of reforms. In H.
Ärlestig & Johansson (Eds.), Educational authorities and the schools (pp. 151–
168). Springer.
Normand, R., & Pacheco, R. (2014). Constructing the lifelong learning self:
European policies and the sense of justice. In M. Milana & J. Holford (Eds.),
Adult education policy and the European union: Theoretical and methodo-
logical issues, ESREA Book Series: Research on the Education and Learning of
Adults. Sense Publishers.
Pettersson, D. (2014). The development of the IEA: The rise of large-scale test-
ing. In A. Nordin & D. Sundberg (Eds.), Transnational policy-flows in Euro-
pean education: Conceptualizing and governing knowledge. Oxford Studies in
Comparative Education/Symposium Books.
Porter, T. M. (1996). Trust in numbers: The pursuit of objectivity in science and
public life. Princeton University Press.
Rinne, R., Kallo, J., & Hokka, S. (2004). Too eager to comply? OECD education
policies and the Finnish response. European Educational Research Journal,
3(2), 454–486.
Sahlberg, P. (2011). PISA in Finland: An education miracle or an obstacle to
change? Center for Educational Policy Studies Journal, 1(3), 119–140.
Sellar, S., & Lingard, B. (2013a). The OECD and global governance in education.
Journal of Education Policy, 28(5), 710–725.
Sellar, S., & Lingard, B. (2013b). Looking east: Shanghai, PISA 2009 and the
reconstitution of reference societies in the global education policy field. Com-
parative Education, 49(4), 464–485.
Sellar, S., & Lingard, B. (2014). The OECD and the expansion of PISA: New
global modes of governance in education. British Educational Research Jour-
nal, 40(6), 917–936.
Sellar, S., Thompson, G., & Rutkowski, D. (2017). The global education race:
Taking the measure of PISA and international testing. Brush Publishing.
Simola, H. (2014). The Finnish education mystery: Historical and sociological
essays on schooling in Finland. Routledge.
Steiner-Khamsi, G., & Waldow, F. (2018). PISA for scandalisation, PISA for pro-
jection: The use of international large-scale assessments in education policy
making – an introduction. Globalisation, Societies and Education, 16(5),
557–565.
Stone, D. (2008). Global public policy, transnational policy communities, and
their networks. Policy Studies Journal, 36(1), 19–38.
Tan, C. (2017). Chinese responses to Shanghai’s performance in PISA. Compara-
tive Education, 53(2), 209–223.
PISA as epistemic governance within the European political arithmetic 69

Tuijnman, A. (2003). Measuring lifelong learning for the new economy. Com-
pare: A Journal of Comparative and International Education, 33(4), 471–482.
Xingguo, Z., & Normand, R. (2021). Accountability policies in Chinese basic
education: The long March towards quality and evidence. In S. Grek, C.
Maroy, & T. Verger (Eds.). World yearbook of education accountability and
datafication in the governance of education. Routledge.
4

Pisa and curricular


reforms in Brazil
The influence of a powerful
regulatory instrument
João Luiz Horta Neto

Introduction
In this section, the structure of education in Brazil and the development
of external assessment promoted by the federal authorities will be pre-
sented, as well as the influence exerted by Pisa, an instrument devel-
oped by the Organization for Economic Cooperation and Development
(OECD) over educational initiatives of the Ministry of Education (MEC).
Brazil, a federal country, with a population of 210 million people and
an area of 8.5 million km2, began its period of constitution as a nation
with the arrival of the Portuguese colonists in 1500. It proclaimed its
independence from Portugal in 1822, with the monarchy, and became a
republic in 1889.
Since 1996, the Brazilian education is organized in two levels: higher
education (ISCED 6 to 8) and basic education (ISCED 0 to 3). Basic
education is then divided into three stages of general education: Early
Childhood Education (ISCED 0), Core Education (ISCED 1 e 2) and
Middle Schooling (ISCED 3) (OECD, 2020a). Mandatory education in
Brazil goes from four to 17 years of age. In this age group, 98% of the
population attends school. Basic education comprehends 181.9 thousand
schools and 48.5 million enrolments, 82% of which are offered by the
public school networks funded by the state and municipal authorities
(INEP, 2019a).
Brazil is a very unequal country, both socially and regionally, and this
is reflected in the problems faced by education (Weller & Horta Neto,
2021). One of these is the high rate of grade retention, which translated
into a high proportion of individuals older than expected for a particular
school grade. In the first five years of Core Education, this proportion of
people older than the expected school age rises to 11.2% of enrolments
and reaches 24.7% in the final four years. In the four following years, this
proportion is even higher, jumping to 28.2% of enrolments. To worsen the
problem at this stage of education, nearly 12% of young people between
the ages of 15 and 17 are outside the school system (INEP, 2019a).
DOI: 10.4324/9781003255215-5
Pisa and curricular reforms in Brazil 71

With the objective of gathering data to propose education policies that


will improve the Brazilian education situation, the federal government
created, and has been improving for the past 30 years, a sturdy, con-
sistent set of measuring tools to evaluate the Brazilian education. These
efforts were launched in the period when the country returned to democ-
racy, in the late 1980s, after more than two decades of military dicta-
torship, continue today and have involved governments with different
ideological profiles. The Basic Education Assessment System (SAEB, in
the Portuguese acronym) was created in 1988 as a pilot project and had
its first application cycle two years later. It has been applied every two
years since 1993.
The first two cycles anticipated using a series of tools, among which
cognitive tests, to generate three groups of indicators which, together,
would enable the assessment of the quality of Brazilian education. Most
of these instruments were already being developed for some years, and
together they formed a logical and harmonious set. At the centre of the
proposal was the notion that education is a complex public endeavour
and that assessing it would require measures that captured some of the
dimensions of the educational process (Horta Neto, 2018).
Some of the results were scattered, others weren’t. One of these was the
“Study of the Student” and involved applying multiple-choice and open-
answer cognitive tests to a sample of students enrolled in the 2nd, 4th,
6th and 8th grades of basic education, considered to be fundamental in
the students’ learning process. To construct a reference for the prepara-
tion of the items, since there was no national curriculum, research was
conducted with the schools and specialists to obtain information on the
implemented curriculum. The results of the tests were analysed using the
Classical Test Theory, so as to ensure that teachers would understand
their results.
From 1995 onwards, the original model was put aside, and the SAEB
focused only on cognitive tests and on the questionnaires applied to a
sample of students, teachers and school directors. The school grades’ tests
shifted to 5th, 9th and 12th grades, the final grades of the basic education
stages. Starting this year, too, results were calculated using a probabilistic
function known as Item Response Theory (IRT), a methodology used by
most of the large-scale tests including PISA, which would be applied for
the first time five years later. Its use enabled the students’ performance in
the different SAEB cycles to be compared but created problems for under-
standing its results. First, because the figure which translates the measure
bears no direct relation with the quantity of correct answers in the test,
something which is typical of classroom activities. Second, because some
of the items applied in one SAEB cycle must be reapplied in the follow-
ing cycle to ensure that the results of the different cycles follow the same
scale. Given this, not all items of the test can be disseminated, which
72 João Luiz Horta Neto

Figure 4.1 IDEB variation from 2005 to 2017


Source: INEP, 2018a
Note: The end of the line indicates the target to be achieved in 2017.

makes it difficult to understand the activities that were presented to the


students, an important element in the education process, since it helps to
contextualize the obtained result.
From 2005, the SAEB becomes a census-type probe for the students
enrolled in the public school networks and sample-based for the students
in private schools. This allowed the creation of the Basic Education Devel-
opment Index (IDEB, in the Portuguese acronym), an indicator which
ranges from zero to ten, nationally recognized as the indicator that meas-
ures the quality of school education. The IDEB is calculated from data of
students’ performance in the SAEB tests, the school flow collected by the
School Census and is parameterized in relation to the OECD countries’
average that participated in PISA. This parameterization, which aims to
compare the Brazilian advances with the international reality (Fernandes,
2017), shows how influential Pisa is in the Brazilian education policies.
Starting with the creation of the IDEB, targets were set for all the schools
so that in 2022, year of the bicentennial of the independence of Brazil, all
schools are to reach index 6 in the IDEB for the 5th grade, equivalent to
the performance average of OECD countries in Pisa in 2003. The IDEB
variation between 2005 and 2017, last year available for this indicator,
can be seen in Figure 4.1.
The federal government has announced for 2021 a change in the SAEB,
envisaging the application of tests to all 45 million students enrolled
in grades from the 2nd to the 12th, and of questionnaires to assess the
schooling offerings of basic education, also as census. This proposal was
Pisa and curricular reforms in Brazil 73

Table 4.1 C
 omparison of Brazil’s performance in PISA tests and average per-
formance of the OECD member countries: 2000 to 2015

Reading Mathematics Sciences

Year Brazil OECD Brazil OECD Brazil OECD

2000 396 496 - - - -


2003 403 497 356 499 - -
2006 393 490 370 497 390 498
2009 412 496 386 495 405 501
2012 407 498 389 496 402 501
2015 407 493 377 490 401 493
2018 413 487 384 489 404 489
Source: INEP, 2019

heavily criticized (Freitas, 2020) and it is not possible to predict whether it


will actually be implemented, due not only to the cost involved but also to
the difficulty of operationalizing this proposal in a country of continental
dimensions and of compiling and analysing the results on a yearly basis.
Just like the federal governments, the local authorities have also been
developing their instruments to measure students’ performance, to the
extent that 30% of Brazilian municipalities have their own assessment
systems, in which the main tool is the cognitive test (Bauer et al., 2015).
A survey also finds that 21 of the 27 states possess their own assessment
systems as well.
Besides national assessment, Brazil also participates in international
surveys, including PISA. For the federal government, this participation
aims to obtain an international benchmark on the standard of Brazilian
education and appropriate education assessment methodologies capable
of assisting the improvement of national assessment. Table 4.1 shows
Brazil’s proficiency average in PISA, in comparison with the proficiency
average of OECD countries throughout the years.
As shown in Table 4.1, the country’s performance is distant from that
of OECD countries. The high rate of retention would be an explanation
for this, since a significant part of 15-year-old students attends school
grades in which the knowledge covered in the test has not been addressed
in sufficient depth.
During the 2012 Pisa cycle, although Brazil still scored below the
OECD average, the Ministry of Education (MEC) highlighted the fact
that the country has improved the most in proficiency in Mathemat-
ics in the period between 2003 and 2012, a fact also emphasized in
the Pisa Report. He also claimed that the country is a “model to
be followed by countries with considerable school lag and that still
74 João Luiz Horta Neto

face the challenge of including students in the school system” (INEP,


2015, p. 14). In the following cycle, in 2015, Brazil’s performance
fell behind that of 2009, six years before, with no comment from the
federal government.1 This is one of the indications of the political uses
made of Pisa results, considerably highlighting the advances made and
silencing the setbacks.
Brazil has also been participating in studies conducted by UNESCO
within the scope of the Latin American Laboratory for Assessment of
the Quality of Education, LLECE, since 1997 (UNESCO, 2001, 2008,
2016); from 2019, it has signed cooperation agreements to participate in
the PIRLS and ICCS surveys, promoted by the International Association
for the Evaluation of Educational Achievement, IEA, and it is considering
its participation in TIMMS also promoted by this entity.
In the case of PISA, this assessment has been taking on the role of a
knowledge-based regulation instrument (Carvalho, 2020), as shaper of
education policies, as agent in the transnationalization of these policies,
and as knowledge metre (Teodoro, 2020). In this way, it has influenced
the development of education policies on a worldwide scale. One con-
sequence of this process is the homogenization of education processes
as well as the political discourse regarding the more suitable paths to
improve education with a view to preparing citizens for the challenges of
the 21st century, a future conceived by the OECD. PISA also influences
the national instruments which, through tests of cognitive performance,
seek to regulate the education offer and the key results of the education
process. As briefly stated, this happens in Brazil, the first non-OECD
member country to participate in Pisa when this survey was still in its
formulation stage, in 1998.
In order to better understand one way in which Pisa influences the
education systems of the countries, one manner of result dissemination
will be presented and discussed, and the flaws of this information as well
as the possible effects of this dissemination will be analysed.

Performance indicators in PISA


Each three-year cycle assigns one area of knowledge (known as domain
in the PISA reports) to be the core area of the assessment, with a higher
number of items in the test, while two other areas are subject to more
limited testing. Thus, Reading having been the core area of the first cycle,
in 2000, it will be core area again in the 2009 cycle, since in 2003, the
core area was Mathematics and in 2006, it was Sciences.
One of the essential steps in the development of any measuring tool is
a clear definition of what it aims to measure. In the case of PISA, reading
the reports produced between the 2000 and 2008 cycles has yielded four
interwoven concepts: knowledge, skills, competencies and literacy. Based
Pisa and curricular reforms in Brazil 75

on these concepts, each area of knowledge develops its constructs, defini-


tions of what each test is measuring. These are quite broad formulations
which, besides being a reference for Pisa, eventually influence several cur-
ricular proposals all over the world. In the case of Brazil, this will be
covered in the following sections.
For PISA to disseminate the results and provide a pedagogical inter-
pretation of the proficiency measured by the test, of the figures that
express the results, interpretations are put forward of the proficiency
scales that describe the tasks which the students were able to carry
out in the applied tests. These descriptions are made for proficiency
intervals, known as levels, within the Pisa scale. Since proficiency is
comparable throughout the different PISA cycles, as time goes by, new
skills are described, broadening the knowledge of the features of the
tasks that students are able to develop on each proficiency level. Thus,
the descriptions disseminated in the 2018 cycle are more extensive or
add new features than those presented in the previous cycles. In the
same way, the descriptions of the next cycles tend to produce new
information for different items of the scale broadening the knowledge
on the skills developed by the students. To make the description of the
scale, a group of specialists tackles the items that were part of the test
and analyse the tasks required and the measured proficiency needed to
solve them.
Regarding levels, level 2 is considered basic, what would be the mini-
mum to expect in terms of performance in the test and what reflects
the skills which, according to the OECD, should have been developed
by 15-year-old students. Conversely, those at level 1, or below it, are
beneath the skills expected, while those who are at level 6 are con-
sidered excellent, a tier where only a small percentage of students are
classified.
The definitions of the constructs for each area of knowledge, together
with the variations they showed throughout the various Pisa cycles, and
the results of the 2018 cycle, the latest to be disseminated, are discussed
as follows:

Reading
The area of Reading has shown little variation as regards the definition
of its construct throughout the cycles. In 2000, the first PISA cycle, when
Reading was its core theme, was defined thus:

Reading Literacy is understanding, using, and reflecting on written


texts, in order to achieve one’s goals, to develop one’s knowledge and
potential, and to participate in society.
(OECD, 1999, p. 20)
76 João Luiz Horta Neto

This definition carries with it the vision of a society idealized by the


OECD, in which global citizens would participate. A fluid and imprecise
concept.
This construct suffered a single change in the 2009 cycle, when Read-
ing was again the major domain theme. In this cycle, the definition of
Reading Literacy included a short passage to refer to a measure regarding
student engagement with reading. The new definition, with the inclusion
highlighted in bold, read as follows: “Reading literacy is understanding,
using, reflecting on and engaging with written texts, in order to achieve
one’s goals, to develop one’s knowledge and potential, and to participate
in society” (OECD, 2009, p. 23).
The change reflected the inclusion of the third pillar of the concept
of competence, defined as knowledge, skill and attitude. For PISA,
attitude becomes engagement with Reading, measured through some
items in the student, the teacher and the school questionnaires which
are then analysed together with the items of the cognitive test. The
main goal of inserting engaging with Reading is to stress the contex-
tual factors and their influence on the measured proficiencies, with
a view to complementing the information obtained in the cognitive
test. Thus, “individual Reading Engagement refers to the motivational
attributes and behavioral characteristics of student’s reading” (OECD,
2009, p. 70).
The motivational attributes refer to how reading is fostered by the
teacher, by the classroom activities and by the school. Attitudinal charac-
teristics, related to the student, refer to the amount and intensity of their
reading activities.
In the 2018 cycle, when Reading was once again the major domain, the
results emphasized its dependence both on reader factors and on textual
factors (formats, complexity of the language used) and on the task to be
carried out (reading for pleasure, for in-depth comprehension or for scan-
ning and locating) (OECD, 2019a). It was also stated that in the selection
of what was used in the test, several types of texts read by students and
the goals of those readings should be ensured, at school and outside it,
so as to naturally represent a range of difficulties involving both the texts
and the tasks to be carried out. As can be understood, the description
presented is quite wide and raises doubts regarding the cultural realities
of each of the countries participating in PISA and the texts produced in
each society, each of which has its own specificities. Choosing texts that
cut across the different cultures is no easy task, and calls into question
the actual possibility of comparing the results of students that experience
such diverse reading practices and cultural patterns, although the sta-
tistical techniques used do not show significant differences between the
measures of different countries.
Pisa and curricular reforms in Brazil 77

Table 4.2 Description of the scale for Reading Proficiency in PISA 2018

Level Lower score Percentage of Characteristics of the tasks


limit students at level

6 698 OECD: 1.3% They can compare, contrast and integrate


Brazil: 0.2% information representing multiple and
potentially conflicting perspectives,
using multiple criteria and generating
inferences across distant pieces of
information to determine how the
information may be used.
4 553 OECD: 18.9% They interpret the meaning of nuances of
Brazil: 7.4% language in a section of text by taking
into account the text as a whole. In
other interpretative tasks, students
demonstrate understanding and
application of ad hoc categories. They
can compare perspectives and draw
inferences based on multiple sources.
2 407 OECD: 23.7% They can understand relationships or
Brazil: 24.5% construe meaning within a limited
part of the text when the information
is not prominent by producing basic
inferences, and/or when the text(s)
include some distracting information.
1b 262 OECD: 6.2% They can also interpret the literal meaning
Brazil: 17.7% of texts by making simple connections
between adjacent pieces of information
in the question and/or the text.
Source: Adapted by the author from OECD (2019b).

The results of Reading Literacy are presented in Table 4.2, divided into
nine levels, from 6 to 2 and then levels “1a”, “1b”, “1c” and “Below
1c” (OECD, 2019b, pp. 87–88). The space of this chapter would not be
enough to analyse the whole table. For this reason, passages from the
descriptions were selected, which present common tasks described in dif-
ferent levels. Thus, just as those presented in Table 4.2 were selected, oth-
ers might have been chosen with no impact on the quality and validity of
the analysis done. For this study the chosen passages were the ones that
referred to the ability to make inferences, one of the tasks required by the
language test. To make the table, some of the levels were also selected so
as to exemplify the extension of this skill, the minimum proficiency to do
each task and the percentage of OECD and Brazilian students classified
at each level in 2018.
To start with, it is necessary to comment some imprecisions of the way
PISA presents its results. As stated in the introduction to this chapter,
calculating proficiency uses the ITR model, which expresses a probability
78 João Luiz Horta Neto

that the task described has been carried out, not a certainty that all those
students classified on a particular level did it. Yet the language used in
the description of the scale creates the idea that the students classified
on a particular level managed to perform all the tasks described in it,
which may not correspond to the actual situation. Besides, like any meas-
ure, an error is associated with it; in other words, the value of the profi-
ciency indicated varies within a margin of error. These limitations are not
included in the description of the scale, thereby creating the notion that
what is expressed in these tables of the descriptions of the tasks carried
out is exact information, when actually it is not.
Regarding the tasks that involved the ability to make inferences
described in Table 4.2, when the different levels of the scale presented
are compared, only in one of them, level 1b, no reference is made to the
one task pertaining to that skill. Consequently, the report implies that
the students classified in that level were not capable of making inferences.
But what inferences is PISA alluding to? Since only access to part of the
items is provided, one of the limitations of the use of the IRT discussed
in the introduction to this chapter, it is not possible to understand all the
tasks the students were subject to and thus grasp why they couldn’t make
the inferences required of them.
In the Brazilian case, Table 4.2 shows that 17.7% of students were
classified on level 1b and, therefore, demonstrated not having developed
the ability to make inferences. This may be true for the texts and the
tasks required by the items on the test. In no way is it possible to assume
that these students are incapable of making inferences in other situations.
However, the manner in which the information is presented to society,
including to policymakers and the media, highlights the failure of the
school system, which allows 17.7% of Brazilians to be unable to make
inferences. If, on the one hand, it is a fact that there is a considerable
difference in performance between the students of OECD countries and
those of Brazil, due to the distinct realities in which they live, on the
other hand, it is pedagogically hard to believe that such a large section
of Brazilian students is incapable of making some type of inference. The
issue raised here has immense importance, since it is related with the uses
made of the results obtained. If the limits of the measures taken are not
made clear, countless conclusions may be drawn and the speeches defend-
ing so-called ideal models spread, casting them as silver bullets capable of
overcoming even social inequalities and ensuring learning improvements,
even if they do not clearly define what these would be.
Another item of data from Table 4.2, regarding level 4, by indicating
that students “are able to compare perspectives and extract inferences
based on multiple sources” provides very limited information on the
actual dimension of the required task. This is the problem of external
evaluations that use IRT to analyse results. As the figure itself means
nothing, the attempt to describe what students can do based on the
Pisa and curricular reforms in Brazil 79

tasks presented by the items is quite limited and may lead to multiple
interpretations. Thus, the data on Table 4.2 indicate that the inference
making skill can be measured in different tasks with varying degrees of
difficulty and complexity. No more than that. This being the case, it is
very difficult to formulate education policies to improve the students’
reading skill solely based on information like that. On the other hand, the
tables with the descriptions of the levels of proficiency give a clear picture
of the differences between nations and economies, both in terms of their
global position on the ranking and regarding the percentage of students
classified in each level. And this is quite enough to foster competition for
the best results in each PISA cycle.
Finally, when we consider all the tasks the students were able to perform
in PISA’s reading tests, not only those indicated in Table 4.2, we observe
that, although the definition of the construct is quite broad and refers to a
model of ideal society, the students were submitted to very specific tasks.
Nevertheless, the conclusions presented refer to the broadest sense of the
construct. In this way, the impression that the instrument is sufficiently
precise to guide any education system is enhanced. This very problem
arises in the reports of the two other fields of knowledge.

Mathematics
In 2000, the definition of the construct used by PISA to measure this
domain read as follows:

Mathematical Literacy is an individual’s capacity to identify and


understand the role that mathematics plays in the world, to make
well-founded mathematical judgements and to engage in mathemat-
ics, in ways that meet the needs of that individual’s current and future
life as a constructive, concerned and reflective citizen.
(OECD, 1999, p. 41)

The text states that the traditional knowledge and skills, defined at school,
are not the focus of PISA and that the emphasis is on mathematical
knowledge put to use in the different contexts that require reflection and
good judgement. At the same time, it defines the following skills as those of
interest to PISA: mathematical thinking, arguing and modelling, problem-
solving and communication. To prepare the tests, those skills are used in
three dimensions: processes, focusing on the students’ ability to analyse,
reason and communicate ideas effectively, presenting, formulating and
solving mathematics problems; content, using knowledge about change
and growth, space and shape, probabilities, quantitative reasoning and
relations of uncertainty and dependence; context, emphasizing doing and
using mathematics in situations of the personal and school life, work
and sports, local community and society (OECD, 1999).
80 João Luiz Horta Neto

In 2012, the second time the field of Mathematics was the main
domain of PISA (the first had been in 2003), the document describing the
framework of the test for that specific cycle remarks that “an understanding
of mathematics is central to a young person’s preparedness for life in
modern society” (OECD, 2013, p. 24), bringing to the table an utilitarian
perspective on education, without a clear definition of what life is being
considered and what modern society that would be. The document also
presents changes to the concept of literacy, seeking to further clarify the
relation between competencies and what would be required by the items,
something that was not clear in the original formulation. Thus:

Mathematical Literacy is an individual’s capacity to formulate,


employ, and interpret mathematics in a variety of contexts. It
includes reasoning mathematically and using mathematical concepts,
procedures, facts and tools to describe, explain and predict phenomena.
It assists individuals to recognize the role that mathematics plays in
the world and to make the well-founded judgments and decisions
needed by constructive, engaged and reflective citizens.
(OECD, 2013, p. 25)

Student engagement in Mathematics is measured indirectly from the stu-


dents’ results to the tasks required by the items, based on the mathematical
processes defined by PISA. These processes, as highlighted in the defini-
tion of Mathematical Literacy, assess the students’ ability to formulate
situations mathematically, use mathematical procedures and concepts and
interpret and evaluate mathematical results. The document underlines the
importance given to that student engagement when it states that

it is important for both policy makers and those engaged more closely
in the day-to-day education of students to know how effectively stu-
dents are able to engage in each of these processes.
(OECD, 2013, p. 28)

To enable a better understanding of the new concept, the document pre-


sents a graphic form to explain Mathematical Literacy (OECD, 2013, p. 26).
In the 2021 cycle, when Mathematics is once again the test domain, the
concept of Literacy remains the same, but the graphic form of expressing
its model has become more sophisticated (OECD, 2018, p. 10). It high-
lights the 21st-century skills, within what is known as the “challenges in
real-world contexts”, skills which were defined by a group of experts as
being those which “mathematical literacy both relies on and develops”
(OECD, 2018, p. 11).
As has already been discussed when Reading was addressed, the con-
struct of Mathematics, used to guide the preparation of the items and
Pisa and curricular reforms in Brazil 81

Table 4.3 Description of the scale for Mathematics Proficiency in PISA 2018

Level Lower score Percentage of Characteristics of the tasks


limit students at level

6 669 OECD: 2.4% They can reflect on their actions, and can
Brazil: 0.1% formulate and precisely communicate
their actions and reflections regarding
their findings, interpretations,
arguments; they can also explain
why these are appropriate to the
original situation.
4 545 OECD: 18.5% They can utilize their limited range of
Brazil: 3.4% skills and can reason with some insight
in straightforward contexts. They
can construct and communicate
explanations and arguments
based on their interpretations,
arguments and actions.
2 420 OECD: 22.2% They can extract relevant
Brazil: 18.2% information from a single
source and make use of a
single representational mode.
They are capable of making literal
interpretations of results.
1 358 OECD: 14.8% They are able to identify
Brazil: 27.1% information and carry out routine
procedures according to direct
instructions in explicit situations.
They can perform actions that are
almost always obvious and follow
immediately from the given stimuli.
Below OECD: 9.1% No skills are identified.
1 Brazil: 41.0%
Source: Adapted by the author from OECD (2019b).

measure the students’ performance, due to its comprehensiveness, ends


up providing directions as to how school systems should guide schooling.
Experts are the ones (Carvalho, 2009) who define how schooling should
be developed around the world, without the need for wider and extended
debates, vital in any democratic process to define the pathways to be
taken in each society. These directives then become the paradigm to be
followed; not following these orientations may result in poorer results in
the following cycles.
Mathematics results are presented divided into seven levels, from 6
to 1 and “below 1” (OECD, 2019b, p. 105). As was done for Reading,
for Mathematics too, some passages of the description of the scale were
selected to create Table 4.3.
82 João Luiz Horta Neto

The first observation to make when analysing Table 4.3 is the indica-
tion that 41% of Brazilian students were classified at level “below 1”
and that no skills for this level were identified. For a expert in the field of
psychometrics, the latter information would indicate the need to prepare
items with the ability to measure skills at that level of proficiency, there-
fore a technical problem to be solved. However, to the general public,
and the media in particular, the information that is conveyed is that 41%
of Brazilian students have no skills in Mathematics, or in other words,
they did not learn Mathematics and are therefore not prepared to become
21st-century citizens. When the percentages of level 1 and level “below
1” are added, the total comes to 68.1% of Brazilian students classified
below level 2, which is considered to be the basic level, that which all
students should minimally achieve, according to the OECD. This piece
of information reflects on the headline published by a renowned Brazil-
ian news site: “Pisa 2018: two thirds of 15-year-old Brazilians know less
than basic Mathematics” (Moreno, 2019).
Regarding the characteristics of the tasks described in Table 4.3, the
passages highlighted in bold, at levels 4 and 6, refer to interpreting and
arguing skills. What would be the gradation between these levels when
we compare the two highlighted passages “[based on interpretations and
arguments they make, they] can explaining why these are appropriate
to the original situation” and “[based on interpretations and arguments
they make, they] can construct and communicate explanations”? In the
text, the two abilities refer to the task of explaining something and, at
first sight, they seem to concern similar skills. However, these two levels
are 124 points apart on the scale. In the same way, for levels 1 and 2, if
we take the formulations “They can extract relevant information from a
single source and make use of a since representational mode” and “They
are able to identify information and carry out routine procedures accord-
ing to direct instruction in explicit situations” also seem to identify very
close tasks, although the levels are 62 points apart on the scale.
The highlighted passages refer to information on the tasks that stu-
dents are supposed to be able to perform, with the possibility of signal-
ling pathways for the improvement of curricula and pedagogic practices.
Yet this information is ineffective since no explanation is provided for
possible differences in students’ proficiency levels. Conversely, by point-
ing one direction for the teaching of Mathematics based on the discussion
of the construct proposed for the test, it indicates paths for improved
performance at PISA. This paradox ultimately reinforces the fight for
better positions in the ranking. Thus, it also imposes practices to be fol-
lowed in the classroom through the item models used in the test, which
are, it must be stated, very well prepared and quite creative. Still, these
items are developed by specialists with several years of practice, who
dedicated long hours to the preparation of each item and who received
Pisa and curricular reforms in Brazil 83

much feedback from other specialists who helped improved them until
they reached the level of excellence required by PISA. Therefore, distant
from the reality of most schools and, above all, from the teaching work
in the classroom. Another paradox.

Science
The field of Science was the one which most underwent change throughout
the PISA cycles. In the 2000 cycle, Scientific Literacy was defined as

the capacity to use scientific knowledge, to identify questions and to


draw evidence-based conclusions in order to understand and help
make decisions about the natural world and the changes made to it
through human activity.
(OECD, 1999, p. 60)

The document stresses that the test to measure Scientific Literacy must
be prepared considering the basis of scientific procedures, adjusting them
to the tasks required by the test, to the knowledge to be mobilized and
the context where the tasks are presented. Each one of these aspects is
detailed in the OECD text.
In the 2006 cycle, when Science was the major domain, a new definition
of Literacy was presented, including attitudinal aspects from the students’
answers to questions of scientific and technological relevance. Thus,
Scientific Literacy refers to an individual’s:

• scientific knowledge and use of that knowledge to identify questions,


acquire new knowledge, explain scientific phenomena and draw
evidence-based conclusions about science-related issues;
• understanding of the characteristic features of science as a form of
human knowledge and enquiry;
• awareness of how science and technology shape our material,
intellectual and cultural environments;
• willingness to engage in science-related issues and with the ideas of
science, as a reflective citizen. (OECD, 2006, p. 23)

With the new definition, the test to measure Scientific Literacy in 2006 was
constituted by several item units, each of them pertaining to a particular
context. The units contained items to measure cognitive aspects as well
as a last item, at the end of the unit, to measure the students’ engagement
with science within the presented context. Examples of these item units
can be found in Annex A of the document (OECD, 2006). Besides the
questions included in the cognitive test, other questions were included in
the questionnaires to measure students’ engagement with Science.
84 João Luiz Horta Neto

In 2015, when Science was once again PISA’s major domain, the previ-
ous formulation was improved upon.

Scientific literacy is the ability to engage with science-related issues,


and with the ideas of science, as a reflective citizen.
A scientifically literate person is willing to engage in reasoned dis-
course about science and technology, which requires the competen-
cies to:

• explain phenomena scientifically – recognize, offer and evalu-


ate explanations for a range of natural and technological
phenomena;
• evaluate and design scientific enquiry – describe and appraise
scientific investigations and propose ways of addressing ques-
tions scientifically;
• interpret data and evidence scientifically – analyse and evalu-
ate data, claims and arguments in a variety of representations
and draw appropriate scientific conclusions.

This text does not allow for a more in-depth analysis; suffice it to say
that the new formulation comes closer to the one used for the Math-
ematics area, especially regarding the specification of the way of rea-
soning, scientifically in this context, mathematically in the previous one.
Another change concerned the questions presented at the end of the item
units, which were discarded, and the items built to measure engagement
with Science, which are now part of the questionnaires. According to
the text, there were two problems with the previous formulation: the
questions reduced the space for cognitive items and a mismatch was
observed between the answers given in the cognitive vis-à-vis those of the
questionnaires.
A proposal for the 2024 cycle is already being discussed within PISA
to include new changes in the formulation of Scientific Literacy. The
proposal suggests three new areas of knowledge, including informatics,
competencies of using scientific knowledge for action and decision-
making and using probabilistic reasoning. Another dimension would also
be added, entitled scientific identity (OECD, 2020b).
Science findings are presented in eight different levels, from 6 to 2
and also levels “1a”, “1b” (OECD, 2019b, p. 113). To discuss the 2018
results, Table 4.4 is presented with the passages of the text to describe
the features of the tasks students managed to perform according to the
proficiency levels presented in the 2018 report, adding level “below 1b”
since that information is discussed in the text. The selected passages refer
to the ability to interpret data.
Pisa and curricular reforms in Brazil 85

Table 4.4 Description of the scale for Science proficiency in Pisa 2018

Level Lower Percentage of Characteristics of the tasks


score limit students at level

6 708 OECD: 0.8% In interpreting data and evidence, they


Brazil: 0.0% are able to discriminate between
relevant and irrelevant information
and can draw on knowledge external
to the normal school curriculum.
4 559 OECD: 18.1% They can interpret data drawn from a
Brazil: 4.6% moderately complex data set or
less familiar context, draw appropriate
conclusions that go beyond the data
and provide justifications for their
choices.
2 410 OECD: 25.8% Students are able to draw on everyday
Brazil: 25.3% content knowledge and basic
procedural knowledge to identify
an appropriate scientific explanation,
interpret data and identify the
question being addressed in a simple
experimental design.
1a 335 OECD: 16.0% With support, they can undertake
Brazil: 31.4% structured scientific enquiries with
no more than two variables. They
are able to identify simple causal or
correlational relationships and interpret
graphical and visual data that require a
low level of cognitive demand.
Below OECD: 0.7% The OECD does not specify the skills
1b Brazil: 4.0% developed.
Source: Adapted by the author from OECD (2019b)

Just as was done for the two previous areas, the goal of the analysis is
to point out the weaknesses of the information provided, which should
in principle guide policymakers and indicate paths for the teaching work.
As was commented earlier, the information provided is limited and hin-
ders clear understanding of the characteristics of the tasks that students
are capable of performing.
At level 6, it is stated that students can draw on knowledge from out-
side the school curriculum. However, as PISA does not reference the
school curricula of the participating countries and economies, and no
comparative study was made of them, it is not possible to identify what
this knowledge would be, since it would pertain to a myriad of possible
topics. Even if the task had been performed by merely 0.8% of OECD
students and none from Brazil, the question must be asked: what is the
practical meaning of this information?
86 João Luiz Horta Neto

Regarding the next three levels presented in Table 4.4, although a


gradation can be observed in the highlighted passages with regard to the
tasks that students can probably perform (data drawn from a moder-
ately complex data set, at level 4; basic procedural level, at level 2; low
level of cognitive demand, at level 1a), it is not possible to know what
type of task the descriptions refer to. The possibilities are countless,
even more so when they refer to scientific knowledge used to explain the
phenomena. Another example is the description for level “1a” indicat-
ing that “With support, [students] can undertake structured scientific
inquiries with no more than two variables”. Again, this is rather vague
information, with little pedagogic usefulness. Despite this inconsistency,
the information that more than 31% of Brazilian students are classified
below this level ends up conditioning the debate on the quality of Brazil-
ian education.
The analyses made on the information provided by the three areas
tested by PISA regarding the characteristics of the tasks that the students
showed ability to perform indicate its weakness, with vague and impre-
cise statements, hindering any governmental action on their education
systems, other than seeking strategies that will improve their placement
in the ranking. On the other hand, the complex models adopted by each
area to define its constructs come closer to prescriptions to ensure the
development of spurious skills of the 21st-century man, globally influenc-
ing the fate of education.
This influence can be felt in Brazil. The following sections discuss the
process of debating the BNCC, a document of curricular policy (Cassio,
2019), which is closer to a national curriculum. The improvements
made to the SAEB framework in order to adjust it to the BNCC, which
introduced some of the elements used by PISA, will also be discussed.

The dispute to define a national curriculum for


basic education in Brazil and the references to
the PISA model
The dispute to define, or not, a national curriculum for Brazil in the time
after the 1988 Constitution is a long one and will be addressed here
rather briefly. The focus of analysis is the preparation of the National
Common Curricular Base (BNCC) in 2018.
The National Constitution determined that minimum contents be
defined so as to ensure common basic schooling and respect for the
national and regional cultural and artistic values. After the Constitution
was ratified, a dispute began between several groups on whether the term
“minimum contents” would imply the definition of a national curriculum
or not, since the Constitution did not explicitly mention it. This dispute
was taken to the debate on the Law on Education Directives and Bases
Pisa and curricular reforms in Brazil 87

(LDB), which regulated education issues defined in the Constitution


approved in 1996. One of its articles provided that the federal
government, in collaboration with the States, the Federal District and the
Municipalities, should provide powers and directives for basic education,
which would guide the curricula and their minimum contents, so as to
ensure a common basic schooling. Therefore, both the Constitution and
LDB make no direct reference to a national curriculum, but rather to
“curricula”, given the power of the states and municipalities to offer
competitively basic education, and “minimum contents”, none of which
are of a national nature. To ensure national unity, LDB stipulates that
curricula and contents be guided by common directives discussed by the
three entities of the federation for the Portuguese Language, Mathematics,
knowledge of the physical, natural, world and of the social and political
reality. As time went by, LDB took in amendments, which included Art,
Physical Education, Religious Education and modern Foreign Language
as new areas for which guidelines should be defined.
During the debates to approve the new LDB, the federal government,
headed by Fernando Henrique Cardoso (1994–1998 and 1999–2002),
started preparing the National Curricular Parameters (PCN). This
preparation was limited to the nuclear teams of the education secretariats
and the consultancy of university professors, with no participation in the
discussion from the managers and teachers of the schools. At the end
of 1995, MEC forwarded the text for consultation and assessment. The
final proposal was published in 1997, one year after the approval of LDB,
and submitted a series of principles which responded to the “need for
benchmarks based on which the education system of the country could
be organized” (MEC, 1997, p. 13). Immediately after its publication,
the federal government developed a massive nationwide programme of
dissemination and training for its implementation, with enough capillarity
to reach even the schools. Moreover, it carried out a comprehensive
programme to purchase didactical books referenced on the PCNs for the
schools, effectively transforming them into the curriculum practiced by
the schools.
In 1999, an evaluation of the actions taken by Fernando Henrique’s
government regarding education policies in force, carried out by Maria
Helena Castro, one of the top MEC officials at the time, highlighted that:

The reforms launched by MEC, in accordance with the new LDB,


introduce changes in the proposed curricula in order to reduce the
emphasis on contents that are unnecessary for the general schooling
in basic education and to foster a pedagogic approach more ori-
ented to problem solving and to the development of general abilities
and skills.
(Castro, 1999, p. 34)
88 João Luiz Horta Neto

Therefore, although there was no prescription of a curriculum, an


attempt was made to organize the different curricula within each direc-
torate, since the results of the SAEB “confirm the limited effectiveness
of the proposed or indicated curriculum, showing that it is not being
learned in a satisfactory manner” (Castro, 1999, p. 30). The same offi-
cial defends the establishment of education standards, arguing that they
could impact students’ learning. She criticizes those who censured them
for contending that they could reduce the school curriculum. Among the
benefits of standards, she reiterates the fact that they enable the provision
of references for “curricular development, didactical books and peda-
gogic materials and materials related to teaching methods . . . and the
demand for the accountability of the various education agents” (Castro,
1999, p. 36). Nearly 20 years later, these were the notions that shaped the
BNCC, as will be discussed next.
Following the election of President Luiz Inácio Lula da Silva (2003–
2006 and 2007–2010), MEC put aside the promotion of PCN. The Sec-
retariat for Basic Education (SEB), the ministerial body in charge of the
debate on the National Base, fosters a national discussion involving edu-
cation researchers and specialists on the curricula, in a broad manner and
not restricted to a national curriculum. These discussions grew and led to
the organization of the Seminar National Curriculum under Discussion
in 2006. The 1,500 participants, from municipal and state education sec-
retariats and national bodies, debated the notions of curriculum and its
construction on the basis of five texts prepared by MEC entitled Enquir-
ies into the Curriculum (MEC, 2006a, 2006b, 2006c, 2006d, 2006e).
From 2008 to 2009, 750,000 copies of these texts were sent to basic
education schools and education secretariats to be discussed. In 2009 a
National Meeting to discuss a proposal of curriculum for basic education
was anticipated.
In 2010, MEC forwards the Draft Law 8,035/2010 to Congress, a
draft law which addressed the National Education Plan (PNE) for the
2011–2020 ten-year period. The Plan, a law setting educational targets
and strategies for achieving them, is a constitutional obligation. Among
the Plan’s targets, IDEB values were set which the country was supposed
to reach before 2020 and 25 strategies were defined to make this possible.
Among the strategies, it laid down “to establish pedagogic directives
for basic education and common national curricular parameters, while
respecting regional, state and local diversity”.
In the following years, during Dilma Roussef’s terms (2011–2014 and
2015–2016), the debates in MEC continued, involving professionals
acting in the schools and in the education management bodies of the whole
of Brazil. In 2011, a taskforce was created, Learning and Development
Expectations, whose aim was to submit a version of the National
Common Base in 2012. During the group’s discussions, the fear was
Pisa and curricular reforms in Brazil 89

raised that the term “expectations” might produce a Base, which defined
formulations that favoured its use to measure students’ performance in
external assessments. For this reason, the term was replaced for Rights,
so as to reinforce the idea of education as a right and, as such, accessible
to all and far from any measure of performance.
In 2013, MEC submitted to the National Education Council (CNE),2
a proposal of Rights to Learning for the students of the first three years
of basic education and pledged to complete the remaining Rights in the
first semester of the following year. Once the rights were defined, the
National Common Curriculum Base would be completed and delivered
to the society for new debates and further improvement. The proposal
would be presented in 2014 and was not completed due to changes in
the top positions of MEC. Still, despite all this, the group involved in its
construction presented the preliminary text which was in the discussion
process (Bonini, 2018).
In 2013, too, the Movement for the Base3 began to emerge, a still
active group which gathers several leaders funded by corporations with
the goal of influencing the construction of a national curriculum, taking
international experiences, especially the Common Core State Standards
Initiative, of the USA, as the basis. The great majority of people and
organizations that grouped around the Movement have been exerting
strong influence on the discussions on education policies and on the
paths of MEC since the 2000s. Some of these people would come to take
on leading positions in the structure of the Ministry and would prove
fundamental to the creation of the BNCC (Avelar & Ball, 2019).
In 2014, four years after MEC’s project had been sent, the PNE is
passed by the National Congress. As in MEC’s original proposal, the
Plan uses IDEB to define targets, but it increases the number of strategies
to achieve them. One of the new targets establishes the implementation of
the learning rights and goals, which will configure the common national
curricular base until 2016.
In mid-2015, during Dilma Roussef’s second term, the discussions
around the president’s impeachment escalate, forcing a rearrangement
of forces in the government, which leads to new changes in the leader-
ship of MEC. In the wake of this, the command of SEB was given to the
founder of a social organization, which markets and applies standardized
tests for several education secretariats all over Brazil. As a result of these
changes, a new commission composed by 116 professionals was created
to prepare a National Curricular Base. The mandate was to prepare a
proposal along the lines of a national curriculum, something different
from what gradually was built in the previous years. The document, com-
pleted in September, defined as its goal to signal learning and develop-
ment pathways for the students throughout basic education capable of
ensuring that students develop the 12 general skills and others specific for
90 João Luiz Horta Neto

each area and education stage. Skills were defined for each grade, entitled
learning objectives, each of which with its identification code. This type
of formulation is very similar to the definitions used in the framework of
large-scale tests used in Brazil.
This early official version of the BNCC was placed in public consulta-
tion on the Internet between October 2015 and March 2016. According
to MEC, there were over 12 million contributions to the text, but, analys-
ing the data, researchers claim no more than 150,000 people contributed,
some of whom with several quite piecemeal submissions (Cassio, 2019).
In April, MEC concludes the second version of the document. This ver-
sion was forwarded to CNE, which would hold conferences in the states,
with the entities aggregating state and municipal education secretaries, to
debate the document and collect suggestions (Cassio, 2019).
A month later, in May, President Roussef was removed from the
presidency to face the impeachment process and the Vice President
Michel Temer takes over. With his ascent to power, the MEC structure
was changed again and the same happened to the constitution of CNE
with the repeal of the nomination decree of 12 of its 24 members,
signed by the ousted President. A body with crucial technical functions
would become the stage of intense political dispute for the BNCC’s
approval.
The new Minister Rossiele Soares da Silva, in one of his first initiatives,
created the Managing Committee of the Curricular Common National
Base and the Reform of Middle Education, in which only MEC manag-
ers participated, presided by Maria Helena Castro, Executive Secretary
of MEC, the same who in 1999 advocate for education grounded on
competencies and skills, and the definition of education standards. In
September 2016, Consed and Undime, bodies representing the munici-
pal and state education leaders, send to the Minister of Education and
to the Managing Committee a report containing criticism and recom-
mendations for a revision of the BNCC. According to Castro (2020),
the document stated the leaders’ option for a competencies- and skills-
based BNCC and on the grounds of this option, the Managing Commit-
tee started drafting a new version of the BNCC. According to the author,
competency “is a way of mediating the right to learn and know how, so
that they can be followed by the teacher, by the school, by the family and
by the system”. (p. 106)
When the third version arrives at the CNE, in April 2017, still incom-
plete, since it was only two of three stages of basic education, one of the
counsellors claims that

it was immediately noticed that the project of the new government


had chosen to use as core structure of the Base the concept of
competency, not that of right to learnings, as laid down by the PNE
Pisa and curricular reforms in Brazil 91

Law. As justification, the text argued that “LDB implicitly adopts


a competency-based focus”. This statement was grounded on an
interpretation, hardly consensual, of Article 9 of LDB, in which some
referred to it as pedagogic (government) while others viewed it as
administrative (opposition).
(Soares, 2019, p. 73)

The text of LDB the counsellor was talking about is paragraph IV of


Article 9, which determined that

the Union must define, in collaboration with the States, the Federal
District and the Municipalities, competencies and guidelines for
childhood education, basic education and middle education, which
will guide curricula and their minimum contents, so as to ensure
common basic schooling.
(Law 9.394/96)

In this passage, LDB clearly refers defining “competencies” by the


bodies of the federation to act collaboratively on the school educational
offer, not “competencies” discussed in the education field. This is a true
paradox, but one which was addressed by the counsellor as a minor
issue. In fact, an intense dispute flared within the CNE in which it was
sought to go beyond its role of interpreting LDB, to introduce definitions
that contradicted the spirit of the Law.
This is what two CNE counsellors stated, as they highlighted that the
differences were very profound. They involved

disputes on two perspectives: one that places quality education for


all within the context of a country with extreme social inequality and
which defends dramatic social and economic changes for the sake of
a fair society; and another which prioritizes training for work on a
market rationale, favoring managerialism, the establishment of com-
petencies and the culture of performativity.
(Aguiar & Tuttman, 2020, p. 72)

Soares, who apparently handled the term “competency” as a minor issue


in the debates, confirms the inspiration that guided the new version of the
BNCC by claiming:

Within the scope of the BNCC, competency is defined as the


mobilization of knowledge (concepts and procedures), abilities
(practical, cognitive and socioemotional), attitudes and values, to
solve complex demands of daily life, of the full exercise of citizenship
and the world of work.
92 João Luiz Horta Neto

This definition, essentially the same used by the OECD and by


UNESCO in recent documents, enshrines concepts which would
allow the issue to be overcome. On the one hand, it emphasizes the
idea that the knowledge mobilized by the competency is essential
to the concept. It is not possible, therefore, to speak of competency
without the necessary emphasis on its components. The DeSeCo
Report of the OECD, written for the preparatory studies for PISA,
is adamant when it claims: neither the cognitive components nor the
motivational aspects in isolation constitute a competency.
(Soares, 2019, p. 29)

The importance of the OECD is justified in this passage as an organiza-


tion that shapes knowledge and disseminates it throughout the world.
No references are made to various aspects of the debate on competen-
cies, but one reference of high international visibility is used to provide a
scientific and credibility aura to a proposal that defends a political world
view. Rationales were sought to support the BNCC as the document that
would define the final behaviours expected of students, in opposition to
those who defended it as a starting point for the construction of basic
knowledge (Aguiar & Tuttman, 2020a, 2020b).
Regarding the external evaluation, another counsellor of CNE, con-
nected to a social organization involved in external evaluations, pointed
to the possibility of BNCC expanding the coverage of the SAEB even
more. According to her:

[The evaluation conducted by the SAEB] may incorporate other


series [school grades] and curricular areas. As mentioned before, it
focused on the ends of the cycles for lack of a clearly defined BNCC
regarding the learning rights and goals, series by series, which would
also include the other curricular areas such as Natural Sciences and
Human Sciences, and not only Portuguese and Mathematics.
(Fontanive, 2019, p. 110)

It can be understood from this passage that the discussions on the BNCC
also mobilized the energies of those who favoured the expansion of exter-
nal evaluations.
The discussion on the BNCC was influenced both by the troubled
national context and by the debate involving two opposite models:
one brought by the forces which since 1995 had been expanding their
influence on MEC and another, which defended the proposals built by
MEC over the course of more than 10 years. This is reflected on the
Resolution, which approved the BNCC (MEC/CNE, 2017). Its text states
that the phrase “competencies and skills” must be considered equivalent
to the phrase “learning rights and goals” present in the Law of PNE.
Pisa and curricular reforms in Brazil 93

This is clearly another conceptual absurdity perpetrated within the scope


of CNE, given the background presented in this text of how PNE built a
consensus on “learning rights and goals”. This interpretation sought to
legitimize a process which was being imposed, which was different from
that which has been discussed on the National Congress during the four
years of the debate on PNE. Since there was no time to change the PNE
Law, the CNE itself, within the scope of its legal powers, proposed an
interpretation to adjust what the legislation provided and, in this way,
break the deadlock.
This short history of the construction of the BNCC is important to
contextualize the great dispute that took place until it was approved.
Whereas the debate which gave rise to the learning rights and goals took
eight years to consolidate (2006 to 2014), the BNCC takes only two
years (2015 to 2017) to be redrafted and passed. It is important to high-
light that this third version of the BNCC, completely different from the
preceding ones, was put up in merely 11 months, between May 2016,
when the Vice-President assumes the presidency on an interim basis, and
April 2017, when the BNCC was submitted to CNE. This expediency
was possible, thanks to the support and resources obtained from the
Movement for the Base, where the foundations for the BNCC were being
set up, right before President Rousseff’s impeachment. This is one of the
reasons which enable us to characterize the BNCC as the conclusion of a
project begun in the 1990s.
The fourth and last version of the BNCC, that which concerned Middle
Education was submitted to and approved by CNE in 2018. New clashes
were called off by comparing the two dominant visions (Cassio, 2019;
Soares, 2019; Tuttman & Aguiar, 2019).
To put an end to the changes in curricular policies, The National
Curricular Guidelines for Basic Education Teacher Training, DCN,
(MEC/CNE, 2019b) as well as SAEB’s framework (INEP, 2019b) were
restructured in 2019.

The organization of the BNCC and its


articulation with external evaluation
The fourth and last version of the BNCC, a 600-page document, claims
to be a reference for the formulation of curricula for federal entities
and for the pedagogic proposals of schools (MEC, 2019). Although the
preamble of the document addresses such topics as knowledge integration
as the relations between the students’ socioemotional4 and sociocultural
dimension, these topics are addressed only superficially throughout the
whole text. It was only to be expected that a curricular document would
present a solid theoretical base, capable of supporting the propositions
presented. Nevertheless, the focus of the document fell on the definition
94 João Luiz Horta Neto

of learning goals connected to syllabuses, without stipulating minimum


articulations (even between the different fields of knowledge) between the
specific, the pedagogic and the historic dimensions.
The most substantial part of the document is a long list of competences
and skills, which must be developed by the students in each one of the
school grades. In total, the BNCC lists 1,183 skills for Basic Education
and 221 for Middle Education. Each of these totals is distributed in
varying ways by curricular component and school grade. Considering
the amount of skills listed on the BNCC, there is little room left to allow
students to work on other types of knowledge locally. Consequently,
although it advocates flexibility and the development of critical and
creative thinking, the BNCC may contribute to compartmentalized
and standardized education. Besides, given the strong influence of
external evaluations on the education process, it may deepen even more
accountability in education in Brazil.
The whole theoretical foundation of the BNCC is summarized in a
single page. In it, competency is defined as the mobilization of “knowl-
edge (concepts and procedures), skills (practical, cognitive and socioemo-
tional), attitudes and values to solve complex demands of daily life”
(MEC, 2019, p. 8), without adding any reference or discussion to justify
these options. Despite presenting this conceptualization throughout the
text, competency is addressed as it is only a set of skills.
The text presents ten general competencies, which must be developed
throughout the whole basic education, which merge, without further
explanation, with the learning and development rights defined in PNE.
Each one of these ten competencies is broken down into several other
competencies, specific for each field of knowledge, which then breaks
down into skills. Skills “express essential learning which must be assured
the students in the different school contexts” (MEC, 2019, p. 29). Each
skill is stated by a statement, which includes three components: concept
connected to the field of knowledge or procedure which students are
expected to master, known as knowledge objects; ranked cognitive
operations, represented by action verbs indicating tasks to be performed;
settings or contexts, which can be general for some areas, or limited for
others. The document also states that the chosen focus is the same one
adopted by international assessments, not making a direct reference to
PISA, but clearly using it as model. In this way, orientations, the basis
of any curriculum, end up being confused with measuring, the goal of
standardized evaluations.
Despite referring to learning and development rights, the BNCC only
makes them explicit when discussing Childhood Education. On page
38, the text indicates six rights of children (socialize, play, participate,
explore, express, know themselves) and elaborates on each one of them.
This way of stating rights comes quite close to the formulation that
Pisa and curricular reforms in Brazil 95

precedes the first version of BNCC. For other stages of basic education,
no reference is made to any right, except for vague mentions to the right
to learn.
The propositions for Fundamental Education are divided into five
areas of knowledge: Languages, covering the curricular components
Portuguese, Art, English and Physical Education; Mathematics; Human
Sciences, including History and Geography; Religious Education. In the
particular case of the area of Languages, by placing side by side such dif-
ferent curricular components as Languages, Art and Physical Education,
the false idea is created that these components bear some proximity. This
is artificial since the types of knowledge each of them structures are quite
distinct.
The text of BNCC mentions conceptions of Literacy, as does PISA,
but uses the concept in a superficial manner. In the area of Languages,
for instance, it refers to “Literacies”, “Literacies of the letter and of
the printed word” and “new essentially digital literacies” (MEC, 2019,
p. 69) without defining them.
Regarding Mathematics, it refers to Mathematical Literacy as being

competencies and skills of reasoning, representing, communicating


and arguing mathematically, so as to foster the establishment of
conjectures, problem formulating and problem solving in a range of
contexts, using mathematical concepts, procedures, facts and tools.
(MEC, 2019, p. 266)

Despite indicating the concept used by PISA as its basis for the used
definition, it does not discuss how the formulation used in the BNCC
was developed or its proximity with PISA and the reality of Brazilian
education.
The notion of Literacy used in the area of Sciences is not very robust
either. Despite defining it as “the ability to understand and interpret the
world (natural, social and technological), but also to transform it based
on the theoretical and procedural input from the sciences” (MEC, 2019,
p. 321), it does not discuss the implications of this formulation nor does
it indicate the path taken to arrive at this conception. As was the case
with Mathematics, the text refers to the skills that must be developed on
the basis of the competencies defined for this area, but does not clarify
the connection between the Literacy announced at the beginning of the
text and these two other concepts.
Each area of knowledge and each curricular component define a
set of specific competencies. Each curricular component, based on the
defined specific competencies, lists a set of skills to be developed in
each school grade. Comparing different area skills we observe that a
learning progression is sought from three elements: cognitive processes
96 João Luiz Horta Neto

involved, stated using verbs that indicate increasingly active or demand-


ing processes; mobilized knowledge objects, which also carry growing
sophistication or complexity; the contexts they relate to (MEC, 2019,
p. 31).
Each of these skills is given a code which translates the stage of basic
education, the school grade, the curricular component and the order of
the skill on the BNCC. This type of construction is very useful for the
preparation of items for cognitive tests used in external assessments,
coming close, for example, to the measuring tools generally used by the
SAEB. Thus, at the same time that it defines the knowledge which must
be taught, the level of complexity and the context, it provides a precise
direction for how this skill should be assessed. This is one more reason to
reassert that it is not possible not to categorize the BNCC as a national
curriculum and to use external assessments as one of the ways of imple-
menting it.
The text on Middle Education follows the same pattern as that of
the text on Fundamental Education. It also refers to essential learning,
without stating the criteria used to select it and without making it explicit
or indicating competences and skills to be developed.
Regarding the areas of knowledge, it defines four: Languages and their
technologies; Mathematics and its technologies; Natural Sciences and
their technologies; Applied Human and Social Sciences.
The use of the phrase “and their/its technologies” by each field of
knowledge emerged during discussion in MEC for the implementation
of a reform for Middle Education, at the time when the passing of
LDB was under discussion in 1996. The aim was to take that stage
of education closer to the world of work, since it was considered that
Middle Education was structured in a way that favoured basic contents,
very distant from their applications to daily life. In 1998 this debate was
taken to CNE, which, in tune with the then heads of MEC, produces an
expert opinion in which it advocates that the technology must be seen
as a process that is part of the youth’s education, connecting acquired
knowledge to its applications in the productive sector, preparing youths
for the labour market. Consequently, it grouped the different curricular
components into four areas and to all of them it added the phrase “and
their/its technologies”. In 2017, the area of Human Sciences and their
technologies became Applied Human and Social Sciences.
As BNCC uses the term “essential learning” but did not defined what
does it means, CNE has defined it as being “those [types of learning]
which develop competencies and skills . . . to solve complex demands of
daily life, of the practice of citizenship and of performance in the world
of work”(MEC/CNE, 2018). As can be seen, this is a generic definition
which still does not clarify their meaning, making it difficult to place
them at the service of the curriculum.
Pisa and curricular reforms in Brazil 97

Regarding the changes to SAEB’s framework, as soon as the third ver-


sion of the BNCC, which included Fundamental Education, was approved
by CNE towards the end of 2017, INEP’s team began discussion to adapt
SAEB’s cognitive tests to the BNCC. Several initiatives were carried out
involving different groups of experts, including some of those that were
responsible for writing the BNCC (INEP, 2019b). The end result was a
new design for the framework of Languages, Mathematics, Human Sci-
ences and Natural Sciences for the 5th and 9th grades of Fundamental
Education, besides Mathematics and Portuguese for the 2nd grade.
In this new design for the framework, each skill is presented as the
cross between two axes of knowledge (relative to the knowledge types
defined on the BNCC and that are susceptible of being measured using
large-scale tests), and cognitive axes (indicating the cognitive processes
deemed essential on the BNCC). This presentation is the same for all
areas, with variations in the number of cognitive and of knowledge axes
so as to address the specificities of each area. To guide the preparation
of the tests, the percentual distributions of items per knowledge axis and
cognitive axis are also presented for each knowledge area.

Final remarks
PISA has become an instrument with a great deal of credibility to the
extent that it influences education policies all over the world. Its cognitive
tests and questionnaires are well prepared, quite solid technically
and have been receiving input from important world experts for its
continuous improvement, a clear sign of the concern to add innovations
and perfect the measurements carried out. Many of the improvements
it makes are debated and announced several years in advance, in a clear
demonstration of care to allow countries and economies to prepare their
education systems for the following PISA cycles. Besides the necessary
transparency that such a powerful tool must have, the concern to avoid
sudden changes in the relative positions of the countries and economies
is evident, mainly due to the importance that ranking has in the strategies
used to disseminate its results.
Its constructs, definitions of what will be measured, are presented from
a solid theoretical basis and end up indicating what should be taught and
how curricula should be structured. It also indicates to the countries the
pathways to be trodden to improve the results of forthcoming cycles.
The discussion on education policies is the essence of PISA, the element
that mobilizes politicians and the media, and which exerts influence on
societies. Having a good position in the ranking means being on the
right track to educate the citizen for the 21st century, as if there were a
single model for this or as if the conditions or starting points for all the
schools and their students were similar. This whole rationale gains an
98 João Luiz Horta Neto

appearance of solid scientific grounding, of irrefutable results, which are


then complemented by sophisticated statistical studies based not only on
the questionnaires applied to students, teachers and school directors but
also on other databases.
However, when we examine the information that seeks to explain
the results of the cognitive test, the frailties of many of them become
apparent. In this chapter it was pointed out that the descriptions of
the scales, when analysed closely, convey little information regarding
what students were able to do in the test, and sometimes they provide
information incorrectly. And this is surprising since the use of cognitive
tests as tools to monitor students’ performance was introduced as the
great advancement of compared education studies, able to inform on
what and how much students were learning at school. The criticism
made throughout this text has no relation with possible methodological
problems of the instrument, but rather with the rationale used to
disseminate its results. If using IRT has the advantage of allowing for
reliable comparisons (in some aspects, not all) throughout time, it does
so from a concept that is incomprehensible for the education world: that
of proficiency, reflected in a figure expressed on a scale that is totally
different from the one used by the school. Moreover, even if one has
access to the set of items applied in the tests, they inform on those tasks
that are presented to students in each PISA cycle, not on all that teachers
and school systems must organize to provide students with meaningful
learning.
The influence exerted by the OECD on nations has been increasing.
Examples of this are two of its products: PISA for Development, devised
to address the poorer countries (Addey, 2016) and PISA for Schools, to
address schools that seek to compare their students’ performance with that
of schools around the world (Lewis, 2017). The OECD also disseminates
the instrument known as GPS Education, a global positioning system,
which provides each education system with its position vis-à-vis the
other countries. With this, the organization exercises world leadership
in the Big Data movement in education. And this can expand even more
with the need to keep up with the Sustainable Development Goals, in
particular the target that defines the search for Quality Education. This
goal involves mobilizing all the governments to ensure that their students
achieve certain learning outcomes so that they can be quantified and
compared internationally. Regardless of debating the actual possibility
of being able to achieve such a goal as that one, PISA can prove to be
a useful tool for that purpose, all the more because it already defines
minimum levels of literacy that should be attained.
The power of the OECD to influence education policies also reaches
Brazil. This chapter described the dispute over the implementation of a
national curriculum that ends with the writing of the National Common
Pisa and curricular reforms in Brazil 99

Curricular Base (BNCC). The curricular document is structured around


such concepts as literacy, competencies and skills, just like PISA. How-
ever, its theoretical basis is presented on a single page and focuses on
defining learning goals in formats that seem more like frameworks for
the SAEB tests.
SAEB, with its 30 years of existence, has shown that students’ pro-
ficiency has been improving slowly. This improvement has been more
marked since 2007, the year when the performance targets were defined
for IDEB, acknowledged as the quality indicator of Brazilian education,
and one of whose parameters is the proficiencies calculated by PISA, as
detailed earlier. It is also from 2007 onwards that there is an increase in
the quantity of subnational entities, who developed their own external
evaluation instruments. Thus, about 30% of Brazilian municipalities,
from the poorer to the wealthier, and 80% of states apply external tests
to their students, most of which based on the instruments of SAEB. One
of the reasons for this high number of states and municipalities conduct-
ing external evaluations may be connected with students’ preparation for
SAEB and this increases IDEB (Horta Neto, 2018).
Everything points to the BNCC serving to further broaden account-
ability policies in Brazil. In this way, the influence of external evaluation
on the education process increases and with it the pressure on the school
and the teacher for results. Consequently, despite preaching flexibility
and the development of critical and creative thinking, the BNCC may
contribute to compartmentalized and standardized education.

Notes
1 In an OECD publication on Brazil (OECD, 2019c), this fact is discussed,
quite superficially. In OECD (2016, p. 181), a brief discussion is presented
on the variations of performance among different countries and a trend curve
is presented for the countries’ performance in the various cycles, although the
variations are not subject to in-depth discussion (see Figure I.5.3, available at
http://dx.doi.org/10.1787/888933432623).
2 CNE is a regulatory body linked to MEC, responsible for defining norms
from the interpretation of LDB. It is constituted by 24 members indicated by
entities of the civil society appointed by the country’s President for a four-year
term.
3 The Movement for the Base (Movimento pela Base, in Portuguese) defines its
mission thus:
BNCC defines the learning and development rights for all Brazilian chil-
dren and youth. We work to ensure that these rights are fulfilled, sup-
porting the implementation of the quality of BNCC and the New Middle
Education in all networks and public schools in the country. We monitor
and provide visibility to the progress of the implementation on various
fronts. We articulate for the alignment of policies and programs – cur-
ricular, of teacher education, didactical materials and assessments – with
100 João Luiz Horta Neto

the BNCC, always in pursuit of the coherence of the education system. In


partnership with national and international organizations, we collect evi-
dence and best practices to ensure the quality and legitimacy of processes.
Together with the education secretariats, we build and disseminate con-
sensuses and technical orientations for the construction of curricula and
teacher training. And we take to the wider society the debate on learning
that is more meaningful and connected with students’ lives.
Available at <http://movimentopelabase.org.br/quem-somos/>. Retrieved
on 07/08/2020.
4 In the past years a discussion has been taking place on socio-emotional skills.
PISA, for example, refers to them as one of the skills measured in “Global
Competences” (OECD, 2019a, pp. 165–208).

References
Addey, C. (2016). Pisa for development and the sacrifice of policy-relevant
data. Educação e Sociedade, 37(136), 685–706. https://doi.org/10.1590/
es0101-73302016166001
Aguiar, M., & Tuttman, M. (2020a). Breve histórico do processo de elaboração
da Base Nacional Comum Curricular no Brasil. In A. Santos & M. Ferreira
(Eds.), Base Nacional Comum Curricular, qualidade da educação e autonomia
docente (pp. 95–102). Em Aberto, 33(107). INEP.
Aguiar, M., & Tuttman, M. (2020b). Políticas educacionais no Brasil e a Base
Nacional Comum Curricular: disputas de projetos. In A. Santos & M. Ferreira
(Eds.), Base Nacional Comum Curricular, qualidade da educação e autonomia
docente (pp. 69–94). Em Aberto, 33(107). INEP.
Avelar, M., & Ball, S. (2019). Mapping new philanthropy and the heterarchi-
cal state: The Mobilization for the national learning standards in Brazil.
International Journal of Educational Development, 64, 63–73. https://doi.
org/10.1016/j.ijedudev.2017.09.007
Bauer, A., Pimenta, C., Souza, S., & Horta Neto, J. L. (2015). Avaliação em
larga escala em municípios brasileiros: o que dizem os números? Estudos
de Avaliação Educacional, 26(62), 326–352. http://dx.doi.org/10.18222/
eae266203207
Bonini, A., Druck, I., & Barra, E. (2018). Direitos à aprendizagem e ao desenvolvi-
mento na educação básica: subsídios ao currículo nacional. UFPR. Retrieved
August 20, 2020, from https://acervodigital.ufpr.br/bitstream/handle/1884/55911/
direitos_a_aprendizagem_e_ao_desenvolvimento_na_educacao_basica_sub
sidios_ao_curriculo_nacional-preprint.pdf?sequence=1&isAllowed=y
Carvalho, L. (2009). Governando a educação pelo espelho do perito: uma análise
do Pisa como instrumento de regulação. Education and Society, 30(109),
1009–1036.
Carvalho, L. M. (2020). Revisiting the fabrications of PISA. In G. Fan & T. S.
Popkewitz (Eds.), Handbook of education policy studies: School/university,
curriculum, and assessment, vol. 2 (pp. 259–273). Springer.
Cassio, F. (2019). Existe vida fora da BNCC? In F. Cássio & R. Catelli Jr. (Orgs.).
Educação é a Base? 23 educefernceadores discutem a BNCC (pp. 13–39). Ação
Educativa.
Pisa and curricular reforms in Brazil 101

Castro, M. (1999). A Educação para o século XXI: o desafio da qualidade e da


equidade. INEP. Retrieved August 18, 2020, from www.dominiopublico.gov.
br/download/texto/me000106.pdf
Fernandes, R. (2017). A universalização da avaliação e a criação do Ideb: pres-
supostos e perspectivas. In J. L. Horta Neto & R. Junqueira (Eds.), Sistema de
Avaliação da Educação Básica (SAEB): 25 anos. Em Aberto, 29(96), 177–193.
Fontanive, N. (2019). As avaliações nacionais dos sistemas escolares e a BNCC.
In I. Siqueira (Eds.), BNCC: educação infantil e ensino fundamental – proces-
sos e demandas no CNE (pp. 95–115). Fundação Santillana.
Freitas, L. C. (2020). Insanidade meritocrática torna o SAEB anual. Blog
do Freitas. Retrieved July 30, 2020, from https://avaliacaoeducacional.
com/2020/05/06/insanidade-meritocratica-torna-o-saeb-anual/
Horta Neto, J. L. (2018). Avaliação educacional no Brasil para além dos testes
cognitivos. Revista de Educação PUC-Campinas, 23(1), 37–53. https://doi.
org/10.24220/2318-0870v23n1a3990
INEP (2015). Relatório nacional Pisa 2012: resultados brasileiros. Brasília: INEP.
Retrieved Auguse 17, 2020, from https://download.inep.gov.br/acoes_internacion-
ais/pisa/resultados/2014/relatorio_nacional_pisa_2012_resultados_brasileiros.pdf
INEP. (2018a). RESUMO TÉCNICO: Resultados do índice de desenvolvimento
da educação básica. INEP.
INEP. (2019a). Sinopse Estatística da Educação Básica 2019. INEP. Retrieved
August 15, 2020, from http://download.inep.gov.br/informacoes_estatisticas/
sinopses_estatisticas/sinopses_educacao_basica/sinopse_estatistica_educacao_
basica_2019.zip
INEP. (2019b). Sistema de Avaliação da Educação Básica: documentos de refer-
ência versão 1.0. INEP. Retrieved August 17, 2020, from http://portal.inep.gov.
br/informacao-da-publicacao/-/asset_publisher/6JYIsGMAMkW1/document/
id/6898204
Lewis, S. (2017). Governing schooling through ‘what works’: The OECD’s PISA
for schools. Journal of Education Policy, 32(3), 281–302. https://doi.org/10.1
080/02680939.2016.1252855
MEC. (1997). Parâmetros curriculares nacionais: introdução aos parâmetros
curriculares nacionais. MEC.
MEC. (2006a). Indagações sobre Currículo: currículo e desenvolvimento humano.
Retrieved August 05, 2020, from http://portal.mec.gov.br/seb/arquivos/pdf/
Ensfund/indag1.pdf
MEC. (2006b). Indagações sobre Currículo: educandos e educadores – seus
direitos e o currículo. Retrieved August 05, 2020, from http://portal.mec.gov.
br/seb/arquivos/pdf/Ensfund/indag2.pdf
MEC. (2006c). Indagações sobre Currículo: currículo, conhecimento e cultura.
Retrieved August 05, 2020, from http://portal.mec.gov.br/seb/arquivos/pdf/
Ensfund/indag3.pdf
MEC. (2006d). Indagações sobre Currículo: diversidade e currículo. Retrieved
August 05, 2020, from http://portal.mec.gov.br/seb/arquivos/pdf/Ensfund/
indag4.pdf
MEC. (2006e). Indagações sobre Currículo: currículo e avaliação. Retrieved
August 05, 2020, from http://portal.mec.gov.br/seb/arquivos/pdf/Ensfund/
indag5.pdf.
102 João Luiz Horta Neto

MEC. (2019). Base Nacional Comum Curricular. Brasília: MEC. Retrieved


August 15, 2020, from http://basenacionalcomum.mec.gov.br/images/BNCC_
EI_EF_110518_versaofinal_site.pdf
MEC/CNE. (2017). Resolução CNE/CP n° 2 de dezembro de 2017. MEC/
CNE. Retrieved August 21, 2020, from http://portal.mec.gov.br/index.
php?option=com_docman&view=download&alias=79631-rcp002-17-
pdf&category_slug=dezembro-2017-pdf&Itemid=30192
MEC/CNE. (2018). Parecer CNE/CEB N° 3 de 8 de novembro de 2018. Retrieved
August 15, 2020, from http://portal.mec.gov.br/index.php?option=com_
docman&view=download&alias=102311-pceb003-18&category_slug=
novembro-2018-pdf&Itemid=30192
MEC/CNE. (2019b). Resolução CNE/CP No 2, de 20 de dezembro de 2019.
MEC/CNE. Retrieved August 13, 2020, from http://portal.mec.gov.br/docman/
dezembro-2019-pdf/135951-rcp002-19/file
Moreno, A. C. (2019). Pisa 2018: dois terços dos brasileiros de 15 anos sabem
menos que o básico de Matemática. G1. Retrieved August 08, 2020, from
https://g1.globo.com/educacao/noticia/2019/12/03/Pisa-2018-dois-tercos-dos-
brasileiros-de-15-anos-sabem-menos-que-o-basico-de-matematica.ghtml
OECD. (1999). Measuring student knowledge and skills – a new framework for
assessment. OECD.
OECD. (2006). Assessing scientific, reading and mathematical literacy – a frame-
work for Pisa 2006. OECD.
OECD. (2009). PISA 2009 assessment framework key competencies in reading,
mathematics and science: Key competencies in reading, mathematics and sci-
ence. OECD.
OECD. (2013). Pisa 2012 assessment and analytical framework: Mathematics,
reading, science, problem solving and financial literacy. OECD.
OECD. (2015). Beyond Pisa 2015: A longer-term strategy of Pisa. OECD
Publishing.
OECD. (2016). PISA 2015 results (Volume I): Excellence and equity in educa-
tion. OECD.
OECD. (2017). PISA 2015 assessment and analytical framework. OECD.
OECD. (2018). Pisa 2021 mathematics framework (draft). OECD. Retrieved
August 13, 2020, from https://pisa2021-maths.oecd.org/files/Pisa%20
2021%20Mathematics%20Framework%20Draft.pdf
OECD. (2019a). Pisa 2018 assessment and analytical framework. OECD
Publishing.
OECD. (2019b). Pisa 2018 results (Volume I): What students know and can do.
OECD Publishing.
OECD. (2019c). Brazil – Country Note – PISA 2018 Results. OECD Publishing.
Retrieved July 25, 2020, from www.oecd.org/pisa/publications/PISA2018_
CN_BRA.pdf
OECD. (2020a). Education GPS. OECD Publishing. Retrieved July 23, 2020,
from https://gpseducation.oecd.org/CountryProfile.
OECD. (2020b). Pisa 2024 strategic vision and direction for science. OECD
Publishing. Retrieved August 07, 2020, from www.oecd.org/Pisa/publications/
Pisa-2024-Science-Strategic-Vision-Proposal.pdf
Pisa and curricular reforms in Brazil 103

Soares, F. (2019). Pontos do debate para a construção da BNCC. SOARES, F.


Pontos do debate para a construção da BNCC. In I. Siqueira (Eds.), BNCC:
educação infantil e ensino fundamental – processos e demandas no CNE
(pp. 67–80). Fundação Santillana.
Teodoro, A. (2020). A OCDE e o sonho de uma governação mundial da educação:
pressupostos e análise crítica. In M. González-Delgado, M. Ferraz Lorenzo, &
C. Machado-Trujillo (Coord.), Transferencia, transnacionalización y transfor-
mación de las políticas educativas (1945–2018) (pp. 283–292). FahrenHouse.
Tuttman, M., & Aguiar, M. (2019). A construção da BNCC da Educação Infan-
til e do Ensino Fundamental: uma visão crítica. In I. Siqueira (Eds.), BNCC:
educação infantil e ensino fundamental – processos e demandas no CNE
(pp. 81–94). Fundação Santillana.
UNESCO. (2001). Informe Técnico del Primer Estudio Internacional Compara-
tivo sobre Lenguaje, Matemática y Factores Asociados para alumnos de Tercer
y Cuarto grado de la Educación Básica. UNESCO, Santiago.
UNESCO. (2008). Los Aprendizajes de los estudiantes de América Latina y el
Caribe: Primer reporte de los resultados del Segundo Estudio Regional Com-
parativo y Explicativo (SERCE). UNESCO, Santiago.
UNESCO. (2016). Comparación de resultados del segundo y tercer estudio
regional comparativo y explicativo – Serce y Terce 2006–2013. UNESCO,
Santiago.
Weller, W., & Horta Neto, J. L. (2021). The Brazilian education system: An over-
view of history and politics. In S. Jornitz & M. Parreira do Amaral (Eds.), The
education systems of the Americas. Springer.
5

Testing PISA tests


A study about how secondary and
college students answer Pisa items
in mathematics and science
Vítor Duarte Teodoro, Vítor Rosa, João Sampaio
Maia and Daniela Mascarenhas

PISA Programme uses tests (for students aged 15) and questionnaires (for
students, teachers, principals and parents). PISA questionnaires are pub-
lic, but the large majority of test items are kept confidential for reuse and
scaling across years. Critics of PISA argue that this makes it difficult to
analyse the validity of PISA tests, particularly across different countries
and cultures since the items are initially written in English or French. This
chapter describes a study about how students from different age groups,
education levels and courses answer PISA items and how they evalu-
ate different aspects of the items. We hypothesized that PISA items are
too difficult for the majority of students aged 15 and that they are more
appropriated to older students. We administered a booklet with two sets
of publicly available Mathematics items (from PISA 2012) and Science
items (from PISA 2015) to a non-representative sample of 839 Portu-
guese students from Basic, Secondary and Higher Education (approxi-
mately 50% have more than 18 years), from different types of courses,
schools, polytechnics and universities, public or private. Students were
asked to answer the items and to evaluate different aspects of each item
(e.g. “understanding” the question and assessing its “difficulty”).
Comparing the scores on our student sample and the scores on the
same items in PISA tests (in 2012 and 2015), for both Portugal and
OECD countries (students aged 15), we found similar results across all
age groups of our sample, in both Mathematics and Science. This sug-
gests that PISA tests can target older students and that knowledge and
skills at age 15 are globally similar to knowledge and skills at older ages
(at least in Mathematics and Science). We also found that item facility
has a significative positive correlation with students’ assessment of the
comprehension of the items text, with the certainty of correction of the
answer, and with the assessment of the difficulty of the item. On the other
hand, item facility has no significative correlation with the student study
of the content of the item, as reported by students.
DOI: 10.4324/9781003255215-6
Testing PISA tests 105

1 Introduction and context


International Large-Scale Assessments (ILSAs) are currently a major
force in education policymaking in Portugal and many other countries
(Carvalho et al., 2020). Reports and studies based on ILSAs tests and
questionnaires allow researchers to empirically test hypotheses about
which factors contribute to improving education, despite much criticism,
mainly due to the fact that the tests are “low-impact tests” (students can
answer them without practical consequences on their school assessment)
and that the items may have a strong “cultural component” that is not
shared by students from different countries and backgrounds (see, e.g.,
Waldow & Steiner-Khamsi, 2019; Zhao, 2020).
One such study, which covers different scientific fields, includes vari-
ous levels of education and is a reference for governance in the fields
of education, is the Programme for International Student Assessment
(PISA), conducted by the Organisation for Economic Co-operation and
Development (OECD), in which 15-year-old students attending at least
the 7th grade of school participate. This programme began in 2000 and
takes place every 3 years, with the participating countries varying from
edition to edition.
The main purpose of PISA is to monitor the levels of skills acquired
by students in three literacy domains (reading, mathematics and science).
More recently, collaborative problem-solving and financial literacy have
been included. The results obtained are expressed on a normalized scale
of 0 to 1,000 points (mean 500 and standard deviation 100). By analys-
ing the results obtained in PISA, the aim is to check whether schools
in the different countries prepare their young people for life in society,
that is whether students have competences that make them fit for active
life. By comparing the results obtained between countries, we can verify
whether, for example, Portugal has the same level as other countries.
Portugal has participated in all six editions of PISA (2000, 2003, 2009,
2012, 2015 and 2018) and has registered a trend of improving results in
the three domains analysed. Although there are several publications on
PISA in general, there are few on the participation of Portugal and their
fields of approach are very restricted (Carvalho & Costa, 2009; Conselho
Nacional de Educação, 2013; Marôco, 2020; Rosa et al., 2020). PISA
applies the Rasch model for item scaling (OECD, 2009), creating an ini-
tial version that serves as a reference for translations. The questionnaire
is completed in a rotating system in which each student fills in only one
part.
Several authors have criticized PISA on different aspects, from cultural
and economic development issues, to translation problems and the type
and characteristics of the items. Zhao (2020) considers that PISA has
three fundamental weaknesses: the existence of an underlying vision of
106 Vítor Duarte Teodoro et al.

education, the way it is implemented and the type of interpretation that is


made with the consequent global impact on education. We will not touch
on the last of the weaknesses; we will briefly touch on the first and focus
on the second, but only as far as the domains of mathematics and science
are concerned.
As part of their critique of an underlying view of education, Araújo
et al. (2017) point out that the differences between different countries in
terms of culture and economics call into question the legitimacy of rank-
ing them in a single table. Zhao (2020) states that this underlying view
is based on the perspective of the OECD countries, which, for the most
part, are the most economically advanced countries in the world, and
that they do not represent the cultural diversity of the approximately 200
countries in the world.
Regarding the way it is implemented, Kreiner and Christensen (2014)
address the issue of PISA’s use of the Rasch model of questionnaire
application and say that PISA does not meet the requirements for this
model to work properly and that, therefore, country rankings are not
robust. Araújo et al. (2017) report that the fact that not all items are
made available to researchers prevents them from checking whether the
published results depend on the choices made by the survey authors and
that there is a lack of documentation clarifying about

the choices made in terms of the dimensions measured, the items


selected as well as those discarded, the item response models chosen
and the resulting impact of all these choices on the ranking of
countries.
(p. 5)

Regarding the test items and what they measure, Hopmann (2008)
argues that there is no research showing that PISA covers enough to be
representative of the school subjects involved and UNESCO (2019) says
that their items have focused too much on a narrow set of subjects and
fail to capture what is important to education systems. Hopfenbeck et al.
(2018) state that PISA may be measuring different abilities in students of
different languages because there is a high degree of differential function-
ing in relation to language, to which Zhao (2020) adds that the problem
of there being languages where the text is much larger than in others,
even though the time given to administer the questionnaire is the same
for all countries. Sjøberg (2015, 2019) states that the OECD’s attempt to
decontextualize it so as not to disadvantage some countries over others
goes against the recommendations of educators and against the choices of
national leaders who want science to be relevant, interesting and linked
to a context. Araújo et al. (2017) also question whether the multidimen-
sionality of the items and the abstract skills that PISA intends to measure
Testing PISA tests 107

fit a response model that ultimately boils down to a single score and say
that “it is less than clear in PISA which dimensions are being measured
and which suppressed” (p. 4).
This study is based on a Question Booklet with mathematics and sci-
ence items, from 2012 and 2015, respectively, released by PISA. In addi-
tion to these items, for each item, we included four questions “Did you
fully understand the text?”, “Have you ever studied something related to
the subject of the question?”, “How sure are you of your answer?” and
“How do you rate the difficulty of this question?” to be answered by stu-
dents from elementary, secondary and higher levels, on four-level Likert
scale. We compare the results, by age, area and education level, to find
out if they are similar or not to those obtained by Portuguese and OECD
students who took the 2012 and 2015 PISA tests, against the background
of the criticism to PISA, namely those related to its items.

2 Methods
In this study, we use two types of methodological analysis: quantitative
and qualitative. In the first, we assume that the values presented in the
PISA reports are reliable and that the samples referred to and used in the
studies are representative. In the second, we used an interpretative quali-
tative approach, following the perspectives of Rémond (2006), Mullis
et al. (2009) and Rosa et al. (2020), among others. We resorted to the
reports and databases produced by the OECD. Since the first cycle of
PISA, in 2000, Portugal has registered a significant improvement in the
results obtained in the different literacies. In the 2018 edition of PISA,
in a ranked list of 79 participants, Portugal was ranked 24th in scien-
tific literacy, 24th in reading literacy and 22nd in mathematical literacy,
with 492 points in each domain, being above the OECD average (in all
domains).

2.1 Item selection


PISA items are originally produced in English and French and then
translated into other languages. As a general rule, the translation is done
by two translators. The versions produced are reconciled by a mediator,
with the collaboration of the scientific coordinator of the domain. The
items that are used in the various cycles of the study aim at identify-
ing trends in student performance. For this reason, they are not public.
However, whenever an assessment domain is the main one, new items
are designed and some of the items used in previous cycles are made
available to the general public, illustrating the type of questions students
are asked.
108 Vítor Duarte Teodoro et al.

For this study, nine groups of mathematics and science items were
selected from the 2012 and 2015 PISA tests. In 2012, PISA focused the
assessment on mathematics and in 2015 on science. Within the criteria
for choosing the items, we considered the different degrees of difficulty
of the items and in the answers given we followed the score/classifica-
tion assigned by the OECD (2013) (one or two, full credit, depending
on the question; 1, partial, depending on the question; 0, wrong). With
these items, a Question Booklet was designed with a view to be applied
to 15-year-old students, the target audience of PISA, and students attend-
ing higher education, seeking statistical comparison, which is essential to
draw safe conclusions, and to test our starting hypothesis, that is about
the unsuitability of the items.

2.2 Instrument
The Question Booklet comprises 36 pages, divided into three parts. Part
0 with one group of Mathematics items (from PISA 2012) and 1 group of
Science items (from PISA 2015); Part 1 with four groups of Mathemat-
ics items; and Part 2 with three groups of Science items. It also includes,
before Part 0, sociodemographic background questions for students
(name of institution, school year and course attended, date of birth, gen-
der, attendance of pre-school education and grade repetition) and for
parents/carers (professional activity and academic qualifications). Some
items, applied by PISA in digital format, were slightly adapted to paper
format. For each item, we also added questions to assess four aspects of
the items using a four-point Likert scale.
The students who agreed to participate in the study were asked to fill in
part 0 (common to all) and then only part A or B. When the time allowed
for the application of the Question Booklet (approximately 60 minutes),
some students answered all parts.

2.3 Administration and participants


An Application Protocol (distribution of the students in the classroom,
distribution of the Booklets, information prior to distribution, proce-
dures to be followed, records, non-use of mobile phones and calculators,
etc.) was defined and the authorization for Monitoring School Surveys
was obtained from the Directorate General for Education (DGE). The
procedures prior to applying the Question Booklet were different in pri-
mary, secondary and higher education schools, both public and private.
In the case of primary and secondary education, the School Directors
and/or Coordinators were contacted, with the aim of explaining the
study and obtaining their consent from the educational establishment.
After obtaining the necessary authorizations, the documentation was
Testing PISA tests 109

handed in, in particular a copy of the Question Booklet and the letter/
form with the request for authorization from parents/guardians. Once
the authorizations had been obtained, a day and time was then sched-
uled for the application of the Question Booklet. In higher education,
the teachers of different degrees, bachelor’s and master’s, were contacted
directly. To ensure the proper application of this evaluation instrument,
a member of the research team was always present. The application
of the Question Booklet took place between 7 October 2019 and 3
March 2020, having been interrupted with the pandemic crisis (SARS-
CoV-2) and the closure of educational establishments at national level.
The sampling process was by convenience. A total of 839 valid book-
lets were collected:

• 78 (9.3%, from basic education level), 204 (24.3% from Secondary


Education), 430 (51.3% from Higher Education);
• 74 aged 15 years, the PISA “age” (8.8% of the sample);
• 747 aged less than 23 years (89.0% of the sample).

3 Data and data analysis


The PISA database contains the students’ responses to each item. We
have identified all students in the OECD (and Portugal) who responded
to all items in the booklet (Table 5.1). Their responses are compared to
the answers of the 839 students in our sample. The analysis was made by
age, by school class and by education level.
PISA estimates student’s individual proficiency using Plausible Val-
ues (PVs). “PVs are a selection of likely proficiencies for students that
attained each score” (OECD, 2014, p. 146). PISA databases include PVs
for each student (in 2012, five PVs; in 2015, ten PVs). We computed the
average PVs for each student who answered all items of the booklet, for
all OECD countries and for Portugal. For 2015, in Science, these average
PVs are practically the same for all Portuguese students and for the Por-
tuguese students who answered all items of the booklet (489 versus 490)
and the same for the equal groups of OECD students (495). For 2012,
in Mathematics, OECD students that answered all items of the booklet
have a higher average PVs (498 versus 488) and Portuguese students have
a lower average PVs (480 versus 485).

Table 5.1 Students who answered all the items of the booklet on the PISA test

OECD Portugal

Mathematics items, 2012 79,577 (from 295,416, 26.9%) 1,759 (from 5,722, 30.7%)
Science items, 2015 6,669 (from 248,620, 2.7%) 187 (from 7,325, 2.6%)
110 Vítor Duarte Teodoro et al.

3.1 Mathematics items


Table 5.2 and Figure 5.1 describe the sum of scores of the Mathematics
items for:

• each class (25 classes, identified by a letter that represents a school


and a number that identifies a class);
• each age (10 ages groups, from 15 years old to greater than
23 years);
• each school level (3 levels, Basic, Secondary, Higher Education);
• the total of our sample (524 students who answered all Mathematics
items of the booklet).

Table 5.2 and Figure 5.1 also show the difference between each group
of our sample and the means for the OECD and for Portuguese students
who answered the same items on the PISA 2012 study, as well as the
statistical significance of these differences.
We conclude that:

• The mean for the 1,759 Portuguese students who answered in


PISA 2012 and answered all the items of the booklet is 5.33, and
is significantly lower than the mean 5.82 of all OECD students
(Portugal 2012 score on PISA Mathematics Literacy was 487,
lower than 494, the OECD’s score, a difference with no statistical
significance).
• The mean of the 524 students of our sample, 5.69, is significatively
higher than the mean of the 1,759 Portuguese students who answered
in PISA 2012 (5.44) and do not differ significatively from the mean of
the 79,577 students of the OECD sample (5.82).
• Of the 25 school classes of our sample, four have a mean that is sta-
tistically different from 5.33 mean of the 1,759 Portuguese students
who answered in the PISA 2012 (one has a lower value and the other
three has a higher value).
• By school level, only the higher education group (339 students) has
a mean significantly higher than the mean of the 1,759 Portuguese
students who answered in the PISA 2012; this difference is due to
two classes of engineering students (Sup_B_1 and Sup_D_2), most of
them aged 22 or 23.

As a global conclusion, we have evidence that our sample of 524 students


answered the Mathematics items in a similar way of the 1,759 Portu-
guese students who answered in PISA 2012, across all ages.
Testing PISA tests 111

Table 5.2 Sum of scores of the Mathematics items, by school class, age and
education level
School_Code |
Diff to mean Diff to mean
Age | School N Mean Median SD SEM PISAPT2012 Sign CECD2012 Sign
Level
Bas_A_1 14 5.14 5.0 2.515 0.672 -0.19 -0.68
Bas_B_1 14 5.00 4.8 1.629 0.435 -0.33 -0.82
Bas_C_1 15 4.23 3.5 1.963 0.507 -1.10 -1.59 0.038
Sec_A_1 20 5.80 6.0 2.256 0.504 0.47 -0.02
Sec_A_2 9 5.44 6.0 2.709 0.903 0.11 -0.38
Sec_B_1 9 3.67 3.5 1.521 0.507 -1.67 -2.16 0.029
Sec_C_1 14 5.46 5.5 1.562 0.418 0.13 -0.36
Sec_D_1 22 6.93 7.0 2.295 0.489 1.60 0.010 1.11
Sec_E_1 23 4.91 4.5 1.992 0.415 -0.42 -0.91
Sec_F_1 9 5.94 6.0 1.878 0.626 0.61 0.12
Sec_G_1 10 5.95 6.0 2.386 0.754 0.62 0.13
Sec_H_1 10 4.85 4.8 2.122 0.671 -0.48 -0.97
Sec_I_1 16 3.59 3.3 1.666 0.416 -1.74 0.017 -2.23 0.003
Sup_A_1 35 5.54 5.5 1.888 0.319 0.21 -0.28
Sup_B_1 45 8.20 8.5 1.733 0.258 2.87 0.000 2.38 0.000
Sup_C_1 32 5.75 5.8 2.275 0.402 0.42 -0.07
Sup_D_1 9 5.17 5.5 1.500 0.500 -0.17 -0.66
Sup_D_2 48 7.18 7.5 2.294 0.331 1.84 0.000 1.35 0.002
Sup_D_3 14 5.86 6.0 2.499 0.668 0.52 0.03
Sup_D_4 39 5.87 6.0 2.022 0.324 0.54 0.05
Sup_E_1 20 5.60 5.5 2.204 0.493 0.27 -0.22
Sup_F_1 14 4.89 5.0 1.767 0.472 -0.44 -0.93
Sup_G_1 26 4.42 4.5 1.798 0.353 -0.91 -1.40 0.016
Sup_G_2 22 4.95 5.0 2.149 0.458 -0.38 -0.87
Sup_G_3 35 4.63 4.5 2.217 0.375 -0.71 -1.20 0.017
15 years 48 5.16 5 2.307 0.333 -0.18 -0.67
16 years 98 5.61 6 2.149 0.217 0.28 -0.21
17 years 28 4.59 4 2.135 0.403 -0.74 -1.24 0.028
18 years 53 5.28 5 2.046 0.281 -0.05 -0.54
19 years 65 5.94 6 2.168 0.269 0.60 0.11
20 years 51 6.00 6 2.133 0.299 0.67 0.18
21 years 36 5.76 6 2.427 0.405 0.43 -0.06
22 years 58 6.68 7 2.696 0.354 1.35 0.001 0.86 0.028
23 years 26 6.46 8 2.687 0.527 1.13 0.050 0.64
> 23 years 46 5.51 5 2.323 0.342 0.18 -0.31
Basic 43 4.78 4.5 2.057 0.314 -0.56 -1.05 0.021
Secondary 142 5.35 5.3 2.249 0.189 0.01 -0.48
Higher Education 339 5.95 6.0 2.351 0.128 0.61 0.000 0.12
Total 524 5.69 5.5 2.328 0.102 0.35 0.011 -0.14
PISAPT 1759 5.33 5.0 2.908 0.069 -0.49 0.000
OECD 79577 5.82 6.0 2.972 0.011
112
Vítor Duarte Teodoro et al.

Figure 5.1 Sum of scores of the Mathematics items, by age, box plot and frequency curve
Testing PISA tests 113

3.2 Science items


Table 5.3 and Figure 5.2 describe the sum of scores of the Science items
for:

• each class (25 classes, identified by a letter that represents a school


and a number that identifies a class);
• each age (10 ages groups, from 15 years old to over 23 years);
• each school level (3 levels, Basic, Secondary, Higher Education);
• the total of our sample (496 students who answered all Science items
of the booklet).

The Table and Figure also show the difference between each group of our
sample and the means for the OECD and for Portuguese students who
answered the same items on the PISA 2015 study, as well as the statistical
significance of these differences.
It is possible to conclude that:

• The mean for the 187 Portuguese students who answered in PISA
2015 and answered all the items of the booklet is 6.25, and is not
significantly different from the mean 6.19 of all 6,669 OECD students
(Portugal 2015 score on PISA Science Literacy was 501, significantly
higher than 493, the OECD’s score).
• The mean of the 496 students of our sample, 6.54, has no statisti-
cal difference from the mean of the 187 Portuguese students who
answered in PISA 2015 (6.25) and is significantly higher from the
mean of the 6,669 students of the OECD sample (6.19).
• Of the 25 school classes of our sample, seven have a mean that is
statistically different from the 6.25 mean of the 187 Portuguese
students who participated in the PISA 2015 (four have a lower value
and the other three have a higher value).
• By school level, only the Basic Education group (38 students) has
a significant lower mean than the 187 Portuguese students who
participated in the PISA 2015.
• Of the age groups, only students aged 22 or 23 have a mean signifi-
cantly higher than the 187 Portuguese students who answered in the
PISA 2015; these students are from two classes of engineering course
students (Sup_B_1 and Sup_D_2), who are expected to have better
skills in science and mathematics.

As a global conclusion, we also have evidence that our sample of 496 stu-
dents answered the Science items in a similar way of the 187 Portuguese
students who answered in PISA 2015, across all ages.
114 Vítor Duarte Teodoro et al.

Table 5.3 Sum of scores of the science items, by school class, age and educa-
tion level
School_Code |
Diff to mean Diff to mean
Age | School N Mean Median SD SEM PISAPT2012 Sign CECD2012 Sign
Level
Bas_A_1 14 6.14 6.5 2.742 0.733 -0.11 0.32
Bas_B_1 12 4.42 4.0 1.782 0.514 -1.83 0.020 -1.41 0.020
Bas_C_1 12 4.58 4.5 1.621 0.468 -1.67 0.033 -1.24 0.036
Sec_A_1 14 8.71 9.0 2.268 0.606 2.46 0.001 2.89 0.000
Sec_A_2 8 8.75 9.5 1.581 0.559 2.50 0.009 2.93 0.006
Sec_B_1 8 5.38 5.5 2.066 0.730 -0.88 -0.45
Sec_C_1 15 6.87 7.0 2.416 0.624 0.62 1.04
Sec_D_1 18 6.89 7.0 1.937 0.457 0.64 1.06
Sec_E_1 21 5.86 6.0 1.797 0.392 -0.39 0.03
Sec_F_1 9 7.89 8.0 1.269 0.423 1.64 2.06
Sec_G_1 9 6.11 5.0 2.205 0.735 -0.14 0.29
Sec_H_1 9 7.89 8.0 1.537 0.512 1.64 2.06
Sec_I_1 20 4.20 4.0 3.037 0.679 -2.05 0.001 -1.62 0.001
Sup_A_1 35 6.80 7.0 1.967 0.333 0.55 0.98
Sup_B_1 44 9.18 9.0 1.660 0.250 2.93 0.000 3.36 0.000
Sup_C_1 32 5.66 6.0 2.610 0.461 -0.60 -0.17
Sup_D_1 7 6.57 7.0 2.225 0.841 0.32 0.75
Sup_D_2 35 7.06 7.0 2.287 0.387 0.81 1.23
Sup_D_3 14 6.79 7.0 1.718 0.459 0.53 0.96
Sup_D_4 38 6.16 6.0 2.150 0.349 -0.09 0.33
Sup_E_1 20 6.90 7.0 1.744 0.390 0.65 1.08
Sup_F_1 12 6.58 6.5 1.832 0.529 0.33 0.76
Sup_G_1 24 6.38 7.0 1.974 0.403 0.12 0.55
Sup_G_2 24 4.58 5.0 2.062 0.421 -1.67 0.003 -1.24 0.003
Sup_G_3 28 6.18 6.0 2.245 0.424 -0.07 0.35
15 years 37 6.38 6 2.822 0.464 0.13 0.55
16 years 96 6.36 7 2.318 0.237 0.11 0.54
17 years 26 6.62 6 2.499 0.490 0.36 0.79
18 years 54 6.54 7 2.337 0.318 0.29 0.71
19 years 71 6.44 6 2.034 0.241 0.19 0.61
20 years 52 6.77 7 1.986 0.275 0.52 0.94
21 years 31 6.35 7 2.715 0.488 0.10 0.53
22 years 48 7.25 8 2.957 0.427 1.00 0.024 1.43 0.005
23 years 26 7.81 9 2.593 0.508 1.56 0.006 1.98 0.002
> 23 years 41 5.59 6 2.280 0.356 -0.67 -0.24
Basic 38 5.11 4.5 2.240 0.363 -1.15 0.014 -0.72 0.012
Secondary 133 6.62 7.0 2.566 0.223 0.37 0.80
Higher Education 325 6.67 7.0 2.352 0.130 0.42 0.85 0.001
Total 496 6.54 7.0 2.433 0.109 0.29 0.35 0.004
PISAPT 187 6.25 6.0 2.661 0.195 0.07
OECD 6669 6.19 6.0 2.638 0.032
Testing PISA tests

Figure 5.2 Sum of scores of the Science items, by age, box plot and frequency curve
115
116 Vítor Duarte Teodoro et al.

3.3 Item facility


We obtained the item facility for each item computing average scores
on a scale of 0 to 1, for our sample and for the students who answered
in PISA 2012 (Mathematics) and PISA 2015 (Science). An item facility
of 0 means that all students had a score of 0 points in the item. An item
facility of 1 means that all students had the full score of the item. An
item facility can be seen as the proportion of the average score, on a scale
from 0 to 1. For example, 0.4 means that the average score is 40% of the
maximum score of the item.
Figure 5.3 shows the scattergrams and Pearson’s coefficient of correla-
tion between the item facility for different education levels for our full
sample with the item facility for students who answered the same set of
items in PISA 2012 (Mathematics) and PISA 2015 (Science), both in Por-
tugal and in all OECD countries. All the correlations are very high and
highly significative (the minimum correlation is 0.907).
From these correlations, we got evidence that our sample of students
from different education levels (Basic, Secondary, Higher Education) and
ages (15 to greater than 23) had similar results on the item facility of the
students who participated on the PISA studies, all aged 15.

3.4 Item discrimination


We obtained the item discrimination index for each item computing Pear-
son’s coefficient of correlation between the score obtained by each stu-
dent in that item of Mathematics (or Science) and the sum of scores in
Mathematics (or Science) for the same student. This discrimination index
was obtained for the students in our sample and for the students in the
PISA database (Portugal and OECD) who answered the same items.
As for all correlation coefficients, 0 means no correlation, 1 means a
perfect positive linear correlation, and –1 means a perfect negative linear
correlation.
Figure 5.4 shows the scattergrams and Pearson’s coefficient of correla-
tion between the item discrimination for different education levels for our
full sample with the item discrimination for students who answered the
same set of items in PISA 2012 (Mathematics) and PISA 2015 (Science),
both in Portugal and in all OECD countries. Discrimination indexes from
our Basic Education sample have no significative correlation with dis-
crimination indexes from secondary and higher education levels, but they
have significative correlation with Portugal’s PISA 2012 and OECD’s
2012 item discriminations. This means that the items seem to be less suit-
able to basic education level.
Testing PISA tests
117

Figure 5.3 Item facility, correlations by education level, PISA PT, PISA OECD
118
Vítor Duarte Teodoro et al.

Figure 5.4 Item discrimination, correlations by education level, PISA PT, PISA OECD
Testing PISA tests 119

Taking all students from our sample, the discrimination indexes


correlate positively and significatively with the OECD sample, both
in Portugal and in other countries. From these correlations, we got
the evidence that our sample of students, as a whole, from different
education levels (Basic, Secondary, Higher Education) and ages (from
15 years old to over 23 years old) had similar results on the item dis-
crimination index of the students who participated on the PISA studies,
all aged 15.

3.5 Item assessment, by students


We asked students of our sample to assess each item with four-level Lik-
ert scale:

• “Have you fully understood the text?”


• “Have you ever studied anything related to the subject of the
question?”
• “How sure are you of your answer?”
• “How do you rate the difficulty of this question?”

For each scale, there are four grades. Figure 5.5 shows the bar charts,
by education level, for the means in each scale for each item. This figure
also shows all item facilities. In an overall analysis, we see that students’
assessment of items is similar in the three education levels, in the four
scales, with the exception of some more difficult items for Basic Educa-
tion students.
Figure 5.6 shows scattergrams and Pearson’s coefficient of the correla-
tions between item facility and the scales used by students to assess the
items, by education level.
Data show that there are positive linear correlations, with statistical
significance (Pearson’s coefficient from 0.198 to 0.755), between each
scale and the item facility for the total of students of our sample and
each education level (14 out of 16 cases). There are only two exceptions:
the scale “Have you ever studied something related to the subject of the
question?” has no significative correlation with the total of students of
our sample and with the higher education level.
In synthesis, we see that item facilities have a significative positive cor-
relation with students’ assessment of:

• the comprehension of the text of the item;


• the certainty of correction of the answer;
• the difficulty of the item.
120
Vítor Duarte Teodoro et al.

Figure 5.5 Item assessment by students, by education level, mean for each scale
Testing PISA tests 121

Figure 5.6 Item assessment by students, correlations between facility and


scales, by education level

We also see that items’ facilities have no significative correlation with


the student study of the content of the item. Students’ items assessment,
concerning if they have ever studied the item subject, empirically sup-
ports PISA assumption that tests are designed to be about “skills for life”
rather than school curriculum.
Figure 5.7 shows scattergrams and Pearson’s coefficient of the correla-
tions between scales used by students to assess the items.
122 Vítor Duarte Teodoro et al.

Figure 5.7 Item assessment by students, correlations between scales

We can see that there is always a positive linear correlation, with sta-
tistical significance (Pearson’s coefficient from 0.205 to 0.955), between
each pair of scales (a total of six cases). Note that the lowest value of the
correlation is between the “Have you ever studied something related to
the subject of the question?” and “How do you rate the difficulty of this
question?” scales. This result emphasizes our earlier conclusion about
one the most important PISA assumptions: PISA items are independent
of the school curriculum.
Testing PISA tests 123

4 Conclusions
PISA provides comparative data on 15-year-old students that are at least
in the 7th grade. The study aims to see whether schools of participating
countries/economies prepare the students to play the role of informed
citizens. PISA does not intend to assess the school curriculum, but the
skills that students have acquired for active life. In that sense, it aims to
assess how well students mobilize their skills in three domains of literacy:
reading, mathematics, science. More recently, it also looks at collabora-
tive problem-solving and financial literacy.
PISA uses questionnaires (for students, teachers, principals and par-
ents) and tests (for students). The questionnaires are public, but the large
majority of test items are kept confidential to allow comparability across
participating countries/economies over the various cycles.
Using a set of mathematics and science items released by the OECD,
we conducted a national study, applying a booklet with some items to
students from different educational levels (Basic, Secondary, Higher
Education), aged between 15 and over 23 years. We had put forward a
hypothesis that the PISA items are very difficult for most students aged
15 and they are more appropriate for older students. We administered
a booklet with two sets of items: Mathematics (PISA 2012) and Science
(PISA 2015). The sample used was by convenience. We administered 839
booklets to students from different types of courses, schools and higher
education establishments, public or private. They were asked to answer
the items and to assess different aspects of each item with four Likert
scales: “Have you fully understood the text?”; “Have you ever studied
anything related to the subject of the question?”; “How sure are you of
your answer?”; and “How do you rate the difficulty of this question?”.
We compared the scores of our sample with the scores of the items in
PISA (in 2012 and 2015), both for Portugal and for OECD countries
(students aged 15), and we found that:

• Results are similar in all age groups in the two literacies under analy-
sis; this means that PISA tests may have a broader purpose, involving
a higher age group, which allows us to accept our starting hypoth-
esis. The knowledge and skills of 15-year-olds are similar to the
knowledge and skills of older age groups.
• Results also allow us to conclude that the item facility has a signifi-
cant positive correlation with the students’ evaluation of the items,
as well as the evaluation of the understanding of the item text, with
certainty to answer, and with the evaluation of the item difficulty. It
is also possible to ascertain that item facility has no significant cor-
relation with students’ study of the item’s content.
124 Vítor Duarte Teodoro et al.

These results seem to support two important assumptions of PISA tests:


assessing what 15-year-old students can do with what they have learned
in schools and the assessment is not restricted by the curriculum.

References
Araújo, L., Saltelli, A., & Schnepf, S. (2017). Do PISA data justify PISA-based
education policy? International Journal of Comparative Education and Devel-
opment, 19(1), 1–17. https://doi.org/10.1108/IJCED-12-2016-0023
Carvalho, L. M., & Costa, E. (2009). Production of OECD’s ‘programme for
international student assessment’: Final report. Project KNOWandPOL, WP
11, March. www.knowandpol.eu/IMG/pdf/o31.pisa.fabrication.pdf
Carvalho, L. M., Costa, E., & Sant’Ovaia, C. (2020). Depicting the faces of
results-oriented regulatory processes in Portugal: National testing in policy
texts. European Educational Research Journal, 19(2), 125–141. https://doi.
org/10.1177/1474904119858799
Conselho Nacional de Educação. (2013). Avaliações internacionais e desem-
penho dos alunos portugueses. Conselho Nacional de Educação. www.
cnedu.pt/content/edicoes/seminarios_e_coloquios/LIVRO_Avaliacoes_
internacionais.pdf
Hopfenbeck, T., Lenkeit, J., Masri, Y., Cantrell, K., Ryan, J., & Baird, J.-A.
(2018). Lessons learned from PISA: A systematic review of peer-reviewed
articles on the programme for international student assessment. Scandinavian
Journal of Educational Research, 62(3), 333–353. https://doi.org/10.1080/00
313831.2016.1258726
Hopmann, S. T. (2008). No child, no school, no state left behind: Schooling in the
age of accountability. Journal of Curriculum Studies, 40(4), 417–456. https://
doi.org/10.1080/00220270801989818
Kreiner, S., & Christensen, K. B. (2014). Analyses of model fit and robust-
ness. A new look at the PISA scaling model underlying ranking of countries
according to reading literacy. Psychometrika, 79(2), 210–231. https://doi.
org/10.1007/s11336-013-9347-z
Marôco, J. (2020). International large-scale assessments: Trends and effects on
the Portuguese public education system. In H. Harju-Luukkainen, N. McEl-
vany, & J. Stang (Eds.), Monitoring student achievement in the 21st cen-
tury. European policy perspectives and assessment strategies (pp. 207–222).
Springer. https://doi.org/10.1007/978-3-030-38969-7_17
Mullis, I., Martin, M., Kennedy, A., Trong, K., & Sainsbury, M. (2009). PIRLS
2011 assessment framework. TIMSS & PIRLS International Study Center,
Lynch Scholl of Education: Boston College. https://timssandpirls.bc.edu/
pirls2011/framework.html
Organisation for Economic Co-operation and Development. (2009). PISA
data analysis manual SAS second edition. OECD. https://doi.org/10.1787/
9789264056251-en
Organisation for Economic Co-operation and Development. (2013). PISA 2012 –
Released mathematics items. www.oecd.org/pisa/pisaproducts/pisa2012-
2006-rel-items-maths-ENG.pdf
Testing PISA tests 125

Organisation for Economic Co-operation and Development. (2014). PISA 2012


technical report. OECD.
Rémond, M. (2006). Éclairages des évaluations internationales PIRLS et PISA
sur les élèves français. Revue Française de Pédagogie, 157, 71–84. https://doi.
org/10.4000/rfp.433
Rosa, V., Maia, J. S., Mascarenhas, D., & Teodoro, A. (2020). PISA, TIMSS
e PIRLS em Portugal: uma análise comparativa. Revista Portuguesa de Edu-
cação, 33(1), 94–120. http://doi.org/10.21814/rpe.18380
Sjøberg, S. (2015). PISA and global educational governance – A critique of
the project, its uses and implications. Eurasia Journal of Mathematics, Sci-
ence & Technology Education, 11(1), 111–127. http://doi.org/10.12973/
eurasia.2015.1310a
Sjøberg, S. (2019). The PISA-syndrome –How the OECD has hijacked the way
we perceive pupils, schools and education. Confero, 7(1), 12–65. https://doi.
org/10.3384/confero.2001-4562.190125
UNESCO. (2019). The promise of large-scale learning assessments: Acknowl-
edging limits to unlock opportunities. UNESCO. https://unesdoc.unesco.
org/ark:/48223/pf0000369697?posInSet=1&queryId=590023c7-2ff8-
4ba5-bd73-8f3ccc0242f4
Waldow, F., & Steiner-Khamsi, G. (Eds.). (2019). Understanding PISA’s attrac-
tiveness: Critical analyses in comparative policy studies. Bloomsbury Aca-
demic. https://doi.org/10.1080/02680939.2020.1759494
Zhao, Y. (2020). Two decades of havoc: A synthesis of criticism against PISA.
Journal of Educational Change, 21(2), 245–266. https://doi.org/10.1007/
s10833-019-09367-x
6

International large-scale
assessment
Issues from Portugal’s participation
in TIMSS, PIRLS and ICILS
Vítor Rosa

Introduction
International large-scale assessment studies produce information
and indicators on the knowledge and skills of students from differ-
ent education systems. These evaluations, in the field of education,
have acquired great importance in recent decades. Governments from
various political quarters started to use the results of these studies,
with the aim of improving investments and achieving better school
performance.
Although international large-scale assessments (ILSAs) are currently
considered by many to be a regular feature of the education landscape,
they are a relatively recent phenomenon. Their origins can be traced
back to the pilot survey of the International Association for the Evalua-
tion of Educational Achievement (IEA) regarding student performance
assessment conducted in the 1960s (Rosine & Postlethwaite, 1994).
Since then, there have been significant developments. The number
of organizations responsible for the development, management and
administration of ILSAs has grown from one to seven major actors:
IEA, Conference of the Ministers of Education of French Speaking
Countries (CONFENEM), the Inter-American Development Bank
(IDB), the Organization for Economic Cooperation and Development
(OECD), the Southern and Eastern Africa Consortium for Monitor-
ing Educational Quality (SACMEQ), the United Nations Educational,
Scientific and Cultural Organization (UNESCO), and the World Bank
(Lietz et al., 2017; Wagemaker, 2014). Currently, it is estimated that
about 70% of the world’s countries participate in ILSAs (Lietz et al.,
2017).
Measurement methodologies have evolved considerably. In the ILSAs
Programme for International Student Assessment (PISA), Progress in
International Reading Literacy Study (PIRLS) and Trends in Interna-
tional Mathematics and Science Study (TIMSS), the roots are grounded
on long-term trend assessment methodologies in the National Assessment
DOI: 10.4324/9781003255215-7
International large-scale assessment 127

of Educational Progress (NAEP) in the United States of America (USA).


Its methodologies were adapted and expanded to meet the challenges
of these and other studies that assess educational performance beyond
national borders, ensuring, for example, the comparability of test items
when operating with an increasing number of participating countries and
languages (Rutkowski et al., 2010).
Education reform efforts have focused on concerns related to the
challenges of ensuring equity in school terms. However, with greater
recognition of the effects of globalization and economic competi-
tiveness and a greater concern for the fairness of learning outcomes
(e.g. in terms of what students know and can do), ILSAs are increas-
ingly seen as a necessary condition for monitoring and understand-
ing the results of the significant investments that all nations make in
education.
Over the years, the organizations that administer the ILSAs have devel-
oped technical standards with minimum requirements, and result reports
for the study cycles are usually accompanied by comprehensive technical
documentation, which provides guidance for data interpretation and the
implementation of secondary analyses.
Portugal has participated in some of these studies, which are promoted
by several international organizations: TIMSS since 1995, PISA since
2000, PIRLS since 2011 and the International Computer and Informa-
tion Literacy Study (ICILS) since 2013. In addition to Portugal’s posi-
tion on the international scene, participating in these studies allows the
gathering of information about the education system. The data collected
then have a bearing in the definition and implementation of education
policies.
Promoted by the OECD, PISA is perhaps the best known and the most
prominent in the media. However, there are other large-scale studies,
namely those promoted by the IEA, founded in the 1950s. Its influence
on the evolution of education systems is well documented. But they are
not only instruments that influence political decision-making in the edu-
cation area of several countries. They also provide unique data that incite
researchers around the world to develop a series of secondary analyses
(Olsen & Lie, 2006).
This chapter seeks to appraise, in general terms, three of them, in
which Portugal participates1: TIMSS, PIRLS and ICILS.

Trends in International Mathematics and


Science Study (TIMSS)
TIMSS is a comparative international study that seeks to assess the level
of school knowledge of students of the 4th and 8th grades in mathematics
128 Vítor Rosa

and science, which, in the view of Marôco et al. (2016a), are considered
domains or literacies that are

essential in the training of students who opt for education paths asso-
ciated with professional areas internationally known as STEM (Sci-
ence, Technology, Engineering and Mathematics).
(pp. 5–6)

In fact, mathematics is objective and universal, going beyond nations and


culture differences. Students know school mathematics, and this is deeply
cultural (Schmidt et al., 1997). Science refers to any knowledge acquired
through the scientific method.
The scope and complexity of TIMSS are enormous. The success of
this study depends on the collaborative effort between the research cen-
tres of each country responsible for the management of tasks, the sam-
pling procedures of schools and students, and the performance of the
various steps necessary for data processing and analysis (Beaton et al.,
1996). According to Bodin and Grapin (2018), TIMSS is a research pro-
ject that “seeks to clarify the links between the programs and the official
instructions (desired curriculum), the teaching practices (implemented
curriculum) and the competencies of the students (achieved curricu-
lum)” (p. 71).
It is, as seen by Fernandes (2008), a “study influenced by research-
ers interested in understanding relationships between the curriculum, the
contexts in which it is developed and students’ learning” (p. 282). In each
country, students are expected to learn

in accordance with the education policies, the organization of the


education system and the cultural aspects. The organization of cur-
ricular plans, syllabuses, learning objectives and evaluation pro-
cesses, state the results expected by each education system.
(Carvalho et al., 1996, p. 2)

The reference framework for mathematics is articulated around two


domains: contents (numbers, geometric shapes and measurements, pres-
entation of data) and cognitive skills (knowing, applying, reasoning). In
general terms, in TIMSS the design of the cognitive dimensions is simi-
lar to math and science literacies, but the identification of related skills
for each is specific to the domains evaluated. The “knowing” dimension
refers to the facts, concepts and procedures that the student must know.
The “applying” dimension encompasses the student’s ability to employ
their knowledge and understanding of concepts to solve problems or
answer questions. The “reasoning” dimension concerns complex situ-
ations and problems that require several steps until a solution is found,
International large-scale assessment 129

Table 6.1 D
 istribution of science items (content area, cognitive dimension
and item type), TIMSS 2015

Science items Selection Construction Total % of score


items items

Content Life Science 39 40 79 46


Area Physical Sciences 36 28 64 35
Earth Sciences 23 10 33 19
Total 98 78 100
% of score 52 48 176
Cognitive Knowing 47 25 72 41
Dimension Applying 32 35 67 38
Reasoning 19 18 37 21
Total 98 78 100
% of score 52 48 176
Source: Marôco et al. (2016b)

which involves logical and systematic thinking. The evaluation of the


sciences focuses on the “life sciences”, the “physical sciences” and the
“earth sciences” (Marôco et al., 2016b). On the other hand, the evalua-
tion is also organized according to the typology of the items and by the
distribution of content areas and cognitive dimensions, as presented in
Table 1.
In a total of 176 items, we note that there is a slight prevalence of selec-
tion items (which are multiple choice) in relation to construction items
(which require a short response), 98 and 78, respectively. These data
also reveal that the area of earth sciences assumes a smaller number of
items (33), corresponding to 19% of the total percentage of scores. This
aspect shows the lower status that is attributed to this area of knowledge
compared to the areas of life sciences and physical sciences, as already
mentioned in 1995 by Carvalho et al. (1996). This distribution, in the
opinion of Saraiva (2017), “is mirrored by the official programs in Por-
tugal” (p. 9).
Established by the IEA, TIMSS aims to interpret the differences
between education systems, seeking to improve the teaching and learning
of students from participating countries (Drent et al., 2013). Since 1995,
it has been taking place every four years. Portugal participated in the first
edition and then in 2011, 2015 and 2019; regarding the latest edition, the
main results are not yet publicly known.
In addition to TIMSS, the IEA carries out TIMSS Advanced, which
seeks to evaluate the performance of students attending the terminal year
(12th grade) of secondary education in advanced mathematics (A) and
physics. The number of countries in the various TIMSS cycles has been
increasing.
130 Vítor Rosa

This international large-scale study uses a multi-stage random sampling


process, ensuring that the sample is representative of the target popula-
tion. Knowledge and competencies are estimated by item response theory
models, associating plausible values on a scale from 0 to 1,000 points,
with an average of 500 points and the standard deviation of 100 points.
Each student answers a test, which combines several items of mathemat-
ics and science, and has a duration of about 90 minutes. The items are
confidential, allowing the consortium to compare the results throughout
the study editions and identify the global result trends. At each cycle the
IEA publicly discloses some of these items, which no longer integrate
future tests, so that it is possible to know the questions that students are
asked (IAVE, n.d., 2019).
Portugal participates in TIMSS. It was involved in several editions:
1995 (with 4th- and 8th-grade students), 2011 (with 4th-grade students),
2015 (4th-grade students) and 2019 (4th-grade students). In 2015, Por-
tuguese students of the 12th grade participated in TIMSS Advanced, with
evaluation in mathematics and physics. The results have been positive:
from 442 points (in 1995) to 525 (in 2019). Boys have been getting a
higher average score than girls: boys (444 in 1995 and 533 in 2019); girls
(440 in 1995 and 516 in 2019).

Progress in International Reading Literacy


Study (PIRLS)
PIRLS is an extensive international assessment of the knowledge as well
as curricular and school skills of students around the world attending
the 4th grade in the field of reading literacy. The reason for this choice,
according to Sim-Sim (2013), has “to do with the stage of schooling in
which all basic mechanisms of learning to read must be consolidated”
(pp. 72–73).
The concept of literacy emerged in Portugal in the 1990s, more specifi-
cally with the study by Benavente et al. (1996). It is related to the abilities
pertaining to each individual’s use and interpretation of written informa-
tion. Therefore, it refers to daily practices and competences, regardless of
school levels. Currently, the notion of literacy is used very broadly, refer-
ring to different fields (scientific literacy, computer literacy, sports lit-
eracy, among others) (Ávila, 2005). In the case of PIRLS, reading literacy
defined by the IEA is understood as “the ability to understand and use
the written language forms required by society and/or valued by the indi-
vidual” (IAVE, 2017, p. 3). The framework of this international survey
focuses on the diversity of reading experiences that students may experi-
ence, be it at school or in their daily lives, building meaning regarding a
wide variety of texts. As understood in PIRLS, reading comprehension
combines two reading objectives: reading to have literary experience,
International large-scale assessment 131

and reading to acquire and use information. Literary texts are complete
recitals accompanied by illustrations. Texts should familiarize students
with plot, events, actions, character motivations, etc. Informational texts
enable one to address aspects of the real world, covering a wide variety of
subjects. They are accompanied by structuring and illustrative elements,
such as diagrams, letters, illustrations, photographs, lists, and tables,
among others.
In PIRLS, the population studied does not correspond to a generation
of students of a defined age, as is the case of the OECD’s PISA Programme.
The goal is to measure the performances of the set of students present at
a particular level of education, regardless of their age, their path and the
organization of the education system.
The countries/regions that participate in the study identify the strengths
and weaknesses of students’ reading skills. As mentioned earlier, Portugal
participated in 2011 and then in 2016 (3rd and 4th editions). In the 2016
edition, the study integrated a new dimension in the evaluation of read-
ing literacy: online (ePIRLS), involving 14 countries, including Portugal.
In the PIRLS/ePIRLS studies, through paper and digital media, infor-
mation about students, their families, teachers and schools is collected,
contextualizing “the reading learning opportunities as well as identifying
factors that influence these opportunities” (IAVE, 2017, p. 3). Within the
framework of PIRLS, the assessment protocol (texts and items) is based
on the intersection of comprehension processes and reading objectives
(Campbell et al., 2001).
The use of reading serves two purposes: literary experience and the
acquisition and use of information.
Ferreira and Gonçalves (2013) emphasize that

PIRLS’s assessment of reading literacy is based on a comprehensive


notion of what it is to know how to read, a notion that includes the
ability to reflect on what is read and to make use of it to achieve indi-
vidual and social goals. Thus, the conceptual assessment framework
was designed to contemplate reading comprehension purposes and
processes that give meaning to this concept of reading literacy.
(p. 12)

The PIRLS 2016 evaluation integrated four types of reading comprehension


processes within each of the purposes: to locate and extract explicit
information; to make direct inferences; to interpret and integrate ideas
and information; to analyse and assess content and textual elements
(IAVE, 2017). PIRLS protocols are based on long and integral texts
(totalling eight in the 2011 PIRLS) and on each of them, there are about a
dozen questions. The exercises are short, since it is necessary to articulate
several factors to obtain the declination of the items corresponding to the
132 Vítor Rosa

characteristics defined (Rémond, 2006, 2007).2 A complex methodology


conditions the construction and organization of materials, elaborated in
English, to form the “source version”, with the common basis being used
for translations in other languages (Araújo et al., 2016; Marôco, 2018;
Rémond, 2007).
Although with different objectives, PIRLS has “many similarities,
both in design and in methodologies” (Rémond, 2006, p. 71) with other
international studies, namely PISA, carried out by the OECD. As with
the other studies, at the end there is a scale of comprehension results
elaborated from item response models (IRM), whose international
average is 500 and the standard deviation is 100. In this way, participating
countries can be classified on the basis of a common dimension. Unlike
PISA, in PIRLS (and in TIMSS), the assessment of reading skills is
distributed by four levels of performance: Low level (from 400 to 474
points), Intermediate level (from 475 to 549 points), High level (from
550 to 624 points) and Advanced level (625 points or more). Since the
percentages are cumulative, students who have reached the Advanced
level, for example, have also reached previous levels. It should be noted
that, with this rating, students who score less than 400 do not even reach
the Low level.
A study by Lafontaine (2008) allows us to point out some pedagogical
characteristics that distinguish countries with lower scores:

• The tendency to teach rather than promote the ability to understand;


• The teaching of comprehension strategies is far from the practice of
the daily life of classes;
• The little time devoted to formal teaching of reading in the 4th grade;
• Children’s books are not considered essential teaching material;
• The irregularity of reading more extensive books;
• The predominance of traditional assessment methods using multiple-
choice questionnaire and open questions that require short written
answers.

Rémond (2007) states that “the results of PIRLS lead us to think that the
tasks proposed to the students by the school are very limited” (p. 67).
Students feel “bewildered” by the complex and successive questions,
which must be answered without the teacher’s help (Rémond, 2007).
Regarding the Portuguese reality, and in a more comprehensive manner,
Benavente (2016) highlights several constraints at school level: lecture-
based classes, high number of students per class, needs dissociated
from the reality of the children and young people, economic competition
between schools, dismissal of thousands of teachers and support staff,
curriculum and syllabus reformulation, long and unsuitable syllabus,
obstacles to the integration of children and young people with specific
International large-scale assessment 133

educational needs, minorization of other subjects (sports, artistic educa-


tion, civic education, environmental education). Policies and practices
are irregular and change according to governments, educational actors
and schools.
Portugal participated in this study for the first time in 2011 (the third
edition of PIRLS) and its latest participation was in 2016. Comparing the
overall results of this study in these two years, we can observe that there
was a decline in the assessment of the performance of Portuguese students
in the 4th grade. Compared to 2011, the score represents a 13-point drop
(from 541 to 528). In the ordered results scale, between the two years
in question, Portugal went from 19th to the 30th place of the ranking.
Regarding the distribution of PIRLS results by gender, we find that girls
have better results than boys in reading in 2011 and 2016 (boys: 534 in
2011; 527 in 2016; girls: 548 in 2011; 529 in 2016).

International Computer and Information


Literacy Study (ICILS)
Carried out every five years, ICILS assesses the competencies of the
8th-grade students in ICTs.3 It stems from a question: are students
well prepared to study, work and live in the digital world? The study
focuses on two key areas: Computer and Information Literacy (CIL) and
Computational Thinking (CT). In addition to influencing the political
decision, it is expected that the conclusions may have an impact on the
work carried out by the school, consequently improving the students’
educational success (Fraillon et al., 2020). From the perspective of Vanda
et al. (2019):

This is a study that assesses domains considered to be very important


for the development of students, both from the point of view
of contributing to curriculum development, of knowledge and
disciplinary knowledge, and from that of the socio-cognitive and
metacognitive development.
(p. 7)

When compared to other studies (PISA, TIMSS and PIRLS), the number
of participating countries/regions is lower. In 2018, 12 countries par-
ticipated in the CIL evaluation (Chile, Denmark, United States of Amer-
ica, Finland, France, Germany, Italy, Kazakhstan, Republic of Korea,
Luxembourg, Portugal and Uruguay) and two regions in benchmarking
(Moscow – Russian Federation, and North Rhineland-Westphalia – Ger-
many) and eight countries (Denmark, Finland, France, Germany, Portu-
gal, Luxembourg, United States, Republic of Korea) and 1 region (North
Rhine-Westphalia – Germany) participated in the CT assessment.4 In
134 Vítor Rosa

total, the ICILS study gathered information from 46,561 8th graders and
26,530 teachers from 2,226 schools.
The IEA follows the same structure as TIMSS and PIRLS in relation
to the numerical scale, which varies between 0 and 1,000 points and has
a fixed central point at 500 points (average performance). The standard
deviation is 100 points.
ICILS is structured within a conceptual framework of reference, where
the analysis dimensions and content areas assessed in the two domains
under consideration (CIL and CT) are defined. The test5 consists of dif-
ferent levels of difficulty of tasks, as well as levels of performance pro-
ficiency. Proficiency level 1 is between 407 and 491 points, proficiency
level 2 is between 492 and 576 points, proficiency level 3 is between 577
and 661 points and proficiency level 4 is higher than 661 points.6
According to the IEA, CIL “refers to an individual’s ability to use
computers to research, create and communicate, in order to actively
participate in contemporary societies, whether at home, at school, in the
workplace and in community and educational contexts” (IAVE, 2019,
p. 23).
As far as CT is concerned, the IEA defines it as follows:

It refers to an individual’s ability to recognize aspects and real-world


problems appropriate for computational formulation, as well as
their ability to evaluate and develop algorithmic solutions to these
problems, which can be operationalized on a computer.
(IAVE, 2019, p. 25)

Table 2 provides the most detailed information on the dimensions and


areas of content (CIL and CT).
There are several definitions of ICT. Ricoy and Couto (2012) highlight
that the term ICT emerged in the late 1990s and they “are constituted
by technical means to manipulate information and promote communica-
tion, including hardware and software” (p. 244), associated with com-
puter networks. ICTs are also linked to telecommunications as a means
of disseminating communication. For Blurton (1999), ICTs consist of a
diversity of technological tools and resources that are used to commu-
nicate, create, disseminate and obtain information. Spanhel (2008), in
turn, clarifies that ICTs are technological or electronic means, based on
the principles of digitization and networking. In the educational sector,
when a reference is made to these devices, it refers to new information
and communication techniques.
In ICILS, Portugal recorded an average score of 516 points in CIL,
placing it above the international average of ICILS 2018 (496 points).
In the case of CT, at national level, 482 points were obtained, 18 points
below the international average. As in TIMSS and PIRLS, in ICILS there
International large-scale assessment 135

Table 6.2 D
 imensions and areas of dimensions, content areas (CIL and CT)
and percentages

CIL dimensions % CIL content areas %

Understanding 14 Learning the fundamentals of 2


computer use computer use
Knowing the conventions of 12
computer use
Collecting 25 Accessing and evaluating 15
information information
Managing information 10
Producing 50 Transforming information 20
information Creating information 30
Communicating 11 Sharing information 8
digitally Using responsible and secure 3
information

CT dimensions % CT content areas %


Conceptualizing 41 Knowing about and 18
problems understanding digital systems
Formulating and analysing 10
problems
Collecting and representing 13
relevant data
Operationalizing 59 Planning and evaluating solutions 33
solutions Developing algorithms, programs 26
and interfaces
Source: IAVE (2019)

is gender differentiation too. In all countries, girls had better average


results than boys in CIL. Portugal follows the international trend, with
girls obtaining 522 points and boys, 511 points. In the CT evaluation,
the results in the international trend were inverted, with the boys achiev-
ing the highest average results. At national level, we observe that boys
achieved an average score of 490 points, which is 16 points above the
score obtained by girls.

Explanatory factors of the results of TIMSS,


PIRLS and ICIL in Portugal
A study recently published by the Portuguese Education Council
(CNE) seeks to identify some of the factors that explain Portuguese
students’ performance in the three main literacies (reading, mathemat-
ics and science) with reference to TIMSS 2015 and PIRLS 2016. The
authors of this study (Félix et al., 2020) investigate how these fac-
tors can promote equal opportunities in access to education, and how
136 Vítor Rosa

they explain the difference in students’ PIRLS performance and, con-


sequently, in their school performance. To this end, they compared the
results of several European countries (Finland, Norway, the Nether-
lands, Poland, Germany, Slovakia, Spain, Italy, France, Ireland and
Portugal). When compared to other European countries, Portugal
has a longer compulsory education.7 The following are some of the
explanatory factors:

1) Students from families with high family capital (this indicator inte-
grates the level of education, the professional qualification of par-
ents, the books available at home, with an emphasis on children’s
books, study support materials)8 perform better than students from
families with fewer socio-economic resources. This issue does not
only concern TIMSS and PIRLS. PISA 2018 and TIMSS 2015 also
revealed that socio-economic status is a strong predictor of Portu-
guese students’ performance.
2) The higher the mastery of basic literacy (and numeracy) issues before
schooling, the more likely students are to perform well in the 4th
grade.
3) Continued attendance of education programmes and care regard-
ing early childhood development are important for students from
families with lower socio-economic resources. In the case of Por-
tugal, the authors report that attendance of three or more years
represents a significant increase in Reading performance for stu-
dents with “Few or some resources”, but does not have a sta-
tistically significant result for the group with “Many resources”
(p. 11).
4) Students who trust their skills more are the ones who achieve the best
results in the three main literacy domains.
5) Of the different European countries, Portugal has the highest per-
centage of students from schools from disadvantaged backgrounds,
achieving, in all areas, scores above the international average. The
results of Portuguese students, when compared with those of other
countries, “suggest a good capacity of the education system to
reduce the differences derived from diverse socio-economic contexts”
(p. 12).
6) Schools that are more focused on school success allow their stu-
dents to perform better. In fact, school climate is an important
predictor.
7) Students attending so-called very safe and organized schools are
more represented in affluent socio-economic backgrounds. In this
respect, the indicator “Discipline Problems” is a good predictor of
reading performances.
International large-scale assessment 137

The ICILS studies (2013 and 2018) highlight that being born in a digital
world does not necessarily mean that someone is digitally competent.
Contrary to the common view that today’s young generation is a gen-
eration of “digital natives”, the findings of the first two cycles of ICILS
indicate that young people do not develop sophisticated digital skills.
Only the digital use of devices grows. On the other hand, there is a great
variation between countries regarding the achievement of information
literacy. The focus should be not only on young people with low socio-
economic resources, but also on those with higher levels of proficiency
in digital competence. There is also gender differentiation in the use of
ICTs. Girls perform better than boys at CIL, but this differentiation is
less evident in CT assessment. The results of ICILS also suggest the need
for a holistic approach to the pedagogical use of ICTs in schools. Provid-
ing students and teachers with ICT equipment is not enough to improve
their digital skills. They should be encouraged and supported in the use
of digital tools.
In a recent article, we tried to compare PISA, TIMSS and PIRLS in
Portugal (Rosa et al., 2020). The article aimed to extend the information
scope on these evaluations and to ascertain if it is possible to compare
them, taking into account their objectives. It drew on the known global
data of the participating countries, with a particular emphasis on the
data concerning Portugal. It was found that the results obtained by the
students are not the same all over the country, with differences between
the various regions. The domains that are assessed (e.g. reading, math-
ematics, science and physics) do not seem to be determinant to the results
of each study. The irrelevance of the domain tells us that regions have
good or poor results regardless of the domain concerned, and the rel-
evance of the assessment object indicates that competence at school may
have little to do with life skills.

Conclusion
In the last decades, large-scale evaluation has acquired great importance
in the field of education. The main goals of the ILSAs, especially those
undertaken in the school context, are to seek to improve the quality and
equity of education, as well as to respond to the growing global demand
regarding investments made in the educational offer. In general, ILSAs
share common objectives that explicitly or implicitly include one or more
of the following elements:

• Provision of high-quality data to improve policymakers’ understand-


ing of the main school and non-school factors that influence teaching
and learning;
138 Vítor Rosa

• Provision of high-quality data as a resource to identify areas of con-


cern and action, and to prepare and evaluate education reforms;
• Development and improvement of the capacity of education sys-
tems to commit to national strategies of educational monitoring and
improvement.

ILSAs can be divided into two categories: assessments based on school


programmes and competency-based assessments. The former assess the
extent to which students have grasped the syllabus, while the latter assess
the extent to which students can put it into practice in real life (Addey &
Sellar, 2019). Several studies have been implemented, leading an increas-
ingly larger number of countries and regions/economies joining them.
Large-scale assessments strive to contribute to improve the quality of edu-
cation by seeking to measure competencies and skills, but they are also
continuous tools which enable problematic conditions to be identified.
When the constraints are identified, new pedagogical possibilities in the
school are proposed. The results of the tests indicate possible flaws in the
process that result in the non-mastery of certain competencies and skills
that must be developed within the school. Several specialized journals and
works echo these studies, insisting on their objectives, the methods used
and their results. The media disclose the results through rankings, but
provide very little clarification regarding the processes of these studies.
Portugal has been participating in several international large-scale
studies, enabling information to be collected about the education system
as well as the socio-economic context of the families and the personal
context of the students. These data then influence the definition and
implementation of education policies.

Notes
1 In a recently published article, we sought to compare PISA, TIMSS and PIRLS
(Rosa et al., 2020).
2 The IEA publicly disclosed three reading assessment units that were part of
the PIRLS test and two assessment units of the ePIRLS test administered in
2016. In 2011, it also released a set of items released by the consortium. IAVE
compiled the information of the Portuguese version in two documents (IAVE,
2018). Context questionnaires for the 2011 PIRLS are available on the IAVE
website: www.iave.pt/index.php/estudos-internacionais/pirls/instrumentos-
de-avaliacao (accessed on 09/09/2020).
3 In Portugal, ICT is a mandatory subject for students from the 5th to 9th
grade. Curricular skills are organized in four fields: 1) digital citizenship; 2)
investigating and researching; 3) communicating and collaborating; 4) creat-
ing and innovating. The Directorate General for Education (DGE) has been
promoting in several school years (2015–2017) introduction to programming
initiatives, aimed at students of the 3rd and 4th grades. In 2017/2018, this
subject received the name “Probótica” (Programming and Robotics).
International large-scale assessment 139

4 In ICILS 2018, CT was considered an optional domain.


5 The test consists of five modules (with questions and tasks) and has a total
duration of 30 minutes each. The CIL test lasts 60 minutes. The CT evalua-
tion was organized in two modules of 25 minutes each.
6 For more detailed information on the proficiency levels, scale intervals, level
characteristics and examples, see the IAVE report (2019, pp. 27–30).
7 In Portugal, Law 85/2009, of 27 August, defined the extension of the age of
compliance with compulsory schooling to 18, and enshrined the universality
of pre-school education for children from the age of five.
8 It should be noted that “capital”, in Bourdieu’s theory (1986), is synonymous
with power, consisting of economic, cultural and social assets that reproduce
and foster social mobility.

References
Addey, C., & Sellar, S. (2019). Cela en vaut-il la peine? Raisons de participa-
tion (ou non) aux évaluations internationales à grande échelle des apprent-
issages. Recherche et Prospective en Éducation, 24. https://unesdoc.unesco.
org/ark:/48223/pf0000368421_spa
Araújo, L., Costa, P., & Folgado, C. (2016). Avaliação da leitura: PIRLS 2011.
In F. Azevedo & Â. Balça (Coord.), Leitura e educação literária (pp. 15–30).
Pactor – Edições de Ciências Sociais, Forenses e da Educação.
Ávila, P. (2005). A literacia de adultos: competências-chave na sociedade do con-
hecimento. Tese de doutoramento em Sociologia. ISCTE.
Beaton, A., Martin, M., Mullis, I., Gonzalez, E., Smith, T., & Kelly, D. (1996).
Science achievement in the middle school years. Boston College/International
Association for the Evaluation of Educational Achievement (IEA).
Benavente, A. (2016, outubro). O ‘dia’ seguinte: O que a Troika fez à escola. Le
Monde Diplomatique – edição portuguesa, 8–9.
Benavente, A. (Coord.), Rosa, A., Costa, A., & Ávila, P. (1996). A literacia em
Portugal – resultados de uma pesquisa extensiva e monográfica. Fundação
Calouste Gulbenkian/Conselho Nacional de Educação.
Blurton, C. (1999). New directions in education. In M. Tawfik (Org.), The world
communication and information (pp. 46–61). UNESCO.
Bodin, A., & Grapin, N. (2018). Un regard didactique sur les évaluations du
PISA et de la TIMSS: mieux les comprendre pour mieux les exploiter. Mesure
et évaluation en éducation, 41(1), 67–96. https://doi.org/10.7202/1055897ar
Bourdieu, P. (1986). The forms of capital. In J. Richardson (Ed.). Handbook of
theory and research for the sociology of education (pp. 241–258). Greenwood.
Campbell, J., Kelly, D., Mullis, I., Martin, M., & Sainsbury, M. (2001). Frame-
work and specifications for PIRLS assessment 2001: Progress in international
reading literacy study. Boston College/IEA – International Association for the
Evaluation of Educational Achievement.
Carvalho, J., Amaro, G., Reis, P., & Neres, F. (1996). Terceiro estudo internac-
ional em matemática e ciências (TIMSS): semelhanças num contexto de difer-
enças. Instituto de Inovação Educacional.
Drent, M., Martina, M., & Fabienne, K. (2013). The contribution of TIMSS
to the link between school and classroom factors and student achievement.
140 Vítor Rosa

Journal of Curriculum Studies, 45(2), 198–224. https://doi.org/10.1080/0022


0272.2012.727872
Félix, P., Perdigão, R., & Lourenço, V. (2020). Desempenho e equidade: uma
análise comparada a partir dos estudos internacionais TIMSS e PIRLS. Con-
selho Nacional de Educação (CNE).
Fernandes, D. (2008). Algumas reflexões acerca dos saberes dos alunos em Por-
tugal. Educ. Soc., Campinas, 29(102), 275–296. https://www.scielo.br/j/es/a/
WPX3N4SV7y7SBD5pqnTKkJP/?lang=pt&format=pdf
Ferreira, A., & Gonçalves, C. (2013). TIMSS & PIRLS 2011 – Relações entre
os desempenhos em leitura, matemática e ciências, 4.° ano. ProjAVI Grupo de
Projeto para a Avaliação Internacional de Alunos.
Fraillon, J., Ainley, J., Wolfram, S., Friedman, T., & Duckworth, D. (2020). Pre-
paring for life in a digital world: IEA international computer and information
literacy study 2018 international report. Springer.
Instituto de Avaliação Educativa (IAVE). (2017). Resultados Globais PIRLS
2016 – ePIRLS 2016 – Portugal. Literacia de leitura & literacia de leitura
online. IAVE.
Instituto de Avaliação Educativa (IAVE). (2018). PIRLS 2016 – ePIRLS2016 –
Literacia de leitura & literacia de leitura online. Unidades de avaliação. IAVE.
Instituto de Avaliação Educativa (IAVE). (s./d.). TIMSS 2011 – Trends in Inter-
national Mathematics and Science Study – itens de matemática – 4.° ano, dis-
ponibilizados ao público. IAVE. https://iave.pt/wp-content/uploads/2019/08/
TIMSS_2011_MAT_Itens_Libertos.pdf
Lafontaine, A. (2008). PIRLS 2006. Progress in reading literacy study. Note de
synthèse. Université de Liège.
Lietz, P., Cresswell, J. C., Rust, K. F., & Adams, R. J. (2017). Implementation of
large-scale education assessments. In P. Lietz, J. C. Cresswell, K. F. Rust, & R.
J. Adams (Eds.), Wiley series in survey methodology. Implementation of large-
scale education assessments (pp. 1–25). John Wiley & Sons.
Marôco, J. (2018). O bom leitor: Preditores da literacia de leitura dos alunos por-
tugueses no PIRLS 2016. Revista Portuguesa de Educação, 31(2), 115–131.
https://doi.org/10.21814/rpe.13768
Marôco, J. (Coord.), Lourenço, V., Mendes, R., & Gonçalves, C. (2016a). TIMSS
Advanced 2015 Portugal – Volume I, Desempenhos em matemática e em física.
IAVE.
Marôco, J. (Coord.), Lourenço, V., Mendes, R., & Gonçalves, C. (2016b). TIMSS
2015 – Portugal – Volume I, desempenhos em matemática e em ciências. IAVE.
Olsen, R., & Lie, S. (2006). Les évaluations internationales et la recherche en éduca-
tion: principaux objectifs et perspectives. Revue française de pédagogie, 157, 11–26.
Rémond, M. (2006). Éclairages des évaluations internationales PIRLS et PISA sur
les élèves français. Revue française de pédagogie, 157, 71–84.
Rémond, M. (2007). Que nous apprend PIRLS sur la compréhension des élèves
français de 10 ans? Repères, recherches en didactique du français langue mater-
nelle, 35, 53–72.
Ricoy, M., & Couto, M. (2012). Os recursos educativos e a utilização das TIC
no ensino secundário na matemática. Revista de Educação Portuguesa, 25(2),
241–262.
International large-scale assessment 141

Rosa, V., Maia, J., Daniela, M., & Teodoro, A. (2020). PISA, TIMSS e PIRLS em
Portugal: uma análise comparativa. Revista Portuguesa de Educação, 33(1),
94–120.
Rosine, L., & Postlethwaite, N. (1994). Les études internationales de l’IEA.
Revue internationale d’éducation de Sèvres, 1, 19–26. https://doi.org/10.4000/
ries.4294
Rutkowski, L., Gonzalez, E., Joncas, M., & von Davier, M. (2010). International
large-scale assessment data: Issues in secondary analysis and reporting. Educa-
tional Researcher, 39(2), 142–151.
Saraiva, L. (2017). A aprendizagem das ciências em Portugal: uma leitura a partir
dos resultados do TIMSS e do PISA. Medi@ções, 5(2), 4–18. http://mediacoes.
ese.ips.pt/index.php/mediacoesonline/article/view/164/pdf_1
Schmidt, W., MacKnight, C., Valverde, G., Houang, R., & Wiley, D. (Eds.).
(1997). Many visions, many aims: A cross-national investigation of curricular
intentions in school mathematics, volume 1. Kluwer Academic Publishers.
Sim-Sim, I. (2013). Os resultados dos alunos portugueses no PIRLS, em leitura, e
as suas implicações para o ensino, para a formação de professores e para o sis-
tema educativo. In Conselho Nacional de Educação (Ed.), Avaliações internac-
ionais e desempenho dos alunos portugueses [Textos do Seminário realizado
no CNE a 25 de março de 2013] (pp. 69–90). CNE.
Spanhel, D. (2008). La importancia de las nuevas tecnologías en el sector edu-
cativo. In M. L. Sevillano (Coord.), Nuevas tecnologías en Educación Social
(pp. 29–52). McGraw-Hill.
Vanda, L., Nunes, A., Amaral, A., Gonçalves, C., Mota, M., & Mendes, R.
(2019). ICILS 2018 – PORTUGAL. Literacia em Tecnologias da Informação
e da Comunicação. IAVE.
Wagemaker, H. (2014). International large-scale assessments: From research to
policy. In L. Rutkowski, M. von Davier, & D. Rutkowski (Eds.), Statistics
in the social and behavioral sciences series. Handbook of international large-
scale assessment. Background, technical issues, and methods of data analy-
sis (pp. 11–36). CRC Press.
7

PISA in media discourse


Prominence, tone, voices and
meanings
Ana Carita, Teresa Teixeira Lopo and
Vítor Duarte Teodoro

Introduction
The object of this study is the critical exploration of the media represen-
tation of the Program for International Student Assessment (PISA) in the
Portuguese newspaper Público. We sought to ascertain and understand,
within the Portuguese context, how a national reference daily general-
interest newspaper refers to the process of Portugal’s participation in
PISA, to the results of PISA and their respective implications, both in
news discourse and in opinion discourses, in the period from 2001 to
2018.
The interest in the media exploration of PISA derives from acknowl-
edging that the mass media are an element of the global information
society, both by making information available and steering the attention
of the target audiences, and by contributing to shape their beliefs and
value systems, their representation and attribution of importance to the
different current events (Castells, 2007; McCombs, 2005). As stated by
Coe and Kuttner (2018, p. 1), the information press plays “a significant
role in the education policy arena, informing the public about pressing
issues and influencing how such issues are prioritized and understood”.
Therefore, it matters to education research to identify the issues prior-
itized by the media, as well as the omitted ones, and understand how
the former are represented and conveyed to the public debate, with the
possibility of influencing the understanding that media consumers have
of education policies (Gerstl-Pepin, 2007). Nowadays and in our political
context, PISA is a recurring topic in the debates on the policies and the
state of education at a national and global scale, and it is important to
inquire into its presence in and coverage by the media.
PISA is a programme undertaken by the Organization for Economic
Cooperation and Development (OECD), with global ambition in the field
of evaluation and design of education policies (OECD, 2006). The pro-
gramme, which is carried out every three years, had its first edition in
2000, and since then has registered, with small variations, a significant
DOI: 10.4324/9781003255215-8
PISA in media discourse 143

increase in participating countries. The main objective of PISA is to meas-


ure student performance in the main areas of competencies – reading,
mathematics and science literacy – with representative samples of the
participating countries constituted by random 15-year-old students. The
information and the comparisons between countries thus obtained on the
quality and equity of their respective education systems are seen as indi-
cators of the countries’ competitiveness at the level of global economy,
and as powerful references of governance in education.
Portugal is one of the countries that has participated in PISA since its
first edition in 2000, with results which started by situating the country
quite below the OECD average and then progressively improved, as sum-
marized by Marôco et al. (2016):

In this edition (PISA, 2000), Portugal took the second last position
among the members of the OECD, in reading literacy and science
literacy. . . . In mathematics, Portugal’s average . . . place the country
four places from the base in all domains of the test. Since then,
Portugal has managed to evolve positively in all the domains of PISA,
with significant increases from 2000 to 2003, from 2006 to 2009
and, lastly, from 2012 to 2015.
(p. 9)

Several researchers have addressed the global political effect of PISA


(e.g. Nóvoa & Yariv-Mashal, 2003; Grek, 2009; Martens & Niemann,
2010; Bieber & Martens, 2011). Regardless of any doubts regarding such
policy normalizing effect (e.g. Baird et al., 2016), today it is not possible
to overlook the reference to PISA when considering the direction of the
education policies of participating countries. The bulky research that
PISA gives rise to in the field of education is, therefore, understandable
(Figazzolo, 2009). Still, studies about how the media communicate
PISA are scant, despite their contribution to the dissemination of the
programme in society (Grey & Morris, 2018; González-Mayorga et al.,
2017; Lemos & Serrão, 2015). Besides, communication with the press is
a topic that deserves particular attention and investment from those in
charge of the programme (Lingard, 2016).
In this way, both by acknowledging the press’s contribution to
steering society’s attention and its representations and by PISA’s potential
political impact in the field of education, and also by the limited research
conducted that the combination of these two phenomena – PISA and the
media – has warranted it seemed to us that it was relevant to deepen the
research on this relation in one country, Portugal, and in one newspaper,
Público, which, from the perspective we now adopt, has not yet been the
object of research. The study reported here corresponds to a first level of
addressing the critical exploration of the media representation of PISA,
144 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

as it is showcased in Público. At this level of analysis, the main focus was


on the more superficial elements of the articles, with a view to making a
descriptive survey of those features and ascertaining their connection to
issues of meaning.
These general goals materialized in four fields, throughout the six PISA
cycles: (i) the intensity of coverage, (ii) the prominence given to PISA, (iii)
the prevailing tone of the articles and (iv) the authorship of the articles,
that is the voices credited by the newspaper to inform or comment on
PISA. Additionally, these goals were also achieved in three fields that are
cross-sectional to the previous ones: (v) evolution throughout time, (vi)
the centrality of PISA in newspaper articles and (vii) the typology of these
articles.
The study is part of a broader project entitled A Success Story? Portugal
and PISA (2000–2015), which is being pursued at the Interdisciplinary
Research Centre for Education and Development (CeiED), of Lusofona
University, funded by the Portuguese Foundation for Science and
Technology (FCT).

PISA in the media: study hypotheses


The studies on the mass media have produced several theories and
constructs regarding the relations of influence and interinfluence between
media, public and political agendas. Among others, the following stand
out: the Agenda Setting Theory and its successive reformulations and
expansions, and the constructs of priming and framing, as dominant
strategies to fulfil the influence of media agendas in society (Coleman
et al., 2009; McCombs, 2005; McCombs & Shaw, 1993, 1972).
Considering the aforementioned theories and constructs, it makes
sense to question the press on the place PISA has had in their respective
agendas, to what extent and how this presence has manifested itself, and
to disclose the meanings that can be associated with that situation. More
specifically, it makes sense to question the intensity of media coverage
and prominence of PISA and their evolution, as well as to ascertain
the journalistic genres the topic seems to be associated with, or the
permeability of other topics of the media agenda to PISA, or moreover on
the voices that are heard on the subject, or even the more or less positive
tone given to them.
This set of questions guided us in the search for empirical studies that
had considered them as well. The comparison that can be done between
the studies that addressed them is imperfect, since the PISA editions
and the years covered vary, as do the number and nature of the publications,
the dimension of the sample, the criteria for analysis and the very jour-
nalistic culture in each country. The comparisons between these studies
should, therefore, merit some degree of caution.
PISA in media discourse 145

Let us start by the issues of coverage and prominence given to PISA in


the written press throughout the various editions of the programme. The
empiric literature highlights, on the one hand, the existence of fluctua-
tions in the media coverage of PISA throughout the editions and, some-
times simultaneously, their progressive increase: on Australia, 2000–2014
(Baroutsis & Lingard, 2016); on Spain, 2001–2014 (González-Mayorga
et al., 2017); on the USA, 2000–2012 (Saraisky, 2015); on Portugal,
2001–2012 (Lemos & Serrão, 2015); on Germany, Finland, France and
the United Kingdom, 2007–2008 (Dixon et al., 2013).
The fluctuations may be interpreted in light of the coexistence of dis-
semination of PISA results with events that mobilize the national political
and media agenda in some countries more than in others. For instance,
Lemos and Serrão (2015) underline this when discussing the poor cover-
age of the 2003 edition of PISA. The progressive increase in media cover-
age of the programme is, in turn, associated in the cited studies to such
indicators as society’s growing receptivity to PISA, reinforced and framed
by the media agenda, the increasing political impact of the programme
and the impact of the PISA results and rankings on each country’s
image of itself, of the quality and equity of its education system, and the
image it projects. Some studies also highlight the month of December
as the time when publications peak, coinciding with the month the
first reports of each PISA edition are released (e.g. Dixon et al., 2013).
Set against this strong association between the number of news items
and their highest concentration in the month when the first results are
released, Saraisky (2015) reports the progressive presence of PISA in the
months leading up to the release of results:

while PISA results were initially covered in the news as news items
around the triennial release of results, as the idea of PISA became
a taken-for-granted measure of educational excellence in the public
consciousness, it was referenced more consistently throughout the
years.
(p. 36)

The studies that have been cited validate our inquiry into the evolution
of the coverage and prominence given to PISA by the newspaper Público,
assessed both by the number of news items throughout the various PISA
editions and also by how the edition features, in other words, whether
it makes the front page or the first pages, the size of the item, the use of
information graphics when covering the issues, the addition of opinion
columns on the topic, namely the editorial. Moreover, another element
also prompted us to research the matter, namely, the conclusions reached
by Saraisky (2015) on the media coverage of PISA beyond the month
of the first release of results, even in news items that do not feature the
146 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

programme as main topic, and on what that signals regarding the pro-
gressive impact and credibility of PISA in the media and in society.
Thus, on the basis of the studies listed earlier, it would be expected to
witness a progressive evolution in the media coverage of PISA along the
time span under study, translated into the rise of news items published
by year and PISA cycle and, both in the items that feature the programme
as main topic and in those that mention it in passing, an increase that
signs the growing importance, influence and credibility of the programme
within society and the media agenda (Hypothesis 1: H1). Moreover,
in those news items where PISA features as the main topic, it is also
expected to see progressive intensification of attention and care in the
media coverage of PISA, translated into the increase in front page place-
ments, of editorials on the topic, of the size of the articles and recourse
to relevant graphics, elements that mark the mounting importance, influ-
ence and credibility of the programme within society and in the media
agenda (Hypothesis 2: H2).
Some of the studies already cited also explicitly explore the more or
less positive or negative tone in which PISA or the national results are
presented in the analysed articles. The assessment criteria and modes used
differ, which reinforces the already stated comparative constraints. We
will now consider some of the conclusions of studies that focused on the
British, Norwegian, Canadian, Shanghai and Portuguese participations
in PISA.
Dixon et al. (2013) look into the presence of “negativity” in news
items published between June 2007 and May 2008, in Germany,
Finland, France and Britain. They observed the presence of negativity
in the four countries, even in those that achieved good results. Although
the proportion of negative pieces was significantly larger in France and
Britain – countries with worse PISA performance – than in Germany
and Finland, the authors still concluded that there was evidence of some
negativity bias in the case of Finland, a country with excellent results.
Studies focusing on the coverage of results of several PISA editions by
the British press underlined their negative tone: Grey and Morris (2018),
on the 2012 PISA edition, Baird et al. (2016), on editions from 2009
to 2012. These studies highlight both some of the negative words and
phrases used to characterize the results – for example decline, stagnation,
failure – and the association of result dissemination to a fierce “football
championship, the championship of the PISA league”, with its winners
and losers: “much angst has been expressed in England by politicians and
the media about slippage over time down the PISA performance league
table” (Baird et al., 2016, p. 129).
In other countries, the presence of the negative tone is highlighted, but
also the alignment between the tone of the news items and the direction
of the PISA results in their respective countries. Thus, in Norway, on
PISA in media discourse 147

the one hand, Baird et al. (2011) concluded that the media echoed the
“shock” and disappointment felt by society at the country’s results in
the 2000 and 2003 PISA cycles, providing a simplistic portrayal of the
country as “a loser”; on the other hand, Hopfenbeck and Görgen (2017),
on PISA 2015, concluded that the tone of most headlines of Norwegian
newspapers followed the positive evolution of results: greater positive,
then, when compared to the headlines in 2001. This alignment between
the media tone and the PISA results is also noted by Baird et al. (2016)
in the Canadian press, comparing the representation of 2012 results “on
the scale of a national emergency” (p. 126) and the representation of
the more positive results of previous editions. E Baird et al. (2016),
in the same study, also report the celebratory tone of the Shanghai press,
although it focuses, not so much on the clear higher position of the region
in the world ranking, but rather on what that points to the existing equity
among the city’s schools and districts (Baird et al., 2016).
Regarding the reaction of the Portuguese press to PISA, Lemos and
Serrão (2015), who analysed the media impact of PISA in Portugal in
two newspapers (Diário de Notícias and Expresso) and a newsmagazine
(Visão), from 2001 to 2012, considered, among other aspects, the gen-
eral tone of the articles that addressed this programme. They concluded
that the tone of the press shows a balance of positive, negative and neu-
tral approaches, with some variation among the media analysed and also
some variation associated with the general sense of the PISA results in
the country: 2009, better results, more items with a positive tone; 2003,
worse results, more items with a negative tone. Lemos and Serrão (2015)
emphasize that both the headlines of the articles and their bodies showed
that the country’s PISA results to a large extent set the tone of the news
items, even when these results are not the main topic, as becomes evident
by the considerable weight of positivity in the articles involving the 2009
cycle.
In short, the reported studies enable us to conclude that, regardless
of the presence of negativity, some balance prevails in the tone of news
items covering PISA, with the tone following the evolution of results in
their respective countries. In most studies, the results seem to be the cri-
terion, which ultimately most determines the tone of the articles, more
than the education systems whose efficiency and equity they supposedly
reveal, or the governments and their respective policies, or even the pro-
gramme itself.
The studies we have been citing validate then our inquiry into the pre-
vailing tone of the articles in Público, which have PISA as the main topic.
The goal is to identify the more or less positive sense of the PISA agenda
in the newspaper, as well as speculate on the underlying reasons, and of
the impact of the media tone in the representation and debates poten-
tially fostered in the society. Also in alignment with the studies alluded
148 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

to, it is to be expected that our results, comprehending the six PISA cycles
(2000–2015), also point to, especially in informational items, a progres-
sively less relevant negativity trend, considering the positive evolution of
Portugal’s results in the programme (Hypothesis 3: H3). Concerning the
opinion items, punctuated in general by a more critical and evaluative
tone determined by the author’s perspective and social stance, it is to be
expected that, when compared to informative items, they will show a
stronger presence of negativity and will focus not just on the results but
also on the education system and policies, or on governmental action and
rulers (Hypothesis 4: H4).
Lastly, regarding the inquiry into the voices which in the written
press have the power to speak up on social phenomena, recreating them
through their discourses, there is a line of questioning which is aimed at
the authors of the texts, particularly opinion pieces (e.g. Boto, 2011):
who does the newspaper give credit to thus state his/her opinions and
influence the public space regarding the social phenomena they include
in their agendas? And what meanings can be inferred, namely in terms of
the published plurality of opinions? The answer to these questions points
to issues of power, dominance and hegemony as proposed, for instance,
by the Critical Discourse Analysis theory (Dijk, 2015), in the analysis
of the press, even if only addressing the more superficial features of its
products.
Within this framework, we asked ourselves about the authorship of
the opinion published, about who, and with what variety and plurality,
the newspaper confer power and credit upon to emit opinions on PISA
throughout its editions. It is not so much to explore here what these
voices say but rather “who they are and what their place is in the social
structure” (Kadushin, 1968, p. 685), and whether this situation some-
how constitutes a handover of this influencing power to an elite, that is
to a minority of individuals – “who are said to have caused more outputs,
or more important outputs” (p. 688).
Among the studies we have been citing, Saraisky (2015) pays particu-
lar attention to the inquiry into who has authorial voice in the press, to
whom this legitimacy and power is awarded to speak up in the public
space on matters of education policy and, in this way, exert influence on
the public debate and policies:

The theoretical literature suggests that elites play a key role in the
policy process. . . . and the analysis of speech acts bore this out.
At the speaker level, the discourse is guided by a mere handful of
elites: the data show that six speakers provide almost 30 percent
of the commentary on PISA in the US. There is virtually no public
voice in the discourse, despite the fact that education is one of the
most public of issues. . . . Teachers, parents and students are almost
PISA in media discourse 149

non-existent in the discussion. Instead, a small, highly elite group of


policy analysts and researchers drive the discourse (Kingdon, 2011).
(p. 37)

Mobilizing this contribution, it seems only natural to expect that,


throughout the time span under study, and particularly regarding the
discourse of opinions, the authorial voice on PISA should be focused: (i)
on a small and stable number of authors, on an elite, a situation indica-
tive of little plurality, conducive to conformity within the framework
given to the topic throughout time – Hypothesis 5 (H5); (ii) on the male
representation, emphasizing the underrepresentation of the female voice,
in contrast to what can be seen in the informational discourse – Hypothesis
6 (H6); (iii) and on a representation in which the public voice, particu-
larly the voice of teachers, students and families or the organizations
that represent them, have a low authorial representation in the opinion
discourse – Hypothesis 7 (H7).
It was then within the context of the presented problem that the study
of the news items on PISA featured in Público was conducted. It was
implemented by the descriptive exploration and inquiry into meanings
on the fields involving the coverage and prominence the topic showed in
its agenda, the tone of said coverage and the more conspicuous authorial
voices.

Method
This study consists in a descriptive, longitudinal and quantitative analysis
of the surface characteristics of 184 newspaper articles of the Portuguese
newspaper Público, published between 4 December 2001 and 31 Janu-
ary 2018. This time frame considered the newspaper articles published
on the day OCDE published the reports/volumes of results relative to
each one of the PISA cycles between 2000 and 2015 and for the following
two complete months. PISA results, let us recall, are published one year
after the assessment carried out, usually in December, and are reported in
one or more volumes, the electronic version of which is made available
on the OCDE website (Table 7.1).
The newspaper articles were obtained by researching the archive of
the print edition in PDF format (main section and supplements) of the
daily Público (1,153 documents). In each issue, the research was carried
out combining the terms PISA and OCDE. As a complement, a consulta-
tion was made in the documental collection of Hemeroteca Municipal
de Lisboa (Lisbon media library) to bridge any gaps in the digital editions
or upload of any item with mistakes. One hundred and six issues were
considered. In those issues, 171 articles were identified featuring PISA in
Portugal.
150 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

Table 7.1 Time frame of the selected articles

PISA cycle Dates of publication of results by the OCDE Time frame for article collection

2000 4 December 2001 Start: 4 December 2001


Finish: 28 February 2002
2003 7 December 2004 Start: 7 December 2004
Finish: 28 February 2005
2006 4 December 2007 (Vol. 1) Start: 4 December 2007
Finish: 29 February 2008
15 September 2008 (Vol. 2) Start: 15 September 2008
Finish: 30 November 2008
2009 7 December 2010 (Vol. 1) Start: 7 December 2010
Finish: 28 February 2010
28 June 2011 (Vol. 2) Start: 28 June 2011
Finish: 31 August 2011
2012 3 December 2013 (Vol. 1–4) Start: 3 December 2013
Finish: 28 February 2014
11 February 2014 (revision Vol. 1) Start: 11 February 2014
Finish: 30 April 2014
1 April 2014 (Vol. 5) Start: 1 April 2014
Finish: 30 June 2014
9 July 2014 (Vol. 6) Start: 9 July 2014
Finish: 30 September 2014
2015 6 December 2016 (Vol. 1–2) Start: 6 December 2016
Finish: 29 February 2016
19 April 2017 (Vol. 3) Start: 19 April 2017
Finish: 30 June 2017
24 May 2017 (Vol. 4) Start: 24 May 2017
Finish: 31 July 2017
21 November 2017 (Vol. 5) Start: 21 November 2017
Finish: 31 January 2018

In a second stage, all print editions (main section and supplement)


within the time frame defined for collecting the articles were uploaded.
For each edition, an internal search was carried out using the FoxTrot
Professional Search software, with the same combination of terms of the
first stage. In these editions, 146 articles on PISA in Portugal, and other
participating countries, were identified.
From the consolidation of the two searches, which also included the
split and/or retention of articles (e.g. meta news, texts on PISA in other
countries) that had not been integrated in the initial mapping, a total of
184 articles were obtained.
The 184 collected newspaper articles were object of analysis within
the framework of the following variables: time placement; authorship;
centrality of PISA; journalistic genre; size; tone; use of tables and charts.
It was sought thus to respond to the study’s objectives and hypotheses,
inscribed in the fields of analysis, coverage and prominence of PISA in
PISA in media discourse 151

Table 7.2 F ields of analysis, their articulation with the hypotheses and respec-
tive categories

Fields of analysis. Hypotheses Variables and categories

Evolution of PISA coverage in Time placement of the articles: month,


n.°, in the articles with PISA year, cycle, period of publication
as main topic and PISA as Centrality of PISA: main topic;
secondary topic and by type secondary topic
H1 Type or journalistic genre of the
articles: news story; chronicle/opinion
piece; interview; editorial; letter to
the editor; other
Evolution of the prominence Type or journalistic genre of the
given to PISA in the articles articles, with special emphasis on the
with PISA as main topic and by presence in editorials
type Frontpage teasers
H2 Size of the articles
Visual elements: tables and charts
Tone of the articles with PISA as Tone: positive, neutral, negative.
main topic and by type Analysis of headlines and leads of
H3; H4 articles
Type or journalistic genre of the articles
Voice of the articles with PISA as Authorship: genre and occupation of 1st
main topic and by type author
H5; H6; H7 Type or journalistic genre of the articles

the newspaper’s agenda, predominant tone of the articles, authorial voice


of the articles and, cross-sectionally to these, also the fields time distri-
bution of the articles, their types and centrality of PISA, as presented
in Table 7.2. After processing the coverage of PISA in the newspaper’s
agenda, the study shifted its focus to the articles that addressed PISA as
their main topic.
The recording in Excel was done in terms of the presence/absence of
each category in the different variables. The data were subject to analysis
by frequencies and percentages throughout the entire time frame, in the
six three-year PISA cycles and by aggregating these into two nine-year
periods, as described in Table 7.2.
Of the 184 newspaper articles that constitute the corpus, there are 112
in which PISA is the main topic (60.9%) – when PISA or its results are
clearly a highlighted topic of the article, even if, sometimes, another topic
or other topics may also be relevant – and 72 in which PISA is a second-
ary topic (39.1%) – when the text addresses another topic, including
PISA as a secondary subtopic or merely mentioning it. The articles with
PISA as main topic constitute, in this study, the main focus of analysis.
Classification was done on the basis of the analysis of the headline and
152 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

lead of the article and also on its body, whenever ambiguities persisted
on those bases.
When categorizing the articles regarding their type or journalistic
genre, following Ricardo (2004) and Lopes (2010), consideration was
given to the groups that journalistic genres are organized into: informa-
tive (news story, interview) and opinion (chronicle/opinion pieces, edito-
rial, letter to the editor), and also the cartoon. It should be noted that
none of the genres exists in editorial practices in a pure state; in other
words, many journalistic texts integrate features from the different gen-
res. It is then the analyst’s task to determine, in each case, which is the
dominant genre (Lopes, 2010). This situation is relevant for the case of
chronicles and opinion pieces, and for this reason, an option was made to
categorize these two genres jointly (Gradim, 2000). The category “other”
was also included, for the collected articles, which did not fit any of the
journalistic genres. In the corpus of 184 articles, the more frequent jour-
nalistic genres were news stories (90, 48.9%), followed by chronicles/
opinion pieces (55, 29.9%; the two genres, together, represent 78.8% of
the total articles).

Results
To present the results, we will follow a rationale aligned with the fields,
objectives and hypotheses of the study. Thus, we will see the results
regarding the evolution of the media coverage and prominence of PISA,
the prevailing tone of the articles and, finally, the leading authorial voices
in the discourses.
When presenting the results according to their time distribution,
we shall consider, in the conditions defined by the sample options, the
monthly and annual distribution, when deemed expedient, and, always,
the distribution by PISA cycles and, finally, comparing two PISA periods:

• In the first period, we aggregate the three initial cycles (2000, 2003,
2006).
• In the second period, the following three cycles (2009, 2012, 2015).

Media coverage and prominence of PISA in Público


We shall start this section by the evolution of the PISA coverage by Público
throughout the first six PISA cycles, considering first and foremost the
articles with PISA as central topic and then those with PISA as secondary
topic. We shall see this according to the general evolution of the PISA
coverage and, afterwards, the evolution in the context of the typology
of the articles. Finally, with a view to deepening the issues regarding the
prominence of PISA in Público in the universe of the articles with PISA
PISA in media discourse 153

as the main topic, we will consider their distribution according to their


presence in editorials and frontpage teasers, the recourse to infographics
and their size.
It is in December that the highest number of articles is published (97
out of 112, 86.6%). The next month with more articles is January (7 out
of 112, 6.3% of the total). In the remaining months no more than one or
two articles were recorded. This is the profile detected in all PISA cycles.
In 2010, the highest number of articles was published (34 out of
112, 30.4%), followed by 2001 and 2016, with about half that number
(16 and 17 articles, representing 15.2% and 14.3%, respectively) – see
Table 7.3. These three years correspond to years of dissemination of the
first results of the respective PISA editions (2009, 2015 and 2000 edi-
tions, respectively). In all the other PISA editions, the year when the first
results are published is also always the most expressive in number of
articles, with 2007 being the first year of the dissemination of the results
of the 2006 edition, which presents a lower number (6 articles, 5.4% of
the total).
The 4th cycle, 2009–2011, shows the highest number of articles (40,
35.7%), while the 2nd and 3rd cycles have the lowest number (11 and
7, representing 9.8% and 6.3% of the total, respectively). The remaining
cycles, 1st, 5th and 6th cycles, present a similar number of articles (18,
17 and 19, representing 16.1%, 15.2% and 17.0%, of all the articles,
respectively).
The second period (2009–2012–2015 cycle) gathered over two-thirds
of the articles with PISA at the core (76, representing 67.9% of the total
112 articles, vis-à-vis 36 in the first period (2000–2006–2009 cycles),
32.1% of the total).
The more frequent journalistic genres were news stories, followed by
chronicles/opinion pieces, while the remaining genres had little expres-
sion – see Figure 7.1. Sixty news stories and 29 opinion pieces (53.6%
and 25.9%, respectively) – jointly, these two genres represented over
three quarters of the 112 articles which focused on PISA as their core
subject (79.5%).
In every year, with the exception of 2008, and in the sample consid-
ered, there are news stories on PISA, albeit irregularly distributed. The
year the results of the PISA editions were published is the one with a
higher number of news stories, with 2010 taking the lead with 15 arti-
cles. Regarding distribution by 3-year cycles, the 4th and 1st cycles stand
out (with 20 and 14 news stories, respectively). Considering the two peri-
ods, there were a higher number of news stories in the second one: 34
news stories compared with 26 of the first period.
In the first three cycles there were no chronicles/opinion pieces or their
presence is insignificant. In the 4th cycle their frequency increases (12),
and it remains expressive in the following cycles (nine and six, in the
Table 7.3 Features of the 112 articles where PISA is the main topic
154

Journalistic Genre Articles where... Article size

the is a there are


Chronicle | Letter to the PISA is a 1 page or 2 pages or 3 pages or 1/2 page or Less than
[%, column] News Interview Editorial Other reference on graphs
Opinion editor central topic approx. approx. more approx. 1/2 page
[%, line] the first page and/or tables
Total N % N % N % N % N % N % N % N % N % N % N % N % N % N %
112 % 60 53.6 29 25.9 3 2.7 7 6.3 9 8.0 4 3.6 112 100.0 29 25.9 18 16.1 19 17.0 2 1.8 3 2.7 36 32.1 55 49.1
Year PISA Cycle
2000
2001 16 14.3 12 1 1 2 16 9 3 1 4 11
2002 2 1.8 2 2 1 1
18 16.1 14 23.3 1 3.4 1 14.3 2 22.2 18 16.1 9 31.0 3 16.7 1 5.3 5 13.9 12 21.8
1.º
18 100.0 14 77.8 1 5.6 1 5.6 2 11.1 18 100.0 9 50.0 3 16.7 1 5.6 5 27.8 12 66.7
2003
2004 10 8.9 7 2 1 10 4 3 2 3 5
2005 1 0.9 1 1 1
11 9.8 8 13.3 2 28.6 1 25.0 11 9.8 4 13.8 3 16.7 2 10.5 4 11.1 5 9.1
2.º
11 100.0 8 72.7 2 18.2 1 9.1 11 100.0 4 36.4 3 27.3 2 18.2 4 36.4 5 45.5
2006
2007 6 5.4 4 1 1 6 4 2 4
2008 1 0.9 1 1 1
7 6.3 4 6.7 1 3.4 1 11.1 1 25.0 7 6.3 4 22.2 2 5.6 5 9.1
3.º
7 100.0 4 57.1 1 14.3 1 14.3 1 14.3 7 100.0 4 57.1 2 28.6 5 71.4
2000-2003- 36 32.1 26 43.3 2 6.9 0 0.0 3 42.9 3 33.3 2 50.0 36 32.1 13 44.8 10 55.6 3 15.8 0 0.0 0 0.0 11 30.6 22 40.0
2006 36 100.0 26 72.2 2 5.6 0 0.0 3 8.3 3 8.3 2 5.6 36 100.0 13 36.1 10 27.8 3 8.3 0 0.0 0 0.0 11 30.6 22 61.1
2009
2010 34 30.4 15 11 2 2 3 1 34 7 3 4 1 14 15
2011 6 5.4 5 1 6 1 4 2
40 35.7 20 33.3 12 41.4 2 66.7 2 28.6 3 33.3 1 25.0 40 35.7 7 24.1 4 22.2 4 21.1 1 50.0 18 50.0 17 30.9
4.º
40 100.0 20 50.0 12 30.0 2 5.0 2 5.0 3 7.5 1 2.5 40 100.0 7 17.5 4 10.0 4 10.0 1 2.5 18 45.0 17 42.5
2012
2013 13 11.6 4 8 1 13 5 1 3 1 5 5
2014 4 3.6 2 1 1 4 2 1 1 1
17 15.2 6 10.0 9 31.0 1 14.3 1 25.0 17 15.2 5 17.2 1 5.6 5 26.3 2 66.7 6 16.7 6 10.9
5.º
17 100.0 6 35.3 9 52.9 1 5.9 1 5.9 17 100.0 5 29.4 1 5.9 5 29.4 2 11.8 6 35.3 6 35.3
2015
2016 17 15.2 6 6 1 1 3 17 4 2 6 1 1 1 9
2017 2 1.8 2 2 1 1 1
19 17.0 8 13.3 6 20.7 1 33.3 1 14.3 3 33.3 19 17.0 4 13.8 3 16.7 7 36.8 1 50.0 1 33.3 1 2.8 10 18.2
6.º
Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

19 100.0 8 42.1 6 31.6 1 5.3 1 5.3 3 15.8 19 100.0 4 21.1 3 15.8 7 36.8 1 5.3 1 5.3 1 5.3 10 52.6
2009-2012- 76 67.9 34 56.7 27 93.1 3 100.0 4 57.1 6 66.7 2 50.0 76 67.9 16 55.2 8 44.4 16 84.2 2 100.0 3 100.0 25 69.4 33 60.0
2015 76 100.0 34 44.7 27 35.5 3 3.9 4 5.3 6 7.9 2 2.6 76 100.0 16 21.1 8 10.5 16 21.1 2 2.6 3 3.9 25 32.9 33 43.4

Change from 1st period (cycles 2000-2003-2006) to 2nd period (cycles 2009-2012-2013):
+40 +35.7 +8 +13.3 +25 +86.2 +3 +100.0 +1 +14.3 +3 +33.3 0 0.0 +40 +35.7 +3 +10.3 -2 -11.1 +13 +68.4 +2 +100.0 +3 +100.0 +14 +38.9 +11 +20.0
-27.5 +30.0 +3.9 -3.1 -0.4 -2.9 0.0 -15.1 -17.3 +12.7 +2.6 +3.9 +2.3 -17.7
PISA in media discourse 155

Figure 7.1 F requency of the do journalistic genres of the 112 articles with
PISA as the main subject, by year (in the sample considered), by
PISA cycle

5th and 6th cycles, respectively). There is a very strong contrast between
the first period (2000–2003–2006 cycles) and the second period (2009–
2012–2015 cycles): 2 versus 27. The chronicles/opinion pieces emerge, in
general, in the 1st year of dissemination of the results of the respective
PISA cycles.
Let us now consider the results on the prominence attributed to PISA
by Público and their respective evolution. To this end, we questioned
the prominence given to PISA by Público and its respective evolution,
the presence of the articles in which PISA is the main topic in editorials,
frontpage teasers the recourse in them to technical illustration and its
size.
Thus, in the set of 112 articles:

• Seven editorials (6.3%) were published in the year the first results
were disclosed. There are editorials in all cycles except for the third.
In the first period there are three editorials and in the second, four.
• There are 29 frontpage teasers (25.9%), and this situation always
occurred in the year the first results were published. With the excep-
tion of the 3rd cycle, with no frontpage teasers, in all the other cycles
this kind of attention occurred, especially in the 1st and 4th cycles.
In the second period there is a higher number of frontpage teasers
vis-à-vis the first period (16 and 13, respectively).
• Eighteen articles resort to graphs and tables to illustrate PISA results
(16.1%), on both national and international data. Articles with this
resource feature mostly in the first year of their respective dissemi-
nation of results. The distribution of this resource is slightly more
156 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

expressive in the 3rd and 4th cycles, with four articles each. The first
period shows a higher number of articles with graphs and tables: ten
versus eight in the second period.
• There are 55 articles (49.1%) with the smallest size, inferior to
half page. Regarding the others sizes, half page stands out, with 36
articles (32.1%) and one page, or approximately, with 19 (17%). In
the distribution per cycles, it can be noted that in all of them there is
at least one full page article except for the 3rd, which has none. Still,
it is in the last three cycles that we can observe the highest number of
one-page articles: 4th cycle with four (21.1% of the total number of
articles of this size), 5th with five (26.3%) and 6th cycle with seven
(36.8%). Thus, the frequency of longer articles is higher in the sec-
ond period (16 versus three in the first period, 84.2% versus 15.8%
of the articles with the size in question). Two- or three-page articles
are scarcely represented – two and three, respectively, in the total set
of 112 articles – and can only be found in the second period.

As for the 72 articles where PISA is a secondary topic, we can observe


the gradual increase in their number, translated into a sharp contrast
between the first and second periods (from 21 to 51 articles), both in
news stories (from ten to 20) and in chronicles/opinion pieces (from six
to 20). Special note must be made of their time dispersion, translated, for
instance, in their presence beyond the 1st year of dissemination of results
in all PISA cycles except for the first one.

Prevailing tone of the 112 articles in which


PISA is the main topic
According to the objectives of the study, the analysis of the tone of the
articles focused on those that emphasize PISA as the main topic, regardless
of their journalistic genre. Tone, as can be inferred from the headline and
lead of the articles, was categorized as positive and negative – depending
on the favourable or unfavourable bias on their respective subject – or
as neutral, when the information does not allow to perceive clearly the
orientation vis-à-vis the object. Here, we can find both more descriptive
information, distanced from the object, and that which combines opposing
perspectives – see Table 7.4.
In all 112 articles, the positive tone is the one with the lowest frequency
(27 articles, 24.1%); the neutral and negative tones show just about the
same frequency: 43 and 42 articles (38.4% and 37.5%, respectively).
The positive tone is less present in the 1st cycle (one article), it is absent
in the following two cycles and it is very frequent in the 4th and 6th
cycles: 11 positive articles in each cycle (40.7% of the total positive arti-
cles, albeit with a higher percentual incidence in cycle in the 6th). The 5th
PISA in media discourse 157

Table 7.4 P
 redominant tone of the 112 articles in which PISA is the main
topic
Positive Neutral Negative
%, line
%, column N % N % N % N %

1st Cycle (2001-2003) 1 5.6 5 27.8 12 66.7 18 100.0


3.7 11.6 28.6 16.1
News 1 5 8 14
Chronicle | Opinion 1 1
Interview
Editorial 1 1
Letter to editor 2 2
Other
2nd Cycle (2004-2006) 6 54.5 5 45.5 11 100.0
14.0 11.9 9.8
News 5 3 8
Chronicle | Opinion
Interview
Editorial 2 2
Letter to editor
Other 1 1
3rd Cycle (2007-2009) 3 42.9 4 57.1 7 100.0
7.0 9.5 6.3
News 1 3 4
Chronicle | Opinion 1 1
Interview
Editorial
Letter to editor 1 1
Other 1 1
4th Cycle (2010-2012) 11 27.5 19 47.5 10 25.0 40 100.0
40.7 44.2 23.8 35.7
News 5 12 3 20
Chronicle | Opinion 2 4 6 12
Interview 1 1 2
Editorial 1 1 2
Letter to editor 2 1 3
Other 1 1
5th Cycle (2013-2015) 4 23.5 7 41.2 6 35.3 17 100.0
14.8 16.3 14.3 15.2
News 3 2 1 6
Chronicle | Opinion 5 4 9
Interview
Editorial 1 1
Letter to editor
Other 1 1
6th Cycle (2016-2018) 11 57.9 3 15.8 5 26.3 19 100.0
40.7 7.0 11.9 17.0
News 4 2 2 8
Chronicle | Opinion 4 1 1 6
Interview 1 1
Editorial 1 1
Letter to editor 1 2 3
Other

Period 2001-2009 1 2.8 14 38.9 21 58.3 36 100.0


3.7 32.6 50.0 32.1
News 1 11 14 26
Chronicle | Opinion 2 2
Interview
Editorial 3 3
Letter to editor 1 2 3
Other 2 2

Period 2010-2018 26 34.2 29 38.2 21 27.6 76 100.0


96.3 67.4 50.0 67.9
News 12 16 6 34
Chronicle | Opinion 6 10 11 27
Interview 2 1 3
Editorial 3 1 4
Letter to editor 3 1 2 6
Other 1 1 2
27 24.1 43 38.4 42 37.5 112 100.0
100.0 100.0 100.0 100.0
Change from 1st period (cycles
2000-2003-2006) to 2nd period +25 +31.4 +15 -0.7 0 -30.7 +40
(cycles 2009-2012-2013): +92.6 +34.9 0 +35.7
News +11 +5 -8 +8
Chronicle | Opinion +6 +10 +9 +25
Interview +2 +0 +1 +3
Editorial +3 +1 -3 +1
Letter to editor +3 0 0 +3
Other 0 -1 +1 0
158 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

cycle featured four articles (14.8% of the positive articles, 23.5% of the
respective cycle).
The negative tone is the most frequent in the 1st cycle, with 12 articles
(28.6% of negative articles, over half the articles in the respective cycle,
66.7%). Close to these figures is 4th cycle: 10 articles (23.8% of negative
articles, but with a percentual in the cycle quite inferior to the former,
25.0%). In the remaining four cycles, the presence of a negative tone
varies between 5 and 6 articles (26.3% in the 6th cycle and 57.1% in the
3rd cycle).
The neutral tone is present in all the cycles, and it is clearly dominant
in the 4th with 19 articles (44.2% of articles with a neutral tone, about
half the evaluations of the cycle, 47.5%). In the remaining cycles, even if
prevalent, the presence of the neutral tone is lower, varying between three
and six articles (15.8% in the 6th cycle and 54.5% in the 2nd cycle).
The analysis of the tone of the articles in the first and second periods
shows a significant difference in the positive tone: one article versus 26
(3.7% and 96.3% in the total number of articles with a positive tone, and
2.8% and 34.2%, at cycle level, respectively). The first period also stands
out by the near absence of articles with a positive tone, while the 2nd
stands out by the balance in the tone of articles, with items with neutral
tone coming in first place (38.9%), closely followed by positive items
(34.2%) and finally the negative one (27.6%).
The distribution of the tone of the articles in the two predominant
journalistic genres shows that in news stories the neutral tone is more
frequent, with 27 items (45.0% of the total of news stories; 62.8% of
the total number of articles with neutral tone). News stories with a posi-
tive tone are the least represented in the total set of news stories, with 13
articles (21.7%), although they contribute to about half the set of posi-
tive articles (48.1%). In the set of chronicles/opinion pieces, the negative
tone is the more frequent, with 13 items (44.8% of the total number of
chronicles/opinion pieces; 31.0% of negative articles). Next come arti-
cles with a neutral tone, with 10 items (34.5% of the total number of
chronicles/opinion pieces, 23.3% of pieces with this tone). In this genre,
too, the positive tone, with six articles, is the least represented (20.7% of
chronicles/opinion pieces; 22.2% of the articles with this tone).

The voice of authors in the articles in which


PISA in the main topic
The global distribution of authorship by gender shows the predominant pres-
ence of women as first authors: 64 women (57.1%) and 32 men (28.6%). The
percentage of women remains in the majority in both periods (24, 66.7% in
the first period, and 40, 52.6% in the second), although the male percentage
rises (six, 16.7% in the first period versus 26, 34.2% in the second period).
PISA in media discourse 159

The gender distribution of the first author by type of articles shows


the very large presence of women in news stories (51 women, 85.0% of
total news stories; two men, 3.3%), and the quite overwhelming pres-
ence of men in chronicles/opinion pieces (22 men, 75.9% of the whole;
five women, 17.2%). In the editorials there are no women: four men and
three unidentified authors. Still in the field of opinion; conversely, in the
nine letters to the editor, there is a more expressive presence of women
(five), and there are two male and two unidentified authorships.
The global distribution of authorship by occupation of the first author,
based on the first information used in the by-line of the article, highlights
the prominent presence of journalists in 75 of the pieces (67.0%); fol-
lowed, in 16 pieces (14.3%), by the group of unidentified occupation,
where we can probably find, among others, trainee journalists. From the
groups with identified occupation, higher education faculty/researchers
stand out, in a distance second place, in eight pieces (7.1%). The per-
centage of journalists remains the largest in both periods, although it
declines slightly in the second one (26, 72.2% in the first period, ver-
sus 49, 64.5% in the second period). All other professions, with special
emphasis for higher education faculty/researchers, increase in the second
period, albeit remaining in very low numbers.
Regarding the profession of the first author by type of the articles,
we found the almost exclusive presence of journalists in news sto-
ries (55 journalists, 91.7% of news stories) and a majority presence in
chronicles/opinion pieces (ten journalists, 34.5% of chronicles/opinion
pieces). Journalists feature exclusively in editorials and interviews (7 and
3, respectively). Higher education lecturers/researchers are the second
group present in chronicles/opinion pieces (seven articles, 24.1%), fol-
lowed by politicians (4, 13.8%). With this same number we also have
the category other (4, 13.8%), where such professions as lawyer, psy-
chiatrist, publisher or post-graduation student can be found. Thus, basic/
secondary education teachers, only one, in letters to the editor, union
leaders, also one, and also a school manager, the two latter in chronicles/
opinion pieces.
The evolution of authorship per gender shows that the percentage of
women remains the largest in both periods (24, 66.7% in the first period
versus 40, 52.6% in the second), although there is an increase in the male
percentage (six, 16.7% in the first period, compared to 26, 34.2% in the
second). Per profession, the percentage of journalists remains the largest
in both periods, in spite of a slight decrease in the second (26, 72.2%
in the first period, versus 49, 64.5% in the second). All other profes-
sions, in particular higher education faculty/researchers and politicians
increased in number and percentage in the second period, albeit in very
low numbers.
160 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

Table 7.5 P
 rofession of the authors of the 112 articles in which PISA is the
core topic
Lecturer Higher Ed./ School teacher/ Teacher/ Other or Not
Journalist Politician
researcher School manager Union leader Identified
%, line
%, column N % N % N % N % N % N % N %

Female and Male (112 pieces, 16 pieces with unidentified gender author)
1st Cycle (2001-2003) 13 72.2 1 5.6 4 22.2 18 100.0
17.3 10.0 19.0 16.1
2nd Cycle (2004-2006) 9 81.8 2 18.2 11 100.0
12.0 9.5 9.8
3rd Cycle (2007-2009) 4 57.1 1 14.3 2 28.6 7 100.0
5.3 10.0 9.5 6.3
4th Cycle (2010-2012) 27 67.5 3 7.5 1 2.5 1 2.5 8 20.0 40 100.0
36.0 30.0 100.0 100.0 38.1 35.7
5th Cycle (2013-2015) 9 52.9 2 11.8 4 23.5 2 11.8 17 100.0
12.0 50.0 40.0 9.5 15.2
6th Cycle (2016-2018) 13 68.4 2 10.5 1 5.3 3 15.8 19 100.0
17.3 50.0 10.0 14.3 17.0

Period 2001-2009 26 72.2 2 5.6 8 22.2 36 100.0


34.7 20.0 38.1 32.1
Period 2010-2018 49 64.5 4 5.3 8 10.5 1 1.3 1 1.3 13 17.1 76 100.0
65.3 100.0 80.0 100.0 100.0 61.9 67.9
75 67.0 4 3.6 10 8.9 1 0.9 1 0.9 21 18.8 112 100.0
100.0 100.0 100.0 100.0 100.0 100.0 100.0

Change from the period


2001-2009 to the period +23 -7.7 +4 +5.3 +6 +5.0 +1 +1.3 +1 +1.3 5 -5.1 +40
2010-2018 +30.7 +100.0 +60.0 +100.0 +100.0 +23.8 +35.7

Female
1st Cycle (2001-2003) 12 100.0 12 100.0
21.1 18.8
2nd Cycle (2004-2006) 7 100.0 7 100.0
12.3 10.9
3rd Cycle (2007-2009) 4 80.0 1 20.0 5 100.0
7.0 50.0 7.8
4th Cycle (2010-2012) 23 92.0 2 8.0 25 100.0
40.4 50.0 39.1
5th Cycle (2013-2015) 4 80.0 1 20.0 5 100.0
7.0 50.0 7.8
6th Cycle (2016-2018) 7 70.0 1 10.0 2 20.0 10 100.0
12.3 100.0 50.0 15.6

Period 2001-2009 23 95.8 1 4.2 24 100.0


40.4 50.0 37.5
Period 2010-2018 34 85.0 1 2.5 1 2.5 4 10.0 40 100.0
59.6 100.0 50.0 100.0 62.5
57 89.1 1 1.6 2 3.1 4 6.3 64 100.0
100.0 100.0 100.0 100.0 100.0

Change from the period


2001-2009 to the period +11 -10.8 +1 +2.5 0 -1.7 4 +10.0 +16
2010-2018 +19.3 +100.0 0 +100.0 +25.0

Male
1st Cycle (2001-2003) 1 33.3 1 33.3 1 33.3 3 100.0
9.1 12.5 12.5 9.4
2nd Cycle (2004-2006) 2 100.0 2 100.0
18.2 6.3
3rd Cycle (2007-2009) 1 100.0 1 100.0
12.5 3.1
4th Cycle (2010-2012) 1 9.1 3 27.3 1 9.1 1 9.1 5 45.5 11 100.0
9.1 37.5 100.0 100.0 62.5 34.4
5th Cycle (2013-2015) 2 25.0 2 25.0 3 37.5 1 12.5 8 100.0
18.2 66.7 37.5 12.5 25.0
6th Cycle (2016-2018) 5 71.4 1 14.3 1 14.3 7 100.0
45.5 33.3 12.5 21.9

Period 2001-2009 3 50.0 1 16.7 2 33.3 6 100.0


27.3 12.5 25.0 18.8
Period 2010-2018 8 30.8 3 11.5 7 26.9 1 3.8 1 3.8 6 23.1 26 100.0
72.7 100.0 87.5 100.0 100.0 75.0 81.3
11 34.4 3 9.4 8 25.0 1 3.1 1 3.1 8 25.0 32 100.0
100.0 100.0 100.0 100.0 100.0 100.0 100.0

Change from the period


2001-2009 to the period +5 -19.2 +3 +11.5 6 +10.3 +1 +3.8 +1 +3.8 4 -10.3 +20
2010-2018 +45.5 +100.0 +75.0 +100.0 +100.0 +50.0 +62.5
PISA in media discourse 161

Discussion of results
Regarding the first editions of PISA, we inquired into their representation
in Público, one of the main reference daily newspapers in Portugal. The
initial approach to the material that is the object of this chapter focused
on and sought to find meanings in seven fields: (i) intensity of coverage
and (ii) prominence of PISA in the newspaper’s agenda; (iii) prevailing
tone of the articles and (iv) authors’ voices that stand out, to which we
added cross-sectional fields: (v) evolution of the meanings surveyed; (vi)
the centrality of PISA in the articles; (vii) typology of the articles. On
these fields we formulated seven hypotheses which we will consider later.
Concerning the issue of the centrality of PISA, from the 184 articles,
112, somewhat over two-thirds, focused on PISA as core topic – in the
remaining third PISA was a secondary topic or merely a passing refer-
ence. We can say that both results shed light on the value that the news-
paper gives this topic, both by taking it as the core topic in a considerable
number of pieces and by bringing together PISA and other information
or debates in the field of education, thereby suggesting that in this news-
paper, too, the topic of PISA has become an unavoidable reference in the
debates on education and its policies (Saraisky, 2015).
When we consider the evolution of the coverage of PISA in Público in
articles which address the programme as their main topic, and regardless
of the multicausality of the PISA effect on media agendas, findings show
that in this newspaper, the effect is apparent and has been growing. Our
expectation that the media coverage of PISA would steadily evolve in
the studies time frame was thus confirmed, translated in the increase in
the number of articles published with PISA as the main topic, per PISA
cycle and in the context of the aggregation in two periods (H1). This
increase can be understood as signalling the growing importance of the
programme in the media agenda and in society, albeit not in the orienta-
tion of the national education policy. This finding aligns with the results
of studies on other countries which highlighted the relevant contribution
of the media to inform society on PISA, and associated its gradual media
coverage with the growing receptivity and political impact of the pro-
gramme (Saraisky, 2015; Baroutsis & Lingard, 2016; González-Mayorga
et al., 2017). This result is also in line with the findings of Lemos and
Serrão (2015) for the Portuguese situation.
Two aspects on the evolution of the coverage of PISA in articles where
the programme is the central topic should also be emphasized. The first
refers to the high incidence of a larger number of articles in the years of
the first dissemination of results for each edition; besides, the same can
be said of the months of December, since the first results are regularly dis-
seminated in this month. It seems reasonable to conclude that the topic of
PISA does not overcome the barrier of the media preference for novelty,
162 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

for current events, therefore producing little echo in the public debate,
be it through information or published opinion, as was ascertained in
other studies as well (Dixon et al., 2013). Apparently, the press is par-
ticularly receptive to figures, to rankings, to the choreography of the
“championship of the PISA league”, which has its peak in the month of
December of the year of the first dissemination of results of each edition.
And yet, from the 2006 edition, the OECD has been disseminating suc-
cessive thematic studies expanding the analysis of results. These studies
translate an approach that is politically more explicit to what they reveal
of each country from social and educational perspectives. The newspa-
per’s reaction to these reports, however, is of less relevance compared to
the reaction to the first reports on each cycle, focused on the results of
the literacy tests, on the comparisons between countries, on the rank-
ings. Meanwhile, although media culture, in general, even in the refer-
ence press, helps us understand the preference for the more dramatic and
exuberant dimensions of social phenomena, often of a more superficial
nature (Coe & Kuttner, 2018), we also question the efficacy of the OCDE
regarding the dissemination of the in-depth reports. Maybe the OECD’s
promotional strategy regarding the dissemination of these reports does
not benefit from a similar investment to that which is allocated to the
dissemination of the early reports, in the terms revealed by Lingard’s
research (2016).
The second aspect to be highlighted regards the irregularity of the
annual distribution flow of the articles, without prejudice to a clear
quantitative increase in the coverage of PISA by the newspaper. Let us
note, for instance, the case of 2001, which is the third year with more
articles, but which is atypical in the context of the years closer to it,
which are among those which least represent PISA in Público’s agenda.
The novelty that PISA constituted in 2001 helps us understand the atten-
tion Público paid the topic, even if that attention is not in keeping with
PISA’s low impact in other Portuguese publications such as Diário de
Notícias, Expresso and Visão in that year (Lemos & Serrão, 2015). The
“tragedy” of the Portuguese results in the 2000 edition may also explain
its strong presence in Público’s agenda. The country’s feeble position lin-
gered in the two following editions, but the strong impact in society and
politics of other social phenomena of an internal nature may explain the
weaker presence of PISA in the newspaper’s agenda. Conversely, the year
2010, the first year of result dissemination of the 4th PISA cycle, was a
very significant turning point as regards the newspaper’s interest in the
matter, translated into a rise in the number of articles, which follows
the noteworthy improvement in Portugal’s performance in PISA. In the
following two cycles, albeit with oscillations, the significant number of
articles remains steady, thus maintaining the greater frequency of articles
in the second period analysed. This result is in line with the situation
PISA in media discourse 163

verified by Lemos and Serrão (2015) in the newspapers and magazines


they researched. In short, we found irregularity in the yearly distribu-
tion of the articles, but clarity regarding the fact that it is in the second
period that is located the cycle with the highest number of articles (the
4th cycle), with over two thirds of the articles featuring PISA as core
topic published by the newspaper.
The increase in the media coverage of PISA should also be highlighted,
not only in general terms but also in those that constitute the two main
journalistic genres: more news stories and more opinion pieces character-
ize the second period, in other words, the set of the 4th, 5th and 6th PISA
cycles, an evolution which reinforces our H1, regarding the reinforcing of
PISA in the newspaper’s agenda. In particular, chronicles/opinion pieces
go up very significantly from the 4th PISA cycle, attesting, albeit errati-
cally, to a very strong difference between the two periods, since the opin-
ion on PISA is all but absent from the newspaper in the first three cycles
of the programme.
This evolution in the coverage of PISA by Público does not stray from
what was observed in other studies: volatility coexisting with the trend
for a progressively more significant presence of PISA in the media agenda.
This reinforcement seems to translate, if not the progressive impact of
PISA in the policy, the agenda of which the newspaper would follow –
risk of inference of which Lemos and Serrão warn (2015) – at least the
relevance of its presence in the public debate on education, or, we would
suggest, a strategy for media agenda both vigilant of the development of
Portuguese performance and aligned with the programme’s progressive
credibility and impact. We believe that the growing media attention on
PISA may also be related to the OECD/PISA’s concern with the efficacy of
their communication strategy with the press (Lingard, 2016).
The gradual increase in articles where PISA is a secondary topic, and
their dispersion throughout the time frame of the study, reinforce these
conclusions and attest to the progressive potential of PISA to be interwo-
ven with other information and debates in the education field. Besides,
this dispersion suggests that “the idea of PISA became a taken-for-
granted measure of educational excellence in the public consciousness, it
was referenced more consistently throughout the years” (Saraisky, 2015,
p. 36); it signals the ballast left by the programme in the public debate on
policies, one which the articles with PISA as central topic do not make
as evident.
In H2, we stated the expectation that in those articles where PISA is
the core topic, there would be a steady intensification of the attention and
care of the media coverage of PISA, translated in the rise of frontpage
teasers, editorials on the subject, the size of the articles and the recourse to
relevant infographics, all elements that signal the programme’s mounting
importance, influence and credibility in the media agenda and in society.
164 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

Regarding three of the criteria, we observed that increase, especially when


the analysis focuses on the two larger periods; still, the differences show
a diversified expression and one which is sometimes of small relevance: a
slightly higher number of editorials and frontpage teasers, and quite
a higher number of articles with the size of a page or close to it in the
second period. The first period differs positively from the second period
only in the use of relevant infographics. Thus we may conclude that the
evolution of the prominence given to PISA, the evolution in the sophisti-
cation of its coverage, in spite of evidencing some differences between the
periods, mostly favourable to the second period, is not very striking, with
the exception of size, which makes it hard to clearly verify H2.
On the contrary, we must underline what appears to be, due to its
consistency, a strategy of the newspaper to pay careful attention to PISA
in the agenda of Público throughout the cycles. In fact, all cycles with the
exception of the 3rd one, have PISA editorials, articles with frontpage
teasers and occupying a full page or thereabouts, and all the cycles with-
out exception have articles featuring infographics. In other words, despite
our observation regarding the quantitative evolution of PISA coverage by
Público, both in news stories and in opinion pieces there is an intensity
in the attention paid to the programme that is constant since its first edi-
tion and which, with the exception of the isolated case of the 3rd cycle, is
manifest in several of the major indicators we have been mobilizing. This
approach of the journalistic agenda on the prominence given to the arti-
cles cannot be found in the empirical studies we have been drawing upon.
Regarding the tone of the articles with PISA as central topic, and consid-
ering the positive evolution of Portugal’s performance in the programme,
we put forward the expectation of a gradually less relevant trend of their
negative tone (H3). This expectation was fulfilled, confirming the trend
observed in studies on the press of other countries (e.g. Hopfenbeck &
Görgen, 2017; Baird et al., 2016; Dixon et al., 2013) and also in Portu-
gal (Lemos & Serrão, 2015), where, despite the persistent negativity, the
tone of most newspaper headlines follows the improvement of pupils’
results. Indeed, we verified, on the one hand, that the negativity, albeit
a constant in all cycles and periods, represents in the first period nearly
double of the second period; on the other hand, positivity is a strong sin-
gularity of the second period, following the turning point in the results of
Portuguese pupils in PISA from the 4th cycle: it is this greater positivity,
moreover, that marks the second period, comparatively speaking, more
than its smaller negativity. Anyway, and in this aspect differently from
the conclusion stated by Lemos and Serrão (2015), the general conclu-
sion does not indicate balance in the distribution of the three tones in
the headlines of the articles, both by the absence of positivity in the first
period and by the sharp difference registered between the two periods in
the order of neutral and positive tones. Now, if we focus solely on the
PISA in media discourse 165

second period, we can stand by the conclusion of some balance in the


tone of the articles.
As far as the distribution of the tone of the articles per type is con-
cerned, and comparatively with the news stories, we put forward the
expectation of a stronger presence of negativity in the chronicles/opin-
ion pieces, focusing not just on results but also, in their wake, on other
aspects of education (H4). This expectation was grounded on the nature
of these articles, characterized by a more personal, critical and evalua-
tive perspective on the part of their authors, than one would expect of
news stories. Our findings confirmed this expectation, showing negativ-
ity as the more expressive tone of chronicles/opinion pieces. Moreover,
this tone is more concentrated in the second period, since in the first the
presence of this type of articles is negligible. Regarding news stories, it
is the neutral tone that defines them mostly, and therefore it is not really
positivity the tone which actually separates the two journalistic genres.
Speaking of the authorship of the articles, we inquired into the
issue of the power to influence the public debate on PISA from the
published opinion: we put forward the expectation that authorship of
the chronicles/opinion pieces would be concentrated in a small, stable
number of individuals, specialists in intellectual and scientific activities
(H5), in male voice (H6), and in the reduced presence of the voice of
teachers, students and families or the organizations that represent them
(H7). All these expectations were fulfilled.
In another context, this finding confirms that

the media discourse analysed reinforce and generate inequalities by


overrating some voices and perspectives to the detriment of others,
confirming the creation of unequal power regarding the broadcast
representation of the phenomenon being studied.
(Carita & Teodoro, in press, p. 15)

In fact, it should be emphasized that in chronicles/opinion pieces, all


of the following could be observed: limited access to female voices – in
striking contrast to their presence in news stories – absence of the voice
of basic and secondary education teachers, or reduced presence of their
representatives as well as the concentration of authorship in a limited
number of commentators. These are situations that signal the low plu-
rality of inputs the newspaper gives voice to as well as of the low power
to influence the media agenda of those stakeholders (Dijk, 2015). The
just resistance to this inequality by conquering access to public and pub-
lished opinion is a challenge which Dijk also states, and which, moreover,
would give society the possibility to collate with a more plural represen-
tation of the PISA phenomenon, one which would be closer to the view
of the agents who live the daily experiences of schools.
166 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

To sum up, within the journalistic framework of Público on PISA – and


marking the importance of this OECD programme in the media agenda
and maybe its impact in the agendas of politicians and newspaper read-
ers – we note the significant increase in the coverage of PISA from the
first period (2000, 2003, 2006 cycles) to the second period (2009, 2012,
2015 cycles), especially pronounced in the field of chronicles/opinion
pieces. This increase is followed by an edition which highlights PISA by
using editorials and frontpage teasers throughout the whole time frame
under consideration. Still, this coverage does not ensure a significant
permanence of PISA as core topic beyond the month and year when
the results of each cycle are disseminated, although the subject does
leave some ballast in articles focusing on other issues. We note a general
tone of the articles on PISA which, in keeping with the improvement in
the Portuguese pupils’ results, is characterized by a growing positivity
when the two periods are compared. This feature, however, does not
blur the differences in tone between the news stories and the chronicles/
opinion pieces, the former with the predominance of a neutral tone and
the latter with the prevalence of the negative tone, thus marking the style
of each of the genres in question. The low social plurality in the opinion
published by the newspaper on PISA must also be noted, com particu-
lar reference to the near absence of the school environment – teachers,
pupils, families or the organizations that represented them – as well as
the near absence of women in opinion pieces, in contrast with the strong
presence in the field of news stories. Thus, the credibility and political
importance given to the programme, a prevailing tone that highlights the
awareness of the evolution of Portuguese results, particularly in news
stories, in contrast to the greater negativity in chronicles/opinion pieces,
and finally the limited openness of the newspaper to a socially plural
opinion seem to be three important conclusions that can be drawn from
the analysis of the surface features of Público’s production on PISA.

References
Baird, J.-A., Isaacs, T., Johnson, S., Stobart, G., Yu, G., Sprague, T., & Daugherty, R.
(2011). Policy effects of PISA. Oxford University Centre for Educational Assess-
ment. http://oucea.education.ox.ac.uk/wordpress/wp-content/uploads/2011/
10/Policy-Effects-of-PISA-OUCEA.pdf
Baird, J.-A., Johnson, S., Hopfenbeck, T. N., Isaacs, T., Sprague, T., Stobart,
G., & Yu, G. (2016). On the supranational spell of PISA in policy. Educational
Research, 58(2), 121–138. https://doi.org/10.1080/00131881.2016.1165410
Baroutsis, A., & Lingard, B. (2016). Counting and comparing school perfor-
mance: An analysis of media coverage of PISA in Australia, 2000–2014.
Journal of Education Policy, 32(4), 432–449. https://doi.org/10.1080/02680939.
2016.1252856
PISA in media discourse 167

Bieber, T., & Martens, K. (2011). The OECD PISA study as a soft power in edu-
cation? Lessons from Switzerland and the US. European Journal of Education,
46(1), 101–116. https://doi.org/10.1111/j.1465-3435.2010.01462.x
Boto, A. P. C. N. de B. (2011). Entre os problemas públicos e a agenda política:
O papel dos opinion makers em torno do novo modelo de avaliação de desem-
penho docente (2007–2009) [Dissertação de Mestrado, Instituto Superior de
Ciências Sociais e Políticas] http://hdl.handle.net/10400.5/3537
Carita, A., & Teodoro, V. D. (in press). A indisciplina escolar na imprensa: O
jornal Público entre 2011 e 2015. Education Policy Analysis Archives.
Castells, M. (2007). A era da informação: Economia, sociedade e cultura.
A sociedade em rede. Fundação Calouste Gulbenkian
Coe, K., & Kuttner, P. J. (2018). Education coverage in television news: A typol-
ogy and analysis of 35 years of topics. AERA Open, 4(1), 1–13. https://doi.
org/10.1177/2332858417751694
Coleman, R., McCombs, M., Shaw, D., & Weaver, D. (2009). Agenda setting. In
K. Wahl-Jorgensen & T. Hanitzsch (Eds.), The handbook of journalism studies
(pp. 147–160). Routledge.
Dijk, T. A. van. (2015). Critical discourse analysis. In D. Tannen, H. E. Hamil-
ton, & D. Schiffrin (Eds.), The handbook of discourse analysis (2nd ed., pp. 465–
485). John Wiley & Sons, Inc. https://doi.org/10.1002/9781118584194.ch22
Dixon, R., Arndt, C., Mullers, M., Vakkuri, J., Engblom-Pelkkala, K., & Hood,
C. (2013). A lever for improvement or a magnet for blame? Press and political
responses to international educational rankings in four EU countries. Public
Administration, 91(2), 484–505. https://doi.org/10.1111/padm.12013
Figazzolo, L. (2009, March). PISA: Is testing dangerous? Education International
(29). www.ei-ie.org/en/detail/4079/pisa-is-testing-dangerous
Gerstl-Pepin, C. I. (2007). Introduction to the special issue on the media, democ-
racy, and the politics of education. Peabody Journal of Education, 82(1), 1–9.
https://doi.org/10.1080/01619560709336534
González-Mayorga, H., Vidal, J., & Vieira, M.-J. (2017). El impacto del informe
PISA en la sociedad española: El caso de la prensa escrita. RELIEVE – Revista
Electrónica de Investigación y Evaluación Educativa, 23(1). https://doi.
org/10.7203/relieve.23.1.9015
Gradim, A. (2000). Manual de jornalismo. Universidade da Beira Interior. http://
labcom.ubi.pt/livro/64
Grek, S. (2009). Governing by numbers: The PISA ‘effect’ in Europe. Journal of
Education Policy, 24(1), 23–37. https://doi.org/10.1080/02680930802412669
Grey, S., & Morris, P. (2018). PISA: Multiple ‘truths’ and mediatised global gov-
ernance. Comparative Education, 2(54), 109–131. https://doi.org/10.1080/03
050068.2018.1425243
Hopfenbeck, T. N., & Görgen, K. (2017). The politics of PISA: The media, policy
and public responses in Norway and England. European Journal of Education,
52(2), 192–205. https://doi.org/10.1111/ejed.12219
Kadushin, C. (1968). Power, influence and social circles: A new methodology for
studying opinion makers. American Sociological Review, 33(5), 685. https://
doi.org/10.2307/2092880
168 Ana Carita, Teresa Teixeira Lopo and Vítor Duarte Teodoro

Lemos, V., & Serrão, A. (2015). O impacto do PISA em Portugal através dos
media. Sociologia, Problemas e Práticas, 78, 87–104. https://doi.org/10.7458/
spp2015783310
Lingard, B. (2016). PISA: Fundamentações para participar e acolhimento político.
Educação, Sociedade, Campinas, 37(136), 609–627. https://doi.org/10.1590/
es0101-73302016166670
Lopes, P. C. (2010). Géneros literários e géneros jornalísticos. Uma revisão
teórica de conceitos. Universidade da Beira Interior. www.bocc.ubi.pt/pag/
bocc-generos-lopes.pdf
Marôco, J., Goncalves, C., Lourenço, V., & Mendes, R. (2016). PISA 2015 – Por-
tugal Volume I: Literacia científica, literacia de leitura & literacia matemática.
IAVE Instituto de Avaliação Educativa I.P. www.iave.pt/images/FicheirosPDF/
Estudos_Internacionais/Relatorio_PISA2015.pdf
Martens, K., & Niemann, D. (2010). Governance by comparison: How rat-
ings & rankings impact national policy-making in education. Univer-
sität Bremen Collaborative Research Center. www.econstor.eu/bitstream/
10419/41595/1/639011268.pdf
McCombs, M. E. (2005). A look at agenda-setting: Past, present and future. Jour-
nalism Studies, 6(4), 543–557. https://doi.org/10.1080/14616700500250438
McCombs, M. E., & Shaw, D. L. (1972). The agenda-setting function of
mass media. Public Opinion Quarterly, 36(2), 176–187. https://doi.
org/10.1086/267990
McCombs, M. E., & Shaw, D. L. (1993). The evolution of agenda-setting
research: Twenty-five years in the marketplace of ideas. Journal of Commu-
nication, 43(2), 58–67. https://doi.org/10.1111/j.1460-2466.1993.tb01262.x
Nóvoa, A., & Yariv-Mashal, T. (2003). Comparative research in education:
A mode of governance or a historical journey? Comparative Education, 39(4),
423–438. https://dx.doi.org/10.1080/0305006032000162002
OECD. (2006). Assessing, scientific, reading and mathematical literacy: A frame-
work for PISA 2006. https://doi.org/10.1787/19963777
Ricardo, D. (2004). Ainda bem que me pergunta. Manual de escrita jornalística.
Editorial Notícias.
Saraisky, N. G. (2015). Analyzing public discourse: Using media content analysis
to understand the policy process. Current Issues in Comparative Education,
18(1), 26–41. www.tc.columbia.edu/cice/pdf/03_Green-Saraisky-CICE-18.pdf
8

OECD and education


How PISA is becoming a
“big science” project
Vítor Rosa and Ana Lourdes Araújo

Introduction
The Organization for Economic Cooperation and Development (OECD),
which at present integrates 36 members, aims to promote policies that
foster prosperity, equal opportunities and well-being for all. It is sup-
ported by an experience of over 60 years of knowledge, with a view to
preparing tomorrow’s societies. In close cooperation with public authori-
ties, economic agents and the representatives of civil society, this organi-
zation seeks to establish international rules and proposes data-based
solutions: improvement of economic performances, job creation, promo-
tion of effective education systems, fight against international tax eva-
sion, among others.
Developed by the OECD, the Programme for International Student
Assessment (PISA) was designed to evaluate if 15-year-old students are
capable of mobilizing their Reading, Mathematics or Science skills to
solve everyday life situations. On an optional basis, collaborative prob-
lem-solving and financial literacy can also be evaluated. This is not, there-
fore, about knowing if students can reproduce the knowledge acquired
in those fields. The PISA tests are designed on the basis of a framework
that is common to all the countries. The applications of the tests have
occurred in three-year cycles, and in each cycle one of the literacy areas
mentioned earlier is evaluated in more depth: Reading (2000, 2009,
2018); Mathematics (2003, 2012a); Science (2006, 2015); Collabora-
tive Problem-Solving (2015).1 Portugal has not yet participated in the
evaluation of Financial Literacy (which is scheduled for 2021), but it has
participated in all the PISA cycles so far (from 2000 to 2018). Students
are selected through a two-stage sampling process. In the first stage, a
stratified random sample of schools is constituted. In the second stage, all
the students eligible to take the tests in the selected schools are identified
(15-year-old students attending at least the 7th grade). Subsequently, the
international Consortium randomly selects the students in each partici-
pating school.
DOI: 10.4324/9781003255215-9
170 Vítor Rosa and Ana Lourdes Araújo

PISA was designed to provide three broad types of indicators: basic


indicators, which concern students’ general competencies profile in the
fields evaluated; contextual indicators, which show how competencies are
linked to different demographic, social, economic, educational variables,
describing students, schools and education systems; time-series indica-
tors, which, since data collection is cyclical, show the evolution of the lev-
els of competencies and the progression of their connection to contexts.
Following data processing, the OECD formulates recommendations
(related to equity, quality and efficiency), based on the good practices
of the countries in the top positions of the PISA ranking. The goal is to
assist and support participating countries improve their education sys-
tems. In this way, it seems indisputable that the OECD has gone from
being a collector of information into a producer of data on education
systems, with PISA taking pride of place in this process (Lingard, 2016).
The acknowledgement of its standing, in view of the technical legitimacy
given by the production of these data, has reinforced its capacity for
recommendation and political influence (Bloem, 2015; Breakspear, 2012;
Carvalho & Costa, 2017; Centeno, 2017; Niemann & Martens, 2018;
Teodoro, 2015, 2016, 2020; Volante, 2017; Verger et al., 2019). By
making the data collected through PISA available, the OECD promotes
the production factors (taking an industrialization perspective) such as
teachers, equipment, facilities, certificate or production. The data enable
a depiction of the raw material: students, schools, syllabi, methods, peda-
gogic techniques, teachers and parents or guardians.
In this chapter, we seek to analyse whether PISA, considering its
specificities as international megaproject, the amount of statistical
data produced, the participation of the various organizations, and the
mobilization of human and financial resources, integrates the so-called
big science, a concept which emerged in the United States of America
(USA) in 1961.

Methodological framework
As big data studies emerged and revolutionized the field, social scientists
have been confronted with new challenges in their attempt to find the
best way to understand the phenomena as well as the conceptual and
methodological approaches that can be used (Robertson, 2019). For
our inquiry, we resorted to the qualitative research methodology, based
on documental and bibliographical analysis (papers, reports, statistics,
press, interviews, images, among others). We are aware that each of
these data-collection techniques has its strengths and weaknesses. In
the social sciences, the use of qualitative data is associated with the
different paradigms trying to put together a view of the social reality
(Koivu & Damman, 2015; Uher, 2018). Qualitative research produces
How PISA is becoming a “big science” project 171

and analyses descriptive data, such as the written or spoken words, or


people’s observable behaviour. It takes us to a research method interested
in the sense of observation of a social phenomenon (Aspers & Corte,
2019; Schut, 2019). With this in mind, we seek to process data that
can hardly be quantified, but figures and statistics are not shunned.
In the documental analysis, we sought to bear in mind some criteria:
authenticity, credibility, representativity and significance.
On the one hand, we seek to demonstrate that this international survey
fits what is known as big science and, on the other hand, we aim to ana-
lyse the reception of the concept of big science by the scientific commu-
nity in Portugal, characterizing the interpretations it was given and how
the effects of big science and the big data were questioned in those texts as
regards education research (continuities and breaks), and the work of the
researcher. We will endeavour to reflect on this international megaproject
and the huge quantity of data (big data) it generates in a configurational
process. In other words, one in which the actors involved in this interna-
tional survey are becoming increasingly dependent on one another.
The notion of “configuration”, for Elias (1993), implies the interde-
pendence of the individuals and also integrates a theory of power, which
is, however, not the attribute of those who hold it, but rather the product
of a specific interdependence, which gives a social actor the ability to
do anything to another (power as a relation between individuals, often
asymmetrical, contingent and dialectic). To put it differently, power is
not a property (something held by a stronger individual to dominate a
weaker one). It is a particularity of all human relations. The knowledge
announced in public action is seen a more or less lasting product of inter-
dependences, determining the speakers that have the right to invoke PISA
as socially acceptable content.
These configurations may have four dimensions, which are a source
of possible interdependence among the actors of the public debate: a
political dimension (the educational political topicality); an institutional
dimension (the institutional features of the actors, the organizations and
the public debate); a professional dimension (the interests, identities
and legitimation modes of the professional groups involved); a cogni-
tive dimension (the state of current knowledge, academic traditions, the
actors’ evaluation sources and notions).

Discussion: the configurations of big science


The literature review shows that the dissemination of the phrase big sci-
ence is attributed to a paper published in Science Magazine, in 1961, by
American nuclear physicist Alvin Martin Weinberg (1915–2006), then
director of Oak Ridge National Laboratory, in the USA. Entitled “Impact
of Large-Scale Science on the United States”, this paper was a response to
172 Vítor Rosa and Ana Lourdes Araújo

the farewell speech of Dwight David Eisenhower (1890–1969), in which


the American President (from 1953 to 1961) warned of the dangers
of the military-industrial complex and the potential submission of scholars
to governments’ funding allocation. The article describes big science as
part of the political economy of science derived from the World War II
(1939–1945), with the development of the radar and the construction of
the atom bomb. According to Weinberg (1961), big science was not the
ideal outcome for the history of the sciences. He argued that the outra-
geous sums of money spent on research on physical energy had little
value for the contribution to human development.
Shortly after the publication of Weinberg’s article, in a two-week
period, English physicist Derek de Solla Price (1922–1983) promoted
four talks at Brookhaven National Laboratory, New York, in June 1962.
These would later be published in book form, entitled Little Science, Big
Science . . . and Beyond (Price, 1963). This work describes the historic
and sociological transition from Little Science to Big Science and the
qualitative differences between the two.2 The book had great impact and
would inspire new perspectives on large scale science and other fields
(Turner, 2003). Big science cannot survive isolated from non-science
and other realms of society. For Galison and Hevly (1992, p. 17), it has
become an “economic, political and sociological entity in its own right”,
with cultural differences in its production.
The concept has evolved since then. It underpins the particle accelera-
tor, the networks of telescopes and other observatories of Orbit (the new
telescopes, whether on the surface of the Earth or in space allow for the
observation of the universe in earlier stages of its development), with
its colossal instruments, the large laboratories supervised by the states
which question the origin of the universe and foster research, namely on
physics and astronomy (Nye, 1996). As Josephson and Klanovicz (2016)
argue, the big science and research technology in applied chemistry, in
agronomy, in hydroelectricity, in atom bomb projects and in genomics
exploitation has become a research paradigm of the 20th century. If in a
given moment, scientists have tended to work alone in small laboratories
which were self-funded in close-knit groups of universities and occasion-
ally on small-scale projects connected with science academies established
at the turn of the 19th century, then, around the 1920s, they became part
of wider groups of specialists, which reflected the aggregation of interests
from governments, engineering, the financial and scientific world and,
often, new interests and organizations.
The scientific activity and the material investments cause deep changes
in the conditions of the research effort and present very broad logistic,
technical, financial and, consequently, political problems. One of the more
relevant aspects of this big science, within the scope of modern science, is
that it has concentrated resources in a few, albeit considerable, research
How PISA is becoming a “big science” project 173

centres, as opposed to the traditional modern laboratory, with little tech-


nological equipment, relying on individual work, geographically dissemi-
nated. Big science presupposes the following characteristics: substantial
budgets, large teams, large-scale instruments and laboratories (Galison &
Hevly, 1992; Gastro & Oppelt, 2018; Hallonsten, 2016; Hallonsten &
Christensson, 2017). In this sense, and following an Eliasian perspective,
one cannot think big science without inserting it in a social network.
Since the World War II, politicians and society in general, in their con-
stant search for knowledge, have agreed to fund pharaonic scientific pro-
jects, despite their exorbitant costs. Big science makes people dream. It
feeds the desire for knowledge and demonstrates the ability to question
the world in its extremes. As Gastro and Oppelt (2018, p. 1) remark,

the knowledge and technology developed within big science facilities


not only advance our understanding of the universe, but generate new
classes de products and services that disrupt markets and change lives.

It is believed that big science, in a global world, is necessary, not merely


as a matter of national prestige, but to solve problems which cannot
be solved individually. Only at this scale can one respond to the chal-
lenges of our time. This, however, requires Big Politics and Big Decisions.
Josephson and Klanovicz (2016, p. 166) stress that large institutions have
acquired great “power and authority”. And Josephson and Klanovicz
(2016) argues:

The large scientific research corporations have learnt to expand their


core projects and their focuses of intervention with the aim of ensur-
ing that their programmes have generous funding with promises of
crucial results for national defence, for public health, for industrial
innovation, for medicine and for agriculture.
(pp. 166–167)

Josephson and Klanovicz (2016, p. 166) emphasize that large institu-


tions have acquired significant “power and authority”. And, as discussed
by Alvin Weinberg (1961), the same can be said of the large scientific
research corporations.
In line with this reality, it becomes evident that in the presentation of
the results of a study cycle, every three years, PISA gives rise to a large
number of papers, books, debates, speeches in political, professional, sci-
entific and media circles all over the world. This broad dissemination,
known to no other international assessment, is one of the signs of the pro-
gramme’s success, which is presented as a reference in terms of compara-
tive survey on students’ performance. While it is true that there is a larger
number of countries participating in the survey (in the first cycle, 2000,
174 Vítor Rosa and Ana Lourdes Araújo

2000 2003 2006 2009 2012 2015 2018

No. of Participations 4 6 6 10 14 14 11
ACER, Australia 7 ¤ ¤ ¤ ¤ ¤ ¤ ¤
WESTAT, USA 7 ¤ ¤ ¤ ¤ ¤ ¤ ¤
ETS, USA 6 ¤ ¤ ¤ ¤ ¤ ¤
NIER, Japan 4 ¤ ¤ ¤ ¤
CITOGROEP, the 4 ¤ ¤ ¤ ¤
Netherlands
aSPe, Belgium 4 ¤ ¤ ¤ ¤
CApStAn, Belgium 4 ¤ ¤ ¤ ¤
DIPF, Germany 4 ¤ ¤ ¤ ¤
HallStat SPRL, Australia 3 ¤ ¤ ¤
University of Heidelberg, 3 ¤ ¤ ¤
Germany
University of Luxembourg, 3 ¤ ¤ ¤
Luxembourg
CET, Israel 1 ¤
Achieve Inc, USA 1 ¤
DEPP, France 1 ¤
GESIS, Germany 1 ¤
ILS, Norway 1 ¤
IPN, Germany 1 ¤
LIST, Luxembourg 1 ¤
Pearson, UK 2 ¤ ¤
Statistic Canada, Canada 2 ¤ ¤
University of Jyväskylä, 1 ¤
Finland
University of Melbourne, 1 ¤
Australia
University of Twente, the 1 ¤
Netherlands

Figure 8.1 “Big science” and human development (in abstract terms)
Source: PISA Technical Reports (2000 to 2018)

43 countries participated; in the latest cycle, in 2018, this number rose


to 81), it is equally true that its periodicity ensures an almost permanent
presence in the world political-scientific agenda (Bart & Daunay, 2016).
It is important to emphasize that the large team of scientists, from
several countries and important universities, with expertise in strategic
areas, are formally organized by companies, which range in shape from
small-sized to significant corporations specialized in providing services
and marketing educational products, aggregated in a consortium named
PISA, as can be seen in Figure 8.1. The figure demonstrates that this “big
science” has been growing throughout the various editions: it has gone
from four organizations (in 2000) to 11 (in 2018), with a clear predomi-
nance of the USA and Australia. The preservation of the field of scientific
and business power of the PISA industry should also be remarked upon.
How PISA is becoming a “big science” project 175

Figure 8.2 PISA Advisory Group (Countries), from 2000 to 2018


Source: PISA Technical Reports (2000 to 2018)

In this approach, it is interesting to highlight that three large organiza-


tions, Australian Council for Educational Research (ACER), Improving
Lives Through Research (WESTAT) and Educational National Services
(ETS), the first from Australia and the other two from the USA, con-
trolled the production of the PISA test from 2000 to 2018. Regarding
PISA Advisory Group, from 2000 to 2018, we note that it is centred in
11 countries, notably the USA, the Netherlands and Australia, as can be
seen in Figure 8.2.
In 2015, the Pearson corporation began integrating the group of PISA’s
lead contractors, and it gained prominence in PISA 2018 because it won
a competitive bid of the Organization for Economic Cooperation and
Development (OECD) to develop the “Frameworks”.
Another aspect that stands out in the context of the PISA production
concerns the dynamic roles companies can play in this structure consider-
ing that one organization can integrate the main group (“Lead Contrac-
tors”) of external contract and also be a subcontractor simultaneously,
this being the case of “Improving Lives Through Research” (WESTAT).
As regards PISA 2018, “Educational Testing Service” (ETS), of the USA,
played the role of lead contractor and was responsible for the general
management and implementation of contracts and tasks, which needed
the cooperation of subcontractors to be carried out.
Besides the economic and financial reverberations that these interna-
tional studies generate, there is also the idea of “industrializing” educa-
tion, in other words, the will to adapt teaching to the needs of society in
general and economy in particular. On the other hand, it is also associ-
ated with the acknowledgement of the inefficiency of a type of teaching,
176 Vítor Rosa and Ana Lourdes Araújo

such as it is, from the quantitative point of view (number of graduates)


or the qualitative point of view (choice of syllabus and pedagogical meth-
ods). For Schleicher (2018, pp. 13–14), “without the right education,
people will languish on the margins of society, countries will not be able
to benefit from technological advances and those advances will not trans-
late into social progress”.

Conclusion
Throughout the 1960s and 1970s, many OECD countries reformulated
the contents and methods of their schooling. Throughout the 1980s, the
guideline was to respond to the needs of society, of the acknowledge-
ment of economic hardship and social problems. In the 1990s indica-
tor projects were fostered (Indicators of Education Systems – INES, for
instance) to monitor the standard of learning and education systems.3
The issues addressed at present in the reports that the OECD produces
fall on the decrease of birth rates, the evolution and change of the fam-
ily structure, demographic ageing, women’s participation in the labour
market, migrations, multiculturalism and technological progress. Educa-
tion, as mentioned earlier, is also part of its concerns and, to this end, the
OECD has been implementing several projects. One megaproject, which
can be integrated in big science, is called PISA and was launched in 2000,
with a view to comparing (the common core) internationally several liter-
acy domains, with the target audience being 15-year-old students. Based
on the results, the OECD issues recommendations to improve education
systems. The PISA programme is seen by the political decision-makers
and the international organizations as a tool to compare school systems,
which will reveal their strengths and weaknesses. The media coverage
and instrumentalization of results also contribute to turn PISA into an
international competition vector, and its most symptomatic manifestation
consists in classifying countries according to their mean performances.
The implementation of PISA is subject to a complex set of tender
specifications. Data collection and processing must respond to a very
demanding list of standards regarding sampling, design, translation and
correction of the items, the conditions for test delivery and data manage-
ment, so as to avoid fraud. One of the more important standards is the
minimum response rate required. As is the case with all survey data, PISA
data are, inevitably, imperfect, especially when they are based on sam-
pling and estimated figures. The historic and cultural contexts are seldom
integrated in the analysis of findings.
Large international surveys, namely PISA, seek to provide the evidence
for governmental political action (Teodoro, 2015) and relegate to the
background the contextualization of learning processes on the political
dimensions of education (Teodoro, 2015). PISA does not allow for “a
How PISA is becoming a “big science” project 177

direct cause-effect link to be established between the education practices


and policies of the different countries and students’ results” (Serrão,
2014, p. 270). Nevertheless, its results have earned their presence in the
public debate, especially in the mass media. These serve to clash educa-
tion policy arguments, proposals and measures.
We argue that PISA, due to the characteristics that we exposed previ-
ously (megaproject, funding, international participation of countries and
specialists, substantial amount of data produced, interdisciplinary articu-
lation between the different fields of knowledge, and number of people
involved), is part of the so-called big science. The big science is not car-
ried out without resistance and criticism, and the same is true of PISA.
Science’s change in scale requires scientists to align and/or “conform”
(to use the term proposed by Elias, 1993) their activities to broader ele-
ments of society. Large-scale science researchers (un)consciously attract
the resources of their societies.

Notes
1 On the reach of the literacy fields, a topic to be further expanded below, see
OECD (2018, pp. 14–15).
2 Turner (2003) emphasizes that, when Price (1963) wrote the book, a para-
digm change was underway according to the science sociologists. Merton
(1938, 1942, 1957) already remarked on the emerging interest in the scien-
tific statute, the recognition of scientific discoveries, scientific priorities and
science and technologies.
3 On the history of the creation of INES, see Teodoro (2015, 2016).

References
Aspers, P., & Corte, U. (2019). What is qualitative in qualitative research. Quali-
tative Sociology, 42, 39–160. https://doi.org/10.1007/s11133-019-9413-7
Bart, D., & Daunay, B. (2016). Les blagues à PISA, le discours sur l’école d’une
institution internationale. Éditions du Croquant.
Bloem, S. (2015). The OECD directorate for education as an independent knowl-
edge producer through PISA. In H. G. Kotthoff & E. Klerides (Eds.), Govern-
ing educational spaces (pp. 169–185). Sense Publishers.
Breakspear, S. (2012). The policy impact of PISA: An exploration of the nor-
mative effects of international benchmarking in school system performance.
OECD Publishing.
Carvalho, L. M., & Costa, E. (2017). The praise of mutual surveillance in Europe.
In R. Normand & J.-L. Derouet (Eds.), A European politics of education: Per-
spectives from sociology, policy studies and politics (pp. 53–72). Routledge.
Centeno, V. G. (2017). The OECD’s educational agendas: Framed from above,
fed from below, determined in interaction. A study on the recurrent education
agenda. Peter Lang.
Elias, N. (1993). Qu’est-ce que la sociologie? Agora Pocket.
178 Vítor Rosa and Ana Lourdes Araújo

Galison, P., & Hevly, B. (1992). Big science: The growth of large-scale research.
Stanford University Press.
Gastro, M., & Oppelt, T. (2018). Big science and human development – what
is the connection? South African Journal of Science, 114(11/12). https://doi.
org/10.17159/sajs.2018/5182
Hallonsten, O. (2016). Use and productivity of contemporary, multidisciplinary
big science. Research Evaluation, 25(4), 486–495.
Hallonsten, O., & Christensson, O. (2017). Collaborative technological innova-
tion in an academic, user-oriented big science facility. Industry and Higher
Education, 31(6), 399–408. https://doi.org/10.1177/0950422217729284
Josephson, P. R., & Klanovicz, J. (2016). Big science e tecnologia no século
XX. Fronteiras: Revista Catarinense de História, 27, 149–168. https://doi.
org/10.36661/2238-9717.2016n27.8051
Koivu, K., & Damman, E. (2015). Qualitative variations: The sources of diver-
gent qualitative methodological approaches. Quality & Quantity: Interna-
tional Journal of Methodology, 49(6), 2617–2632.
Lingard, B. (2016). Rationales for and reception of the OECD’s PISA.
Educação & Sociedade, 37(136), 609–627. https://doi.org/10.1590/
es0101-73302016166670
Merton, R. (1938). Science, technology and society in seventeenth century
England. Saint Catherine Press.
Merton, R. (1942). Science and technology in a democratic order. Journal of
Legal and Political Sociology, 1, 115–126.
Merton, R. (1957). Priorities in scientific discovery. American Sociological
Review, 22, 635–659.
Niemann, D., & Martens, K. (2018). Soft governance by hard fact? The OECD
as a knowledge broker in education policy. Global Social Policy, 18(3),
267–283. https://doi.org/10.1177/1468018118794076
Nye, M. (1996). Before big science: The pursuit of modern chemistry and physics
1800–1840. Harvard University Press.
OECD. (2000). PISA 2000 Technical report. OECD.
OECD. (2003). PISA 2003 Technical report. OECD.
OECD. (2006). PISA 2006 Technical report. OECD.
OECD. (2009). PISA 2009 Technical report. OECD.
OECD. (2010). The high cost of low educational performance: The long-run
economic impact of improving PISA outcomes. OECD. https://www.oecd.org/
pisa/44417824.pdf
OECD. (2012a). PISA 2012 Technical report. OECD. https://www.oecd.org/
pisa/pisaproducts/PISA-2012-technical-report-final.pdf
OECD. (2012b). Better skills, better jobs, better lives: A strategic approach to
skills policies. OECD. https://www.oecd.org/education/imhe/IMHEinfos_
Jult12_EN%20-%20web.pdf
OECD. (2015). PISA 2015 Technical report. OECD.
OECD. (2018). PISA 2018 Assessment and analytical framework. OECD.
Price, D. S. (1963). Little science, big science . . . and beyond. Columbia Univer-
sity Press.
How PISA is becoming a “big science” project 179

Robertson, S. (2019). Comparing platforms and the new value economy in


the academy. In R. Gorur, S. Sellar, & G. Steiner-Khamsi (Eds.), Compara-
tive methodology in the era of big data and global networks (pp. 169–186).
Routledge.
Schleicher, A. (2018). World class: How to build a 21st-century school system,
strong performers and successful reformers in education. OECD.
Schut, R. K. (2019). Investigating the social world: The process and practice of
research. SAGE Publications.
Serrão, A. (2014). PISA: A avaliação e a definição de políticas educativas [Assess-
ment and definition of educational policies]. In M. L. Rodrigues (Ed.), 40 Anos
de políticas de educação em Portugal (Vol. 1, pp. 269–291). Edições Almedina.
Teodoro, A. (2015). A construção da educação mundial ou o lugar da Educação
Comparada no estudo das políticas (e práticas) de educação. [The construc-
tion of the global education or place of comparative education in the study
of education policies (and practices)]. Revista Brasileira de Pós-Graduação,
29(12), 859–877.
Teodoro, A. (2016). Governando por números: Os grandes inquéritos estatísticos
internacionais e a construção de uma agenda global nas políticas de educação
[Governing by numbers: The major international statistical surveys and the
construction of a global agenda in education policies]. Em Aberto, 96(29),
41–52.
Teodoro, A. (2020). Contesting the global development of sustainable and inclu-
sive education. Education reform and the challenges of neoliberal globaliza-
tion. Routledge.
Turner, J. (2003). Little book, big book: Before and after little science, big science:
A review article. Journal of Librarianship and Information Science, 32(2),
115–125.
Uher, J. (2018). Data generation methods across the empirical sciences: Differ-
ences in the study phenomena’s accessibility and the processes of data encoding.
Quality & Quantity. International Journal of Methodology, 53(1), 221–246.
Verger, A., Fontdevila, C., & Parcerisa, L. (2019). Constructing school autonomy
with accountability as a global policy model: A focus on OECD’s governance
mechanisms. In C. Ydesen (Ed.), The OECD’s historical rise in education.
Global histories of education (pp. 219–243). Palgrave Macmillan.
Volante, L. (2017). The PISA effect on global educational governance. Routledge.
Weinberg, A. (1961). Impact of large-scale science on the United States. Science,
134(3473), 161–164.
Conclusion
Limitations and risks of an
OECD global governance project
António Teodoro

Introduction: the origins of the OECD


The concerns of the Organisation for European Economic Cooperation
(OEEC), the forerunner of the Organisation for Economic Cooperation
and Development (OECD), with education derive directly from the eco-
nomic sphere (Leimgruber & Schmelzer, 2017; Schmelzer, 2016). The
concerns of the countries that signed the convention which in 1948 cre-
ated the OEEC strove, primarily, to seek an understanding regarding the
maximization of national capabilities and potentials, to increase their
production, develop and modernize their agricultural and industrial
equipment, foster trade, reducing barriers to mutual trade, encourage full
employment and restore, or maintain, the stability of their economies,
including the trust in their national currencies.
In addition to that, the countries that signed the convention stipulated
that the parties would use the available manpower fully and rationally.
The need to give content to this clause implied that shortly afterwards, in
1953, the European Productivity Agency was created, within the frame-
work of the OEEC, later followed, in 1958, by the Office for Scientific and
Technical Personnel (OSTP). In 1970, still under the impact of the launch-
ing of the first artificial satellite, the Sputnik, by the USSR, the present-day
Education Committee of the OECD was formed, as a result of merging
several bodies connected to science and training of scientific and technical
staff. At the core of these decisions lay the conviction that science is the
driving force of progress, and that overcoming the shortage of qualified
researchers and engineers would have long-term consequences in the edu-
cation systems, producing considerable changes not only in higher educa-
tion but especially in general education at basic and secondary levels.
The emergence within the OEEC/OECD of education as a priority
and as a decisive issue for economic growth follows the emergence and
later dissemination of the Human Capital Theory, formulated in 1960
by Theodore Schultz and refined two years later in the supplement of the
Journal of Political Economy, “Investment in Human Beings” (Schultz,
DOI: 10.4324/9781003255215-10
Conclusion 181

1962). This already included other pioneering studies, namely what Gary
Becker would publish on “Human Capital” (Becker, 1964), which has
since served as locus classicus of the topic. The theory of human capi-
tal became ubiquitous in the works of the OECD, assuming the role of
scientific (and economic) legitimation of the climate of euphoria, to use
Húsen’s term (Húsen, 1979), which would shape the expansion of the
education systems in the 1960s and 1970s.
As Vera Centeno remarked, the OECD

was designed as a place . . . where likeminded countries could build


a common economic but also cognitive transnational space . . .
that moved from being territorially centred to become conceptually
centred.
(Centeno, 2017, p. 28)

Additionally, vis-à-vis the OEEC, which was able to issue binding deci-
sions, the OECD was equipped with weaker legal instruments. Regarding
its goals, there is no explicit reference to education in the OECD Con-
vention though “there was always an ‘inferred role’, derived from early
human capital formulations of links between economic productivity and
educational investment” (Rizvi & Lingard, 2009, p. 438). Besides, as
Papadopoulos observed

the nearest it comes to getting such a reference is in Article 2(b), on


policies designed to promote the development of Member countries’
resources, in science and technology, to encourage research and to
promote vocational training.
(Papadopoulos, 1994, p. 11)

Deriving inspiration from the practice set up early on in the economic


policy sphere, in 1958–1959, the OEEC/OECD started conducting yearly
examinations with the goal of assessing the general situation of scien-
tific and technical education, the prime concern at the time, as well as
other specific problems each member country faced. The technique used
consisted of sending to each country a small team of experts to meet
with administrative officials and representatives of other interested sec-
tors. From these interviews, the team of experts produced a report, which
was studied in a confrontation meeting, held at the OECD headquar-
ters, where high-level officials of the country under review answered the
various questions submitted by the examiners and by the members of the
OECD’s Steering Committee.
However, it was in the 1980s, with the rise of neoliberalism, that a turn-
ing point happened in the OECD’s procedures, creating new and more
complex regulation modes for education policies. Project INES enabled the
182 António Teodoro

OCDE to develop strong expertise in “output statistics” (Eide, 1990), that


is statistics directed at measuring “school efficiency” by assessing young
students’ learning. The problem lay in how to do it on the international
level, which involves such great diversity of cultural contexts and national
curricula guiding students’ training. Resorting to the concept of “literacy”,
a concept originating in the field of recurrent adult education, then strongly
used in some of the OECD’s member countries (specifically, Nordic coun-
tries, in Europe, and in Canada),1 enabled the OECD to overcome this
limitation and build its most effective regulatory instrument, PISA.
In this context, the current power of the OECD goes beyond the
already important role of setting the global agenda for education. Draw-
ing on an analogy with the distinction that Basil Bernstein made between
recognition and realization, Roger Dale argues that the influence of an
international organization like OECD – the main world think tank of
hegemonic globalization in the field of education2 – can be found not
merely in Steven Lukes’s second dimension of power – “power as set-
ting agenda” – but, especially, in its third dimension – “the power to
set and control the rules of the game, and to shape preferences” (Dale,
2008, p. 3). Hence, Roger Dale’s conviction that the role of this interna-
tional organization has been changing, increasingly becoming “problem
definer” rather than “solution provider”.

PISA as the basis for global educational reform


movement
PISA is not only the world’s most comprehensive and reliable indicator
of students’ capabilities, it is also a powerful tool that countries and
economies can use to fine-tune their education policies. . . . That is
why the OECD produces this triennial report on the state of education
around the globe: to share evidence of the best policies and practices,
and to offer our timely and targeted support to help countries provide
the best education possible for all of their students.3

Gone are the days when education did not come within the purview of
the OECD (and of its predecessor, the OEEC). At the end of the second
decade of the 21st century, the OECD’s Secretary-General, Angel Gurría,
clamours for the centrality of education in the development processes.
For this reason, within the framework of its powers, the OECD assumes
the priority of offering its member (and associate) countries a “powerful
tool”, which will enable them “to fine-tune their education policies”,
constructed from “the world’s most comprehensive and reliable indica-
tor of students’ capabilities”. This is the ambitious role given to PISA, a
programme launched in 2000 and since then repeated every three years.
Conclusion 183

Chinese-born US social scientist Yong Zhao wonders how a “non-novelty”


such as PISA, afflicted by serious conceptual and methodological
frailties, and preceded by other, more solid, surveys, could have become
this mighty regulatory instrument (Zhao, 2020). The answer can only
be given from the analysis of how a small organization like the OECD,
which did not belong to the system of the United Nations, became the
key organization in the legitimation of the global educational reform
movement (GERM), which feeds the education policies of countries and
economies since the late 20th century.
Finnish educator Pasi Sahlberb coined GERM to designate the process
based on the idea that schools had mediocre outcomes and that a global
reform was required. To this end, it becomes necessary to, first, make
schools more efficient and bring them closer to the needs of the techno-
logical development generated by the revolution of the new ICT; second,
adapt them to the new configuration of the economic competition among
states and regions as a result of global and regional integration processes
(in Europe, in particular, but also in regions such as North America and
Southwest Asia).
The process began in the (then) centre of the world system with the
large-scale reforms of the Reagan administration in the United States
(A Nation at Risk, 1983) or of Margaret Thatcher’s government in
England (Education Reform Act, 1988) after they had been tried out in
a periphery that served as actual laboratory for many of these solutions,
namely, Chile of the after-coup led by General Pinochet. Many countries,
located in various parts of the world system, began structural reforms in
their school systems involving duration of schooling, curricular organi-
zation, evaluation mechanisms, school management and administration,
teacher training and career regulation. This was the case of countries
from southern and northern Europe (Spain, Portugal and Sweden) but
also of Australia, New Zealand, the United States, Brazil and countries
from the Asian-Pacific region.
This process of transferring education policies from the strictly
national sphere to beyond the borders eventually became a global move-
ment based on four “common senses” that Levin and Fullan (2008) sys-
tematized as follows:

1. Competition among schools leads to better outcomes for students.


2. School autonomy is the more adequate means to bring about this
competition.
3. Parents should be free to choose the school they want for their
children.
4. There should be a single national curriculum and a regular compari-
son system of outcomes among schools to allow informed choice.
184 António Teodoro

Sahlberg identified five of the main common features of the education


policies generated by this global movement from the 1990s onwards, gen-
erally presented by national authorities as requirements to improve the
quality of their respective school systems (Sahlberg, 2016, pp. 133–136).
The first, and arguably the most powerful, was the creation of com-
petition mechanisms by student enrolment, allowing families to choose
the school for the children. In other words, the goal was to create an
education market, be it by supporting and incrementing different forms
of private education (both religious and lay in nature) capable of com-
peting with the traditional public education systems, or by new forms
of managing and distributing public resources, such as the voucher sys-
tem in Chile (and the United States), free schools in Sweden, or charters
schools in the United States. In those places where this was not possible,
or had limited acceptance, the creation of the (almost) markets generat-
ing competition among schools (public schools, in the vast majority) was
done by creating school rankings (school league tables) drawn up on the
single criterion of the scores obtained by their students in standardized
exams and university entrance tests. The simple idea underpinning these
policies was to bring into the education field the competition principles of
the economic market, absent from many of the public education systems
where the state historically took on the role of main provider of public
education. The belief was that this competition would lead to an increase
in school quality, this sacrosanct concept disseminated by the OECD in
the early 1990s, and that constitutes the leitmotiv of (nearly) all public
policy interventions of the last 30 years.
The second feature was the standardization of teaching and learning
in schools. The definition of national standards, the rapid development
of forms of external (standardized) evaluations of school learning and
the movement towards school evaluation (from these standards) led
to a decrease in teacher autonomy in their use of innovative teaching
methods that did not comply with the traditional school grammar of
“teaching many as if they were only one”. As a result of highly critical
teacher training and vast participation in pedagogic and professional
movements, teachers’ professional culture, who in several countries
enjoyed considerable pedagogic autonomy, was gradually replaced
by pragmatism aiming at responding with efficacy to the nationally
defined learning standards. This standardization movement, rather than
generating differentiation in the methods and educational responses,
led to a decline in the participation in the movement of pedagogy
renewal, crucial to the educational responses in the period before the
democratization of schooling.
The third feature was the rise of a core curriculum, focusing on the
emphasis on reading, mathematics and science literacy from school
knowledges. The common-sense assumption that these disciplines
Conclusion 185

constitute more relevant (and structured) subjects in school learning was


legitimized (and reinforced) by the OECD when it adopted the PISA, a
test of limited validity applied to 15-year-olds in these three domains,
as the basis for a league table of the education systems of participat-
ing countries (and regions). Different countries (from North America to
Europe and from Latin America to Asia) adopted curricular reforms that
undervalued other educational fields (humanistic training, artistic educa-
tion, and physical and sports education), focusing instead on the main
teaching time on areas considered to be relevant to the international com-
parison of school systems.
The fourth feature that was globally observable in educational reforms
was the “borrowing of change models from the corporate world” as
the main means of improvement (Sahlberg, 2016, p. 135). The develop-
ment of this feature was facilitated by the generalization in the public
administrations of the different countries of the new public management
approach, even if its introduction in education systems with a strong
bureaucratic, centralized emphasis was done in hybrid ways, often con-
centrated only on a symbolic and discourse level. Transposing the con-
cept of business efficacy, focused on outputs and devaluing processes, to
the educational field became the key element in the approach to assess-
ment methods of schools and, often, of teachers themselves.
The fifth and last feature (or trend) pointed out by Sahlberg was the
link between “test-based accountability policies” of schools and teachers
and students’ achievements. This trend, sometimes unfulfilled due to the
teachers’ (and their unions’) strong opposition, generated policies for the
assessment and development of teaching careers in such distinct coun-
tries as Brazil, the United States, or England based on rewards and pun-
ishments according to “merit” measured by the place occupied by their
respective school in the rankings created from exams and standardized
tests, organized nationally.4
The OECD progressively assumed a central role in this global reform
movement, becoming its key think tank, especially after the launch of
project INES in the 1990s. Still, it was PISA, as ultimate embodiment
of this “output statistics” (Eide, 1990), which consecrated this field
of the OECD as providing the legitimizing elements that underpinned
that movement. As Maren Elfert (2019) recalls, this happened after
the publication, in 1989, of the influential report Education and the
Economy in a Changing Society (OECD, 1989). This book publishes
the results of a conference held in Paris the previous year, where the
justification for a tighter connection between educational systems and
world economy, marked at the time by the powerful rise of neoliberal-
ism, was defined (and reinforced).5 Education, more than a common
good, became, in the OECD’s hegemonic discourse, a tool for global
competition (Rubenson, 2008).
186 António Teodoro

The same theoretical source: from human


capital theory to knowledge capital theory
Four decades later, the updating of Human Capital Theory, put forward
by American Eric A. Hanushek and German Ludger Woessmann, both
economists, under the name of Knowledge Capital Theory (Hanushek &
Woessmann, 2015a, 2015b), enabled the OECD (and the World Bank) to
revamp their theoretical framework for carrying out their analyses and
proposals, fostering the narrative, which links the economic growth of
the countries and economies to the results of students’ learning, meas-
ured by the large surveys conducted by the OECD (and IEA), among
which PISA features prominently.
The rationale of the Knowledge Capital Theory is very simple (and
appealing): the levels of cognitive development of a given country make
it possible to know approximately the cognitive levels of the labour
force of that country; and, in turn, it is the quality of that labour force
which will determine the levels of economic growth. Then the question
arises: how to operationalize the knowledge of those cognitive levels of
the labour force? By means of an equally simple device: knowing the
results obtained by the students of that country in PISA (and in TIMSS,
at first).
This argument is presented and developed in the report Universal Basic
Skills: What Countries stand to gain, prepared for the OECD in 2015 by
Hanushek and Woessmann (2015b). The satisfaction with the implica-
tions of this argument led the OECD’s Director for Education and Skills,
Andreas Schleicher (accompanied by UNESCO Assistant-Director Qian
Tang) to sign an editorial that opens the report, considering 2015 a date
that opens up a new future: “Education post-2015: Knowledge and skills
transform lives and societies”.

Everywhere, skills transform lives, generate prosperity and promote


social inclusion. If there is one lesson we have learned from the global
economy over the past few years, it is that we cannot simply bail our-
selves out of an economic crisis, we cannot solely stimulate ourselves
out of an economic crisis, and we cannot just print money to ease
our way out of an economic crisis. We can only grow ourselves out
of bad economic conditions and, in the long run, that depends more
than anything on equipping more people with better skills to collabo-
rate, compete and connect in ways that drive our societies forward –
and on using those skills productively. Ensuring that all people have
a solid foundation of knowledge and skills must therefore be the
central aim of the post-2015 education agenda.
(Schleicher and Tang, in Hanushek &
Woessmann, 2015b, p. 9)
Conclusion 187

The magic of this argument does not simply lie in allowing an under-
standing of the past, finding a constant in the relation between economic
growth and knowledge capital,6 but, especially, in estimating projections
for the future. This forecasting endeavour was undertaken by Hanushek
and Woessmann in a study about the “Asian miracle” (Hanushek &
Woessmann, 2016), but also in a more focused study, on the Economic
Benefits of Improving Educational Achievement in the European Union
(Hanushek & Woessmann, 2019).
In this latter study, carried out at the request of the European Commis-
sion, E. Hanushek and L. Woessmann quantify the economic benefits of
educational improvement for each of the EU countries, from an analysis
centred on the relationship between educational achievement (as meas-
ured by PISA) and the long-run growth of nations. The report incorpo-
rates the projections of the dynamics of educational reform – that it takes
time for student improvements to appear and for better-skilled workers
to become a noticeable proportion of the workforce, and model four
educational improvement scenarios.

The first scenario considers an increase in student achievement of 25


PISA points. This reform, shown possible by several EU countries,
would add €71 trillion to EU GDP over the status quo. This amounts
to an aggregate EU gain of almost 3. times current levels of GDP and
an average GDP that is seven percent higher for the remainder of the
century.
The second scenario brings all low-performing students up to basic
skill requirements for competing in today’s economy (PISA level 2).
Achieving this goal would boost average GDP over the 21st century
by nearly four percent. The more limited goal of the Strategic Frame-
work for European cooperation in education and training (ET, 2020)
to reduce low achievement to 15 percent by country would have only
about one-seventh the impact.
The third scenario matches the goal of ET 2020 calling for reduc-
tions in early school leaving. Enhancing the skills of all potential early
school leavers is projected to raise average GDP by 0.7 percent. Just
reaching the specific ET 2020 goal of no more than 10 percent early
leavers in each EU country has significantly less impact (0.1 percent).
The fourth scenario focuses on top performers, ensuring that at
least 15 percent of students in each country achieve PISA level 5.
While having minimal effect on currently high-achieving countries,
average GDP across EU countries would be 0.5 percent higher over
the remainder of the century.
(Hanushek & Woessmann, 2019, p. 3)
188 António Teodoro

Table C.1 T
 he economic benefits of improving educational achievement in the
European Union
Reform / Value of Reform
Policy Scenarios At present costs As % of present GDP As % of future GPD (discounted)
71 027 billion
Increasing average performance (25 points) 100 %
340 %
10 %
7.3 %

37 898 billion
Achieving universal basic skills 100 %
188 % 3.9 %
10 %

5 223 billion
100 %
At most 15 percent low achievers 25 % 0.5 %
10 %

7 097 billion
100 % 10 %
Enhancing skills of early school leavers 34 % 0.7 %
1 144 billion
100 % 10 %
At most 10 percent early leavers 6% 0.1 %

4 615 billion
100 % 10 %
Increasing top performance 22 % 0.5 %

Source: Hanushek & Woessmann, 2019, p. 5

And the authors submit to the European Commission a quantification


of the scenarios presented (see Table C.1). The inference of the report’s
authors is obvious and direct:

The most fundamental conclusion must be a recognition of the value


of improved educational performance. It is essential that countries
realize that their future is highly dependent on the quality of their
schools.
(Hanushek & Woessmann, 2019, p. 33)

The simplicity of the argument is touching: there is a measurable rela-


tion between achievements in educational improvement and the “wealth
of nations” (European nations, in this case). This “simplicity” is paral-
leled only by its conceptual frailty, as several authors have been demon-
strating, albeit, it must be acknowledged, with little influence in the “soft
power” developed by organizations like the OECD, the World Bank or
even UNESCO, and presented as deriving from academic research.
American social scientist Nelly Stromquist emphasizes that Knowl-
edge Capital Theory, as formulated by Hanushek and Woessmann, rests
on two theoretical premises: (i) the quality of knowledge is the main
determinant of wealth generation and (ii) a country’s economic output is
determined primarily by internal/endogenous factors (Stromquist, 2016).
The second premise combines the neoclassical theory and the endoge-
nous growth theory; the former defends that economic growth is mainly
determined by the intensive production of capital, which leads to higher
productivity from workers; the latter upholds that national investments
in human capital and innovation are the main factors in fostering eco-
nomic growth. Even if the second premise, highly contested within eco-
nomic theories, is accepted, more detail in the analysis of the former
Conclusion 189

premise is required: the quality of knowledge is the main determinant of


wealth creation, and it can be measured by the results of the PISA tests
of a particular country or economy.
Yet, as Stromquist (2016) asks, what is education quality? For econo-
mists like Hanushek and Woessmann – and for organizations like the
OECD – the answer is limited to the acquisition of knowledge and skills
needed for future workers (“better jobs”, in the OECD’s leitmotif). This
is a very limited view, as many other social scientists have been insisting.
“Education quality” is measured by its contribution to an “action project
for social and cognitive justice”,7 in other words, by an action which
produces and supports positive transformations in gender, class, race and
ethic relations. To defend that “knowledge quality” of a particular gener-
ation of workers can be measured by Reading, Mathematics and Science
tests applied to 15-year-olds undoubtedly corresponds to an extreme
(and gross) distortion of reality. Starting by their own distortions:

International testing programs introduce distortions of their own,


one of the most salient being student motivation to perform well
in those tests. While such motivation might be strong in countries
that aggressively seek to present an advanced national face, other
countries might view these international tests as relatively useless,
nationally embarrassing, and/or expensive exercises, which might
not promote student motivation to perform well.
(Stromquist, 2016, p. 67)

Hikaru Komatsu and Jeremy Rappleye, two researchers from Kyoto


University, present even more radical criticism: The new global policy
regime implemented by the OECD and other international organizations
based on the studies by Hanushek and Woessmann, who aim to associ-
ate the results of PISA and other ILSAs to economic growth, are founded
on invalid statistics (Komatsu & Rappleye, 2017; see also Rappleye &
Komatsu, 2019).

Our focus has been the internal validity of the H&W statistical claims.
By focusing on extending data up through the present (2014), and
matching test score change with subsequent economic growth – all
logical moves for pursing the overarching question – the claims of
H&W are rendered invalid. As stated previously, this is not because
we endorse Human Capital premises but because invalidation based
on the same dataset and methodology, obtained after successfully
reproducing H&W’s (2015a) findings, is more powerful than other
genres of critique. Utilizing the same data and methods renders our
critique conclusive.
(Komatsu & Rappleye, 2017, p. 20)
190 António Teodoro

Komatsu and Rappleye insist that their purpose is not to contend that
education plays no role in economic growth or that the quality of learn-
ing has no bearing on economic success. What they aim to demonstrate
is that setting learning goals centred on skills that generate economic gain
can ultimately prove harmful to students’ learning (Komatsu & Rappl-
eye, 2017). One possible outcome of this approach is the formulation of
strategies for political and administrative authorities, teachers, families
and students who seek the easier route, in other words, that of finding
immediate reward (good results in the tests) and give up on more complex
and meaningful learning, able to create “better jobs” and “better lives”,
but also “better and fair societies”. And these scholars end up pointing
out what can be extremely ironic: the political claims that aim to assume
(and foster) PISA results as the prime indicators of the reality of educa-
tion systems are, after all, the cause of the decline and stagnation of the
quality of students’ learning in many of those countries and economies.

The limitations of the OECD approach and the


required humanistic alternative
In 2018, the influential Director for Education and Skills at the OECD,
Andreas Scheicher, published World Class: How to Build a 21st Century
School System (Schleicher, 2018). Schleicher’s book is important not for
advancing conceptual innovation but for systematizing and grounding
recent options taken by the OECD in the field of education, as Michael
Fullan emphasizes on its back cover, when he writes that the author
“grasps all the key issues, and does so through keeping his ear to the
ground and by working solutions”. The book is structured for the climax
of the two final chapters: “Make Education Reform Happen” (Chapter 5)
and “What to Do Now” (Chapter 6, the last one). The assumption is that
“the gap between what education systems provide and what our societies
demand is likely to widen further”. He immediately adds: “There is a
risk that education becomes our next steel industry, and schools a relic of
the past” (p. 223). Hence, the more common challenges “are not about
designing reforms, but how reforms can be put into practice successfully”
(p. 204).
In the book, Schleicher positions himself as the father and main mentor
of PISA and other international instruments of evaluation and analysis
within the framework of the OECD. Acknowledging that having studied
physics and worked for some years in the medical industry led him to
view education “through the eyes of a scientist” (Chapter 1), he pro-
poses that the approach to educational matters become “not less of an
art, but more of a science” (p. 16). His past in the hard sciences, of a
markedly experimental nature, associated with his having worked with
three distinguished scholars in the field of comparative and international
Conclusion 191

education – Tornsten Husen, John Keeves and Neville Postlethwaite, all


belonging to the same community regarding discourse, one that António
Nóvoa calls the modernization approach (Nóvoa, 1998, pp. 72–73) –
have undoubtedly marked the outlook and the proposals of the main
ideologue (and political-administrative official) of the OECD in the field
of education (and educational reforms) for the past three decades.
The OECD narrative, systematized in Schleicher’s book, is powerful
and, at a time when the public debate on education in the national and
local spaces has all but completely disappeared, or is colonized by pro-
fessional commentators from outside the fields of education, from policy
or economy, has allowed several governments (and the EU Union itself,
which delegated in the OECD all reflection on education matters) to find
specific formulations and legitimacy for many of their policies.
The core of this OECD narrative is that education policies need to be
informed by scientific knowledge. And for Schleicher, who completely
overlooks the historic contributions of scientific research in the fields of
education (and of pedagogy), relevant knowledge is that which derives
from large statistical surveys built on the basis of new indicators: “It
was the idea to apply the rigours of scientific research to education pol-
icy that nudged the OECD to create PISA in the late 1990s” (Schleicher,
2018, p. 17). Andreas Schleicher stresses that it was this survey that
started a new generation of education policies informed by scientific
research:

Of course, the OECD had already published numerous comparisons


on education outcomes by that times, but they were mainly based
of years of schooling, which isn’t always a good indicator of what
people are actually able to do with the education they have acquired.
(Schleicher, 2018, p. 18)

The main misconception of this narrative stems from the identifi-


cation of scientific research applied to education policy and PISA (as
well as other subsequent surveys, such as TALIS, PIRLS or PIACC,
and the new PISA for Development, PISA for Schools and PISA Baby).
Those who work in the social sciences and humanities, and do not
merely identify science with its positivist paradigm, know that it is not
possible to turn complex realities into simple, easily measurable things,
hence aimed at establishing hierarchizations. The conceptual frailty of
the OCDE (and of Andreas Schleicher in the book World Class) makes
the purpose of this international organization to assert itself as the “think
tank” centre of education reforms worldwide something that can do
nothing but impoverish the much-needed reflection on how education
may participate in the co-construction of freer, fairer and more sup-
portive societies.
192 António Teodoro

The OECD’s proposals, which are inscribed in what I have named


neoliberal cosmopolitism,8 have been strongly questioned by the rise of
populist nationalism and authoritarian neofascist movements in different
countries and regions of the world, some of them influential members of
the OECD. To these movements, this type of organization derived from
the international order created in the aftermath of the World War II is
unnecessary and counterproductive to the assertion of national sover-
eignty. In this framework, education can only be considered a question of
unshared national sovereignty, where a cosmopolitical view of education
policies (and practices) makes little sense or is even counterproductive to
the assertion of the community superiority over the other, the foreigner,
migrant, refugee or simply believer in another religion or having a differ-
ent skin colour from the nationally dominant one.
Besides, these movements represent a fierce criticism of liberal
democracy (Sedgwick, 2019), the paradigm that underpins Schleicher’s
book and proposals already discussed, as well as many of the OECD’s
documents, although for this organization the issue of democracy has
never been an admission requirement. As a result of its origin – sup-
port of the Marshall plan in the context of the Cold War, which went
on in Europe after the World War II – the first concern of the OECD
when it arose was “to contribute to the expansion of world trade on a
multilateral, non-discriminatory basis in accordance with international
obligations”, according to paragraph c of article 1 of the convention
signed in Paris on 14 December 1960. The commitments taken on by
the member states could be found exclusively in the economic field,
as the name denotes. Article 2 of the convention consolidates this ori-
entation by establishing that “the Members agree that they will, both
individually and jointly”:

1. Promote the efficient use of their economic resources.


2. In the scientific and technological field, promote the development of
their resources, encourage research and promote vocational training.
3. Pursue policies designed to achieve economic growth and internal
and external financial stability and to avoid developments that might
endanger their economies or those of other countries.
4. Pursue their efforts to reduce or abolish obstacles to the exchange of
goods and services and current payments and maintain and extend
the liberalization of capital movements.
5. Contribute to the economic development of both member and
non-member countries in the process of economic development
by appropriate means and, in particular, by the flow of capital to
those countries, having regard to the importance to their economies
of receiving technical assistance and of securing expanding export
markets.
Conclusion 193

The OECD seems to get along well with antidemocratic liberalism (in
the Popular Republic of China, but also in Chile, Hungary, or Poland).9
As Yascha Mounk states in his remarkable essay People vs. Democracy.
Why Our Freedom Is in Danger and How to Save It (Mounk, 2018),

the first big assumption of the postwar era appears to be wrong:


liberalism and democracy do not go together nearly as naturally as
most citizens – and many scholars – have assumed.
(p. 97)

In this context, the OECD seems to continue to care more about liberal-
ism and much less about democracy. And, in education, that reversal of
priorities constitutes a capital sin.
The OECD’s proposals are questionable also from a humanistic and
critical perspective of cosmopolitism, which assumes the universality
of the human condition and the equal dignity of human beings. The
OECD comes forward today as an international organization that
works to build “better policies for better lives”, “to shape policies that
foster prosperity, equality, opportunity and well-being for all”.10 In the
field of education, two new buzz phrases are added: “better skills” and
“better jobs”.
The narrative developed in the OECD’s documents rests on an indi-
vidualistic view of the world, and of the economic and cultural relations
that human beings establish among one another, in search of better jobs
and a better life. It is a world of free consumers who fight for better
jobs, accumulating better skills at school and throughout life, in an iso-
lated journey, in constant competition for survival in a hostile world
threatened by unemployment. In consumption, no solidarity is estab-
lished; one competes for better prices and better social positions. It is
in production, in labour, that the values of solidarity are consolidated
as a starting point for the construction of societies guided by social jus-
tice and citizenship engaged with the dignity of all human beings. This
dimension, with profound implications in the organization of schools
and the modes of learning and teaching, clearly depicted in the history
of modern pedagogy, by Adolphe Ferrière, Jean Piaget, John Dewey,
Celestin Freinet or Paulo Freire, falls regrettably outside the OECD’s
concerns and proposals.
“In the dark, all schools and education systems look the same” is the
title given by A. Schleicher (Schleicher & Pablo Zoido, 2016, p. 374) to
a section of a chapter published on Global Education Policy Handbook
(edited by Mundy et al., 2016). The question is not the commendable
effort to want to “illuminate” education systems with relevant informa-
tion. The key issue lies in considering that the manner in which one “illu-
minates” derives from options of a political nature that must be equated
194 António Teodoro

and debated in the public space. They are not technical issues, but neutral
indicators. As Iveta Silova, Komatsu and Rappleye challenge us:

In short, we propose to initiate a different sort of conversation than


the one currently surrounding ILSAs – one that helps education
researchers, practitioners, and policy makers alike to imagine some-
thing beyond the current education paradigm – and gives the next
generation a chance of shifting off our current trajectory of environ-
mental catastrophe. While LSA data itself presents multiple dilem-
mas, it is the parochial economic and cultural logics underpinning
the analysis of this data that is far more problematic. So we whole-
heartedly agree that it is time to focus on measuring what really mat-
ters in education, but this means moving away from a myopic focus
on technology, economic growth, and Western cultural scripts as the
standard of the real.
(Silota et al., 2019, p. 346)

Social justice and polis are concepts that are absent from the new
OECDism.11 The reality is that we stand on the precipice of a planetary
cliff, with two options laid out before humanity. One side is the contin-
ued expansion of democracy, the further extension of human rights and
freedoms, and concerted efforts to address the growing threats and reali-
ties of global climate change. On the other is the dismantling of democ-
racy to be replaced by a populist, authoritarian rule; increased attacks on
the marginalized, oppressed and exploited populations of the world; and
acceleration of the degradation of planet Earth. We need international
organizations to be fully capable of confronting the challenges of the
post-truth, radically sceptical world we live in today, which has led to the
rise in atavistic, xenophobic neopopulist movements. This can be accom-
plished by (a) bridging knowledge production between universities (and
research) and the public, (b) supporting the revitalization of the public
spheres in old and new forms, (c) facilitating discourses that challenge the
dominant ideologies of today, (d) training the next generation of public
intellectuals and (e) serving as public intellectuals ourselves, intervening
in the public spheres to reaffirm our pursuit of social justice, democracy
and truth itself.12

Notes
1 See Eide, 1990.
2 My position, which I have been arguing since the publication of the article
Organizações Internacionais e políticas educativas nacionais. A emergên-
cia de novas formas de regulação transnacional ou uma globalização de
baixa intensidade [International Organizations and National Education
Conclusion 195

Policies: The Emergence of New Forms of Transnational Regulation or a


Low Intensity Globalization].
(Teodoro, 2001; see also Teodoro, 2003)
is corroborated by Henry et al. (2001).
3 Angel Gurría, opening citation of the brochure on PISA 2018, prepared by
Andreas Scheicher. Available at: www.oecd.org/pisa/PISA%202018%20
Insights%20and%20Interpretations%20FINAL%20PDF.pdf
4 For the development of this GERM concept, please see Mundy et al. (2016),
and Saltman and Means (2019).
5 See Teodoro (2020), Chapter 2 (pp. 33–48).
6 See Figure 2.1, Hanushek & Woessmann, 2015b, p. 26.
7 See the development of this argument in Teodoro, 2020, p. 107 and ff.
8 Keynote at the Second International Conference of Comparative Education,
Portuguese Society of Comparative Education (SPCE-SEC), University of
Madeira, Funchal, 30 January 2018.
9 It should be noted that the OECD does not raise the issue of democracy as
one of the requirements for admission to the organization. Portugal joined
the organization during Salazar’s dictatorship (4 August 1961).
10 See the OECD website: www.oecd.org/about/ (accessed on 16.08.2021).
11 Concept coined by Portuguese sociologist Sacuntala de Miranda in the
1980s, referring to the educational ideology developed by the OECD since its
foundation and, particularly, after the Mediterranean Regional Project. See
Miranda (1981) and Teodoro (2019).
12 These arguments are the result of joint reflection with Carlos Alberto Tor-
res (UCLA, US), José Beltrán (U. Valencia, Spain) and Régis Malet (U. Bor-
deaux, France) while preparing a research project submitted to the European
Research Council (ERC) in November 2019 on teacher praxis, teacher Bil-
dung and global citizenship education.

References
Becker, G. S. (1964). Human capital: A theoretical and empirical analysis, with
special reference to education. The University Chicago Press.
Centeno, V. G. (2017). The OECD’s educational agendas – framed from above,
fed from below, determined in interaction. A study on the recurrent education
agenda. Peter Lang. https://doi.org/10.3726/b12774
Dale, R. (2008). Brief critical commentary on CWEC and GSAE 8 years on. Paper
presented to 52th Conference Comparative and International Education Soci-
ety (CIES), Teachers College, Columbia University, New York, March 17–21.
Eide, K. (1990). 30 years of international collaboration in the OECD. Inter-
national Congress ‘Planning and Management of Educational Development,
Mexico, Mars 26–30, 1990. UNESCO ED-90/CPA.401/DP.1/11
Elfert, M. (2019). The OECD, American power and the rise of the “economics
of education” in the 1960s. In C. Ydesen (Ed.), The OECD’s historical rise
in education, global histories of education (pp. 39–61). Palgrave Macmillan.
https://doi.org/10.1007/978-3-030-33799-5_3
Hanushek, E., & Woessmann, L. (2015a). The knowledge capital of nations:
Education and the economics of growth. MIT Press.
Hanushek, E., & Woessmann, L. (2015b). Universal basic skills: What countries
stand to gain. OECD.
196 António Teodoro

Hanushek, E., & Woessmann, L. (2016). Knowledge capital, growth, and the east
Asian miracle. Science, 351(6172), 344–345. doi:10.1126/science.aad7796
Hanushek, E., & Woessmann, L. (2019). Economic benefits of improving educa-
tional achievement in the European Union: An updated and extension. EENEE
Analytical Report No. 39. European Commission.
Henry, M., Lingard, B., Rizvi, F., & Taylor, S. (2001). The OECD, globalisation
and education policy. Pergamon, Elsevier.
Húsen, T. (1979). L’école en question. Pierre Mardaga.
Komatsu, H., & Rappleye, J. (2017). A PISA paradox? An alternative theory
of learning as a possible solution for variations in PISA scores. Comparative
Education Review, 61(2). https://doi.org/10.1086/690809
Leimgruber, M., & Schmelzer, M. (Eds.). (2017). The OECD and the interna-
tional political economy since 1948. Springer Berlin Heidelberg.
Levin, B., & Fullan, M. (2008). Learning about system renewal. Educational
Management, Administration and Leadership, 36(2), 289–303.
Miranda, S. de. (1981). Portugal e o ocdeísmo. Análise Psicológica, 1(II), 25–38.
Mounk, Y. (2018). The people vs. democracy. How your freedom is in danger
and how to save it. Harvard University Press.
Mundy, K., Green, A., Lingard, B., & Verger, A. (Eds.). (2016). The handbook of
global education movement. Wiley Blackwell.
Nóvoa, A. (1998). Histoire & Comparaison (Essais sur l’Éducation). Educa.
OECD. (1989). Education and the economy in a changing society. OECD.
Papadopoulos, G. (1994). Education 1960–1990: The OECD perspective.
OECD.
Rappleye, J., & Komatsu, H. (2019). Is knowledge capital theory degenerate?
PIAAC, PISA, and economic growth. Compare: A Journal of Comparative and
International Education. doi:10.1080/03057925.2019.1612233
Rizvi, F., & Lingard, B. (2009). The OECD and global shifts in education policy.
In R. Cowen & A. M. Kazamias (Eds.), International handbook of compara-
tive education. Springer. https://doi.org/10.1007/978-1-4020-6403-6_28
Rubenson, K. (2008). OECD education policies and world hegemony. In R.
Mahon & S. McBride (Eds.), The OECD and transnational governance
(pp. 241–259). UBC Press.
Sahlberg, P. (2016). The global reform movement and its impact on schooling. In
K. Mundy, A. Green, B. Lingard, & A. Verger (Eds.), The handbook of global
education movement (pp. 128–144). Wiley Blackwell.
Saltman, K. J., & Means, A. J. (Eds.). (2019). The Wiley handbook of global
reform. Wiley Blackwell.
Schleicher, A. (2018). World class: How to build a 21st-century school system.
OECD Publishing. doi:10.1787/4789264300002-en
Schleicher, A., & Zoido, P. (2016). The policies that shaped PISA, and the policies
that PISA shaped. In K. Mundy, A. Green, B. Lingard, & A. Verger (Eds.), The
handbook of global education movement (pp. 374–384). Wiley Blackwell.
Schmelzer, M. (2016). The hegemony of growth: The OECD and the making of
the economic growth paradigm. Cambridge University Press.
Schultz, T. (1962). Investment in human beings. Journal of Political Economy,
70(5), 1–8.
Conclusion 197

Sedgwick, M. J. (Ed.). (2019). Key thinkers of the radical right. Behind the new
threat to liberal democracy. Oxford University Press.
Silota, I., Rappleye, J., & Komatsu, H. (2019). Measuring what really matters:
Education and large-scale assessments in the time of climate crisis. ECNU
Review of Education, 2(3), 342–346. doi:10.1177/2096531119878897
Stromquist, N. P. (2016). Using regression analysis to predict countries’ economic
growth: Illusion and fact in education policy. Real-World Economics Review,
76, 65–74.
Teodoro, A. (2001). Organizações internacionais e políticas educativas nacionais:
A emergência de novas formas de regulação transnacional ou uma globalização
de baixa intensidade. In S. R. Stoer, L. Cortesão, & J. A. Correia (Orgs.), Da
Crise da Educação à “Educação” da Crise: Educação e a Transnacionalização
dos Mecanismos de Regulação Social (pp. 125–161). Edições Afrontamento.
Teodoro, A. (2003). Educational policies and new ways of governance in a trans-
nationazation period. In C. A. Torres & A. Antikainen (Eds.), The interna-
tional handbook on the sociology of education: An international assessment of
new research and theory (pp. 183–210). Rowman & Littlefield.
Teodoro, A. (2019). The end of isolationism: Examining the OECD influence in
Portuguese education policies, 1955–1974. Paedagogica Historica. doi:10.108
0/00309230.2019.1606022
Teodoro, A. (2020). Contesting the global development of sustainable and inclu-
sive education. Education reform and the challenges of neoliberal globaliza-
tion. Routledge.
Zhao, Y. (2020). Two decades of havoc: A synthesis of criticism against PISA.
Journal of Educational Change, 21, 245–266. https://doi.org/10.1007/
s10833-019-09367-x
Index

accountability 14, 19, 48, 49, 50, 52, big science 3, 8, 169, 170, 171, 172,
54, 59, 60, 61, 62, 63, 64, 65, 69, 173, 174, 176
88, 94, 99 BNCC (of Brazil curricular base) 6,
active form 11, 12, 13 86, 88, 89, 90, 92, 93, 94, 95, 96,
Addey, C. 4, 5, 11, 12, 14, 16, 17, 18, 97, 99
19, 20, 34–35, 98, 138, 139 Bodin, A. 128, 139
Agasiti, T. 34–35 Bottani, N. 2
Agenda Setting Theory 144 Bourdieu, P. 65, 139
Aguiar, M. 91, 93 Brazil 4, 6, 183, 185
Alasuutari, P. 34–35, 41, 59, 66 Breakspear, S. 14
Application Protocol 108 Busch, L. 11, 20
Araújo, L. 105–106
Aspers, P. 171, 177 calculability 55
assemblages 54 Canada 146–147; Canadian
Assessment of Higher Education Standards Association 15
Learning Outcomes (AHELO) Carvalho, L. M. 74, 105, 170, 177
project 4 Cassio, F. 86, 90, 93
assessments 16, 19; market 16; Castells, M. 142
methodologies 17; products 16 Castro, M. H. 87, 88, 90
Auld, E. 26, 34 CeiED (of Lusofona University)
Australia 145 4, 144
Avelar, M. 89 Centeno, V. 170, 177, 181
Ávila, P. 130, 139 centralization 63
CERI (of the OECD) 1, 4, 52
Baird, J.-A. 146–147, 164 Christensen, K. B. 106
Ball, S. 89 Christensson, O. 173, 178
Baroutsis, A. 145, 161 citations 31–33
basic skills 60, 62, 63, 64 CNE (from Brazil) 90, 91, 92, 93, 96
Baudelot, C. 62 Coe, K. 142, 162
Bauer, A. 73 Coleman, R. 144
Becker, G. 55, 181 competencies 74, 80, 84, 90, 91, 92,
Benavente, A. 132, 139 94, 95, 96, 99
Benavot, A. 33 competition 50, 53
benchmarks 51, 54, 58; CONFENEM 126
benchmarking 57 Conselho Nacional de Educação
Berliner, D. C. 49 (from Portugal) 105
Bieber, T. 143 constructs 75, 86, 97
big data 170, 171, 179 Cordero, J. 34–35, 39–40
Index 199

Corte, U. 171, 177 Expresso (weekly newspaper)


Costa, E. 105, 170, 177 147, 162
Crahay, M. 62 extrastatecraft 11, 12, 14
Critical Discourse Analysis theory 148
Critical Infrastructure Studies (CIS) 11 FCT (Portuguese Foundation for
Science and Technology)
Dale, R. 182 9n7, 144
Demeuse, M. 62 Felouzis, G. 62
demographic challenge 55, 58 Fernandes, D. 128, 140
DEVCO 18 Fernandes, R. 72
Diário de Notícias (newspaper) Ferreira, A. 131, 140
147, 162 Figazzolo, L. 143
Dijk, T. A. van 148, 165 Finland 145–146
Dixon, R. 145–146, 162, 164 Fontanive, N. 92
Dubet, F. 60, 62, 66 FoxTrot professional search
Duru-Bellat, M. 62 software 150
framing 144
early childhood 50, 53, 56, 64 France 4, 5, 145–146; French-Third
Easterling, K. 11, 12, 13, 14, 15, 16, Way 63
17, 18, 19, 20, 21 Freitas, L. C. 73
Education at a Glance 2 Fullan, M. 183, 190
Education Testing Service (ETS) funding 30
51, 175
educational equity 13, 19 Galison, P. 172, 173, 178
educational quality 14 Gastro, M. 173, 178
Eide, K. 182, 185 Germany 145–146; German Institute
Elfert, M. 185 for Standardization (DIN) 15
Elias, N. 171, 177 Gerstl-Pepin, C. I. 142
employment 56, 58 Global Education Reform Movement
Engel, L. 34–35, 37–38 (GERM) 3, 8, 182–185
ePIRLS 131 Gonçalves, C. 131, 140
epistemic governance 48, 49, 51, 53, González-Mayorga, H. 143, 145
55, 57, 59, 61, 63, 65, 66, 67, Görgen, K. 147, 164
68, 69 Gorur, R. 17
epistemic work 59, 65 governance 14, 17; governing by
epistemology 49 numbers 48, 49, 51, 53, 61, 65, 66,
Ertl, H. 34 67, 68
Esping-Andersen, G. 56, 57, 66 Gradim, A. 152
European Commission 54, 57, 58, Grapin, N. 128, 139
187, 188 Grek, S. 14, 18, 19, 33, 143
European Statistical System 57 Grey, S. 143, 146
evidence-based education 52, 54, Gurria, A. 182
64, 65
evidence-based policies 52 Hallonsten, O. 173, 178
Excel 151 Halo effect (the) 18
experimental economics 56 Hanushek, E. A. 186, 187, 188, 189
expertise 48, 49, 51, 52, 61, 62, Hemeroteca Municipal de Lisboa
63, 67 (Lisbon media library) 149
expert knowledge 61, 62, 63 Henry, M. 2
expert networks 49 Hevly, B. 172, 173, 178
experts 48, 51, 52, 54, 61, 63, 64, 80, Hopfenbeck, T. 106, 147, 164
81, 97 Hopmann, S. T. 106
200 Index

Horta Neto, J. L. 4, 6, 70, 71, 99 Kreiner, S. 106


Human Capital (Theory) 8, 50, 52, Kuttner, P. J. 142, 162
55, 56, 57, 58, 62, 64, 65, 180,
181, 186 Lafontaine, D. 132, 140
Husén, T. 57, 181, 191 Larkin, B. 21
Lemos, V. 143, 145, 147, 161–164
IAVE (from Portugal) 130, 131, 134, Lewis, S. 34–35, 38–39, 98
135, 136, 137 lifelong learning 48, 57, 58, 67, 68,
ICCS 74 69, 69
ICILS 7, 126, 127, 133, 134, 137, 139 Likert scale 119
IDB 126 Lingard, Bob 14, 17, 19, 33–35, 143,
IDEB (from Brazil) 72, 88, 99 145, 161–163, 170, 178, 181
IEA 3, 7, 49, 51, 55, 62, 68, 74, 126, literacy 3, 26, 74, 80, 83, 95, 98, 99,
127, 129, 130, 134, 138, 186 182; mathematical 79, 80; reading
IERI 51 75, 77; scientific 83, 84
ILSA 1, 3, 7, 105, 126, 127, 137, 138, LLCE 74
189, 194 Longobardi, S. 34–35, 39–40
ILSAINC, The ILSA Industry Project Lopes, P. C. 152
12, 22 Lusofona University 144
imageries 59
indicators 54, 56, 57, 58, 60, 62, 65, Magalhães, A. 4
71, 74 Marôco, J. 105, 128, 129, 140, 143
INEP (from Brazil) 70, 93, 97 Marshall Plan 1, 192
inequality(ies) 48, 50, 53, 54, 55, 56, Martens, K. 3, 14, 18, 34–35, 37–38,
57, 60, 63, 64, 65; in educational 143, 170, 178
systems 13 Mathematics 80, 81, 82, 84, 95
INES project 1, 2, 181, 182, 185 McCombs, M. E. 142, 144
infrastructure 11, 12, 13, 14, 15, 16, measurement 49, 55, 56, 57, 58, 59,
17, 20, 21, 22 61, 64, 66, 67
International Organizations (IO) 65 MEC (from Brazil) 70, 73, 86, 88, 89,
international survey 49, 50, 51, 55, 90, 92, 93, 94, 95, 96
57, 65 media consumers 142
ISCED 70 Mendel, P. J. 15
ISO (International Organization for methodological nationalism 48
Standardization) 15; ISO 9000 methodology 49, 50
(quality management standards) 5, metrology 49, 62
11, 12, 15, 16, 17, 18, 19, 22; ISO Meyer, H. 33
secretariat 16 Mons, N. 62
item discrimination 116 Moreno, A. C. 82
item facility 116 Morgan, C. 26, 34–35
Item Response Theory (IRT) 71, 77, Morris, P. 34, 143, 146
78, 98 Mounk, Y. 193
Mullis, I. 107
Josephson, P. 172, 173, 178
journalistic genre 152 NAEP 51, 126
Napoleonic model 63
Kadushin, C. 148 narrative(s) 59, 60, 63
Keeves, J. 191 national assessments 53, 64, 65
keywords plus 31 neoliberal cosmopolitism 192
Kimerling, J. 18, 19 neoliberal globalization 8
Klanovicz, J. 172, 173, 178 New Right 54
Knowledge Capital Theory 8, 186, 188 Niemann, D. 34–35, 37–38, 143,
Komasu, H. 26, 34–36, 189, 190, 194 170, 178
Index 201

Normand, R. 4, 5 policymakers 48, 52, 53, 54, 59, 60,


Norway 146–147 62, 63, 64
Nóvoa, A. 14, 19, 143, 191 political arithmetic (of inequalities)
Nye, M.-J. 172, 178 48, 57
Portugal 7, 8, 183
OMC 57, 58, 59, 64 Postlethwaite, N. 191
Open Method of Coordination 54, power distribution 15, 16, 17
55, 57, 58, 60, 62, 64, 66, 67 Price, D. de S. 3, 4
Oppelt, T. 173, 178 priming 144
Organization for Economic productivity 56, 58
Cooperation and Development proficiency 75
(OECD) 1, 2, 3, 4, 5, 6, 7, 8, 9, psychometrics 49, 52
15, 16, 17, 18, 20, 49, 50, 52, 54, Público (newspaper) 7, 142
57, 62, 66, 67, 68, 70, 72, 73, 74, public service 63
75, 76, 77, 78, 79, 80, 81, 83, 84,
85, 92, 98, 105, 108–109, 126, Qadir, Ali 59, 66
127, 131, 132, 142, 169, 170, quality 56, 57, 58, 62, 69, 71, 72, 74,
175, 176, 180, 181, 182, 183, 77, 86, 91, 98, 99; certification 19;
185, 186, 188, 189, 190, 191, educational system 19; management
192, 193; Directorate of Education standards 11, 12, 15; of teaching 2
and Skills 20; new OECDism quantity 55, 58
194; Organisation for European
Economic Cooperation (OEEC) randomized controlled trials 54,
180, 181, 182 56, 64
Ozga, J. 19 ranking(s) 13, 53, 60, 64, 79, 82,
86, 97
Papadopoulos, G. S. 181 Rappleye, J. 26, 34–36, 189, 190, 194
PCN (from Brazil) 87, 88 Rasch model 105–106
performance 50, 52, 56, 57, 61, 68, Rautalin, M. 34–35, 40–41
71, 72, 73, 75, 75, 78, 81, 82, 89, reading 75, 76, 80, 81
96, 98, 99 Reagan administration 183
PIAAC 4, 191 reception 49
PIREF 64 reforms 52, 56, 57, 58, 60, 61, 62, 63,
PIRLS 1, 7, 74, 126, 127, 130, 131, 64, 65, 67, 68
132, 133, 135, 136, 137, 138, 191 Rémond, M. 107, 132, 140
PISA 1, 3, 5, 6, 7, 8, 9, 11, 12, 13, 14, Ricardo, D. 152
15, 16, 17, 18, 19, 20, 21, 22, 48, Robertson, S. 170, 179
49, 50, 51, 52, 53, 54, 55, 56, 57, Rosa, V. 105, 107, 137, 138, 140
58, 59, 60, 61, 62, 63, 64, 65, 66, Rutkowski, L. 34–35, 37–38,
67, 68, 69, 71, 73, 74, 75, 76, 77, 127, 141
78, 79, 80, 81, 82, 83, 84, 85, 86,
92, 94, 95, 97, 98, 105–106, 126, SACMEQ 126
127, 132, 133, 136, 137, 143, 169, SAEB (from Brazil) 70, 92, 96, 97, 99
170, 171, 173, 174, 182, 183, 185, Sahlberg, P. 183, 184, 185
189, 190, 191; PISA baby 4, 191; Saraisky, N. G. 145, 148, 161
PISA for Development 4, 18, 191; Saraiva, L. 129, 141
PISA paradigm 52, 61, 63, 64, 65; Schleicher, A. 1, 3, 4, 20, 186, 190,
PISA schools 191 191, 192, 193
Plausible Values (PVs) 109 Schultz, T. 180
policies 48, 50, 51, 52, 54, 55, 58, 59, Schut, R. 171, 179
60, 63, 68, 69, 71, 72, 74, 79, 87, science 48, 49, 51, 58, 60, 61, 63, 65,
89, 93, 97, 98, 99 68, 83, 84, 85; of government 63
policy borrowing 54 Sedgwick, M. J. 92
202 Index

Sellar, S. 13, 14, 17, 18, 19, 33–35, 39 tone of a newspaper article 156
Serrão, A. 143, 145, 147, 161–164 Tuinjmann, A. 57
Shanghai 53, 68, 146–147 Tuttman, M. 91, 93
Shaw, D. L. 144
Silova, I. 194 UK 145–146; National Standards
Sim-Sim, I. 130, 141 Body (BSI) 15
Sjøberg, S. 106 UNESCO 49, 74, 92, 106, 126,
skills 74, 75, 79, 80, 81, 82, 86, 89, 186, 188
90, 92, 94, 95, 96, 99; cognitive USA 145
55, 56
Soares, F. 91, 92, 93 van Eck, N. J. 25–26
soft power 18 Vento, E. 34–35
Spain 145 Verger, A. 14, 16, 19, 170, 179
Star, S. L. 11, 12, 14, 20 Visão (weekly newspaper) 147, 162
statistics 52, 55, 57
Visualisation of Similarities
Steiner-Khamsi, G. 14, 18, 19, 105
(VOS) 25
Stolpe, I. 18
storytelling 52, 54 Volante, L. 170, 179
Stromquist, N. 188, 189
Waldow, F. 105
Takayama, K. 33 Weinberg, A. M. 3, 4
TALIS 1, 4, 191 Weller, V. 70
Teltemann, J. 34–35, 37 WESTAT 175, 176
Teodoro, A. 1, 8, 74, 170, 176, 177, Wiseman, A. 17
179 Woessmann, L. 186, 187, 188, 189
tests 49, 50, 51, 54, 55, 62, 71, 73, World Bank (The) 57, 186, 188
74, 75, 76, 77, 78, 80, 82, 83, 98; World War II 192
low-impact 105
Thatcher government 183 Yariv-Mashal, T. 14, 19, 143
TIMSS 1, 7, 74, 126, 127, 128, 129,
130, 132, 133, 134, 135, 136, 137, Zhao, Y. 3, 8, 105–106, 183
138, 186 Zumbo, B. 17

You might also like