You are on page 1of 2

Guidance about avoiding bias in Cambridge

International qualifications in June 2020


Information for heads of centre, heads of department and teachers on objectivity
in predicting grades and deciding on rank orders

The importance of objectivity Unconscious effects on objectivity


In these unprecedented circumstances, schools To avoid unconscious bias, teachers are urged
are best placed to judge the likely performance to reflect on and question whether they may
of their students if teaching and learning, and have any preconceptions about each student’s
exams, had continued as planned. Centres performance and whether their perception of
usually know their students well and will have the evidence might be affected by any
regularly assessed their performance irrelevant factors. Teachers should be aware of:
throughout the course of study.
• confirmation bias, for example noticing only
evidence about a candidate that fits with pre-
We are providing the following extra information
existing views about them
on objectivity in predicting grades and deciding
on rank orders to help schools play their role in • masking or halo effects, for example a
ensuring this year’s results are as fair as particular view about an aspect of a candidate
possible. This is based on existing research that hides, or overly accentuates, their actual
and analysis about how centres can assess knowledge, skills and abilities
candidates as objectively as possible.
• recency effects, for example giving undue
weight to the most recent interaction with a
Objectivity in grading and ranking decisions candidate or the most recent piece of work
Each predicted grade should be a holistic done by a candidate
professional judgement, balancing different • primacy effects, for example giving undue
sources of evidence and data. It is important weight to ‘first impressions’ of a candidate
that the centre’s predicted grades and rank
order judgements are objective; they should • selective perceptions, for example giving
only take account of existing records and undue weight to a candidate’s performance
available evidence of a student’s knowledge, on a particular part of the content of the
skills and abilities in relation to the subject. syllabus, rather than considering performance
across the whole syllabus for Cambridge
This evidence should inform teachers’ IGCSE, O Level, International AS & A level,
professional judgements about each Pre-U and IPQ
candidate’s likely performance at the time of • contrast effects, for example over- or under-
the exam. Other factors should not affect this estimating a candidate’s likely performance
judgement, including characteristics such as having first considered a large number of
a candidate’s gender, race, religion/belief or students who are all working at a different
disability. Similarly, judgements should not be standard.
affected by a candidate’s behaviour (both
good and poor), character, appearance or
social background, or the performance of
their siblings.

This document includes information contained in Guidance issued by Ofqual, the UK regulator for exams, 1
and therefore contains public sector information issued under the Open Government Licence v.3.0.
Information from previous data In doing any such analysis, centres should be
The effects described above may not be aware of and take into account contextual
consistently seen across different centres or factors. Awareness of the limitations of data
individuals. To understand more about possible and the context in which it was generated may
effects in a particular centre, a centre could help centres to consider which data is relevant,
look back at previous years’ data, for example, which is not, and what conclusions may and
over the past two to five years, where this is may not be supported.
available. Considering data in this way is
unlikely to identify all possible effects and may Reviewing judgements
prove inconclusive. Contextual information is Having considered possible unconscious
likely be important in considering what weight effects on objectivity and any information from
to give any such data. For example, significant available data from previous years, centres are
personnel changes may mean that effects in asked to use this information to reflect carefully
previous years may not be assumed to carry on their predicted grades and rank orders.
forward, or may reduce the benefits of Dialogue between heads of departments,
aggregating data between different years. teachers and the head of centre can support
such reflection and review.
A centre could use such data to identify
whether there may be any indications of Where any possible unconscious effects, or
systematic under- or over-prediction for previous systemic under- or over-prediction for
different groups of students, for example, those particular groups, have been identified, careful
from particular ethnic, social or religious consideration would be needed to ensure,
groups. For example, a centre may find that it for example, that this was not over-
has routinely under-estimated forecast compensated for.
Cambridge International A Level maths grades
compared to grades actually achieved for Nonetheless, analysing information, reflection
students from particular groups; or routinely and dialogue as outlined above could help a
over-estimated forecast Cambridge IGCSE centre to assure itself that it has effectively
grades compared to grades actually achieved fulfilled its duties to avoid discrimination, and
for students from particular groups. The centre to assure itself that it has maximised objectivity
could use any such findings as it checks and fairness in the judgements that it
whether its proposed predicted grades for June has made.
2020 might have been influenced by
preconceptions or irrelevant factors.

This document includes information contained in Guidance issued by Ofqual, the UK regulator for exams, 2
and therefore contains public sector information issued under the Open Government Licence v.3.0.

You might also like