You are on page 1of 11

What does language assessment

literacy mean to teachers?


Vivien Berry, Susan Sheehan, and Sonia Munro

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


Language assessment literacy has been discussed by language assessment
specialists since Davies’ seminal 2008 article. Much of this discussion has
focused on experts’ opinions of what teachers know, do not know, and should
know about assessment. This paper reports on a project which aimed to put
teachers at the centre of the debate by exploring their attitudes to assessment
and their assessment practices. Interviews, classroom observations with
follow-up interviews, and focus group discussions were held with teachers based
in the United Kingdom, France, and Spain. The teachers discussed their training
in assessment and their attitudes towards it. Although many of them expressed
a lack of confidence in their knowledge of the subject, the teachers we observed
in fact successfully deployed a wide range of assessment techniques. However,
they tended to characterize these as teaching activities and, therefore, part of
good teaching practice. Testing and grading were viewed negatively.

Introduction In her influential article on testing and assessment, Clapham (2000: 150)
comments that: ‘[A]ssessment is used both as a general umbrella term to
cover all methods of testing and assessment and as a term to distinguish
alternative assessment from testing.’ Similarly, Clapham proposes a
distinction between ‘testers’ who design and deliver reliable and valid high-
stakes tests and ‘assessors’ who prepare real-life communicative tasks for
their students, even though she acknowledges that the two terms are often
used interchangeably by experts, herself included. If we accept Clapham’s
distinctions, classroom teachers fit more obviously into the category of
assessors concerned with assessment than of testers concerned with testing.
Effective assessment can support and promote learning, and therefore
a teacher’s ability to engage with a range of teaching, learning, and
assessment practices is essential. As Crusan, Plakans, and Gebril (2016)
suggest, it is the students who lose out if assessment practices are poor.
However, concerns have been expressed about the level and quality of
teacher training in assessment (Fulcher 2012; Crusan et al. 2016). In
general education, the term assessment literacy has been used to describe
the knowledge teachers should have about assessment. The term has been
adapted and adopted by experts in language assessment, with Malone
proposing the following definition of language assessment literacy:

ELT Journal Volume 73/2 April 2019; doi:10.1093/elt/ccy055  113


© The Author(s) 2019. Published by Oxford University Press; all rights reserved.
Advance Access publication February 9, 2019
Assessment literacy is an understanding of the measurement basics
related directly to classroom learning; language assessment literacy
extends this definition to issues specific to language classrooms.
(Malone, 2011: online)

This paper describes a project which sought to bring the voice of the
teacher into the debate concerning what teachers actually know, and
what they should know, about assessment. After a brief discussion of
how surveys and classroom observations have been used to research
assessment, this paper describes qualitative data collection and findings
on teachers’ voices concerning assessment.

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


Survey research The survey has been the most commonly used research method when
investigating teachers’ knowledge of assessment (Fulcher, 2012; Crusan
et al, 2016). These surveys are generally created by expert assessment
researchers, often leading to an emphasis on possible gaps in teacher
knowledge. In the studies cited, a range of assessment-related topics
was presented to teachers with questions which asked them to state
their current level of knowledge about the topic and their interest
in learning more about it. The results of these surveys suggest that
teachers lack knowledge and need extensive training to raise their level
of understanding of these areas of assessment. Survey studies, however,
have limitations. When faced with questions which ask if more knowledge
is wanted on a topic, people might respond positively, motivated by
simple curiosity or the fear of being perceived as unprofessional by not
expressing an interest in receiving more information.

Classroom research Stoynoff (2012:531) claims that ‘survey results need to be complemented
with other empirical evidence of the effect of teacher characteristics on
assessment practices’. Several studies have attempted to address this
statement either using mixed-methods or through classroom observation.
One example is Yin (2010), who conducted a series of observations and
interviews with two tutors of English for Academic Purposes. He found
that teachers had two sets of beliefs about classroom assessment. One set,
strategic cognitions, related to planning classroom activities. These were
based on the individual teacher’s beliefs about teaching and institutional
conditions such as the course syllabus. The other set, interactive cognitions,
were drawn on in the classroom. These were based on knowledge of level
and on experiences of working with particular groups of students. Although
Yin’s project did not focus on language assessment literacy as such, the
discussion of the teachers’ assessment practices suggests that they deployed
a range of assessment techniques (for thorough reviews of the literature on
the effect of teachers’ identity and beliefs on their assessment practices, see
Xu and Brown 2016; Looney, Cumming, van Der Kleij, and Harris 2017).

Methodology The aims of the study were twofold: (1) to gain a greater understanding of
Aims of the study teachers’ knowledge of assessment through actual observation of classroom
assessment practices and through focus group discussions; (2) to use the
knowledge gained from the observations and discussions to develop training
materials which meet teachers’ actual stated needs. The major focus of

114 Vivien Berry et al.


the remainder of this article will be on the first aim, with the second aim
addressed only briefly. Presentation and discussion of the training materials
ultimately developed for teachers is beyond the scope of this article. For further
information about the materials see Berry, Sheehan, and Munro (2018).

Participants Participants in the study were teachers who were based in Europe at the
time of the project but many of them talked about work and training
experiences from beyond Europe in both state education and private
language schools. A total of 54 teachers participated in the study, 28 of
them female and 26 male, with ages ranging from 25 to 60 years. The
teachers had come into the profession through various routes and had

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


from 3 to 30 years’ experience (average approximately 13 years). Some had
followed a CELTA course whereas others had PGCEs. A minority of them
had a degree in English Language Teaching. Many of the participants
reported that they had obtained the DELTA or similar qualification.
The teachers were chosen, from amongst those who had volunteered to
participate in the study, to reflect a range of routes into teaching and to
include teachers working in different contexts.

Research questions The following research questions (RQs) informed our project:
RQ1 What are teacher attitudes to assessment?
RQ2 How confident do teachers feel about engaging in assessment
activities?
RQ3 How confident do teachers feel about engaging in testing?
RQ4 What impact does assessment have on the classroom?

Data collection and We investigated teachers’ practices, beliefs, and attitudes towards
analysis assessment through a range of qualitatively orientated research methods,
Data collection which included individual interviews, classroom observations with
follow-up interviews, and focus group discussions. There were three
stages of data collection.
Stage 1
Stage 1 consisted of semi-structured interviews with three teachers
working in a UK university, which explored what assessment training they
had received and the impact of testing and assessment on their teaching
practice. The teachers were selected for their range of backgrounds and
experience (see Table 1).
Stage 2
Stage 2 comprised classroom observations and follow-up interviews with
three different teachers working at a study centre in a UK university to
determine what assessment practices teachers actually used in the classroom.
Although all the teachers who participated in this stage were British, their
ELT work experience differed considerably (see Table 2). An observation
schedule was used which was inspired by one created by Colby-Kelly and
Turner (2007). The schedule listed 16 types of assessment activities. We
ticked each time we observed any of the activities within a three-minute
period. Every three minutes a fresh observation sheet was started. The

What does language assessment literacy mean to teachers? 115


Teacher Nationality TEFL qualifications TEFL work experience TEFL
years of
practice
1 (female) Polish BA English Secondary school 12
Language teaching in Poland
Teaching EFL teaching in Middle
CELTA East
DELTA ESOL teaching in the UK
Master’s in TESOL EAP teaching in UK
from universities

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


UK university
2 (male) British Secondary PGCE VSO 7
CELTA EFL teaching in Japan
3 (male) Hungarian BA English Secondary school 20
Language teaching in Hungary
Teaching EFL teaching in Middle
CELTA East
table 1
Master’s in TESOL EAP teaching in UK
Participants in baseline from universities
interviews UK university

Teacher Nationality TEFL TEFL work TEFL


qualifications experience years of
practice
1 (female) British CELTA EFL teaching in 22
DELTA Turkey
Master’s in EAP teaching in
TESOL from UK universities
UK university
2 (male) British CELTA EFL teaching in 8
DELTA South East Asia
EAP teaching in
UK universities
3 (female) British Trinity Certificate ESOL teaching 9
table 2
Trinity Diploma in the UK
Participants in classroom EAP teaching in
observations UK universities

observation sheet also contained space for notes so in addition to recording


ticks we could also note questions or queries about what we were observing.
These questions and comments were used as the basis for follow-up
interviews with the teachers. The interviews also included questions about
attitudes to assessment and the effects of testing on the classroom and a
discussion of teachers’ training needs. Examples of the questions include:
1 In the lesson you used both peer and self-assessment. What’s the aim of
using these types of assessment?
2 Why did you use self-assessment with the students? Is this something
you often use?
3 You shared grading sheets with students. Can you explain your thinking
behind this?

116 Vivien Berry et al.


Stage 3
The third stage of data collection consisted of five focus group discussions
with 48 EFL teachers who worked for a large international organization that
recruits and trains teachers. The teachers were based in either Madrid or Paris
at the time of the study but their experience of working in other countries
covered every continent and level and type of language learners from
complete beginner to advanced and kindergarten to EAP and ESP within
companies. They were all either native speakers of English from English-
speaking countries or bilingual Spanish–English or French–English speakers.
The main purpose of this stage was to confirm that the comments from
phases 1 and 2 were typical of a much broader range of English language

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


teachers and to gain additional insights into what activities should be included
in a set of online training materials. We engaged with teachers who had
worked in a variety of different countries and who could be considered as
representing different sections of the ELT teaching community.
Each focus group consisted of six–eight participants and was facilitated
by a researcher who opened the discussion with a prompt consisting of
a comment from the stage 2 teachers’ responses to the initial questions.
Participants were invited to discuss their opinions of the prompt as a
group. This procedure was followed for approximately 90 minutes with
each group, thus allowing most of the stage 2 teachers’ responses to be
covered. Examples of prompts offered for discussion include:
1 The primary aim of the certificate course was to get us ready for the
classroom, so assessment wasn’t included.
2 I trained to be a KET and PET examiner. The training was fit for
purpose as a standardization session, but it should have included a
broader session on assessment.
3 The idea of grading someone wasn’t that important.

Data analysis In addition to the observation sheets, the final data from the three stages of
data collection consisted of 16 hours of oral discussions. The oral data were
coded by two experienced researchers using the software package Atlas.ti,
version 7 (http://atlasti.com/). The framework used for data analysis was
Davies’ (2008: 335–41) components of assessment literacy: Skills (including
item writing, statistics, test analysis, and using software programmes for test
delivery and analysis) + Knowledge (including issues in measurement and
language description, different models of language learning, teaching, and
testing) + Principles (including the proper use of language tests, their fairness
and impact). A deductive approach was taken to data analysis with each
researcher coding independently, after which the two sets of codings were
compared. Any disagreements were discussed, and a consensus reached.

Results and analysis In this section we will first discuss the data from our classroom
observations and focus group discussions. Using these data, we will then
relate them to each of the four research questions.

Classroom practices Many teachers reported they had received little or no formal training
in assessment, so we were interested in exploring exactly how teachers
developed their assessment practices. One teacher stated: ‘You bring

What does language assessment literacy mean to teachers? 117


conceptions of how you were tested at school and you apply them to
the language classroom’, suggesting that personal school experiences
influence assessment practices. This supports the findings of Smith, Hill,
Cowrie, and Gilmore (2014), who conclude that the assessment beliefs
of pre-service teachers had been shaped by past personal experiences.
Vogt and Tsagari (2014) make the analogy between ‘teaching as you
were taught’ and ‘testing as you were tested’. They characterize this as
a brake on innovation and consider that teachers need more formal
training in assessment rather than relying on past experiences or
staffroom knowledge sharing. One of the teachers made the point that
the development of assessment practices relies on teaching experience.

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


She noted that: ‘You build up your own ideas of assessment just through
experience of what your students are capable of doing.’ This suggests
that understandings of level and appropriate assessment activities come
with experience and interactions with students, which is plausible as it is
difficult to get a sense of level in isolation.
We also observed that the teachers used a range of assessment techniques,
including self- and peer-assessment, in the classroom. They shared
marking criteria with the students and discussed features of a good
piece of work. For example, the focus of one of the observed lessons was
giving a presentation. The teacher encouraged the students to look at
the marking criteria and discuss the key features of a good presentation.
In another lesson we saw the teacher using a class test as a stimulus
for student reflection. She encouraged the students to shift their focus
from the test score to how they could develop their language in a more
holistic sense. In another of the lessons, the teacher simplified an activity
which the students seemed to be struggling with. When we raised this
section of the lesson with him he categorized this as good teaching. He
stated that he continually monitored his students and adjusted his lesson
plans in response to student feedback, a change of activity described by
McKay (2006) as ‘on-the-fly’ assessment. This provides further evidence
supporting the idea that teachers tend to include assessment practices
within their teaching practice and therefore do not consider assessment,
as such, to be part of their teaching role. It is possible that they associate
assessment with tests or exams, and not with classroom practice
techniques such a monitoring and giving feedback.
On the evidence of the observation data, and contrary to Vogt and
Tsagari’s (2014) contention, informal routes to developing assessment
practices do not appear to stifle or limit practice. However, in the
follow-up interviews all the teachers categorized their assessment activities
as part of good teaching and not as assessment. It would seem, therefore,
that there is something of a divide in teachers’ minds between teaching
and assessment. This may have developed during initial teacher training
and the effect of this may have been to privilege teaching over assessment.
Another possible explanation (see Rea-Dickens 2004) may be that
teachers are often ethically torn over both teaching students and assessing
them, and therefore tend to shy away from the latter. It is more likely,
however, that it is teachers’ confusion in terminology between assessment
and testing, described by Clapham (2000) as common amongst testing/
assessment experts, that has created the divide in their minds. As we have

118 Vivien Berry et al.


seen, the teachers very successfully used a range of assessment activities
in their classes, while at the same time insisting that what they were doing
was not assessment but simply good teaching practice.
In terms of the first research question (RQ1 What are teacher attitudes
to assessment?), the teachers we spoke to could be described as having
somewhat contradictory attitudes towards assessment. Assessment was
considered negatively but good assessment practices were considered
part of good teaching and being a good teacher was, as can be expected,
important to the participants.

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


Testing and One of the teachers interviewed at the start of the project remarked:
assessment ‘None of my experiences of teaching had any focus on any kind of
experiences qualification at the end of it.’ A focus group participant told us that: ‘I
felt blindfolded when trying to create assessment tasks.’ This is a strong
statement and shows that this teacher found it very difficult to engage
in the assessment process. Another focus group participant expressed a
lack of confidence in her ability to make assessments, stating: ‘When you
make speaking assessments you guess the level and give a mark like a 7
or a 9.’ Her use of the word ‘guess’ suggests that she is not able to draw
on her experiences of teaching or her training when grading her students.
An indicator of this lack of confidence in creating assessments was the
willingness to assign responsibility for assessment to outside agencies.
One focus group participant said that he felt ‘unconfident about creating
test materials and so we defer to Cambridge’. Another focus group
participant described Cambridge as ‘a crutch to lean on’, lending further
support to the idea that the teachers associate assessment with tests (and/
or exams) rather than classroom assessment activities. These comments
show clearly that testing was more often the focus of the discussion
rather than assessment, although, in fact, all our prompts used the term
‘assessment’ rather than ‘testing’.
The answer to the second research question (RQ2 How confident do
teachers feel about engaging in assessment activities?) and the third research
question (RQ3 How confident do teachers feel about engaging in testing?)
would seem to be that many of the participants in the project felt
unconfident about engaging in either testing or assessment.
Comments from teachers in all phases of the project suggest that
the influence of tests in the classroom is all pervasive. A focus group
participant mentioned the importance for students of obtaining a
certificate of English language proficiency in economically difficult times.
One teacher stated: ‘Everything I do in class I’m conscious of how it
will help them when they are tested, and I always mention that to them
as well.’ Another teacher suggested that the students only focus on test
scores and that this is detrimental to language learning as it may promote
a superficial approach to learning. She commented: ‘You give exams
out and all they are bothered about is the score … they just fixate on the
numbers and they’re not looking at what they’ve done.’
In terms of our fourth research question (RQ4 What impact does assessment
have on the classroom?), it is clear that assessment has a huge influence on

What does language assessment literacy mean to teachers? 119


the classroom, but the influence may not always be positive. In some ways
assessment has become synonymous with testing, and testing could be
perceived as having a negative impact on language learning.

Training experiences Participants in this study had come into the teaching profession through
a variety of routes. Some had undertaken a degree in English language
teaching, whereas others had taken shorter courses such as CELTA. Most
said that assessment had not been included in their initial training but were
happy with the training they had received. Comments such as: ‘We were
not planning and designing assessments, we were planning and delivering

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


lessons’ were made. They were satisfied with the classroom focus of their
courses and considered them to be appropriate. Participants did not think
of ‘assessment’ as an integral part of teaching, despite successfully using a
range of assessment techniques in their own classrooms, but rather thought
of it as a synonym of ‘testing’ and hence did not believe their teacher
training had included any aspects of assessment.

Training materials Throughout the data-collection process, we asked teachers for their views
on training materials and what they considered to be relevant content for
a course designed to meet their needs. The responses we received focused
on practice-orientated training; theoretically-orientated training was not
requested. One focus group participant stated: ‘I would have liked more
practical elements in my training on assessment—more situation based.’
Another commented: ‘We’d like clear criteria for marking speaking and
writing.’ This indicates that teachers wanted activities they could readily apply
to their teaching. As teachers are busy, the appeal of ready-made activities
is obvious. These requests also support the notion that teachers have little
confidence in their ability to create what they understand assessments to be
and prefer to give responsibility to someone else or to an external agency.

Conclusions and The relationship between teaching, assessment, and testing as practised by
recommendations the participants in our study can be seen in Figure 1, where teaching is the
overall reason for teachers to be in the classroom, assessment in its many
forms is part of that teaching, and testing is only one facet of assessment,
although this is not necessarily how the teachers themselves understand it.
Testing and assessment have been found to have considerable impact
on the classroom. The participants in this project were observed to
successfully deploy a range of assessment techniques, although they did
not characterize them as assessment activities but considered them to be
part of good teaching practice. When they did discuss assessment, they
referred to it as ‘testing’ or, in some cases, as ‘exams’. They also expressed
concern that focusing on tests led to a superficial approach to learning.
The desire was expressed for ready-made materials and the reliance
on external testing agencies seems to be symptomatic of a lack of
confidence. This may be due to the lack of coverage of issues relating
to assessment in initial teacher training or it may be part of an ethical
conflict between both teaching and testing learners. There is evidence
to suggest that teachers develop their assessment practices through
engagement with a variety of teaching and learning experiences. These

120 Vivien Berry et al.


Teaching

Assessment

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


Tesng

f i g u r e  1
Relationship between
teaching, assessment, and
testing.

include experiences as a schoolchild and experience of teaching learners


at a variety of levels. The teachers, in general, reported that they had
received limited training in assessment but felt comfortable with this
situation. They had found alternative sources of assessment knowledge
such as colleagues and past experiences and this meant they did not
perceive their lack of training as problematic. They were confident in the
teaching practices they used. Although the teachers, overall, did not want
training in the theoretical underpinnings of assessment, they did request
assessment materials that could be used straight away. This may reflect
the busy nature of teaching and the desire to have time-efficient solutions
to problems. It may also be an indication of the separation of teaching
and assessment. The teachers saw assessments (or more properly ‘tests’)
as being created by other people and not by the teachers themselves.
The lack of training in assessment may have led to teachers feeling
unconfident about creating assessment tasks and therefore wanting to
deploy ready-made solutions.
Understanding teachers’ attitudes towards assessment can, therefore,
perhaps best be described as complex. On the one hand there is
ambivalence towards the topic and lack of confidence. On the other hand,
there is a conflation of assessment with testing, with hostility towards
testing, although teachers in this study used a wide range of assessment
activities very successfully in the classroom.
Several recommendations for initial teacher training and practice have
emerged from this project:
▪ To foster teachers’ awareness of the relationship between good teaching
practice and good assessment practice, explicit links should be made
during initial teacher training.
▪ During initial teacher training teachers should be encouraged to reflect
on their own experiences of assessment and project forward on how
they will be expected to assess their students.

What does language assessment literacy mean to teachers? 121


▪ Schools should perhaps consider providing opportunities for teachers to
meet and exchange ideas relating to best practice in assessment.

There has been little research to date that relates teachers’ attitudes
to assessment to their actual classroom practice. We would argue that
more classroom-based research should be undertaken to gain a better
understanding of assessment practices and how teachers develop them.
It is clear from our research that teachers are continually engaging
in assessment activities and this should receive more recognition. In
Clapham’s (2000) terms, the teachers who participated in our study are
not ‘testers’ engaged in testing, but they most certainly are ‘assessors’

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


involved in assessment, even if they do not characterize themselves as
such. The voice of the teacher therefore needs to feature much more
prominently in the assessment literacy debate.
Final version received July 2018

References McKay, P. 2006. Assessing Young Language Learners.


Berry, V., S. Sheehan, and S. Munro. 2018. ‘Beyond Cambridge: Cambridge University Press.
surveys: an approach to understanding effective
Rea-Dickins, P. 2004. ‘Understanding teachers as
classroom assessment practices’ in Conference
agents of assessment’. Language Testing 21/3: 249–58.
Selections, IATEFL TEASIG—CRELLA Conference,
University of Bedfordshire, Luton, November 2018, Smith, L. F., M. F. Hill, B. Cowie, and A. Gilmore.
http://edition.pagesuite-professional.co.uk/Launch. 2014. ‘Preparing teachers to use the enabling power
aspx?PBID=933218b7-7b10-49d6-99cb-f7410f299641. of assessment’ in C. Wyatt-Smith, V. Klenowski, and
P. Colbert (eds.). Designing Assessment for Quality
Clapham, C. 2000. ‘Assessment and testing’. Annual
Learning. Dordrecht: Springer, 303–23.
Review of Applied Linguistics 20: 147–61.
Stoynoff, S. 2012. ‘Looking backward and forward at
Colby-Kelly, C. and C. Turner. 2007. ‘AFL research in
classroom-based language assessment’. ELT Journal
the L2 classroom and evidence of usefulness: taking
66/4: 523–32.
formative assessment to the next level’. Canadian
Modern Language Review 64/1: 9–37. Vogt, K. and D. Tsagari. 2014. ‘Assessment literacy
of foreign language teachers: findings of a European
Crusan, D., L. Plakans, and A. Gebril. 2016. ‘Writing
study’. Language Assessment Quarterly 11/4: 374–402.
assessment literacy: Surveying second language
teachers’ knowledge, beliefs, and practices’. Assessing Xu, Y., and G. Brown. 2016. ‘Teacher assessment
Writing 28: 43–56. literacy in practice: a reconceptualization’. Teaching
and Teacher Education 58/1: 149–62.
Davies, A. 2008. ‘Textbook trends in teaching
language testing’. Language Testing 25/3: 327–47. Yin, M. 2010. ‘Understanding classroom language
assessment through teacher thinking research’.
Fulcher, G. 2012. ‘Assessment literacy for the
Language Assessment Quarterly 7/2: 175–94.
language classroom’. Language Assessment Quarterly
9/2: 113–32.
The authors
Looney, A., J. Cumming, F. van Der Kleij, and K. Harris.
Vivien Berry is Senior Researcher, English Language
2017. ‘Reconceptualising the role of teachers as
Assessment at the British Council where she
assessors: teacher assessment identity’. Assessment in
leads a project to promote understanding of basic
Education: Principles, Policy & Practice 22/1: 161–71.
issues in language assessment, including the
Malone, M. E. 2011. ‘Assessment literacy for language development of a series of video animations. She has
educators’. CAL Digest October 2011. Available at recently completed a major study investigating the
www.cal.org. comparability of the speaking construct in face-to-face

122 Vivien Berry et al.


and videoconferencing speaking tests. Vivien has and postgraduate courses and supervises doctoral
published extensively on many aspects of language candidates. Susan worked with Sonia Munro to
assessment and regularly presents research findings create a set of training materials for teachers on
at international conferences. topics related to assessment. These have been
Email: Vivien.Berry@britishcouncil.org published by the British Council.
Email: S.Sheehan@hud.ac.uk
Susan Sheehan is a Senior Lecturer in TESOL at
the University of Huddersfield. She researches Sonia Munro is a Senior Lecturer at the University
issues related to English language assessment and of Huddersfield where she is course leader for the
teacher education. She has presented on these MA TESOL. Her research interests include teacher
subjects at various national and international cognition and teacher education.

Downloaded from https://academic.oup.com/eltj/article-abstract/73/2/113/5310453 by guest on 19 March 2020


conferences. Susan teaches on both undergraduate Email: S.Munro@hud.ac.uk

What does language assessment literacy mean to teachers? 123

You might also like