You are on page 1of 19

Teacher Development

ISSN: 1366-4530 (Print) 1747-5120 (Online) Journal homepage: https://www.tandfonline.com/loi/rtde20

Learning about assessment for learning: a


framework for discourse about classroom
1
practice

Ann-marie Brandom , Patrick Carmichael & Bethan Marshall

To cite this article: Ann-marie Brandom , Patrick Carmichael & Bethan Marshall (2005) Learning
1
about assessment for learning: a framework for discourse about classroom practice , Teacher
Development, 9:2, 201-217, DOI: 10.1080/13664530500200248

To link to this article: https://doi.org/10.1080/13664530500200248

Published online: 20 Dec 2006.

Submit your article to this journal

Article views: 343

View related articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=rtde20
Teacher Development, Volume 9, Number 2, 2005

Learning about Assessment


for Learning: a framework for
discourse about classroom practice[1]

ANN-MARIE BRANDOM
King’s College London, United Kingdom
PATRICK CARMICHAEL
University of Cambridge, United Kingdom
BETHAN MARSHALL
King’s College London, United Kingdom

ABSTRACT ‘Learning how to Learn – in classrooms, schools and networks’ is a four-


year, multi-site project funded under the Economic and Social Research Council
Teaching and Learning Research Programme and concerned with the development of
formative assessment practice. This article describes how analysis of a large double-scale
questionnaire administered across project schools provided not only information about
trends in teacher values and practices, but also a basis for observation, analysis and
discourse about classroom practice. A group of secondary Postgraduate Certificate of
Education trainees were provided with a semi-structured observational framework
derived from factor analysis of questionnaire responses in order to guide their
observation of classroom video and analysis of teacher interviews. While all trainees were
able to identify effective practice and to relate it to their own evolving craft knowledge,
for a minority the framework also provided a focus for enquiry and dialogue about the
relationships and tensions between the values and practices of the featured teachers and
of the participating trainees themselves.

Introduction
In this article we describe how a group of secondary Postgraduate Certificate
of Education (PGCE) trainees, close to the completion of their course,
explored the relationship between teachers’ values and beliefs. While trainee
teachers and, increasingly, those involved in other school-based evaluation
processes, are involved in classroom observation, this varies in its scope and
form. It was a welcome opportunity, therefore, on the PGCE course at King’s
College London, to question, research and explore critically the underpinning

201
Ann-Marie Brandom et al
pedagogic and philosophical rationales employed by teachers in their design of
learning sequences or the use of specific classroom strategies. During this
college-based research task the trainees were working with the theory and
observing the practice of assessment for learning, critically engaging in the
analysis of classroom learning through the use of video and questionnaires and
exploring the relationship between teachers’ values and beliefs.
Twenty-one trainee teachers from a range of subject specialist areas were
involved in a four-day project during which they were introduced to the
research context (outlined below), engaged with video evidence gathered in
classrooms and critically evaluated transcripts of interviews with the teachers
featured in the videos. They were aided in their analysis of both video and
transcript data by the provision of a novel analytical scheme derived from the
‘Learning how to Learn’ large-scale survey of teachers’ beliefs and practices
(James & Pedder, 2005 [submitted]). This survey addressed the opportunities
afforded to teachers to reconcile the similarities and differences between their
own beliefs and practices in the light of assessment for learning. The trainees
reported their findings and were encouraged to reflect upon how the data,
which they had reviewed, provided evidence not only of teachers’ practices
but also of their underlying beliefs about learners and learning. In so doing the
trainees were invited to address their own beliefs and values and challenge
their own conceptions of pedagogy.

The Research Context: the Learning how to Learn project


Learning how to Learn is a four-year project involving four universities
(Cambridge, King’s College London, the Open University and Reading
University) working in partnership with 40 schools in five local education
authorities, and a Virtual Education Action Zone (VEAZ) in the United
Kingdom (UK). The project was established in order to build upon work in
two areas: the first of these was the use of formative assessment or ‘assessment
for learning’. The systematic review of research into formative assessment
work of Black & Wiliam (1998a) and subsequent work by the Assessment
Reform Group (1999, 2002), together with the work undertaken as part of the
King’s–Medway–Oxfordshire Formative Assessment Project (KMOFAP) (Black
et al, 2002, 2003), have contributed to a wide recognition of the role of
formative assessment in raising achievement. Accordingly, there has been a
wide acceptance of the ideas and practices associated with it by both
practitioners and policy makers.
The Learning how to Learn project was set up to take forward the
classroom-based work begun by Black & Wiliam (1998a, b) and to link it to a
second area, school improvement and the professional development of
teachers, and specifically to explore the kinds of professional development and
school culture which allow the changes of the kinds advocated by supporters
of assessment reform. In this respect, the project can be interpreted as
addressing the call from, amongst others, MacBeath & Mortimore (2001,

202
LEARNING ABOUT ASSESSMENT FOR LEARNING
pp. 20-21) and Wrigley (2003) for the school improvement debate to address
issues of pedagogical practice and to ‘examine the cultural messages of
classrooms which are dominated by the teacher’s voice, closed questions and
the rituals of transmission of superior wisdom’ (Wrigley, 2003, p. 36). A third
area for research recognized that schools are increasingly acting as elements of
both real-world and electronic networks and that ideas and information
relevant to classroom practice arrive in schools from a variety of sources.
Accordingly, the role of networks in supporting professional development and
classroom practice was another aspect of the project’s work.
The project design (described in full in James et al, 2003b) involved a
sequence of data collection episodes, interventions and support designed both
to collect evidence of change in classroom and school-wide practice and to
enable the advancement of practice. This expansive design with its dual
commitment to research and development locates the project in a broad
tradition of ‘action research’ as conceived of by Lewin (1948). It also has much
in common with ‘design experiments’ (Brown, 1992; Cobb et al, 2003) in that it
is iterative, interventionist and multi-leveled (with its focus on classrooms,
schools and the networks to which they belong) while also being theory
driven. It is another characteristic of design experiments, namely the
combination of both qualitative and quantitative data within broadly narrative
accounts, that has provided us with opportunities to develop the professional
development activity reported in this article (Gorard et al, 2004).

The Staff Questionnaire: design and analysis


A key element of the research design was a staff questionnaire comprising 83
items, the major feature of which was the differentiation between teachers’
practices and their beliefs. Of these, 30 statements grouped into a discrete
section related to teachers’ classroom assessment practices and were reflective
of a set of principles derived directly from the Assessment Reform Group,
which were based on Black & Wiliam’s (1998a) systematic review of the
literature on formative assessment, in which formative assessment is defined
as an integral part of planning and of routine classroom practice, children are
supported in self and peer assessment and are encouraged to reflect on their
own learning processes (James et al, 2003a). Other sections of the
questionnaire were concerned with teachers’ professional learning and with
management systems and practice.
The questionnaire was constructed with double scales, similar to those
used in the Improving School Effectiveness Project conducted in Scotland
(Robertson et al, 2001) and elsewhere (Stoll & Fink, 1996). This allows the
elicitation of staff perceptions of current practices (‘Scale X’) and their values
(‘Scale Y’) in relation to the 30 items. These allow the self-identification of gaps
between where staff members perceive themselves to ‘be’ now and where
they would ‘like it to be’. Based on 558 returned questionnaires from primary
and secondary teachers, 21 of the items were found to load, on the basis of the

203
Ann-Marie Brandom et al
‘practices’ response, onto three robust and orthogonal factors; the other nine
items did not load exclusively onto any one factor, although this did not
preclude analysis of differences on a per-item basis.
The factor structure was initially intended to describe trends in the
questionnaire data across staff at different levels of each school, and across
schools in the sample, while at the same time presenting the data in such a
way that feedback derived from them resonated with teachers and was
suggestive of opportunities for development on an individual, group and
school-wide basis. The three factors (listed in full in the Appendix) were:
ƒ ‘Making Learning Explicit’ (onto which loaded 10 items related to sharing
criteria, formative feedback, treating errors as learning opportunities and the
use of questions to elicit reasons and opinions);
ƒ ‘Promoting Learning Autonomy’ (onto which loaded five items related to
self-assessment and peer assessment and the involvement of learners in
setting their own learning objectives);
ƒ ‘Performance Orientation’ (onto which loaded six items related to planning
and assessment against prescribed criteria, summative feedback and the use
of questions to elicit factual information).
While a full analysis of the results of the questionnaire and the patterns of
trends within and between schools will be reported elsewhere, a number of
key findings are of particular interest. In the majority of cases, the practices
associated with ‘Making Learning Explicit’ were not only valued but were
reported as being ‘practiced’, although there were some instances of a ‘values–
practice gap’ with teachers unable, for whatever reason, to use these
classroom strategies as much as they would like. This gap was more evident in
the case of ‘Promoting Learning Autonomy’, where it was noted in the
ensuing analysis that teachers’ practice fell well short of the values they
ascribed to these strategies. The third factor, ‘Performance Orientation’,
showed the reverse pattern, with the values attached to these strategies being
less than their reported practice. In the analysis it was noted that in about one-
fifth of cases, teacher responses which reported high levels of commitment to
all three factors suggested a little or no values–practice gap.
When we have subsequently discussed these data with teachers both in
project schools and more widely, a common interpretation they offer is that in
many schools, it is commitment to a ‘performance orientation’ that prevents
the wider deployment of strategies designed to ‘promote learning autonomy’.
The rather more straightforward procedures related to ‘making learning
explicit’, however, are easier to integrate into existing classroom practice and
can also be employed in pursuit of enhanced performance without necessarily
involving independent self-assessment; for example, the sharing of ‘success
criteria’ can be related explicitly to performance in public examinations.

204
LEARNING ABOUT ASSESSMENT FOR LEARNING
A Framework for Observation and Analysis
In addition to the teacher questionnaire described above, a range of data was
collected from participating schools (see James et al, 2003b, for an overview of
these). In 20 schools, ‘focal teachers’ were identified and these teachers and
their classes were then involved in small-scale ‘embedded’ case studies (Yin,
2002) within the larger, multi-method ‘design experiment’. In the course of
these they took part in a number of interviews both about their beliefs about
learning and about their experience of professional development, and about
specific learning sequences and lessons which were also observed and
videotaped. Small groups of children were also interviewed following the
lesson observations and documentary evidence collected.
In a small number of cases, structured time-interval observation of the
video recordings was used in order to identify patterns in these focal teachers’
use of practices and procedures associated with formative practice. This
provides a very detailed picture of the role which these approaches have
within classroom practice and the conditions under which teachers draw upon
them in order to enable and enhance learning activities, much in the way that
Galton (1987) and Tunstall & Gipps (1996), for example, have documented the
‘fine-grained’ structure of classroom discourse and the effects on learning.
Our aim, prompted by the data, was, however, different. We were
conscious that valuable data were available in the various interviews and
documentation, which might both triangulate and illuminate observational
data. We were also aware that the data suggested that while a fifth of the
teachers did not believe their values to be compromised by the demands made
upon them, the rest did. Such findings correspond with more ethnographic
studies undertaken on the affordances and constraints that teachers perceive in
their working practices (Hammersley, 1999). However, neither the fine-
grained structured schedules nor more narrative accounts characteristic of
ethnographic studies fitted our purpose. We were interested in developing a
semi-structured instrument precisely so that teachers themselves might be
aided in evaluating and reflecting upon their own practice, and identifying the
constraints that prevented their developing a closer match between values and
practices.

The Trainee Research Task


As part of their PGCE course at King’s College London, trainees are required
to undertake a four-day research task in which they work alongside members
of staff on activities related to a current research project within the
department. This is undertaken in the summer term, after the completion of
the second of the students’ two blocks of school-based experience. A group of
21 such students were involved in a research task in which they were first
provided with an introduction to the principles of formative assessment
(drawing on the work of both the Assessment Reform Group and the
KMOFAP project) and the work of the Learning how to Learn project,

205
Ann-Marie Brandom et al
including a detailed description of the staff questionnaire and the analysis of
the results from the section of the questionnaire dealing with classroom
assessment practice. They were then provided with a semi-structured data
analysis instrument listing the factors and factor items derived from the staff
questionnaire.
Trainees then watched selected excerpts from two videos of classroom
activity; these were videos collected as data for analysis, rather than being
exemplary ‘best practice’. The first showed a secondary English classroom in
which the teacher was encouraging learners to undertake a structured peer-
assessment activity involving the reading of pre-1900 poetry. The teacher used
a range of classroom practices associated with formative assessment – most
notably sharing success criteria – and modeled peer-assessment approaches
with a Learning Support Assistant and with groups of learners. The second
was collected in a primary school classroom in which a ‘numeracy hour’ was
taking place; following a teacher-led activity, the children were involved in a
practical task in which they constructed three-dimensional shapes. The
trainees were also provided with transcripts of semi-structured interviews
collected prior to the observed lessons and which were designed to explore
their beliefs about their own learning and those of the children in their classes;
the school structures and procedures which helped them to learn; the
classroom strategies they used to encourage and enable learning; and key
assessment strategies they used.
Working with project members, trainees were able to use the interview
transcripts to identify episodes in which teachers expressed commitment to the
use of particular strategies or to more general principles. They also were able
to identify instances in which teachers cited barriers to learning, or aspects of
school practice at odds with their own commitments. Subsequently, trainees
spent time in groups observing other video recordings of classrooms and
analyzing the transcripts of interviews with the teachers featured. They then
compiled a short report and presented their results to the rest of the group. In
the majority of cases, this involved their showing excerpts from the video and
citing elements of the interview transcript; all the students were involved not
only in the task but in the reporting and discussion of their observations, and
while some were initially diffident and unwilling to be ‘critical’ of teachers’
classroom practice, the existence of a framework focusing attention on values
and practices, and on barriers to effective practice rather than on a more
general notion of ‘competence’, provided students with a basis for constructive
but critical commentary.
The research task is always evaluated by the PGCE trainees at the end of
the week’s activity. This evaluation is designed to act not only as a quality
assurance mechanism but also as a tool for reflection on the part of the
students. Tutors also use the evaluation formatively in order to see the extent
to which trainees have engaged with the research task and to develop or
modify the task for the following year. We were interested not only in the
trainees’ engagement with the data and their interpretations of the values and

206
LEARNING ABOUT ASSESSMENT FOR LEARNING
practices of the teachers, but also in their assessment as to the efficacy of the
development exercise as a whole and of the use of the analytical scheme
derived from the factor analysis as a basis for developing their understanding
of formative practice.
Three questions in the evaluation form were particularly valuable in our
analysis of the research task. These were:
ƒ Which areas has the research task allowed you to develop?
ƒ What has been the most useful/significant learning for you – personally and
as a teacher?
ƒ How could the task have helped you to develop further?
Before we consider the responses in more detail, it should be noted that all of
the trainees, without exception, felt that the activity had enhanced their
understanding of formative assessment, including one student who wrote:
‘Spot on. Very accessible, well explained, exciting, interesting’. We will return
to the perceived value of the activity later, but for now it is worth delving
beneath the surface of a general endorsement of the task to specific patterns
evident in student responses indicative of the way in which the activity
prompted thought about formative practice.
Three main themes emerged from the trainee responses. The first two,
perhaps not surprisingly, reflected the factors we had discussed. The first area
was broadly to do with values and practice and the second with pupil learning.
The third, self-reflection, was possibly attributable to the evaluative design of
the form, but also to the task itself, the focus on teachers’ beliefs and values
stimulating some students to question critically their own experience during
their recent school experience.
Each of the three questions stimulated responses in each of the three
themes; in other words there was no direct correlation between the question
and the area of response it provoked, and some students returned to the same
theme in more than one of their responses; the theme of ‘values and practices’
was raised by some students in response to each of the questions listed above.
To this extent the questions do appear to have acted more as a prompt to what
the students wanted to say rather than inviting stock responses, and many of
the responses have a discursive and exploratory character.

Values and Practices


In the course of the research task, the trainees worked collaboratively and all
contributed both to discussion and reporting back. On the basis of their
responses to the questions listed above, however, it was possible to identify
two broadly defined groups, which we will call Group A and Group B. Group
A (eight students) characteristically described their emerging awareness of the
tension between values and practices highlighted by the factor analysis, while
Group B (15 students) tended to focus solely on the practices of formative
assessment.

207
Ann-Marie Brandom et al
Group A articulated the tensions they had identified between values and
practice in a number of ways. It appeared that some of the students had simply
been struck by the practice–values gap as identified through the analysis. For
one student, who came back to the issue twice (in response to questions one
and three) the difference was broadly descriptive:
Ideas of formative assessment values in practice ... an insight into how
other teachers implement and theorize their practice ... it has made me
more aware of the discrepancies between the philosophies of a teacher
and the practices they undertake
Another observed:
That there is a tension between the ideal world of formative assessment
and the real pressure of the summative assessment system
This last comment is interesting as it is the only one, in any of the responses,
that explicitly juxtaposes formative practice with ‘performance orientation’.
Another wrote:
There can be a significant difference between your learning and your
practice.
The choice of word ‘learning’ here is interesting in that it implies a disparity in
the student’s mind between how they view their learning and what they value
about the learning of others.
For others, however, the apparent gap appeared to act more as a
challenge:
Making values and practice work together.

The use of formative assessment as well as trying to match practice and


values during a lesson.

Looking at the purposes and procedures of formative assessment.

More confident and clearer aims exactly what formative assessment is –


aim – practice.
These last two comments, written by the same student, while they do not
specifically mention the tension between values and practice, seem to imply
them. This is because the student appears to be making a distinction between
the outworking of formative assessment in the classroom – its surface
‘practice’ or ‘procedures’ – and an underlying rationale – the ‘purpose’ or ‘aim’
of formative assessment. We will return to the idea of the ‘purpose’ of
formative assessment as conceived by this and other students in this group
later.
Others more explicitly related the research task to their own practice:

208
LEARNING ABOUT ASSESSMENT FOR LEARNING
Highlighting mistakes I’ve made but am aware of in terms of formative
assessment. Good practice/examples and strategies of formative
assessment procedures. Analyzing practice/values.

It made me reflect on the way I teach and on the values I have. I think I’m
going to analyze if my values match my practice.
Group B, on the other hand, made little or no mention of the values–practice
gap, which is perhaps surprising given the presentation of the factor and
cluster analysis. What they seemed to have derived from the task might all,
with the exception of one student, come under the general heading of
‘practices’. This one student who commented that: ‘It was a very useful task
that has provided me with an insight into formative assessment’ perhaps
needed more support in transferring this general statement into practical
strategies. This interpretation of his response is evident in his answer to
another question where he writes that what he wanted was ‘a summary of the
main issues linked to formative assessment’.
For the rest of Group B, however, the notion of practices fell into two
broad divisions. For some it meant the range of practices – embodied in terms
such as technique, activity, practice, strategies and methods:
Developing a range of formative strategies

Has helped me identify and explore different formative practices

Awareness of formative techniques

Methods of formative assessment


For others it was more about implementation:
How to put formative assessment into practice

It has given me ideas about formative assessment in practice

I will put some practices observed into my teaching

Methods for putting formative assessment into practice


Some are not directly concerned with formative assessment but appear to have
prompted some reflection on practice in general:
To be able to pinpoint and pull out important aspects of teaching a topic

Observing different teaching styles and how resources are used

209
Ann-Marie Brandom et al
Pupil Learning
Similar patterns emerged in responses to questions dealing with pupil learning.
On the whole, it was Group A students who referred to either pupil learning,
or more specifically, to pupil autonomy. The task itself had clearly encouraged
students to critically engage with issues of pupil learning and, in particular,
autonomous learning. One student is quite explicit:
I shall never be able to watch a video again without wondering where
‘promoting learning autonomy’ is taking place.
Others make the connection through their choice of language suggestive of
the importance of the observational process. They speak of being able to ‘see’
and ‘recognize’ learning in pupils and describe how the classroom activity they
have observed is relevant to, and may inform, their own practice:
Knowledge and ways of learning can be assessed and being able to see
whether a child has learned something

Recognizing a pupil who has learned to learn


For some, however, the task encouraged them to think more about their own
role as teachers in the process of developing pupils’ learning:
Ideas for pupil autonomy and self and peer assessment

Promoting learning autonomy


With others the task more generally raised their awareness of learning in
pupils:
That learning isn’t a passive process. Pupils need to engage in the activities
in order to learn

What students will learn and how they learn best


One student mentioned that the task had enabled her to ‘focus on the benefits
of formative assessment’. While she does not specifically mention the notion
of learning, it seems implicit in her use of the word ‘benefit’.
The one student in Group B who wrote about student learning
comments stated that the activity gave him a ‘greater understanding of what
formative assessment is and the benefits for student learning’. They also add
that the task has given them an ‘awareness of formative techniques. How they
help students develop as learners and how to facilitate their use in the
classroom.’

Self-reflection
The last theme again highlights the differences between the two groups. If we
return to the last statement this suggests that the (Group B) student is inclined

210
LEARNING ABOUT ASSESSMENT FOR LEARNING
to see the relationship between formative assessment and pupil learning in
terms of ‘techniques’. This is characteristic of Group B students in that their
reflection is framed by what they perceive themselves to have ‘learned’
through the task and how it will impact on their own teaching.
The majority of Group A, in contrast, related reflections on their practice
to either ‘values’ or pupil learning. For a number, as we quoted earlier, the
activity prompted them to consider whether they practiced what they valued:
Highlighting mistakes I’ve made but am aware of in terms of formative
assessment ... good practice/examples and strategies of formative
assessment procedures ... analyzing practice/values.

It made me reflect on the way I teach and the values I have. I think I’m
going to analyze if my values match my practice

Do my teaching values meet my teaching practice? What problems are


there with meeting values in practice?
One student reflected less on their own values and more on their own learning
and the way in which they taught: ‘It made me think how I learn best and use
this to teach.’ For others, the research task prompted specific reflections on
their desire to promote pupils’ learning autonomy:
Now that I’m aware of the key factors I will be developing my own style
according to some/all of the relevant areas ... promoting learning
autonomy

[through] analysis of others’ teaching I have become more aware of how


to implement formative assessment and encourage pupil autonomy
Evident in all the foregoing comments are expressions of the benefit of lesson
observation as an aid to reflection, rather than simply in demonstrating
desirable practice and adding to the students’ repertoire of classroom strategies.
Our Group B students, on the contrary, emphasized the practical benefits
of the research task in terms of classroom repertoire. The following comments
are more characteristic of Group B in their concentration on the ‘techniques’
of formative assessment, in relationship to practice:
This has really helped me think about how I assess and given me terms for
what I do in the classroom automatically

This task has given me lots to think about and lots of ideas to help me
improve my assessment in my teaching

It gave me ideas to try in the classroom. Made me reflect on my own


teaching practice

211
Ann-Marie Brandom et al
For all the students, what appears to have been particularly helpful is the
opportunity to undertake semi-structured observation of other people’s
classroom practice. The use of the words ‘critical’ and careful’ in the following
responses is particularly illuminating in this respect:
The ability to be critical of other people’s method of teaching and to be
able to think of ways I would teach what I am observing

The potential improvement of my own teaching by careful observation of


another teacher’s lesson

Review
Evident in these student responses is the link between peer assessment and
self-assessment reported in Black et al (2003). For many students (those in both
our Group A and Group B) processes of peer observation prompted reflection
and self-assessment.
The use of video evidence to stimulate reflective practice amongst
practitioners is not new (see, for example, Hu et al, 2000; Cunningham, 2002).
In this research task, however, students, in critically evaluating the practice of
others in a research role, were able to relate their own values and practice and
those of others to the theoretical underpinnings of formative assessment. In
this respect the use of the video can be located in a broader ‘image-based’
approach to qualitative research (Prosser, 1998).
It is evident, however, that the video data alone would have been
insufficient to illuminate the values–practice gap, and that the use of the
interview data, for around a third of the students, added an additional ‘frame’
for the students to understand what they were seeing. The benefits of the
iteration between the interview and the video data were further supported by
the additional ‘lens’ of the factor structure. It is this that appears to have given
them a vocabulary for what one trainee described as ‘what I do in the
classroom automatically’. This phrase in itself is significant in that it indicates
the efficacy of the instrument in enabling trainees to get beyond the
‘automatic’ to the critically reflective (see Schön, 1983), even if they only
extended this reflection to their own repertoire rather than the pupils’
autonomy in learning.
What the evidence we have analyzed does not tell us is the trainees’ own
starting points. We do not have any data on the trainees’ beliefs about learning
and it would be interesting to know the extent to which, as with the teachers,
their own values affected the way they came to the task. Truly to understand
the impact of this semi-structured instrument, in developing trainees’ and
teachers’ engagement with formative assessment, we would need to know
what their opinions were based upon.
It is possible that those trainees who spoke most of pupil autonomy
already valued this factor and had begun to experience some of the tension
highlighted in the factors and the teacher interviews. Even if this were so, it is

212
LEARNING ABOUT ASSESSMENT FOR LEARNING
evident that all these trainees were still prompted to further reflection by the
task, not least because, as we have already observed, they were given a
manner in which to structure that reflection in ways which might enhance
their practice.

Conclusion
What is clear from the analysis of the trainees’ responses to the data is that all
the trainees were able to reflect upon the issues surrounding values and
practice with particular emphasis upon performance orientation and pupil
autonomy. We are cautious regarding the inferences drawn from this analysis
on the basis that the reconciliation between the demands of procedural
practice and ideas about autonomy will need a more detailed study.
In Group B some of the trainees focused on the effectiveness of the
procedures at the end of a training course which privileges the acquisition and
demonstration of predefined competences. Group A responded differently and
highlighted the tensions between performance orientation and pupil
autonomy. It is possible that the nature of their reflections was prompted
either by their own experience of the gap between their values and their
practice on teaching placement or that they arrived on the PGCE course with
a commitment to pupil autonomy.
This echoes the findings of James & Pedder (2005 [submitted]), in which
20% of teachers surveyed in the Learning to Learn project valued pupil
autonomy over and above performance criteria. Such evidence suggests that
the challenge presented for initial teacher training courses is to unravel trainee
beliefs about pupil autonomy and to engage them in the resolution of the
beliefs and values gap. Moreover, this understanding may or may not align
constructively with what and the way in which they are being asked to teach
and this potential alignment may vary according to subject discipline
(Entwistle, 2003). What our small-scale study indicates is that it is important to
give trainees the discourse to begin to identify and so discuss the parameters of
the values–practice gap and these tensions may or may not be reconciled
depending upon the trainees’ understanding of performance orientation and
the need for pupil autonomy.

Acknowledgements
The assistance of the Learning how to Learn team, particularly Dave Pedder
and Richard Procter (University of Cambridge), is gratefully acknowledged, as
is that of the student researchers from King’s College London.

213
Ann-Marie Brandom et al

Correspondence
Ann-Marie Brandom, Department of Education & Professional Studies,
King’s College London, Franklin-Wilkins Building (Waterloo Bridge Wing),
Waterloo Road, London SE1 9NH, United Kingdom
(ann-marie.brandom@kcl.ac.uk).

Note
[1] This article is based on the work of ‘Learning how to Learn – in classrooms, schools and
networks’. This is a four-year development and research project funded from January
2001 to March 2005 by the UK Economic and Social Research Council (ref: L139 25
1020) as part of Phase II of the Teaching and Learning Research Programme (see
www.tlrp.org). The Project is directed by Mary James (University of Cambridge) and
co-directed by Robert McCormick (Open University). Other members of the team are:
Carmel Burgess, Patrick Carmichael, David Frost, John MacBeath, David Pedder,
Richard Procter and Sue Swaffield (University of Cambridge), Paul Black, Bethan
Marshall and Joanna Swann (King’s College London), Leslie Honour (University of
Reading) and Alison Fox (Open University). Past members of the team are Geoff
Southworth, University of Reading (until March 2002), Colin Conner, University of
Cambridge (until April 2003) and Dylan Wiliam, King’s College London (until August
2003). Further details are available at: www.learntolearn.ac.uk

References
Assessment Reform Group (1999) Assessment for Learning: beyond the black box. Cambridge:
Cambridge University School of Education.
Assessment Reform Group (2002) Assessment for Learning: 10 principles. Cambridge: Cambridge
University Faculty of Education.
Black, P. & Wiliam, D. (1998a) Assessment and Classroom Learning, Assessment in Education, 5,
pp. 7-73.
Black, P. & Wiliam, D. (1998b) Inside the Black Box. London: King’s College School of Education.
Black, P., Harrison, C., Lee, C., Marshall, B. & Wiliam, D. (2002) Working Inside the Black Box:
assessment for learning in the classroom. London: King’s College School of Education.

214
LEARNING ABOUT ASSESSMENT FOR LEARNING
Black, P., Harrison, C., Lee, C., Marshall, B. & Wiliam, D. (2003) Assessment for Learning.
Buckingham: Open University Press.
Brown, A. (1992) Design Experiments: theoretical and methodological challenges in creating
complex interventions in classroom settings, The Journal of the Learning Sciences, 2,
pp. 141-178.
Cobb, P., Confrey, J., diSessa, A., Lehrer, R. & Schauble, L. (2003) Design Experiments in
Educational Research, Educational Researcher, 32, pp. 9-13.
Cunningham, A. (2002) Using Digital Video Tools to Promote Reflective Practice, in Proceedings
of International Conference of Society for Information Technology and Teacher Education 2002(1),
pp. 551-553. Available at: http://dl.aace.org/10816
Entwistle, N. (2003) Concepts and Conceptual Frameworks Underpinning the ETL Project. Occasional
Report. Available at: www.edac/ETL/docs/docs3.pdf
Galton, M. (1987) An ORACLE Chronicle: a decade of classroom research, Teaching and Teacher
Education, 3, pp. 299-313.
Gorard, S., Roberts, K. & Taylor, C. (2004) What Kind of Creature is a Design Experiment?,
British Educational Research Journal, 30(4), pp. 575-590.
Hammersley, M. (1999) (Ed.) Researching School Experience: ethnographic studies of teaching and
learning. Lewes: Falmer Press.
Hu, C., Sharpe, L., Crawford, L. et al (2000) Using Lesson Video Clips Via Multipoint Desktop
Video Conferencing to Facilitate Reflective Practice, Journal of Information Technology for
Teacher Education, 9, pp. 377-388.
James, M. & Pedder, D. (2005) Beyond Method: differences in teachers’ stated assessment and
learning values and practices (submitted).
James, M., Pedder, D. & Swaffield, S. (2003a) A Servant of Two Masters: designing research to
advance knowledge and practice, paper presented at the American Educational Research
Association, Chicago, 21-25 April.
James, M., Pedder, D., Black, P., Wiliam, D. & McCormick, R. (2003b) The Design of the ESRC
TLRP ‘Learning how to Learn’ Project, paper presented at British Educational Research
Association Conference, Heriot-Watt University, Edinburgh, 10-13 September.
Lewin, K. (1948) Resolving Social Conflicts. Selected Papers on Group Dynamics, ed. Gertrude
W. Lewin. New York: Harper & Row.
MacBeath, J. & Mortimore, P. (2001) School Effectiveness and Improvement: the story so far, in
J. MacBeath & P. Mortimore (Eds) Improving School Effectiveness. Buckingham: Open
University Press.
Prosser, J. (1998) (Ed.) Image-based Research: a sourcebook for qualitative researchers. London:
Falmer Press.
Robertson, P., Sammons, P., Thomas, S. & Mortimore, P. (2001) The Research Design and
Methods, in J. MacBeath & P. Mortimore (Eds) Improving School Effectiveness. Buckingham:
Open University Press.
Schön, D.A (1983) The Reflective Practitioner: how professionals think in action. New York: Basic
Books.
Stoll, L. & Fink, D. (1996) Changing our Schools. Buckingham: Open University Press.
Tunstall, P. & Gipps, C. (1996) Teacher Feedback to Young Children in Formative Assessment:
a typology, British Educational Research Journal, 22, pp. 389-404.

215
Ann-Marie Brandom et al
Wrigley, T. (2003) Schools of Hope: a new agenda for school improvement. Stoke-on-Trent: Trentham
Books.
Yin, R.K. (2002) Case Study Research. Thousand Oaks: Sage.

APPENDIX

The Factors and Items


The factors and items derived from the Staff Questionnaire and used in the
analytical framework used by the students were as follows.

Factor 1: Making Learning Explicit


Assessment provides me with useful evidence of students’ understandings
which they use to plan subsequent lessons.
Students are told how well they have done in relation to their own previous
performance.
Students’ learning objectives are discussed with students in ways they
understand.
I identify students’ strengths and advise them on how to develop them further.
Students are helped to find ways of addressing problems they have in their
learning.
Students are encouraged to view mistakes as valuable learning opportunities.
I use questions mainly to elicit reasons and explanations from their students.
Students’ errors are valued for the insights they reveal about how students are
thinking.
Students are helped to understand the learning purposes of each lesson or
series of lessons.
Pupil effort is seen as important when assessing their learning.

Factor 2: Promoting Learning Autonomy


Students are given opportunities to decide their own learning objectives.
I provide guidance to help students assess their own work.
I provide guidance to help students to assess one another’s work.
I provide guidance to help students assess their own learning.
Students are given opportunities to assess one another’s work.

216
LEARNING ABOUT ASSESSMENT FOR LEARNING
Factor 3: Performance Orientation
The next lesson is determined more by the prescribed curriculum than by how
well students did in the last lesson.
The main emphasis in my assessments is on whether students know,
understand or can do prescribed elements of the curriculum.
I use questions mainly to elicit factual knowledge from my students.
I consider the most worthwhile assessment to be assessment that is
undertaken by the teacher.
Assessment of students’ work consists primarily of marks and grades.
Students’ learning objectives are determined mainly by the prescribed
curriculum.

217
Ann-Marie Brandom et al

218

You might also like