You are on page 1of 28

King’s College London

School of Education, Communication & Society


Coversheet for submission of coursework
(Undergraduate & Taught Postgraduate)
Complete all sections of this form and ensure it is the first page of the document you submit.

Failure to attach the coversheet as required may result in your work not being accepted for
assessment.

DECLARATION BY STUDENT
This assignment is entirely my own work. Quotations from secondary literature are indicated by the
use of inverted commas around ALL such quotations AND by reference in the text or notes to the
author concerned. ALL primary and secondary literature used in this piece of work is indicated in the
bibliography placed at the end, and dependence upon ANY source used is indicated at the
appropriate point in the text. I confirm that no sources have been used other than those stated.
I understand what is meant by plagiarism and have signed at enrolment the declaration
concerning the avoidance of plagiarism.
I understand that plagiarism is a serious examinations offence that may result in disciplinary
action being taken.
I understand that I must submit work BEFORE the deadline, and that failure to do so will result in
capped marks.

Student no. 2 3 1 2 2 9 8 7 (Student ID card number, NOT your K-number)

Module Title: Studying Policy and Professional Practice

Module Code:
7SSEP001 6,000-word (SER)
(e.g. 7SSEM031)
How is students’ knowledge assessed in humanities and STEM
Assignment title: subjects at my SE1 placement school?
Deadline: Tuesday 30th January 2024 by 5pm

Date Submitted: Tuesday 30th January 2024

Word Count: For details of what is included in the


word count, and penalties incurred by exceeding the 6,023
word count limit, please consult the handbook.

1
Your assignment may be used as an example of good practice for other students to refer to in future.
If selected, your assignment will be presented anonymously and will not include feedback comments
or the specific grade awarded. Participation is optional and will not affect your grade.
Do you consent to your assignment being used in this way? Please tick the appropriate box below.

YES NO

2
How is students’ knowledge
assessed in humanities and STEM
subjects at my SE1 placement
school?

Table of Contents
Introduction and School Context .................................................................................................. p. 4
Literature Review and Government Policy Analysis ..................................................................... p. 4
Ethics ........................................................................................................................................... p. 7
Data Collection Methodology ...................................................................................................... p. 7
Findings, Analysis and Discussion ................................................................................................ p. 8
Conclusion ................................................................................................................................. p. 12
References ................................................................................................................................. p. 13
Appendices ................................................................................................................................ p. 15
Appendix I ................................................................................................................................p. 15
Appendix II ...............................................................................................................................p. 17
Appendix III ..............................................................................................................................p. 18
Appendix IV ..............................................................................................................................p. 19
Appendix V ...............................................................................................................................p. 21
Appendix VI ..............................................................................................................................p. 22
Appendix VII .............................................................................................................................p. 23
Appendix VIII ............................................................................................................................p. 26
Appendix IX ..............................................................................................................................p. 27
Appendix X ...............................................................................................................................p. 28

3
Introduction and School Context
In this School Experience Report, the aim of this report is to compare assessment for learning (AFL)
in humanities and STEM subjects at the School X, which is in an inner metropolitan borough in
London. The school X is an Academy Trust, mixed sex school, and it consists of approximately 1900
students on roll and in its most recent Ofsted report the school was considered ‘good’ (29 and 30
September 2022). According to the Ofsted Report (2022), there is a collaboration between the
primary and secondary phases at the school X for pupils to develop their knowledge gradually.
Another positive aspect of the school X is the regular training the staff members receive. This way
not only can students work on the areas they need improvements, but they can also familiarise
themselves with new concepts at a later stage of learning.

There was not an assessment or marking policy at the time of data collection; finding out
information about the differences of AFL between STEM and humanities subjects in the absence of
school policy was the key factor to choose the topic. Moreover, since AFL covers all subjects and
students’ ability levels, as a PGCE teacher I believe that it would be beneficial to grasp a better
understanding of it.

Literature Review and Government Policy Analysis


Defining assessment for learning

It was not known until five decades ago that what the learner was already familiar with was of
paramount significance in teaching. It was Ausubel (1968) who reached this conclusion, stating that
teachers need to ensure that students have learnt what they are meant to learn. AFL is the process
of providing feedback through teaching and learning with a view to enhancing students’
performance (Cambridge Community, 2015). As highlighted by Dr Fautley and Dr Savage (2008, p.4)
decent assessment skills are essential for effective teaching and learning in schools. Florez and
Sammons (2013) maintain that there are ten principles which comprise the key aspects of any
practice in AFL. Therefore, effective planning, decent classroom practice, understanding goals and
criteria play a vital role in AFL. In addition to these, AFL should be sensitive and constructive, it needs
to foster motivation, and it should also recognise all educational achievements (Florez and Sammons,
2013). Furthermore, AFL is not only a key professional skill, but it also includes concepts like focusing
on learning process, facilitating pupils know how to improve, developing the capacity for peer and
self-assessment (Florez and Sammons, 2013). Differentiation is undeniably one of the most
significant features in effective planning. As suggested by Bartlett (2016, p.1) what is primarily meant
by differentiation is that ‘all pupils make progress’ irrespective of their differences in abilities,
learning styles, different life or learning experiences. Another point that is worth indicating is that
That is why, there is no ‘one-size-fits-all’ approach when it comes to teaching, as there might be
instances where the teacher may need to change his style entirely in a classroom, for example, from
conservative to ‘all singing, all dancing’ (Barlett, 2016, p.1).

Setting clear goals and criteria is also crucial for the success of the lesson. Nevertheless, coming up
with right goals and criteria is far from being easy. Therefore, it is important to define the right goals,
which should not only be a worthwhile learning goal, but also attainable within the right timeframe
(Oregon, 2016). As illustrated by Oregon (2016), considering the following examples from science:

a) “Students understand that all matter has characteristic chemical and physical properties and
functions;
b) Students know that water is H20; and
c) Students understand that energy can be transformed from one form to another.”

4
We can perceive that Learning Goal c is not only functional enough to be worth learning, but it is also
viable to fulfil within a lesson. The option a would not be appropriate, because its breadth and
complexity would require a much longer timeframe than just a lesson. Although the option b
provides a discreet fact for students to learn, it lacks a substantial goal for a lesson. On the other
hand, success criteria are the ways that help teachers to verify whether, and to what extent, learners
have accomplished the learning goals (AITSL, 2017). AITSL (2017) also claims that success criteria are
useful when they are concrete and unvague, because in case they are not specific enough they might
not become meaningful. In addition, success criteria are foremost for students as well, since if they
are aware of what they will learn and why, they will be capable of becoming autonomous learners,
thereby developing the skills necessary to commence learning on their own (TeachingChannel,
2021).

While there are 3 types of assessment (diagnostic, formative, and summative), AFL only contains the
first and the second one (Miles, 2022). Diagnostic assessments are those happening prior to the
learning activity for students to use to spot challenging aspects to heed for the duration of teaching,
but formative ones are those taking place during the learning activity for students to use later (Miles,
2022). Miles (2022) also mentions that summative assessments are not considered AFL, since the
outcomes are a summary of learning that has happened. The best type of diagnostic questions is
multiple-choice ones (MCQs), because they are quick, intensely focussed, and very powerful (B.E.S.T.,
2011). They are quick, since students do not need to come up with an answer on their own. They are
highly focused, because especially thoroughly prepared MCQs can be useful to tackle possible
misunderstandings stemming from previous lessons. They are also powerful for the reason that
teachers can utilise them in almost all year groups and subjects. Surprisingly, Barton (2018, p.35)
points out that closed questions are not considered effective in Maths, in view of the fact that they
do not require great depth of thought, however hard he has tried to implement them in class. Thus,
Barton (2018) concluded that open questions are more prolific than closed ones, since they start
with interrogative words like “Why” and “How” which demand a higher level of perception to
answer. Nevertheless, there are cases where closed questions may also be quite valuable. For
instance, closed questions are more suitable for low-ability students, because they give them the
opportunity to answer questions more accurately, which leads to increasing confidence (Doherty,
2017). Timewise, closed questions are also more efficient than open ones, as according to Cohen et
al. (2004) although students answer closed questions within three and five seconds, it takes them up
to 15 seconds to answer open ones. On top of that, teachers can also prepare questions named
“hinge” before the lesson begins so that he or she can assess all students’ understanding and
thinking at specific points (Searle, 2023). These questions might be helpful for teachers to figure out
whether they will need to reteach the relevant stages of the lesson before moving on (Searle, 2023).
Searle (2023) also suggests that STEM subjects are particularly suitable for hinge questions, because
there are no grey areas or answers based on students’ perspective like other subjects, as answers can
only be either right or wrong. That is why, it would not be inaccurate to assume that there are no
universal approaches as to teaching, as there are question types, such as “fill in the blank” ones,
which may be more suitable for humanities subjects instead. Hence, it would be wise to mention the
role the individual requirements play in learning and designing questions. As indicated by Benjamin
(2023), mini whiteboards, problem pairs, think pair share, and metacognitive prompts are among
most striking formative assessment examples. Mini whiteboards are especially noticeable on account
of them being considered “low stakes”, as students’ responses will be documented nowhere,
consequently, they can take more risks answering questions than writing them down in their book
(Benjamin, 2023). For this reason, MCQs are also useful in formative assessment, since either they
are based on the traffic light system or on the letters A, B, C, or D, one way or another, students just

5
need to identify the correct answer rather than provide it on their own. ‘Problem pairs’, which is the
process of writing two akin models or questions on the board so that the teacher can start working
on before students take over and finish them, is helpful since the teacher will be able to benefit from
students’ responses to ascertain whether they have learned the topic well or they still struggle
(Benjamin, 2023). With respect to ‘Think, pair, share’ approach; it is a valuable way of urging
students in the class to give answers, which bears fruit when dealing with students with lower ability
or those lacking confidence, as this way they will not need to share their answers unless they have
first discussed them as a pair (Benjamin, 2023). Johnson (2023) also argues that this approach may
be useful across all stages of the lesson (beginning, main part, and the last part), as the teacher will
be able to check students’ previous knowledge before moving on to the main part where he or she
can explain the key elements, and lastly, the teacher can also evaluate whether the students have
grasped the concept. It would be wise to pay special attention to prior knowledge, as it constitutes
the basis of subsequent knowledge, thereby ensuring that students have reached a sufficient level
before the beginning of the main topic is rather important (Hailikari et. al, 2015). In retrospect,
whereas it was only the teacher who used to assess, nowadays, pupils’ self-assessment is also
perceived equally significant (Johnson, 2023). This way students can have a good understanding of
their knowledge before, during, and at the end of the lesson, thereby facilitating them to acquire
metacognitive skills (Johnson, 2023). In terms of ‘Metacognitive prompts’, they are helpful to acquire
students’ feedback so that the teacher can be aware of whether they have met the requirements of
the learning objectives and how to improve strategies to enhance learning in subsequent lessons
(Benjamin, 2023). Metacognition is regarded as one of the most influential factors in education,
because it develops skills necessary to build up their knowledge on their own, thereby becoming
autonomous learners in later stages. Although each aforementioned method consists of advantages
and disadvantages, what matters most is the need for the feedback to be constructive. Consequently,
what teachers mostly need is to ensure whether all their students have grasped the knowledge
required and developed their skills. The regular assessment of students constitutes another key
element in education, as it offers students and parents useful feedback concerning the development
of students’ progress (Wolf and Patrick, 2007). Gathering real-time feedback is particularly beneficial,
as it offers lots of advantages such as enhancement in teaching and comprehending students’
learning behaviour which can lead to more effective communication between teachers and students
(Altrabsheh et al., 2014).

Government Policy

In accordance with Poet et al. (2018), the following points are the most conspicuous data of the
government policy regarding AFL:

 The revocation of national curriculum levels in 2014 was predicted to induce positive
changes in AFL, such as pedagogical, motivation, engagement, and more effective usage of
formative assessment in the classroom.
 The purpose of the Commission on Assessment without Levels (CAWL) was to increase the
time for in-depth teaching and formative assessment while decreasing the time teachers
spend on recording and tracking progress concerning numerical targets.
 It is now possible to pay more attention to formative assessment than before. Thus, teachers
have the opportunity to modify activities, adjust their planning and resources more
efficiently. A few teachers mention that this has helped pupils to have a better understanding
of topics, since this way it is no longer necessary to progress without first covering the
corresponding curriculum unit first.

6
 It is a general fact that non-statutory assessments were more curriculum-oriented prior to
the establishment of AWL.
 The majority of teachers have stated that there is no big difference in the time spent on
assessment between the periods before and after AWL. To make matters worse, several
teachers say that their workload has increased because of the obligations resulting from this
change.
 While schools’ non-statutory assessment consists of four strategies, namely formative,
summative, moderation and tracking, and reporting systems, secondary schools are inclined
to use external tests and moderation less than in primary ones.
 In respect of non-statutory assessment, primary schools seem to be focusing more on the
core subjects of Maths, English and Science unlike secondary ones which apply a comparable
strategy across all subjects.
 Some teachers have signified that owing to the variety of strategies, they find it hard to
comprehend the non-statutory assessment data they obtain when parents register their
children for their school.

In light of the aforementioned points, what can be concluded is that despite some improvements,
including the quality of feedback and communication with pupils, the AWL has brought about a
mutual confusion between schools (Poet et al., 2018). The statutory national assessment is still in use
in schools’ non-statutory assessment; hence, it is still the principal approach used for not only
formative but summative assessment too (Poet et al., 2018). Additionally, various levels of
assessment were said to have restricted teachers and pupils, which was the major cause for them to
have been withdrawn (DfE, 2011). Nonetheless, as there have yet to be a consensus among
educators, it is still hard to maintain whether the AWL has improved the AFL practices.

Notwithstanding the fact that there is no policy document covering AFL at the School X, I have
noticed that teachers either in STEM or humanities subjects use similar approaches. As such,
teachers utilise cold calling, mini whiteboards, traffic light systems, and circulation. As far as marking
is concerned, the mostly used practice is self-assessment, as students are asked to check their
answers on their own and mark the correct ones with a green pen.

Ethics
The data were collected as part of the School Experience Report in accordance with King’s College
London and BERA ethics guidelines. All participants were informed that their participation was
voluntary, and they could withdraw at any point during the process in advance. They were also aware
that a permission from PCM (Professional Coordinating Mentor) was obtained to conduct this
research, observe lessons and hold interviews. They were also notified that the data collected would
be anonymised after collection so that they could not be identified. They were also advised that the
data would not be shared with third parties and would be removed once the assignment has been
completed.

Data Collection Methodology


Since there were no policy documents relating to AFL at the School X, it was necessary to observe
lessons and hold interviews with teachers in both STEM and humanities subjects. That is why, I
observed two lessons in Maths, one in R.E. and one in English. I also interviewed the teacher A for
R.E., the teacher B for English, and the teacher C for Maths. The relevant data were gathered through
semi-structured observations and interviews. This method is effective, because a semi-structured
observation may set the issues beforehand, but prearranged or methodical approach is unnecessary

7
to gather data (Cohen et. al, 2017). Observations have also enabled me to perceive the reaction and
performance of students firsthand without having to interview them. This was rather significant,
because in this way, not only did I eliminate any possible pressure pupils would feel, thereby avoiding
giving explicit and certain answers, but I also managed to realise how they responded to their
teachers’ questions. Therefore, it was possible to benefit from observing aspects that have not
previously been factored in.

On account of the nature of this research, as due to it being a small-scale study, the semi-structured
interview would be appropriate, as this way, while the topic and questions are defined prior to the
interviews, the given questions are open-ended with the aim of adjusting the order and working of
questions so that each interview can be given prompts or be asked follow-up questions to avoid
ambiguities (Cohen et. al, 2017). As cited by Cohen et al. (2017), there are many advantages of open-
ended questions, including flexibility, possibility to deepen the questions as well as answers,
encouragement of collaboration, reaching a more accurate assessment of respondents’ notions.
Unforeseen answers can also stem from open-ended situations which the interviewer would not
assume in the first place (Cohen et. al, 2017). Thus, it is my conviction that it would be more natural
for teachers to answer the questions entirely on their own rather than be asked the questions to
necessitate them to choose answers between different options. Since the School X allowed me to
record the interviews, there was no issue at all with documenting transcriptions. This is quite
important, as this way I was able to minimise the risk of “massive data loss, distortion and the
reduction of complexity” (Cohen et. al, 2017). However, since the interviews were based on verbal
discussions, “the visual and non-verbal aspects of the interview” have been neglected (Mishler,
(1986), cited in Cohen et. al, 2017). Losing data from the original interview is bound to happen,
because it converts the oral and interpersonal system into the written one (Cohen et. al, 2017). The
transcript acts as “an opaque screen between the researcher and the original live interview
situation.” (Kvale (1996, p. 167), cited in Cohen et. al, 2017). As a consequence, in virtue of such
challenges, it is hard to define what a ‘correct’ transcription is (Cohen et. al, 2017).

Findings, Analysis and Discussion


Interviews

In three interviews, all teachers highlighted the importance of prior knowledge so that they would be
aware of the current level of their students. The role that prior knowledge plays is invaluable,
because in case students’ prior knowledge has not reached the level required, there will not be any
tangible progress in their learning. This notion is also justified by Hailikari et. al (2015) mentioned
above. Another aspect that all interviewees have agreed is the effectiveness of MCQs. They have all
mentioned that MCQs are easier because students find it easier to identify the correct answer than
having to produce it on their own. MCQs also provide teachers with the opportunity to assess the
students’ knowledge quickly and relatively effortlessly, which also represents B.E.S.T.’s (2016)
viewpoints on the usage of this question type. As I have noticed three interviewees have asked
students to write down the answers of MCQs on their mini white boards. They use mini whiteboards
because not only are they practical, but they also demonstrate Benjamin’s (2023) perspective, as
students’ answers are not recorded anywhere, therefore, they take “risks” to answer the questions.

When it comes to differences among three interviewees, the teacher A for R.E. has mentioned that
she applies a combination of assessments including checkpoints through MWBs. The teacher A also
utilises the concepts memory retrieval and quotations. By implementing these techniques, the
teacher A ensures that students remember or retain the relevant information before moving on to
next topics. The teacher A also argues that this is the power they need in R.E. As far as the frequency

8
of assessment is concerned, she opts for assessing students’ knowledge every 20 minutes. Checking
students’ knowledge frequently indicates Wolf’s and Patrick’s (2007) view, as this way both students
will be able to monitor their progress regularly through progressive feedback. Specifically, she
reactivates their memories for the new knowledge. Besides, the teacher A assesses chunks of
learning every 5 weeks and then gives feedback so as to have a better understanding of students’
long-term strengths and shortcomings. According to the teacher A, students with additional support
needs (ASN) need more time and scaffolding than others, which conforms to Oregon’s (2016)
statement for the reason that a successful lesson needs to define the right goals within the right
timeframe. What we can understand from “right goals” and “right timeframe” is the fact that we
need to adjust the resources as well as our teaching style depending on the students’ skills and
capacity, as not everyone is the same and can learn at the same pace. Hence, scaffolding can
disproportionately benefit teachers and students. Because of such factors, the teacher A gives
constant prompts and cues to ASN students.
The teacher B for English circulates the classroom to elicit participation. The teacher B also uses
techniques like cold calling and checking students’ substantive and disciplinary knowledge.
Substantive knowledge is teaching the content as a widely accepted fact, irrespective of being
standard practice, concept and warranted account of reality, while disciplinary knowledge defines a
curricular term used for the ways of how pupils learn and about the methods of establishment of
that knowledge, the extent of certainty that it can reach and how scholars, artists or professional
practice can continue to revise it (Councell, 2018). This fact is also in line with Wolf’s and Patrick’s
(2007) argument. The teacher B uses MCQs and hinge questions to check students' substantive
knowledge so that he can check their understanding at certain levels of the lesson. This concept has
also been supported by Searle (2023). In contrast, he uses techniques like highlighting in specific
colours the key elements like adjectives, adverbs and so on, to evaluate students’ disciplinary
knowledge, because he suggests that this may assist students to develop their critical thinking skills.
Nonetheless, he has also heeded the fact that students need to first have acquired substantive
knowledge to develop their disciplinary knowledge, as he has mentioned that the success of the
latter one is based on the former one. With regard to what students find challenging, he claims that
literary criticism, which is the study used to investigate, evaluate, and interpret works of literature
(Dickinson, 2024). The teacher B has stated that literary criticism is hard to retain knowledge related
to the development of critical thinking skills, as it requires higher level of abstraction. Conversely, the
students find the standard rules to follow, for instance, prepositions easy. This is comprehensible
given the fact that memorisation of rules does not require any in-depth analysis, but just practice.

As far as stem subjects are concerned, the Maths teacher C also stresses the cruciality of students’
prior knowledge so that she can determine what they know before introducing the new topic to
them. She has also drawn her attention to real-time feedback while circulating the classroom. As
cited by Altrabsheh et al. (2014), the teacher C gives real-time feedback so that she can develop her
teaching skills and gain an insight into how the students learn and the gaps they need to cover. The
teacher C differentiates her approach at the end of the lesson from other teachers by asking two
questions; the first one is “spot the mistake” and the second one is "exam-style question” to check
whether they have understood the topic. She has also maintained that she constantly checks
students’ prior knowledge by backing to the lessons taught depending on the difficulty of the topics.
That is why, she refers to those lessons from twice a month up to a couple of times per school year.
As regards what they find difficult is to make connections between topics, since once they have
focused too much on a topic, they may forget it, as they may have done this a while ago. Another
useful teaching technique she uses is modelling, which is the method used to engage students
through demonstration broken down in steps with a rationale so that students can expose

9
themselves to both visual and verbal examples of the relevant topic (Lousville, 2012). She also
applies pair share activities for students to be able to correct and learn from each other. She has also
indicated that the more simplified the questions, the easier the students answer.

Lesson Observations

I first observed the Year 7-Maths class which consisted of 27 students. The teacher C started the
lesson with equation problems and asked students to work out the missing numbers in calculations,
as they needed some hints to find out the answers first, like “You’ll first need to find the value of As.”
Some students seem to have struggled with open questions due to lack of clear instructions and
methods. What I have noticed is that they find answers easier by asking them guiding questions,
which are developed to motivate students to ponder the topic being taught (Cornell, 2024), and
providing relevant examples. The teacher C asks students to refer to their previous learning if they
find the relevant questions hard. As I have realised, reminding students the process for different
modules helps a lot, since this way, as suggested by Wolf and Patrick (2007), it provides them with an
opportunity to receive useful feedback about their progress. Another noteworthy aspect of this
lesson was the effective usage of visual diagrams. As what I perceived was the fact that although
students struggled to answer the question “Between which days was the temperature higher in
Plymouth than Belfast?” (Appendix IV) in the first place, after the teacher C showed them the
relevant graphs, most students answered it correctly. Conversely, there were only a select few
students who succeeded in answering the question “Explain the mistake” (Appendix IV), which was
ambiguous as the teacher did not mention anything about the “mistake”. Since the answer was “The
graph only talks about days and not times of the day.” (Appendix IV), had the teacher C first given a
few hints making the question a little bit guiding like “There is a mistake about times in this graph.
Can you please identify it?” it would not be inaccurate to assume that more students would have
given the correct answer. I also managed to observe the Y11 Maths lesson with 26 students. The
teacher C started with the factorisation of numbers, and then gave students two options to choose
the correct one. In this case, what came to my attention was the fact that, after the teacher
simplified the examples of quadratic fractures, most students answered the question correctly,
because through guiding questions, students can at least know what kind of answers are expected of
them. Subsequently, she also gave further instructions for future reference saying that “the length of
x can only be a positive number.” (Appendix V) Such remarks are also helpful to solidify students’
learning, as they could build on their current knowledge this way. The teacher C has also used the
“think-pair-share” strategy with the aim of enabling students to work together to discuss and
address problems. Contrary to what Benjamin (2023) claims, in the light of my observations, the
“think-pair-share” approach is not always useful, because when working together, several students
tend to disengage and make noise instead of working on their tasks. Consequently, unless the
teacher is well-acquainted with the students, who also need to have developed a certain level of
collaboration, it is highly unlikely that this approach will be effective.

Secondly, I observed the Year 9-English class comprising 26 students. Due to it being a language
lesson, it has made more room for the teacher B to ask open questions. Especially when assessing
students’ knowledge in humanities subjects, teachers usually take into consideration subjective
items, which allow students to prepare an original answer like an extended-response essay, unlike
objective items which do not require them to produce answers on their own, but rather select the
correct response from the specified alternatives (The University of Illinois, 2009). Having said that,
surprisingly, the students still struggled to answer them. The reason that I did not anticipate this was
because I was first on the assumption that languages did not have clear-cut rules like STEM subjects.
Therefore, I believed that most students would have been able to produce and expand their answers

10
using their common sense and expressing their own views. As was the case at Maths, what they
found easy was MCQs. What I also observed was that the assessment with jigsaw reading questions
was quite different from the questions used at Maths. Since the only thing that the students had to
do was to identify the order in which the paragraphs were written, expectably, most students
responded quickly. Linguistic elements like keywords, conjunctions, and interrogative sentences were
useful prompts for students to put the paragraphs in the correct order. Another useful feature that
the teacher B has utilised is highlighting different parts of speech of texts on the big whiteboard. As I
perceived, this helped students to learn the differences between various parts of speech like
adjectives, adverbs, nouns, and verbs. The most conspicuous concept in terms of AFL I observed was
the exposure of students to prior knowledge or receiving guidance before the assessment. Otherwise
only the gifted students can tackle problems when being left to their own devices.

Lastly, I observed the year 9-R.E. class with 20 students. The teacher A for R.E. first used fill-in
sentences with MCQs first. Unsurprisingly, the vast majority of students answered the questions
correctly. The teacher has also asked the students to highlight the key words. Underlining or
highlighting key words is helpful for two reasons; it first urges students to search for fundamental
information during reading, which plays a major role in improving students’ concentration on the
essential parts or information, and it also simplifies the process of analysing the information they
have perused, since this way they will not need to reread the whole text, but just the keynotes they
have spotted (Pressbooks, 2022). The teacher uses charts with key words which are useful for
students to learn better than a text-heavy document. The teacher A has also asked semi-open
questions, whose answers are based on the text. “Two reasons why the ascension is important?”
(Appendix VII) One reason has been answered by most students. A more guided question “Where
does the ascension show Jesus is?” (Appendix VII) was necessary for students to find the 2nd reason.
The teacher has also asked a closed question like “How many instructions did Jesus give to his
disciples before his ascension?” (Appendix VII) After the guided question, all the students answered
it correctly. The teacher A has also used a differentiated technique, as after she asked them to
answer an MCQ, she also asked the students why other options are incorrect to check their
understanding. She has also told students to “explain two ways in which a belief in ascension
influences Christians today” and “explain two Christian teachings about ascension.” Since these were
also semi-closed questions, most students found the correct answers. Another feature that has
drawn my attention is the fact that the teacher asked students not to rub out their answers. She
asked students to leave their incorrect answers as they were so that they could compare their
answers with the correct ones and refer to them not to forget why they had first given incorrect
answers. This approach also reinforces Johnson’s (2023) view, as it can be used to boost their
metacognitive skills. Despite the fact that other teachers have used the mini whiteboards, as I have
noticed the teacher A have used them more regularly. This also reinforces Wolf’s and Patrick’s (2007)
view, as it makes it possible to assess students’ knowledge on a regular basis. Another attribute that
the teacher A did a little bit differently was setting and following strict time limits. For this purpose,
she gave students ten seconds to answer open questions on their mini whiteboards and 3 seconds to
closed ones. This would be wise to prepare students for exam conditions in order to develop the
much-needed qualities to answer questions quickly without panicking.

Overall, what can be deduced from my observations is the fact that all teachers have used similar
approaches regardless of being humanities or STEM subjects. As opposed to my predictions, there
were not any noticeable differences in terms of the frequency and points of use of different question
types such as open or closed ones, MCQs and so forth. What I had first anticipated was the fact that
the learning environment for English and R.E. students to have been more interactive than Maths, as

11
students would not need any specialised or technical knowledge unlike STEM to develop and discuss
their personal beliefs.

Conclusion
Taking everything into consideration, the obvious conclusion to be drawn is that even though the
government guidance was vague and school documentation did not exist specifically for AFL, what
the interviewees have mentioned and what I have observed in lessons reflect the influence of
formative assessment on students’ learning. All the interviewees have also underlined the
importance of applying similar assessment types to their lessons. All components of data collection
indicated the precise use of the pedagogical principles the teachers have acquired during their
training sessions at the School X. The findings may suggest that the implementation of assessment
techniques that takes place is identical in School X’s classrooms, it would also be wise to mention the
huge role of subject knowledge of teachers has played in their disciplines. This was in particular
apparent in the way the guided questions were designed, the feedback given, and the well-adjusted
tasks observed in the lessons. Therefore, the efficiency of the AFL is primarily based on a decent
familiarity of the content being taught rather than the methodology used. Since no matter how well
assessment types may have been designed, in case the teachers lack the knowledge to convey the
topic well, students will be unable to enhance their knowledge. It is the teacher who is seen as a
source of knowledge to promote students’ learning. For this reason, it would be advantageous for
the authorities to include elements regarding subject knowledge enhancement when designing AFL
policies to ensure that all teachers are capable of teaching the content required. Another point that
is worth discussing is the way of how students assess their own knowledge. Albeit the students are
usually asked to assess their own knowledge at the end of each task at the School X, while observing
the lessons I could not see how teachers would ensure that whether students have checked all their
answers or not. The use of mini-whiteboards was partially successful in this regard, but a more
reliable type of assessment like asking them to do in-class tests at the end of each topic, would have
been more beneficial.

When it comes to limitations, due to the scope of the School Experience Report (SER), although I
observed lessons both in key stage 3 (Year 7 and Year 9) and in key stage 4 (Year 11), given the
limited number of students observed and teachers interviewed, I would have had a better
understanding of how the School X implements the formative assessment in all classes and year
groups, if I had interviewed more teachers and observed more lessons. In view of these factors, this
research has chiefly been qualitative. Since I could not observe how the formative assessment takes
place in key stage 5, I was just able to summarise the general overview of how students’ knowledge
is assessed in very specific classes and year groups. While Cohen et. al (2017) suggest that even small
data sets gathered in research can be trustworthy, it should be maintained that given the restricted
coverage of this research, it would still be hard to ascertain whether the data collected are entirely
adequate, valid, and reliable in terms of how the School X implements formative assessment across
the whole school. To increase the reliability and validity of this research, it would have been better if
its extent had been larger, therefore, more data collection would have produced a more tangible
outcome. Further research, interviews, and lesson observations would also be valuable to
understand how children with special educational needs respond to questions in terms of AFL.
Last but not least, the establishment of AWL has somewhat improved the formative assessment, as it
offers teachers the opportunity to provide pupils with feedback to clarify their strengths as well as
the areas where there are still gaps. On the other hand, the large number of students in classrooms,

12
and sometimes the absence of teaching assistants are the major issues need to be resolved so that
teachers can assess their students’ knowledge more effectively.

References
William, D. (2011). Studies in Educational Evaluation (Volume 37, Issue 1, March 2011, Pages 3-14).
What is assessment for learning? [Online] Available from:
https://www.sciencedirect.com/science/article/pii/S0191491X11000149
[Accessed 11th December 2023]

Cambridge International Education Teaching and Learning Team (2015). Getting started with
Assessment for Learning. [Online] Available from: https://cambridge-community.org.uk/professional-
development/gswafl/index.html [Accessed 11th December 2023]

Fautley, M. and Jonathan, S. (2008). Assessment for Learning and Teaching in Secondary Schools.
[Online] Available from:
https://books.google.co.uk/books/about/Assessment_for_Learning_and_Teaching_in.html?id=ijodqGT
e2zoC&redir_esc=y [Accessed 12th December 2023]

Jayne, B. (2015). Outstanding Assessment for Learning in the Classroom. [Online] Available from:
https://www.routledge.com/Outstanding-Assessment-for-Learning-in-the-
Classroom/Bartlett/p/book/9781138824508 [Accessed 12th December 2023]

Oregon (2016). Chapter 2: Learning Goals and Success Criteria. [Online] Available from:
https://www.oregon.gov/ode/educator-
resources/assessment/Documents/learning_goals_success_criteria.pdf
[Accessed 13th December 2023]

Australian Institute for Teaching and School Leadership (2017). Learning intentions and success
criteria. [Online] Available from: aitsl-learning-intentions-and-success-criteria-strategy.pdf
[Accessed 13th December 2023]
Freibrun, M. (2021). Using Success Criteria to Spark Motivation in Your Students. [Online] Available
from: https://www.teachingchannel.com/k12-hub/blog/success-criteria/
[Accessed 14th December 2023]

Miles, J. (2022). The 3 Different Types of Assessment in Education. [Online] Available from:
https://www.hmhco.com/blog/different-types-of-assessment-in-education
[Accessed 15th December 2023]
BEST (2018). Approaches Diagnostic questions. [Online] Available from:
https://www.stem.org.uk/sites/default/files/pages/downloads/BEST_Approaches_Diagnostic%20questi
ons.pdf [Accessed 16th December 2023]

Barton, C. (2018). On Formative Assessment in Math. [Online] Available from:


https://files.eric.ed.gov/fulltext/EJ1182085.pdf [Accessed 17th December 2023]

Doherty, J. (2017). Skilful questioning: The beating heart of good pedagogy. [Online] Available from:
https://my.chartered.college/impact_article/skilful-questioning-the-beating-heart-of-good-pedagogy/
[Accessed 17th December 2023]

Benjamin, Z. (2023). 17 Easy To Use Formative Assessment Examples To Develop Your Approach To
Teaching and Learning [Online] Available from: https://thirdspacelearning.com/blog/formative-
assessment-examples/ [Accessed 18th December 2023]

Johnson, E. (2023). 25 Assessment For Learning Examples To Use With Your Students Today.
[Online] Available from: https://thirdspacelearning.com/blog/assessment-for-learning-examples/
[Accessed 19th December 2023]
Department for Education (2018). Assessment without levels: qualitative research. [Online] Available
from:

13
https://assets.publishing.service.gov.uk/media/5f6385f48fa8f510677710cd/NFER_AWL_report.pdf
[Accessed 20th December 2023]
Cohen, L., Manion, L. and Morrison, K. (2018). Research Methods in Education. [Online] Available
from: https://ebookcentral.proquest.com/lib/kcl/reader.action?docID=5103697 [Accessed 21st
December 2023]
Hailikari, T., Katajavuori, N. and Lindblom-Ylanne, S. (2008). The Relevance of Prior Knowledge in
Learning and Instructional Design. [Online] Available from:
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2630138/ [Accessed 22nd December 2023]
Wolf, J. and Patrick, J. (2007). Academic Improvement through Regular Assessment. [Online]
Available from: https://eric.ed.gov/?id=EJ780700 [Accessed 23rd December 2023]
Counsell, C. (2018). Taking curriculum seriously. [Online] Available from:
https://my.chartered.college/impact_article/taking-curriculum-seriously/
[Accessed 24th December 2023]
Dickinson College (2024). Criticism: Literature, Film & Drama: Literature Criticism. [Online] Available
from: https://libguides.dickinson.edu/criticism [Accessed 16th January 2024]

Altrabsheh, N., Cocea, M. and Fallahkhair, S. (2014). Learning Sentiment from Students’ Feedback
for Real-Time Interventions in Classrooms. [Online] Available from:
https://link.springer.com/chapter/10.1007/978-3-319-11298-5_5 [Accessed 17th January 2024]
University of Louisville (2012). Modeling - Elementary School. [Online] Available from:
https://louisville.edu/education/abri/primarylevel/modeling/elementary [Accessed 18th January 2024]

HelpfulProfessor (2024). 18 Guiding Questions Examples. [Online] Available from:


https://helpfulprofessor.com/guiding-questions-examples/ [Accessed 19th January 2024]

Valencia College (2020). Choosing between Objective and Subjective Test Items. [Online] Available
from: https://valenciacollege.edu/academics/academic-affairs/learning-
assessment/documents/handout4-yellow-quiz-
choosingbetweenobjectiveandsubjectivetestitems.pdf [Accessed 20th January 2024]
Pressbooks (2022). Chapter 9: Underlining and Highlighting Key Words and Phrases. [Online]
Available from: https://pressbooks.pub/irwlevel1/chapter/underlining-and-highlighting-key-words-and-
phrases/ [Accessed 21st January 2024]

14
Appendices
Appendix I: Interview with the Maths Teacher C

15
16
Appendix II: Interview with the English Teacher B

17
Appendix III: Interview with the R.E. Teacher A

18
Appendix IV: AFL Lesson Observation 16/11/023 – Maths, Teacher C, Year 7, class size 27.

19
20
Appendix V: AFL Lesson Observation 13/12/023 – Maths, Teacher C, Year 11, class size 26.

21
Appendix VI: AFL Lesson Observation 24/11/2023 – English, Teacher B, Year 9, class size 26.

22
Appendix VII: AFL Lesson Observation 17/11/2023 – English, Teacher A, Year 9, class size 20.

23
24
25
Appendix VIII: Teacher of Mathematics Ethics Consent Form

26
Appendix IX: Teacher of English Ethics Consent Form

27
Appendix X: Teacher of R.E. Ethics Consent Form

28

You might also like