You are on page 1of 16

Assessment & Evaluation in Higher Education

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/caeh20

Enabling the feedback process in work-based


learning: an evaluation of the 5 minute feedback
form

Emer O’Malley , Anne-Maria Scanlon , Lucy Alpine & Sinéad McMahon

To cite this article: Emer O’Malley , Anne-Maria Scanlon , Lucy Alpine & Sinéad McMahon (2020):
Enabling the feedback process in work-based learning: an evaluation of the 5 minute feedback
form, Assessment & Evaluation in Higher Education, DOI: 10.1080/02602938.2020.1842852

To link to this article: https://doi.org/10.1080/02602938.2020.1842852

Published online: 08 Nov 2020.

Submit your article to this journal

Article views: 114

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=caeh20
ASSESSMENT & EVALUATION IN HIGHER EDUCATION
https://doi.org/10.1080/02602938.2020.1842852

Enabling the feedback process in work-based learning: an


evaluation of the 5 minute feedback form
Emer O’Malleya , Anne-Maria Scanlonb, Lucy Alpinec and Sinead McMahond
a
Physiotherapy Department, St. Columcille’s Hospital, Dublin, Ireland; bPhysiotherapy Department, Tallaght
University Hospital, Dublin, Ireland; cDiscipline of Physiotherapy, Trinity College Dublin, Dublin, Ireland;
d
UCD School of Public Health, Physiotherapy and Sport Science, University College Dublin, Dublin, Ireland

ABSTRACT KEYWORDS
Feedback is the key ingredient from which we grow. This paper contrib- Work-based learning;
utes to the literature on student feedback in higher education, specific- clinical education; student
ally within work-based learning. It provides an overview of the usability, feedback; feedback tool
utility and impact of a freely available feedback tool, the 5 min feedback
form (5MFF). The 5MFF was constructed for use within physiotherapy
and is being utilised across clinical placement sites in Ireland. The gen-
eralisability of its structure allows for transferability across healthcare
professions and other work-based disciplines. A multi-site cross-sectional
study utilising convenience sampling across a diverse range of place-
ment settings was conducted. Physiotherapy students, practice educa-
tors and practice tutors were surveyed on their experience and
perceptions of the 5MFF. Findings indicate strong agreement in relation
to ease of use, speed of completion and the form’s ability to structure
and provide timely feedback. Results support its capacity to encourage
feedback prioritisation, identify strengths and weaknesses, direct learn-
ing, modify behaviours and facilitate the achievement of goals and
objectives. Through facilitating collaborative dialogue the 5MFF supports
the feedback conversation through raising awareness of student chal-
lenges or practice educator and tutor concerns. Utilisation of the 5MFF
during physiotherapy clinical education supports effective feedback
delivery and enables a positive feedback process.

Introduction
High-quality feedback is recognised as an essential component of student learning in higher
education (Mulliner and Tucker 2017). Feedback is defined as ‘an informed, non-evaluative, and
objective appraisal of past performance that is aimed at improving future performance’ (Ende
1983, 3). It is necessary to lessen the gap between current capability and future achievement;
however, there is high variability in its frequency and effectiveness (Hattie and Yates 2014).
Feedback is considered to be one of the most influential and impactful elements in supporting
learning and ultimately student development (Hattie and Timperley 2007). As a result, failure to
receive feedback can be extremely discouraging and negatively impact student progression
(Cantillon and Sargeant 2008). The provision of written feedback over verbal feedback alone is
advocated (Page, Gardner, and Booth 2020), due to its potential to allow for reflection, consolida-
tion and application (Carless 2006). Sargeant et al. (2009) identified the process of reflection as

CONTACT Emer O’Malley emer.omalley1@hse.ie


ß 2020 Informa UK Limited, trading as Taylor & Francis Group
2 E. O’MALLEY ET AL.

being a key element in feedback acceptance and utilisation. The importance of developing self-
reflection skills across all levels and professions within healthcare education are emphasised as
part of professional development (Mann, Gordon, and MacLeod 2009).

Theoretical underpinning
The concept of feedback is deeply rooted in pedagogical theories of learning. It originates in
behaviourism and remains a central aspect of learning at all levels (Mulliner and Tucker 2017).
Behaviourism adopts a linear process whereby students are required to reproduce information or
discipline-specific skills based on feedback which can be modified through praise or punishment
(Atkinson, Atkinson, and Hilgard 1983). The ongoing evolution of learning theories through cogni-
tivism, social cultural theory, meta-cognitivism and social constructivism has seen a greater
emphasis placed on feedback (Thurlings et al. 2013). Active student participation in a collaborative
cyclical feedback process, whereby students are encouraged to utilise prior learning and where
educators participate in the feedback process, is advocated (Jonassen 1991). Reflective practice as
a strategy to facilitate the achievement of professional competence has become a cornerstone of
healthcare education (Mann, Gordon, and MacLeod 2009). Boud (1999) outlined the importance of
good quality practices in student self-assessment and reflection, emphasising the value of guided
construction and self-evaluation of the profession-specific criteria to which they aspire.

Characteristics of effective feedback


Effective feedback has been described as the delivery of first hand, timely and specific feedback
that utilises non-evaluative phrasing and focuses on action rather than personality (Hesketh and
Laidlaw 2002; Shute 2008). It should be frequent, formal and informal, constructive, reflected on
by the learner and utilised to achieve positive progression (Hesketh and Laidlaw 2002; Archer
2010). Formative feedback can have a powerful impact on achievement by providing helpful
comments on strengths and areas for development with guidance on strategies for improvement
(Nicol and Macfarlane-Dick 2006). Hesketh and Laidlaw (2002) emphasise the importance of rein-
forcing desired practice and providing an opportunity to correct or modify behaviour through
increasing self-awareness. Feedback can empower students and promote individual progression
by supporting the acquisition of self-assessment skills in order to develop self-regulated learning
(Nicol and Macfarlane-Dick 2006). Effective feedback must demonstrate an effect; a change in
behaviour or action must be exhibited in order to close the feedback loop (Dawson et al. 2019;
Carless 2019). Taras (2001) provides a helpful example of enabling student self-assessment skills
by involving students in appraising their own work, guided by an educator prior to the delivery
of a grade. Feedback content and delivery should therefore be carefully considered as they can
have a significant impact on student learning and engagement (Hattie and Timperley 2007).

Work-based learning
Work-based learning also referred to as clinical or practice education is an essential component
of all healthcare professional programmes. The authentic learning it supports is vital to develop-
ing clinical competence and increasing graduate employability worldwide (Little and Harvey
2006; National Forum 2017). Through the dynamic process of experiential learning students
develop the profession-specific therapeutic and clinical reasoning skills required (Lisko and O’Dell
2010) and feedback plays a vital role in its efficacy (Cantillon and Sargeant 2008). Achieving a
successful balance between assessment and feedback requires a clear focus, support for learners,
adequate organisational processes and a commitment to learning rather than achievement alone
(Watling and Ginsburg 2019). An appreciation of self-assessment and reflection as a valued
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 3

component of learning is vital to the development of autonomous practitioners (Boud 1999).


Formative feedback provides a stimulus for ongoing student development, however it places a per-
ceived workload burden on clinical supervisors who have the onerous task of being both mentor
and assessor (Trede et al. 2015). Barriers to helpful feedback delivery include inadequate processes,
insufficient depth and quantity and poor communication between students and educators (Boud
and Molloy 2013a). The availability of sustainable effective feedback strategies for busy supervising
clinicians is therefore a key factor in achieving positive learning outcomes (Mutch et al. 2018).

Feedback tools
A move from historical educator-driven to student-led feedback processes has instigated the
development of work-based feedback models and tools (Boud and Molloy 2013b). A number of
models exist including the ‘feedback sandwich’ (Docheff 1990), which provides reinforcing and cor-
rective feedback, the ‘Pendleton model’ (Pendleton et al. 1984), which utilises a structured feed-
back conversation, and the ‘reflective feedback conversation’, which focuses on the goals of
feedback (Cantillon and Sargeant 2008). Structured feedback tools have developed within health-
care education from educator-led tools such as the ‘RIME-based’ method (DeWitt et al. 2008) to
broader multi-modal performance feedback tools (Dearnley et al. 2013). Recently evaluated stu-
dent-led tool includes ‘The Daily Feedback Tool’ advocated to increase feedback frequency,
develop trusting relationships, encourage feedback-seeking and improve overall performance
(Allen and Molloy 2017). A potential limitation identified in relation to usability was the repetitive
nature of this tool. Others such as the ‘FEEDBK’ tool, provide a structure for students to reflect on
development goals, align with curriculum objectives and receive guidance from educators (Hall
et al. 2020). Although received positively by students, the utility of the ‘FEEDBK tool’ among educa-
tors has not been determined. These tools represent a significant progression towards student-led
self-directed learning. The importance of feedback design to ensure effective feedback utilisation,
by identifying whether planned goals have been achieved (i.e. closing the feedback loop) has been
emphasised (Carless 2019; Molloy, Boud, and Henderson 2020).

Development of feedback tools


The 5 min feedback form (5MFF) was developed within physiotherapy programmes in Ireland to
enhance feedback practices during clinical placement. The 5MFF is a structured tool, utilised to
increase feedback frequency, provide written evidence of weekly feedback and to standardise feed-
back practices. Despite use in routine physiotherapy clinical education there is currently no available
evidence as to whether the 5MFF supports feedback processes. This paper therefore sought student,
practice educator and practice tutor experiences of the usability, utility and impact of the 5MFF.

Aim
The aim of this study was to evaluate student, practice educator and practice tutor perceptions
of a weekly feedback form, utilised during clinical placement. The objective was to establish the
forms usability and utility and to estimate the impact of using the 5MFF in supporting a struc-
tured feedback process.

Materials and methods


Clinical education
Clinical education is a significant component of higher education programmes for healthcare
graduates. Physiotherapy students in Ireland must undertake 1000 hours of supervised clinical
4 E. O’MALLEY ET AL.

Figure 1. Five minute feedback form.

education in placement sites which are reflective of current practice and the demands of the
profession (CORU 2018). A competency-based common assessment form (CAF) is utilised to pro-
vide formative and summative assessment of student progress at midway and at placement
completion (Coote et al. 2007). Competency-based assessments are utilised across many health-
care disciplines; however, feedback processes are inconsistent and do not occur routinely within
medical education in Ireland. The 5MFF was created as an adjunct to support effective feedback
and facilitate the achievement of clinical competence (see Figure 1). It is a novel feedback tool
utilised in physiotherapy clinical education to stimulate weekly student-led formative feedback
conversations. Prior to its introduction feedback was delivered in an ad hoc, inconsistent manner
and often reserved only for midway & final feedback.

Study population and setting


A multi-site cross-sectional study utilising convenience sampling to survey participants on their
experience and perceptions of the 5MFF was conducted. Prior to undertaking this study the
questionnaire was piloted across 20% (n ¼ 4 of 20) of Trinity College Dublin physiotherapy place-
ment sites (O’Malley, Scanlon, and Alpine 2019). A range of sites including urban, rural, primary,
secondary and tertiary care in tutor and non-tutor placement sites were contacted and invited to
participate. Third and fourth year physiotherapy students from University College Dublin (UCD)
and Trinity College Dublin (TCD) and affiliated practice educators and tutors were recruited
between October and December 2019. All students were familiar with placement processes hav-
ing previously completed between two and five placements. Placement duration was between
five and six weeks. While on placement students were supported by the practice education team.
The practice education team consists of university-based practice education co-ordinators,
regional placement facilitators who oversee multiple placement sites, practice tutors who provide
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 5

a link between placement sites and higher education institutions and clinically-based practice
educators whose role it is to supervise, guide and determine clinical and professional compe-
tence in their speciality area. All students have an assigned practice educator with some place-
ment sites having additional support from practice tutors who are primarily responsible for
placement organisation and tutorial delivery.

5 minute feedback form


The 5MFF was developed through reference to guiding literature and the cross-collaboration of
clinical and academic practice education staff to structure feedback practices within work-based
clinical placements. The tool was initially introduced to placement sites in 2013 and has been
developed through an iterative process. Students and supervising staff connected with UCD are
encouraged to utilise the 5MFF weekly whereas the ‘TCD Student Pathway’ requires weekly use.
Student-led feedback is emphasised and feedback delivery which is timely and brief i.e. 5 min is
highlighted. Students document three aspects of their performance that went well, three areas for
future development, identify any challenging situations that arose during the week and agree a
structured plan to achieve performance goals. Educators provide feedback via brief written com-
ments to guide students on current performance and competency development. Comments are
utilised to reinforce desired behaviour, to alert students to areas that require development and to
provide guidance on strategies to bridge competency gaps. The 5MFF provides written evidence
of feedback on competency development and reinforces summative assessment decision making.
The inclusion of the text box ‘did any challenging situations arise this week?’ for students and
‘did any areas of concern arise this week?’ for educators differs from other available feedback
tools. Allen and Molloy (2017) advocated the development of trusting relationships to empower
students to seek feedback. Encouraging students to discuss experiences and feelings may
increase understanding, and requiring educators to explicitly identify areas of concern may facili-
tate increased self-awareness. On completion of the form a review date is set to determine
achievement of performance objectives and ultimately clinical competence as summatively
assessed by the CAF, thereby closing the feedback loop. The structure of the 5MFF facilitates
self-directed learning by encouraging students to take ownership of the feedback process by
engaging in self-reflection and self-assessment. It also facilitates the early detection of an under-
performing student. The identification of concerns triggers the utilisation of the TCD Student
Pathway, providing a structured remediation pathway.

Ethical approval
Ethical approval was granted by the Human Research Ethics Committee of University College
Dublin (LS-E-19-126-OMalley-McMahon) and Trinity College Dublin (Application no: 20190905).
Prior to initiation of the study placement sites were contacted and invited to participate. An
anonymous questionnaire was utilised at the end of clinical placement to gather feedback on
student, practice educator and tutor perceptions of the form’s usability, utility and impact.
Participation in the study was voluntary and its completion and return implied consent. Data
gathered from the pilot study informed the content of the study questionnaire. Two question-
naires were utilised, one for students and the other for practice educators and tutors. None of
the groups were required to have used the 5MFF prior to completion of the questionnaire.

Study questionnaire
The questionnaires were constructed to capture data based on desired feedback characteristics,
using an education-based theoretical framework, similar to that used by Bohnacker-Bruce (2013).
A number of iterations of the questionnaire were evaluated and modifications were performed
6 E. O’MALLEY ET AL.

following the pilot study, consultation with students, practice educators, practice tutors, educa-
tional researchers and review of the literature, to support content validity. Demographic data, place-
ment or supervisory experience and familiarity with the 5MFF were gathered. For questions 1–6,
students were asked to indicate gender, age range, educational year, prior use of the 5MFF, the
number of placements it was used on previously and frequency of use on the preceding placement.
For questions 1-8 practice educators and tutors were asked to indicate gender, age range, educa-
tional role, supervision history, frequency of 5MFF use and previous attendance at practice educa-
tion training. A 5-point Likert scale from 5 ‘strongly agree’ to 1 ‘strongly disagree’ was used to
gather data on the 5MFF’s ease of use, completion time, ability to support feedback structure, time-
liness, prioritisation and the identification of strengths and weaknesses, learning guidance and goals,
desired performance modifications, student challenges and practice educator/tutor concerns. Finally,
an open response section requesting participants to ‘Please comment on the impact using the 5MFF
has e.g. support, feedback opportunities, time, structure, engagement, progression’ was included.

Analysis
All responses were imported to IBM SPSS for Windows version 24 (Armonk, NY) by the lead
investigator where all statistical analyses were performed. Practice educator and practice tutor
data were combined and analysed as one group. Continuous variables are reported as means
and standard deviation and categorical variables are reported as frequencies and percentage
prevalence. Non-parametric testing (Mann Whitney U test) was used to determine any associa-
tions between student and practice educator/tutor groups. Pearson chi-square test was utilised
to determine any associations between frequency of use and Likert scale and open response
classification. A statistical significance level of p < 0.05 was set for analyses. A dualistic technique
of deductive and inductive thematic analysis was applied to the coding of the open question
responses (Fereday and Muir-Cochrane 2006). A preliminary codebook was created with data-
driven coding based on our pilot study findings, the research question and literature review. In
addition to the a priori themes of ‘usability, utility and impact’ the application of an inductive
approach allowed for additional or unexpected themes to emerge.
Each data set (student and practice educator/tutor group) was analysed by two researchers.
The primary researcher (EOM) analysed both data sets and a separate researcher analysed the stu-
dent (LA) and practice educator/tutor groups (AMS) independently. The coding framework was fur-
ther developed and agreed through iterative testing. Where disagreement regarding coding
occurred, a fourth researcher (SMcM) was consulted to achieve consensus. Coding categories with
representative examples are presented in Table 1. Usability was defined as the degree to which
the 5MFF was fit to be used, utility related to the tool being useful, beneficial or worthwhile and
impact identified text related to having a marked effect or influence. ‘Other’ was assigned where
statements didn’t fall under any of the other three themes. Student examples are recorded as ‘S’
and educator/tutor examples are recorded as ‘ET’ followed by a number to identify individually
coded data. Responses were coded as positive, negative, mixed, neutral or none.

Results
A response rate of 51% for students was achieved. It was not possible to calculate this for the
practice educator/tutor group due to the anonymous nature of the study and inability to identify
the number of participants who had access to the questionnaire. A total of 147 participants com-
pleted the anonymous questionnaire. This included 68 students, 65 practice educators and 14
practice tutors. Seventy-five percent of students were in 3rd year and 25% were in 4th year. The
majority of placements (74.6%) were based in secondary care with 11.6% in primary care and
13.8% in tertiary care. Student and practice educator/tutor demographics, experience with use of
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 7

Table 1. Coding categories and representative examples.


Code label Definition Description Examples from study
Usability The degree to which Text related to ease of use, time S12: I found the 5MFF easy
something is able or fit investment, design feedback and to use.
to be used. its capacity to help structure and ET79: Quick and easy to use.
support the delivery of
timely feedback.
Utility The state of being useful, Text agreeing or disagreeing with the S17: I found it to be useful
profitable, or beneficial form being useful, beneficial, to identify the areas that
through being able to worthwhile or valuable. I was both doing well
perform several functions. with and that needed
work.
ET20: Extremely useful tool
for educating students
and assisting with getting
students to achieve
overall
placement objectives.
Impact A marked effect or Text mentioning direct influence of S19: It has a great impact
influence. Have a strong form e.g. on participation, insight/ on my education.
effect on someone self-reflection, action planning, ET75: Increased engagement
or something. achievement, flag areas of concern regarding
& prioritisation. student learning.
Other Statements that do not fall Text relating to future use, form S20: I feel that educators
under any of the ownership, requirement for need more education
above themes. prompting, educator value and around the importance of
training need. the form.
ET: I maybe need more
reminders to use it from
student and
practice tutor.
None No comment Comment section blank N/A

the 5MFF and feedback training history are presented in Table 2. A higher incidence of females
was noted in both groups, as is commonplace in many healthcare disciplines. The majority of
students (86.8%) and practice educators/tutors (82.3%) used it two to three times during the
placement or weekly with no significant difference between groups for prior use (p ¼ 0.40) or fre-
quency of placement use (p ¼ 0.35). Twenty-six (32.9%) of the practice educator/tutor group had
not attended formal training on feedback delivery.
Student and practice educator/tutor responses to the questionnaire are presented in Table 3.
The majority of respondents agreed or strongly agreed that the form was usable, useful,
impacted positively on student learning and that the time to complete was appropriate. The
5MFF was positively perceived with 76.4% to 95.9% of study participants agreeing that it helped
with feedback timeliness, directing learning, identifying strengths and weaknesses, modifying
behaviour, improving performance and achieving goals. There was no significant difference in
students and practice educator/tutor responses apart from impact on encouraging feedback pri-
oritisation with a greater number of the practice educator/tutor group strongly agreeing.
Frequency of use was significantly associated with perceived ability to direct learning (p ¼ 0.003),
modify behaviours and improve performance (p ¼ 0.005) among students and the achievement
of goals and objectives within the educator/tutor group (p ¼ 0.039).
Of the student open comment responses, 61.8% (n ¼ 42) were classified as positive, 2.9%
(n ¼ 2) negative, 14.7% (n ¼ 10) mixed, 2.9% (n ¼ 2) neutral and 17.6% (n ¼ 12) did not respond.
Within the practice educator/tutor group 60.8% (n ¼ 48) were classified as positive, 3.8% negative
(n ¼ 3), 8.9% mixed (n ¼ 7), 6.3% (n ¼ 5) neutral and 20.3% (n ¼ 16) did not respond. Frequency
of use and a positive open comment response were significantly associated for both the student
(p ¼ 0.015) and practice educator/tutor group (p ¼ 0.001). Of the twenty-six practice educators
who had not attended formal feedback training all indicated agreement with its ease of use.
8 E. O’MALLEY ET AL.

Table 2. Student and practice educator/tutor demographics, 5MFF experience and feedback training.
Students Educators/Tutors
Gender % (n) Male 33.8 (23) 20.3 (16)
Female 66.2 (45) 79.7 (63)
Age % (n) <25years 85.3 (58)
25years 14.7 (10)
20-39years 77.2 (61)
40-59years 22.8 (18)
Prior use % (n) Yes 85.3 (58) 89.9 (71)
No 14.7 (10) 10.1 (8)
Prior placement use mean (±SD) 2.4 (±1.6)
Frequency of prior use % (n) 1-5 times 39.9 (31)
6-10 times 13.9 (11)
10þ times 46.8 (37)
Frequency of use during placement % (n) Never 8.8 (6) 12.7 (10)
Once 2.9 (2) 5.1 (4)
2 or 3 times 35.3 (24) 38.0 (30)
Every week 51.5 (35) 44.3 (35)
Not scored 1.5 (1) 0 (0)
Supervision experience % (n) <5years 48.1 (38)
5years 51.9 (41)
Feedback training % (n) Attended 64.6 (51)
Expressed interest 22.8 (18)
Never attended 10.1 (8)
Not scored 2.5 (2)

Table 3. Questionnaire Likert scale responses.


Students Educators/Tutors
I found the 5MFF form … n Agree/Strongly Median (IQR) n Agree/Strongly Median (IQR) p value
agree % agree %
Easy to use 68 98.5 5 (4–5) 74 100 5 (4–5) 0.19
Helped to 68 92.6 5 (4–5) 74 100 5 (4–5) 0.23
structure feedback
Helped with timely feedback 68 83.8 4 (4–5) 74 87.8 5 (4–5) 0.27
Encouraged prioritisation 68 79.4 4 (4–5) 73 85 5 (4–5) 0.02
of feedback
Took too much time 68 14.7 2 (2–2) 73 4.1 2 (1–2) 0.53
to complete
Helped identify strengths 68 92.6 4 (4–5) 74 90.6 4 (4–5) 0.74
and weaknesses
Helped to direct learning 68 85.3 4 (4–5) 73 95.9 4 (4–5) 0.40
and give clear areas to
focus on
Helped to modify 68 86.7 4 (4–5) 73 94.5 4 (4–5) 0.51
behaviours and improve
performance
Helped achieve goals 68 76.4 4 (4–5) 74 87.9 4 (4–5) 0.22
and objectives
Helped bring up any 68 80.9 4 (4–5) 74 85.1 4 (4–5) 0.43
student challenges
Helped highlight any 68 91.1 4 (4–5) 74 91.8 4 (4–5) 0.50
Practice Educator/
Tutor concerns
IQR ¼ Interquartile range

The usability of the 5MFF


Approximately one fifth of the student and practice educator/tutor groups commented on the
usability of the 5MFF in relation to its ease of use, time investment, design and capacity to help
structure and support the delivery of timely feedback. Both groups responded positively on
these aspects noting that the tool is:
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 9

S38: Brief and to the point. Time orientated.


S32: It provides a structure for feedback.
ET8: Really quick and easy to use.

ET48: I liked that this is a rapid, structured & student led approach to feedback.

Some students gave design feedback suggestions while practice educators/tutors highlighted
the time element involved.
S14: Maybe there shouldn’t be an emphasis on the number of things that you improved/found difficult
because at times it’s hard to find 3 of each.
ET69: Good opportunity for PE to give quick feedback in structured manner – it does take more than five
minutes though!

The utility of the 5MFF


Over a third of the student and practice educator/tutor groups identified the 5MFF as being a
valuable, useful and helpful tool for enabling the feedback process. Comments highlighted an
improvement in formative feedback, clear goal setting and an increased opportunity for self-
reflection and interaction.
S53: Good for receiving constructive feedback & setting goals and in identifying any issues.
ET19: Useful for self-reflection from student, useful to highlight areas to improve, useful reminder in busy
environment for continued feedback, useful to start conversation on areas for development/areas of strength.

The 5MFF provides an opportunity for collaboration, supports ongoing monitoring and pro-
gression and is helpful in documenting concerns:
S21: Helped me facilitate a good rapport with tutor … overall was a good bridge week to week to ensure
continuous line of contact and review.

ET24: Useful to flag up any areas of concern in a timely manner & formally through writing it down.

Findings suggest that overall both groups have found the tool helpful, with only one student
identifying its utilisation as a burden:
S41: I think the 5MFF is a nice way for me to check in with my supervisor & share my thoughts on the past
week. Overall I find it very helpful.
ET6: I really like it, very helpful, should be standard everywhere, very helpful to progress and students
like it.
S8: I personally did not find the 5MFF helpful throughout my placement … it was more of a task than an aid.

The impact of the 5MFF


A quarter of all participants commented on the 5MFF as having a marked influence on learning.
Feedback prioritisation was highlighted by both groups and the impact of early identification of
challenges and emerging problems was also emphasised.
ET3: Enabled me to prioritise more formal feedback on a weekly basis. Easier to monitor student’s objectives
and placement plan.
S31: Good way to make sure feedback is actually given.
ET67: I feel it is valuable in terms of identifying concerns early and gives students an opportunity to
highlight any difficulties which they otherwise might not do.

The practice educator/tutor group also noted increased student reflection, insight, motivation
and self-directed learning while students indicated that setting expectations positively impacts
their future progress.
10 E. O’MALLEY ET AL.

ET51: I think it encourages students to do a reflection without having to be verbally prompted by me which
demonstrates some independence. Also if students don’t ask to do it, this can demonstrate poor self-
directed learning.
S57: Helped identify learning needs and goals as well as improve my overall grades both at midway and at
the end of placement.

S46: Guidance of what is expected from me & how I can progress every week.

Other
Statements which did not fall under the above headings were coded as ‘other’. One theme iden-
tified form ownership and requirement for reminders.
ET74: Didn’t use it on this placement. I maybe need more reminders to use it from student and
practice tutor.
ET61: Often prioritised out by educator on a Friday. Students need to continually prompt educators to fill
out and go through.

Both groups identified future use with students also highlighting the importance of educator
value and training.
ET15: I’ve never used it but from looking at it I can imagine it would help greatly with giving feedback.
S5: I maybe should have used it more … chased my educator.
S1: Depends on educator’s willingness to give feedback.
S20: I feel that educators need more education around the importance of the form.

Discussion
Clinical education is an integral part of healthcare curricula as it provides students with the
authentic learning crucial to becoming a healthcare professional (Trede et al. 2015). Work-based
learning increases opportunities for individual feedback which can enhance student engagement
and progression within a clinical environment (Cantillon and Sargeant 2008). Facilitating a collab-
orative learning environment where feedback is valued as a two-way process is central to achiev-
ing desired placement outcomes (Omer and Abdularhim 2017). Strategies and tools that enable
constructive and effective feedback processes are therefore essential. The 5MFF has demon-
strated high usability, utility and positively impacted the feedback process. This is important as
research suggests that if feedback mechanisms are not easy to use, detailed, personalised and
do not support active student engagement then the feedback process will not be successful
(Dawson et al. 2019). The 5MFF facilitates feedback on student perceptions of areas of strength
and for development. Its success, however, has the potential to be limited by student insight,
clinical experience and professional skill in guiding learning. Discussing areas of concern may
evoke reflective responses as described by Sargeant et al. (2009) who found that negative feed-
back or feedback inconsistent with self-assessment was vital to decisions around feedback
acceptance and use.
Effective feedback has been described as the provision of frequent, formal and constructive
guidance that promotes self-reflection and is utilised to achieve positive progression (Hesketh and
Laidlaw 2002; Archer 2010). Over half of students reported using the 5MFF weekly. This figure may
have been negatively impacted by its non-compulsory guidance within one university in addition
to CAF utilisation to provide feedback at midway and at placement completion. Our results sug-
gest increased 5MFF use impacts learning direction, performance and student achievement.
Utilising the 5MFF has encouraged students to seek out feedback – an approach endorsed by
Hesketh and Laidlaw (2002) to enable feedback to work best. Results suggest that the 5MFF has
promoted open communication which can help students to discuss challenges, facilitate practice
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 11

educators/tutors in highlighting concerns and ultimately increase feedback effectiveness. A lack of


feedback engagement was, however, highlighted by both groups. Some students noted that 5MFF
utilisation was determined by educator willingness while some educators identified student failure
to seek out feedback. The student-led nature of the 5MFF can overcome some barriers, however
responsibility-sharing is advocated as regardless of the quality of feedback delivered students must
engage in the process in order for it to be impactful (Nash and Winstone 2017).
Access to easy-to-use and time-efficient tools that align with learning objectives, support stu-
dent reflection and facilitate experiential learning is vital (Hall et al. 2020). The structure and guid-
ance for use of the 5MFF emphasises the need for the feedback tool to be easy-to-implement and
brief. Both student and practice educator/tutor groups rate the usability of the 5MFF very highly.
Importantly, findings suggest an encouraging relationship between increased frequency of use and
a positive appraisal of the 5MFF. Greater feedback retention through writing comments down was
also reported. Hall et al. (2020) emphasised the value of written feedback in providing the oppor-
tunity to review and track progress. Interestingly, within our data, some students gave feedback
tool design suggestions and although these differed between individuals they highlight the need
for feedback to remain flexible and individualised to the learner (Sadler 2010). Among many bar-
riers, a lack of educator training in the delivery of feedback has been reported (Hesketh and
Laidlaw 2002). Despite a third of our study population having not received feedback training this
did not appear to be a difficulty, with many who had not previously used it anticipating that the
5MFF would be easy to use and would support the feedback process.
To date, disagreement around feedback engagement, preference and student satisfaction has
been a significant challenge across higher education (Mulliner and Tucker 2017). Students report
receiving limited feedback whereas educators report providing regular and sufficient feedback
(Gigante, Dell, and Sharkey 2011; Barrett, Belton, and Alpine 2019). Given this disconnect, struc-
turing feedback delivery has the potential to positively impact work-based learning. Our findings
suggest that the 5MFF has high utility with students and practice educators/tutors reporting the
tool to be useful, beneficial and worthwhile. Both groups agreed that it provides a valuable
opportunity to formalise feedback in order to reinforce good practice and support a collaborative
approach to tackling areas for improvement. The 5MFF requirement to document feedback and
the utilisation of a weekly review date supports its efficacy. In line with Dawson et al. (2019), the
5MFF promotes active participation, timeliness and self-reflection through encouraging students
to explore challenges and evaluate planned learning strategies. This process of guided self-reflec-
tion and active learning is key to the empowerment of students as self-regulated learners (Nicol
and Macfarlane-Dick 2006). 5MFF utilisation to validate student progression may support profes-
sional competence achievement; however performative evaluation of reflective practice should
not discourage open and honest communication (Hargreaves 2004).
Study findings suggest the 5MFF has impacted positively on student self-awareness, has
helped initiate feedback conversations, has facilitated the achievement of goals through action
planning and can flag concerns earlier. This is encouraging given that timely constructive feed-
back can help build student confidence and independence (Bradshaw and Lowenstein 2014).
Practice educators and tutors have found that the 5MFF has encouraged a more structured and
constructive delivery of feedback while also helping to motivate students. Feedback discussion is
important in clarifying understanding and setting performance expectations (Perera et al. 2008).
Importantly, both groups have found that the 5MFF has helped encourage feedback prioritisa-
tion and better communication. Having a structured method of addressing placement challenges
can be particularly beneficial for students who may be struggling. Bearman et al. (2013) identi-
fied a need for system and programme developments in supporting underperforming students
on clinical placement. Student ownership of learning and structured feedback opportunities for
educators was emphasised. We feel that the 5MFF in addition to the student pathway addresses
these recommendations. A small number of participants felt the 5MFF may be perceived as a
12 E. O’MALLEY ET AL.

burden in busy clinical environments. Given that time is a significant challenge for many,
emphasis on the brief nature of feedback when naming this tool was prioritised.
5MFF utilisation challenges that emerged during inductive analysis included form ownership,
requirement for prompting, educator value and training need. Although the 5MFF is a student-led
tool, comments relating to the requirement for prompting for both students and practice educa-
tors were noted. Some students also report that utilisation is dependent on practice educator per-
ceived value and their own requesting consistency. A change in practice to mandatory utilisation
may support increased frequency of use and reduce ownership disagreement. This paper provides
a strong argument for the integration of a feedback specific tool in work-based learning. Outlining
roles and setting expectations for stakeholders may improve utilisation within clinical education.
Given the barriers to feedback, having a tool with high usability, utility and impact that reduces
clinician burden is important. The 5MFF is utilised during physiotherapy placements; however, we
feel that its utilisation across healthcare disciplines could prove equally useful. The universality of
the 5MFF structure provides the opportunity for self-directed and individualised feedback and
importantly equity. We also contend that the 5MFF could be easily modified for utilisation with
peers and inter-professional colleagues to further avail of feedback opportunities.

Study limitations
Study results indicate a range of student, practice educator and tutor experiences of utilising the
5MFF. It provides a snapshot of a feedback tool used for clinical education but is unable to give
greater depth on the overall feedback experience. Due to the brevity of the questionnaire we
cannot be sure if all relevant topics, challenges and theories were captured. Although both 3rd
and 4th year students were recruited the cross-sectional nature of the study design doesn’t allow
for interpretation of the influence of increased placement experience and use of the 5MFF. The
opt-in aspect of the study may also influence the representative nature of the sample recruited.
A broader research approach such as individual interviews or focus groups would provide
greater validation of the inferences made within the results. Although this paper sought to
establish perceptions of the 5MFF, future research to determine attainment of goals and educa-
tional achievement utilising CAF scores would provide further insight into its impact.

Conclusion
Healthcare professionals are the gatekeepers for their professions. They are tasked with the respon-
sibility of assessing student clinical competence prior to graduation. The provision of an effective
feedback tool is therefore an essential component in facilitating the development of competent
healthcare professionals. Use of the 5MFF was found to positively impact the feedback process in
physiotherapy clinical education. The vast majority of students, practice educators and practice
tutors positively evaluated the 5MFF in relation to usability, utility and impact on learning. Findings
support its ability to enable the feedback process through open collaborative dialogue, timely guid-
ance on current performance and constructing a plan for future development. The 5MFF has
encouraged feedback prioritisation, enabled student self-reflection and provided an opportunity for
self-directed learning. Its key contribution is in reducing feedback burden, developing self-regulated
learners and facilitating overall progression in work-based learning. This paper provides educators
with a convenient and adaptable student-centred tool to support the feedback process.

Acknowledgements
We would like to sincerely thank all of the students, practice educators and practice tutors who took the time to
complete our questionnaire. Their comments provided really valuable insight into the usability, utility and impact
of the 5MFF as a feedback tool.
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 13

Disclosure statement
The authors report no declarations of interest.

Notes on contributors
Emer O’ Malley is a practice tutor and senior Physiotherapist in St. Columcille’s Hospital, Loughlinstown, Dublin,
Ireland and adjunct lecturer with University College Dublin, Ireland. Her research to date has centred on the feed-
back process, creating positive learning environments, telehealth and the student pathway within the clinical envir-
onment. She has also published widely in the specialist area of obesity assessment and management.

Anne-Maria Scanlon Anne-Maria Scanlon works in Tallaght University Hospital, Dublin, Ireland as a practice tutor in
Physiotherapy. Her research interests centre on the impact of positive and supportive environments on student
learning, in particular utilising strategies to facilitate particularly those living with physical and mental health dis-
abilities. Her clinical interests include in-patient rehabilitation, care of older adults and persons with cognitive
impairments.

Lucy Alpine is the Physiotherapy practice education coordinator in the Discipline of Physiotherapy, Trinity College
Dublin, Ireland. She is passionate about clinical supervision models, placement assessment, innovative strategies
that support student learning and factors that motivate healthcare professionals as practice educators. Her research
interests also include hip fracture and orthopaedic surgery outcomes.

Sinead McMahon is the Physiotherapy practice education coordinator in the UCD School of Public Health,
Physiotherapy and Sports Science, Dublin Ireland. Her programme of research centres on education, curriculum
design and development. Her main research area focuses on student physiotherapist work-based learning, having
extensively investigated the cycle of competence model, core competencies requirements and clinical placement
specialisation and location.

ORCID
Emer O’Malley http://orcid.org/0000-0002-2745-4526

References
Allen, L., and E. Molloy. 2017. “The Influence of a Preceptor-Student ’Daily Feedback Tool’ on Clinical Feedback
Practices in Nursing Education: A Qualitative Study.” Nurse Education Today 49: 57–62. doi:10.1016/j.nedt.
2016.11.009.
Archer, J. C. 2010. “State of the Science in Health Professional Education: Effective Feedback.” Medical Education 44
(1): 101–108. doi:10.1111/j.1365-2923.2009.03546.x.
Atkinson, R. L., R. C. Atkinson, and E. R. Hilgard. 1983. Introduction to Psychology. New York: Harcourt, Brace &
Jovanovich, Inc.
Barrett, E. M., A. Belton, and L. M. Alpine. 2019. “Supervision Models in Physiotherapy Practice Education: Student
and Practice Educator Evaluations.” Physiotherapy Theory and Practice, 29: 1–14. doi:10.1080/09593985.2019.
1692393.
Bearman, M., E. Molloy, R. Ajjawi, and J. Keating. 2013. “‘Is There a Plan B?’: Clinical Educators Supporting
Underperforming Students in Practice Settings.” Teaching in Higher Education 18 (5): 531–544.
Bohnacker-Bruce, S. 2013. “Effective Feedback: The Student Perspective.” Capture 4 (1): 25–36.
Boud, D. 1999. “Avoiding the Traps: Seeking Good Practice in the Use of Self Assessment and Reflection in
Professional Courses.” Social Work Education 18 (2): 121–132.
Boud, D., and E. Molloy. 2013a. Feedback on Higher Professional Education Understanding It and Doing It Well.
London: Routledge.
Boud, D., and E. Molloy. 2013b. “Rethinking Models of Feedback for Learning: The Challenge of Design.” Assessment
& Evaluation in Higher Education 38 (6): 698–712.
Bradshaw, M. J., and A. L. Lowenstein. 2014. Innovative Teaching Strategies in Nursing and Related Health Professions.
Burlington, MA: Jones & Bartlett Learning.
Cantillon, P., and J. Sargeant. 2008. “Giving Feedback in Clinical Settings.” BMJ 337: a1961. doi:10.1136/bmj.a1961.
Carless, D. 2006. “Differing Perceptions in the Feedback Process.” Studies in Higher Education 31 (2): 219–233. doi:
10.1080/03075070600572132.
14 E. O’MALLEY ET AL.

Carless, D. 2019. “Feedback Loops and the Longer-Term: Towards Feedback Spirals.” Assessment & Evaluation in
Higher Education 44 (5): 705–714. doi: 10.1080/02602938.2018.1531108.
Coote, S., L. Alpine, C. Cassidy, M. Loughnane, S. McMahon, D. Meldrum, A. O. ’. Connor, and M. O’Mahoney. 2007.
“The Development and Evaluation of a Common Assessment Form for Physiotherapy Practice Education in
Ireland.” Physiotherapy Ireland 28 (2): 5–10.
CORU. 2018. “Physiotherapists Registration Board. Criteria for Education and Training Programmes,” edited by CORU
Regulating Health Social Care Professionals. Ireland.
Dawson, P., M. Henderson, P. Mahoney, M. Phillips, T. Ryan, D. Boud, and E. Molloy. 2019. “What Makes for Effective
Feedback: Staff and Student Perspectives.” Assessment & Evaluation in Higher Education 44 (1): 25–36. doi: 10.
1080/02602938.2018.1467877.
Dearnley, C. A., J. D. Taylor, J. C. Laxton, S. Rinomhota, and I. Nkosana-Nyawata. 2013. “The Student Experience of
Piloting Multi-Modal Performance Feedback Tools in Health and Social Care Practice (Work)-Based Settings.”
Assessment & Evaluation in Higher Education 38 (4): 436–450. doi: 10.1080/02602938.2011.645014.
DeWitt, D., J. Carline, D. Paauw, and L. Pangaro. 2008. “Pilot Study of a ‘RIME’-Based Tool for Giving Feedback in a
Multi-Specialty Longitudinal Clerkship.” Medical Education 42 (12): 1205–1209. doi:10.1111/j.1365-2923.2008.
03229.x.
Docheff, D. M. 1990. “The Feedback Sandwich.” Journal of Physical Education, Recreation & Dance 61 (9): 17–18. 8.
doi:10.1080/07303084.1990.10604618.
Ende, J. 1983. “Feedback in Clinical Medical Education.” JAMA: The Journal of the American Medical Association 250
(6): 777–781.
Fereday, J., and E. Muir-Cochrane. 2006. “Demonstrating Rigor Using Thematic Analysis: A Hybrid Approach of
Inductive and Deductive Coding and Theme Development.” International Journal of Qualitative Methods 5 (1):
80–92.
Gigante, J., M. Dell, and A. Sharkey. 2011. “Getting beyond "Good Job": How to Give Effective Feedback.” Pediatrics
127 (2): 205–207. doi:10.1542/peds.2010-3351.
Hall, C., E. Peleva, R. H. Vithlani, S. Shah, M. Bashyam, M. Ramadas, J. Horsburgh, and A. H. Sam. 2020. “FEEDBK: A
Novel Approach for Providing Feedback.” The Clinical Teacher 17 (1): 76–80. doi:10.1111/tct.13026.
Hargreaves, J. 2004. “So How Do You Feel about That? Assessing Reflective Practice.” Nurse Education Today 24 (3):
196–201. doi:10.1016/j.nedt.2003.11.008.
Hattie, J., and H. Timperley. 2007. “The Power of Feedback.” Review of Educational Research 77 (1): 81–112.
Hattie, J. A. C., and G. C. R. Yates. 2014. “Using Feedback to Promote Learning.” In Applying Science of Learning in
Education: Infusing Psychological Science into the Curriculum, Benassi, V. A., Overson C. E., and Hakala C. M.,
45–58. Washington, DC: Society for the Teaching of Psychology.
Hesketh, E. A., and J. M. Laidlaw. 2002. “Developing the Teaching Instinct, 1: Feedback.” Medical Teacher 24 (3):
245–248. doi:10.1080/014215902201409911.
Jonassen, D. H. 1991. “Evaluating Constructivistic Learning.” Educational Technology 31 (9): 28–33.
Lisko, S. A., and V. O’Dell. 2010. “Integration of Theory and Practice: Experiential Learning Theory and Nursing
Education.” Nursing Education Perspectives 31 (2): 106–108.
Little, B., and L. Harvey. 2006. “Learning Through Work Placements and Beyond. A Report for the Higher Education
Careers Services Unit and the Higher Education Academy’s Work Placements Organisation Forum.” Retrieved
from Prospects http://ww2.prospects.ac.uk/downloads/documents/hecsu/reports/workplacement_little_harvey.pdf
Mann, K., J. Gordon, and A. MacLeod. 2009. “Reflection and Reflective Practice in Health Professions Education: A
Systematic Review.” Advances in Health Sciences Education 14 (4): 595–621. doi:10.1007/s10459-007-9090-2.
Molloy, E., D. Boud, and M. Henderson. 2020. “Developing a Learning-Centred Framework for Feedback Literacy.”
Assessment & Evaluation in Higher Education 45 (4): 527–514. doi:10.1080/02602938.2019.1667955.
Mulliner, E., and M. Tucker. 2017. “Feedback on Feedback Practice: Perceptions of Students and Academics.”
Assessment & Evaluation in Higher Education 42 (2): 266–288.
Mutch, A., C. Young, T. Davey, and L. Fitzgerald. 2018. “A Journey towards Sustainable Feedback.” Assessment &
Evaluation in Higher Education 43 (2): 248–259.
Nash, R. A., and N. E. Winstone. 2017. “Responsibility-Sharing in the Giving and Receiving of Assessment Feedback.”
Frontiers in Psychology 8 (1519): 1519. doi:10.3389/fpsyg.2017.01519.
National Forum. 2017. “National Forum for the Enhancement of Teaching and Learning in Higher Education: Forum
Insight.” In Work-Based Assessment of/for/as Learning: Context, Purposes and Methods, 1–3. Dublin: National
Forum for the Enhancement of Teaching and Learning in Higher Education.
Nicol, D. J., and D. Macfarlane-Dick. 2006. “Formative Assessment and Self-Regulated Learning: A Model and Seven
Principles of Good Feedback Practice.” Studies in Higher Education 31 (2): 199–218.
O’Malley, E., A.-M. Scanlon, and L. Alpine. 2019. ““The 5 Minute Feedback Form”: A Pilot Study of a Weekly
Feedback Form for Clinical Placement.” In Health and Social Care Professions 2nd National Conference on
Practice Education, Dublin, Ireland.
Omer, A. A. A., and M. E. Abdularhim. 2017. “The Criteria of Constructive Feedback: The Feedback That Counts.”
Journal of Health Specialties 5 (1): 45.
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 15

Page, M., J. Gardner, and J. Booth. 2020. “Validating Written Feedback in Clinical Formative Assessment.”
Assessment & Evaluation in Higher Education 45 (5): 697–617. doi: 10.1080/02602938.2019.1691974.
Pendleton, D., T. Schofield, P. Tate, and P. Havelock. 1984. The Consultation: An Approach to Learning and Teaching.
Oxford: Oxford University Press.
Perera, J., N. Lee, K. Win, J. Perera, and L. Wijesuriya. 2008. “Formative Feedback to Students: The Mismatch
between Faculty Perceptions and Student Expectations.” Medical Teacher 30 (4): 395–399. doi:10.1080/
01421590801949966.
Sadler, D. R. 2010. Beyond Feedback: Developing Student Capability in Complex Appraisal, 535–550. Abingdon:
Routledge.
Sargeant, J. M., K. V. Mann, C. P. van der Vleuten, and J. F. Metsemakers. 2009. “Reflection: A Link between
Receiving and Using Assessment Feedback.” Advances in Health Sciences Education 14 (3): 399–410. doi:10.1007/
s10459-008-9124-4.
Shute, V. J. 2008. “Focus on Formative Feedback.” Review of Educational Research 78 (1): 153–189.
Taras, M. 2001. “The Use of Tutor Feedback and Student Self-Assessment in Summative Assessment Tasks: Toward
Transparency for Students and Tutors.” Assessment & Evaluation in Higher Education 26 (6): 605–614. doi:10.1080/
02602930120093922.
Thurlings, M., M. Vermeulen, T. Bastiaens, and S. Stijnen. 2013. “Understanding Feedback: A Learning Theory
Perspective.” Educational Research Review 9: 1–15. doi:10.1016/j.edurev.2012.11.004.
Trede, F., M. Mischo-Kelling, E. M. Gasser, and S. Pulcini. 2015. “Assessment Experiences in the Workplace: A
Comparative Study between Clinical Educators’ and Their Students’ Perceptions.” Assessment & Evaluation in
Higher Education 40 (7): 1002–1016.
Watling, C. J., and S. Ginsburg. 2019. “Assessment, Feedback and the Alchemy of Learning.” Medical Education 53
(1): 76–85. doi:10.1111/medu.13645.

You might also like