You are on page 1of 31

Workplace-based

assessment
(WPBA)
Yohannes Molla
Sr. Technical Advisor
HWIP/ Jhpiego

March 14, 2023


Outlines
Introduction
Definition
Purpose
Common types
Psychometry of WPBA
Practical uses of WPBA in anesthesia program
Introduction
 Support learning
 Decide readiness
 Assessment in workplace setup is the close
approximation of actual performance
 But:
o How appropriate are our pass/fail decisions?
o How confident/ happy are you with the overall
decisions?
 Lesson: Curriculum goal ⅏ Assessment + Fairness
Introduction…
 Rich experience in using written, OSCEs, and short/long-cases
 OSCEs (real/ simulated patients) happen in a controlled setup

George Miller, 1990

 Concern that trainees are seldom observed, assessed, and given


feedback during their workplace-based education
(Norcini et al., 2007)
Objectives
 Discuss the purposes of WPBA
 Describe commonly used WPBA tools
 Discuss psychometrics of WPBA
 Share practical ways to implement WPBA
Definition
 Workplace-based assessment (WPBA) refers to a group of
assessment modalities that assesses learners’ performance
during the actual clinical settings
 Alongside other assessments!
 Hallmarks of WPBA: Observation Real
o Performance observation WPBA
o Real workplace environment
Rubrics Feedback
o Using standardized rubrics
o Feedback
 Few points about key principles…
Definition…
 Employing ‘good’ assessment method(s) depends on specifying:
o Purpose (Why): formative vs summative Norcini et al., 2018
o Construct (What = Framework): Analytic, synthetic, developmental
Louis Pangaro & Olle ten Cate, Amee guide 78
Definition…
 Analytic vs Synthetic
Definition…
 Synthetic

Task

Task Contingency
Management Management

Environment
Purposes of WPBA
 Quite simply you could watch a learner with a patient and just
say that was good or bad and move on
 The validity and reliability of a single encounter (regardless of
whether it is satisfactory or not) are tiny
(Burch et al., 2019)
 The use of rubrics increased error detection from 32% to 64%
(Noel et al. 1992)
 Besides, fairness is also another concern/ issue
 This increases interest to a variety of standardized assessment
methods with observation and feedback
Purposes…
 Assess a wide range of skills in an authentic setting –a broader,
richer, more realistic, and more complex set of challenges
 Directly observe a trainee’s performance in the actual setup
 Ask colleagues about the trainee’s performance
 Ask patients about the trainee’s care
 Ask other HCPs about the trainee’s performance
 Discuss with a trainee performance using authentic data
 Provide feedback
 A more direct impact on patient safety than others
Common types
 Variations existed in their names and practice
 The four basic and common versions are:
o Direct Observation of Procedural Skills (DOPS)
o Mini Clinical Evaluation Exercise (Mini-CEX)
o Case-based Discussion (CbD)
o Multisource feedback (MSF)
 Different names, BUT all are based on an actual patient
encounter with a form of direct observation in the workplace
 More peculiar tools for peculiar departments
#01: DOPS
 Aimed at assessing learners
when performing a specific
Task Reporter

practical procedure
 Directly observed and scored
by a trained observer
Task Interprete
Managem r
ent

 Generic assessment tool


Contingen  E.g.
cy Manager
Managem
ent o Suturing
o Aspiration of effusion
Environm  Perhaps, our short exams? Educator
ent
#01: DOPS…
 Usually takes between 10-20 minutes
 Trainee assessed when ready
 Multiple assessors on multiple
occasions for a trainee continuously

Source: www.hcat.nhs.uk
 Four to six cases are enough
 Mandatory vs optional procedures
 Royal College of General Practitioners
(RCGP): 8 mandatory & 11 optional
procedures
www.rcgp.org.uk/mrcgp-exams/wpba
#02: Mini-CEX
 A more holistic judgment made
Task
about trainees in general practice Reporter

 It includes seven rated question


o History taking
Task Interprete
Managem
ent
o Physical examination skills r

o Communication skills
o Critical judgment
Contingen
cy
Managem
o Professionalism Manager

ent
o Organization and efficiency
o Overall clinical care
Environm
ent
 Perhaps our long case? Educator
#02: Mini-CEX…
 Direct observation using rubrics
 Multiple assessors on d/t cases
 Usually takes 20 minutes
 10-14 events are needed (4-6√)

Source: www.hcat.nhs.uk
 Factors for successful Mini-CEX
o Clinical experience
o Familiar with development level
o Familiar with rating descriptors
o Able to grade case complexity
o Experience in using mini-CEX
o Trained in giving feedback
#03: CbD
 aka Stimulated chart recall
Task Reporter

 Assesses the learner’s clinical


reasoning and decision-making
Task  Should be based around a Interprete

written actual record


Managem r
ent

 Either the learner or faculty


Contingen can choose the cases
cy Manager
Managem
ent  Our viva and long-case?

Environm Educator
ent
#03: CbD…
 Assessment using standard rubrics
 Direct observation
 Four to six cases are enough to
provide a reliable and valid

Source: www.hcat.nhs.uk
assessment
 Beyond CDMs
Make judgement
Discuss judgment
Choose a case Frame questions
#04: MSF
 aka 360o evaluation
Task Reporter

 Anonymous feedback from


those around
Task  Particularly useful to assess Interprete
Managem
ent areas difficult to assess using r

other WPBA:
Contingen
o Communication
o Professionalism
cy Manager
Managem
ent

 Perhaps our internship


assessment?
Environm Educator
ent
#04: MSF…
Involves multiple raters:
oSelf assessment
oPeer assessment
oPatient assessment

Source: www.hcat.nhs.uk
oOther HCPs assessment
oFaculty assessment
Compare results with self-assessment
Use six to 10 raters/ cases
Reflection is key for its effectiveness
Raters’ training is important
Collectively
Task Reporter

 DOPS
Task
Managem
 Mini-CEX Interprete
r
ent
 CbD
 MSF
Contingen
cy Manager
Managem
ent

 Existing practice can be standardized


Environm
ent  Message: WPBA can assess every Educator
dimension
Psychometry
 In real life (uncontrolled), there might be good reasons for

#?
not demonstrating a behavior at a time
 Thus, trainees need to be assessed more than once to
increase confidence – validity
Bre
 Reliability increases when there are more assessors across
adth

different tasks, rather than more assessors per task


 This also means that there will be richer, triangulated
Dep
feedback to learners th
Triang
 There is growing evidence of good triangulation between ulation
WPBA and other assessment methods
Psychometry…
 Now, you noted that the validity and reliability of WPBA depend
to a large degree on the faculty that serve as assessors.
 Well-trained and motivated faculty  positive results
 However, if the faculty feels that it takes too much time or has
been imposed, they may undermine the reliability and validity
of WPBA.

Vanessa Burch et al., 2019


Challenges of WPBAs
Is that a
DOPS or
DOPS?

 Massive need for faculty development and change management


Practical experience
 We started with blueprint development in 2014/5
 Suggested cases + minimum # of assessments – 10-12/ module
 Mandatory requirement – students must complete a prescribed
number of mini-CEXs, DOPS, and CbDs to pass a module
 Faculty orientated
 Students briefed and portfolio shared (paper-based)
 All faculty share an equal number of WPBA for each student
 Leading the process is up to the student
 Progress followed regularly by the coordinator – Continuously
Practical…
 All 20+ anesthesia
departments are
implementing WPBAs
 Quality of
implementation varies
 Students must submit
such a signed portfolio
at the end of each
semester in order to
move on to the next
 WPBAs account – 40%
WPBA in your setting
 New WPB educational strategy - always a challenge
 Have clear reasons for doing so - lessen resistance to change
 Vanessa Burch and Adrian Freeman recommended 6 steps
1.Design a blueprint – what to assess (mandatory vs optional - Frame)
2.Map the individual tools – how to assess (standardization)
3.Determine the emphasis and importance of different areas
4.Consider the issue of feasibility – start small and engage others
5.Decide how the evidence will be accumulated and presented
6.Decide the purposes of the assessment
Summary
 When implemented properly, WPBA can help students drive their
learning effectively
 WPBA can assess all competency dimensions – ↑↑↑ defensibility of
decisions
 WPBA with adequate breadth, depth, and triangulation is valid and
reliable
 Introducing WPBA requires systematic planning and faculty
commitment
 While not perfect, we have at least started to ensure that anesthesia
trainees are observed on core tasks multiple times by multiple
faculties
 Little standardization can facilitate change
References
 Norcini J, et al., 2018. Consensus framework for good assessment. Medical
teacher. 2018 Nov 2;40(11):1102-9.
 Norcini, J. and Burch, V., 2007. Workplace-based assessment as an
educational tool: AMEE Guide No. 31. Medical teacher, 29(9-10), pp.855-871.
 Pangaro, L. and Ten Cate, O., 2013. Frameworks for learner assessment in
medicine: AMEE Guide No. 78. Medical Teacher, 35(6), pp.e1197-e1210.
 Burch, V. and Freeman, A., 2019. Workplace-based assessment. FAIMER Keele
Master’s in Health Professions Education: Accreditation and Assessment.
Module 1, Unit 7. FAIMER Centre for Distance Learning, CenMEDIC, London.
 Norcini, J.J. and McKinley, D.W., 2007. Assessment methods in medical
education. Teaching and teacher education, 23(3), pp.239-250.
 Herbers JE, Noel GL, Pangaro LN, et al., 1992. How accurate are faculty
evaluations of clinical competence? J Gen Inter Med 4:202–208
Thank You!

Addis Ababa | Ethiopia ema@gmail.com

+251 912 34 56 78 ethiopiamedicalassociation.com

You might also like