You are on page 1of 10

Prospective, Randomized Trial Comparing

Simulator-based versus Traditional Teaching of


Direct Ophthalmoscopy for Medical Students

GRANT L. HOWELL, GERMÁN CHÁVEZ, COLIN A. MCCANNEL, PETER A. QUIROS, SABA AL-HASHIMI, FEI YU,
SIMON FUNG, CHRISTOPHER M. DEGIORGIO, YUE MING HUANG, BRADLEY R. STRAATSMA,
CLARENCE H. BRADDOCK, AND GARY N. HOLLAND

• OBJECTIVE: To compare results of simulator-based vs ulator group (median: 5.0, range: 0.0-11.0) than for the
traditional training of medical students in direct ophthal- traditional group (median: 4.0, range: 0.0-9.0), although
moscopy. the difference was not significant. The simulator group
• DESIGN: Randomized controlled trial. was less likely to mistake the location of a macular scar
• METHODS: First-year medical student volunteers com- in 1 patient (odds ratio: 0.28, 95% confidence interval:
pleted 1 hour of didactic instruction regarding direct oph- 0.056-1.35, P = .013).
thalmoscopes, fundus anatomy, and signs of disease. Stu- • CONCLUSIONS: Direct ophthalmoscopy is difficult, re-
dents were randomized to an additional hour of training gardless of training technique, but simulator-based train-
on a direct ophthalmoscope simulator (n = 17) or super- ing has apparent advantages, including improved tech-
vised practice examining classmates (traditional method, nique, the ability to localize fundus lesions, and a foster-
n = 16). After 1 week of independent student prac- ing of interest in learning ophthalmoscopy, reflected by
tice using assigned training methods, masked ophthalmol- increased practice time. (Am J Ophthalmol 2022;238:
ogist observers assessed student ophthalmoscopy skills 187–196. © 2021 The Authors. Published by Elsevier
(technique, efficiency, and global performance) during Inc.
examination of 5 patient volunteers, using 5-point Lik- This is an open access article under the CC BY-NC-ND
ert scales. Students recorded findings and lesion location license
for each patient. Two masked ophthalmologists graded (http://creativecommons.org/licenses/by-nc-nd/4.0/))
answer sheets independently using 3-point scales. Stu-
dents completed surveys before randomization and after

D
assessments. Training groups were compared for grades, irect ophthalmoscopy allows clinicians, regardless
observer- and patient-assigned scores, and survey re- of specialty, to visualize fundus findings, includ-
sponses. ing diabetic retinopathy and papilledema, that are
• RESULTS: The simulator group reported longer practice critical for the evaluation of patients with systemic disor-
times than the traditional group (P = .002). Observers ders. The Association of University Professors of Ophthal-
assigned higher technique scores to the simulator group mology recommends that all medical students learn to ap-
after adjustment for practice time (P = .034). Combined preciate anatomic landmarks and signs of disease through
grades (maximum points = 20) were higher for the sim- direct ophthalmoscopy.1 Nevertheless, studies report that
medical students, nonophthalmology residents, and prac-
ticing physicians lack confidence in their ophthalmoscopy
Supplemental Material available at AJO.com.
skills.2-4 Both educators and trainees attribute the lack
Accepted for publication November 9, 2021. of skill, in part, to inadequate training during medical
UCLA Stein Eye Institute, David Geffen School of Medicine at UCLA, school.5 , 6 In a survey of residency program directors in pri-
University of California, Los Angeles; Department of Ophthalmology, mary care specialties, respondents felt that medical schools
David Geffen School of Medicine at UCLA, University of California, Los
Angeles; UCLA Simulation Center, David Geffen School of Medicine were not training students adequately to perform direct
at UCLA, University of California, Los Angeles; Doheny Eye Institute, ophthalmoscopy once they are in residencies.6
Pasadena; Departments of Neurology, Cardiology, and Neurobiology, Traditionally, teaching direct ophthalmoscopy skills in-
David Geffen School of Medicine at UCLA, University of California, Los
Angeles, California, USA; Department of Anesthesiology and Periopera- volves student-on-student (or student-on-patient) practice;
tive Medicine, David Geffen School of Medicine at UCLA, University of however, such instruction involves many challenges that
California, Los Angeles, California, USA; Office of the Dean, David Gef- limit its educational value, including an inability to pro-
fen School of Medicine at UCLA, University of California, Los Angeles,
California, USA; Department of Biostatistics, UCLA Jonathan and Karin vide students with accurate feedback about what they have
Fielding School of Public Health, Los Angeles seen in the fundi of examinees.7 Studies have evaluated al-
Inquiries to Gary N. Holland, UCLA Stein Eye Institute, 100 ternative teaching techniques, including use of “teaching”
Stein Plaza, UCLA, Los Angeles, CA 90095-7000, USA.; e-mail:
uveitis@jsei.ucla.edu ophthalmoscopes, which provide instructors with images of
© 2021 THE AUTHORS. PUBLISHED BY ELSEVIER INC.
0002-9394/$36.00 THIS IS AN OPEN ACCESS ARTICLE UNDER THE CC BY-NC-ND LICENSE 187
https://doi.org/10.1016/j.ajo.2021.11.016 (HTTP://CREATIVECOMMONS.ORG/LICENSES/BY-NC-ND/4.0/).
what students see; fundus photography; and direct ophthal- technique, and common mistakes made by students learn-
moscope simulators.8-12 Simulation has been shown to im- ing to use the ophthalmoscope). Signs of retinal disease in-
prove surgical skills among residents,13-15 but evidence is cluded examples similar to those in study eyes of patient
limited regarding the value of simulation for teaching direct volunteers, but were not limited to those signs; examples
ophthalmoscopy to students.12 In this article, we describe included papilledema, large optic disc cups, chorioretinal
the results of a randomized controlled trial that compared scars, drusen, signs of vasculopathies and ischemia (hem-
simulator-based vs traditional teaching of direct ophthal- orrhage, cotton-wool spots, exudates), and nevi. Students
moscopy to first-year medical students. in each session had been randomized 1:1 into simulator
and traditional training groups using block randomization,
but assignments for each day were revealed after didactic
instruction. All participants on a given day attended the
same didactic session, regardless of randomization assign-
METHODS ment. Didactic sessions were conducted by the same in-
structor each day (G.N.H.). Training groups ranged in size
This prospective, randomized, controlled trial was con-
from 4 to 8 students.
ducted at the David Geffen School of Medicine at UCLA.
The simulator group was introduced to the Eyesi Direct
First-year medical student volunteers were recruited in May
Ophthalmoscope Simulator (Model EDO491 #03 × 0127,
2019 to participate in the study via e-mail and social media
Platform 2.1, Software v1.8.0.113443; VRmagic GmbH,
announcements. Students were not eligible for participa-
Mannheim, Germany) during an additional hour by 1 fac-
tion in the trial if they had had more than an introduc-
ulty instructor (C.A.M.) and a manufacturer’s representa-
tory demonstration of the direct ophthalmoscope or had
tive. The simulator contains 4 “modules,” each with numer-
practiced use of a direct ophthalmoscope independently.
ous “cases” that are scored on the operator’s ability to com-
Meals were provided at study sessions, and students were
plete tasks.16 The first 2 modules focus on finding geomet-
given $50 Amazon gift cards on study completion. Five pa-
ric shapes superimposed on a simulated fundus. The simu-
tient volunteers with various fundus findings were recruited
lator registers whether the operator successfully locates the
from faculty practices to participate in the assessment phase
shape. A cross hair indicating the position of gaze must be
of the study. Patients were given $100 Amazon gift cards
held over the shape for 1 second for the simulator to regis-
for participation. Seven ophthalmologists with experience
ter it as having been “seen.” The simulator also records and
in medical student instruction volunteered to assess stu-
displays a map of the total fundus area that has been viewed,
dent performance; 5 evaluated students as they examined
thereby promoting efficient scanning of the entire posterior
patients (see Acknowledgments) and 2 (P.A.Q., S.A.-H.)
pole. Each student in the simulator group used the first mod-
graded answer sheets on which students recorded their
ule during the training hour. The third and fourth modules
findings. Fundus imaging of the study eye of each patient
display simulated pathologic retinal and optic disc lesions.
had been performed with photography or optical coher-
The fourth (examination) module evaluates the operator’s
ence tomography to assist in grading student answer sheets
ability to identify lesions and was not available to students
(Figure 1). All assessments were masked to each student’s
during the study. The simulator also addresses some aspects
training group. The study was approved by the UCLA Insti-
of the direct ophthalmoscopy technique. Some cases re-
tutional Review Board before participant recruitment and
quire operators to adjust “lenses” within the ophthalmo-
data collection.
scope hand piece to obtain a clear view of the simulated fun-
dus; another penalizes operators unless the fundus is viewed
• DATA COLLECTION AND STUDY PROCEDURES: The through simulation of an undilated pupil.
study was conducted in 3 phases: training, practice, and During the same hour, the traditional training group ex-
skills assessment. Students completed a pretraining survey amined fellow students with their personal direct ophthal-
that collected the following information: prior direct oph- moscopes under the guidance of 2 faculty instructors (S.F.,
thalmoscopy experience (yes vs no), prior study of eye dis- G.N.H.). Pupillary dilation of examined students was op-
ease (yes vs no), and the presence or absence of pre-existing tional. After training, all students were encouraged to prac-
eye conditions (refractive error corrected by glasses or con- tice ophthalmoscopy independently, using only their as-
tact lenses, color-blindness, amblyopia). Investigators not signed training method (practice on the simulator vs prac-
involved in training or assessments randomized (F.Y.) and tice on family members, friends, or fellow students).
enrolled (G.C.) student volunteers. All students were assessed at a single session, 4 days af-
Training was conducted in 3 identical sessions on con- ter the final training day. Students were placed in ran-
secutive days. The session in which each student could par- dom order to begin assessments. Patient volunteers were
ticipate was dictated by other school responsibilities. Train- placed in 5 examination lanes. Study eyes were dilated with
ing began with a 1-hour didactic session on fundus anatomy, tropicamide 1% and phenylephrine 2.5%. Before examina-
signs of retinal and optic disc disease, and basic principles of tions, students were oriented to an answer sheet on which
direct ophthalmoscopy (ophthalmoscope controls, proper they were to record (1) abnormal findings in the posterior

188 AMERICAN JOURNAL OF OPHTHALMOLOGY JUNE 2022


FIGURE 1. Clinical images illustrating pertinent findings in the fundi of 4 patient volunteers who were examined by first-year
medical students randomized to simulator-based vs traditional training of direct ophthalmoscopy. A. Patient 1: enlarged optic disc
cup, left eye. B. Patient 2: toxoplasmic retinochoroidal scar inferotemporal to the fovea, right eye. C. Patient 4: red-free image of a
large macular disciform scar, left eye. D. Patient 5: age-related macular degeneration with central pigmentary changes and drusen,
left eye. An additional patient volunteer (patient 3) had no retinal or optic disc lesions and was considered to have a normal fundus.

pole and (2) location of those findings on fundus diagrams ined, but no further instructions were given. Students were
printed on the answer sheet. Students were asked to de- told not to reveal their training technique to observers or
scribe lesion characteristics (size, color, and shape) in their patients, and patients had been told not to reveal their eye
own words. Diagnoses were not required. Two additional problems to students. Both observers and patients were told
options were “normal fundus,” if they found no abnormali- not to provide students with feedback during the examina-
ties, and “could not visualize fundus.” tion. Students had 1 minute to perform each examination,
Students examined all 5 patient volunteers in the same during which observers assigned them 3 scores (technique,
order; after the first student examined the first patient and efficiency, and global, as described below). After each ex-
moved to the second patient, the second student in line ex- amination, students exited the examination lane and had
amined the first patient, and so forth until each patient had 1 minute to complete answer sheets before moving to the
been examined by every student. A single masked ophthal- next patient. During the same minute, patients assigned a
mologist observer was assigned to each examination lane single global score for student performance, as described be-
for the entire assessment period. On entry into the exami- low. Students were asked not to discuss their answers with
nation lane, students were told which eye was to be exam- other students until all examinations were completed and

VOL. 238 DIRECT OPHTHALMOSCOPY SIMULATION 189


answer sheets submitted. At the end of the assessment ses- ear regression models. P values of <.05 were considered
sion, observers confirmed that no students had revealed statistically significant. We evaluated agreement between
their training techniques. grades assigned by the ophthalmologist graders using a sim-
After examining the fifth patient and submitting the an- ple Kappa statistic. Values of <0.4 were considered to be no
swer sheet, each student completed a standard postassess- or mild agreement, 0.41-0.6 to be moderate agreement, and
ment survey. The following information was collected: in- >0.6 to be good or excellent agreement.
dependent practice time (none, <30 minutes, 30-59 min-
utes, 1-3 hours, >3 hours), perceived advantages and dis-
advantages of the training method to which the student
was assigned (open-field question), and confidence in di-
rect ophthalmoscopy (5 categories ranging from “not at all RESULTS
comfortable” to “very comfortable”).
The graders (S.A.-H., P.A.Q.) reviewed all examination The study adhered to guidelines of the International Net-
answer sheets and independently assigned 1 of 3 grades (0, work for Simulation-based Pediatric Innovation, Research,
1, or 2 points) for answers to the 2 questions for each study and Education.17 A total of 39 first-year medical students
eye (location, description of findings); grading criteria are volunteered to participate and were randomized to sim-
described below. After grading, they met to adjudicate dis- ulator (n = 20) or traditional (n = 19) training groups
crepant results and achieve a single, final grade for each (Supplemental Figure). Before training and announce-
question. The graders had had no contact with students dur- ment of randomization assignments, 4 students (1 in the
ing the course of the study, and students’ training assign- simulator group; 3 in the traditional group) decided not
ments were not listed on the answer sheets. to participate. An additional 2 students in the simulator
group did not attend the assessment session because of
• CONVENTIONS AND DEFINITIONS: Rubrics were cre- conflicts with other student activities. Table 2 describes
ated prospectively to guide evaluators (ophthalmologist ob- characteristics of the 33 students (17 in the simulator
servers, patient volunteers) in assigning scores using 5-point group; 16 in the traditional group) who completed train-
Likert scales (Table 1). The global score assigned by an oph- ing, assessment, and surveys. No significant differences
thalmologist observer was based on a gestalt of the student’s were found between the groups. Only 1 student noted
overall performance. The single score assigned by patients limited prior exposure to direct ophthalmoscopy. Among
was based on their perceptions of a student’s performance in 17 students in the simulator group, 16 (94.1%) accessed
3 areas: professionalism, technique, and confidence. Oph- the third module with simulated pathologic fundus lesions
thalmologist graders were also provided with guidelines for while practicing; 7 completed the entire module.
assigning grades to answer sheets (Table 1). Table 3 summarizes grades assigned to students in each
group. Grades for both groups were low overall, but the sim-
• DATA ANALYSIS AND STATISTICAL TECHNIQUES: Pri- ulator group had higher grades than the traditional group for
mary outcome measures were examination grades and all measures, although differences were small and not statis-
ophthalmologist-observer scores. Secondary outcomes in- tically significant. In addition, differences were not signifi-
cluded patient-reported scores, student-reported practice cantly different when adjusted for practice time. The high-
time, and perceived advantages/disadvantages of each est total grade was 11 of 20 (55%) points, assigned to a stu-
training method. Pretraining survey results were compared dent in the simulator group.
between training groups, to identify any differences de- When grades were considered on an individual patient
spite randomization. Each component of the examination basis, the simulator group had significantly higher location
grades (location, clinical findings) and ophthalmologist- and clinical finding grades for patient 4, who had a large dis-
observer scores (technique, efficiency, global) was com- ciform scar (Table 4; Figure, C). Differences remained sig-
pared between groups for each patient individually. The nificant when adjusted for practice time. A mistake among
sum of each student’s 3 observer scores for all 5 patients both groups was to label the scar as a large pale optic disc;
and the sum of each student’s 2 examination grades for all however, simulator-trained students were less likely to mis-
5 patients were compared between groups. Postassessment take the location of the lesion (odds ratio: 0.28, 95% con-
survey results were compared between groups. fidence interval: 0.056-1.35, P = .013).
Statistical analysis was performed using SAS software When all 330 grades assigned to students (33 stu-
version 9.3 (SAS Inc, Cary, North Carolina, USA). The dents × 5 patients per student × 2 questions about each
Fisher exact test was used to calculate differences in pre- patient) were compared, there was good agreement between
training and postassessment survey results, and to com- the 2 graders (270 [81.8%] of the 330 assigned grades were
pare proportions of specific grades assigned to individual identical between graders; simple Kappa estimate 0.613
patients. Summary grades and scores were compared using [95% confidence interval: 0.532-0.694]). Agreement was
the Kruskal-Wallis test. Group differences after adjusting similar when comparisons were made between grades for lo-
for time practiced were calculated using multivariable lin- cation only and for findings only (data not shown).

190 AMERICAN JOURNAL OF OPHTHALMOLOGY JUNE 2022


TABLE 1. Prospective Rubrics to Guide Evaluators in Assigning Scores and Grades to Medical Student Participants During a
Randomized Controlled Trial Comparing Simulator-based vs Traditional Training Techniques for Direct Ophthalmoscopy.

Score or Grade Factors Related to Student Performance That Were Appropriate for Assigning Specific Scores and Grades

For ophthalmologist observersa , b


Technique
Score 1 (poor) Fumbles with the direct ophthalmoscope
Positions incorrectly (too far from the eye, wrong angle to the patient, uses the left eye to examine the patient’s
right eye or vice versa)
Has difficulty keeping light directed into the eye
Constantly shifts position
Unable to find the optic disc
Score 5 (excellent) Holds direct ophthalmoscope correctly
Properly distances from the patient
Initially positions 15 degrees to the patient
Remains stable (ophthalmoscope against the brow, hand on shoulder or forehead, and does not lose alignment)
Scans the macula correctly
Efficiency
Score 1 (poor) Is hesitant during the entire examination
Needs to “start over” many times
Does not finish the examination
Score 5 (excellent) Displays confidence
Proceeds methodically throughout the examination
Finishes the examination in the allotted time
For patient volunteersa , c
Score 1 (poor) Professionalism characterized by roughness and heavy-handling, poor communication, and use of an
uncomfortable level of light
Technique characterized by fumbling with the ophthalmoscope, inability to keep the light directed into the eye,
and constant shifting of position
Lack of confidence characterized by hesitancy and appearance of being unsure of examination procedures
Failure to complete the examination
Score 5 (excellent) Professionalism characterized by a gentle touch, demonstrated consideration of the patient’s comfort, and
awareness of the patient’s tolerance of light intensity
Technique characterized by being methodical, steady, and efficient
Confidence characterized by appearance of knowing correct examination procedures
Finished without being told to stop
For ophthalmologist gradersd
Location
Grade 0 No response or incorrect fundus region
Grade 1 A generally correct region of the fundus, but no detail regarding the relationship of lesions to fundus landmarks
Grade 2 Provides detailed information regarding correct lesion location
Clinical findings
Grade 0 Student could not visualize the fundus; or
Noted a “normal” fundus despite the presence of abnormalities; or
Listed incorrect findings
Grade 1 Answers described a lesion in general terms, but with no specific details (eg, “white spot” or “weird disc”)
Grade 2e A lesion was described accurately with greater detail than for grade 1 (eg, “white scar with surrounding dark
pigment” or “increased cup-to-disc ratio”); or
A nonspecific description was provided with correct diagnosis (eg, “lesion consistent with ocular
toxoplasmosis”)
a
Ophthalmologist observers and patient volunteers assigned scores using Likert scales with values from 1 through 5; for all scales, scores
were defined as follows, for the factor being evaluated: 1 = poor performance; 2 = below average performance; 3 = average performance
(neither good nor bad); 4 = above average performance; and 5 = excellent performance.
b
Guidelines were distributed to ophthalmologist observers only for scores 1 and 5. Scores could be assigned for any of the factors listed, a
combination of those factors, or similar factors that, in the opinion of the evaluator, warranted a given score.
c
A single score was based on the patient volunteer’s perception of a student’s performance in 3 areas: professionalism, technique, and
confidence.
d
Ophthalmologist graders assigned separate grades of 0, 1, or 2 to students’ written description of lesion location and to students’ written
description of the characteristics of the fundus lesions that they observed, guided by the listed factors.
e
A grade of 2 points was also assigned to students who correctly identified the normal fundus of patient volunteer 3.

VOL. 238 DIRECT OPHTHALMOSCOPY SIMULATION 191


TABLE 2. Characteristics of 33 First-Year Medical Student Participants in a Randomized Controlled Trial Comparing Simulator-based
vs Traditional Training Techniques for Direct Ophthalmoscopy.

Characteristic Simulator Group (n = 17) Traditional Group (n = 16) P Valuea

Prior ophthalmoscopy experience, n (%) .485


No 17 (100) 15 (93.75)
Yes 0 1 (6.25)
Prior study of eye disease, n (%) .656
No 15 (88.24) 13 (81.25)
Yes 2 (11.76) 3 (18.75)
Pre-existing eye condition, n (%) .728
None 8 (47.06) 6 (37.50)
Routine use of glassesb 5 (29.41) 3 (18.75)
Routine use of contactsb 4 (23.53) 5 (31.25)
Otherc 0 2 (12.50)
a
Fisher exact test.
b
Students were asked to identify the presence of refractive errors and the correction they would wear during direct ophthalmoscopy assess-
ment.
c
One participant with strabismus since childhood, with best corrected visual acuity less than 20/20, both eyes; another participant alternated
between glasses and contact lenses.

TABLE 3. Summary Grades Assigned by Masked Ophthalmologist Graders to 33 First-Year Medical Student Participants in a
Randomized Controlled Trial of Training Techniques for Learning Direct Ophthalmoscopy, Based on Examination of 5 Patient
Volunteers.

Grade Simulator Group (n = 17) Traditional Group (n = 16) P Valuea Adjusted P Valueb

Locationc .768 .692


Mean ± SD 2.5 ± 1.7 2.2 ± 1.2
Median (range) 2.0 (0.0-6.0) 2.0 (0.0-4.0)
IQR 1.0, 3.0 2.0, 3.0
Clinical findingsc .386 .407
Mean ± SD 2.9 ± 1.7 2.4 ± 1.3
Median (range) 3.0 (0.0-5.0) 2.0 (0.0-5.0)
IQR 2.0, 4.0 2.0, 3.0
Totald .513 .524
Mean ± SD 5.4 ± 3.3 4.6 ± 2.4
Median (range) 5.0 (0.0-11.0) 4.0 (0.0-9.0)
IQR 3.0, 7.0 4.0, 6.0

IQR = interquartile range, SD = standard deviation.


a
Kruskal-Wallis test.
b
Multivariable linear regression models, adjusting for practice time (≥1 hour vs <1 hour).
c
A grade of 0, 1, or 2 was assigned to each student for each patient volunteer examined (maximum grade = 10 points).
d
A combination of Location and Clinical Findings grades (maximum grade = 20 points).

Table 5 shows ophthalmologist-observer and patient- not statistically significant (P = .178). Scores did not im-
volunteer scores assigned to students. Observers scores prove in either group as students progressed from patient to
assigned to simulator-trained students were consistently patient (data not included).
higher than scores assigned to traditionally trained students Table 6 summarizes student responses in the postassess-
for technique (P = .011), efficiency (P = .007), and global ment survey. The simulator group reported longer indepen-
performance (P = .004). Technique (P = .034) and global dent practice time than the traditional group (P = .002).
performance (P = .025) scores remained statistically signif- With respect to students’ perceptions of their assigned
icant after adjustment for practice time. Patients assigned training method vs their understanding of the alternative
higher scores to students in the simulator group than to technique, the number of students within each group who
those in the traditional group, although the difference was felt that their method was strictly advantageous was simi-

192 AMERICAN JOURNAL OF OPHTHALMOLOGY JUNE 2022


TABLE 4. Proportions of 33 First-Year Medical Student Participants Assigned Each Grade Level During Examination of a Patient With
a Macular Disciform Scar in a Randomized Controlled Trial of Training Techniques for Learning Direct Ophthalmoscopy, With
Comparison Between Training Groups.

Gradea Simulator Group (n = 17) Traditional Group (n = 16) P Valueb Adjusted P Valuec

Location, n (%) .013 .027


0 5 (29.41) 13 (81.25)
1 7 (41.18) 2 (12.50)
2 5 (29.41) 1 (6.25)
Clinical findings, n (%) .018 .044
0 5 (29.41) 12 (75.00)
1 4 (23.53) 3 (18.75)
2 8 (47.06) 1 (6.25)
a
A grade of 0, 1, or 2 was assigned to each student for location and for clinical findings.
b
Fisher exact test.
c
Multivariable logistic regression models, adjusting for practice time (≥1 hour vs <1 hour).

TABLE 5. Summary Scores Assigned by Masked Ophthalmologist Observers to 33 First-Year Medical Student Participants, Based
on Examination of 5 Patient Volunteers, and a Summary Score Assigned to the Students by Those Patient Volunteers in a
Randomized Controlled Trial Comparing Simulator-based vs Traditional Training of Direct Ophthalmoscopy.

Scoresa Simulator Group (n = 16)b Traditional Group (n = 16) P Valuec Adjusted P Valued

Ophthalmologist-observer scores
Technique .011 .034
Mean ± SD 19.1 ± 2.0 16.4 ± 2.8
Median (range) 19.0 (16.0-23.0) 17.0 (11.0-21.0)
IQR 17.5, 21.0 14.0, 18.0
Efficiency .007 .066
Mean ± SD 19.8 ± 2.1 17.1 ± 3.0
Median (range) 20.0 (16.0-22.0) 17.5 (12.5-23.0)
IQR 18.5, 22.0 15.0, 18.5
Global performance .004 .025
Mean ± SD 19.8 ± 1.8 17.2 ± 2.4
Median (range) 20.0 (17.0-23.0) 18.0 (12.5-22.0)
IQR 18.3, 21.0 15.8, 18.8
Patient-volunteer score
Global performance .178 .207
Mean ± SD 21.7 ± 1.3 20.9 ± 1.8
Median (range) 22.0 (19.0-24.0) 21.0 (18.0-24.0)
IQR 21.0, 23.0 19.0, 22.0

IQR = interquartile range, SD = standard deviation.


a
Each score based on a 5-point Likert scale. Maximum possible points assigned to each student for each score was 25 (5 points for each of
5 patient volunteers examined).
b
One participant did not receive a score from an ophthalmologist observer and was not included in analysis.
c
Kruskal-Wallis test.
d
Multivariable linear regression models, adjusting for practice time (≥1 hour vs <1 hour).

lar to the number who felt it was strictly disadvantageous; On the basis of open-field responses, major themes
however, substantially more students in the simulator group among stated advantages of simulator training by students
specifically felt that there were both advantages and dis- in the simulator group were the ability to visualize simu-
advantages associated with simulator training (9 students lated fundus lesions (10 [76.9%] of 13 respondents) and
[52.94%] vs 2 students [12.5%] in the traditional group, ability to practice without bothering a patient or classmate
P = .026, Fisher exact test). (4 respondents [30.8%]). Students in the traditional group

VOL. 238 DIRECT OPHTHALMOSCOPY SIMULATION 193


TABLE 6. Results of a Postassessment Survey of 33 First-Year Medical Student Participants in a Randomized Controlled Trial
Comparing Simulator-based vs Traditional Training of Direct Ophthalmoscopy.

Survey Responses Simulator Group (n = 17) Traditional Group (n = 16) P Valuea

Self-reported estimate of independent practice timeb , n (%) .002


None 1 (5.88) 1 (6.25)
<30 min 0 3 (18.75)
30-59 min 2 (11.76) 8 (50.00)
1-3 h 14 (82.35) 4 (25.00)
Perception of assigned training techniquec , n (%) .128
Advantageous 4 (23.53) 6 (37.50)
Disadvantageous 4 (23.53) 7 (43.75)
Neither an advantage nor a disadvantage 0 1 (6.25)
Both advantages and disadvantages 9 (52.94) 2 (12.50)
Confidence in ability to perform direct ophthalmoscopy, n (%) .516
Not at all comfortable 1 (5.88) 2 (12.50)
Somewhat uncomfortable 5 (29.41) 8 (50.00)
Uncertain 5 (29.41) 2 (12.50)
Somewhat comfortable 6 (35.29) 4 (25.00)
Very comfortable 0 0
a
Fisher exact test.
b
No student reported practice time more than 3 hours.
c
The survey included an open-field option to provide further explanation of perceptions.

viewed lack of these factors as disadvantages while practic- of learning ophthalmoscopy. We identified better perfor-
ing on students with normal fundi or patients who might be mance among students in the simulator-trained group on
bothered by extended examinations. Major themes among 3 measures: independent practice time, technique (ability
stated disadvantages of simulator training by students in the to handle a direct ophthalmoscope and conduct an exami-
simulator group were the unrealistic training environment nation), and, based on the results of 1 patient examination,
(5 [38.4%] of 13 respondents), lack of experience in posi- the ability to locate fundus lesions accurately.
tioning relative to, and interacting with, a live person (3 Longer practice time among students in the simulator
respondents [23.1%]), and unfamiliarity with actual direct group is likely attributable to the novelty of the device and
ophthalmoscope controls before examination of patients the fact that students did not have to depend on others to
(8 respondents [61.5%]). Conversely, students in the tradi- practice. The exercises are engaging and the simulator pro-
tional group considered working with live individuals, using vides immediate feedback. One can hypothesize that longer
a real ophthalmoscope, to be an advantage. Lack of real- practice time will ultimately result in better skills, but prac-
time feedback was viewed as a disadvantage by 1 student in tice time alone did not explain better technique scores or
the traditional group. ability to localize lesions by the simulator group.
With regard to comfort in performing direct ophthal- With regard to technique, handling of the direct oph-
moscopy during assessments, no student in either train- thalmoscope was taught to both groups in the didactic ses-
ing group felt “very comfortable.” When training groups sion, but proper technique is reinforced by the simulator; for
were compared, more students in the simulator group felt example, correct positioning of the hand piece is required
“somewhat comfortable,” whereas more students in the tra- to view the simulated fundus. By mapping that portion of
ditional group felt “somewhat uncomfortable” or “not at all the fundus viewed by the operator, the simulator reinforced
comfortable,” although the differences were not statistically didactic instructions about how to scan the posterior pole
significant. efficiently, which might explain why the simulator group
was better able to identify the location of the lesion in 1
patient’s eye.
On a computerized literature search using PubMed, we
found only 1 previous study that investigated the effect
DISCUSSION of training on the Eyesi Direct Ophthalmoscope simula-
tor. Boden and associates12 randomized 34 German medical
Direct ophthalmoscopy is a difficult skill for many medical
students to “classic” and “simulator” training groups dur-
students to learn. This study sought to determine the effect
ing an ophthalmology rotation. All students received the
of a direct ophthalmoscope simulator on the early stages

194 AMERICAN JOURNAL OF OPHTHALMOLOGY JUNE 2022


same 5-minute introduction, after which the classic group ing. The most commonly reported disadvantage was lack of
practiced on dilated students and looked at pictures of fun- exposure to a real ophthalmoscope before use on a patient.
dus lesions, whereas the simulator group looked at repre- Thus, simulation is likely most appropriate as a complement
sentations of fundus lesions in the simulator. After the 45- to traditional training, not as a replacement. We propose
minute training session, a course instructor assessed each that learning is best achieved by a combination of didac-
student using a 23-point scale based on a combination of tic instruction, initial supervised practice, both on people
examination milestones (ability to see various fundus land- and a simulator, followed by independent practice and re-
marks) and examination procedures not specifically related finement of skills with the simulator. Initial supervision can
to the ophthalmoscope itself. During the assessment, stu- reinforce techniques and clinician-patient interactions that
dents used either a direct ophthalmoscope or the simulator are not specifically addressed by the simulator.
according to each student’s randomization group. Examina- Strengths of our study include the use of masked assess-
tion milestones were self-reported by students in the clas- ments and the fact that all students underwent the same as-
sic group and reported objectively by the simulator for stu- sessment. Evaluation of performance on real patients helps
dents who used it. On the basis of the assessment scale, the to determine the transferability of skills from the simulator
reported “learning success” for the simulator group (91%) to people. Limitations of our study include use of a nonvali-
was higher than for the classic group (78%, statistical com- dated scoring system and lack of a reference standard against
parison not reported). A student survey about the learning which medical student skills could be compared (eg, perfor-
experience yielded favorable responses, with no statistical mance of ophthalmology residents on the same assessment).
differences between the 2 groups. Students in the simulator The use of open-field questions required grader interpreta-
group also responded positively to 3 additional questions tion of student responses, which might limit the accuracy
about the simulator. with which skills were assessed; we addressed this limitation
In our study, the overall ability of students to perform di- by comparing grades assigned by 2 graders and adjudicat-
rect ophthalmoscopy on patients was poor in both groups, ing discrepancies. Despite this potential problem, we chose
with the majority in both groups reporting low confidence not to use multiple-choice answer sheets to reduce the risk
in ophthalmoscopy skills. Lack of confidence during oph- of guessing by students. The power of the study to detect
thalmoscopy has been described previously at all levels of differences was limited by the relatively small number of
medical training.2-4 Our purpose was not to achieve high student and patient volunteers for the trial. It is unlikely,
levels of proficiency in direct ophthalmoscopy among stu- however, that patients would have tolerated a substantially
dents, but to identify any influence of simulator training larger number of examinations. We did not perform a sta-
during the early phase of learning. Students in the tradi- tistical correction for numerous comparisons during anal-
tional group most likely practiced on people whose pupils yses, as we considered our study to be exploratory, look-
were not dilated, whereas students in the simulator group ing for various ways that simulation influences the learn-
had the option of looking through simulations of either di- ing of ophthalmoscopy. Although we may have identified
lated or undilated pupils. Although looking through undi- false relationships because of the many questions asked, a
lated pupils requires examiners to position carefully, we be- correction for numerous comparisons might also have pre-
lieve that, during early training, it is easier for students to vented us from identifying factors that should be investi-
develop a proper direct ophthalmoscope technique and pro- gated in larger, future studies. We sensed that many students
cedures (such as scanning of the macula) while examining were overwhelmed by the assessment experience, which
eyes with dilated pupils. This distinction would favor prac- may have led to poor performance. The long-term effect
tice on the simulator. simulation-based training on skills is unknown, but will be
There are many advantages of using simulation in med- the subject of a future study.
ical training in general.18 Simulation allows the learner to In summary, simulation appears to enhance early teach-
develop proper motor skills before performing those skills ing of direct ophthalmoscopy. Students who practiced on
on patients. Simulation facilitates learning through repeti- the simulator demonstrated a better technique using a di-
tion with real-time feedback and objective tracking of per- rect ophthalmoscope during patient examinations. Simula-
formance. Independent practice, spaced out over time, re- tion aims to train operators to scan the fundus completely
sults in more permanent memory encoding of motor skills; and locate lesions accurately. The simulator also appeared
however, operators receive no feedback from human exam- to motivate students to practice longer, but duration of
inees, and simulation does not fully duplicate the position- practice did not account for other training benefits in the
ing of clinicians in relation to patients’ bodies. With regard timeframe of this study. We propose that simulation will
to student perceptions, those in the simulator group recog- likely be most useful as a complement to traditional, super-
nized both advantages and disadvantages to simulator train- vised training methods.

VOL. 238 DIRECT OPHTHALMOSCOPY SIMULATION 195


Acknowledgments: Marshall Dial, North American representative of VRmagic GmbH, helped to instruct medical students who had been randomized to
the simulator group about how to operate the simulator; he had no further role in conduction of the randomized controlled trial or analysis and reporting of
data. The following ophthalmologists from the UCLA Department of Ophthalmology served as observers during student examinations of patients: JoAnn
A. Giaconi, MD, Lynn K. Gordon, MD, PhD, Ralph D. Levinson, MD, Tania Onclinx, MD, and Sandip Suresh, MD.
All authors have completed and submitted the ICMJE form for disclosure of potential conflicts of interest.
Funding/Support: David Geffen School of Medicine at UCLA Office of the Vice Dean for Education (C.H.B.); unrestricted funds from the UCLA Stein
Eye Institute (B.R.S.); the Skirball Foundation (New York, NY) (G.N.H.); and Research to Prevent Blindness, Inc (New York, NY) through an unrestricted
grant to the UCLA Stein Eye Institute for research. Funding organizations had no role in the design or conduction of the study.
Financial Disclosures: C.A.M. is an unpaid consultant to VRmagic GmbH (Mannheim, Germany), manufacturer of the Eyesi Direct Ophthalmoscope
Simulator; VRmagic GmbH had no role in the design or conduct of this research. The other authors have no conflict of interest to disclose. All authors
attest that they meet the current ICMJE criteria for authorship.

REFERENCES 10. Ricci LH, Ferraz CA. Ophthalmoscopy simulation: advances


in training and practice for medical students and young oph-
1. Graubart EB, Waxman EL, Forster SH, et al. Ophthal- thalmologists. Adv Med Educ Pract. 2017;8:435–439.
mology objectives for medical students: revisiting what ev- 11. de Souza PHL, Nunes GMN, de Alencar Bastos JMG, Fon-
ery graduating medical student should know. Ophthalmology. seca TM, Cortizo V. A new model for teaching opthalmoscopy
2018;125:1842–1843. to medical students. MedEdPublish. 2017;28. doi:10.15694/
2. Gupta RR, Lam WC. Medical students’ self-confidence in mep.2017.000197.
performing direct ophthalmoscopy in clinical training. Can J 12. Boden KT, Rickmann A, Fries FN, et al. [Evaluation of a
Ophthalmol. 2006;41:169–174. virtual reality simulator for learning direct ophthalmoscopy
3. Mottow-Lippa L. Ophthalmology in the medical school cur- in student teaching]. Ophthalmologe. 2020;117:44–49 [in Ger-
riculum: reestablishing our value and effecting change. Oph- man].
thalmology. 2009;116:1235–1236 1236.e1. 13. Ferris JD, Donachie PH, Johnston RL, Barnes B, Olaitan M,
4. Wu EH, Fagan MJ, Reinert SE, Diaz JA. Self-confidence in Sparrow JM. Royal College of Ophthalmologists’ National
and perceived utility of the physical examination: a compari- Ophthalmology Database study of cataract surgery: report 6.
son of medical students, residents, and faculty internists. J Gen The impact of EyeSi virtual reality training on complications
Intern Med. 2007;22:1725–1730. rates of cataract surgery performed by first and second year
5. Shuttleworth GN, Marsh GW. How effective is undergradu- trainees. Br J Ophthalmol. 2020;104:324–329.
ate and postgraduate teaching in ophthalmology? Eye (Lond). 14. Jacobsen MF, Konge L, Bach-Holm D, et al. Correlation of
1997;11(Pt 5):744–750. virtual reality performance with real-life cataract surgery per-
6. Stern GA. Teaching ophthalmology to primary care formance. J Cataract Refract Surg. 2019;45:1246–1251.
physicians. The Association of University Professors of 15. McCannel CA, Reed DC, Goldman DR. Ophthalmic
Ophthalmology Education Committee. Arch Ophthalmol. surgery simulator training improves resident performance
1995;113:722–724. of capsulorhexis in the operating room. Ophthalmology.
7. Mackay DD, Garza PS, Bruce BB, Newman NJ, Biousse V. 2013;120:2456–2461.
The demise of direct ophthalmoscopy: a modern clinical chal- 16. Borgersen NJ, Skou Thomsen AS, Konge L, Sørensen TL,
lenge. Neurol Clin Pract. 2015;5:150–157. Subhi Y. Virtual reality-based proficiency test in direct oph-
8. Kelly LP, Garza PS, Bruce BB, Graubart EB, Newman NJ, thalmoscopy. Acta Ophthalmol. 2018;96:e259–e261.
Biousse V. Teaching ophthalmoscopy to medical students 17. Cheng A, Kessler D, Mackinnon R, et al. Reporting guidelines
(the TOTeMS study). Am J Ophthalmol. 2013;156:1056–1061 for health care simulation research: extensions to the consort
e1010. and strobe statements. Simul Healthc. 2016;11:238–248.
9. Akaishi Y, Otaki J, Takahashi O, et al. Validity of direct oph- 18. Issenberg SB, Mcgaghie WC, Petrusa ER, Gordon DL,
thalmoscopy skill evaluation with ocular fundus examination Scalese RJ. Features and uses of high-fidelity medical simu-
simulators. Can J Ophthalmol. 2014;49:377–381. lations that lead to effective learning: a BEME systematic re-
view. Medical Teacher. 2005;27:10–28.

196 AMERICAN JOURNAL OF OPHTHALMOLOGY JUNE 2022

You might also like