Professional Documents
Culture Documents
Kayla A. McKean
Abstract
This applied research project focused on Vision of You (VOY) and its alignment with the
instructional design model, ADDIE. VOY is an online sexual health education program
developed by the VPREIS team at James Madison University. Strengths, limitations, and
recommendations for how the creation of the program aligned with the analysis, design,
development, implementation, and evaluation phases were all considered. These observations
indicated a breech from the design and quick fixes in the development phases that may have
resulted in program elements less suitable for online learners. Time spent on lessons about
survey questions related to those topics to gauge how learner engagement with the program
(the time they spent in it) may be related to their performance on content questions. The
researcher conducted a series of t-tests to compare the means of time spent in three VOY units
and the corresponding post- program survey question answers. Results indicated no statistical
significance for time spent in the unit and survey outcomes, but a small to medium effect size
was shown by determining the Cohen’s d for each test. On average, participants spent less
time in the program units than was intended by the developers, possibly indicating a lack of
engaging content, challenging content appropriate for the audience, or technology issues
within the online program or management system. Future research will consider additional
components of intervention dosage including the number of sessions and duration across time.
Keywords: instructional design, ADDIE, online sexual health education, time spent
Chapter One
Strategies) grant in 2016 to implement and evaluate innovative strategies for the prevention
among youth ages 10 - 19. JMU established the Virginia PREIS (VPREIS) project team to
innovate the existing Vision of You curriculum, implement the adapted curriculum with
youth, and rigorously evaluate it. The PREIS grant was awarded to IIHHS at JMU through
the Federal Youth Services Bureau (FYSB) which is a branch of the Administration for
Children & Families under the U.S. Department of Health and Human Services. Title V of
the Social Security Act was amended to include PREIS on March 23, 2010 (National
Education Program Innovative Strategies Fact Sheet published on the HHS website (2016):
Every PREIS project conducts their own independent evaluation, supported by Federal
training and technical assistance, These rigorous evaluations are designed to meet the
expected to generate lessons learned so that others can benefit from these strategies and
innovative approaches. Projects must carefully document the intervention for possible
Program Delivery: Fidelity to the program model or adaptation of the program model
for the target population;
Program Dosage: The number of youth served and hours of service delivery; and
At the writing of this paper, the VPREIS project is in its fourth year of funding. October of
2020 will begin the fifth year of funding for the project. Tables 1 through 3 detail the timeline
Table 1
Work Plan and Timetable – VPREIS Year 1 (JMU, 2015, pp. 37-39)
Project Goal: To reduce pregnancies, births, and STIs including HIV/AIDS among
high-risk youth populations in rural Virginia through the online, interactive VOY
program designed to reduce the frequency of sexual activity, reduce the number of sexual
partners, and increase contraceptive use among youth participants.
Process Objective: VPREIS will have Phase 1 project activities operating within at least
60 days and completed within nine months of receiving the Cooperative Agreement.
Activity, Timeframe, Responsible Staff
PI=Principal Investigator; PD=Project Director; PC=Project Coordinator; APC=Assistant
Project Coordinator; DC=Data Coordinator; CDS=Curriculum Development Specialist;
E=Evaluator; HEDG=Health Education Design Group
Ongoing
Conduct monthly telephone calls with FYSB to consult on project and evaluation
design: PI, PD, PC
Conduct weekly VPREIS staff meetings to ensure quality and compliance with
project objectives and timeline: All staff
Participate in technical assistance and training as needed: All staff
By October 31, 2016
Participate in the Grantee Orientation Webinar: PI, PD, PC, DC, E
Begin documentation of innovative strategy/approach to be evaluated, program
implementation and service delivery: PI, PD, PC
Begin VOY curriculum revisions and video script creation: PC, APC, CDS, HEDG
By November 30, 2016
All vacant staff positions are filled: PI, PD, PC
All VPREIS staff are trained in trauma-informed approach by a certified trainer: All
Submit all video scripts and storyboards to FYSB for medical accuracy review and
approval: PD, PC, HEDG
Phase 1 activities are operating within 60 days following Notice of Award: All staff
By December 31, 2016
Video filming has begun: HEDG
Research methodology, data management and reporting systems designs are
finalized: PI, PD, PC, DC, E
Partnership agreements with juvenile detention centers, school divisions, alternative
education programs, and CSBs are finalized and executed: PD, PC, APC
All materials for Learning Modules 1 & 2 are submitted to FYSB for medical
accuracy and age appropriateness review: PD, PC, CDS, HEDG
By January 31, 2017
Sustainability plan is finalized and submitted to FYSB for review: PI, PD
Meetings and communication with key staff at partner organizations are taking place
to determine logistics for VOY implementation and evaluation with treatment and
control groups: PD, PC, APC, DC, E
Learning Modules 1 & 2 are finalized, and pilot tested: PD, PC, CDS, HEDG
By February 28, 2017
VOY Training and Fidelity Guide materials are updated and finalized: PC, APC,
CDS
All materials for Learning Modules 3 & 4 are submitted to FYSB for medical
accuracy and age appropriateness review: PD, PC, CDS, HEDG
By March 31, 2016
Data collection instruments and protocols are finalized: E, DC, PI, PD
Evaluation plans are finalized: E, DC, PI, PD
IRB protocol is submitted to JMU’s IRB including all PREP performance measures
and measures associated with the rigorous evaluation: E, PI, PD
Health and Support Services Referral list is created for each community that will
have VOY participants: APC, CDS
Learning Modules 3 & 4 are finalized, and pilot tested: PD, PC, CDS, HEDG
By April 30, 2017
Institutional Review Board approval is received: E, PI, PD
VOY Training and Fidelity Guide materials are submitted to FSYB: PC, APC, CDS
All materials for Learning Modules 5 & 6 are submitted to FYSB for medical
accuracy and age appropriateness review: PD, PC, CDS, HEDG
Performance Progress Report and Financial Status Report submitted to FYSB: PI, PD
Additional partners and implementation sites have been recruited and partnership
agreements are finalized: PD, PC, APC
By May 30, 2017
Scheduling and logistical details are finalized with all partners and implementation
sites for Year 2: PC, APC, DC
Learning Modules 5 & 6 are finalized, and pilot tested: PD, PC, CDS, HEDG
All materials for Learning Modules 7 & 8 are submitted to FYSB for medical
accuracy and age appropriateness review: PD, PC, CDS, HEDG
Evaluation plan is submitted to FYSB for review/approval: E, PI, PD, DC
By June 30, 2017
Learning Modules 7 & 8 are finalized, and pilot tested: PD, PC, CDS, HEDG
All VOY program materials, including training and fidelity guide have been
submitted to FYSB for review: PD, PC, CDS, HEDG
All staff have been trained in the use of the training and fidelity guide: PC, APC, DC
All partners and implementation sites have identified/recruited participants: PC, APC
All Phase 1 activities are completed within 9 months of Notice of Award: All staff
By September 30, 2017
Two to three key staff attend the National TPP Grantee Conference in Washington,
DC area: PI, PD, PC
Key staff attend Topical Trainings provided by FYSB: PI, PD, PC, E
All revisions and recommendations based on FYSB’s review of materials are
completed prior to implementation: PI, PD, PC, APC, DC, E
Implementation and control assignment and scheduling is complete: E, PC, APC, DC
All components are ready for Year 2 implementation: PI, PD, PC, APC, DC, E
Table 2
Work Plan and Timetable – VPREIS Years 2 - 4 (JMU, 2015, pp. 39-40)
Project Goal: To reduce pregnancies, births, and STIs including HIV/AIDS among
high-risk youth populations in rural Virginia through the online, interactive VOY
program designed to reduce the frequency of sexual activity, reduce the number of sexual
partners, and increase contraceptive use among youth participants.
Process Objective: Target youth populations that are at highest risk of teen pregnancy to
prevent adolescent pregnancy and STIs including HIV/AIDS and rigorously evaluate the
online interactive, self-paced VOY program using a randomized controlled design in
Phase 2 of the VPREIS project.
Activity, Timeframe, Measures of Success, Responsible Staff
PI=Principal Investigator; PD=Project Director; PC=Project Coordinator; APC=Assistant
Project Coordinator; DC=Data Coordinator; DCS=Data Collection Specialists;
CDS=Curriculum Development Specialist; E=Evaluator; HEDG=Health Ed Design
Group
Ongoing
Conduct monthly telephone calls with FYSB to consult on project and evaluation
design: PI, PD, PC
Conduct weekly VPREIS staff meetings to ensure quality and compliance with
project objectives and timeline: All staff
Participate in technical assistance and training as needed: All staff
By October 31, 2017, 2018, 2019
Obtain consent and assent forms from parents and participants for the program and
evaluation: PC, APC, DC, partners
Begin implementation and evaluation of VOY with intervention and control groups:
PD, PC, APC, DC, E, partners
Begin performance measure and rigorous evaluation measure collection: DC
Submit 6-month Performance Progress Report and FSR to FYSB: PI, PD
Data Collection Specialist is hired (Year 2 only): PI, PD, PC, DC
By December 31, 2017, 2018, 2019
Staff begin conducting fidelity monitoring site visits at each site: PD, PC, APC
Begin retention, tracking, and follow-up activities for 3-month post-intervention
survey: PD, PC, APC, DC, DCS
VPREIS leadership reviews and makes revisions to the sustainability plan and begins
sustainability activities: PI, PD, PC
By April 30, 2018, 2019, 2020
VPREIS staff conduct fidelity monitoring site visits at each partner site: PD, PC,
APC
Submit 6-month Performance Progress Report and FSR to FYSB: PI, PD
By June 30, 2018, 2019, 2020
A total of at least 160 high-risk youth have participated in the VOY program each
year (a total of at least 480 in Phase 2): PI, PD, PC, APC, partners
A total of at least 160 high-risk youth have participated as the control group each
year (a total of at least 480 in Phase 2): PI, PD, PC, APC, partners
Retention, tracking, and follow-up activities for 9-month post-intervention survey:
PD, PC, APC, DC, DCS
Staff and points of contact at partner sites complete feedback activities to assess areas
for continuous quality improvement of delivery: PC, APC, CDS, DC, DCS, partners
All data submitted to Evaluation Team for analysis and reporting: DC, DCS
By September 30, 2018, 2019, 2020
Any necessary revisions to implementation plan are submitted to FYSB: PI, PD, PC,
APC
Evaluation Team presents findings from previous year results to VPREIS staff for
continuous quality improvement efforts: All staff
Continuous quality improvement efforts are put into place for the following year: All
2-3 Key staff attend annual TPP Grantee Conference in Washington, DC: PI, PD, PC
Present findings from evaluation, innovative strategy approach and program
implementation to other grantees through poster and panel presentations: PI, PD, PC,
E
Key staff attend Topical Trainings hosted by FYSB, as applicable: PI, PD, PC, E
All logistics are in place for implementation and evaluation for the next year: All
Sustainability planning and activities are implemented: PI, PD, PC
Table 3
Work Plan and Timetable – VPREIS Year 5 (JMU, 2015, pp. 40-41)
Project Goal: To reduce pregnancies, births, and STIs including HIV/AIDS among
high-risk youth populations in rural Virginia through the online, interactive VOY
program designed to reduce the frequency of sexual activity, reduce the number of sexual
partners, and increase contraceptive use among youth participants.
Process Objective: Manualize and package the VOY program; and disseminate lessons
learned, best practices, and relevant findings to further teen pregnancy prevention and
STI prevention efforts among high-risk youth populations.
Activity, Timeframe, Measures of Success, Responsible Staff
PI=Principal Investigator; PD=Project Director; PC=Project Coordinator; APC=Assistant
Project Coordinator; DC=Data Coordinator; DCS=Data Collection Specialists;
CDS=Curriculum Development Specialist; E=Evaluator; HEDG=Health Ed Design
Group
Ongoing
Conduct monthly telephone calls with FYSB to consult on project and evaluation
design: PI, PD, PC
Conduct weekly VPREIS staff meetings to ensure quality and compliance with
project objectives and timeline: All staff
Participate in technical assistance and training as needed: All staff
By October 31, 2020
Obtain consent and assent forms (parents and participants): PC, APC, DC, partners
Continue implementation of VOY with control groups: PD, PC, APC, DC, E,
partners
Submit 6-month Performance Progress Report and FSR to FYSB: PI, PD
By December 31, 2020
Staff conducts fidelity monitoring site visits at each partner site: PD, PC, APC
Retention, tracking, and follow-up activities for 3-month and 9-month post-
intervention surveys are completed for remaining cohorts: PD, PC, APC, DC, DCS
Sustainability planning and activities are implemented: PI, PD, PC
Outcome evaluation data is analyzed: E
By April 30, 2021
VPREIS staff conduct fidelity monitoring site visits at each partner site: PD, PC,
APC
Submit 6-month Performance Progress Report and FSR to FYSB: PI, PD
Evaluation Team presents outcome evaluation findings to VPREIS staff: E
Manuscript submitted to peer-reviewed journals for publication and contribution to
research: PI, PD, E
By June 30, 2021
VOY program and training materials are manualized and packaged: PI, PD, PC, APC
Sustainability activities are expanded: leveraging funding from Virginia, securing
funding from private foundation/corporate sponsors, establishing an affordable fee
structure so that organizations across the US can implement VOY with high-risk
youth, and disseminating outreach and communication materials to professionals in
the field via digital advertising, presentations, workshops, and vendor demonstrations
at conferences: PI, PD, PC, APC
By September 30, 2021
Final evaluation report submitted to ACF/FYSB: E, PI, PD
Final evaluation findings submitted to HHS TPP Evidence Review: PI, PD, E
2-3 Key staff attend annual TPP Grantee Conference in Washington, DC: PI, PD, PC
VPREIS staff presents findings from evaluation and innovative strategy approach and
program implementation to other grantees at Grantee Conference through poster and
panel presentations: PI, PD, PC, E
Key staff attend Topical Trainings hosted by FYSB, as applicable: PI, PD, PC, E
VPREIS leadership and staff disseminate information related to program design,
theory of change, implementation and early findings via journals, press releases,
conferences, workshops, and other methods: PI, PD, PC, E
Sustainability activities are in place to ensure continuation: PI, PD, PC
These timelines portray the intended plan for the VPREIS team in 2015. Over the course of the
four years the project has been operating changes to the original timelines have been made. For
example, all Vision of You program units were to be completed and pilot tested by June of
2017 in order to begin implementation by October of 2017, but the program was only partially
completed and piloted in July of 2017 and was not ready for implementation until April of
2018. Though not outlined in the original timeline or plan, a recruitment specialist was hired in
the spring of 2019 to assist the team in recruiting study participants as recruitment at point was
critically low (under 100 participants) for meeting the number of identified participants needed
Context of Learners
Youth residing in areas of Virginia with high teen birth rates who demonstrate elevated
risk factors for experiencing or causing a teen pregnancy and contracting sexually transmitted
infections (STIs), including HIV/AIDS were chosen as the population for the VPREIS
research. Participants for the project include the following vulnerable high-school-aged youth
attending alternative education and/or night school programs; and 3) youth referred to third
party service providers (JMU, 2015). Rational for targeting two of these populations is
outlined within the VPREIS research proposal (as cited in JMU, 2015) as follows:
Youth involved in the juvenile justice system: While the rate of juvenile detention has
been declining since 1999, almost 55,000 youth were detained in residential placements
in 2013 across the US with males and racial/ethnic minorities being heavily
overrepresented (Child Trends, 2015). In 2013, there were 1,563 under the age of 21 that
juvenile incarceration was 188 per 100,000 in 2013. When examining by race/ethnicity,
black youth were incarcerated at a rate of 506 per 100,000, Hispanic youth at a rate of
114 per 100,000, and white youth at a rate of 93 per 100,000 (Kids Count, 2015). Youth
committed to the Department of Juvenile Justice (DJJ) have participated in certain sexual
behaviors at higher rates than youth in the general population. According to the 2013
Youth Risk Behavior Survey, 47% of high school students nationwide reported ever
having sex, and 15% reported having sex with four or more partners during their lifetime.
Of those youth who reported having sex, 59% nationwide reported having sex without a
condom. Additionally, only 6% of high school students reported having had sex before
age 13. In sharp contrast, 87% of all admitted youth to the DJJ reported ever having sex,
and 57% reported having sex with four or more partners in the previous three years. Of
youth admitted in FY2013 who reported having sex, 77% reported having sex without a
condom, and 25% had sex before age 13 (VDJJ, 2015). There is a clear, significant need
for addressing risky sexual behavior among youth in the DJJ system. These youth also
have other risk factors including mental health concerns, substance use, higher rates of
domestic violence, abuse and neglect, and sexual abuse than other youth, necessitating a
Youth attending alternative education and night school programs: In 1993, the
programs to provide an educational alternative for certain students. There are currently
longer have access to traditional school programs or are returning from juvenile
detention centers. Specifically, targeted are students who have a pending violation of a
school board policy, have been expelled or suspended long term, or have been released
from a juvenile detention center. The number of students enrolled in Virginia’s regional
programs increased from 217 students in 1993-1994 to 4,085 students in 2008-2009.
According to a 2010 report, students served by these programs were 52.61% White,
41.15% African American, 5.04% Hispanic, and 1.2% Other. Examining gender,
71.16% were male and 28.84% of students were female. Nearly 70% of students were
in grades 9-12 (2,866 students). Students are typically assigned to these regional
alternative education programs because they have received long-term suspensions, are
best served by these programs. Due to the nature of reasons for enrollment, programs
In their original narrative, JMU proposed conducting an intervention with students who
received home-bound instruction due to medical conditions that prohibit attendance upon
later dropped by the VPREIS study due to the difficulty of establishing relationships with
partner staff that worked with home-bound students. Instead, third party service providers
including community services boards, foster care services, and after school programming were
added as organizations from which to recruit study participants. Students within these settings
show some of the same risk factors as youth in alternative education and juvenile detention
centers. For instance, youth who have been involved in the child welfare system have an
increased likelihood of being involved in the juvenile justice system (Abbott & Barnett, 2016).
Some term youth who have been involved in both systems as “crossover youth” (Herz et al.,
2012, p. 3). In addition, afterschool programs are often used to provide activities to youth that
are intended to reduce delinquent behavior during the time of day (2 pm to 6 pm) when
juvenile crime is at a peak (Gottfredson et al., 2004). Programs like the Boys and Girls Clubs
of America receive funding from the Office of Juvenile Justice and Delinquency Prevention to
reduce juvenile delinquency, drug abuse, truancy and other high-risk behaviors that could
result in the detainment of youth (U.S. Department of Justice Office of Justice Programs,
2019). For these reasons this population of youth were added to the target population for the
VPREIS study.
Context of Team
As outlined in Tables 1-3 the VPREIS team was originally meant to consist of the
Principal Investigator, the Project Director, the Project Coordinator, the Assistant Project
Coordinator, the Data Collection Specialist, the Curriculum Development Specialist, the
Evaluator, and the Health Education Design Group (HEDG). Members of the VPREIS team
programming. Several members had experience in evaluating program effectiveness and the
Principal Investigator had overseen grant funded programs for over 15 years. Upon award of
the PREIS grant, all team members responsible for research completed the Human Subjects:
design for Farm to Table programs and several years of experience facilitating sexual health
curriculums in middle school classrooms. They were also invested in LGBTQ+ youth
programming and social justice education that focused on the school to prison pipeline. The
CDS was responsible for updating the Vision of You curriculum from an in-face abstinence
only education program to a comprehensive sexual health program that would delivered in a
self-paced format online. As the CDS developed program content it was passed to the HEDG
team.
The HEDG team was responsible for developing the learning content management
system (LCMS) that would facilitate the self-paced active learning environment through a web
browser on a computer or mobile device. The LCMS was to be comprised of the Student
Engagement Profile (student platform for engaging in the learning activities), Learning Module
Series (all Vision of You units’ topics and activities), Moderator Interaction and Support
Forum (where students could ask question to program moderators, Facility and Center
Administrative Control Panel (to be used by schools or partners of the VPREIS team), System
Administrative Control Panel (used by the VPREIS team for adding study participants to the
intervention program), and a Controlled Student Population Test (this system would connect
participants from the Vision of You program to the follow up survey in Qualtrics). The LCMS
and contracting server hardware and bandwidth requirements. The HEDG team was also
responsible for developing the program activities for the Learning Module Series as outlined
The Evaluator for the VPREIS project is the CEO for an independent company that
offers psychometrics and evaluation services. She was contracted through James Madison
University to design the evaluation tools for the VPREIS project and will serve the project
through year five when final results are analyzed and disseminated.
Process of Program
program the VPREIS team relied on the SIECUS Guidelines for Comprehensive Sexuality
Education (GCSE) (SIECUS, 2004). Students receive nine, 45 – 60 minute lessons that address
all the key concepts and the majority of the topics for Level 4, high school aged youth as
outlined in the third edition of the GCSE (SIECUS, 2004). Figure 1 displays the Guidelines for
Comprehensive Sexuality Education: Key Concepts and Topics that were used in the design of
the Vision of You program content.
Figure 1
VOY is delivered in nine sessions over the period of two to four weeks, depending on
student availability and scheduling. Two lessons, identity and anatomy must be completed
before students can access subsequent lessons. Once the two foundational lessons are completed
students are able to determine the order, they would like to complete the remaining seven
lessons. Table 4 outlines objectives for each lesson, the GCSE concepts and topics covered, as
well as each unit’s connection to the adult preparation subjects outlined by the Patient Protection
and Affordable Care Act. PREP funding, of which VPREIS is an extension of, are required to
include at least three of the six adulthood preparation subjects which include adolescent
development, educational and career success, financial literacy, healthy life skills, healthy
Table 4
VOY Program Content, Alignment with SIECUS, and Adulthood Preparation Subjects
Addressed (Adapted from JMU, 2015, p. 25 – 26)
Participants assigned to the intervention in the VPREIS project log in to the web-based
program with a desktop or laptop computer with an assigned username and password. Users
must access the program through the Google Chrome browser. For the majority of partner sites
in VPREIS project participants are able to work on the VOY program without other participants
around them. In alternative school settings three to four students may work on the program in a
group, but each logged in individually on their own device. Students are provided headphones
When a student first logs in to their account in Vision of You they will see their name as
it has been entered in the Administrative Control Panel (referred to by the VPREIS team as the
VOY Management System) on the program landing page. The landing page shows the student’s
chosen character avatar, which is selected in the first unit, a display of badges, a board which
contains the infographics unlocked in each unit, a graphical representation of how the choices the
student has made in activities have affected their character avatar’s goals/interests, and a list of
Participants are able to see all the units on their homepage, as well as their individual
progress through each specific unit (e.g. Completed, In Progress with an indication of the
percentage completed, Not Started, and Locked). Students have a degree of control over the
order of lessons with which they engage. Some lessons are locked until a prerequisite lesson is
completed. The order of engagement with other lessons is up to the student. The content of all
lessons is divided into sections that contain one or more of the following: videos, short reading
progress bar at the bottom of the page that will indicate how much of the lesson is remaining.
Students can exit any lesson at any time and will be returned to the section where they left off.
Lessons are divided into two categories: Foundation Content (FC) and Consequential
Content (CC). FC focuses on providing students with medically accurate information and tests
their knowledge with short quizzes and interactive activities. An FC lesson is not listed as
“Complete,” and thus will not unlock any linked subsequent lessons, until the student can
achieve a pre-determined score on the lesson’s final evaluation, to ensure participants engage
with the content. Students are not punished for wrong answers; instead, if a student fails the final
evaluation, they are redirected to the relevant sections corresponding to the questions they
answered incorrectly before being able to retake the evaluation. A bank of relevant questions
ensures the evaluation is never exactly the same twice, thus reducing the effectiveness of simply
trying all the different answers. CC focuses on showing situations and stories, along with
possible outcomes and resolutions for the character avatar chosen in the beginning of the
program. Students are asked to think about what they would do in similar situations and give
advice. Advice options are categorized as assertive, avoidant, aggressive, or passive. After
completing the scenario questions, a results display indicates how the chosen answers will affect,
both positively and negatively, their character avatar’s goals and interests.
The original VOY interface was developed in React. According to the React (n.d.)
website they offer “a declarative, efficient, and flexible JavaScript library for building user
interfaces” (Tutorial section, para. 3). React offers open-source code, meaning that it is free and
available to be used and modified. As a code source developed and used by the social media
platform, Facebook, React is popular for developing single page applications and mobile apps,
but can also be used to build more complex apps if used with other libraries. Two programmers
on the HEDG team were responsible for building the LCMS for Vision of You. One programmer
was assigned to building the Student Engagement Profile and the Learning Module Series and
the other programmer was assigned to building the Moderator Interaction and Support Forum,
Facility and Center Administrative Control Panel, System Administrative Control Panel.
Current Program
The intervention being used by VPREIS project is the online sexual health education
program, Vision of You. Vision of You consists of nine units containing games, videos, and
interactive activities all accessible from one website. One unit is dedicated to each of the
following topics; Identity, Healthy Relationships, Talking with Adults, Consent, Anatomy, STIs,
Methods of Protection, Clinic Visit, and Thinking Forward. Throughout VOY, gamification
principles are utilized to maximize student engagement, such as the use of encouraging sound
effects and animations, digital badging to provide incentive and motivation, scoring points for
completing lessons and getting answers correct. Program units also contain smaller gaming
elements that can be returned to at any time in the program. Games featuring puzzles, matching,
and fall and catch elements are used to reinforce content knowledge in anatomy, consent, and
As students navigate through the program, they are introduced to twelve characters
representing a diverse group of young people shown in Figure 2. Lacey, Sofia, Noor, Hunter,
Dakoda, Bella, Keisha, Luis, Jannette, Tyler, Nick and Brianna navigate healthy and unhealthy
relationships, asking for and receiving consent, communicating with adults in their lives,
understanding their identity, accessing health care services, and thinking about their future goals
both short term and long term. Students in Vision of You pick from these characters and learn
more about their experiences throughout the program while answering scenario questions to help
The VPREIS team strived to represent a diverse group of young people that VOY users
can identify with. Evidenced-based programs often fall short in recognizing the impact that
intersecting identities including race, sexual orientation, disability, gender, class, and religion
have on youth and their sexual health. Vision of You represents youth that have experienced
homelessness and incarceration, youth that have been ostracized for their sexual orientation, and
youth that are learning to be parents. The program navigates the user through definitions of
gender identity and sexual orientation and shows LGBTQ+ youth in examples of healthy
relationships. Interactive comic style activities, like the one shown in Figure 3 allow the user to
Once completing a unit VOY users can still access the activities, games, and videos they
engaged with in that unit. Units on identity, consent, sexually transmitted infections, and
methods of protection automatically generate infographics that can be easily accessed from the
VOY homepage should students wish to review that content. Before moving on to new content
students complete a Gateway Quiz containing three to four questions that they must answer
correctly to show they understand key concepts within that unit. Games throughout the units
reinforce content in a fun and engaging way. Figure 4 shows the STI Eliminator game where
users put their knowledge about bacterial, parasitic, and viral STIs to the test in a classic style
game.
Figure 4
Video segments were seamlessly integrated into each unit of VOY. Stylized animation is
used in the lessons addressing STI Prevention. These videos/animations effectively demonstrate
key concepts from the lesson objectives in a memorable way. The HEDG team (Health
Education Design Group) at JMU developed and scripted high-quality engaging segments from a
dramatic narrative approach, depicting relatable, real-life situations. Some of the characters
appear in multiple videos in the course and are represented as character avatars, creating a more
cohesive presentation. The HEDG creative team utilized a number of types of video, ranging
from energetic docu-drama web series to cutting edge animations. This innovative approach to
learning and skill development makes sexuality education relevant to teens by presenting
information in a format and delivery method that teens use daily. Vision of You begins with a
focus on the user through the Identity unit and ends with a focus on the user’s future in the
Thinking Forward unit. In this final unit students meet the reluctant substitute health teacher, Ed,
who helps the class think about their short- and long-term goals moving forward. Figure 5 shows
Ed in front of the health class as students prepare to remind him of what they have been learning
Figure 5
The evaluator assigned to the VPREIS team designed post program surveys to be used
to assess participant outcomes across four points of time. A review of literature and technical
manuals were conducted with attention paid to psychometric properties (reliability and
validity) of similar scales. All survey items, excluding the delivery quality, were mapped
directly to the VOY curriculum. Many items are used to measure participant behaviors as
change in health behaviors are the primary outcome goals for participants and knowledge gains
are expected to be present on immediate survey outcomes but are a secondary outcome goal for
the participants. VOY instruments were submitted to FYSB and Mathematica for review
before being approved by the JMU Institutional Review Board. Table 5 outlines the
instruments used, the variables assessed, number of items, and connection to content or
Table 5
Primary
Content Outline / Secondary Variables
Instruments Variables # Items
Assessed
Assessed
Questions are mapped to the content
VOY Student topics of the VOY curriculum covering
Knowledge Curriculum 15 identity (sexual orientation, gender, sex),
Scale Knowledge consent, anatomy, methods of protection,
and clinics.
Questions are mapped to content topics
of the VOY curriculum covering the
VOY Adulthood
adult preparation subjects of healthy
Knowledge Preparation
relationships, parent-child
Scale Subjects 9
communication, healthy life skills, and
adolescent development.
Behavioral questions about frequency of
VOY
Sexual Risk sexual activity, number of partners,
Behavior
Behavior 23 contraceptive use and other behaviors
Scale
related to sexual activity.
Program Questions meant to gauge student interest
Delivery Likability in program compared to others programs
9
Quality Compared to they have taken, favorite activities, and
Others missing topics.
Questions required by FYSB (federal
Performance
Demographics 10 funder) which cover race, age, language,
Measures
etc.
Participants begin by taking a baseline survey and are then randomly assigned the
intervention group which will complete the Vision of You program, or the control group which
can complete an optional nutrition program. Participants are evaluated first by completing an
immediate post program survey followed by the same survey three months post program
completion and finally at nine months post program completion. Surveys are confidential and
collected through the web-based service, Qualtrics. Participants are assigned a tracking number
to use for each survey. Partner staff at alternative schools, detention centers, and third-party
service providers deliver the survey to participants along with their tracking number if the
participant is still attending the partner site. If participants have graduated, withdrawn from
school, have been released from detention or left the services of the third-party provider the
VPREIS Data Collection Specialist contacts the participant directly via cell phone, home
phone, email, mailing address, or social media. It is expected that response rates to the surveys
will decrease slightly over the four survey points as participants leave the placements where
they began the intervention. Participants are offered a monetary incentive for completing each
of the post program surveys to encourage follow-through with the project and avoid high
attrition rates.
The following statement details the data analysis that will be used by the evaluator to a
analyze collected data in year five of the VPREIS project once all survey points have been
The statistical modeling method for this research design is a doubly multivariate analysis:
the within-subjects variance (across time) modeled within each between-subjects level
(programming). The RCT utilized in this study will create an Intervention and a Control
Group, which represent two distinct levels of programming: The Control Group with no
VOY curriculum, and the Intervention Group that will participate in VOY curriculum.
subjects, but if statistical differences appear, they will be controlled for in the subsequent
analysis.
A doubly multivariate analysis can be implemented when multiple dependent
variables (DV) are measured at multiple times (Tabachnik and Fidell, 2001). The four
four points in time. To measure these variables four times for one group / level of an
Independent Variable (IV) would require a MANOVA, with time being the within-groups
IV. To measure the DVs across the two levels of the treatment IV leads to a doubly
multivariate occurrence, with one between-subjects IV (program level) and one within-
subjects IV (time). Each Primary and Secondary Research Question will be addressed by
conducting this analysis. For each setting, a doubly multivariate analysis will include
examination of the four dependent variables at four points in time for the control group
and the treatment group. It is an anticipated challenge that the four dependent variables
may be correlated with each other. Experts suggest that a doubly multivariate approach is
wasteful if the DV’s are correlated greater than 0.6 (Tabachnik and Fidell, 2001). That is,
the amount of variance accounted for by the most significant DV will overlap with that of
other DVs, rendering its contribution to the model meaningless. If, at any point, it is
discovered that the limit of multicollinearity is exceeded, then two MANOVAs (with a
(pp. 51-52)
Though the VPREIS expects to the exceed the sample size needed to conduct the
proposed statistical analysis, if that sample size is not met some variables will be analyzed via
Researcher Profile
The researcher taught as a sexual health and positive youth development educator for
three years in middle school and high school classrooms prior to being hired full time as the
Data Collector for the VPREIS project in the fall of 2017. Due to the underestimated amount of
time needed to complete the design and development of the online Vision of You program, the
researcher also assisted the Curriculum Development Specialist from 2016 to 2017 and served
as a content expert during filming of program videos with the HEDG team. She was the
VPREIS team member responsible for recruiting participants conducting the pilot study before
program implementation. At the start of the program’s implementation the researcher worked
with partner sites to recruit and retain participants. Her responsibilities included collecting
parent/guardian consent, participant assent, collecting and validating tracking information for
contacting participants over the course of the study, tracking participant progress in the Vision
of You program, collecting data for all four survey points, and working closely with the project
evaluator to assure low attrition and continued updates to survey measures as outlined by
federal funders. She is also responsible for organizing collected data to be reported to the
Problem Statement
In the final year of the VPREIS project the team will disseminate study results as well
as lessons learned in regard to the design, development, and implementation of the Vision of
You program and research study. Processes for creating program activities, engaging site
partners, recruiting and retaining study participants and collecting data will all be examined by
the team and shared at relevant conferences and with the federal funding agency. This paper
will begin an evaluation of the processes the VPREIS team took in the first four years of the
project. This evaluation will be conducted through analyzing the project processes through the
ADDIE Framework as described by Allen (2006) for developing training programs as well as
examining the time spent in three units of the program. The researcher will evaluate the
VPREIS project processes through the lens of the ADDIE Framework including the Analysis
phase, Design phase, Development phase, Implementation phase, and the Evaluation phase
(ADDIE). The researcher was hired to join the VPREIS team in the Fall of 2017, one year after
the project began, but was a team member of TPP prior to receiving the grant award. She had
roles in each part of the PREIS grant process including the initial writing of the grant proposal
and the beginning phases of implementing the grant processes, so she is intimately acquainted
with the project. In 2018 the researcher began the Educational Technology Graduate program at
James Madison University. This research will not only meet the requirements of the graduate
program for researcher but will also assist her and her team in the evaluation of the VPREIS
project.
Significance
This research will not only contribute significantly to the VPREIS project by providing
a detailed analysis of the project’s processes, but also broadly to the fields of sexual health
education, online learning, and research implementation. This information will be prioritized in
the fifth and final year of the VPREIS project when it will be disseminated to federal funders,
potential new grantees, and project partners. Many processes affect program cost, staffing,
replication, and scale-up (Wasik et al., 2013) and should be carefully recorded to be shared
with others. Analyzing the strengths and limitations of the project design and delivery may
inform others who wish to replicate the project or those looking to begin the design and
Research Questions
In what ways did the analysis, design, development, implementation, and evaluation
of the web-based program, Vision of You, by JMU’s VPREIS team align or diverge from
best practices of the instructional design phases outlined by the ADDIE framework?
What are the strengths of the VPREIS project curriculum and survey design and
delivery?
What are the limitations of the VPREIS project curriculum and survey design and
delivery?
What recommendations are made for future projects similar to the VPREIS project?
What impact does the amount of time spent in the self-paced online program, Vision of
You, have on student performance on the immediate post-program survey questions regarding
Definitions of Terms
2. Unit – the smaller components containing the learning activities that make up the
Vision of You curriculum. VOY contains nine units on various sexual health topics.
implementation, evaluation, and research performed by the VPREIS team and funded
time and the intervals at which it should be administered for a specified period
6. Participants – the high school aged youth originally recruited to the VPREIS study
8. ADDIE framework – the acronym outlining the framework for instructional design:
Conceptual Framework
The Vision of You program was modeled after the theories presented in Table 1
(Brindis et al., 2005, p. 21) and has strong connections to the Social Cognitive Theory.
Table 6
participate in programs to prevent and detect disease. Over the years, it has commonly been
used with adolescents and young adults in the US to study the relationship between the
model’s constructs and risky sexual behaviors (Champion & Skinner, 2008). According to
the model, if a young person perceives that they are at risk for contracting a sexually
transmitted infection and they have the needed skills to protect themselves, they are more
Researchers Jackson et al. (2016) used constructs and concepts from the Theory of
Reasoned Action as well as the Trans-Theoretical Model of Behavior Change in the creation of
their health education application. They included comparison statistics on peer norms and
attitudes, subjective norms for a condom and contraceptive use, and skill-building exercises as
part of their intervention to align with their theoretical framework. The Theory of Reasoned
Action is popular in the field of sexual health. It suggests that “an individual’s intention to
perform a specific behavior is a linear function of his or her effective response to performing
the behavior (attitudes) and perceived social norms about the behavior.” (Baker et al., 1996,
p.529). This can be a helpful model in assessing whether youth will be more or less likely to
engage in risky sexual behavior based on their reported attitudes toward that behavior.
Using the Theory of Reasoned Action, Wulfert and Wan (1995) asked research
participants to indicate how they felt using a condom every time they engaged in sex from
extremely favorable to extremely unfavorable on a seven-point Likert scale. They also asked
participants to answer the same question, but from a social norms perspective. They found that
condom use attitude was closely linked to the intention to use condoms. Muñoz-Silva et al.,
(2007) studied condom use prediction differences in gender using the Theory of Reasoned
Action and Planned Behavior theory. Similarly, to Wulfert and Wan, they asked participants to
indicate their attitude toward condom use as well as perceived social norms of condom use using
a Likert scale. They found that male participants intended use of condoms was aligned with their
attitude toward perceived social norms while females showed a closer relation to intended
behavior based on their attitudes toward condom use (Muñoz-Silva et al., 2007).
Attribution Theory assumes that people work to understand the reasons an event
happened to them and they attribute emotional and behavioral consequences to those reasons.
These attributions make their world more predictable and controllable (Brindis, 2017). In
sexual health education with adolescents, attribution theory could, for example, focus on the
those feelings to the action of not using contraception, attribution theory says the adolescent
will be less likely to repeat that same behavior to which they attribute negative emotions.
Protective Motivation Theory states that an individual is more likely to take any action
to protect themselves if they believe the event is of enough magnitude to harm them and they
can adopt a new behavior to protect themselves from the threat (Brindis, 2017). In regard to
sexual health education with adolescents, this theoretical framework would help a young
person focus on the potential consequences of becoming a teen parent or contracting a sexually
transmitted infection. This framework would then use strategies to encourage a young person
expectancies and incentives. Expectancies can be about how events are connected, consequences
of one’s actions, and self-efficacy, or the belief in one’s own ability to behave in a way to
influence desired outcomes. Incentives are the values of the perceived effects of changed
behaviors, (Rosenstock et al., 1988). This educational theory is closely related to the Health
Belief Model. For sexual health education, this framework can be used to help youth consider the
actions of others and consequences and how those might feel for them as well as the potential
Time is needed to accomplish the broad approach to decreasing sexual risk behavior and
increasing intent to make healthier choices. Young people need ample time to begin to grasp
and understand their own susceptibility to illness, benefits and barriers to protection; to form
attitudes and beliefs; gain motivation through understanding consequences; and engage in future
planning. Fidelity to the sexual health program content is vital to ensure youth can explore these
topics, but according to researchers, Shegog et al. (2017), sexual health education is often
compromised by other academic priorities in a school setting. For students that are truant or
learning in non- traditional settings, like the youth the VPREIS team is working with, that could
mean next to no sexual health education exposure. Shegog et al (2017), suggest that by
understanding what program exposure (or “dose”) and time-on-task (number of lesson hours) is
effective in delaying sexual initiation creates a model for fitting sexual health into the school
schedule. Time-on-task refers to the amount of time students spend attending to school-related
tasks (Prater, 1992). This can also be called “intervention dose” in public health education
programs (Legrand et al., 2012, p.2). Though focused on linguistics education, according to
Rossell and Baker (1996) time on task theory states that the amount of time spent learning a
adequate time for youth to understand the potential consequences they may face when
encouraging youth to think about potential consequences to their individual lives, as well as
the feelings and reactions of their peers, family, and community. The Vision of You program
uses interactive games, videos, and practice communication activities to guide adolescents
through their understanding of potential threats, consequences, and benefits of making healthy
decisions. Finally, the program offers a self-paced model that allows youth to work on the
program at their own pace for learning and understanding. The program does not require a
classroom facilitator, so it is accessible to youth who may have missed the valuable (though
Overview
The following review includes professional studies or reviews that focus on the impacts
of the dosage of sexual health education and online sexual health programs. Fully online
programs for sexual health education are still new and under evaluation. Relevant outcomes for
sexual health interventions are reviewed as a basis for the importance of continued analysis and
improvements of these programs. Recommendations for the amount of time or dosage is also
included as Vision of You is a self-paced program which is a new approach for how most sexual
Recommendations for evaluating the instructional design process for creating a training
or educational program are also outlined. This will provide relevant background needed for
conducting the evaluation of the VPREIS project processes. Literature will primarily focus on
Literature Review
For this review, only peer-reviewed articles in English, published between 2000 and
2020, and available in full text were reviewed. The search terms “dosage of online education”,
“impact of time spent in the online program”, “online sexual health education”, “time on task”,
“heath education dosage”, and “duration or length of time” were all used to find studies
relevant to the design and creation of the Vision of You program. Search terms “instructional
design”, “ADDIE model”, “ADDIE framework”, and “best practices for instructional design”
were all used to collect relevant literature for evaluating the VPREIS project. JMU library
databases, ERIC and Education Research Complete, were used to search for the articles
The Vision of You program includes gamified learning elements like avatars, a point
system, goals, and trophies. It also utilizes videos, games, and interactive activities for learning.
While its effectiveness is still being tested, other similar intervention strategies have shown
positive outcomes for youth. Through a scoping review of digital intervention (defined as
programs that provide sexual health information) for sexual health, Mann and Bailey (2015)
found that online programs not only reached people who were less likely to engage with
mainstream services, but they allowed participants to access information when it was
convenient to them.
In another study Jackson et al., (2016) developed an online app to decrease sexual
risk behaviors among young college students. While they did not find a significant change
in students’ intention to reduce sexual risk behaviors, students did have a significant
curriculum for 18 and 19-year-olds. As compared to the control group, the intervention
group in this study reported higher levels of using dental dams and communicating about
sex as well as higher levels of using protection for oral sex. Participants in the intervention
comparing the media norm to the social norm, searching for the missing messages in media,
and comparing the effectiveness of birth control and condoms that are reported in media
versus what is medically accurate. Results showed that instances of oral, vaginal, and anal
sex were fewer in intervention participants as well as instances of sex under the influence
of drugs or alcohol. Acceptance of rape and rape myths was also lower for students in the
intervention group as compared to those in the control group (Scull et al., 2018). In a study
done by McGinn and Arnedillo- Sánchez (2015) researchers noted increased attention that
they attributed to the novelty of an online course or the familiarity of using applications
similar to those that adolescent students already used. The anonymity of the application put
the user more at ease, which made them feel more comfortable to engage with the sensitive
subject matter. Furthermore, online sexual health curriculums ensure fidelity of the content
delivered to students, they are easier to update with current medical findings, and they offer
Online sexual health programs can be an engaging way to reach adolescent learners. A
review of online sexual health education programs found that youth highly value privacy while
taking these courses online and look for easy access to information online from home, school,
or any other location (Holstrom, 2015). Similarly, a study done in South Africa with 16
randomly selected secondary schools found that students enjoyed sexual health lessons that
were delivered informally, as in an online format, and required minimal effort (Tucker et al.,
2015). The value of programs that engage young people in their health is very powerful.
Further research of programs to understand what elements make them not only engaging but
Effective interventions for addressing adolescent sexual risk behaviors are important
as adolescents are more likely than other age groups to engage in risky behaviors and endure
the consequences. A three-year review of electronic medical records for Ohio youth in
custody revealed that the most common risky sexual behaviors were inconsistent condom
use, having sex for the first time before the age of 16, and experiencing an unintended
pregnancy (Beal et al., 2018). Vision of You has been implemented with youth that for a
variety of reasons, experience a higher risk than their peers for engaging in sexual risk
behaviors.
Understanding the impact of dosage, or time spent, will help educators, researchers,
practitioners, and policymakers design programs that ensure the best outcomes for learners.
Research indicates that one time or limited exposure may show increased knowledge on an
immediate evaluation, but over time learners tend to lose the information they learned (All,
Nuñez Castellar, & Van, 2016; Maeda et al., 2018). Maeda et al. (2018) found that after a one-
time exposure to slides with information about fertility, participants lost most of their newly
acquired knowledge after two years. This could imply that learners need longer exposure to
learning materials, or materials taught in context or across time, to retain the information in
long-term memory.
While some studies have shown that limited time in an intervention shows fewer
positive outcomes for participants, researchers Bull et al., (2012), found that the sexual health
intervention, Just/Us delivered via social media, showed to be as effective as other online
interventions even though participants were not required to spend a specific amount of time
with the material. This could imply that time spent on content is not as crucial for positive
outcomes, but also indicates that further specific research is needed. A review of 64 internet-
based health interventions concluded that with such a diversity of intervention methods within
the reviewed studies, more specific research is needed to determine the true impact of factors
programs for teaching (McKimm et al. 2003) researchers Cook et al. (2010) found that the time
for internet-based learning was about the same as time required for non-computer learning
activities. When they explored studies that compared instead different internet-based learning
formats to each other, they found that in nearly all cases interventions designed to enhance
learning took more time (Cook et al., 2010). In a correlational analysis they found that “time
explained about one-fourth the variance in knowledge outcomes across studies” concluding that
“the longer one studies, the more one learns” (Cook et al., 2010, p. 765). In a study
investigating the effect of student time allocation on the average grade of undergraduate
students Grave (2010) found that time spent attending courses was positively associated with
grades for females, high-ability students and students of social sciences and engineering. Their
research supported findings from Fredrick and Walberg (1980) who found that time predicted
learning outcomes at modest levels and evidence was strengthened by content-specific outcome
Durlak and DuPre (2008) reviewed results from over 500 studies to understand the
impact of implementation on program outcomes. They found that when studies assessed dosage
and fidelity, programs had higher levels of implementation and better outcomes. Studies have
shown that increased dosage shows positive outcomes for participants including retention of
information as well as adherence to the program (Zaslow et al., 2016). A randomized control
trial design study evaluating healthy eating online programs found that more visits to the online
intervention corresponded with an increase in fruit and vegetable servings (Alexander et al.,
2010). Results also showed participants were more likely to recommend the program to others
at the 6 and 12-month post surveys. As adolescents learn a lot from their peers, a finding like
this could help inform online sexual health programs. In a study of intervention exposure, or
time spent in an online weight loss program, to see if exposure impacted retention at the follow-
up evaluation surveys, researchers found that the likelihood of retention increased with each
session that participants viewed and the more minutes spent in the sessions were also correlated
with better retention (Wilson et al., 2018). Fuhr et al. (2018) found a weak, but statistically
significant correlation between adherence (defined as the number of sessions and usage
duration) to an online intervention for depression and symptom reduction after twelve weeks.
Other studies show little impact of dosage or a need for further research. The online
tailored intervention, MeFirst, was designed for college-aged females not previously
vaccinated for HPV. Participants could use the program as often as they wanted. Results
showed that overall participants in the intervention group as well as the control group that was
only shown a factsheet showed an increase in knowledge about HPV. There was no significant
increase in intention to be vaccinated for the intervention group (Bennett et al., 2015). A series
of videos were used as an intervention to address the behaviors of gay and bisexual men who
have sex with other men living with HIV. Researchers Hirshbield et al. (2019) noted that there
were likely only short-term effects in reducing risky behaviors and more research was needed
Vision of You is a self-paced curriculum mean that students move through the units and
activities on their own without a facilitator or teacher keeping the pace. Studying the impact of
dosage will help the VOY developers understand if students are spending enough meaningful
time with the content to receive positive outcomes. Self-paced curriculums have major pros and
cons for students and educators. In a self- paced course, Tang et al., (2019) note that students
can dedicate as much time as they need and can learn and reflect at their own pace. While the
freedom to move at their own pace could be a strong indicator of increased retention, it should
also be noted that time-independent formats can cause learners to procrastinate or even drop
out of the program (Michinov et al., 2011). While increased intervention time does not
guarantee the higher potential for positive outcomes, Cheng and Chau (2016) found that
learners who spent more time in online activities showed increased achievement and greater
satisfaction.
their solutions.” (Lohr & Ursyn, 2010, p. 427). Well-designed instruction gets the attention of the
learner, orients them to interacting with content, zeros in on the most important objectives,
connects to previously learned material, and sets a framework for applying new knowledge and
information (Larson & Lockee, 2014). Instructional design is systemic, meaning that actions in
one process or component impact every other component in some way (Edmonds et al., 1994).
Dozens of instructional design models exist for professionals to use to guide the design of the
instruction. Effective models provide steps or guidelines, help the designer to effectively
facilitate learning, and allow for both formative and summative evaluation and assessment of the
design process as well as the learner outcomes (Nichols Hess & Greer, 2016). Despite the
availability of a variety of instructional design models, the most popular model and the model
that many others are based off is the ADDIE model (Allen, 2006). ADDIE is an acronym for
analysis, design, develop, implement, and evaluate. The ADDIE model was originally a model
used by the United States Air Force and has seen several changes over the years but has always
incorporated interaction among the phases to allow for continuous improvements (Allen, 2006).
Larson and Lockee (2014) refer to the ADDIE phases instead as “activities” (p. 8) which
they note are carried out “repeatedly, or iteratively, throughout the life of an instructional
product” (p. 8). The following table outlines the components of each of the ADDIE activities
Table 7
ADDIE Model Activities and Major Components as Outlined by Larson and Lockee (2014)
As indicated in Table 5, the ADDIE Model has several phases or activities and multiple
components within each activity. Many factors, like the details of the learning environment or
space and the learning theory used to frame instruction can be analyzed to determine the best
In the systematic review previously mentioned in this literature review, Cook et al.
(2010) suggest that developers of internet-based learning focus first on instructional designs
based on sound theoretical and empirical support for effectiveness and “continue to measure
time as an important outcome [to] understand and improve instructional design in internet-based
learning” (p. 767). The twenty studies that Cook et al. (2010) explored each reviewed internet-
based instructional design and found that audio narration, short video clips, three-dimensional
models, animations and discussions all indicated longer time spent learning and higher
knowledge test scores (Spickard et al., 2004; Schittek Janda et al., 2005; Nicholson et al., 2006,
reviewed by Cook et al. in order to best facilitate learning. Audio narration used to illustrate a
process is better for learning than using on-screen text (Mayer, 2009) and people tend to learn
more deeply when graphics are explained by audio narration alone rather than narration along
with on-screen text (Clark & Lyons, 2011). Moreno (2009) found that when showing short video
clips to new teachers, better learning occurred when a virtual agent elaborated on what was
presented in the video. For all elements, if the visuals or text used is only topically related to the
lesson and extraneous to the learning goal it is likely to negatively impact learning (Harp &
Mayer, 1998).
Sexual health education is still most commonly delivered face to face and further
research needs to be done of sexual health programs that are online. Research on sexual health
education, especially for young people, can be difficult to access. A variety of factors make
studying sexual health education difficult, but with adolescent participants, there are additional
hurdles with obtaining parent consent and getting buy-in from school systems and educators.
Many of the studies considered for this review did not consider a dosage of intervention
on its own in their research but noted its importance along with other factors in an intervention.
As Cook et al. (2010) suggests, a true experimental design on factors of time spent would give a
better understanding of its significance. Often the impact of dosage was considered after the
intervention was completed rather than throughout the implementation. For studies that
considered visits to a website, the amount of time in minutes or seconds was not determined, so
little could be understood about the impact of how long participants engaged with the learning
material. While no studies reviewed used eye-tracking or clicks to determine what participants
engaged with and for how long, further research to show the importance of time spent engaged
in the material could open the doors for interest and funding of those advanced evaluative
measures.
adolescents’ sexual health education. Understanding the impact that instructional design has on
the dosage, or time spent in curriculum activities and resulting student outcomes will inform
educators, parents, and other key stakeholders of best practices for sexual health education
online programs.
Chapter Three
Methodology
The researcher conducted a non-experimental study which examined the strengths
and limitations in instructional design of the Vision of You program. The researcher used
the ADDIE Model as a framework for which to evaluate the creation of the Vision of You
program. The strengths and limitations of the instructional design process for the analysis,
Because the reviewed literature suggests that programs with instructional design elements
like audio, animation, three-dimensional models, and discussions tend to take a greater
amount of time than programs without those elements and are correlated to increased
knowledge scores, the researcher then examined time spent by students in the VOY
program. VOY student performance on three key variables from the immediate post-
program survey were analyzed. This research utilized secondary data (or data that was
previously collected in prior research) from the original VPREIS study taking place at James
Madison University.
The researcher chose to focus this research on Unit 4, 6, and 7 as these units contain
content which directly connect to the post program survey and are intended to impact
behavioral change as well knowledge gains for the participant. Data previously collected by
the VPREIS team between 2017 and 2019 was utilized to conduct the research. For this
academic calendar, only participant data completed by February 3rd, 2020 was considered.
Only participants who completed lessons corresponding to the post- program survey were
considered. This research was conducted during the spring semester of 2020. The amount of
time (measured in seconds) spent in each of the Vision of You lessons is recorded by the
Management System built for the VOY program. Time-spent data was exported to an Excel
document from the VOY Management System for each of the three program lessons that
were examined. Time spent data in VOY has not been previously viewed or studied by the
VPREIS team.
gained from the STI, Methods of Protection, and Consent lessons was collected from
Qualtrics, the online survey platform used by the VPREIS team to collect participant
surveys. Survey responses were separated into two groups: Group 1 consisted of those that
answered the question correctly, while Group 2 consisted of those that answered
incorrectly. To determine if age and gender should be included as a variable, for this study,
a bivariate analysis was conducted to determine if age and time spent or gender and time
spent were positively correlated with one another. Preliminary evidence showed that they
were not correlated and so were not included as a variable for the following analysis. A t-
test was conducted to examine the group mean differences on time spent in VOY for each
Due to the multiple variables being examined, the statistical test needed to be
adjusted using a Bonferroni Correction in order to control for a Type I error. A Type I error
occurs when the researcher has rejected a true null hypothesis (Field, 2009). Rejecting a true
null hypothesis would mean that the researcher reported that their findings are significant when
they actually only occurred by chance (McLeod, 2019a). The chance of making a Type I error
or thinking that you have found statistically significant results is increased as the researcher
conducts more and more tests. For this research a Bonferroni Correction was calculated by
dividing the standard for statistical significance (p < 0.05) by the number of tests being
conducted (3) which led to a significance level of only 0.017. Applying this correction to
control for the Type I error, does result in a of a “loss of statistical power and the probability of
rejecting an effect that actually does exist” which would be defined as making a Type II error
(Field, 2009). Due to a sample size of approximately 75 participants per group after cleaning
data for incomplete responses and the lowered standard for statistical power, it was unlikely
that the research would lead to statistically significant results. Therefore, a measure of practical
significance was also determined to control for a Type II error or accepting a false null
hypothesis which would mean the researcher concluded that there is no significant effect on
differences do not always indicate that the findings are not significant to the population
(Spurlock, 2019). Unfortunately, a lack of statistically significant findings has resulted in many
studies not being reported or published and many other studies with statistically significant
findings being replicated with poor results because published research reports significance that
is not actually seen in the population (Replicability-Index, 2015). Jacob Cohen suggested to
account for these Type II errors, researchers determine effect size, “a standardized measure of
the magnitude of an observed effect” (Field, 2009, p. 785). While a p-value can tell the
researcher that an intervention works, an effect size tells the researcher how much the
intervention works (McLeod, 2019b). Cohen’s d is commonly used to accompany the reporting
of t-tests and is calculated by subtracting the mean of one group from the mean of the other
group before dividing by the standard deviation of the population from which the groups were
sampled (McLeod, 2019b). Cohen suggested that d = 0.2 be considered a small effect size, 0.5
a medium effect size and 0.8 a large effect size (Cohen, 1988). Regardless of statistically
significant findings this means that if two groups’ means do not differ by 0.2 standard
means (Cohen, 1988), was calculated for each variable for this research.
Participants
Participants in the VPREIS study were recruited within alternative schools, juvenile
detention centers, and a community service board within the state of Virginia. In order to
participate in the study participants needed to be able to read and write in English at a 5th
grade level, be of high school age, and have not participated in the study previously. VPREIS
chose this study group due to the population’s high risk of contracting sexually transmitted
infections and/or experiencing an unplanned pregnancy. Students in these settings are more
likely to be absent from traditional educational settings and therefore may be absent for sexual
health education.
At the time of writing this proposal, 595 participants were assigned in the VPREIS study,
with 304 randomly assigned to the intervention group. Participants were between the ages of 14
and 21, identified as male, female, and gender nonconforming, and were white/Caucasian,
black/African American, Hispanic, Asian, and Native American. Some participants were still in
high school while others had completed a GED or were still pursuing a GED at the time of their
participation. While the number of youths assigned to the intervention was over 300, only youth
that had completed the intervention and the post-program survey were eligible for this research.
Many of the intervention participants were still in the process of completing VOY or had not yet
in the VPREIS study. Participants 18 or older consented to participation and those under 18
signed student informed assent forms. Participants were randomly assigned to either a control
group or the intervention group. The intervention group would complete the nine-unit sexual
health program, Vision of You. The control group had the option of completing the online
nutrition program, Eat Move Win. Parents and guardians were offered a $10 incentive to return
a consent form regardless of whether they opted to give consent. Study participants were
offered incentives for completing post-program surveys. All data that was examined for this
research was originally collected under James Madison University’s IRB Approval #20-1486.
Collection and examination of the original VPREIS data for this research was approved by the
The Principal Investigator of the VPREIS study gave approval for questionnaire data
collected in Qualtrics to be used for this research as well as the time log data recorded within the
Management System for Vision of You. The researcher works on the VPREIS team as a Data
Specialist but does not have access to the data used for this research as part of her job function.
Instruments
Instruments used for this research included three post-program survey questions
presented to participants in the VPREIS study after completing the VOY program. The surveys
used were developed by the researchers of the VPREIS study and intended to gauge knowledge
gained from the VOY program. Figures 6, 7 and 8 show the survey questions used in this
research. Time-spent data collected in the VOY Management System for each intervention
participant was used and was presented in seconds spent in each activity within the program
units. Roughly 1000 lines of time spent data per participant per unit were recorded. The
researcher did not consider data for this study that was incomplete. If a student only completed
a portion of the unit, they did not engage with all of the intended content and their time spent
Figure 6
Figure 7
Data Analysis
Data for this research was collected through Qualtrics and the VOY Management
System as well as the JMU grant proposal from 2015. Survey data had no identifying
information when given to the researcher. Data for each survey question being studied was
coded by “1” for correct responses and “0” for incorrect responses. The researcher used SPSS
(Statistical Package for Social Sciences) software to perform the statistical analysis. To
counteract the issues of a small sample size and multiple variables, and to further explore the
possible relationship between time spent and knowledge gained, Cohen’s d was calculated for
to skip questions on the post-program survey. Due to the limited amount of data currently
available, this research also only considered data collected from the immediate post-program
survey and not from the three and nine month follow up surveys that participants are also asked
Time spent data for this research is a collection of the amount of time spent in minutes
and seconds that participants were active in the online program. This data is somewhat limited
in showing the engagement level of participants, as it does not track eye or mouse movement
and was not collected by an in-person researcher. Best practices for instructional design can be
used to predict how impactful the unit activities should have been for the participants but
because there is no detailed account of the instructional design, this research is limited by what
You, by JMU’s VPREIS team align or diverge from best practices of the instructional design
phases outlined by the ADDIE framework? What are the strengths of the VPREIS project
curriculum and survey design and delivery? What are the limitations of the VPREIS project
curriculum and survey design and delivery? What recommendations are made for future
projects similar to the VPREIS project? What impact does the amount of time spent in the
self-paced online program, Vision of You, have on student performance on the immediate
consent? Sub questions included what is the impact on post-program survey responses for
Analysis. According to the ADDIE Model the beginning phase or activity of analysis
should consist of defining the problem, expectations and needs, identifying goals, resources,
and constraints, identifying the learners and their contexts, as well as establishing the
The VPREIS team carefully outlined the need for sexual health education in Virginia.
Their proposal made the case that rates of teen pregnancy in rural areas were still higher than
that of the state and youth involved in the juvenile justice system and alternative education
presented with a higher risk than their peers. Due to the fact that the identified learners were
more transient than their peers, the team identified a need for a self-paced curriculum that
would not require a facilitator or instructor. They aligned their project needs with a standard
for comprehensive sexual health education and determined what scaffolding needed to occur
to best support the learner. The VPREIS team identified their subject matter experts and
support from agencies like Mathematica for assistance with evaluation and medical accuracy
testing. They also established the HEDG team as their resource for the development phase of
the project and planned to hire a curriculum writer. Prior to beginning any design activities,
the team established partners within their identified population which whom they signed
memorandums of understanding. Each of these activities aligns with the activities outlined in
The VPREIS team outlined few potential constraints for their project. Most
constraints focused on the evaluation phase of the project and possible barriers to collecting
survey data from participants. A thorough analysis of the capabilities of the development and
design team were not outlined. In addition, there was no plan or list of actions to be taken
assessments, and strategies, message, media, the delivery system, and an evaluation plan for
the Vision of You program were all outlined early in the project as the design components
needed to be reviewed by the federal funding agency before the grant for the project was
awarded. The delivery method for the program was designed to be online with program
elements including videos, interactive games, and a discussion forum. Relevant research
about gamified learning was used to justify the design of Vision of You which was a strength
of the design process. There was a plan to review prototypes of the program throughout the
development process so that design changes could be altered appropriately which aligns with
education and because the state of Virginia aligns with abstinence only based education and
at the time did not explicitly require programs to include consent education or materials to be
medically accurate, the team chose not to align with only VA standards even though the
project would take place only in Virginia. The team aligned the curriculum with the SIECUS
A design of the program delivery model was laid out broadly, but not in a detailed
format for the team to review. For example, there was no information provided for how the
designed program elements would be created for the online program. This was a limitation in
design as it did not give an outline for the team to refer back to throughout the process which
The design of the evaluation plan was made based on the recommendations of several
health behavior change theories. The primary outcomes to be evaluated for the program were
to reduce the number of sexual partners, reduce frequency of sexual activity, and increase
contraceptive use. Secondary outcomes for the program were to increase knowledge in
sexual health topics and adulthood preparation subjects. These outcomes were designed to be
evaluated by the administration of a post program survey, a three month follow up survey,
and a nine month follow up survey. This design of summative evaluation was a strength of
the design as it would allow the researchers to understand knowledge retention over time and
not just immediately after program completion. A formative evaluation design was also
proposed through Gateway Quizzes throughout the curriculum that would ensure users had
Nine questions were used in the summative evaluation of the program to evaluate the
program’s delivery. These questions focused on how the user felt the VOY program
compared to other sexual health programs, what topics they wanted to know more about,
what topics were missing and how engaged they felt. The questions were not specific to the
elements of the program such as the videos, games, or interactive elements and did not ask
the user to answer how well they felt they understood the navigation of the program. This is
a limitation in the design of the evaluation as it will not provide specific feedback for the
VPREIS team to use in updating the delivery and usability of the program.
Development. During the development phase of the Vision of You program project
management tools, as recommended in the ADDIE Model, were used to determine a timeline
and budget for development. Prototypes were expected to be viewed by the subject matter
experts before being included into the program. For all of the videos being filmed, a subject
matter expert was present to ensure information was medically accurate and represented the
image and message that the VPREIS team wanted. Videos were viewed by the team before
final edits to sound and coloring were made so as not to waste valuable time and resources
repeating the process should the team need a scene changed. In addition, all included images
and graphics were reviewed by the VPREIS team as well. These were major strengths of the
development process as it allowed for an iterative process to flow between the design and
development phases. Items could be changed as needed before they got too far along in their
development.
Prototypes of the program elements were not included like the prototypes of videos
were. The process for creating program elements relied entirely on a system of trust that they
were taking place in accordance with the timeline for delivery. The VPREIS team was not
able to see how they were designed or functioning until a final deadline was established for
conducting a pilot study. This was huge limitation for the team in regard to best practices for
instructional design. Not being able to see the development of the program meant that
changes in design could not take place and that it was impossible to judge how closely the
development followed the original design. In addition, an implementation guide could not be
created as the team was unaware of how the program functioned. This made the timeline
very tight for beginning implementation and the delivery of a final product.
Implementation. The first phase of implementation began with a pilot study. Twenty
high school aged youth were recruited to participate in the pilot. Conducting a pilot was a
strength in the process as it would allow for the first set of feedback from the intended
audience. Though high school aged youth were recruited for the pilot, these learners were
technically not representative of the audience the curriculum was intended for. This was a
limitation in the implementation process as it did not give the team an understanding of how
the program would be received by the intended audience, so necessary changes or additions
The high school youth that did complete the pilot found several technical issues that
they regarded as frustrating or impossible to work through. The developer for the program
was present during the pilot to note needed changes as he heard them come up. Students
noted that they really enjoyed the characters that were depicted in the program and would be
interested in learning more about their stories. Because an avatar element had originally been
written into the design of the program, but was not included by the time the pilot was
conducted it was decided that one of the needed additions was to develop character avatars
for learners to engage with throughout the curriculum. It was discovered during the pilot that
many program elements initially designed for the program were not included and this was
noted by the VPREIS team. All other technical issues including interactive activities that did
not function, questions that were being marked as incorrect even though they were correct,
“next” buttons that did not advance the user, and pixelated images that were too difficult for
the user to read were other noted changes and fixes identified during the pilot. This process
would allow the team to return to the design and development phases briefly before
Aside from the pilot study and the program completion by staff members of the
VPREIS team, the VOY program went through no additional evaluations for functionality
and usability and no testing of beginning behavioral or knowledge outcomes. A detailed list
of errors within the program was identified by the staff, but program elements were not
aligned with best practices for online programs. The list of errors that were still present in the
program were not addressed before the program started to be implemented with new groups
of students. Some errors were not addressed because they were deemed to interfere with the
beginning of the implementation. Missing program elements were replaced with graphics
and additional videos as the team discovered there would not be enough time or money to
continue extending the development phase. The VPREIS team hired a different company to
develop interactive games for the curriculum to fill some of the program gaps as well. Once
implementation began, the sole person responsible for the development of the program
Evaluation. While formative evaluation of the program videos were ongoing and
productive throughout the creation of VOY, there was no consistent form of formative
evaluation for the other program elements aside from the pilot study which was the first time
the VPREIS team saw a version the final product. This lack of testing and observations was a
huge limitation for this project. A review of the program before it needed to be implemented
would have given the team insight about the workload and capabilities of the developer. This
also meant that there was little time and budget left for revisions which is noted as an
important activity in the evaluation phase of the ADDIE Model. No examination of the
teamwork and instructional design process was conducted either which left no real party
First data collections from the program participants noted positive outcomes for
behavior change, but seemingly poor results for knowledge gains on the post program
survey. Because these survey results are what is reported to the federal funder it is important
that the team can show the program outcomes here. Knowledge tests throughout the
curriculum have not been evaluated by the team to understand where students may need
additional support because at the moment, they cannot be accessed due to the way the
Time Spent
Unit 4
Some understanding of how engaged students were with the Vision of You program
could come from examining the amount to time spent within the units. According to the
reviewed literature, program elements that took users longer to work through tended to result in
better knowledge results on test scores. This research does not compare program elements to
each other, but rather observes what if any relationship occurs between the amount of time
spent and the student performance on post program survey questions. In order to compare the
post-program survey question results and the amount of time spent in Unit 4, an independent
samples t-test was conducted. The participants who completed Unit 4 and answered the post-
program survey question incorrectly (n=19) spent slightly more time in the unit (M = 1503.95,
SD = 913.583) compared to the participants (n=50) who completed the same unit and answered
correctly (M = 1420.40 seconds, SD = 631.814). This test, however, did not reach statistical
significance, t(67) = .431 , p = .667. The effect size for this analysis (d = 0.106) fell just below
Cohen’s (1988) convention for indicating a small effect size (d = 0.2), indicating that there very
little or trivial practical significance on time spent and the impact on participant outcomes.
The longest amount of time spent in Unit 4 was 4179 seconds, or 69 minutes and the
shortest amount of time 794 seconds, or about 13 minutes. Table 1 breaks down the amount of
time spent into 10-minute differences with sections for less than 15 minutes and over 60
minutes. The number of participants that spent each corresponding amount of time are noted in
the bottom row of the table. Of the 70 participants who completed the Unit, the majority spent
less than 35 minutes, with the average participant spending 23 minutes in the Unit.
Table 8
14 35 10 5 5 1
There is no statistical evidence to indicate that the amount of time spent in Unit 4 had any
significant impact on participant survey outcomes. Many participants in this sample spent less
time in the Unit (average of 23 minutes) than the developers intended them to (45 – 60
minutes). This may indicate that the Unit is lacking enough content or that the activities were
below what the average participant would be meaningfully challenged or engaged by.
Unit 6
The second independent t-test was conducted for the (n=73) participants who
completed Unit 6 and answered the corresponding post-program survey question. The
participants who completed Unit 6 and answered the post-program survey question incorrectly
(n=40) spent slightly less time in the unit (M = 2650.98 seconds, SD = 2542.957) compared to
the participants who completed the same unit and answered correctly (n=33) (M = 2730.58, SD
= 1660.432).
While a quick glance at these results may seem to indicate support of the original stated
hypothesis (participants who spend longer in the unit will be more likely to get the post-
program survey question correct), this test, did not reach the statistical significance, t(71) =
-.155 , p = .878, to indicate that time spent had any significant impact on how participants
answered the survey question. The effect size for this analysis (d = 0.0370) fell far below
Cohen’s (1988) convention for indicating a small effect size (d = 0.2), thus indicating no
The greatest amount of time spent in Unit 6 was just over four hours and the next greatest
amount of time just over two hours. While there could be several reasons that these times are not
entirely accurate, a little over half of the participants spent over 35 minutes in the program and
2 13 20 13 10 15
This unit took participants on average, 45 minutes to complete, but with the more outliers
removed, the average was closer to 39 minutes. This is considerably longer than the previously
studied Unit 4 (23 minutes) and could be appropriate as Unit 6 may be considered a more
advanced subject, sexually transmitted infections. This average was also the amount of time
the program developers expected participants to take. While only two participants spent less
than 15 minutes in the unit, that time spent is a concern to the researcher as the VOY
Management System indicates that the lesson was completed in its entirety by the participants.
This unit content could not be completely in under 15 minutes due to the total amount of timed
content in activities and videos exceeding 15 minutes. In design users should not be able to
advance past videos and activities that have an advancing button with a time lock. This may
indicate that participants were able to advance through the unit without completing the
activities or that a glitch in the VOY Management System recorded the unit as completed
Additionally, since the number of participants answering the post survey question
incorrectly is similar to the number answering correctly, it could be inferred that the survey
The final independent t-test was conducted for the (n=73) participants who completed
Unit 7 and answered the corresponding post-program survey question. The participants who
completed Unit 7 and answered the post-program survey question incorrectly (n=36) spent
noticeably less time in the unit (M = 1524.53 seconds, SD = 703.860) compared to the
participants who completed the same unit and answered correctly (n=37) (M = 2126.16, SD =
1721.393). Similar to the results for Unit 6, these results seem to indicate support of the
original stated hypothesis (participants who spend longer in the unit will be more likely to get
the post-program survey question correct). Preliminary results indicate that statistical
significance was much closer to being achieved than the previous two units, t (71) = - 1.945, p
= .056. However, due to the number of tests conducted, the likelihood of finding statistically
significant results is increased due simply to the number of tests. For this reason, a Bonferroni
adjustment (Dunn, 1961) was made by dividing the standard significance level (p < 0.05) by
the number of tests being conducted (3) and leaves a significance level of only 0.017, which
indicates that statistical significance was not found. The effect size however, for this analysis
(d = 0.457) represents Cohen’s (1988) convention for indicating a medium effect size (d = 0.5),
thus indicating some practical significance. This practical significance could indicate that the
the user, is already familiar to the user, or was presented in a way that participants felt more
engaged with. These results could indicate that the amount of time spent has some impact on
participant learning outcomes. These results are also similar to the results noted from Bull et
al., (2012) and Brouwer (2011) that indicated support of increased dosage for positive learning
outcomes in internet-based interventions, but a need for future research to fully understand
time spent factors.
participants spent between 15 and 35 minutes in the unit which is less than the developers
intended. This could again, like in Unit 4, indicate that the content was not at the
appropriate level to be challenging to the participants or that not enough content was
available.
Table 10
12 25 18 10 3 5
The greatest amount of time spent in Unit 7 was 2.5 hours. A total of 12 participants spent less
than 15 minutes in the unit. These times may indicate that some participants were able to
navigate past required activities, while others may have gotten stuck, spent more time engaged
and several limitations as described in the results section. It is the researcher’s recommendation
after observing the instructional design process through the ADDIE Model that the VPREIS
team consider usability testing in future and collect evaluation data from the VOY program
directly to understand where the curriculum may need to be redesigned or better supported.
Though the VPREIS team did an extensive review of the needs of youth in Virginia, particularly
rural youth and youth in the juvenile justice system or alternative education, they did not pilot
the curriculum with the youth who would eventually be involved in the study. There was also no
representation of knowledgeable individuals who work with youth in these settings that could
have informed the design team about what elements may work best for the youth.
The development phase is where the creation of Vision of You really struggled and
because in the ADDIE Model, one phase impacts another, it could be said that the analysis and
design phases were not strong enough which is why the development phase faltered. The
creation of certain program elements was delayed by over a year and some elements never being
developed and replaced by lower tech and more cost effectiveness options. The entire project
was out of money and time with only a partially completed program by the time implementation
should have started. A more thorough analysis of the amount of work the project would entail
and the skills of the team had to accomplish could have benefited this project.
Finally, it is the recommendation of the researcher that should a project like this be
attempted again that the team consider a learning management system that is already developed
in order to concentrate more on the program elements rather than the building of the system
which took considerable time and money. The researcher also recommends that when possible,
there should never be just one individual responsible for the majority of one part of the project.
Because there was one sole coder for the Vision of You program itself and that individual left the
project when implementation began, the team had no one to turn to with technical issues. In
addition, the amount of responsibility that individual had was likely overwhelming and resulted
Though statistical significance was not found for any of the conducted t-tests, results
from this research will still prove useful to the researcher and the VPREIS team. For Units 4
and 7, participants spent far less time on average than the developers intended. According to
the literature reviewed, programs with elements that take the learner longer to engage with
tend to show better test scores. The results may indicate that the program elements could be
enhanced so that the learner is more engaged by them. This finding would be supported by the
evidence collected in the instructional design analysis which noted that program elements were
thrown out due to a lack of time, money, and capability of staff and replaced by elements that
were more cost effective, but less engaging for the learner. The average times spent in the
curriculum are important to the VPREIS team for communicating to teachers the expected
amount of class time this curriculum would use. Regardless of the statistically significant
findings, knowing the average amount of time spent is important to educators for planning
purposes as well as identifying students that may need additional support with the program or
content.
Many participants spent less than 15 minutes in the program units, which could indicate
that the participants were not engaged meaningfully with the content, that the VOY
Management System has an operating flaw, or that participants were able to navigate through
activities without completing them entirely, which by design, should not be able to occur due
to the mechanisms in place that prevent the user from skipping a video before it has completed
playing or moving beyond an activity before answering required questions. More research
should be conducted by the VPREIS team to ensure that the curriculum is operating as it
should be as this was not a test that was already performed prior to implementation.
Even though the average amount of time spent in Unit 6 was closer to the intended
amount by the developers, it did not appear to positively impact participant outcomes on
the post-program survey. In fact, participants appeared to perform worse on the post-
program survey question for this unit than the other two unit’s questions. This will be
The results of this research may indicate a need to improve the program survey
questions, especially related to knowledge gains. While it could be true that participants did
not know the correct response, the number of participants that answered incorrectly could also
be due to a confusing question format or wording, unclear instructions, or the placement of the
question within the survey (participants may feel fatigued by the time they get to the question
and feel less motivated) as the survey takes around 30 minutes to complete due to the number
Questions on the post-program survey are also not written in a way to draw participants
back to the activities they participated in. It was previously stated that of the units studied,
participants spent the longest amount of time in Unit 6 even after outliers were not considered.
Unit 6 covers content related to sexually transmitted infections and does so through a series of
videos where young actors meet an anthropomorphized version of the eight most common
sexually transmitted infections. The VPREIS recruitment specialist stated, “I can always tell
when students get to the STI Unit because they are laughing out loud and start pointing out
their favorite moments,” (Jo Benjamin, personal communication, February 18, 2020). This
anecdotal evidence could indicate that participants spent more time in this unit because of how
much they enjoy it and are entertained by it. Moyer-Gusé (2008) defines edutainment as
framework may result in more positive survey responses. The user’s enjoyment or degree to
which they felt entertained by the content or unit is not an indicator for assuming they learned
something from it. The survey results for Unit 6 may indicate that the content was presented in
a fun way, but not in an effective way for learning. Because there are already knowledge-based
questions built into the Vision of You program, it would be helpful for the researchers to
examine student responses to those questions to understand if what knowledge is being gained
from the content and what knowledge remains by the time the post program survey is given.
For Unit 4, most participants answered the post-program survey correctly, though this
was the unit participants, on average, spent the least amount of time in. The methodology for
this research did not include a comparison to participants’ responses on the baseline surveys.
Especially for Unit 4 (since this unit contained content that was less related to a hard science
than Units 6 and 7 and may be more common knowledge) comparing the baseline results to
the post survey results would have helped the researcher understand what knowledge was
truly gained as a result of completing the VOY program. This indicates a limitation in the
research design.
Practical Significance
None of the studied units were shown to have statistical significance, but two were
determined to have a minimal to medium effect sizes indicating that the results are relevant to
the population despite not having statistical significance. This information can guide the
VPREIS team as they begin to disseminate results of their study and plan for improvements to
the program and survey tools and create guidelines for the use of the VOY program. Time
spent is an important factor for educators especially in lesson planning and feasibility of
completing a program, but in this study, it did not necessarily indicate a strong impact on
student success. Even in a self- paced program, if students range from taking thirteen minutes
to complete a unit, all the way to four hours, that can be a major hurdle for planning and
assessments. A look at time spent and survey results may also help the VPREIS team in
determining where new content could be added based on adolescent’s needs and educational
Limitations
This research was limited in several ways. First, the amount of data considered was
relatively small. At the time the research was conducted, only participants that had
successfully completed the VOY program and the post-program survey could be considered
which also kept the sample size small. Vision of You covers sensitive topics and many of the
post-program survey questions require the participants to consider their own behaviors and
attitudes around these topics. For some participants, answering these questions could be
traumatic. For that reason, none of the questions on the survey, including content related
questions force a response from the participant. While this protects the participant, it also
limits data.
Time spent data for this research was collected from the VOY Management System that
is used to track student progress, among other things. Data was exported to a very large excel
sheet which was separated into about 10,000 rows of data per participant. There is a chance of
human error in adding the sum of the total amount of time spent for each participant, but
overall, the functions available in excel were used so the results should be reliable. While
participants are automatically logged out after a short period of inactivity, those moments of
inactivity could still add up. This management system is also not able to indicate if and at what
point a student may have experienced a technical error and if the time clock was still counting
As previously mentioned, time spent data was also exported in a format that is not user
friendly. For example, the VPREIS team is not able to see an automatically calculated total of
the time each user spent within the program from the VOY Management System. Instead, user
data is exported to an excel sheet which lists the time in seconds per each unit function. User
data is listed in order by the first user to log in to the program. The users’ ID number is listed
with the record of time spent in each program unit, but there is no additional organization to the
data. In order to find the time spent for each unit the researcher needed to use a sum function in
Microsoft Excel to add up the time for each unit per each participant. Each unit contained about
1,000 rows of time spent data (roughly 9,000 to 10,000 rows of data per participant). While this
process was necessary for the completion of this research, it was an arduous task that could
Finally, the time spent in each unit per participant was not matched across units and
participant’s baseline surveys were not matched to their post surveys. By not matching the time
spent data across units, the researcher cannot infer whether some participants consistently took
more time or less time, which may indicate a student’s ability to navigate the problem and could
potentially highlight accessibility issues. By not matching the baseline surveys with the post-
program surveys it cannot be determined if the researcher saw knowledge gains due to the VOY
program, or if participants had prior content knowledge. Prior content knowledge could have
Personal Reflection
What I Brought
work requires a person to wear multiple hats. I was familiar with the role of educator from my
social work background which is what led me to working with and educating youth in sexual
health. That role led me to assisting in the creation of sexual health education programs and
eventually to a brand-new role of researching and evaluating a program. These roles led me to
One of the first things I learned in the field of Social Work was the steps of intervention;
practitioner through learning about an individual, group, or community, assessing their needs,
planning an intervention, then implementing the intervention, evaluating for effectiveness and
ending the relationship. What I discovered in the Educational Technology program is that a very
similar process, called the ADDIE Model, exists to guide professionals in much the same way.
While I had plenty of practice applying a similar model to groups of people, I had not thought
about using it to inform our design of an entire curriculum. What I had also learned in social
work was how to apply theory to the people that I worked with the interventions that I chose.
Education Technology brought on the same expectations, though with new theories to learn and
apply.
Having been in the field of sexual health education for several years I felt confident
about my knowledge in what young people want and need in their sexual health education. In
starting the Ed Tech program at JMU I had already spent two years helping to develop an
online sexual health program which, for the sexual health field and largely rural areas, is
incredibly innovative. I quickly learned that while I had valuable skills and knowledge about
the needs of young people; I really knew nothing about instructional design, or at least not
enough to be effective at it. While this could have been an unsettling realization for someone
that had just created instruction for a multimillion-dollar project, I chose instead to focus on
where to go from there. What I brought to the VPREIS team at JMU before starting the Ed
Tech program were interpersonal skills with young people and knowledge about people’s
needs. Those same skills led me to the Ed Tech program. In the past two years of the Ed Tech
program, I feel I have been able to bring my team the understanding of a process that will help
meet the needs of the people we serve in a new way as well as ongoing questions about how to
improve and continually evaluate the programs we teach and create to education standards fit
This project focused on secondary data, which to a Social Worker and an educator can
feel rather cold and detached from the living breathing humans the data is about. To a researcher
this data is fun, and its analysis and manipulation is interesting and exciting. These two roles can
feel at odds with one another. On the surface one of these roles focuses on the human and the
other on spreadsheet, but both are important and more complicated than they appear. Both roles
are present in the fields I know, social work and education, and a professional should know how
to be both. What I have tried to bring to this applied research are professional skills from each of
these roles. I chose to conduct research that focused primarily on numbers and what they mean,
but by understanding them, I will better understand people and be able to ask new questions
Through this research process, and much of my work in the Ed Tech program, I have
learned how connected I am to my work. I not only enjoy what I do, but I hope to continually
grow and become better at it. I chose the applied research project I did and the methods I used
largely to support the work my office does and the program I have been working closely with
for the last three years. This meant stepping out of my comfort zone a bit and working closely
with the VPREIS team’s program evaluator to become more familiar with statistics and
statistical analysis. Through this process I have been able to brush the dust off some of the
research courses I have taken in the past and use the knowledge I have gained in this program.
I believe where I truly see the benefit of the Ed Tech program in this applied research
project is in the next steps. With the results I have gathered I will be able to further assist my
team in the evaluation of VOY and provide ideas and frameworks for new projects to come.
Prior to the Ed Tech course, I may have been satisfied with the results of the project and left
well enough alone, but instead I have learned to evaluate many aspects of design and
instruction. There is truly not an end to this project as the questions that have come from it will
need to be answered and more questions will likely come from those answers.
Taking a critical look back at my project I know that there were limitations and moving
forward I would like to be able to address them in new research. Once more points of data are
available, I would like to conduct similar tests with more participants. I would also like to be
able to test time spent in Vision of You as an observing researcher and take note of where
participants lag or struggle. Further studies will also need to determine what accessibility
issues are present in the curriculum as only looking at time spent cannot tell us why a
participant took longer or shorter than what was average. Where this is not possible, I would
like to provide recommendations for upgrades to the management system and what it tracks.
This research had several important findings for the VPREIS team. An understanding
of the instructional design strengths and limitations as well as recommendations for the future
can prepare the team for new projects or provide a framework to work from if the VOY
program undergoes more studies. For the statistical analysis portion of this study it was found
that on average, participants spent less time on program units than expected. The VPREIS
team will need to explore any technical issues present in the VOY program or VOY
required activities, causing participants to get stuck and unable to advance forward, or
inaccurately capturing the time spent. A review of how many meaningfully engaging activities
are present will also be important if findings continue to indicate a lack of knowledge gains.
Finally, a look at other dimensions of time spent, including the duration, or how many sessions
over a span of time could be studied alongside time spent in minutes within the program units.
determine how user friendly the VOY program is for young people. Gamification, videos, and
games are what makes the Vision of You program unique and innovative compared to other
sexual health education programs. A study to determine how various populations respond to
these program elements (for instance; the reactions across age, sex, and ethnicity) would also
be a point of interest for the researcher and the VPREIS team as well. Before engaging in those
points of research though, results from this study indicate that further testing of accessibility
and usability should be conducted prior to offering the program to a broader audience.
References
Alexander, G. L., Mcclure, J. B., Calvi, J. H., Divine, G. W., Stopponi, M. A., Rolnick, S. J.,
All, A., Nuñez Castellar, E. P., & Van Looy, J. (2016). Assessing the effectiveness of
digital game-based learning: Best practices. Computers & Education, 92–93, 90–
103. https://doi.org/10.1016/j.compedu.2015.10.007
Allen, W.C. (2006). Overview and evolution of the ADDIE training system. Advances in
Baker, S.A., Morrison, D.M., Carter, W.B., & Verdon, M.S. (1996). Using the theory of
reasoned action (TRA) to understand the decision to use condoms in an STD clinic
Blackmore, C., Tantam, D., & Van Deurzen, E. (2006). The role of the eTutor—evaluating
Beal, S. J., Nause, K., Crosby, I., & Greiner, M. (2018). Understanding health risks for
Bennett, A. T., Patel, D. A., Carlos, R. C., Zochowski, M. K., Pennewell, S. M., Chi, A. M.,
10.1089/jwh.2015.5251
Brindis, C., Sattley, D., Mamo, L. (2017). Advancing the field of teenage pregnancy
Brouwer, W., Kroeze, W., Crutzen, R., Nooijer, J. D., Vries, N. K. D., Brug, J., & Oenema,
Bull, S. S., Levine, D. K., Black, S. R., Schmiege, S. J., & Santelli, J. (2012). Social media–
Champion, V. L., & Skinner, C. S. (2008). The health belief model. In K. Glanz, B. K.
Cheng, G., & Chau, J. (2014). Exploring the relationships between learning styles, online
doi: 10.1111/bjet.12243
Cohen, J. (1988). Statistical power analysis for the behavioral sciences. L. Erlbaum Associates.
direct=true&AuthType=cookie,ip,cpid,athens,sh
ib&custid=s8863137&db=cat00024a&AN=vmc.b10973837&site=eds-
live&scope=site
Cook, D. A., Levinson, A. J., & Garside, S. (2010). Time and learning in internet-based
Deshler, D. A., Morishige, K., Johns, C. (2008, July 1). Fidelity! Fidelity! Fidelity! --
blog/entry/1/12
Dunn, O.J., (1961). Multiple comparisons among means. Journal of the American
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the
doi: 10.1007/s10464-008-9165-0
https://doi.org/10.1080/00220671.1980.10885233
Fuhr, K., Schröder, J., Berger, T., Moritz, S., Meyer, B., Lutz, W., … Klein, J. P. (2018). The
Gottfredson, D. C., Gerstenblith, S. A., Soule, D. A., Womer, S. C., & Lu, S. (2004). Do after
10.1023/B:PREV.0000045359.41696.02
Harp, S.F., & Mayer, R.E. (1998). How seductive details do their damage: A theory of cognitive
Herz, D., Lee, P., Lutz, L., Stewart, M., Tuell, J., Wiig, J., … Kelley, E. (2012). Addressing the
needs of multi-system youth: Strengthening the connection between child welfare and
content/uploads/2015/03/MultiSystemYouth_March2012.pdf
Hirshfield, S., Downing, M. J., Chiasson, M. A., Yoon, I. S., Houang, S. T., Teran, R. A.,
men living with HIV. AIDS and Behavior, 23(11), 3103–3118. doi:
10.1007/s10461-019- 02498-5
Holstrom, A. M. (2015). Sexuality education goes viral: What we know about online
Jackson, D. D., Ingram, L. A., Boyer, C. B., Robillard, A., & Huhns, M. N. (2016). Can
technology decrease sexual risk behaviors among young people? Results of a pilot
study examining the effectiveness of a mobile application intervention. American
Legrand, K., Bonsergent, E., Latarche, C., Empereur, F., Collin, J. F., Lecomte, E., …
A framework and a tool. Application to the diet and physical activity promotion
2288-12-146
Lohr, L., & Ursyn, A. (2010). Visualizing the instructional design process: Seven usability
1874/CGP/v04i02/37869
McLeod, S. A. (2019a, July 04). What are type I and type II errors? Simply psychology:
https://www.simplypsychology.org/type_I_and_type_II_errors.html
McLeod, S. A. (2019b, July 10). What does effect size tell you? Simply psychology:
https://www.simplypsychology.org/effect-size.html
Maeda, E., Boivin, J., Toyokawa, S., Murata, K., & Saito, H. (2018). Two-year follow-up of
10.1093/humrep/dey293
Mann, S., & Bailey, J.V. (2015). Implementation of digital interventions for sexual health
for young people. In 2nd Behaviour Change Conference: Digital Health and
Mayer, R.E. (2009) Multimedia learning (2nd ed.) New York: Cambridge University Press.
McKimm, J., Jollie, C., & Cantillon, P. (2003, April 19). Web based learning. Retrieved from
https://www.bmj.com/content/326/7394/870
Michinov, N., Brunot, S., Bohec, O. L., Juhel, J., & Delaval, M. (2011).
10.1016/j.compedu.2010.07.025
Moreno, R. (2009). Learning from animated classroom exemplars: The case for
Muñoz-Silva, A., Sánchez-García, M., Nunes, C., & Martins, A. (2007). Gender differences in
condom use prediction with Theory of Reasoned Action and Planned Behaviour: The role
10.1080/09540120701402772
Nichols Hess, A. K., & Greer, K. (2016). Designing for engagement: Using the ADDIE model to
Nicholson, D. T., Chalk, C., Funnell, W. R. J., & Daniel, S. J. (2006). Can virtual reality improve
the amount of time learners spend in on-task behaviors. Intervention in School and
Personal responsibility education program innovative strategies fact sheet. (n.d.). Retrieved from
https://www.acf.hhs.gov/fysb/resource/preis-fact-sheet
https://replicationindex.com/tag/cohens-d/
Rosenstock, I. M., Strecher, V. J., & Becker, M. H. (1988). Social learning theory and the
10.1177/109019818801500203
Rossell, C. H., & Baker, K. (1996). The educational effectiveness of bilingual education.
Schittek Janda, M., Tani Botticelli, A., Mattheos, N., Nebel, D., Wagner, A., Nattestad, A., et al.
Scull, T. M., Kupersmidt, J. B., Malik, C. V., & Keefe, E. M. (2018). Examining the efficacy of
an mHealth media literacy education program for sexual health promotion in older
adolescents attending community college. Journal of American College Health, 66(3),
165–177.
Shegog, R., Baumler, E., Addy, R. C., Peskin, M., Thiel, M. A., Tortolero, S. R., & Markham,
C. (2017). Sexual Health Education for Behavior Change: How Much Is Enough?
Spickard, A. III, Alrajeh, N., Cordray, D., & Gigante, J. (2002). Learning about screening using
540–545.
Spurlock, D. R., & Spurlock, D. (2019). Defining practical significance is hard, but we should
https://doi.org/10.3928/01484834-20191021-02
Tang, H., Xing, W., & Pei, B. (2018). Time Really Matters: Understanding the temporal
10.1177/0735633118784705
Tucker, L. A., George, G., Reardon, C., & Panday, S. (2015). ‘Learning the basics’:
Tunuguntla, R., Rodriguez, O., Ruiz, J. G., Qadri, S. S., Mintzer, M. J., Van Zuilen, M.
early childhood care and education: It's complicated. Administration for Children
Widman, L., Golin, C. E., Kamke, K., Burnette, J. L., & Prinstein, M. J. (2018). Sexual
Wilson, D. K., Sweeney, A. M., Law, L. H., Kitzman-Ulrich, H., & Resnicow, K. (2018). Web-
based program exposure and retention in the families improving together for weight loss
Wulfert, E. & Wan, C.K., (1995). Safer sex intentions and condom use viewed from a health
belief, reasoned action, and social cognitive perspective. The Journal of Sex
Zaslow, M., Anderson, R., Redd, Z., Wessel, J., Daneri, P., Green, K., … Martinez, B. I. (2016).
I. quality thresholds, features, and dosage in early care and education: Introduction and
literature review. Monographs of the Society for Research in Child Development, 81(2),
7–26. https://doi.org/10.1111/mono.12