You are on page 1of 15

Running head: EVALUATION VR MODULE 1

Evaluation of the Virtual Reality for Instructional Design Module

Justin Baffico and CarlottaRhea Clark

California State University Monterey Bay

IST 622 - Assessment and Evaluation

Dr. Bude Su

July 25, 2017


EVALUATION VR MODULE 2

Table of Contents

Introduction....2

Methodology... 3

Prototype 3

Learners. 4

Tryout Conditions.. 5

Process... 6

Results. 7

Entry Conditions 7

Instruction.. 7

Outcomes... 9

Recommendations11

Summary...12

Appendix A... 12

Pretest Description... 12

Posttest Description. 12

Pretest and Posttest Questions. 13

Appendix B... 14

Conclusion and Feedback Text Description.... 14

Conclusion and Feedback Survey Questions...14


EVALUATION VR MODULE 3

Introduction

VR (Virtual Reality) for Instructional Design is a training course that aims to

prepare instructional designers to identify when a training scenario should consider virtual reality

as a media for training. Virtual reality is becoming increasingly popular, both for entertainment,

and in industry. Instructional designers need to be able to identify when VR should be considered

for training purposes. The VR for Instructional Design course has the potential make

instructional designers relevant and viable assets to organizations as technology advances into

the virtual domain.

Methodology

Prototype

The prototype training course, VR for Instructional Design, was created in IST

522 - Instructional Design, at California State University Monterey Bay, In the fall of 2016 by

ourselves and several other MIST students (see appendix ). The course is designed to teach

learners how to identify when virtual reality should be considered for training scenarios. The

course is designed to be contained within a single Google Form with multiple pages. Data is

collected after the user submits the form, and automatically updates a Google Sheet to be used

for data analysis. Course objective and descriptions are outlined below.

Main Objective. After completing this course, participants will be able to identify when

VR technology should be considered for instructional design purposes with 100 percent

accuracy.
EVALUATION VR MODULE 4

Agenda and Descriptions. The course consists of seven different parts. Each part is

sequential and must be completed before the participant can view the next part. Listed below are

the following parts and a description of its contents:

1. Agenda and Objectives. On this page of the Google Form, are presented with an agenda

for the learning module, as well as the objective. The participant will verifies that they

are have at least twenty minutes to dedicate to the course. Their email is also recorded for

paired t-test analysis of their pretest and posttest.

2. Pretest. Five multiple choice questions that examines the participants knowledge of

virtual reality in training scenarios. Same five questions are asked in the posttest

assessment.

3. Introduction. This section introduces the participant to the topic Virtual Reality in

Instructional Design. Includes brief overview of the concept, along with three YouTube

videos embedded on the page. The participant is instructed to view at least one of the

videos. Data is collected for this section.

4. Direct Instruction: The participant views an embedded youtube video. Below the

presentation are four questions. The questions check for understanding on information

written on the slides. The questions must be answered correctly before the participant can

move on to the next section.

5. Guided Practice: The participants read a sample training scenario, and work through the

qualifying questions to determine if virtual reality is suitable for this training scenario.

6. Assessment. This section of the lesson assesses the learning objective through a series of

multiple choice questions. Questions are identical to pretest questions.


EVALUATION VR MODULE 5

7. Feedback and conclusion. The final section gives the users a chance to reflect on the

course and confirmation that the course is complete.

Learners

The target learning population for this course is instructional designers (including

teachers) who are not experts at virtual reality instruction. Our experimental population included

members of CSUMB (California State University Monterey Bay) MIST (Masters of

Instructional Science and Technology) program, as well as classroom teachers in California

public or charter schools. Usability test participants included both sub-populations.

It was critical that our learners have vested interest in instructional design.

Learners outside of this population could have skewed assessment scores due to lack of impact in

their respective fields. Users within the target population can use information learned in the

course to influence decisions about virtual reality and training training to make an impact in the

area of instructional design.

Twenty-eight learners took the course under the tryout conditions, including five under

usability test conditions. Eleven of the participants were MIST program students, and twenty

participants were classroom teachers at the time the course was taken - see table 1.1.
EVALUATION VR MODULE 6

Tryout Conditions

Since the prototype allows for asynchronous completion of the course, most

participants completed the course under their own conditions. All participants noted that they

were able to dedicate a continuous twenty minutes to completing the course before starting. Five

of the participants were observed. Three instances were done in person, and two were done over

a screencast session with an examiner. All observed sessions included only the participant and

one examiner.

Process

The tryout process consisted of three parts. 1) Pre and posttest to measure learning, 2)

observations, and 3) feedback questionnaire.

Pretest and Posttest. The pretest and posttest are present to measure learning of the

targeted objective. Pretest is given before instruction, and used to determine participants

knowledge of VR and instructional design before learning tasks are presented. Pretest consists of

five multiple choice questions asking if a VR should be considered for a certain training
EVALUATION VR MODULE 7

scenario. Posttest consists of the same five questions allow the experiment to have repeated

measurements.

Observations. Five of the twenty-eight participants were observed in person by one of

two examiners for the purpose of a usability test. Examiners would ask clarifying questions to

identify what the participant was looking for on the page. The Examiners would also ask the

participants to read aloud whenever they would reread a portion of the page. The observations

were video recorded to document statements, and keep track of the time spent on each portion of

the course.

Feedback Questionnaire. A feedback questionnaire was administered immediately

following the posttest. The questionnaire was required for the course to be completed. The

questionnaire asked scaled questions about self-reported knowledge acquisition, course

recommendation, and a general feedback comments section.

Results

Entry Conditions

Both intended entry conditions and observed entry conditions were similar. Only

individuals who had a relevance to learning about virtual reality and instructional design

(stakeholders) were asked to participate. Examiners compiled a list of stakeholders, including

K-12 educators and MIST students. The course was directed at individuals who had not had prior

experience identifying when virtual reality should be used in training scenarios. Our feedback

questionnaire indicated that participants self-reported an average initial understanding quotient of

2.07 (1 being little to no prior knowledge, 5 being a great deal of prior knowledge).
EVALUATION VR MODULE 8

Instruction

Instruction was evaluated by the feedback questionnaire and observation. Questionnaire

allowed for participants to reflect on what they liked about the course, and what they found

troubling or difficult.

Intended Instruction. Instruction was intended to be easy to navigate. Google Forms are

not often used as a course or training material. Google Forms are often used to collect feedback

or assess instruction. The examiners wanted to know how the instruction was received, as well as

identifying if it could be considered effective. Designers indicated that the course should take the

learner no longer than twenty minutes. Examiners were able to test this through observation.

Observed Instruction. Examiners determined that the average instructional time for the

five people observed was 18.9 minutes. One participant went over the 20 minute allotted time to

take the course. We hypothesize that time taken on the course by observed participants could be

inflated due to the participant having to read aloud and clarify different parts of the course.

Participants spent the longest time on the direct instruction portion of the course - see table 2.1

below.
EVALUATION VR MODULE 9

Table 2.1

Comments in the feedback portion indicated that three participants thought that the

instruction needed no improvements. Two participants indicated that they would have liked to

see more guided practice. One participant thought the course could have been more engaging.

One participant thought the posttest questions should be different than the pretest - see figure 2.2.

Figure 2.2

Outcomes

Intended Outcomes. Our team hypothesised that completion of the prototype would

increase test results of the participants. The null hypothesis without training is that there would

be no significant difference in test scores. The research hypothesis is that there will be a

statistically significant increase between pretest and posttest scores of the same participants.

Observed Outcomes. Mean scores from pretest to posttest increased from 4.14 to 4.5.

Since all participants completed both a pretest and a posttest assessment, we used a paired two
EVALUATION VR MODULE 10

sample t-test for dependant samples to test validity. The test statistic was 2.56. Since our

hypothesis was directional, we compared it to the t critical one tail value. Since our test statistic

was greater than the t critical one tail value, we reject the null hypothesis and accept the research

hypothesis - see table 2.3 below. The course was a statistically significant factor in participants

increase from pretest to posttest score.

Table 2.3

Learners were tested on only one learning objective. Since test determined statistically

significant in impacting learning gains, we consider learning outcomes successful. In the

feedback portion of the course, learners were asked about their understanding of VR and

instructional design before the course, and their level of understanding after the course see table

2.4. Users self-reported an average after course understanding quotient of 4.25, an increase on

the average initial understanding quotient of 2.07 (1 being little to no prior knowledge, 5 being a

great deal of prior knowledge).


EVALUATION VR MODULE 11

Table 2.4

Recommendations

After observing certain learners take the module and getting feedback from others, we

have a few recommendations before we launch this module again.

We want to see more guided practice, so that learners are able to have a better

understanding of VR before allowing them to continue on their own.

We want to avoid case sensitive answer checks so that learners dont become frustrated

and unable to move on from one question.

We want to make the course overall more engaging to any learner.

While Google Forms is not a typical learning management platform, it can be incredibly useful

but perhaps more graphics and explanations in general would be helpful to a variety of learners.
EVALUATION VR MODULE 12

Summary

Upon prototype examination, our team hypothesized that there would be a statistically

significant increase between the pretest and posttest scores after taking the Virtual Reality and

Instructional Design Module. Pretest and posttest data analysis showed evidence that learning

indeed occurred to a significant degree.We also aimed to explore the reception and usability

aspects of the course. Through our post-course survey, we received valuable feedback about the

prototype from learners. Much of this feedback resulted in recommendations that will improve

the course. Our observation also revealed aspects of the product that were not self-reported in the

feedback. We recommend that more research be done to evaluate the usability and effectiveness

of the course medium (Google Forms) as a viable means for instruction or training.

Appendix A

Pretest and Posttest

The pretest and posttest for the module discussed were identical. The users encountered

this before the training material, and once again after. This section came before the feedback and

conclusion page. Both the pretest and posttest were preceded by a respective title and text

description. Asterisk designates the correct answer.

Pretest Description

Before we get started, we would like to evaluate your understanding of when VR should

be considered for instruction. Please answer the following questions to the best of your ability

Posttest Description

It's time to test your ability to identify if a situation could utilize VR to assist in

instructional design needs.


EVALUATION VR MODULE 13

Pretest and Posttest Questions

1. Scenario: Medical school students need training to locate the liver without puncturing any

other organs. Cadavers are expensive and are needed later for more in depth procedures.

Should VR be considered?

a. Yes*

b. No

2. Scenario: School district staff members need training on how to schedule meetings using

their online calendars. Should VR be considered?

a. Yes

b. No*

3. Scenario: S.W.A.T. team members need active shooter training for various building

layouts (banks, schools, offices). Organizations are hesitant to allow such training on

their premises. Should VR be considered?

a. Yes*

b. No

4. Scenario: First grade teachers are asked to train students on proper pedestrian safety

procedures, including looking both ways. Teachers are concerned about taking their

students to busy intersections for training. Should VR be considered?

a. Yes*

b. No
EVALUATION VR MODULE 14

5. Scenario: A preschool teacher wants to teach a lesson on how to tie shoes. All of her

students have shoes with laces. Last year, her students were fully engaged in this lesson,

and results were successful . Should VR be considered?

a. Yes

b. No*

Appendix B

Conclusion and Feedback

The conclusion and feedback page was given after the assessment had been taken.

Conclusion and Feedback Text Description

Thank you for taking our course on Virtual Reality and Instructional Design. We hope

that the information gained from taking this course will influence your decisions in

regards to using VR in your instructional design. Your feedback of the course is greatly

appreciated.

Conclusion and Feedback Survey Questions


EVALUATION VR MODULE 15

Link to Course

https://drive.google.com/open?id=1o_7MXzb6e_xdQ-YIEyTVhM_gV0YUeaqtf5GTspwqPy8

You might also like