You are on page 1of 14

Information Sciences 264 (2014) 61–74

Contents lists available at ScienceDirect

Information Sciences
journal homepage: www.elsevier.com/locate/ins

Safe science classrooms: Teacher training through serious


educational games
Leonard Annetta a,⇑, Richard Lamb b,1, James Minogue c,2, Elizabeth Folta d,3, Shawn Holmes e,
David Vallett f,4, Rebecca Cheng a,5
a
George Mason University, 4400 University Dr., Fairfax, VA 22032, United States
b
Washington State University, Cleveland Hall, Room 327, P.O. Box 642132, Pullman, WA 99164, United States
c
North Carolina State University, 317E Poe Hall, Raleigh, NC 27695, United States
d
State University of New York College of Environmental Science and Forestry, 356 Illick Hall, 1 Forestry Dr., Syracuse, NY 13210, United States
e
6891 Woodward Dr Brentwood Bay, BC, Canada V8M 1A9
f
University of Nevada Las Vegas, Department of Teaching and Learning, United States

a r t i c l e i n f o a b s t r a c t

Article history: STIMULATE (Science Training Immersive Modules for University Learning Around Teacher
Received 7 October 2012 Education) is a Serious Educational Game (SEG) designed to advance science teacher prep-
Received in revised form 9 September 2013 aration and development, by creating a laboratory safety module that immerse teachers in
Accepted 22 October 2013
scenarios previously taught using only hypothetical case studies. This study employed a
Available online 30 October 2013
two-phase design based methodology. The first phase was a cognitive task analysis of a
convenience sample (n = 10) of preservice and in-service science teachers in which they
Keywords:
described their key issues and concerns regarding chemical laboratory safety planning,
Serious education Game
Science teacher education
response, management plan, and assessment. Phase 2 examined the usability and effective-
Educational simulation ness of STIMULATE’s initial build on 31 preservice teachers. The t-test for equality of means
Video game demonstrates that there is a statistically significant difference between pretest and post-
test scores, t(30) = 14.79, p < .001, d = 2.56 (large) Overall, results suggest positive learning
gains from the preservice science teachers who engaged in STIMULATE program.
Ó 2013 Elsevier Inc. All rights reserved.

1. Introduction

Science education in the United States is undergoing a major shift toward pedagogy that embeds the use of technology as
a vehicle to understand both practice and fundamental concerns within science classrooms; yet current preservice science
teacher learning approaches and supporting tools allowing for viable scale-up toward this shift are absent in much of teacher
education.

⇑ Corresponding author. Tel.: +1 703 993 5249.


E-mail addresses: lannetta@gmu.edu (L. Annetta), richard.lamb@wsu.edu (R. Lamb), james_minogue@ncsu.edu (J. Minogue), efoa@esf.edu (E. Folta),
surloc@gmail.com (S. Holmes), david.vallett@unlv.edu (D. Vallett), rebecca.jovi@gmail.com (R. Cheng).
1
Tel.: +1 509 335 5025.
2
Tel.: +1 919 513 3317.
3
Tel.: +1 315 470 4938.
4
Tel.: +1 910 352 8275.
5
Tel.: +1 703 993 5249.

0020-0255/$ - see front matter Ó 2013 Elsevier Inc. All rights reserved.
http://dx.doi.org/10.1016/j.ins.2013.10.028
62 L. Annetta et al. / Information Sciences 264 (2014) 61–74

1.1. Game scenario and explanation

This following vignette is designed to introduce what a teacher playing the STIMULATE Serious Educational Game will
encounter upon login. The teacher takes the role of Samantha, a first year teacher who will be responsible for teaching high
school chemistry. It is the summer before she is to start teaching at Patriot Lake High School and she is eager to see her as-
signed classroom. Samantha is taken aback by the lackluster conditions in Room 361 as she opens the door for her first look
at her classroom. What must have once been a ‘state of the art’ chemistry classroom in the 1980’s is now a neglected and
potentially dangerous workplace. Samantha’s somewhat idealized visions of 25 young scientist-students engaged in exciting
chemistry laboratories is quickly replaced with a reality of dilapidated laboratory tables, drooping fume hoods, and a dirty,
disorganized chemical storage room that is in desperate need of some attention and order.
This fictional scenario closely mirrors what many of today’s high school science teachers potentially face. Some questions
that quickly come to mind when seeing the conditions of her room are; how does Samantha ensure the safety of her stu-
dents? Does she know how to conduct a safety audit? Does she know what Material Safety Data Sheets (MSDS) are? Is
she able to sift through and organize the chemical storage room? Will she have the practical hands on experience with lab-
oratory safety that will enable her to deal with emergencies, which, may arise.
Many teacher education programs do not offer training or an answer to the myriad of problems or questions raised within
vignette. This lack of substantive guidance is compounded by the lack of focus in secondary science teacher programs on how
the initial entry teacher would address the problems of an inadequate or unsafe laboratory. Many times teacher education
programs leave it to local districts to train their new teachers in laboratory safety and maintenance [24].

1.2. Purpose and research questions

The intent of the STIMULATE project is to provide preservice teachers with an authentic scenario driven training environ-
ment simulating an initial licensure science teachers first day at school handling an unsafe laboratory environment. All of the
tasks and problem-solving approaches are within the confines of a Serious Educational Game (SEG) offering a soft-failure low
stakes environment for teacher training. The aim of the project is to provide preservice teachers within science education
teacher preparation programs exposure and practice in solving real-life laboratory safety concerns in this real-time respon-
sive virtual environment. The eventual desired outcome is the transfer of knowledge, practice, and heuristics to actual class-
room practice during the preservice teachers first year of teaching. Training based in the STIMULATE SEG fulfills a critical gap
in science teacher education by providing instruction and practice related to secondary science (grades 9–12) laboratory
maintenance and safety procedures, which are not otherwise, addressed in science-teacher preparation programs.
This article describes the iterative design and testing process of a Serious Educational Game (SEG) [2] called STIMULATE
(Science Training Immersive Modules for University Learning Around Teacher Education).6 The authors of the study explore
the effectiveness of gaming scenarios for teaching secondary school preservice science teachers practical knowledge of chemical
safety. Along with this exploration the authors, crystalize some of the preeservice science teachers concerns related to labora-
tory safety and response. The authors also discuss usability and implementation issues that will feed future design and test cy-
cles related to STIMULATE.
To this end, the study investigated the following research questions:

1. Do preservice science teachers learn safe classroom practices from playing STIMULATE in terms of safety knowledge and
emergency responses?
2. Do preservice male and female science-teacher learning experiences as measured by pretest and posttest outcomes differ
significantly while using the STIMULATE SEG?
3. What are preservice science teacher concerns about lab safety and how do they inform future iterations of STIMULATE?

1.3. Background

1.3.1. The importance of science safety


Laboratory safety is a critical component of K-12 science instruction; with no primary literature related to the topic with-
in the science education realm for schools and teachers to develop best practices, the National Science Teachers’ Association
(NSTA) has put forth declarations that speak to laboratory safety importance within the K-12 classroom (NSTA, n.d). For
example, NSTA recommends that school districts and teachers adopt written safety standards; hazardous material manage-
ment, disposal procedures for chemical and biological waste, and share the responsibility of establishing and maintaining
safety standards. They (NSTA) go onto suggest that all science teachers must be involved in an established and on-going
safety-training program relative to the established safety procedures updated on an annual basis. Further, Standard 9 of
the NSTA Standards for Science Teacher Preparation (NSTA-SSTP) [28] suggests that science-teacher preparation programs pro-
vide initial licensure science teachers with the knowledge and skills to understand and successfully engage students in a safe

6
This material is based upon work supported by the National Science Foundation under Grant No. XXXXX. Any opinions, findings, and conclusions or
recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
L. Annetta et al. / Information Sciences 264 (2014) 61–74 63

and ethical manner, though there is little being done to meet this requirement within the teacher preparation programs. The
most current article relating to the subject of preservice teacher education related to laboratory safety is from Tamir [42].
The purpose of this NSTA standard on safety is to encourage teacher preparation programs to prepare teachers for the unique
safety and legal issues involved in providing learning environments that can include hazardous materials and equipment;
however, many teacher preparation programs do not train today’s science teachers adequately to deal with the safety chal-
lenges they might face in the science classroom [9]. New technological tools may help bridge the gap between the need for
safety and the need for laboratory experiences [17].
STIMULATE has the potential to advance science teacher preparation and bridge the gap by creating gaming modules that
immerse the teacher in scenarios once deliverable only through hypothetical case study. This results from the ability of SEG
designer to develop virtually dangerous environments for teachers to practice in without exposure to actual danger. SEG
game modules are appropriate for training content delivery as they can provide these realistic environments and scenarios
for training purposes with the additional benefit of aiding in transfer due to authenticity of the environment [29,41]. The
realism of the environments and scenarios arises from increased processing power associated with computer simulations
further increasing transfer [43]; thus when confronted with similar situations in real life, teachers will have some knowledge
and capability to respond. This study targets four core areas of lab safety: planning, response, management, and assessment.
Fig. 1 illustrates the four areas used to discuss safety within the science classroom.
Safety and health considerations are arguably as important as the content taught in the science classroom. Occupational
injury data from industry studies indicate that the injury rate is highest during the initial period of employment and de-
creases with experience7 [35]. Similarly, in a high school laboratory setting where students experience new activities, the like-
lihood of incidents, injury, and damage is high. Teachers play the most important role in ensuring a safe and healthy learning
environment for their students [6]. The ideal time to impress upon students’ the need for caution and preparation is before and
while they are working with chemicals in science laboratories. No current federal law requires safety and health programs to
protect students in schools. However, the Occupational Safety and Health Act of 1970 require employers to provide safety and
health protection for teachers and other school system employees. Some states require school systems to abide by state regu-
lations, which are similar to the OSHA Laboratory Standard (29 CFR 1910.1450).
The National Science Standards, AAAS Project 2061, NSTA Scope & Sequence, Coordination Project along with, state and
local science curriculum initiatives all agree on one issue: Students need to ‘‘do’’ science. At the middle and high school lev-
els, the need for formal laboratory facilities is increasing because of growing student populations, adoption of additional sci-
ence graduation requirements and meeting the national priority for science education [45]. Science teachers and school
leadership are challenged to meet both academic and safety concerns with renovation of existing laboratories and new con-
struction of science laboratories. Items to be addressed on the safety agenda for newly built K-12 science laboratories include
engineering controls (e.g., acid shower, eyewash), work practice and administrative controls (e.g., laboratory standard oper-
ating procedures, scheduling), and personal protective equipment (e.g., splash goggles, gloves, aprons) [13]. Many building
administrators, central office administrators and even some science educators do not seem to have an understanding of the
safety issues involved in laboratory exploration and are unable to take the steps to address them.
First, by definition, NFPA-45 (National Fire Protection Association, Standard on Fire Protection for Laboratories Using
Chemicals NFPA-45, 1996) defines a laboratory as ‘‘a room or space for testing, analyzing, research, instruction, or similar
activities that involve the use of chemicals’’. OSHA’s Laboratory Safety Standard [40] defines the term ‘‘laboratory’’ as, ‘‘a
facility where the ‘laboratory use of hazardous chemicals’ occurs. It is a workplace where relatively small quantities of haz-
ardous chemicals are used in a non-production basis’’. Most, if not all, secondary school and university science laboratories
are usually classified under these and other appropriate safety standards. Science educators must remember not to confuse
the terms ‘‘science laboratory’’ with that of ‘‘science classroom’’. A science classroom is for lecture and discussion—the talk-
ing about science [44]. The laboratory is for doing the science. This is where the safety standards are most applicable. More-
over, most secondary science classrooms, and even university settings, combine the lecture and laboratory into one space
[25].

1.3.2. How serious educational games can help


The use of video games in the educational setting has been in existence for a significant period. Numerous empirical stud-
ies suggest that there is significant educational value to their use [2,3,21,26]. SEGs present the learner with complex repre-
sentations of real-world problems within the educational environments. These complex representations would not
otherwise be possible for a student to interact within the real world [12]. For example, it is very unlikely that a P-16 student
would have access to or engage in learning within a Level 4 biosafety laboratory. The learner within these environments,
video games, is exposed to complex representations often requiring specific tasks to be completed in order to forward the
game toward the objective and, by extension, promote learning [19]. Through task completion within the game, knowledge
construction takes place and the video game acts as the mediator. The construction of learning in a virtual environment is
analogous to construction within other environments. This occurs because humans construct and use knowledge to identify
and understand critical processes regardless of the environment. Thus, this construction is common while designing SEGs
[16]. The student develops concepts associated with learning through the generation and use of internal representations

7
http://www.bls.gov/iif/oshcfoi1.htm.
64 L. Annetta et al. / Information Sciences 264 (2014) 61–74

Fig. 1. Core lab safety components within the STIMULATE framework.

of concrete objects in the real world while using the virtual equivalent [30], therefore, there is a tendency to focus on the
faculties that develop recognition of the significant objects within a problem and solve for those objects (i.e. inferential
and critical reasoning). Based in this understanding, one can propose that computer game designers would need highly or-
ganized cognitive structures to facilitate internal representation. Therefore, it is plausible that these internal representations
would be necessary in order to use science knowledge, when confronted with ‘game situations’. Studies suggest that video
game designers, and by extension SEG designers, would have the need to encode explicit information presented in the game
for use later in task based problem-solving, thereby potentially transferring awareness and knowledge application to similar
environments within the real world [27]. This explicit encoding or knowledge construction, and knowledge deployment, is
the key feature for the measurement of cognitive attribute sets. In other words, task completion is a key consideration when
assessing cognitive attributes [15].
Preparing scientifically literate and technological perceptive teachers is a critical first step in better preparing and training
teachers to deal with science teaching and safety issues [38]. To address challenges faced by science teachers, the authors
have developed a SEG [2] built upon the training principles found in Serious Games used for training military, corporate,
and medical personnel. The intent is to simulate laboratory scenarios for teachers to train in, as it is often too expensive,
resource intensive, and dangerous to train in actual science classrooms. Adapting SEGs into secondary science-teacher edu-
cation programs has enormous pedagogical and cost saving potential. In adult training settings, simulations have taught peo-
ple to set up refugee camps in troubled areas, orchestrate disaster relief, negotiate environmental treaties more effectively,
make better health policy choices, and handle complex air traffic [32]. Unfortunately, most games are not educational by
design Sherry [36]. In fact, there are few, if any, existing substantive SEGs for initial licensure or practicing science teachers
that are interactive, fun, and most importantly, promote meaningful learning. Well designed, games have properties associ-
ated with most effective instructional approaches: experiential, inquiry-based, and providing continuous user feedback,
while promoting understanding, science interest and efficacy, goal setting, and team learning [4,20]. Further, educational
gaming improves student motivation for an engagement in learning and changes in attitudes of efficacy toward science
[3]. STIMULATE is first person adventure game based on the real world of K-12 science classrooms and the underlying ped-
agogy behind their operations. This module has two main components:

(a) The user engages in a virtual ‘‘Safety Audit’’ and performs basic tasks in conducting safety inspections in a virtual sci-
ence classroom with assistance from a mentor avatar as needed.
(b) The module tests the users’ responses to several emergencies and gives trainees the direct experiences intended to
improve their decision-making and reasoning skills while coping with these different situations, as well as gaining
knowledge of safety issues.

SEGs are games designed for educational or training purposes using specific pedagogical approaches [3]. The inclusion of
the pedagogical approaches specifically differentiates SEGs from other forms of computer-based learning. STIMULATE pro-
vides an immersive environment within a problem-based pedagogy for learning. Thus, though exposure to several emergen-
cies and safety audits the participants can engage in an open-ended task completion found in real classroom this allowing for
greater transference from the games/simulations to an actual science classroom [21]. Translation of actual tasks from the
science classroom to the virtual environment occurred though realistic approximation of the environments, materials,
and non-player character interactions.

2. Material and methods

Methodologists suggest that design studies are ‘‘test-beds for innovation’’ whose intent is to ‘‘investigate the possibilities
for educational improvement by bringing about new forms of learning in order to study them’’ [34]. The researchers guided
this cross-disciplinary effort by the principles and attributes of design-based research and design-experiment methodologies
[5,7,10,14]. That is, it is pragmatic (mimicking real world problems in real classrooms), grounded (in both theory and con-
text), iterative (involving design, testing, and redesigning), and integrative (employing a mixed-methods multilevel
assessment).
L. Annetta et al. / Information Sciences 264 (2014) 61–74 65

Our study unfolded in two phases. The first phase leveraged cognitive task analysis [34] and learner-centered design [39]
to inform the building of a testable version of STIMULATE. For the cognitive task analysis a convenience sample (n = 10) of
preservice and in-service science teachers were interviewed. The participants described their key issues and concerns
regarding chemical laboratory safety planning, response planning, execution of management, and assessment of outcomes.
These aspects when combined into an interview and probe approach, and cognitive demands review provide a means to
establish some understanding of participant decision making [37] Phase two examined the usability and effectiveness of this
initial build and employed feedback from phase one results.
Learner-centered design (LCD) inspired and articulated by Soloway et al. [39], constructivist learning theory focused de-
sign efforts on the needs of learners. LCD shifts attention away from ease-of-use issues and makes the distinction between
‘‘users’’ using technology and ‘‘learners’’ learning with technology clear.
This design paradigm is mindful of the simple fact that, in most cases, ‘‘learners’’ are developing expertise in new and
unknown domains. Its proponents [31,39]) suggest that learners often do not possess the same domain specific expertise
as users. Moreover, they remind us that learners are often heterogeneous and do not necessarily share a common work cul-
ture, level of motivation, background, or level of understanding. They urge computer-based instruction designers to consider
these factors and build interfaces that support learner’s needs serving to create and sustain motivation.

2.1. STIMULATE: serious educational game module description

STIMULATE meets the formal requirements for classification as an SEG through the inclusion of art, story development,
task authenticity, open-ended problem solving approaches and most importantly through the addition of content and ped-
agogy [2]. Specifically, it is the addition of pedagogy to the game through a problem-based learning pedagogy, which allows
the authors to classify STIMULATE as an SEG. Gameification of the STIMULATE SEG develops from players natural desires for
achievement through point scoring for successful completion of laboratory safety tasks and meaningful choice related to
how one prepares the laboratory for instruction resulting in beneficial or negative outcomes for the class. Additional actions,
which add to the game feel of the STIMULATE intervention, are interactions with ‘‘mentor’’ avatars and a narrative storyline.
The STIMULATE module consists of a real-time SEG designed around the topic of laboratory safety from an initial licen-
sure teachers perspective not a grade 9–12 science students perspective. It is this teacher-oriented perspective that makes
STIMULATE innovative and usable within science-teacher preparation programs. Within the module, the ‘‘teacher’’ com-
pletes multiple tasks related to setup and preparation of her classroom science laboratory. This module is an initial prototype
module designed to inform the construction of more refined SEGs based on the same topics (laboratory safety and setup).
Within a science teacher preparation classroom at a local university one could imagine the following scenario. . .science
teachers (both preservice and in-service) enter a virtual 3-dimensional learning space (STIMULATE) and assume the role
of Samantha (the beginning teacher we introduced earlier). She first conducts a ‘‘Laboratory Safety Audit’’ of her classroom.
Fig. 2 illustrates the realistic classroom setting used to initiate safety training for the perservice science teachers.
In this SEG, she searches for the existence of safety signage, electricity, gas, and water supply, fire extinguishers, first aid
kits. She also examines the condition of such things as, ventilation systems, eyewash stations, and safety showers. Through-
out an integrated heads-up display the user can see the score, time, and game functions, Samantha addresses signage and
makes notes as to the condition and potential risks of the aforementioned areas of the classroom.
Samantha next turns her attention to the chemical storage room as she completes the daunting task of inventorying,
properly labeling, reorganizing, and disposing of unfit chemicals that have accumulated over the years. Fig. 3 displays a por-
tion of the laboratory stockroom used to train preservice teachers on how to properly audit and store chemicals. This re-
quires a working knowledge of chemistry and familiarity with the MSDS sheets for each chemical present. The MSDS
sheets embedded in the SEG pop up in windows when Samantha clicks specific chemical bottles.

Fig. 2. The virtual STIMULATE classroom used to introduce safety procedures.


66 L. Annetta et al. / Information Sciences 264 (2014) 61–74

Fig. 3. An image of the Samantha’s chemical storage closet used to train chemical storage.

By the time Room 361 is in working order for the first day of class, Samantha’s classroom (Room 361) is filled with 32
boisterous 10th graders. Unfortunately, there are only 20 desks. The remaining 12 students without a desk sit at a laboratory
bench. Samantha’s new challenge is to prevent and respond to all accidents that may arise as the students complete their
study of chemistry and use the chemistry laboratory. The overcrowding experienced in this SEG is often a reality in many
of today’s schools and something all initial licensure teachers must deal with [1]. In particular, the initially licensed teachers
must assess the safety impact of such a large class. In addition to the overcrowding, the SEG scenarios present the player
(Samantha) with several ‘‘Accident Response Scenarios’’. These scenarios test her ability to deal with corrosive or reactive
chemicals, toxic hazards, and insidious hazards, all within an overcrowded classroom. Each of the listed scenarios, and oth-
ers, teach concepts related to various aspects of laboratory safety.
For example, students are conducting an activity in which they are using thermometers. One student accidentally drops
the thermometer onto the lab table breaking its mercury-filled bulb. The teacher must act accordingly because mercury can
emit toxic vapor over a long period and Samantha must clean up the mercury in a particular manner. Several dangerous sit-
uations, such fires, student injuries and spills, test Samantha’s decision-making skills, pedagogical awareness, and knowl-
edge of chemistry. Another in games example of a response scenario is; a longhaired student is working around an open
flame without pulling back his hair; a student adding water to an acid and splashing it in her eyes; a student plugs in a
hot plate with a frayed power cord.
Numerous ‘‘in-game’’ support mechanisms are available throughout her journey from which Samantha can seek help.
Such resources come from a variety of places including a Science Safety Resource Manual, MSDS sheets, and the NSTA safety
website. Additionally, several ‘‘in-game’’ expert avatars can provide additional support. These characters include represen-
tatives from the local Fire Commissioner’s Office, a Hazardous Waste Management team, the American Chemical Society, and The
Laboratory Safety Institute.

2.2. Research design

2.2.1. Participants
Researchers obtained data from 31 participants enrolled in teacher preparation program as a university in the South East-
ern Portion of the United States. Members of the class took part in the intervention as a component of the classroom instruc-
tion as part of their regular course. Approximately half (14% or 45%) of the participants were at the undergraduate level and
54% (17) were at the graduate level. The percentages of students enrolled in the middle school science education program of
study are 45% (14) and the remaining 54% (17) enrolled in a high school science education program of study. Gender statis-
tics show 64% (20) of the students are male and 35% (11) are female. The intervention lasted approximately 3-h with the
participants taking the pretest at the start of the class and the posttest immediately after the intervention.

2.2.2. Instrumentation and research design


The research design for this study is a one-group, non-randomized, pretest–posttest, quasi-experimental design. The
study groups acts as intact units with no assignment to control or experimental groups. The lack of control (comparison
group) does not allow for a sufficient number of parameters for use of an ANCOVA. In addition, the authors are primarily
concerned with quantifying within group variation due to the effect of the intervention not compare between groups. As this
was a one-group design, all participants received the intervention. Data collection occurred pre-intervention in an attempt to
ascertain starting levels of safety knowledge, video game experience, and safety response. Post intervention data collection
compared gains in understanding of content only. This design type controls for differential selection bias and experimental
mortality (loss of participants). This design is subject to validity threats due to carryover effects or repetition of survey and
test questions. It is also difficult to isolate gains that arise solely from the intervention, as there is no control group.
While a Randomized Control Trial (RCT) would have allowed for greater estimation of intervention effects, there are eth-
ical and ecological considerations, which do not allow for its use. Ethically, it is difficult to justify denial of an intervention to
L. Annetta et al. / Information Sciences 264 (2014) 61–74 67

students based upon experimental need. Secondly assignment to a control or experimental group would have created exces-
sive burden for the instructors involved and destroyed the natural ecology of the classroom. Lastly, sample size does not
allow for a true randomized control trial.
Data collection occurred at two points (pre and post intervention) with all measures given consecutively on the same day
as the pre-intervention. The study authors developed each of the surveys used in this study to assess key research questions.
The use of the same tests, pre, and post, (Chemical Safety test) does create some sensitizing and carryover, however; given
the exploratory nature of this study, sanitization and carryover are not of concern.
The primary investigator created the study instruments specifically for this study as no preexisting instruments captured
the specific content measured within the study. Previous psychometric data is unavailable for these measures as they are
validated on this data set only. The authors reported psychometric properties for these instruments below. The first survey
instrument used to collect data for this study is the Perceptions Survey (PS) (see Appendix A). The PS provides information
regarding the participants general perceptions related to playing video games and chemical safety response.
The study also used a second survey, the Course, Curriculum, and Laboratory Instruction survey (CCLI) which is not in-
cluded in Appendix B. The authors used the CCLI to collect additional information to include demographic, videogame expe-
rience and emergency response experience information. The survey is a 20-question instrument with open-ended responses
and 10-point semantic rating scale. This survey also acted as the initial means to report the frequencies for the endorsement
of concerns related to safety. The scale is semantic in nature, as it does not provide a neutral anchor point at the mid-range.
The use of a 10-point scale avoids a ceiling effect and allows for greater discrimination of participant experience ratings. In
addition, analysis of threshold disordering indicates there is a monotonic advancement within the ordering, thus it is not
necessary to collapse the scale. Additional analysis of category width in logits, are within tolerance 1.4–5.0 logits within
the step calibrations [23]. This indicates appropriate scale development and functioning along with providing increased dis-
crimination when compared to a five-item scale. The rating scale range is (1) Novice, (5) Intermediate, (10) Very Experienced
with gradations in-between. Participants are asked to endorse responses which best describe the participants levels of expe-
rience with video game use and emergency response experiences. The researchers gave survey questions to the study par-
ticipants at the onset of the study. The second portion of the CCLI is a Content Test (CT). The content portion of the CCLI
shown in Appendix B is a ten-item open-response, chemical-safety content assessment instrument, assessing the relative
knowledge of participants regarding emergency response scenarios. Content knowledge assessment occurred using a one-
group, intact pretest, posttest design with pretests given at the start of the intervention to develop a baseline understanding
of participant’s content knowledge. Upon completion of the intervention, which was the completion of the game, the
researchers gave posttest assessments to track content knowledge change and garner information related to video game
playability. Scoring of the content portion of CCLI used a dichotomous approach with ‘‘1’’ for correct and ‘‘0’’ for incorrect
allowing reliability analysis using KR-20. The remaining scales were assessed using the scales associated with the measure.
Please note for reliability analysis, these scales (affective), were converted to a ‘‘1’’ for agree and ‘‘0’’ for disagreement with
the proposed statement allowing analysis using KR-20. Table 1 shows the experimental and assessment design for the study.
Internal reliability was established using Kuder-Richardson-20, which is indicated for dichotomous coding of results such
as those found on the CCLI and PS. KR-20 coefficients for binary effects suggest an adequate level of internal reliability for
each measure with the content measure showing very high reliability with a coefficient value of .949. The safety response
measure internal reliability coefficient is slightly below the threshold requirement. This suggests the presence of latent traits
in the construct not intended within the measure. However, a coefficient of 0.693 is still adequate as the construct of safety
response is quite complex. In addition, when the measure is taken (all components together) as a whole, and a corrected al-
pha coefficient is employed, the overall internal reliability stabilizes at 0.821. The video game experience survey (perception
survey) suggests an adequate level of internal reliability with a KR-20 coefficient of .764.
Construct validity is defined as the degree to which a scale measures the theoretical psychological constructs (H) to
which it is proposed to measure [20]. When a test measures a trait, which is difficult to define such as in an affective test,
measuring the construct of video game experiences, safety response, and safety content, multiple expert reviewers may rate
the individual item relevance to the construct [19].
Analysis of reviewer agreement of relevance shows, 89% of items have strong relevance to safety content as rated by ex-
pert reviewers. Reviewers rated the safety-response measure-item relevance at 60%. The final measure video game percep-
tions illustrated an agreement level of 77%. This percentage relevance corresponds to an average construct-validity
coefficient of .753. This level of construct validity is adequate for an affective measure. Items not rated as strongly relevant
by raters show a mixed relevance, i.e. one reviewer rated the item as strongly relevant and one reviewer rated the same item
as weakly relevant or both reviewers rated the item as weakly relevant. This does not necessarily mean that the experts feel
the item does not measure any aspect of H, just that the item does not measure as much of theta as the strongly relevant
items. Selection of expert reviewers was based upon their unique understanding of the measured items of video games and

Table 1
STIMULATE study design.

Treatment condition Group Retest Treatment Posttest Assessment type


Quasi-experimental QE O1 X O2 Content/survey
68 L. Annetta et al. / Information Sciences 264 (2014) 61–74

safety as it relates to science education. Each of these experts has obtained a Ph.D. in Science Education with focus areas in
Educational Psychology.

2.2.3. Statistical analysis


Research questions 1 and 2 where answered using a mean difference analysis via a dependent t-test. Pretest–posttest
scoring outcomes were calculates as a gain score (posttest score minus pretest score). The mean comparison quantified dif-
ferences due to intervention effect (participant content knowledge growth). The dependent t-test provides a method to com-
pare mean differences on the content measures at an alpha of .05. Calculation of effect size used Cohen’s d with pooled
variance suggesting the magnitude of the mean difference [33,8].
The researchers answered research question 3 via analysis within group variance for differing levels of the single categor-
ical variables (demographics). Specifically, the researchers used a Multifactor Analysis of Variance (ANOVA). Analysis of the
main effects of gain scores across variables for each of the criterion. Content as measured by the CCLI Content Portion, video
game perceptions as measured by the PS, safety assessment capacity as measured by the CCLI and demographic variables
(age, ethnicity, years in school, major, and gender) provides an understanding of differential learning occurring due to effects
associated with these moderator variables.

3. Results

3.1. Survey results

Figs. 4–6 show frequencies of category endorsement prior to the intervention, for video game experience (Fig. 4), chem-
istry safety knowledge (Fig. 5) and emergency response preparedness (Fig. 6). Each of the rating scales is a 1–10 rating scale
with 1 showing novice level and 10 showing a very experienced rating. The remaining affective variables, safety response,
and video game experience dichotomized using a ‘‘1’’ for yes and ‘‘0’’ for no as it relates to experience with each area within
the measure. Posttest histograms were selected for each figure as they allow for a focus on specific answers in the context of
overall trends for categorical variables, which do not lend themselves to display using mean, median and standard deviation.
Fig. 4 illustrates a nearly flat histogram associated with the video game experience ratings (M = 4.87, SD = 2.91). This sug-
gests that the distribution of experience ratings is equal across the population parameter as suggested by the sample; how-
ever, the sample size is insufficient to reflect this [22].
Fig. 5 shows a normal distribution based upon the Shapiro–Wilk W test associated with chemistry safety ratings
(W = .907, p = .059, M = 5.06, SD = 1.84). This suggests that a majority of the study participants fell within the intermediate
range of safety knowledge and the resultant gains are meaningful to participants within this knowledge range. Results sug-
gest that the measure does not exhibit a ceiling effect, which provides for the ability to equally measure increases, and
decreases in chemistry safety knowledge prior to intervention greatly varied.
Fig. 6 shows a normal distribution based upon the Shapiro–Wilk W test, for the rating of emergency response capability
(W = .94, p = .124, M = 5.81, SD = 1.93). This distribution highlights the perceived level of emergency response capability
among the participants post intervention.

3.2. Video game use profile results

The majority of participants (61.29%) endorsed a mixed model of video game usage (Table 2). Participants perceived mode
(most often played mode measured via PS) of video game play varied between multiple types of games as indicated in
Table 2. Arcade and sport being the most endorsed though not much more so than other modes. This suggests that the par-
ticipants did not specifically play one game type or another, which is indicative of an ability to accept the SEG format (first
person) for learning purposes. Meaning the format of the SEG did not interfere with the participants ability to use the game
for learning. This is of significance, as the SEG used in the intervention did not appeal to one player type or another. Endorse-
ment of multiple game types provides evidence of that participant endorsement of multiple game types is not related to spe-
cific genre of the SEG or underlying conceptions. For example, sports games have little in common with the trivia games.
However, there is one commonality found between the top three endorsed game types (Arcade Style, Sports, and First Person
Shooter). This commonality is the large social component within each game. The social component seems to play a role in

Fig. 4. Videogame play experience taken from question 6 on the PS measure.


L. Annetta et al. / Information Sciences 264 (2014) 61–74 69

Fig. 5. Chemistry safety knowledge-ratings taken from question 8 on the PS measure.

Fig. 6. Participant emergency response capability taken from question 11 on the PS measure.

Table 2
Participant video game type preference.

Game type Percentage of total


Arcade style 14.05
Sports 14.05
First person shooter 10.74
Adventure 9.92
Simulation 9.92
Fighting 8.26
Trivia 8.26
Action 7.44
Turn-based 5.79
Fantasy 5.79
Online role play 4.13
Role playing 1.65

each of the highly endorsed game type. This result suggested that there is an individual and collaborative or social compo-
nent to video game play use within the study; meaning participants seemed to prefer those games that contained a social
aspect to their gaming. The study authors examined game play as a variable to establish the roles of prior experience with
playing games has on student learning via an immersive SEG environment. Each of these game genres supports significant
social interaction between players. Online role-playing is the one exception to this trend in the data. The lack of endorsement
may be due to blending of the game type with other similar types such as simulation or adventure while engaged in online
game play.
Data collection for research question 3 occurred using portions of the existing instrument, specifically question 11 and 12
of the CCLI. Table 3 shows the top four concerns expressed by the study participants. The top concern expressed (51%) is that
of chemical mishap occurring within the classroom. The second most expressed concern is that of how manage emergencies
during a classroom emergency.
The t-test for equality of means shows that there is a statistically significant difference between pretest (M = 2.29,
SD = .90) and posttest (M = 6.29, SD = 1.24) scores, t(30) = 14.79, p < .001, d = 2.56 (large). The 95% confidence interval for
the difference between l1 l2 is (3.45–4.55), so l1 < l2 by at least 3.45, but no more than 4.55 [11]. Thus, posttests, content
test scores, are significantly higher than pretest scores. The effect size for the sample is obtained using Cohen’s d with pooled
standard deviation [33,8]. The value for the effect size is very large with a value of 3.29. The large effect size suggests very
little variance within the sample and that the differences are due to true difference in learning outcomes.

Table 3
Percentage of respondents classified by safety concern type.

Concern type Percentage of total


Chemicals 51
Emergency management 29
Safety 16
Classroom management 3
70 L. Annetta et al. / Information Sciences 264 (2014) 61–74

The ANOVA F-test results illustrate there is a statistically significant difference in gain scores on the safety assessment
across the main effect of gender; females (M = 3.95, SD = 1.46), males (M = 6.09, SD = 1.64), F (8, 30) = 11.54, p = .002,
g2 = 0.27 (large)). Using Cohen’s guidelines for interpreting g2, the effect size of 0.27 is large. The 95% confidence interval
for the difference shows that males scored between 1.79 and 2.31 points higher than females for a mean difference of
2.14 points. The use of the confidence intervals allows for understandings related to the uncertainty in the measurement
and provides an upper and lower limit on the changes. This change is masked when one reports on a gain score. In this con-
text, the confidence interval is not the same as a mean difference. The remaining factors of ethnicity (M = 4.68, SD = .94,
F(4, 30) = 0.356, p = 0.837), years in school (M = 1.29, SD = .46, F(1, 30) = .048, p = .828), major (M = 1.54, .50, F(1, 30) = .314,
p = .581) and experience (M = 4.87, SD = 2.92, F(1, 30) = 2.94, p = .101) did not result in statistically significant results on
the main effect of intervention gain score.
Results suggest that prior to participating in the study, initial licensure science teachers had an intermediate level of
understanding regarding laboratory safety and how to provide a safe classroom laboratory environment as illustrated by pre-
test data (M = 4.31, SD = .30, F(1, 30) = 4.31, p = .046, g2 = 0.12 (large).

4. Discussion

Statistically significant mean differences and effect size results suggest that the laboratory safety game is able to increase
learner understanding in the context of content safety in a science classroom and laboratory setting. The change between
pretest and posttest for the initially licensed teachers and experienced teachers suggests that significant learning is occur-
ring. However, these results should be interpreted with caution; carry-over effects due to the study design may have played
a role in the gains. As this is a one-group design, the use of ANCOVA is not indicated for analysis of gains. Gain scores were
not normalized as the normalization most often is indicated when comparing different measures and the scale scores were
not usually large or cumbersome.
Research question 3: what are preservice science teachers concerns about lab safety and how do they inform future iter-
ations of STIMULATE? The majority of participants indicated that their biggest concerns were whether they would know what
to do in the event of a chemical emergency; whether they would know how and be able to handle the situation in order to
minimize the risk or injury to students. Cognitive tasks analysis provided a framework for the teachers to investigate pro-
cesses and plan appropriate responses. Individual teachers completed open-ended surveys to probe and cognitive demand
assessments in keeping with the cognitive task assessment. The open-ended survey allowed teachers to consider-events, ac-
tions, assessment, cues and errors. Results of the CTA provide a means to select concerns of the participants within the class-
room and laboratory environment. Within the survey data shown in Table 3, 51% of the participants show concerns related to
chemical exposure. The other main concerns of participants focused on student misbehavior that could lead to issues of
safety, such as students’ not following directions, engaging in horseplay, and not listening to instructions also shown in Ta-
ble 3. Other concerns included being able to deal with emergencies when a student is injured and having chemicals spill in
the classroom. Two participants indicated that they did not have any concerns. In both cases, the SEG was used once and was
not scalable enough to be played outside of class; thus, the intended repetitive learning did not occur. Many of these con-
cerns were address in the STIMULATE scenarios, but additional scenarios could be added to the game to allow for more expe-
rience in these particular areas.

5. Conclusions

Video games play a substantial role in our culture as almost 70% of Americans have played some sort of video game and
do so regularly [18]. Studies of video game play suggest that they dramatically enhance and alter a wide range of learning
outcomes. The research literature would suggest that inquiry teaching and learning is paramount within science-teacher
education programs. SEGs are a means to infuse inquiry and problem based teaching approaches within science-teacher edu-
cation programs. Inclusion of these two approached via SEGs would allow for greater authenticity and by extension transfer
to actual classroom environments. However, based on our analysis of the current initial licensure students (teachers) and
practicing science teachers; the use of inquiry teaching around topics is not the problem. The problem stems from the lack
of training related to laboratory safety and emergency response as seen within the survey results. Teachers have reported
this problem (lack of training) to the researchers within this study. Other problems identified within the context of this study
and observed through classroom observations are; overcrowded classrooms, the inability to effectively use CTA, lack of train-
ing in using chemicals or potentially dangerous equipment, lack of response training, and lack of storage training. Through
simulations (SEGs) within the STIMULATE framework, teachers are able to tackle these incredibly complex, open-ended, and
ill-defined problems while doing so in a low stakes soft-failure environment. Teachers also have the opportunity to interact
with simulated environments as many times as they would like to gain mastery. The open-ended, dynamic, and sometimes
volatile aspect of the science classroom / laboratory is what we are simulating in this project.

5.1. Implications

Moreover, the NSTA position on professional development urges programs to develop high quality, transformative mate-
rials and learning experiences to prepare the next generation of professional development providers to affect their deeply
L. Annetta et al. / Information Sciences 264 (2014) 61–74 71

held beliefs and habits of practice. Further, these materials should be widely distributed so that professional development
providers are using the best standards and research-based tools and strategies in their work.

5.2. Future iterations of STIMULATE

Participant feedback, survey results, and gain score differences provides a means to establish an iterative design process
allowing for innovation within the development of future teacher education safety modules using an SEG framework. Inno-
vations within teacher education often contextualize and develop within environments controlled by the researcher. STIM-
ULATE allows for direct integration of target audience (preservice teacher) feedback about an uncontrolled environment
(secondary classroom) via a controlled environment (SEG). This paper explores the direct effects of the new materials de-
rived from player experience. Scaling-up of the STIMULATE concept allowing for the feedback collected occurs using a model
conceived around the interactive, open-ended nature of the SEG. While training using the SEG itself would be intuitive and
require little preparation other than access, priming around the issues of science laboratory safety would be a reasonable
addition specifically around safety concerns exhibited by the preservice teachers in this study. More to this point, content,
and practice are found to be a key consideration when advancing science teacher training in the university setting. A key
modification of the STIMULATE concept would allow for increased amount of both practice and content in a unique real-life
context.
An additional key consideration within the STIMULATE concept would be the development of a fidelity of implementation
protocols. Results are suggestive of a need to provide addition structure to increase actual deliver of modular content, in-
creased attention to ascertain minimal and maximal training times for effect and possible programmatic differentiation
addressing more specific individual differences within the data such as video game play experience or training related to
laboratory incident response. Finally, identification and quantification of student avatar interactions, task completion, and
content knowledge as a means to understand student learning.

Appendix A.

Perceptions Survey
1. Your age? ______

2. Your Gender: ___Female ___Male

3. Your ethnicity?
___ American Indian or Alaskan Native ___ Hispanic
___ Asian or Pacific Islander? ___ White, not of Hispanic origin
___ Black, not of Hispanic origin ___ Other or Unknown
4. Year in School:

5. Major/s: _______________________

VIDEO GAME EXPERIENCE:

6. Please rate your level of experience with video games?


1 2 3 4 5 6 7 8 9 10
Novice Intermediate Very experienced

7. If you have experience with video games:

7a. What types of video games do you play most? (check all that apply)

____Online role playing ____First person shooter ____Arcade style ___ Fantasy
____Sports ____Role playing ____Action____Adventure ____Simulations____Fighting
____Turn-based____Trivia
Other: ________________________________________________

7.b. What are your three favorite video games?


______________________________________________________
______________________________________________________

7c. List the main types of game systems you use:


______________________________________________________
______________________________________________________

7.d. Do you play most often by yourself or with others?


____Individual _____Collaborative ____ Both
72 L. Annetta et al. / Information Sciences 264 (2014) 61–74

7.e. Do you play games online or on your computer more often?


____Online _____Computer ____ Both

7.f. Where do you play most often?


____Home _____Friend’s House ____ School____Work
Other: ________________________________________________

CHEMICAL SAFETY EXPERIENCE:

8. How would you rate your knowledge of chemistry safety in the classroom?

1 2 3 4 5 6 7 8 9 10
Novice Intermediate Very experienced

9. Have you taken chemistry safety classes before? Yes _____ No _____

9.b. If yes, when did you take this course/s of training? _________
Please describe the type of classes or training:_________________
______________________________________________________
______________________________________________________
______________________________________________________

10. Do you have any training or experience in any of the following areas?
___ Hazardous materials ___ Firefighter ___ EMT ___ Police
Other:____________________________________________________________

If yes, please describe:


________________________________________________________________________
________________________________________________________________________

11. How prepared do you feel to respond to real life safety issues in the chemistry classroom?

1 2 3 4 5 6 7 8 9 10
Not confident Intermediate Very confident

12. What are your biggest concerns with regard to chemical safety in the classroom?

______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

13. Do you have other comments that you would like to share?
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

Thank you for your responses!


L. Annetta et al. / Information Sciences 264 (2014) 61–74 73

Appendix B.

1. What four items of information should be included on the chemical labels?


a. ____________________________________________________________
b. ____________________________________________________________
c. ____________________________________________________________
d. ____________________________________________________________
2. Where should Material Safety Data Sheets be located?
____________________________________________________________
____________________________________________________________
3. What do you do in case of a gas leak?
a. ____________________________________________________________
b. ____________________________________________________________
c. ____________________________________________________________
4. What do you do if a student ingests a chemical?
____________________________________________________________
____________________________________________________________
5. What three steps do you do follow if a student spills acid on himself or herself?
a. ____________________________________________________________
b. ____________________________________________________________
c. ____________________________________________________________

6. How can you keep chemical stock room safe?


____________________________________________________________
____________________________________________________________
7. In which of the following situations should a fire blanket be used instead of a fire
extinguisher? (circle all that apply)
a. Chemical fire
b. Clothing fire
c. Desk fire
d. Hair Fire
e. Paper fire

8. List the key steps for completing a safety audit in the chemistry classroom
a. ____________________________________________________________

b. ____________________________________________________________
c. ____________________________________________________________
d. ____________________________________________________________
e. ____________________________________________________________
f. ____________________________________________________________
g. ____________________________________________________________
h. ____________________________________________________________

9. How do I safely dispose of chemicals?


a. Consult another chemistry teacher to get his or her advice
b. Locate information on chemical disposal on the Internet
c. Read the MSDS sheet and follow directions
d. Ask the assistant principal for advice

10. What are the six strategies you should use to categorize and organize chemicals in the
chemistry safety closet?
a. ____________________________________________________________
b. ____________________________________________________________
c. ____________________________________________________________
d. ____________________________________________________________
e. ____________________________________________________________
f. ____________________________________________________________
74 L. Annetta et al. / Information Sciences 264 (2014) 61–74

References

[1] Author, Science teacher training through serious educational games. in: Society for Information Technology & Teacher Education International
Conference, vol. 2011(1), 2011 March, pp. 2025–2033.
[2] Author, Serious Educational Games: From Theory to Practice, Sense Publishers, Amsterdam, The Netherlands, 2008, pp. 83.
[3] Author, Bridging reality to virtual reality: Investigating gender effect and student engagement on learning through video game play in an elementary
school classroom. International Journal of Science Education 31(8) (2009) 1091–1113.
[4] J.D. Bransford, A.L. Brown, R.R. Cocking (Eds.), How People Learn: Brain, Mind, Experience, and School, National Academy Press, Washington DC, 1999.
[5] A. Brown, Design experiments: theoretical and methodological challenges in creating complex interventions in classroom settings, The Journal of the
Learning Sciences 2 (2) (1992) 141–178.
[6] H. Borko, Professional development and teacher learning: Mapping the terrain, Educational Researcher 33 (8) (2004) 3–15.
[7] P. Cobb, J. Confrey, A. deSessa, R. Lehrer, L. Schauble, Design experiments in educational research, Educational Researcher 32 (1) (2003) 9–13.
[8] J. Cohen, Statistical Power Analysis for the Behavioral Sciences, second ed., Lawrence Earlbaum Associates, Hillsdale, NJ, 1988.
[9] L. Darling-Hammond, J. Bransford (Eds.), Preparing Teachers for a Changing World: What Teachers Should Learn and be able to do, Wiley.com, 2007.
[10] Design-base Research Collective, Design-based research: an emerging paradigm for educational inquiry, Educational Researcher 32(1) (2003) 5–8.
[11] D. Dimitrov, Quantitative Research in Education, Intermediate & Advanced Methods, Whitier Publications, New York, New York, 2010.
[12] M. Dondinger, Educational video game design: A review of the literature, Computer and Information Science 4 (1) (2007) 21–31.
[13] A.K. Furr, CRC Handbook of Laboratory Safety, CRC Press, 2010.
[14] J. Green, G. Camilli, P. Elmore (Eds.), Handbook of Complementary Methods in Education Research, LEA (AERA), Mahwah, NJ, 2006.
[15] A. Hadwin, P. Winne, J. Nesbit, Roles for software technologies in advancing research and theory in educational psychology, British Journal of
Educational Psychology 75 (1) (2005) 1–24.
[16] A. Jamaludin, Y. Chee, C Mei Lin Ho, Fostering argumentative knowledge construction through enactive role-play in Second Life, Computers &
Education 53 (2) (2009) 317–329.
[17] S. Judge, J. Bobzien, A. Maydosz, S. Gear, P. Katsioloudis, The use of visual-based simulated environments in teacher preparation, Journal of Education
and Training Studies 1 (1) (2013) 88–97.
[18] R.F. Kenny, R. McDaniel, The role teachers’ expectations and value assessments of video games play in their adopting and integrating them into their
classrooms, British Journal of Educational Technology 42 (2) (2011) 197–213.
[19] R. Lamb, L. Annetta, D. Vallett, T. Sadler, Cognitive diagnostic-like approaches using neural network analysis of serious educational games, Computers
& Education 70 (1) (2014) 92–104.
[20] R. Lamb, L. Annetta, J. Meldrum, D. Vallett, Measuring science interest: Rasch validation of the science interest survey, International Journal of Science
and Mathematics Education 10 (3) (2012) 643–668.
[21] R. Lamb, L. Annetta, The use of online modules and the effect on student outcomes in a high school chemistry class, Journal of Science Education and
Technology (2012), http://dx.doi.org/10.1007/s10956-012-9417-5 (online publication).
[22] J. Levman, J. Alierzaie, G. Khan, Perfectly flat histogram equalization, in: Proceedings from the International Conference Signal Processing, Pattern
Recognition and Applications, 2003, pp. 38–42.
[23] J.M. Linacre, Understanding Rasch measurement: Optimizing rating scale category effectiveness, Journal of Applied Measurement 3 (1) (2002) 85–106.
[24] G.L. Long, C.A. Bailey, B.B. Bunn, C. Slebodnick, M.R. Johnson, S. Derozier, J.R. Grady, Chemistry outreach project to High Schools Using a Mobile
Chemistry Laboratory, ChemKits, and Teacher Workshops, Journal of Chemical Education 89 (10) (2012) 1249–1258.
[25] R.L. Matz, E.D. Rothman, J.S. Krajcik, M.M. Banaszak Holl, Concurrent enrollment in lecture and laboratory enhances student performance and
retention, Journal of Research in Science Teaching 49 (5) (2012) 659–682.
[26] A. Mitchell, Savill-Smith, The Use of Computer and Video Games for Learning: A Review of the Literature, 2004. Retrieved from Learning and Skill
Development Laboratory website: <http://dera.ioe.ac.uk/5270/1/041529.pdf>.
[27] P. Moreno-Ger, D. Burgos, I. Martinez-Ortiz, L. Sierra, B. Fernandez-Manjon, Educational game design for online education, Computers in Human
Behavior 24 (2008) 2530–2540.
[28] National Science Teachers Association, Standards for Science Teacher Preparation, 2003. <http://www.nsta.org/pdfs/NSTAstandards2003.pdf>
(retrieved 27.07.11).
[29] G. Ozogul, R. Borden, B. Clark, Preservice teacher professionalism game: how to evaluate effectiveness and transfer, in: Society for Information
Technology & Teacher Education International Conference, vol. 2013(1), 2013 March, pp. 2349–2353).
[30] L. Perlovsky, Language and cognition, Neural Networks 22 (3) (2009) 247–257.
[31] C. Quintana, J. Krajcik, E. Soloway, Exploring a structured definition for learner-centered design, in: B. Fishman, S. O’Connor-Divelbiss (Eds.), Fourth
International Conference of the Learning Sciences, Erlbaum, Mahwah, NJ, 2000, pp. 256–263.
[32] D. Rejeski, Gaming Our Way to a Better Future, 2002. <http://www.avault.com/developer/getarticle.asp?name=drejeski1> (retrieved 22.02.04).
[33] R.L. Rosnow, R. Rosenthal, Computing contrasts, effect sizes, and counternulls on other people’s published data: general procedures for research
consumers, Pyschological Methods 1 (1996) 331–340.
[34] J.M.C. Schraagen, S.F. Chipman, V.L. Shalin (Eds.), Cognitive Task Analysis, Lawrence Erlbaum Associates, Mahwah, NJ, 2000.
[35] N.V. Schwatka, L.M. Butler, J.R. Rosecrance, An aging workforce and injury in the construction industry, Epidemiologic Reviews 34 (1) (2012) 156–167.
[36] J.L. Sherry, Formative research for STEM educational games, Zeitschrift für Psychologie 221 (2) (2013) 90–97.
[37] N.M. Shrayne, S.J. Westerman, C.M. Crawshaw, G.R.J. Hockey, J. Sauer, Task analysis for the investigation of human error in safety-critical software
design: a convergent methods approach, Ergonomics 41 (11) (1998) 1719–1736.
[38] L.K. Smetana, R.L. Bell, Computer simulations to support science instruction and learning: a critical review of the literature, International Journal of
Science Education 34 (9) (2012) 1337–1370.
[39] E. Soloway, M. Guzdial, K.E. Hay, Learner-centered design: the challenge for HCI in the 21st century, Interactions 1 (2) (1994).
[40] Standard (CFR 1910.1450 Occupational exposure to hazardous chemicals in laboratories, Occupational Safety & Health Administration, US Department
of Labor, 1990).
[41] T. Strobach, P.A. Frensch, T. Schubert, Video game practice optimizes executive control skills in dual-task and task switching situations, Acta
Psychologica 140 (1) (2012) 13–24.
[42] P. Tamir, Training teachers to teach effectively in the laboratory, Science Education 73 (1) (1989) 59–69.
[43] T. Tsiatsos, A. Konstantinidis, Utilizing multiplayer video game design principles to enhance the educational experience in 3D virtual computer
supported collaborative learning environments, in: 2012 IEEE 12th International Conference on Advanced Learning Technologies (ICALT), IEEE, 2012,
pp. 621–623.
[44] E.H. van Zee, H. Jansen, K. Winograd, M. Crowl, A. Devitt, Integrating physics and literacy learning in a physics course for prospective elementary and
middle school teachers, Journal of Science Teacher Education (2013) 1–27.
[45] A. Zusman, Challenges facing higher education in the twenty-first century, American Higher Education in the Twenty-First Century: Social, Political,
and Economic Challenges 2 (2005) 115–160.

You might also like