You are on page 1of 17

A microworld of bacteria, known as the gut flora or gut microbiota, exists throughout the GI

tract, and facilitates the breakdown and processing of foodstuffs.

The Role of Computers in Aphasia Rehabilitation


Volkbert M. Roth, Richard C. Katz, in Handbook of Neurolinguistics, 1998
43-2.3.3 Simulation
Simulation tasks are also called, “microworlds.” They present a structured environment
with an intentional problem. The patient must often use elements within the simulated
environment to solve the problem. Microworlds provide the opportunity to design
divergent therapy tasks with various solutions to real-life problems. Although
microworlds require complex programming, the skills patients acquire from using them
seem more likely to generalize to real-world situations than gains achieved through
more traditional formats as microworlds more closely approximate variables found in
the real world. The focus of simulations is to develop an effective (problem-solving)
strategy (Crerar, Ellis, & Dean, 1996).
View chapterExplore book
Technology Supports for Acquiring Mathematics
M.J. Nathan, in International Encyclopedia of Education (Third Edition), 2010
Logo
Logo (Papert, 1980) is both a computer programming language and a microworld, a
designed learning environment to promote mathematical reasoning and problem-
solving skills through an innovative process of directing the actions of a mathematical
creature called the Logo turtle. The turtle can move forward or backward, stop, and
rotate to the left or right, and raise and lower its pen in response to programmed
commands. Although the original turtle was a physical robot that ran along the floor or
paper, in later versions it was replaced by a graphical turtle on a computer screen.
The Logo environment offers a way for the child to externalize mathematical ideas and
procedures and project them onto the actions and properties of the turtle and the Logo
programming language (Eisenberg, 2003). Yet, it also becomes an object-to-think-with
(Resnick et al., 1996) and has been used to conceptualize many areas of mathematics,
including modern algebra and group theory, computer science, cybernetics, as well as
Euclidean and non-Euclidean geometry (Abelson and diSessa, 1981).
Logo has long been a tool for doing mathematics and mathematics instruction, and
there is a large collection of empirical studies investigating its impact on mathematics
learning, teaching, and discovery. For example, fourth graders familiar with Logo
programming were better able to apply what they learned and elaborate on their
procedural interpretations of geometry concepts than those taught from an inquiry-
based approach (Lehrer et al., 1989). In other studies, Logo improved students’ use of
geometric models in other areas of mathematics, generalization and abstraction of
geometric operations, and improved complex reasoning along with more general
cognitive skills (Battista and Clements, 1991; Clements and Battista, 1991, 1992; Lehrer
and Littlefield, 1993).
As is the case with educational technology, more generally, the effects of Logo have as
much to do with the teaching and the engagement of the students, as the technology
itself (Kozma, 1991, 1994; though also see Clark, 1983, 1994).
The essential ideas conveyed in Papert’s (1980) original work, Mindstorms, inspired a
broad range of technological designs for learning and instruction, including: StarLogo,
which uses concepts of parallel computation to introduce participants to the
computational and cognitive aspects of modeling complex, dynamic systems
(e.g., Colella et al., 1999); and the NetLogo Project (reviewed below) at Northwestern
University and The University of Texas at Austin (Wilensky, 1999; Wilensky and Stroup,
1999), which supports distributed computing.
View chapterExplore book
Transfer of Learning, Cognitive Psychology of
G. Steiner, in International Encyclopedia of the Social & Behavioral Sciences, 2001
6.2 About Future Transfer Research
A lot of admirable experimental work has been done within transfer research but there
is a strong bias toward artificial microworlds in the tasks to be learned that diminishes
the usefulness of the results with regard to broader, less experimental settings. As far as
the ‘analogues’ are concerned, the nonexperimental reality often looks different: at the
beginning of many problem-solving processes there are no analogues whatsoever in the
learner's mind. It would be promising for future transfer research to focus on the
subprocesses at the outset of transfer when individuals try to find and define the mental
gaps to be filled in their learning or problem-solving tasks or when the retrieval
subprocesses are set in motion. There is, furthermore, a definite need for more research
focusing on both metacognitive and noncognitive aspects for explaining the lack of
transfer so often observed.

Discovery worlds and simulations


Another product of ITS development throughout the 1980s was discovery worlds (Ahuja
and Sille, 2013) or simulations. A simulation is a model of scientific (or social)
phenomena that represents the domain-specific properties and conceptual
representations of that phenomena. A simulation may emulate some features of real
phenomena but not others. Originally referred to as microworlds by Papert as a “…
subset[s] of reality or a constructed reality so … as to allow a human learner
to exercise particular powerful ideas or intellectual skills” (Papert, 1980, p. 204). For
example, digital simulations enable learners to change aspects of phenomena, i.e., speed
up, slow down, change size, or simplify aspects of real-world phenomena. Learners can
explore phenomena, which are not possible to explore in real life due to their size and or
time scale (Gobert and Clement, 1999), or engage in due to safety issues.
The goal with simulations is for students to use a computer simulation environment to
learn content and skills on their own (Shute and Psotka, 1994). The notion of student-
led discovery, however, is at odds with an ITS since the goal of ITS is to provide
feedback to guide students' learning. For example, White (1984) created a system that
allowed students to explore Newton's laws of motion in a discovery world (a simulation)
by controlling a spaceship and navigating it toward a target or through a maze. While
these types of systems allow for more freedom on the part of the student to guide their
own learning processes, it does demand some knowledge and skill to operate within the
microworlds. Students must know what to manipulate and how to do so, i.e., design an
experiment or support a hypothesis, etc. (Shute and Psotka, 1994).
View chapterExplore book

Gastrointestinal Diseases: Psycho-social Aspects


Antonina Mikocka-Walus, Lesley Graff, in International Encyclopedia of the Social &
Behavioral Sciences (Second Edition), 2015
The GI Tract and the Brain–Gut Axis
The human GI tract extends from the mouth to the anus, and includes the esophagus,
stomach, approximately 13 feet of small intestine, and 7 feet of large intestine. The GI
tract has a crucial role in survival, as it is responsible for breaking down food, extracting
nutrients, exchanging fluids, and expelling waste. It is a dynamic organ that actively
moves contents through the tract using coordinated ripple-type muscle contractions.
A microworld of bacteria, known as the gut flora or gut microbiota, exists throughout
the GI tract, and facilitates the breakdown and processing of foodstuffs. This is the most
highly innervated organ in the body, and has the largest concentration of
neurotransmitters outside the brain, including approximately 90% of the body's
serotonin, supporting the view of a highly developed brain–gut communication
pathway. The GI tract is regulated through its own internal nervous system, the enteric
nervous system, and the central nervous system, via a complex interaction of the neuro–
endocrine–immune systems, with the autonomic nervous system playing a central role
in brain–gut communication (Bonaz and Bernstein, 2013).
This bidirectional connection system is known as the brain–gut axis. While gut action
typically occurs without much awareness or focus, visceral afferent information from the
GI tract has been shown to be more available to conscious awareness than other body
systems (e.g., cardiovascular). Actions including gastric emptying, transit time, and
small and large intestine contractions can be altered by psychological processes such as
emotion and stress (Mayer et al., 2001). Certainly, gut upset (e.g., loose stool, nausea) is
a common experience for many when stressed.
View chapterExplore book

Comprehension, Cognitive Psychology of


G. Hatano, in International Encyclopedia of the Social & Behavioral Sciences, 2001
3.2 Discourse Comprehension
Comprehension of discourse (including both narrative and expository text) is one of the
most popular topics in the study of understanding. Here, the input information is in the
form of a set of spoken or written sentences, and the comprehender's job is to form a
coherent whole from individual sentences. Versions of schema theory (e.g., Schank and
Abelson 1977) were predominant in earlier studies. However, as pointed out by Kintsch
(1998), such top-down views have limitations: Discourse comprehension is interactive if
not bottom-up, and it is much more flexible and context-sensitive than predicted from
the schema theory.
Most contemporary investigators assume that to understand a story (i.e., a narrative
text) is to build a microworld in which the described events are likely to occur. Kintsch
(1998) calls this microworld the ‘situation model.’ In addition to the textbase that
consists of those elements and relations that are explicitly described in the text itself, the
situation model involves pieces of information that make the textbase coherent,
complete, and interpretable in relation to prior knowledge. These pieces of information
are inferred from both the text and prior knowledge. Multiple situation models may be
constructed from the same text (e.g., differing in the coverage or the mode of
representation).
When people understand a story, they make inferences that are not given in the text.
They sometimes infer that an event occurred, even though it is not written explicitly
(e.g., from the sentence, ‘He took a subway downtown,’ the reader infers that the subject
bought a ticket); other times, the reader infers how an event occurred, although the text
does not give any concrete details. (From the sentence, ‘He bought a ticket with coins,’
one infers, ‘He used a vending machine.’) In other situations, readers connect two
adjacent propositions. (From the two sentences, ‘Taro wanted to buy some tuna for
sashimi at a discounted price. He went to the fish-market,’ one infers, ‘Taro went to the
fish-market to buy tuna there.’) How many inferences are generated spontaneously may
depend on the readers, texts, and modes of reading. Graesser et al. (1994) claim that the
inferences that are needed to explain why given events occur and to establish the
coherence of the text tend to be induced spontaneously as the text is being processed.
How many inferences are based on conscious and deliberate attempts to build
consistent and detailed situation models also depends on several factors. Kintsch (1998)
proposes that a coherent situation model is built based on these presented and inferred
propositions through spreading activation. However, studies using experimentally
designed defective texts (e.g., in Collins et al. 1980) have revealed that text
comprehension may require a number of effortful attempts to instantiate, coordinate,
and even insert pieces of information.
Almost everyone is very experienced in discourse comprehension, because it is a major
medium of human communication. However, knowledge about the topic of the
discourse still makes a difference, especially in the generation of inferences. As a result,
more knowledgeable people can build a richer situation model than less knowledgeable
ones. They learn more, especially when the text is less coherent and thus requires
comprehension activity on the part of the comprehenders. Schneider et al. (1989)
demonstrated that students who knew a lot about soccer not only remembered the
details of a given story about soccer better, but also made more inferences and
recognized contradictions in the text more often than their age-mates who knew little
about soccer.
View chapterExplore book

Quantitative Research and Educational Measurement


Björn Nicolay, ... Samuel Greiff, in International Encyclopedia of Education(Fourth
Edition), 2023
Applied computer-based process data analysis in educational measurement:
the example of complex problem solving
At this point, having presented a general overview of the current state of computer-
based process data analysis in educational measurement, we would like to illustrate how
computer-based process data analysis can be used in practice to fill knowledge gaps
regarding how students apply a skill that has been shown to be particularly relevant for
educational success and beyond: the domain-general 21st century skill complex problem
solving (CPS).
CPS skills can be defined as the ability to solve problems that possess unique features,
such as hidden relationships between variables, multiple goals to be reached
simultaneously, and the potential for variable values to change dynamically at any given
time during the solution process (Greiff et al., 2013a,b). One example of a complex
problem could be trying to use a recently updated self-service ticket machine to
purchase a train ticket, an example that has already been used in PISA (OECD, 2014;
here CPS was labeled as creative problem solving). In order to ultimately make the
correct purchase, it is crucial to apply a strategic approach (i.e., deliberately clicking on
different buttons on the machine) to build up one's knowledge instead of performing
random clicks (i.e., applying the vary-one-thing-at-a-time or VOTAT strategy, also
referred to as the control of variables or CVS strategy; Schwichow et al., 2016).
Computer-based microworlds have become the current standard for assessing students'
CPS skills (e.g., Stadler et al., 2015). These microworlds involve different hypothetical
scenarios as complex problems that must be solved, such as creating a lemonade using
different ingredients with arbitrary labels or evaluating the impact of changing the
dosage of several unknown types of medication on different body functions (Greiff et al.,
2013a,b). In CPS assessment, students can explore the initially hidden relationships
among the variables within a given item by moving sliders and clicking different
buttons. These actions, as well as others, can be used for process data analysis.
See Fig. 2 below for an exemplary CPS item.

Sign in to download full-size image


Fig. 2. A complex problem microworld involving multiple variables with hidden

relationships (see also Fig. 1 in The role of computer-based process data in

educational measurement section for a corresponding log file containing process

data). After the variable relationships have been discovered in the problem space

(top part) by applying strategies such as VOTAT, the student is asked to indicate

each present relationship in the model space (bottom part).

Importantly, computer-based process data are collected and stored in log files while
students work on the complex problems within such a microworld. These log files
capture all relevant components of a student's interaction with a given problem,
including time spent on an item, all variable manipulations and their corresponding
time stamps, and whether the student achieved or at least approximated the predefined
goals (Xu et al., 2018). A common approach to extracting valuable information from this
computer-based process data after CPS assessment is to write an automated program
that iterates through all log files and parses the relevant data points for further
subsequent statistical analyses. For instance, if data are stored in XML-format, this can
be achieved using so-called XML parsers (i.e., programs that automatically extract the
desired information from raw log files; e.g., Applen and McDaniel, 2009).
The importance of CPS for both educational and career success as the ability to manage
novel and dynamic situations in a systematic way has been discussed in multiple studies
(Schweizer et al., 2013; Wüstenberg et al., 2012). Thus, it does not come as a surprise
that a large body of research on CPS using computer-based process data from large-
scale assessments, including PISA, has accumulated in recent years (Greiff et al.,
2015; Han et al., 2019; Liu et al., 2020). For instance, research has established the
particular benefit of applying strategies such as the aforementioned VOTAT strategy
alone or in concert with other related strategies for successful CPS performance (Molnár
and Csapó, 2018). Additionally, the time students spend on individual complex
problems has been shown to influence their probability of solving the problem
successfully or unsuccessfully (Scherer et al., 2015). A study by Ren et al.
(2019) uncovered how students can successfully balance multiple goals to be achieved
simultaneously in CPS. Additional research has sought to identify different levels of CPS
proficiency in students, for example based on how they approach a CPS assessment test
(Greiff et al., 2015; Stadler et al., 2020). Eichmann et al. (2020) used PISA computer-
based process data to uncover differences in students' CPS performance based on their
sex and ethnic background: the higher CPS success rates found among boys compared
to girls were attributable to differences in exploration behavior between the sexes,
whereas the behavioral differences investigated were unable to account for the
performance differences between students with versus without migration backgrounds.
In summary, as demonstrated by the selected research endeavors described above,
computer-based process data analysis has greatly supported the investigation of
students' competence levels in crucial skills such as CPS (for an overview of computer-
based process data use in CPS, see also Herde et al., 2016). At the same time, computer-
based process data analysis has helped to uncover opportunities to improve students'
skills in these areas by means of training programs in digital learning environments
(e.g., Azevedo, 2007). Moreover, as computers and digital learning and assessment tools
become increasingly prevalent in the educational context, we can expect further
advancements based on computer-based process data analysis in the near future.
Consequently, we will now discuss potential challenges and future focus areas for
computer-based process data analysis, after providing a broad summary of its current
state of application in the educational measurement domain.
View chapterExplore book

Flexible Learning in Higher Education


S. Alexander, in International Encyclopedia of Education (Third Edition), 2010
Information and Communication Technologies
The development and release in the early 1990s of multimedia-capable computers and
the Internet came at the very time when the crises of higher education were being widely
discussed and thus became the catalyst for many of the changes that ensued. University
administrators and academics alike were attracted, as many commentators noted, like
moths to the flame to use of the new technologies which were seen as the panacea to the
contemporary issues they faced.
In one the largest initiatives of its kind in the world in higher education, the UK
government provided significant funding for the Teaching and Learning Technology
Programme (TLTP) with the aim of “making teaching and learning more productive and
efficient by harnessing modern technology” (Teaching and Learning Technology
Program, 1996). In Australia, successive national government committees, such as the
Committee for the Advancement of University Teaching (CAUT) and the Committee for
University teaching and Staff Development (CUTSD), offered grants to institutions for
the development of projects to improve teaching and learning. Although not specifically
targeting the use of new information and communication technologies, the reality was
that a significant proportion of the projects submitted involved the use of the new
technologies.
Early developments in the use of new technologies for flexible learning included the
development of CD-ROMs (CD-ROM, compact disk, read-only-memory) containing
course materials in the form of lecture notes, databases, animations, simulations, and
collections of still and moving images, and sounds. Some of the learning materials were
said to be interactive when there was a degree of learner choice through, for example,
following hypertext or hypermedia links, and through multiple-choice selections on
screen. Other programs used more sophisticated interactive multimedia software to
provide that user choice. Although still in use today, especially in locations where the
Internet is unavailable, slow, or unreliable, CD-ROMs have largely been replaced by the
Internet as a delivery mechanism.
From the late 1990s until the present day much of the focus of higher education's use of
the Internet has been through the deployment of specialized software variously referred
to in different countries as Course Management Systems, Virtual Learning
Environments, and Learning Management Systems. These tools have become almost
ubiquitous in higher education institutions in First World countries. Although this
software had its origins in developments in computer-mediated conferencing in the
mid-1970s, it was rarely used outside conventional distance-education higher education
institutions until the mid-1990s. Early commercially available software included
FirstClass, Lotus Notes, Top Class, Blackboard, and WebCT. More recently, there has
been a significant move toward the adoption of open-source software such as Moodle
and SAKAI.
Those tools have been supplemented by more recent technological developments such
as the mobile phone (including the iPhone), Personal Digital Assistants (PDAs), and
Personal Access Devices (PAD), which, when used with ubiquitous wireless
communications with high bandwidth, enable what is now known as m-learning or
mobile learning. Students may use these devices to access new mobile versions of
Learning Management Systems, and to download and listen to or watch podcasts (audio
recordings) and vodcasts (video recordings) of lectures available on Internet sites such
as iTunesU and YouTube, thus facilitating access to content at a time and location
convenient to them.
Some universities have opened up the content of their courses to anyone with Internet
access. The most famous of these is the Massachusetts Institute of Technology (MIT)
OpenCourseWare initiative which, at the time of writing, has 1900 courses available
online complete with syllabi, course materials, assignments, and in some cases, videos
of lectures.
The degree of interactivity afforded by the earlier technologies was minimal compared
to that available in the late 2000s. The rapid adoption of social networking tools such as
Facebook, LinkedIn, Twitter, and MySpace have provided opportunities for the
proliferation of user-controlled networks of ‘friends’ sharing photos, videos, blogs, and
personal profiles (Mason and Rennie, 2008). Third-party tools, such as Flickr (photos)
and del.icio.us (social bookmarking), can be integrated with personal profiles. Students
form special-interest groups (which share common pages and message boards and may
be private or public) within social networking sites which, in the higher education
context, are used for diverse activities such as group homework problem solving, and
general academic support. However, tempting it is for universities to make use of these
sites, students express a strong preference to be left alone within them (Aleman and
Wartman, 2009). Each of these tools does, however, support learning which is time and
location independent.
In addition to the growing use of these Web 2.0 tools, there has been the increased
sophistication of virtual worlds software (such as Second Life) which enables learners to
inhabit computer-generated worlds using a computer-generated self known as an
avatar. While some higher education institutions have reproduced real-life classrooms
in which lectures take place, others have taken the opportunity to design qualitatively
different learning experiences for students.
Effectiveness of Information and Communication Technologies for Flexible Delivery
Early evaluations of the effectiveness of these innovations found some evidence of
benefits of the use of these new technologies to higher education. In a large-scale
national study of 104 projects, the majority of which had developed computer-based
learning/interactive multimedia products, Alexander and McKenzie (1998) found that
for students there was some evidence in some projects of:

improved quality of learning;


improved productivity of learning;


improved access to learning; and


improved student attitudes to learning.


Examples of positive learning outcomes which resulted from students' use of
information-technology (IT)-enabled projects included:

opportunities for students to interact with others to gain a more sophisticated


and global understanding of complex international political issues;

improved understanding of concepts which students are known to have difficulty


with in a range of disciplines, through the use of interactive multimedia
animations, simulations, and microworlds;

enhanced communication between part-time students and their lecturer, through


the use of the Internet;

the acquisition of information, such as language learning, where a high


component of factual recall is required; and

the facility for students to assess their own learning of concepts, through
computer-based qualitative and quantitative assessment modules.
The benefits for faculty involved in the projects were found to include: job satisfaction
flowing from the improved learning of their students; an improved understanding of
student learning, student needs, and difficulties; an improved understanding of their
own discipline area; enhanced enthusiasm for teaching; and, for some, an increase in
their own personal profile.
For departments or faculties, the major benefit was the faculty development
opportunities afforded by faculty member's participation in the projects. This
sometimes led to significant change in teaching approach in areas other than the
designated project, as faculty developed enhanced understanding of learning and
teaching. For some departments, the project outcome helped faculty to cope with
decreased resources, without a commensurate decrease in the quality of teaching.
Finally, the teaching profile of some departments was raised as a result of external
recognition of the innovation.
The study emphasizes the fact that it is not the presence of information and
communication technologies by itself that accounts for enhanced learning, rather it is
the design of the learning experience, the support of the learners undertaking that
experience, and the students' perception of the context in which they learn.
In a more recent study of the use of lecture recording technologies used across four
universities, Gosper et al. (2008) report that students perceive the following benefits of
lecture recording technologies in improving their learning: picking up on things they
missed in class; to revise for exams; to revisit complex material, ideas, and concepts;
and to work through material at their own pace.
Faculty have expressed two main concerns around the use of these lecture recording
technologies: that students would not pace their use of these technologies, resulting in
the need to cram learning at the end of the semester, and/or that they would stop
coming to lectures. This fear was somewhat confirmed by the Gosper et al. study in their
report that only approximately 50% of students listened to the recordings on a regular
basis and almost 40% listened to several weeks at the one time. Of the students who no
longer attended lectures, as reported in the study, 68% said they had chosen not to
attend because they could learn as well from the recorded lectures as they could from
the face-to-face lectures.
Critique of Flexible Delivery
Alongside the benefits ascribed to flexible delivery using the information and
communication technologies discussed above, there are also significant challenges. First
and foremost, there remain, despite the rhetoric about the high levels of IT literacy
of generations X and Y, whole cohorts of students who do not have sufficient levels of IT
literacy to access the course materials. Adequate technical support is essential to these
learners in achieving successful learning outcomes.
A second factor is that of students reporting insufficient time to devote to the
course. Mason (2001) famously proclaimed that “time is the new distance,” since lack of
time, rather than long distance, has become one of the primary reasons that students
withdraw from courses. The self-discipline required to undertake a course that has been
delivered to the door is far greater than that required by students whose attendance
requirement at face-to-face lectures serves as a time-management strategy. It is often
only the experienced learners who have the time-management skills to undertake such
self-directed study, yet the majority of students in the higher education system are
undergraduates, who have come directly from high school, and who are studying full
time.
A third critique has centered around the degree to which information and
communication technologies are in fact innovative. Many authors have pointed to the
long-term existence of books and papers which have for many years facilitated learners'
access to the content of learning in a way that is time and location independent. These
authors also point to the fact that learners have control over the order in which they
read printed material, thus facilitating learner control over the content of learning that
is held up as a unique feature of information and communication technologies. Others,
such as Alexander and Boud (2001), argue that the potential for the use of information
and communication technologies to enhance learning has been lost because faculty
have, by and large, simply used the new technologies to automate existing didactic
practices. Much of what passes for innovation in learning is little more than lectures that
have been turned into podcasts and vodcasts, and textbooks which have been
repurposed as websites with electronic page turning for example.
The view of flexible learning described above which equates flexible learning with the
use of technologies for flexible delivery of teaching/learning has been criticized for being
a particularly narrow view of flexible learning. It does, nonetheless, remain a prominent
discourse in both everyday discussions and promotional literature.
There are, however, other views of flexible learning which are discussed below.
View chapterExplore book

The Development of Early Childhood Mathematics


Education
Arthur J. Baroody, in Advances in Child Development and Behavior, 2017
4 Is There Evidence That Concrete Experiences Work?
Some empirical evidence indicates that, for example, concrete experiences are useful in
extending existing informal knowledge by providing young children an opportunity to
discover and apply a mathematical regularity or devise and practice an informal strategy
—(Boggan, Harper, & Whitmire, 2010; Clements & Sarama, 2012) and that games
(Bright, Harvey, & Wheeler, 1985), including computer games (Baroody, Purpura,
Eiland, & Reid, 2015; Obersteiner, Reiss, & Ufer, 2013; Shin, Sutherland, Norris, &
Soloway, 2012) can be valuable in promoting mathematical learning. However, in light
of the preceding discussions on instructional strategies for ensuring effective use of
concrete experiences and Dewey's (1963) interaction principle, it should not be
surprising that research on the effectiveness of concrete experiences is mixed (see, e.g.,
reviews by Mix, 2010; Uttal, 2003). In explaining the mixed results of Thompson's
(1992) use of the Blocks Microworld program and research on the effectiveness of
manipulatives in general, Mix (2010) concluded that whether a model might or might
not work depends on such factors as how manipulatives are used, the outcome measure,
and the characteristics of the learner.
For instance, Walker, Mickes, Bajic, Nailon, and Rickard (2013) evaluated the relative
efficacy of using a conceptual approach (fact triangles) and a drill approach (answer-
production [AP] training) to promote fluency with subtraction combinations with grade
1–6 students. Fact triangles (e.g., see Fig. 10) are widely used to help children see that
subtraction is related to addition and that known sums can be used to reason out
unknown differences (e.g., if 3 + 4 = 7, then 7 – 3 = 4). Walker et al. (2013) found that
AP training was significantly more efficacious in promoting subtraction fluency with
practiced combinations than the fact-triangle intervention but that neither approach
promoted transfer of fluency to unpracticed subtraction combinations. These
researchers concluded that fact triangles “are not an effective vehicle for fluency training
or for establishing flexibly applicable arithmetic skill” and curricula should
“deemphasize fact-triangle exercises in favor of more AP training” (p. 30).
Sign in to download hi-res image

Fig. 10. Example of a fact triangles using 3–4–7.

However, the fact-triangle intervention in the Walker et al. (2013) study may have failed
because of the ineffective manner in which the training was implemented. The model
used was (a) perhaps only semiactive (Section 3.3, Point 2), (b) not designed to prompt
reflection (Point 3), (c) not particularly meaningful (Point 4), (d) without explicit
connections between procedures and concepts (Point 5), (e) relatively short in duration
(Point 6), (f) dependent on a single (not multiple) representations (Point 7), and (g) not
purposeful and engaging (Point 8). In regard to Point 4, their fact-triangle training
involved only two of seven steps in a HLT for fostering the meaningful memorization of
subtraction combinations (Baroody, 2016a). In contrast, virtual concrete experiences
designed to be consistent with Points 2–4 and 6–8 were significantly efficacious in
promoting fluency with unpracticed subtraction combination—transfer, which is a
primary goal of education (Baroody, Purpura, Eiland, & Reid, 2014; Baroody, Purpura,
Eiland, Reid, & Paliwal, 2016). Moreover, although Walker et al. did attempt to gauge
transfer, they did not measure which method was more effective in fostering the
conceptual understanding that addition and subtraction are related operations. Finally,
aside from a fluency pretest, these researchers did assess internal factors such as
developmental readiness to benefit from fact-triangle instruction.
Mix (2010) concluded that manipulatives “play different roles in different situations” (p.
41) and the key question is not “do such educational tools work” but “do these materials
used in this particular way activate this particular mechanism in this particular
learner?” Moreover, citing Ginsburg and Golbeck (2004), she noted, “almost no
research has addressed how or why these materials might help” (p. 41). To address such
a question effectively, researchers need to spell out the role manipulatives are presumed
to play—a factor that dictates the outcome measures. Importantly, both the theoretical
model and intervention effort need to take into account Dewey's (1963) principle of
interaction—how external factors are intended to mesh with internal factors.
Evaluations of manipulatives involving interventions that violate this principle (e.g.,
attempt to impose a manipulative procedure via direct instruction quickly) confound
instructional approaches with the potential value of the manipulative in a learning
environment that honors the principle (e.g., encourages a child to use their existing
knowledge to reflect on how to use the manipulative). Furthermore, a fair evaluation of
manipulatives should include assessing the developmental readiness of participants
along a HLT. To date, such internal factors have been largely overlooked when
researchers construct hypotheses regarding why and how manipulatives work and
evaluate the impact of manipulatives.
View chapterExplore book

GIS Methods and Techniques


Brian Tomaszewski, ... Jacob Hartz, in Comprehensive Geographic Information
Systems, 2018
1.25.2.1 Disaster Management and Serious Gaming
In the field of disaster management, serious games are being introduced as a means of
eliminating the shortcomings of traditional training tools such as simulated drills. In
addition to lacking realism, these drills often require large investments of time and
money to arrange and execute. These shortcomings make these training tools impossible
to repeat in short amounts of time. Recent introductions of serious games into the
disaster management training routine allows upcoming first responders to optimize
their training by achieving the most effective results with less of a need to invest great
amounts of time and money. The following are reviews of several released serious games
that include either a mapping tool or a GIS component as well as common mechanics
seen within these games.
Many existing serious games in disaster management include a GIS component as a
means of providing spatial awareness in the gaming scenario. The presentation of maps,
context for the scenario, specific locations, and the ability to interact with the
environment are all examples of spatial awareness as presented in a gaming context. A
gaming scenario that presents this exceptionally well is Cˆ3Fire,
a microworld simulation for emergency management (Granlund, 2001). Using the tool
both as a means for the leader to communicate to his personnel and as a means for the
personnel to keep record of their findings, this game relies heavily on the graphical
interface that a GIS is capable of providing in order to enhance their communications
between players. Granlund (2001) revealed in his findings that those participants who
selected to use the GIS and mapping tools provided to them had a higher rate of
accurately identifying fires than those who simply chose to use the diary and standard
communication tools. He elaborated as well by stating that the data from the GIS tool
were much more beneficial for debriefing of the game since it provided the instructors
with quantitative data rather than just simply qualitative feedback.
Several other disaster management games effectively provide spatial awareness without
necessarily including real GIS functions. BreakAway Ltd. (2012) presents Incident
Commander (Fig. 2), a game created in conjunction with the US Department of Justice,
that also considers spatial context for their game, giving users a map of the surrounding
area where the disaster is located. As people work through the situation, they are able to
reflect on the context of the emergency and make decisions based on what’s present in
the area.
Sign in to download full-size image

Fig. 2. Screen shot of Incident Commander from https://youtu.be/Gc1CnfQKkZc.

Hazmat Hotline uses maps in a slightly different way, still giving context to their users,
but on a much more local level (Schell et al., 2005). Giving them locations of victims, of
the source of the hazardous material, and of their crew, this game allows users to think
about how to best handle the situation, given where everything is located in relation to
each other.
Although GIS and spatial components are critical components of disaster training
serious games, there are other factors seen within released games that strengthen a
game’s viability as a training tool. One of these factors is the inclusion of a stress
component to portray the reality of the situation at hand. There are several ways to
address the stress component within a gaming context, one of which is using time as a
key game mechanic. Haferkamp et al. (2011) demonstrates how each is portrayed within
the game, DREAD-ED. DREAD-ED works based on limited time for team discussion
and decision-making, giving the team between 30 and 45 min to reach a decision. To
introduce a stress component as well, the game displays four scales to the players which
change based on decisions they make to give real-time feedback after every more. Both
poor and wise decisions come with feedback. The tactical decision games created
by Crichton and Flin (2001) reflect upon similar components, allowing only an hour and
a half for participants to completely work through their game, and introducing
contingencies throughout the entire duration. The time component emphasizes the need
for emergency responders to act quickly in light of a disaster. In addition to a time limit,
stress is also factored into serious games through the inclusion of an unpredictable
factor. Created in conjunction with VSTEP and several agencies around
Europe, RescueSim is a flexible gaming environment that is controlled strictly through
an instructor toolbox (VSTEP B.V., n.d.). The instructor not only creates the original
scenario that will be presented to the players but also is capable of changing the weather
in real time, showing the progression of an incident as it would look in real life, and
introducing secondary events off of the primary one. Each of these changes is not known
or able to be predicted by the players. SPOEL, a virtual environment created for the
management and training of mass evacuations, allows for stress to be portrayed in a
slightly different manner, working with the changes in human behavior as well as
resource distribution and management as their primary sources of stress (Kolen et al.,
2011). Victims within the game are able to change their opinions and actions based on
media and decisions of the emergency crews. Road systems are also a limited resource,
as they are capable of degrading within the game, or becoming too congested to use as
viable evacuation routes.
Another component present in released disaster management games is the use of news
stories or information recaps within the game scenario. Information provided to the
players throughout the game scenario is another crucial piece to their ability to fully
understand about what is going on as the incident revolves around them. IMACSIM
provides this through use of waypoints (Benjamins and Rothkrantz, 2007). As the users
make their way through the simulated environment they are able to visit numerous
waypoints which provide information on the current state. These waypoints are flexible
with scenarios, meaning that they can fit to a variety of different conditions and
emergencies, and they are also able to accurately reflect any changes that occur
throughout game play. Disaster in my Backyard also takes advantage of the opportunity
to introduce information throughout the game, using QR codes and victims as the
information source (Meesters and van de Walle, 2013). Set up as a live walk-through
game, this scenario is much more hands-on in their information presentation. As
players make their way through from start to finish, they are able to interact with actors
who are playing victims within the game, receiving various amounts of information as
they interact with them. Similarly, participants are also given an app which allows them
to interact with QR codes that are placed throughout the game environment. These QR
codes contain relevant information and allow communication between people as the
game plays out.
View chapterExplore book

Recommended publications


Computers in Human Behavior
Journal


Computers & Education
Journal

The Journal of Mathematical Behavior
Journal


Applied Ergonomics
Journal
Browse books and journals

Featured Authors
1. Hoyles, Celia

UCL Knowledge Lab, London, United Kingdom


Citations
2,064
h-index
28
Publications
5

2. Cai, Su

Beijing Normal University, Beijing, China


Citations
999
h-index
13
Publications
2

3. Chiang, Fengkuang

Shanghai Jiao Tong University, Shanghai, China


Citations
838
h-index
11
Publications
4

 About ScienceDirect
 Remote access
 Shopping cart
 Advertise
 Contact and support
 Terms and conditions
 Privacy policy
We use cookies to help provide and enhance our service and tailor content and ads. By
continuing you agree to the use of cookies.
All content on this site: Copyright © 2023 Elsevier B.V., its licensors, and contributors. All
rights are reserved, including those for text and data mining, AI training, and similar
technologies. For all open access content, the Creative Commons licensing terms apply.

You might also like