Professional Documents
Culture Documents
PDF Digital Turn in Schools Research Policy Practice Proceedings of Icem 2018 Conference Terje Valjataga Ebook Full Chapter
PDF Digital Turn in Schools Research Policy Practice Proceedings of Icem 2018 Conference Terje Valjataga Ebook Full Chapter
https://textbookfull.com/product/emerging-research-in-
electronics-computer-science-and-technology-proceedings-of-
international-conference-icerect-2018-v-sridhar/
https://textbookfull.com/product/proceedings-of-the-scientific-
practical-conference-research-and-development-2016-1st-edition-k-
v-anisimov/
https://textbookfull.com/product/transportation-research-
proceedings-of-ctrg-2017-tom-v-mathew/
https://textbookfull.com/product/global-challenges-in-energy-and-
environment-select-proceedings-of-icee-2018-v-sivasubramanian/
https://textbookfull.com/product/neural-information-
processing-25th-international-conference-iconip-2018-siem-reap-
cambodia-december-13-16-2018-proceedings-part-v-long-cheng/
https://textbookfull.com/product/research-and-advanced-
technology-for-digital-libraries-20th-international-conference-
on-theory-and-practice-of-digital-libraries-tpdl-2016-norbert-
Lecture Notes in Educational Technology
Terje Väljataga
Mart Laanpere Editors
Digital Turn
in Schools
—Research,
Policy, Practice
Proceedings of ICEM 2018 Conference
Lecture Notes in Educational Technology
Series Editors
Ronghuai Huang, Smart Learning Institute, Beijing Normal University, Beijing,
China
Kinshuk, College of Information, University of North Texas, Denton, TX, USA
Mohamed Jemni, University of Tunis, Tunis, Tunisia
Nian-Shing Chen, National Yunlin University of Science and Technology, Douliu,
Taiwan
J. Michael Spector, University of North Texas, Denton, TX, USA
The series Lecture Notes in Educational Technology (LNET), has established itself
as a medium for the publication of new developments in the research and practice of
educational policy, pedagogy, learning science, learning environment, learning
resources etc. in information and knowledge age,—quickly, informally, and at a
high level.
Abstracted/Indexed in:
Scopus, Web of Science Book Citation Index
Editors
Digital Turn
in Schools—Research,
Policy, Practice
Proceedings of ICEM 2018 Conference
123
Editors
Terje Väljataga Mart Laanpere
School of Educational Sciences School of Digital Technologies
Tallinn University Tallinn University
Tallinn, Estonia Tallinn, Estonia
This Springer imprint is published by the registered company Springer Nature Singapore Pte Ltd.
The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721,
Singapore
Preface
v
vi Preface
vii
viii Contents
1 Introduction
Winquist and Carlson (2014) reported that while student teaching evaluations of the
standard lecture-based approach were high, they found that students had poor long-
term retention of material. It was also found by Onwuegbuzie and Wilson (2003)
that a major impediment to learning for exam-based statistic course is student’s
(mathematical or statistical) anxiety. In another study by Loveland and Schneiter
(2014), activity-based teaching method resulted in higher student comprehension of
statistical concepts and greater ability to apply statistical procedures.
In view of these reported problems, the implications for statistics teachers are then
to provide activities anchored on cognitive principles to develop the desired per-
formance and have appropriate assessment incorporated into the learning journey
for both teachers and students to determine if learning outcomes are achieved, in
time for rectifications of shortcomings. Various strategies have thus been explored,
such as using experiential learning that involves working with real data, solving real
problems, and improving real processes, as reported by Calderwood (2002); collab-
orative learning that allows group assignments on computation and interpretation of
data and descriptive statistics (Delucchi 2006) or the ‘jigsaw technique’ to finish the
entire worksheet sequentially (Perkins and Saris 2001); and integration of ICT and
softwares, especially those in demand in the market (Peiris 2002).
A congruent factor in every effective learning and teaching technique is moti-
vation. Learners have to be motivated to both begin the learning and continue the
process. Games, defined as ‘a form of participatory, or interactive, entertainment’
(Rollings and Adams 2003) seem to be able to provide the self-direction and auton-
omy to sustain the motivation and engagement in this participatory process. This
‘play as you learn’ concept was also evaluated by Glover (2013) where he found that
gamification had provided additional motivation to ensure that learners fully com-
plete activities and, with careful consideration of the implementation, can encour-
age ‘good’ behaviour and discourage ‘bad’ behaviour. Better academic performance
was observed for practical assignments when gamification plugin was deployed in
a learning management system as compared to a traditional e-learning (de-Marcos
et al. 2017). On the other hand, Hanus and Fox (2015) opined that students can be
aversive to the idea of ‘mandatory fun’ (of earning badges, and other activities);
while Mollick and Rothbard (2014) reported that gamification was more effective
only when individuals can choose whether or not to participate. Gåsland (2011) anal-
ysed the impact of game elements using game ‘World of Warcraft’, stating that the
tabard served as a badge other players can recognise and admire. Such status symbols
motivate the players to excel.
Project-Based Learning for Statistical Literacy: A Gamification … 5
Biostatistics, a 60-h statistics literacy core module for the Diploma in Biomedical
Engineering (BME) was redesigned for the October 2017 semester to incorporate
a project-based learning approach. The implementation of project-based learning
required students to conduct a statistical study through design of research instruments,
collection and analysis of data, and the writing of a report in their own interest area.
Gamification elements drawn from the popular gaming genres were infused to better
motive and engage the students. The purpose of this qualitative study was to determine
the impact of infusing gamification elements in a project-based learning approach
through examining the following questions:
1. How did the gamification elements improve the students’ motivation level in a
group?
2. How did these elements help the students acquire the desired learning outcomes
as a group?
2 Methodology
2.1 Participants
A total of 28 BME second-year students from one class participated in this study.
They consisted of 16 males and 12 females, with a mean age of 19 years. This
biostatistics module was conducted between October 2017 and February 2018. There
were five summative assessment components, including a heavily weighted group
project component.
The participants worked over 15 weeks on their group project, in groups of two to
three. They were required to conduct a statistical study in an area of their choice,
such as how sleeping habits affect academic performance, and write a formal report
on it. In the report, they had to describe their proposed course of action, carry out
statistical experiments, justifying with appropriate statistical techniques and analysis,
and explain the benefits or improvements to be expected. The quality of work was
assessed based on the description of the issues, depth of research, appropriateness of
statistical methods, analysis and interpretations, proposed benefits or improvements,
referencing, and on the overall organization and flow of the report. This project was
facilitated mainly through the BlackBoard Learning Management System (LMS).
Figure 1 shows the recommended timeline for major milestones.
6 H. Zhang and L. Fang
the most important component—it was the key tool to collect responses and data
(ammunition in the game sense) to gain accurate trends and insights of the topic or
problem statement. Hence, participants were required to put in effort at the Biostats
Stop to formulate appropriate survey/interview questions. Corresponding to ‘Poké-
mon Gym’ where Pokémon trainers could go to sharpen their skills and battle their
Pokémons against other player’s Pokémons, in Stage 5—‘Biostats Gym’, a team’s
research instrument would be critiqued by another team using rubrics provided, in
a round-robin manner. The critique team was able to award one of three available
Pokémons indicating the respective ‘power’, namely ‘More work needed’, ‘Almost
there’ and ‘Great to go’, to indicate their overall assessment of the quality of work.
Upon completion of these five stages in ‘Level 1’, participants earned their second
badge but could not proceed to the next level until the green light was given, as
indicated on the leader board.
After having earned their second badge, teams were presented with the first
‘Bonus’ chance (to be elaborated in the next section). To clear ‘Level 2-Getting
more serious’, the teams watched an instructional video on using a statistical soft-
ware. They had to use it to analyse the data collected through administering the
research instrument that would have been perfected after clearing ‘Level 1’. The
teams were automatically awarded with the third badge and were presented with the
second ‘Bonus’ chance.
To complete this game of ‘Biostatistics GO’, the team just needed to submit a
formal report that consolidated the processes and outcomes, broadly following the
universal guidelines of including ‘Introduction’, ‘Literature Review’, ‘Methodol-
ogy’, ‘Findings’, ‘Discussions’, ‘Conclusion’ and ‘References’. See Fig. 2 for an
overview of the game design.
Reward mechanisms. Apart from the badges to be collected, a leader board,
typically used in competitive activities to rank the players according to
their achievement, was used here as a motivator. In this three-tiered ranking leader
board, every team can view their position according to the tutor’s assessment (based
on the same rubric provided to them during Level 1 Stage 5) on the performance in
Stage 4—how well was the research instrument drafted. Each team was awarded with
either one of the three stars in gold, silver and bronze according to merit in descend-
ing order. They were again ranked based on their contributions in Stage 5—to what
extent was the critique objective, actionable and constructive, with either of the three
types of emojis in grinning, slightly smiley and unhappy (in descending order). They
also viewed the rating provided by their critiquers. The indication to proceed to the
first ‘Bonus’ chance, termed ‘Level up’, was indicated on the leader board with a
flag (see Fig. 3).
There were two ‘Bonus’ chances. In each ‘Bonus’ chance, each team earned a 20%
of the total project marks by submitting a progress review report that summarised
their progress in ‘Level 1’ and after ‘Level 2’, respectively. As Tishkovskaya and
Lancaster (2012) had noted, to make learning more efficient, the mental load students
must carry needs to decrease. Thus, by implementing this reward of 40% of the final
mark, one, it served as an extrinsic motivation for the completion of the game; and
two, it reduced their cognitive load and stress in managing such complexed statistical
concepts and procedures.
Progress tracking. Tracking progress towards goals in games parallels the signifi-
cance of tracking learning processes, that is to identify the remaining tasks required to
win in the game context, or to achieve the desired learning outcomes for the instruc-
tional context. The progress tracking tool used here is inferred from the reward
mechanisms. Using the leader board, participants could track whether tasks had or
had not been completed. In addition, participants received some form of formative
assessment by looking at the tutor’s and peers’ ratings reflected on the leader board,
and work on how to improve.
Project-Based Learning for Statistical Literacy: A Gamification … 9
The levelling-up mechanism along with the collection of badges in the game, also
provided the platform for participants to self-monitor and pace their progress, on
whether they are on target with respect to the recommended timeline to complete the
project.
3 Evaluation
By using the framework from Kirkpatrick’s levels of learning (1994), the effective-
ness of these gamification elements regarding the level of engagement and motivation
of the learners, and whether they would lead to better project performance and the
attainment of statistical literacy, was investigated.
3.1 Instruments
For the Level 1 (Reaction), an online survey consisted of 19 questions were designed.
The questions were categorised into two groups, namely to find out student’s dif-
ficulties and self-perception of their formal writing competency, and their levels of
engagement and motivation to the different gamification elements. Responses were
scored on a 5-point Likert scale, ranging from 1 (strongly disagree) to 5 (strongly
agree). The survey was administered with informed consent obtained from all the
28 participants in January 2018, one week after they have submitted their second
progress review report.
For Level 2 (Learning), 12 participants from six different teams of varying project
performances were selected to be interviewed to find out why their self-rating
reflected an increase or decrease by one point. The interview sought to find out
the most and least significant online experience for the project as well as whether
the online set-up helped them prepare for their project work in terms of team work,
sharing of knowledge, motivation, quality of work, meeting of deadlines, etc.
For Level 1 (Reaction), the survey data were used to provide simple descriptive
statistics in terms of response to the gamification activities. For Level 2 (Learning),
three sources of data were used. The self-reporting section of the survey of their
competence in report writing provided data on whether the participants perceived
they were learning. To gain further insights, their responses were also compared with
their group report marks. Those who rated an increase or decrease by at least one point
were selected for an interview. The triangulation enabled a deeper understanding of
how learning was shaped by the online activities.
10 H. Zhang and L. Fang
4 Results
4.1 Engagement
Participants’ responses from the survey showed that they were overall engaged and
motivated by the gamification elements. The breakdown of their responses is shown in
the series of charts found in Fig. 4. Overall, the participants perceived the gamification
elements such as the ‘Briefing Room’ and ‘Retrieve Mission’ enjoyable, and had
made the requirements of the work much simpler, understandable and appealing; and
the ability to perform playback allowed them to revisit the information as and when
they need it. The only negative comment from a respondent about these elements
was that some doubts about the project still needed face-to-face clarifications with
the tutor.
On the first three stages in ‘Level 1’, participants found the gamification design
an interesting approach which encouraged them to complete the tasks, assisted in
their through thought process and a useful tracking tool; although there was also a
disagreement that this process was too tedious.
Participants especially liked the ‘Biostats Stop’ and ‘Biostats Gym’ as they were
able to realise their own errors and mistakes when they critiqued their peers, under-
stood how to enhance their own research instrument from adopting the perspective
of a survey participant viewing the questionnaire (of another team) for the first time;
and were definitely appreciative of the constructive feedback provided by their peers.
Not everyone enjoyed the leader board. While some students found they were
more motivated because of the ranking and competitive environment created, and
were driven to improve on their project because of their own ranking; some students
opined it was discouraging as every team can see how the other teams fared. These
Pokémons reference also failed to appeal to one South African raised student who
did not have the context understanding of the evolution of the Pokémons, from a
‘Dratini’ to ‘Dragonair’ and eventually ‘Dragonite’ in analogy to the quality of the
current work done.
4.2 Learning
All participants passed their project report. The teams scored relatively well. Four
teams were awarded an A grade, five were awarded B grade and one a C grade:
A grade
Team 1(G1)—90 marks
Team 2 (G2)—88 marks
Team 3 (G3)—85 marks
Team 4 (G4)—83 marks
Project-Based Learning for Statistical Literacy: A Gamification … 11
(a) Did you enjoy the 'Briefing Room' & 'Retrieve Mission'?
17
8
3 0 0
(b)
Did you find 'Briefing Room' & 'Retrieve Mission' helpful in
your project implementation?
18
4 5 1 0
(c)
Did you enjoy the ‘Gather your team’,‘What’s your
mission’ & 'Foolproof plan'?
16
6 4 2 0
(d)
Did you find ‘Gather your team’,‘What’s your mission’ &
'Foolproof plan' helpful in your project implementation?
16
6 5 1 0
(e) Did you enjoy the ‘Biostats Stop' & 'Biostats Gym'?
17
5 5
1 0
(f)
Did you find ‘Biostats Stop' & 'Biostats Gym' helpful in your
project implementaƟon?
18
4 6
0 0
Fig. 4 a, c, e and g depict to what extent the respondents liked the different gamification elements,
while b, d, f and h depict the extent the respondents found these elements helpful in their learning
and executing of the project
12 H. Zhang and L. Fang
12
6 6 4
0
(h)
Did you find 'Leaderboard' helpful in your project
implementaƟon?
17
3 6
2 0
Fig. 4 (continued)
Fig. 5 Change in the participant’s self-rating of perceived competency in formal report writing
B grade
Teams 5 (G5) and 6 (G6)—79 marks
Team 7 (G7)—77 marks
Team 8 (G8)—73 marks
Team 9 (G9)—70 marks
C grade
Team 10 (G10)—62 marks
No team received a D (59%) or F (below 50%) grade.
Data from their self-rating of how they perceived their competency in writing
a formal report, before and after this group project showed that some had rated
themselves higher, while others scored themselves lower after the project (see Fig. 5).
Based on the self-scoring system, the changes were as follows (see Table 1):
Each participant’s change in self-rating and project ranking were cross-tabulated
(see Fig. 6).
Project-Based Learning for Statistical Literacy: A Gamification … 13
The 16 participants who indicated no change in their rating came from all groups
except for group 5. An interesting trend was observed. The seven who indicated an
improved rating of +1 were from the lower-scoring groups (G4, G5, G7, G8 and
G9). In comparison, the five who indicated negative improved rating of −1 came
from the higher-scoring groups (G2, G3 and G6). The participants who indicated the
‘−1’ rating had a better cumulative grade point aggregate (cGPA) score (on average,
3.73 out of 4.0, with median of 3.78) than the other participants who indicated ‘1’
rating (on average 2.07, with median of 3.32).
Those who indicated a change in their perceived competence level were invited
for an interview. Data from the following section came from the 11 out of the 12
participants who came for the interview.
14 H. Zhang and L. Fang
For those who rated a lower competence point (‘−1’) after the project, the online
experience made them more aware that they could do better and their causes. They
cited the following reasons:
• Time constraint
… there is more room for improvement, and that the group was capable of better
work; however, we were constrained by time … (Participants 1 and 2 from G3,
rating from average to below average)
• Shortcomings
… I had a false expectancy at the start and realised that I was not as competent at
the end. However, I feel that I have improved in report writing and did learn …
(Participant 2 from G6, rating from Competent to Average)
… I am aware of my shortcomings, as I require some help from friends and the
tutor … (Participant 1 from G2, rating from average to below average)
… there can be room for improvement, as some parts were not completed suf-
ficiently, and at times lacked knowledge … (Participant 1 from G6, rating from
Very Competent to Competent)
Those who rated an increase competence point (‘+1’) were the ones who high-
lighted that they had many problems with report writing before the start of the project.
The cited the following reasons for their positive self-scoring:
• Enabled learning
… online project was useful for learning … (Participant 1 from G9, rating from
Average to Competent)
… I was able to understand the different statistical tests and apply them for the
project and the whole subject … I appreciated placing lessons online as they were
clear … (Participant 2 from G4, rating from Average to Competent)
• Guiding steps
… while other subject reports helped with the format, however, it was the steps that
provided guidance for the project … (Participant 1 from G5, rating from Average
to Competent)
… improvement came from everyone, including classmates … (Participant 2 from
G7, rating from Average to Competent)
Overall, the online experience helped the 12 participants learned better because
of the way learning unfolded. Elements of the design were highlighted. For instance,
the levels helped participants in their journey. To one participant, it provided a goal
to work towards. They also learnt from their group mates as well as from feedback
from peers. It influenced one participant to review the group hypothesis and change
the survey questions. The leader board provided a useful signal as to how they were
doing, identifying their gaps as well as providing the standards they had to meet. In
addition, having Pokémon characters made it more fun than regular leader boards.
Project-Based Learning for Statistical Literacy: A Gamification … 15
However, participants wished they had more time to reflect on it, while those who
had the lowest score from that felt a little ashamed to have it publicly announced.
Tapping on the idea of Pokémon characters was a positive move. The participants
were more motivated. The different power of the Pokémon characters symbolised
the quality of the work produced, which made the leader board more interesting
and motivating to the students as being awarded the more powerful ‘Dragnonite’ or
‘Dragonair’ was an achievement status (Gåsland 2011).
Evidently, it was the design of the game that helped learning, i.e. the levels and the
opportunity for feedback; agreeing with Gåsland (2011) and Lovett and Greenhouse
(2000). With these ‘Level’s in place, the students were able to dissect what were
required and work towards the goals. The feedback helped them to further improve
on their work.
What was unexpected was the way the experience made some participants aware
of their shortcomings, and potential room for improvement.
6 Conclusion
Using game-like elements certainly added much excitement and depth to learning.
With the goal-focused activities, reward mechanisms and progress tracking incorpo-
rated in this approach, results gathered suggest that participants were overall more
motivated and engaged to begin and continue with the learning. This could in part
be related to the features of the goal-focused activities, that have helped to sequence
and breakdown the project into bite-sized chunks (in the form of ‘Level’s) for easier
implementation. Implementing the reward mechanism using the leader board also
proved its merits as a tool to increase and sustain the motivational levels; encourag-
ing students to complete the project timely, and with a sense of achievement. Having
said so, the context in the game should be relevant to the participants (players) to
achieve the said effectiveness.
In the further work, we intend to analyse and evaluate this current model of
gamified approach using Temasek Polytechnic’s Self-Directed Learning Framework
(Learning Academy 2016) comprising of the four phases: ‘Plan’, ‘Perform’, ‘Moni-
tor’ and ‘Reflect’; on whether the learners are able to diagnose their learning needs,
formulate learning goals, identify resources for learning, select and implement learn-
ing strategies and evaluate learning outcomes (Knowles 1975).
16 H. Zhang and L. Fang
References
Association for Achievement and Improvement through Assessment: How can attainment and
progress be recorded and tracked. https://www.aaia.org.uk/assessing-without-levels/how-can-
attainment-and-progress-be-recorded-and-tracked/ (2019). Accessed 2019/04/01
Calderwood, K.A.: Incorporating multiple epistemologies into teaching statistics to social work
students. J. Teach. Social Work 22(1–2), 17–32 (2002)
Delucchi, M.: The efficacy of collaborative learning groups in an undergraduate statistics course.
Coll. Teach. 54(2), 244–248 (2006)
de-Marcos, L., Domínguez, A., Saenz-de-Navarrete, J., Pagés, C.: An empirical study comparing
gamification and social networking on e-learning. Comput. Educ. 75, 82–91 (2017)
Gal, I.: Adults’ statistical literacy: meanings, components, responsibilities. Int. Stat. Rev. 70(1),
1–51 (2002)
Garfield, J.: The challenge of developing statistical reasoning. J. Stat. Educ. [Online] 10(3) (2002)
Gåsland, M.: Game mechanic based e-learning. Master Thesis, Science and Technology (2011).
Retrieved from https://daim.idi.ntnu.no/masteroppgaver/006/6099/masteroppgave.pdf. Accessed
2018/04/04
Glover, I.: Play as you learn: gamification as a technique for motivating learners. In: Herrington,
J., Couros, A., Irvine, V. (eds.) Proceedings of World Conference on Educational Multimedia,
Hypermedia and Telecommunications, pp. 1999–2008. AACE, Chesapeake, VA (2013)
Hanus, M.D., Fox, J.: Assessing the effects of gamification in the classroom: a longitudinal study on
intrinsic motivation, social comparison, satisfaction, effort, and academic performance. Comput.
Educ. 152–161 (2015)
Kirkpatrick, D.L.: Evaluating Training Programs: The Four Levels. Berrett-Koehler, San Francisco
(1994)
Knowles, M.S.: Self-Directed Learning: A Guide for Learners and Teachers. Englewood Cliffs,
Prentice Hall/Cambridge (1975)
Learning Academy. Temasek Polytechnic. http://www.tp.edu.sg/centres/learning-academy (2016).
Accessed 2018/04/04
Loveland, J., Schneiter, K.: Teaching statistics with lectures or activities: a comparative study. In:
Makar, K., de Sousa, B., Gould, R. (eds.) Sustainability in Statistics Education. Proceedings of
the Ninth International Conference on Teaching Statistics, pp. 1–4. ISI, Flagstaff, AZ (2014)
Lovett, M.C., Greenhouse, J.B.: Applying cognitive theory to statistics instruction. Am. Stat. 54(3),
196–206 (2000)
Mollick, E.R., Rothbard, N.: Mandatory fun: consent, gamification and the impact of games at
work. In: The Wharton School Research Paper Series (2014)
Onwuegbuzie, A.J., Wilson, V.A.: Statistics anxiety: nature, etiology, antecedents, effects, and
treatments—a comprehensive review of the literature. Teach. High. Educ. 8(2), 195–209 (2003)
Peiris, M.S.: A way of teaching statistics: an approach flexible to learning. CAL-laborate 9, 13–15
(2002)
Perkins, D.V., Saris, R.N.: A “jigsaw classroom” technique for undergraduate statistics courses.
Teach. Psychol. 28(2), 111–113 (2001)
Rollings, A., Adams, E.: Andrew Rollings and Ernest Adams on Game Design. New Riders, Indi-
anapolis (2003)
Tishkovskaya, S., Lancaster, G.A.: Statistical education in the 21st century: a review of challenges,
teaching innovations and strategies for reform. J. Stat. Educ. 20(2), 1–55 (2012)
Winquist, J.R., Carlson, K.A.: Flipped statistics class results: better performance than lecture over
one year later. J. Stat. Educ. [Online] 22(3) (2014)
User Expectations and Experiences
in Using Location-Based Game
in Educational Context
1 Introduction
Location-based games are a promising way to enable students for varying activi-
ties in- and outside school building. Especially smart phones with GPS capability
and fourth generation of broadband cellular network technology, computing power,
development of app economy, and overall affordability have played a key role in
making location-based mobile technology available, reliable, and robust, and the
technology is progressively common to be found from every student’s pocket.
Incorporating mobile technology and pervasive learning can leverage effective-
ness and accessibility of learning activities (Shuib et al. 2015). Using mobile devices
to support learning has been also associated with improvements to stronger moti-
vation and engagement (Hsu and Ching 2013; Martin and Ertzberger 2013). The
outcomes of mobile devices harnessed with location-based activities are encourag-
ing. Perhaps the most known mobile application using location technology so far
is the Pokémon GO. There is evidence that Pokémon GO has a positive effect on
physical activity and life expectancy and perhaps increases social connectedness, and
improves mood (Althoff et al. 2016; Howe et al. 2016). Schools could provide an
interesting context for applications and games with immersive location-based tech-
nology for various reasons. School classrooms are known of high volume of sitting
(Ridgers et al. 2012), which have been linked to rates of metabolic syndrome, type
2 diabetes, obesity, and cardiovascular diseases (Hamilton et al. 2007). Reduction in
sedentary time may have a significant impact on diabetes prevention (Wilmot et al.
2012). Among possible health benefits, using mobile devices for learning adds a new
pedagogical approach to set up learning activities and expand traditional learning
environments. Sung et al. (2016) performed a meta-analysis and research synthesis
of the effects of mobile device integration to learning and teaching, resulting there is
The Finnish comprehensive schools participating in our study were provided with a
location-based serious game administrated in a web browser environment to build
and generate activities. The actual game is run in a smart phone application using
smart phone’s performances such as GPS and camera. The smart phone application
runs both in Android and iOS environments.
To start activities, facilitator (in this case the teacher) creates a “track” or “trail”
on virtual map layer (e.g., Google Maps) that consists of several GPS checkpoints
with specific interactive task. These interactive tasks can be quizzes, multiple-choice
questions, text and hyperlinks, YouTube videos, pictures, instructions, and taking
pictures or a video. Feedback can be given immediately or afterward. Facilitator sets
a specific geographical area for the game play, so if a user tends to leave the area,
they are being alerted by the application to return to the game area. Facilitator can
also interact with users by sending messages and track the users’ location during the
game.
After the learning trail is set, users log into the game by scanning a QR code by
using their smart phone’s camera. After logging in, users start the game by moving to
start area defined by the facilitator. Users proceed to navigate to activities by following
a compass in the application that tells the direction and distance for the activity (see
Fig. 1). After the user reaches the correct GPS checkpoint (geographical point or
area), activity emerges to their smart phone application. After accomplishing the
task, user receives the next GPS checkpoint that can be randomized or accomplished
with the same order for every user. Facilitator can also set a time limit and modify
the diameters of the GPS checkpoints. The game ends to the finish area defined by
the facilitator. Facilitator receives users answers, pictures, and videos taken during
the trail and meta-data (i.e., time), which then can be analyzed for further use and
feedback. These learning trails can be stand-alone as the facilitator can print the QR
code for the users to start the game whenever and wherever they please or the game
can be a one-time activity.
2 Methodology
included the System Usability Scale questionnaire to capture subjective user percep-
tions relating to usability properties of the system.
Described as “quick and dirty,” the System Usability Scale (SUS), originally intro-
duced by Brooke (1996), is a simple ten-item scale to measure subjective usability of
a system. Statements of the SUS scale are ranked with five-step scale ranging from
“Strongly disagree” to (1) to “Strongly agree” (5). Main analysis is to score users
responses in different statements leading to a sum between 0 and 100 representing
the overall usability of the system under evaluation. SUS was originally developed
as a universal scale to measure usability aspects of a varying system and has ever
since been applied to different contexts. SUS remains still valid approach to measure
system usability (Mclellan et al. 2012), although there are debates whether the scale
produces a unidimensional factor (Borsci et al. 2009). SUS has been since developed
further to add more interpretation to overall SUS score (Bangor et al. 2008) and to
use a seven-step scale instead of five (Finstad 2010).
Content analysis by Neuendorf (2002) is “a summarizing, quantitative analysis
of messages that relies on scientific method (including attention to objectivity–in-
tersubjectivity, a priori design, reliability, validity, generalizability, replicability, and
hypothesis testing) and is not limited as to the types of variables that may be measured
or the context in which the messages are created or presented” (Neuendorf 2002). The
objective is to attain concise picture of the phenomenon resulting in concepts or cate-
gories describing the phenomenon under examination. There are numerous strategies
on how to use the content analysis as part of research methodologies. Inductive con-
tent analysis was used to organize data (quotations) that led to open coding. Open
coding means notes and headings are written in the text while reviewing it. After
a series of reviews, codes are listed to categories, which then are again grouped in
order to reduce the number of categories to find similarities or dissimilarities among
the codes (Elo and Kyngäs 2008). Overall, categories are patterns or themes that are
directly expressed in the data and/or are derived from them using analysis (Hsieh and
Shannon 2005). Results of the content analysis are viewed further on results section.
Coding of the content analysis was executed with Atlas.ti qualitative data analysis
software.
Location-based serious game (now on system) was used in the Finnish comprehensive
school grades 3–9 (age 8–15) during school year 2017–2018 (see Table 1). System
was used by 408 students (n = 408) representing 25 classes from ten different com-
prehensive schools. Before using the system, students were introduced with its main
features. Researcher introduced the system by reading a description of the system’s
general purpose, performances and gave examples of possible activities. After the
introduction, students entered the pretest questionnaire measuring their expectations
toward the system. Teachers had a one-hour training on how to operate the system,
how to create activities, and how to administrate the system in a web browser envi-
Another random document with
no related content on Scribd:
The Project Gutenberg eBook of Saaristoväkeä
This ebook is for the use of anyone anywhere in the United States
and most other parts of the world at no cost and with almost no
restrictions whatsoever. You may copy it, give it away or re-use it
under the terms of the Project Gutenberg License included with this
ebook or online at www.gutenberg.org. If you are not located in the
United States, you will have to check the laws of the country where
you are located before using this eBook.
Title: Saaristoväkeä
Novelleja
Language: Finnish
Novelleja
Kirj.
YRJÖ KOSKELAINEN
SISÄLLYS:
Kehto
Kalle-Kustaan tupa
Mestarivaras
Uskovaisia
Simsörin Johannes
Naapurit
KEHTO
I.
Mutta eilen kun hänen hätäpäätä oli korjattava purjetta, olisi hän
tarvinnut yhden niistä messinkiväkipyöristä, joita Sälskärin
hylkyhuutokaupassa oli ostanut kokonaisen kimpun, mutta sai koluta
aitat ja vinnit löytämättä mitään. Hän muisti hämärästi ripustaneensa
ne johonkin naulaan, mutta mihin naulaan, sitä ei olisi tiennyt sanoa
kukaan muu kuin Rosina. Ja Rosina oli maannut haudassa jo viime
vapusta vuoden…
Hamberg olisi tänään jatkanut pitemmällekin muistojaan, ellei
hänen apumiehensä, Janne, olisi tullut tupaan saamaan osaansa
päivän ansioista. Hamberg tyhjensi pöydälle kissan viereen kuluneen
kukkaronsa sisällyksen. Rahat, joita oli hopeita ja kuparia lajiteltiin,
laskettiin ja jaettiin kolmeen läjään, joista yhden sai Janne, kaksi
taas pyyhkäisi Hamberg suureen kouraansa vieden ne tuvan sivussa
olevaan kamariin. Nurkassa siellä sängyn takana oli raudoitettu
punainen kirstu, jonka hän avasi jykevällä avaimella ja erotettuaan
hopeat vanhaan tupakkapussiin ja kuparit sikarilaatikkoon, jälleen
huolellisesti sulki. Avain pantiin lasi-oviseen seinäkaappiin, jossa
punaisessa nauhassa hohti pari hopeamitalia »ihmishengen
pelastamisesta», vanhanaikuinen umpikuorinen kello ja pieni
hopeapikari palkintona kilpapurjehduksesta.
*****
Oli vielä täysi yö merellä, kun Hamberg tunsi ruumiissaan että oli
aika nousta. Tuuli oli kääntynyt pohjoiseen ja ajanut taivaankannen
täyteen hallahtavia pilvenhattaroita, vesi näytti joka taholla kylmältä
ja harmaalta. Aamuvilu hyrisytti verkkomiehiä, he kohoutuivat
istualleen ja asettuivat hitaasti airoihin Hambergin hilatessa jo
veneeseen ensimäistä verkon päätä.
II.
Kirje oli nyt valmis. Rivit tosin nousivat ja laskivat kuin laineet
myrskyssä, mutta Hamberg oli täysin tyytyväinen siihen. Kun oli
loppuun pantu joukko tervehdyksiä nimitetyille henkilöille, piirrettiin
alle Petter Valfrid Hamberg ja varmuudeksi otti Hamberg vielä kynän
tukevaan kouraansa ja piirsi puumerkkinsä q W H, johon sisältyi
kaikki, mitä hän tiesi jalosta kirjoitustaidosta.
*****
*****