You are on page 1of 13

Entertainment Computing 4 (2013) 143–155

Contents lists available at SciVerse ScienceDirect

Entertainment Computing
journal homepage: ees.elsevier.com/entcom

Review

Visualization-based analysis of gameplay data – A review of literature q


G. Wallner a,⇑, S. Kriglstein b
a
Institute of Art and Technology, University of Applied Arts Vienna, Oskar Kokoschka Platz 2, 1010 Vienna, Austria
b
Institute for Design and Assessment of Technology, Vienna University of Technology, Argentinierstrasse 8, 1040 Vienna, Austria

a r t i c l e i n f o a b s t r a c t

Article history: As video games are becoming more and more complex and are reaching a broader audience, there is an
Received 24 July 2012 increasing interest in procedures to analyze player behavior and the impact of design decisions. Game
Revised 24 January 2013 companies traditionally relied on user-testing methods, like playtesting, surveys or videotaping, to obtain
Accepted 20 February 2013
player feedback. However, these qualitative methods for data collection are time-consuming and the
Available online 27 February 2013
obtained data is often incomplete or subjective. Therefore, instrumentation became popular in recent
years to unobtrusively obtain the detailed data required to thoroughly evaluate player behavior. To make
Keywords:
sense of the large amount of data, appropriate tools and visualizations have been developed.
Games
Evaluation
This article reviews literature on visualization-based analysis of game metric data in order to give an
Gameplay analysis overview of the current state of this emerging field of research. We discuss issues related to gameplay
Visualization analysis, propose a broad categorization of visualization techniques and discuss their characteristics. Fur-
thermore, we point out open problems to promote future research in this area.
Ó 2013 International Federation for Information Processing Published by Elsevier B.V. All rights reserved.

Contents

1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
2. Application areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
3. Gameplay visualization classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
4. Target audience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
4.1. Game developers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
4.2. Players . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
5. Field of application. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
6. Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
7. Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
7.1. Charts and diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
7.2. Heatmaps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
7.3. Movement visualizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
7.4. Self-organizing maps. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
7.5. Node-link representations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
8. Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
9. Conclusions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153

q
This paper has been recommended for acceptance by Pierre Jouvelot.
⇑ Corresponding author. Tel.: +43 1 71133 2382; fax: +43 1 71133 2089.
E-mail addresses: guenter.wallner@uni-ak.ac.at (G. Wallner), kriglstein@cvast.-
tuwien.ac.at (S. Kriglstein).

1875-9521/$ - see front matter Ó 2013 International Federation for Information Processing Published by Elsevier B.V. All rights reserved.
http://dx.doi.org/10.1016/j.entcom.2013.02.002
144 G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155

1. Introduction and analyzed in some way. Traditionally, purely statistical ap-


proaches (see, e.g., [24–26]) were used [2,5]. However, the inter-
Video games have become more and more complex over the pretation of the data can be difficult, e.g., because of missing
years, not only in regard to technical issues, like game engines, expertise with statistical methods. Therefore additional ap-
but also in regard to the amount of interaction possibilities available proaches are necessary to support the analysis of gameplay data.
to the players [1–4]. The rich amount of choices in contemporary vi- The literature review showed us that there is an increasing interest
deo games makes it hard to properly balance them and to anticipate in visualizations because they can help developers to analyze large
player behavior. For these reasons there exists an increasing need to amounts of multi-dimensional data and enable them to gain valu-
understand player behavior (e.g., identifying frequently accessed able knowledge in order to understand player behavior (e.g., by
parts of a game), to detect patterns and to establish reliable visualizing player traces) and to discover patterns of what players
player-oriented game testing. Developers have to ensure not only are doing over time (e.g., by analyzing occurrence, context and fre-
a good playability of the game but also that the game provides a sat- quency). Graphical representations of gameplay data make things
isfying and fun experience for the player [1,3,5–7]. Gameplay anal- visible or present things in a new light of which users were not
ysis has the goal to observe how players play a game and to analyze aware of before and therefore support them in their decisions to
recorded game-related data in order to understand the impact of de- improve the game [1,27]. Several visualization tools (e.g., Data
sign decisions and player behavior [2,4,7–10]. Such information can Cracker [9], Lithium [28], PlayerViz [1] or Playtracer [5]) have been
help to adapt game content and design as well as optimize the inter- developed in the last years with the goal to assist the analytical
actions of players within a game [7,10]. process. Such visualization tools range from simple representa-
For player-oriented game testing traditional methods like tions like bar charts of a single variable (e.g., level completion
observational studies of players, interviews or thinking aloud for times) to complex systems that consider different visualization
usability testing and playability testing are often used (see, e.g., techniques (e.g., maps in combination with player path
[11–18]). Such methods are useful for collecting qualitative data representation).
and survey-based information [2,3,5,19]. For example, they are Based on these observations we want to give researchers and
helpful to get feedback from players to identify design problems practitioners a state-of-the-art overview about different aspects
(e.g., problems such as player disorientation or difficulties in way- for the analysis and particularly for the visualization of gameplay
finding) [2,3]. However, they are often limited, because testing data. In the following sections we shortly discuss the various appli-
each player individually takes time, which makes such methods cation areas of game metrics, present our attempt to categorize vi-
unsuitable for large user groups [1–3,5,19]. Furthermore, it is often sual methods for analyzing gameplay data, provide detailed
difficult to get an overview of how players play a game just from descriptions and examples for each category and discuss open
the players perspective and the results might be biased because problems and topics for possible future work.
of personal opinions or because players have problems to explain
their impressions and activities [1,5]. This can easily lead to misin-
terpretations of the data. 2. Application areas
Another possibility is to automatically collect instrumentation
data by logging user initiated events (events that occur when a In recent years, instrumentation of games to automatically col-
player interacts with a game) which reduces error and saves time lect game metrics has become an important aspect throughout the
[2]. In the context of video games instrumentation data are usually development cycle, for beta-tests, and even after the game has
referred to as gameplay metrics. Gameplay metrics are numerical been released. While the overall goal of gameplay metrics can be
data about players’ behavior and interactions with the game and summarized as providing detailed information about the player
have become a valuable source for the analysis of how a game is behavior the reasons for collecting them are numerous.
played [3,4,19]. Gameplay metrics allow user-research profession- To begin with, metrics can be used to uncover design issues
als to evaluate players’ behavior more objectively in contrast to (e.g., [3,29]), to fine-tune level design (e.g., [3,30]), to predict player
qualitative player feedback which can be influenced by players’ churn [31], and to understand player movement in complex virtual
perceptions and preferences [3,4,19]. Moreover, the large amount environments (e.g., [28,30,32]). One of the biggest benefits of
of collected data makes it possible to detect and establish player telemetry is that it enables the analysis of long-term player behav-
behavior patterns more easily [3,5]. In addition, telemetry over ior [2,21,33], which is particularly useful for ensuring play balance
the Internet allows developers to stay constantly connected to [34]. Balancing issues may only become apparent after several
their customers and supply them with a constant stream of game- weeks or months and are therefore not easily detected during short
play information that allows them to frequently update a game play sessions. However, gathering data from released games also
after its release with bugfixes, gameplay changes, new features has other benefits as well. As pointed out by Hullett et al. [34],
and extra content in order to extend its life. In recent years, some long-term data can help to plan the release of additional content
developers like Valve [20] have therefore started to view games as for a game to counteract fading interest. For example, Hullett
a service rather than as a product. et al. [34] analyzed long-term play data from Project Gotham Racing
Even though gameplay metrics analysis has many advantages it 4 to identify unused features in the game. Results of the study re-
does not provide reliable information about why a player is doing vealed valuable information on how to reduce costs for asset crea-
something [2–4,21,22]. In other words, game metrics fail to pro- tion in future developments. Weber et al. [35] collected data from
vide context, e.g., whether a player is having fun or not. However, Madden NFL 11 after the game has been released to identify the
as Lynn [22] points out, why can sometimes be the most important most influential features on player retention.
question. Canossa and Cheong [23] also emphasize that the re- Other authors identified game metrics as a source for inspira-
corded behavior is not necessarily an expression of the player’s tion [36] and to advise creators of the next game in a specific brand
personality and intention. Several authors (e.g., [2,4,21]) therefore or even other games [30,33]. Zoeller [37] mentioned game teleme-
recommend to combine telemetry analysis with qualitative user try as a tool to measure the stability of the software (e.g., by log-
research methods to get the best of both worlds. ging client-crashes). Kennerly [38] stressed the value of game
Gameplay metrics are usually stored in textual form in game metrics to catch cheaters. Cheating, as discussed by Kennerly
logs or relational database systems which need to be processed [38], can be caused by players adapting dominant strategies not
G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155 145

intended by the designer (i.e., imbalances in the game design) or by recent commercial games which uses adaptive mechanics is Left
modifying the game itself. Detecting cheaters is especially impor- for Dead, a cooperative zombie survival shooter, which algorithmi-
tant in multiplayer games, where a cheater will not only cheat cally adjusts the game pacing by estimating the emotional intensity
himself but also negatively affect the experience of the other play- of a player based on certain in-game variables (e.g., damage taken)
ers. For example, Mitterhofer et al. [39] used logs of character to create drama and to avoid battle fatigue (cf. [60]).
movement to detect botting, a form of cheating where players Recordings of gameplay data can also be used to train bots or for
use bots (i.e., artificial agents) to play for them instead. the creation of believable non-player characters. For example, Ta-
Beside these uses, metrics can be leveraged to facilitate commu- stan and Sukthankar [61] gathered data from human Unreal Tour-
nity building [23,40]. Two examples of providing in-game statistics nament players to teach bots policies for attack, exploration, and
to the player community are the game developer Valve which pro- targeting. Bauckhage et al. [62] and Thurau et al. [63] used data
vides aggregated statistics of gameplay data for many of their from Quake II to create bots that imitate human movement and
games, including Half-Life 2: Episode Two [41] and the Call of Duty: strategic behavior. Reeder et al. [64] were concerned with develop-
Elite online service [42]. Moreover, gameplay metrics possess mon- ing intelligent bots which can partake in virtual economies (as
etary value. They are helpful to plan the release of expansions and found in many massively multiplayer online role-playing games).
downloadable content [30], to increase the potential for subse- Data recorded from the EVE Online market was used as training
quent purchases and in-game purchases [35] – especially impor- set. Goal recognition in games (e.g. [65]), that is inferring the play-
tant for free-2-play where revenues are earned by selling in- ers’ current high level goals based on his low-level actions, also de-
game items, to increase customer renewal [38], and to cut produc- pends heavily on telemetry data.
tion costs [38].
While most of the literature on game metrics focuses on games
for entertainment, game telemetry is also a valuable asset for the de- 3. Gameplay visualization classification
sign of educational games. Designers of such games face the chal-
lenge that the game must not only be entertaining and fun to play For this literature review various conference and journal dat-
but educational as well. Game telemetry allows continual monitor- abases (like IEEE Computer Society, ACM Digital Library, ScienceDi-
ing of learning as a process [43] and enables unobtrusive tracking of rect, SpringerLink and Google Scholar) were searched. For the
assessment information [44] which can be used by teachers [45,46] purpose of this literature review we restrict ourselves to visualiza-
and students alike. This makes it especially important that the data tion approaches which are specifically intended for the analysis of
is presented in a clear and understandable manner. data internal to gameplay sessions – with a strong focus on visual-
Although game metrics are nowadays applied to the develop- izations for development purposes. Analytic tools, like for example
ment of educational games or the assessment of learning (see provided by Mochibot [66] and Nonoba [67], that do not allow to
[46–50] for some examples), analysis of the data is mostly re- track internal variables (although allowing to track certain game-
stricted to descriptive statistics (e.g., average time spent on a task, related statistics like traffic or number of games played per day)
average number of attempts, gaming score) while literature about are therefore not considered. Furthermore, even though there exist
applying visualizations (apart from various charts) to the analysis many visualization approaches for analyzing user behavior in vir-
of learning games is still sparse. Among the few examples are Liu tual environments (e.g., [68–71]) that are highly applicable to the
et al. [51] who applied a node-link visualization (see Section 7.5) games domain as well, they are omitted from this review to keep
to the analysis of an educational game about fractions and the pro- the scope manageable and focused. Finally, we should stress that
tein folding game Foldit, Scarlatos and Scarlatos [26] who used a we did not consider literature about visualizations to convey infor-
variation of parallel coordinates to analyze a game about global mation within games themselves (e.g., [72,73]). This topic has al-
warming and energy use, and Wallner and Kriglstein [52] who also ready been covered by Zammitto [74], Medler and Magerko [75]
used a node-link visualization to analyze playing behavior in a and in the very recent survey of Bowman et al. [40].
game about transformation geometry. We found 42 papers – published between 2004 and 2012 –
Beside the above use cases, gameplay metrics also play an which either describe (novel) visualizations and visualization sys-
important role for adaptive gameplay mechanics, whose goal is tems for gameplay analysis or applied visualizations to under-
to change the game in light of the player’s ongoing interactions with stand, analyze or communicate game metrics. All these papers
the video game [53]. One particular way is dynamic difficulty are covered in the following sections. Beside these papers we also
adjustment (DDA) which aims to adopt the challenge level to the included several websites that are concerned with gameplay
player’s abilities to avoid boredom or frustration and to keep the visualization.
player engaged. Devising a classification of the different aspects for the visu-
For example, Hunicke and Chapman [54] tracked various statis- alization of gameplay data is not an easy task because a too gen-
tics (like damage a player takes over time and current location) to eral view would not be helpful for answering specific questions
decide when and how to intervene by, e.g., supplying the player and a very detailed categorization would hardly be distinctive.
with ammunition or health or by reducing the strength of attacks For the purpose of this review the aspects for visualization of
by enemies. Other work in this area includes, for example, proce- gameplay data are therefore classified into the following four
dural level generation for platform games [55,56] or personalized groups:
track generation for racing games [57].
Examples of commercial games where the details of their DDA
systems have been described are the third-person shooter Max Pay- Target audience:
ne from Remedy Entertainment and SiN Episodes, a first-person shoo- Most of the gameplay data visualizations are used by game devel-
ter from Ritual Entertainment. The former dynamically changes the opers to analyze recorded data in order to identify design and
difficulty level by increasing the number of enemies, based on cer- interaction problems and to understand player behavior. However,
tain statistics, like average health or kills per level (see [58]). The lat- we also observe an increasing tendency to make the data available
ter, as described by Kazemi [59], uses a system where artificial to the players themselves. This allows them to analyze their past
agents called advisors make – based on collected game metrics – rec- behavior and to compare their data with other players. In this cat-
ommendations on how to adjust certain attributes, like damage egory we discuss the different motivations and tasks depending on
done by enemies or the tendency of throwing grenades. One of the the different user groups.
146 G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155

Field of application: be considered in the development of visualization tools. In this sec-


As the literature review revealed, existing visualization tools can tion we will distinguish between visualizations for game develop-
be categorized in tools that are developed only for the analysis of ers and players.
a specific game and in tools that are applicable across games or
4.1. Game developers
genres.

For game developers it is interesting to evaluate the game not


only during the development process but also after the game has
Data:
been released to understand the effects of design decisions and
Every interaction of a player within a game can be logged and stored
to get information regarding players actions within games
for later analysis. The choice which events or information should be
[19,29]. After the game has been released, instrumentation pro-
tracked is by far not a trivial task. The type of data can range from
vides valuable and detailed information about long-term player
player specific data to gameplay data. In this category we discuss
behavior to developers which was previously not accessible to
how data and what kind of data (e.g., spatial data, temporal data)
them. Game developers have therefore stressed the importance
can be logged.
of effective visualizations that facilitate aggregate analysis of data
from thousands to millions of players [30,76]. On the other hand,
Representation:
visualizations are necessary that allow developers to drill down
For the representation of gameplay data, a variety of visual
to finer levels of detail to get deep insights into the specifics of
approaches are available which range from charts and diagrams
how players play a game. This is, for example, valuable for data
(e.g., to present the frequency of occurrence of a specific event)
gathered in playtests during development. In either case, the
to spatiotemporal visualization approaches. In this category we
gained knowledge can support game developers to react to game-
present different representation methods along with their
play issues and to improve the game.
strengths and weaknesses. The goal is to give an overview of which
For this reason visualization approaches for the analysis of
of these approaches are suitable for which kind of information.
gameplay data became popular among industry professionals and
academic researchers (e.g., [28,29]). Visualizations of gameplay
The field of application and the target audience influence which
data can range from the representation of the frequency of occur-
data and representations are suitable for the analysis of gameplay
rence of a specific event to the representation of players’ move-
data. For example, the gameplay analysis of a puzzle game will fo-
ments on a game map. In the following we present a few
cus on different aspects than the evaluation of a third-person shoo-
examples of tools used by industry professionals for the analysis
ter. Collected data can be represented graphically in different ways
of gameplay-related data.
(e.g., heatmaps can be used to depict players’ positions) to support
Data Cracker [9] is a visual game analytic tool which was devel-
the target audience in generating insights about the game which in
oped at Electronic Arts. The tool allows game developers to monitor
turn can influence their further decisions (e.g., developers can de-
player behavior by tracking and organizing gameplay data from the
tect game design problems and players can compare their achieve-
game Dead Space 2. For this reason the tool includes (a) summary
ments with other players).
graphs to give an overview of the collected gameplay data (like
In the following sections we will provide detailed explanations
number of tracked players, how many matches have been played
of each category (including sub-categories). Each category will be
or winning percentages for both teams), (b) a timeline table to de-
illustrated with examples from the literature.
fine the time period to be analyzed and (c) graphs to present infor-
mation such as number of rounds played and won, kill/death
4. Target audience ratios, experience points gained by players and weapon statistics
[9]. Moreover, the entire game team was involved in the develop-
A major factor for the success of a visualization is the accep- ment to increase acceptance and effective team communication
tance by the user. Therefore, purpose and intended users have to [9,75].

Fig. 1. SkyNet [37], BioWare’s telemetry system provides several ways to visualize the collected data, like an aggregated spatial view of player movement. (Image courtesy of
G. Zoeller, reproduced with permission).
G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155 147

SkyNet [37] is a visual analytic tool which is used by game Sc2gears [84] is a StarCraft II utility that provides different views
developer BioWare. Beside providing different kinds of visualiza- for management and analysis of individual replays and multi-re-
tions like color-coded lists, charts, diagrams and aggregated spatial play statistics. For example, the tool includes a map preview, sta-
visualizations of, e.g., player movement (see Fig. 1), it allows game tistical information (e.g., game length, game speed, game type,
developers to track the number of bugs that have been identified or and information about the individual players themselves) as well
solved and to stay in touch with their co-workers (cf. [9,75]). By the as charts and diagrams to show information such as how many ac-
way of example, heatmaps of client crashes are used to identify tions per minute were performed or which hot keys were used.
problem areas (cf. [37]). Moreover, SkyNet shows which team
members fix the most game bugs or test the game most frequently
to encourage friendly competition among them [9,75]. 5. Field of application
The Unreal Master Control Program [77] supports data collection
and includes a set of tools to visualize the gathered gameplay data The literature review showed us that in many cases visualiza-
via charts, graphs (like trends in weapon usage over time) or tion tools are developed for a specific game or genre. For example,
heatmaps to display specific player activity (like kills with specific Hoobler et al. [28] presented a visualization approach – called Lith-
weapons). ium – to analyze behavior patterns of players for the team-based
Flying Lab Software [78] used metric collection for their MMO Pi- first-person perspective game Return to Castle Wolfenstein: Enemy
rates of the Burning Sea. Various events within the game are logged Territory. The gameplay data is presented from two perspectives:
and stored in a database on which queries can be executed. Bar ’local’ and ’global’ visualization. The ’local’ visualization allows to
charts were used to display the aggregated data, which includes analyze the positions of the players in the map through color-cod-
the number of characters per level, ship deaths per level and un- ing and icons (e.g., the color of the player reflects in which team the
ique logins per day. Volition [22] utilizes the Games Data Service player is) [28]. Contrarily, the ’global’ visualization focuses on the
telemetry solution from developer and publisher THQ. Ubisoft uses representation of statistical information (e.g., the amount of com-
a set of tools called DNA to examine telemetry data gathered from bats) which should help to understand high-level trends and
games of the publisher, e.g., the Assassin’s Creed series as described behaviors of the players and teams [28].
in [30]. Several different types of visualizations like charts (e.g., to Other examples are Sc2gears [84] and Data Cracker [9]. Data
plot average final score against the number of kills), heatmaps (e.g., Cracker also uses color schemes, certain symbols and artworks
of failures) or 3D visualizations of player traces – similar to the ap- from the game to brand it as a Dead Space 2 tool to increase interest
proach of Dixit and Youngblood [79] – are supported by the and acceptance among the game development team.
system. Tools that are developed with a specific game or genre in mind
Playtomic [80] provides analytics for game developers of Adobe have the advantage that they are tailored to game-specific tasks.
Flash games. In addition to rather general metrics, like views or However, the development process is often very time and cost con-
playtime, custom metrics can be defined to track different kinds suming and therefore game analytic systems were developed
of variables in a game. Heatmaps of certain activities within the which are applicable to a broad range of games and genres. For
game can also be generated. example, Kim et al. [21] describe a system – called Tracking
Real-Time User Experience (TRUE) – which combines behavioral
4.2. Players instrumentation with human-computer interaction methods to
gain deeper insights into players behavior. TRUE logs time stamps
In the last years several visual game analytic systems for players for each event, which allows to analyze sequences of events and to
have been developed to make their personal gaming history more collect contextual information related to the event [21]. Other
transparent. Making gameplay data accessible to players gives examples are SkyNet [37] and the gamer community website Giant
them the possibility to track their progress and to analyze their Bomb [82] which visualizes gameplay data from a number of
past gameplay behavior [75,81]. Representations of gameplay data different games.
can motivate players to optimize their achievements (e.g., solving a The published examples are still quite sparse and it is often not
puzzle with lesser trials) and allow them to compare their data easy to get access to gameplay data, probably because such data is
with others which encourages friendly competition between play- normally treated as confidential by game companies as noted by
ers (e.g., who earns more trophies) [81]. Visualizations for players Drachen and Canossa [3]. Therefore, it is common in the game re-
range from simple visualizations of statistical data (e.g., [82,41]) search community to show the applicability of a visualization ap-
over heatmaps (e.g., [83,41]) to replay analyzers (e.g., [84]) and proach by means of a self-developed game and to discuss the
summary visualizations for spectator modes (e.g., [85,86]). Medler generalizability to games with similar structures (see, e.g.,
and Magerko [75] already discussed different visualization ap- [5,52,88]).
proaches that player can use for exploring their data. We will
therefore restrict ourselves to three illustrative examples that have
not been covered by Medler and Magerko. 6. Data
The Rockstar Games Social Club online service [87] provides
game statistics and friend comparison features for various games Game analytics is a domain that encompasses all aspects of col-
from the publisher, including the recent L.A. Noire. For the open- lecting and analyzing game-related data [9,89] and would there-
world game Red Dead Redemption the website offers, for instance, fore be a huge area to cover. We will therefore focus on data
a map showing visited and unexplored locations. related to gameplay analysis. Although the term gameplay lacks
Cheong et al. [85] and Halper and Masuch [86] describe meth- any precise definition [90] because it consists of many contributing
ods for generating summary visualizations of gameplay sessions. and interplaying elements it is usually used to refer to the interac-
Whereas Cheong et al. [85] extract interesting events based on cog- tive aspects of game design. Gameplay-related data can be mainly
nitive models of summarization, Halper and Masuch [86] use eval- gathered by qualitative methods (usability testing, playability test-
uation functions that define how interesting a particular moment ing) or via instrumentation, i.e., by logging user interactions with
in time is. Such summaries can be used to review a game after the game [2–4,19,21]. Typical metrics are, for example, time to
playing or for spectator modes that allow players to view games complete a level, probability to succeed, number of kills or loca-
in progress. tions of player’s deaths.
148 G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155

Gameplay metrics can be gathered in large numbers [4,19], are Table 1


precise [19] and objective [4] and can be collected unobtrusively Overview of the most common representation techniques along with references for
[6,91] since logging the data happens without user intervention. each category.

In contrast to qualitative methods, metrics are not influenced by Representation Examples


individual perceptions and preferences of players (and evaluators) Charts and diagrams [9,24–27,37,77,78,21,89,91,79,96–101]
[4,5] and collecting detailed data is much less time-consuming Heatmaps [1,2,28,37,77,22,98,102]
[2,5]. However, gameplay metrics are not always able to inform Movement visualizations [1,3,7,10,28,29,37,91,79,99,103,32,104,39,105]
why players behave in certain ways or about the subjective expe- Self-organizing maps [6,106]
Node-link representations [5,7,52,88,107,51]
rience of players (e.g., is it fun to play?). Data from qualitative user
feedback provides context and information about the players moti-
vation and attitude. These are important reasons why both meth- In production environments instrumentation data is usually re-
ods should be used together to be most effective. For example, corded on the client side and sent to a dedicated server [9,21,37]
the instrumentation solution from Microsoft Game Studios [21] al- where the data is aggregated and stored for further usage. There-
lows to capture videos of users interacting with the game and to fore bandwidth, latency and other factors in network performance
synchronize the video with the timestamps of recorded events. have to be considered as well. For instance, SkyNet [37] uses non-
Metrics can be recorded in different ways, but mostly authors blocking UDP sockets instead of TCP connections to increase speed.
differentiate between frequently recorded metrics (such as the The collected data is usually stored in text files or relational data-
location of a player) and event-triggered metrics (such as firing a base systems and has to be analyzed in some way. Adequate visu-
weapon or collecting an item), although the terms to describe alization are therefore necessary to make sense of the vast amount
these types vary in the literature (cf. [2,8,19,92]). Another distinc- of data.
tion can be made regarding the type of data. Different authors pro-
posed different categorizations (e.g., [19,26]) but the most
predominant distinction is perhaps between spatial (e.g., player 7. Representation
position) and non-spatial gameplay metrics (e.g., health, money,
talking to a non-player character). In either instance, the data often In this section we discuss different techniques for visualizing
has a temporal component since in many cases the order in which game metric data which we classified into five subcategories:
players perform their actions is very important. Further, since charts and diagrams, heatmaps, movement visualizations, self-
instrumentation can only record information from the game itself, organizing maps, and node-link approaches. Each category is use-
such data should be accompanied with attitudinal, demographic, ful for different kinds of analysis tasks. Table 1 lists these types
and contextual data. Fig. 2 shows a block diagram of the involved of visualizations along with examples from the literature.
kind of data.
Games are very complex systems and the number of variables 7.1. Charts and diagrams
which can be recorded is enormous. Which metrics should be
tracked depends on the questions the evaluator wishes to answer Charts and diagrams are useful if specific questions have to be
and on the game itself. However, choosing the right metrics to re- answered (e.g., how many players solved the level) but they are
cord is not easy and may not always be obvious at the beginning not suited for exploratory data analysis (e.g., detecting common
because – as stated by Andersen et al. [5] – the evaluator may player behavior). They are used in almost every gameplay analysis
not know the right questions or patterns to look for in the first tool to present quantitative data in one form or another. We will
place. Knowing which metrics to record is important because therefore concentrate on three illustrative examples. Further
tracking a lot of variables can significantly impact performance examples of using charts can be found, e.g., in [9,21,37,79,100].
and result in huge storage requirements. For example, Marselas Scarlatos and Scarlatos [26] introduced the concept of Action
[93] reports the use of more powerful computers to compensate Shapes, a variation on parallel coordinates [108]. The shape is con-
for performance issues caused by unoptimized code and logging structed by connecting the values of the parallel coordinate axes,
functions during in-house play-testing of Age of Empires II. Zoeller which represent the different choices available to the player. If
[37] reports that they have gathered 250GB of data on Dragon Age: the parallel axes are ordered in a meaningful way (e.g., by
Origins. Sony Online Entertainment granted the research group
around Dmitri Williams [94] access to 60TB of server logs from
the massively multiplayer online role-playing game Everquest 2
[95]. Schoenblum [77] discusses scalability and performance issues
in respect to the Unreal Master Control Program.

Fig. 2. An overview of the different kinds of gameplay-related data. Gameplay data Fig. 3. Action Shapes [26] for the game Energy Choices. By choosing a meaningful
can be gathered via qualitative methods and instrumentation. Whereas instru- arrangement of the parallel axes, positive choices (left) can be easily distinguished
mentation happens automatically within the game, qualitative methods gather data from unfavorable ones (right) by differences in shape. (Image courtesy of L.L.
by observing or asking the player. Scarlatos, reproduced with permission).
G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155 149

Fig. 4. Biometric Storyboards [96,97] use diagrams to show the connection between behavior (labels along the x-axis) and associated player experience (y-axis), based on
players physiological arousal signals and the players self-drawn diagram of his gameplay experience. Red/green dots highlight positive/negative player experiences. (Image
courtesy of P. Mirza-Babaei, reproduced with permission.) (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this
article.)

clustering positive choices in the center as suggested in [26]), ben-


eficial shapes can be quickly distinguished from unfavorable
shapes as depicted in Fig. 3.
Milam and El Nasr [27] proposed a set of design patterns for le-
vel design and used charts to visualize occurrence, context and fre-
quency of these patterns in four different games.
Mirza-Babaei et al. [96,97] use what they termed Biometric Sto-
ryboards to visualize relationships between a player’s physiological
changes and game events. As pointed out by the authors [97] a cur-
rent drawback is the use of individual graphs for each player in-
stead of a composite graph in order to show trends among
players. An example of a Biometric Storyboard is depicted in Fig. 4.

7.2. Heatmaps

Heatmaps are commonly used for visualizing gameplay metrics


that can be mapped to a specific coordinate. A heatmap is a two-
dimensional map which uses color-coded gradients to indicate
the frequency of occurrence of a variable at a particular location.
On the plus side, heatmaps are easy to create and are well-suited
to recognize patterns of behavior. For 3D games heatmaps are usu-
ally created from a top-down perspective, therefore losing infor-
mation about the third dimension. In games with multi-level
Fig. 5. Heatmap of player deaths on the map Dustbowl from Team Fortress 2. The
architectural environments, heatmaps may therefore provide false map uses a color gradient from blue to red, where red areas indicate places where
or incomplete information. It should also be emphasized that most deaths occur. (Image ÓValve, reproduced with permission.) (For interpreta-
heatmaps require an adequate sample size to be meaningful. How- tion of the references to color in this figure legend, the reader is referred to the web
ever, not all game studios can rely on a large number of playtesters version of this article)

and user-research people during development. Furthermore,


heatmaps do not shed light on the emotional experience of the where crashes or warnings occur to help developers to quickly
player. Fig. 5 shows a heatmap of player deaths for the map Dust- identify problem areas.
bowl from the game Team Fortress 2.
During development of Halo 3, Bungie used heatmaps showing 7.3. Movement visualizations
player deaths to balance multiplayer maps [102]. Valve Corporation
offers heatmaps of player deaths for Half Life: Episode 2 [41]. Psy- Movement is an integral and fundamental part of the gameplay
chostats [83] is a tool that generates heatmaps for Half-Life 1-and mechanic of many 2D and 3D video games. When creating a game,
2-based game servers. It supports the creation of animated hourly designers will assume that players will interact with the game in
heatmaps or heatmaps for particular weapons, players, or teams. certain ways. If these assumptions fail for various reasons (e.g.,
Drachen and Canossa [2] used grid-based heatmaps to analyze pat- players getting lost or dying repeatedly) it is important to under-
terns of death in Tomb Raider: Underworld. Since a single heatmap stand why this is the case. But since position and orientation are
only visualizes a single aggregated gameplay metric (e.g., deaths) consistently changing they need to be tracked frequently which
they do not provide further contextual information (e.g., cause of usually results in large amounts of collected data. Therefore visual-
death). Therefore, Drachen and Canossa [2] overlaid different lay- izations are necessary to present the data in a clear and meaningful
ers, each containing the distribution of one cause of death. Volition manner. Fig. 6 shows a visualization of player paths from Hoobler
[22] used heatmaps that do not use a simple color gradient but in- et al.’s [28] Lithium system.
stead used symbols to represent player deaths, where additional Usually the path of each player is plotted individually by con-
information was encoded in the shape of the symbol. necting the logged position as lines. Such a visualization was, for
Although heatmaps are most commonly used to monitor the ex- example, used by Penumarthy and Börner [91] to visualize social
act locations of ongoing kills and deaths, they can be used for other diffusion patterns in an online game. Miller and Crowcroft [32]
variables as well. To give two examples: Ashton and Verbrugge measured avatar movement in World of Warcraft battlegrounds
[98] analyzed gameplay pacing in World of Warcraft and plotted and characterized each avatar as belonging to one of three move-
the time spent by a group in different areas of the dungeon as heat- ment categories. Trace maps with different colors for each trace
map. The visual analytic tool SkyNet [37] uses heatmaps to show were used to visualize the movement. In a subsequent paper,
150 G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155

scaling. A SOM consists of components called nodes which are usu-


ally displayed in a rectangular or hexagonal grid. While small
SOMs resemble k-means clustering [112], large SOMs emphasize
the topological properties of the input space and help to identify
clusters in the low-dimensional embedding. For a thorough
description the reader may refer to [111].
Drachen et al. [6] used emergent SOMs to identify different
player types in Tomb Raider: Underworld. The high-dimensional in-
put vector consisted of features relevant for the core game design,
like completion time, cause of death, and the number of times help
was requested. Clusters of player types can then be identified by
visualizing the highest performing SOM. By visualizing the compo-
nent planes (the relative distribution of a component of the input
vector projected onto the SOM) they were able to infer the charac-
teristics for each identified cluster.
Similarly, Thawonmas et al. [106] used SOMs to cluster players
based on their movement patterns in an online game. Players with
Fig. 6. A visualization of paths taken by different players in the game Return to similar movement patterns may share similar interests and there-
Castle Wolfenstein: Enemy Territory, as generated by the Lithium system [28]. The fore such kind of information can help to adapt the game to differ-
thickness of the path represents the elapsed time: the thicker the line the more ent player types. As input vector they used transition probabilities
recent in time. The color of the path reflects the team the players belong to and the
between landmarks in the virtual environment, where the transi-
glyphs represent the player classes. (Image courtesy of N. Hoobler, reproduced with
permission.) tion probability between two landmarks was defined as a function
of how many times a player moved between the two landmarks in
Miller and Crowcroft [104] rendered avatar movement as point question. Again, a visualization of the resulting SOM helped to
clouds and automatically identified waypoints in the movement identify clusters of player behavior.
data by using the approach of Mitterhofer et al. [39] (see below).
However, displaying a large number of individual trails will result 7.5. Node-link representations
in overlapping and visual clutter. Therefore appropriate aggregation
techniques are necessary for supporting visual exploration of move- Node-link representations have been mainly used for abstract
ment data (see [109] for a good overview). Dixit and Youngblood [1] or high-dimensional data which cannot be visualized in spatial
developed a tool which generates a collection of webpages to pres- relationship to the virtual environment. Many researchers
ent a large number of player paths in an organized fashion. Mitter- [5,7,52,88] used classical multidimensional scaling (CMDS) [113],
hofer et al. [39] – concerned with bot detection in massive an exploratory technique used to visualize similarities of high-
multiplayer online games – proposed an algorithm to simplify dimensional data in a low-dimensional space (usually 2D or 3D).
movement data by using line simplification, extracting waypoints However, CMDS requires a dissimilarity matrix that specifies the
from the coordinate dots of all simplified paths by means of cluster- similarity between every pair of input objects. Choosing an inap-
ing, and finally connecting these waypoints with straight lines. propriate dissimilarity function to relate the objects can therefore
To highlight the temporal component of movement, different vi- hinder analysis.
sual properties like color cycling [1] or thickness [28] have been Thawonmas et al. [7] presented a method which uses CMDS to
used. Coulton et al. [99] borrowed the concept of space-time paths locate clusters of players in respect to their movement patterns.
from the field of geography to visualize the spatial and temporal Similar to their above mentioned work [88] a matrix, describing
information. Space-time paths reduce the environment to a two- the transition probabilities between every pair of landmarks, is cal-
dimensional plane and plot the time on the third axis. As with other culated for each player. The similarity between two players is then
2D visualizations this does not work well for games with multiple derived by calculating the Euclidean distance between the respec-
floors or which allow free movement in 3D. An example for a 3D tive transition probability matrices. For the visualization of the
visualization is the PlayerViz tool of Dixit and Youngblood [1]. similarities they either apply CMDS or use a force-directed graph
Usually the movement data is annotated with further informa- layout (see, e.g., [114] for an overview) in which case a node rep-
tion such as orientation, health, or interactions with non-player resents a player and a link between two nodes indicates a connec-
characters. Dixit and Youngblood [1] used oriented lines protrud- tion between them. The ideal edge length of such a connection is
ing from the displayed positions to show the player’s orientation proportional to the aforementioned Euclidean distance. To avoid
at that point. Others superimposed icons and glyphs [28,91] or col- displaying a complete graph, the authors do not consider edges
or-coded the position points [3]. Further work concentrated on whose length is greater than a certain threshold. Therefore similar
extracting and displaying features from the movement data. Tha- players will be arranged closer together in the resulting graph lay-
wonmas et al. [10] used cellular automata and Hilditch Thinning out. Thawonmas and Iizuka [88] also describe an approach where
[110] to highlight frequently visited areas. Dixit and Youngblood MDS is used to locate clusters of players who behave similarly
[1] used their visualization to search for interesting patterns of and then use KeyGraph1 [115] to visualize the player behavior inside
behavior from which mathematical models were derived to auto- these clusters.
matically detect these phenomena in the dataset. Andersen et al. [5] used a node-link representation to depict
player progressions through a game. Nodes represent game states
and the player movements are directed edges. The major advan-
7.4. Self-organizing maps tage is that their approach can also be applied to games where it
is not possible to visualize gameplay in relation to the spatial
Self-organizing Maps (SOMs) [111] are a type of artificial neural
network to produce a low-dimensional (typically two-dimen- 1
KeyGraph was originally developed to extract keywords from a document. The
sional), discretized visualization of a high-dimensional input space output is a co-occurrence graph where nodes represent terms and links represent the
by grouping similar data items together, akin to multidimensional co-occurrence.
G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155 151

this problem by merging states sharing the same features. Fig. 8


shows a comparison between these two versions. However, the
effectiveness will depend tremendously on the chosen features.
Secondly, to analyze a continuous platform game the authors pro-
pose to discretize the state space into a set of designer-specified
key regions. This, however, will require knowledge of these key re-
gions beforehand and may therefore influence the analysis.

8. Discussion

While in-game metrics provide huge amounts of highly valu-


able quantitative data to analyze gaming habits they do not provide
reliable information about why a player is engaging in specific behav-
iors or not engaging in others [22]. But, as Lynn [22] emphasizes:
why can sometimes be the most important question. As in many areas
of human-computer or user-experience research, the best results
can be obtained by using a combination of different methods.
Although the importance of mixed-methods approaches has been
recognized by many researchers and industry professionals (e.g.,
Fig. 7. A node-link diagram of gameplay data from an educational puzzle game as
generated by the software system of Wallner and Kriglstein [107]. Color-coded [21,22,105]), most of the visualizations (with a few exceptions, like
nodes and edges are used to depict states and actions performed by the players. The [21,97]) only focus on displaying game-metrics without including
player icons visualize the locations of players as a function of time. qualitative data about attitude, emotional experience or motiva-
tion of the users. In this sense, it would be beneficial to develop
environment, like in abstract puzzle games. CMDS is applied to ob- visualizations that include such kinds of information directly in
tain the layout of the graph. Different visual properties like size the graphical depiction of the data.
and color are used to emphasize highly visited game states or to re- As pointed out by Zimmerman et al. [101] game developers
flect the probability that a player who reached a state eventually are now routinely instrumenting games to collect large amounts
completed the level successfully. Such graphs are valuable for of data. This is also reflected in the increasing amount of papers
many research questions because they allow the user to observe and presentations from industry professionals on game analytics
sequences of actions. (e.g., [22,37,77,78,102]). However, tools developed and data col-
A similar visualization approach has been recently published by lected by the industry are rarely available to the public and are
Wallner and Kriglstein [107]. Aggregated and bundled node-link treated as confidential. Therefore, one major problem is still the
diagrams are used to reveal patterns in gameplay. Different vari- lack of available gameplay data. Researchers (e.g., [5,7,26,51,52])
ables, derived from log-files, can be mapped to various compo- therefore often use their self-developed games to present and
nents of the visualization, which makes the tool adaptable to demonstrate the value of their analysis tools. Although establish-
different games. The authors present case studies of an educational ing research cooperations between academic and industry may
(abstract) puzzle game [52,107] and a multiplayer-shooter [107]. take time because of opposing goals, different pacing, or legal is-
In contrast to the work from Andersen et al. it considers the tem- sues they can – if properly set up – be for the benefit of both, as
porality of the gameplay data and clustering can be applied to thoroughly discussed in [116]. In this context, conducting a
group similar states together. Fig. 7 shows an example of such a study on the financial implications of game data acquisition
visualization. and visualization could be another interesting topic for future
Both above mentioned approaches suffer from problems if the research and could make a strong argument for companies to
number of states is very large or the state space is continuous, in collaborate with academics. It would also be advisable to
which cases the output will become cluttered and unintelligible. encourage developers to make datasets from already published
In a follow-up paper to Andersen et al. [5], Liu et al. [51] address games available to the research community. Although that way

Fig. 8. Playtracer [5,51] uses a node-link representation to visualize aggregated player behavior where nodes correspond to game states and edges depict transitions between
states. Since, the output can become unintelligible for large numbers of states (left) feature-based aggregation of states was introduced to reduce the visual complexity (right).
(Images courtesy of E. Andersen and Y.E. Liu, reproduced with permission.)
152 G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155

the insights will not be directly applicable to the development of related considerations but also ways to reduce the visual complex-
the game in question, the results can prove valuable for other ity of the visualizations, e.g., by means of clustering as suggested
games and can inform and advance the field in general. Large by several authors [5,124].
game companies, like Microsoft, are able to mine enormous
amounts of data not only for specific games but also across Context:
games via services such as Xbox Live (see, e.g., [101]). This is Although gameplay metrics inform what players are doing, they
something that is usually not possible for independent research- usually do not reveal the motivation behind the player’s behavior.
ers. Fortunately, some game developers have started to provide For example, Coulton et al. [99] published an evaluation of a loca-
public application programming interfaces (APIs) to access in- tion-based game, where the recorded path of one player suggested
game telemetry data. Examples include the API for the game that the player was running around fairly aimlessly. Only after the
Spore [117,118] from Maxis, the BF3 Stats API [119] to access sta- player was interviewed it was obvious that his strategy was to
tistics from Battlefield 3 and the World of Warcraft API [120]. ambush another player. To avoid misinterpretations of the data
While Bowman et al. [40] highlight the benefits of such APIs the analysis should include contextual data [2,21,125] or should
for community building, especially the latter API has become a be combined with user research methods like playtesting [2], video
favorite for the research community as evidenced by the number capture [2,21], or thinking aloud protocols [126].
of publications on World of Warcraft ([32,98,100] are a few
examples which are also concerned with visualization). Automatic analysis:
However, while automatic collection of game metrics attracts Several researchers argued for automatic detection of patterns in
increasing attention among game developers, issues of privacy the data to assist human analysis. This includes automatic detec-
are rarely discussed. This is especially of concern if the data is tion of landmarks in virtual environments [10] or movement pat-
not collected during in-house play sessions but occurs remotely. terns, automatic identification of roles that avatars take in a
Players may not even be aware that data about their in-game group [91], and the development of unsupervised and supervised
behavior (or hardware) is collected and transmitted to the machine-learning systems (e.g., [90]).
game’s developer. Mostly, players give their consent by accepting
an end-user license agreement (EULA) from the publisher of the Integration:
game. For example, the privacy policies of Electronic Arts [121] Although many game teams collect game metrics they never ana-
and Rockstar Games [122] contain a special paragraph on the lyze them, as stated in [9]. First and foremost, game analytic tools
use of analytic metric tools and collection of gameplay data, have to be used by the developers and must be integrated well into
respectively. However, as a recent study [123] with 80,000 users the development process in order to be truly effective. Unfortu-
of an online privacy tool has shown, users tend to blindly accept nately, published case studies and best practices on how to inte-
terms of EULAs, with 50% of the studies participants taking less grate analytic tools into the development cycle are almost not
than 8 seconds to read the entire notice. Other games in turn, existent, Medler et al. [9] being a notable exception.
like for example Mass Effect 3 or Battlefield 1943, offer an option
inside their settings menu with which players can choose for Causal relationships:
themselves if they want to allow telemetry or not. Yet, despite Cause-effect relationships cannot be observed in heatmaps or sta-
its importance, we are not aware of any academic article or tistical diagrams. Yet, understanding causal relationships is impor-
study on the implications of game metric collection on the pri- tant in order to enhance playability. For example, if a player does
vacy of players. not collect item A, what will happen at point B. However, to our
In the following we will shortly discuss open problems and fu- knowledge this area remained largely unexplored in gameplay
ture directions for research which mainly evolve around six broad visualization, except for a few examples, like the recent work of
areas: Moura et al. [29].

Lastly, it should be emphasized that contrary to productivity


Data selection: applications, the primary purpose of games is challenge and
Instrumentation allows to track every action a player performs entertainment (see Pagulayan et al. [127] for an in-depth discus-
while interacting with the game, resulting in huge amounts of sion about how games are different from productivity applica-
data. However, effective analysis requires to know which data tions). Traditional evaluation methods used in general human-
should be tracked and how it should be analyzed. Statistical computer interaction have been adopted for the assessment of
techniques or aggregated data can be useful if specific questions games, but their standard metrics like time-on-task, error rate
need to be answered. If the patterns in the data are not known or efficiency in task completion – although certainly relevant –
a-priori, techniques such as clustering, SOMs or graphs may be are not sufficient to adequately assess the process of play (cf.
better suited. Drachen and Canossa [3] therefore argue to also [2,127–129]). Deciding which metrics to track is one chal-
develop methods to decide which data to track and how to ana- lenge, deciding how best to analyze (and visualize) them is an-
lyze it. In another paper [2] they state that there is little knowl- other. While some issues may be unique to game data
edge on how metrics vary across games. In this sense, analysis, other challenges may be similar, or even perhaps, the
frameworks or heuristics may prove useful. Andersen et al. [5], same in other areas dealing with (large scale) data analysis or
for instance, propose to classify games for which a graph-based data mining2. It needs to be explored further how techniques used
approach is appropriate or not. in other areas apply or can be adopted to the analysis of gameplay
data.
Large-scale data:
The amount of data involved in analyzing gameplay is usually very
large and therefore the process of transforming, cleaning, analyz-
ing and visualizing the data can be challenging, as pointed out by
Drachen and Canossa [2]. The issues in this category are of course
not specific to gameplay analysis but rather arise in all areas where 2
See, e.g., Keim [130] for a review and categorization of commonly used
large-scale data has to be analyzed. This includes performance- visualization techniques in visual data mining.
G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155 153

9. Conclusions [15] F. Garzotto, Investigating the educational effectiveness of multiplayer online


games for children, in: Proc. IDC 2007, ACM Press, 2007, pp. 29–36.
[16] K. Isbister, N. Schaffer, Game Usability: Advancing the Player Experience,
Game analytics is an area that encompasses many different as- Morgan Kaufmann, 2008.
pects, from financial issues to playability and game usability. Of all [17] D. Pinelle, N. Wong, T. Stach, Heuristic evaluation for games: usability
principles for video game design, in: Proc. CHI 2008, ACM Press, 2008, pp.
these aspects we particularly focused on analyzing and visualizing
1453–1462.
instrumentation data, i.e., data that is automatically logged within [18] P. Sweetser, P. Wyeth, Gameflow: a model for evaluating player enjoyment in
the game. Our discussion was based on four categories: target games, Computer Entertainment 3 (2005) 1–24.
[19] A. Tychsen, A. Canossa, Defining personas in games using metrics, in: Proc.
audience, field of application, data and representation. For each
Future Play 2008, ACM Press, 2008, pp. 73–80.
category we presented examples from the literature. [20] R. Crossley, Gabe Newell on Valve, 2011. Available at: <http://www.develop-
There exists no single solution to visualize all kinds of gameplay online.net/features/1192/Gabe-Newell-on-Valve> (accessed November
data. Mainly five types of visualizations are used: (i) charts and 2012).
[21] J.H. Kim, D.V. Gunn, E. Schuh, B. Phillips, R.J. Pagulayan, D. Wixon, Tracking
diagrams, (ii) heatmaps, (iii) different types of movement visual- real-time user experience (TRUE): a comprehensive instrumentation solution
izations, (iv) SOMs and (v) node-link representations. for complex systems, in: Proc. CHI 2008, ACM Press, 2008, pp. 443–452.
Analysis tasks can differ substantially from game to game and [22] J. Lynn, Data metrics and user experience testing, in: CHI 2012 Workshop on
Game User Research, 2012.
therefore usually custom solutions are developed. Although for [23] A. Canossa, Y.-G. Cheong, Between intention and improvisation: limits of
some tasks specialized solutions will always be necessary, it would gameplay metrics analysis and phenomenological debugging, in: Proc. DIGRA
be helpful to identify common tasks (over different kinds of games 2011.
[24] K.-T. Chen, L.-W. Hong, User identification based on game-play activity
and genres) and to provide guidelines on how these tasks are ap- patterns, in: Proc. NetGames 2007, ACM Press, 2007, pp. 7–12.
proached best. This would improve reusability, cut development [25] P. DeRosa, Tracking player feedback to improve game design, Gamasutra
costs and reduce training time. (2007). Available at: <http://www.gamasutra.com/view/feature/1546/>
(accessed July 2012).
Further, we identified possible directions for future research.
[26] L. Scarlatos, T. Scarlatos, Visualizations for the assessment of learning in
Gameplay metrics analysis offers a rich and interesting field for re- computer games, in: Proc. CEWIT 2010.
search which goes beyond the mere application of statistical [27] D. Milam, M.S. El Nasr, Design patterns to guide player movement in 3D
games, in: Proc. SIGGRAPH Sandbox 2010, ACM Press, 2010, pp. 37–42.
methods.
[28] N. Hoobler, G. Humphreys, M. Agrawala, Visualizing competitive behaviors in
multi-user virtual environments, in: Proc. VIS 2004, IEEE Computer Society,
2004, pp. 163–170.
Acknowledgements [29] D. Moura, M.S. el Nasr, C.D. Shaw, Visualizing and understanding players’
behavior in video games: discovering patterns and supporting aggregation
We sincerely thank all the researchers and companies who and comparison, in: Proc. SIGGRAPH Sandbox 2011, ACM Press, 2011, pp. 11–
15.
were kind enough to grant permission to use images of their work
[30] J. Dankoff, Game Telemetry with Playtest DNA on Assassin’s Creed, 2012.
in this survey. We would also like to thank the reviewers for their Available at: <http://engineroom.ubi.com/game-telemetry-with-playtest-
valuable comments for improving this paper. dna-on-assassins-creed-part-3/> (accessed November 2012).
[31] D. Nozhnin, Predicting churn: data-mining your game, Gamasutra (2012).
Available at: <http://www.gamasutra.com/view/feature/170472/
References predicting_churn_datamining_your_.php> (accessed November 2012).
[32] J.L. Miller, J. Crowcroft, Avatar movement in World of Warcraft battlegrounds,
in: Proc. NetGames 2009, pp. 1:1–1:6.
[1] P.N. Dixit, G.M. Youngblood, Understanding playtest data through visual data
[33] B.G. Weber, M. John, M. Mateas, A. Jhala, Using data mining to model player
mining in interactive 3D environments, in: Proc. CGAMES 2008.
experience, in: FDG Workshop on Evaluating Player Experience in Games,
[2] A. Drachen, A. Canossa, Analyzing spatial user behavior in computer games
ACM Press, 2011.
using geographic information systems, in: Proc. MindTrek 2009, ACM Press,
[34] K. Hullett, N. Nagappan, E. Schuh, J. Hopson, Data analytics for game
2009, pp. 182–189.
development (NIER track), in: Proc. ICSE 2011, ACM Press, 2011, pp. 940–943.
[3] A. Drachen, A. Canossa, Towards gameplay analysis via gameplay metrics, in:
[35] B.G. Weber, M. John, M. Mateas, A. Jhala, Modeling player retention in
Proc. MindTrek 2009, ACM Press, 2009, pp. 202–209.
Madden NFL 11, in: Proc. IAAI 2011, AAAI Press, 2011.
[4] Lennart E. Nacke, A. Drachen, K. Kuikkaniemi, J. Niesenhaus, H.J. Korhonen,
[36] B. Hillier, Interview with Randy Pitchford, 2012. Available at: <http://
W.M. van den Hoogen, K. Poels, W.A. IJsselsteijn, Y.A.W. de Kort, Playability
www.vg247.com/2012/08/23/borderlands-2-interview-randy-pitchford-
and player experience research (panel abstracts), in: A. Barry, K. Helen, K.
and-the-500-game/> (accessed November, 2012.
Tanya (Eds.), Breaking New Ground: Innovation in Games, Play, Practice and
[37] G. Zoeller, Development telemetry in video games projects, in: Game
Theory: Proc. DiGRA 2009 Conference, Brunel University, 2009.
Developer Conference 2010.
[5] E. Andersen, Y.-E. Liu, E. Apter, F. Boucher-Genesse, Z. Popović, Gameplay
[38] D. Kennerly, Better game design through data miningn, Gamasutra (2003).
analysis through state projection, in: Proc. FDG 2010, ACM Press, 2010, pp. 1–
Available at: <http://www.gamasutra.com/view/feature/2816/
8.
better_game_design_through_data_.php> (accessed November 2012).
[6] A. Drachen, A. Canossa, G.N. Yannakakis, Player modeling using self-
[39] S. Mitterhofer, C. Kruegel, E. Kirda, C. Platzer, Server-side BOT detection in
organization in Tomb Raider: Underworld, in: Proc. CIG 2009, IEEE Press,
massively multiplayer online games, IEEE Security and Privacy 7 (2009) 29–
2009, pp. 1–8.
36.
[7] R. Thawonmas, M. Kurashige, K.-T. Chen, Detection of landmarks for
[40] B. Bowman, N. Elmqvist, T. Jankun-Kelly, Toward visualization for games:
clustering of online-game players, International Journal of Virtual Reality 6
theory, design space, and patterns, IEEE Transactions on Visualization and
(2007) 11–16.
Computer Graphics 99 (2012).
[8] B. Medler, Generations of game analytics, achievements and high scores,
[41] Valve Corporation, Half-life 2: Episode Two Stats. Available at: <http://
Eludamos, Journal for Computer Game Culture 3 (2009).
www.steampowered.com/status/ep2/ep2_stats.php> (accessed June 2012).
[9] B. Medler, M. John, J. Lane, Data Cracker: developing a visual game analytic
[42] I. Activision Publishing, Call of Duty Elite. Available at: <http://
tool for analyzing online gameplay, in: Proc. CHI 2011, ACM Press, 2011, pp.
www.callofduty.com/elite> (accessed November 2012).
2365–2374.
[43] J.L. Plass, M. Biles, J. Frye, T.-T. Huang, Games & The future of Learning, 2012.
[10] R. Thawonmas, M. Hirano, M. Kurashige, Cellular automata and Hilditch
Available at: <http://www.nyu.edu/about/news-publications/publications/
thinning for extraction of user paths in online games, in: Proc. NetGames
connect-information-technology/2012/04/30/video-games-and-the-future-
2006, ACM Press, 2006.
of-learning.html> (accessed November 2012).
[11] H. Desurvire, M. Caplan, J.A. Toth, Using heuristics to evaluate the playability
[44] S.B. Linek, G. Öttl, D. Albert, Non-invasive data tracking in educational games:
of games, in: Ext. Abstracts CHI 2004, ACM Press, 2004, pp. 1509–1512.
combination of logfiles and natural language processing, in: Proc. INTED
[12] H. Desurvire, C. Wiberg, Game usability heuristics (play) for evaluating and
2010.
designing better games: the next iteration, in: Proc. OCSC 2009, Springer,
[45] D. Michael, S. Chen, Proof of learning: Assessment in serious games,
2009, pp. 557–566.
Gamasutra (2005). Available at: <http://www.gamasutra.com/view/feature/
[13] H. Desurvire, C. Wiberg, User experience design for inexperienced gamers:
2433/proof_of_learning_assessment_in_.php> (accessed November 2012).
gap game approachability principles, in: R. Bernhaupt (Ed.), Evaluating User
[46] A. del Blanco, J. Torrente, E.J. Marchiori, I. Martı́nez-Ortiz, P. Moreno-Ger, B.
Experience in Games, Human–Computer Interaction Series, Springer, 2010,
Fernández-Manjón, Easing assessment of game-based learning with <e-
pp. 131–147.
Adventure> and LAMS, in: Proc. MTDL 2010, ACM Press, 2010, pp. 25–30.
[14] A. Febretti, F. Garzotto, Usability, playability, and long-term engagement in
[47] A. Chaffin, K. Doran, D. Hicks, T. Barnes, Experimental evaluation of teaching
computer games, in: Ext. Abstracts CHI 2009, ACM Press, 2009, pp. 4063–
recursion in a video game, in: Proc. Sandbox 2009, ACM Press, 2009, pp. 79–86.
4068.
154 G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155

[48] F. Ke, A case study of computer gaming for math: engaged learning from [80] Playtomic. Available at: <http://playtomic.com/> (accessed June 2012).
gameplay?, Computer and Education 51 (2008) 1609–1620 [81] B. Medler, Player dossiers: analyzing gameplay data as a reward, The
[49] A. Serrano-Laguna, J. Torrente, P. Moreno-Ger, B. Fernández-Manjón, Tracing International Journal of Computer Game Research 11 (2011).
a little for big improvements: application of learning analytics and [82] GiantBomb. Available at: <http://www.giantbomb.com> (accessed June 2012).
videogames for student assessment, in: Proc. VS-Games 2012. [83] PsychoStats, Heatmaps. Available at: <http://www.psychostats.com/doc/
[50] M.J. Habgood, S.E. Ainsworth, Motivating children to learn effectively: Heatmaps> (accessed July 2012).
exploring the value of intrinsic integration in educational games, Journal of [84] A. Belicza, Sc2gears. Available at: <https://sites.google.com/site/sc2gears/>
the Learning Sciences 20 (2011) 169–206. (accessed June 2012).
[51] Y.-E. Liu, E. Andersen, R. Snider, S. Cooper, Z. Popović, Feature-based [85] Y.-G. Cheong, A. Jhala, B.-C. Bae, R.M. Young, Automatically generating
projections for effective playtrace analysis, in: Proc. FDG 2011, ACM Press, summary visualizations from game logs, in: Proc. AIIDE 2008, The AAAI Press,
2011, pp. 69–76. 2008.
[52] G. Wallner, S. Kriglstein, Design and evaluation of the educational game [86] N. Halper, M. Masuch, Action summary for computer games: Extracting
DOGeometry - a case study, in: Proc. ACE 2011, ACM Press, 2011, pp. 14:1– action for spectator modes and summaries, in: Proc. ADCOG 2003, City
14:8. University of Hong Kong, 2003, pp. 124–132.
[53] K.M. Gilleade, A. Dix, Using frustration in the design of adaptive videogames, [87] R. Games, Rockstar Games Social Club. Available at: <http://
in: Proc. ACE 2004, ACM Press, 2004, pp. 228–232. socialclub.rockstargames.com> (accessed June 2012).
[54] R. Hunicke, V. Chapman, AI for dynamic difficulty adjustment in games, in: [88] R. Thawonmas, K. Iizuka, Visualization of online-game players based on their
Proc. AIIDE 2004, AAAI Press, 2004. action behaviors, International Journal of Computer Games Technology 2008
[55] M. Jennings-Teats, G. Smith, N. Wardrip-Fruin, Polymorph: dynamic difficulty (2008) 1–9.
adjustment through level generation, in: Proc. PCGames 2010, ACM Press, [89] A. Iosup, Cameo: continuous analytics for massively multiplayer online games
2010, pp. 11:1–11:4. on cloud resources, in: Proc. Euro-Par 2009, Springer, 2010, pp. 289–299.
[56] N. Shaker, G. Yannakakis, J. Togelius, Towards automatic personalized content [90] S. Finnegan, R. Holte, G. Xiao, M. Trommelen, Machine learning for semi-
generation for platform games, in: Proc. AIIDE 2010, AAAI Press, 2010. automated gameplay analysis, in: Game Developers Conference 2005.
[57] J. Togelius, R.D. Nardi, S.M. Lucas, Towards automatic personalised content [91] S. Penumarthy, K. Börner, Analysis and visualization of social diffusion
creation in racing games, in: Proc. CIG 2007, pp. 252–259. patterns in three-dimensional virtual worlds, in: R. Schroeder, A.-S. Axelsson
[58] S. Miller, Auto-dynamic Difficulty, 2004. Available at: <http:// (Eds.), Avatars at Work and Play: Collaboration and Interaction in Shared
dukenukem.typepad.com/game_matters/2004/01/autoadjusting_g.html> Virtual Environments (Computer Supported Cooperative Work), Springer,
(accessed November 2012). 2006, pp. 39–61.
[59] D. Kazemi, Metrics and Dynamic difficulty in Ritual’s Sin Episodes (Part 1), [92] S. Joslin, R. Brown, P. Drennan, The gameplay visualization manifesto: a
2008. Available at: <http://orbusgameworks.com/blog/article/70/metrics- framework for logging and visualization of online gameplay data, Computer
anddynamic-difficulty-in-rituals-sin-episodes-part-1> (accessed November Entertainment 5 (2007).
2012). [93] H. Marselas, Profiling, data analysis, scalability, and magic numbers: meeting
[60] M. Booth, The AI systems of Left 4 Dead, in: Artificial Intelligence and the minimum requirements for Age of Empires II: The Age of Kings,
Interactive Digital Entertainment Conference, Stanford, USA. Gamasutra (2000). Available at: <http://www.gamasutra.com/view/feature/
[61] B. Tastan, G.R. Sukthankar, Learning policies for first person shooter games 3137/profiling_data_analysis_.php> (accessed July 2012).
using inverse reinforcement learning, in: Proc. AIIDE 2011, AAAI Press, 2011, [94] D. Williams, N. Yee, S.E. Caplan, Who plays, how much, and why? Debunking
pp. 85–90. the stereotypical gamer profile, Journal of Computer-Mediated Communication
[62] C. Bauckhage, B. Gorman, C. Thurau, M. Humphrys, Learning human behavior 13 (2008) 993–1018.
from analyzing activities in virtual environments, MMI Interaktiv-Human 1 [95] J. Timmer, Science gleans 60TB of behavior data from Everquest 2 logs, 2009.
(2007) 3–17. Available at: <http://arstechnica.com/gaming/2009/02/aaas-60tb-of-
[63] C. Thurau, C. Bauckhage, G. Sagerer, Learning human-like movement behavior behavioral-data-the-everquest-2-server-logs/> (accessed July 2012).
for computer games, in: Proc. SAB 2004, MIT Press, 2004, pp. 315–323. [96] P. Mirza-Babaei, G. McAllister, Biometric storyboards to improve
[64] J. Reeder, G. Sukthankar, M. Georgiopoulos, G. Anagnostopoulos, Intelligent understanding of the players gameplay experience, in: Proc. Videogame
trading agents for massively multi-player game economies, in: Proc. AIIDE Cultures and the Future of Interactive Entertainment, 2011.
2008, AAAI Press, 2008, pp. 102–107. [97] P. Mirza-Babaei, L. Nacke, G. Fitzpatrick, G. White, G. McAllister, N. Collins,
[65] E. Ha, J.P. Rowe, B.W. Mott, J.C. Lester, Goal recognition with markov logic Biometric storyboards: visualising game user research data, in: Proc. CHI
networks for player-adaptive games, in: Proc. AIIDE 2011, TAAAI Press, 2011. 2012 Extended Abstracts, ACM Press, 2012, pp. 2315–2320.
[66] Mochibot. Available at: <https://www.mochibot.com> (accessed June 2012). [98] M. Ashton, C. Verbrugge, Measuring cooperative gameplay pacing in World of
[67] Nonoba. Available at: <http://nonoba.com/developers/statistics> (accessed Warcraft, in: Proc. FDG 2011, ACM Press, 2011, pp. 77–83.
June 2012). [99] P. Coulton, W. Bamford, K. Cheverst, O. Rashid, 3D space-time visualization of
[68] K. Börner, G.J. Lee, S. Penumarthy, R.J. Jones, Visualizing the VLearn3D 2002 player behavior in pervasive location-based games, International Journal of
conference in space and time, in: Visualization and Data Analysis, vol. 5295, Computer Games Technology (2008) 1–5.
SPIE-IS& T, 2004, pp. 24–32. [100] C. Lewis, N. Wardrip-Fruin, Mining game statistics from web services: a
[69] C. Chen, K. Börner, From spatial proximity to semantic coherence: a World of Warcraft armory case study, in: Proc. FDG 2010, ACM Press, 2010,
quantitative approach to the study of group dynamics in collaborative pp. 100–107.
virtual environments, Presence: Teleoper, Virtual Environment 14 (2005) 81– [101] T. Zimmermann, B. Phillips, N. Nagappan, C. Harrison, Data-driven games
103. user research, in: CHI 2012 Workshop on Game User Research, 2012.
[70] S. Stellmach, L. Nacke, R. Dachselt, 3D attentional maps: aggregated gaze [102] C. Thompson, Halo 3: how Microsoft Labs invented a new science of play,
visualizations in three-dimensional virtual environments, in: Proc. AVI 2010, Wired (2007). Available at: <http://www.wired.com/gaming/virtualworlds/
ACM Press, 2010, pp. 345–348. magazine/15-09/ff_halo> (accessed July 2012).
[71] C.A. Zanbaka, B.C. Lok, S.V. Babu, A.C. Ulinski, L.F. Hodges, Comparison of path [103] A.R. Gagné, M.S. El-Nasr, C.D. Shaw, A deeper look at the use of telemetry for
visualizations and cognitive measures relative to travel technique in a virtual analysis of player behavior in RTS games, in: Proc. ICEC 2011, Springer, 2011,
environment, IEEE Transactions on Visualization and Computer Graphics 11 pp. 247–257.
(2005) 694–705. [104] J.L. Miller, J. Crowcroft, Group movement in World of Warcraft battlegrounds,
[72] R. Haworth, S.S.T. Bostani, K. Sedig, Visualizing decision trees in games to Int. J. Adv. Media Commun. 4 (2010) 387–404.
support children’s analytic reasoning: any negative effects on gameplay?, [105] A. Canossa, A. Drachen, J.R.M. Sørensen, Arrrgghh!!! – blending quantitative
International Journal of Computer Games Technology (2010) 1–11 and qualitative methods to detect player frustration, in: Proc. FDG 2011, ACM
[73] C. Macklin, J. Wargaski, M. Edwards, K.Y. Li, Dataplay: mapping game Press, 2011, pp. 61–68.
mechanics to traditional data visualization, in: A. Barry, K. Helen, K. Tanya [106] R. Thawonmas, M. Kurashige, K. Iizuka, M.M. Kantardzic, Clustering of online
(Eds.), Breaking New Ground: Innovation in Games, Play, Practice and Theory: game users based on their trails using self-organizing map, in: Proc. ICEC
Proc. DiGRA 2009 Conference, Brunel University, 2009. 2006, LNCS, Springer, 2006, pp. 366–369.
[74] V. Zammitto, Visualization techniques in video games, in: Proc. EVA 2008, pp. [107] G. Wallner, S. Kriglstein, A spatiotemporal visualization approach for the
267–276. analysis of gameplay data, in: Proc. CHI 2012, ACM Press, 2012, pp. 1115–
[75] B. Medler, B. Magerko, Analytics of play: using information visualization and 1124.
gameplay practices for visualizing video game data, Parsons Journal for [108] A. Inselberg, Parallel Coordinates: Visual Multidimensional Geometry and Its
Information Mapping 3 (2011). Applications, Springer, 2009.
[76] C. Pruett, Hot failure: tuning gameplay with simple player metrics, [109] G. Andrienko, N. Andrienko, A general framework for using aggregation in
Gamasutra (2010). Available at: <http://www.gamasutra.com/view/feature/ visual exploration of movement data, The Cartographic Journal (2010) 22–40.
6155/hot_failure_tuning_gameplay_with_.php> (accessed November 2012). [110] C. Hilditch, Linear skeletons from square cupboards, Machine Intelligence 4
[77] D. Schoenblum, Zero to millions: building an XLSP for Gears of War 2, in: (1969) 403–420.
Game Developer Conference, 2010. [111] T. Kohonen, Self-organizing Maps, Springer, 2001.
[78] J. Ludwig, Flogging: data collection on the high seas, in: Game Developer [112] J.B. MacQueen, Some methods for classification and analysis of multivariate
Conference, 2007. observations, in: L.M.L. Cam, J. Neyman (Eds.), Proc. of the Fifth Berkeley
[79] P.N. Dixit, G.M. Youngblood, Understanding information observation in Symposium on Mathematical Statistics and Probability, vol. 1, University of
interactive 3D environments, in: Proc. SIGGRAPH Sandbox 2008, ACM Press, California Press, 1967, pp. 281–297.
2008, pp. 163–170. [113] J.B. Kruskal, M. Wish, Multidimensional Scaling, Sage Publications, 1978.
G. Wallner, S. Kriglstein / Entertainment Computing 4 (2013) 143–155 155

[114] I.G. Tollis, G. Di Battista, P. Eades, R. Tamassia, Graph Drawing: Algorithms for [123] R. Böhme, S. Köpsell, Trained to accept? A field experiment on consent
the Visualization of Graphs, Prentice Hall, 1998. dialogs, in: Proc. CHI 2010, ACM Press, 2010, pp. 2403–2406.
[115] Y. Ohsawa, N.E. Benson, M. Yachida, Keygraph: automatic indexing by co- [124] L. Chittaro, R. Ranon, L. Ieronutti, VU-Flow: a visualization tool for analyzing
occurrence graph based on building construction metaphor, in: Proc. ADL navigation in virtual environments, IEEE Transactions on Visualization and
1998, IEEE Computer Society, 1998, pp. 12–18. Computer Graphics 12 (2006) 1475–1485.
[116] B.A. Lameman, M.S. El-Nasr, A. Drachen, W. Foster, D. Moura, B. Aghabeigi, [125] L. Nacke, C. Lindley, S. Stellmach, Log who’s playing: psychophysiological
User studies – a strategy towards a successful industry – academic game analysis made easy through event logging, in: P. Markopoulos, B. de
relationship, in: Proc. Futureplay 2010, ACM Press, 2010, pp. 134–142. Ruyter, W. IJsselsteijn, D. Rowland (Eds.), Fun and Games 2008, LNCS,
[117] D. Moskowitz, S. Shodhan, M. Twardos, Spore API: accessing a unique Springer, 2008, pp. 150–157.
database of player creativity, in: SIGGRAPH 2009: Talks, SIGGRAPH 2009, [126] L. Chittaro, L. Ieronutti, A visual tool for tracing users’ behavior in virtual
ACM Press, 2009. p. 62:1. environments, in: Proc. AVI 2004, ACM Press, 2004, pp. 40–47.
[118] Electronic Arts Inc., Spore API. Available at: <http://www.spore.com/comm/ [127] R.J. Pagulayan, K. Keeker, D. Wixon, R.L. Romero, T. Fuller, The Human–
developer/> (accessed July 2012). Computer Interaction Handbook, L. Erlbaum Associates Inc., 2003. pp. 883–906.
[119] D. Herbst, BF3 Stats API. Available at: <http://www.wowwiki.com/ [128] R.L. Mandryk, M.S. Atkins, K.M. Inkpen, A continuous and objective
World_of_Warcraft_API> (accessed July 2012). evaluation of emotional experience with interactive play environments, in:
[120] Blizzard Entertainment, World of Warcraft API. Available at: <http:// Proc. CHI 2006, ACM Press, 2006, pp. 1027–1036.
bf3stats.com/api> (accessed July 2012). [129] L. Nacke, A. Drachen, S. Goebel, Methods for evaluating gameplay experience in a
[121] Electronic Arts, Privacy Policy. Available at: <http://www.ea.com/privacy- serious gaming context, International Journal of Computer Science in Sport 9 (2010).
policy> (accessed November 2012). [130] D.A. Keim, Information visualization and visual data mining, IEEE
[122] Rockstar Games, Rockstar Games Online Privacy Statement. Available at: Transactions on Visualization and Computer Graphics 8 (2002) 1–8.
<http://www.rockstargames.com/privacy> (accessed November 2012).

You might also like