You are on page 1of 6

2013 IEEE 10th International Conference on Ubiquitous Intelligence & Computing and 2013 IEEE 10th International Conference

on Autonomic & Trusted Computing

Eye-Tracking Experiment Design for Extraction of Viewing Patterns


in Social Media
Guangyu PIAO1, Qun JIN1, Xiaokang ZHOU1, Shoji NISHIMURA1,
Kanoksak WATTANACHOTE2, Timothy K. SHIH2, and Neil Y.YEN3
1 Graduate School of Human Sciences, Waseda University,
Tokorozawa, Japan
{piao@fuji., jin@, xkzhou@ruri., kickaha@}waseda.jp
2 Department of Computer Science and Information Engineering,
National Central University, Taipei, Taiwan
kaskonak@hotmail.com, timothykshih@gmail.com
3 School of Computer Science and Engineering,
University of Aizu, Aizu-Wakamatsu, Japan
neilyyen@u-aizu.ac.jp

Abstract—Recently, eye-tracking has been widely applied in a timeline, which would be related to the information they want,
wide spectrum of fields for both academic researches and they may gaze on the point of the content for a while, or
Business. In this study, we concentrate on the analysis of instant change the eye gaze direction, or move onto the related
(and often subconscious) information generated from interactions content.
between an individual and devices, such as a PC, laptop, and
mobile phone. We present the experiment design to capture and The rest of this paper is organized as follows. We give a
extract the viewing patterns in Twitter using the eye-tracking brief overview on related work on eye-tracking technology and
technology. We show a set of experiment results based on the experiment design in Section II. In Section III, we show the
analysis of eye gazing data, in order to demonstrate how the design of eye-tracking experiment using SNS to extract the
subjects look for specified keywords in the Twitter timeline, viewing patterns. In Section IV, we discuss the result of
which can further contribute to categorization of viewing eye-tracking experiment and the viewing pattern. Finally, we
patterns. conclude this study and highlight future work in Section V.
Keywords- browsing behavior, gaze patterns, eye-tracking, social II. RELATED WORK
media
Eye-tracking has been widely applied in the research field
of human-computer interaction, usability study, etc. By using
I. INTRODUCTION the eye-tracker to detect user’s gaze behaviors, such as dwell
time, duration of fixations and number of fixations, Alt et al.
Eye-tracking is widely considered to be one of powerful developed a mechanism that allows the web content to be
experimental techniques to observe the browsing behavior customized based on the user’s gaze behavior [8]. In the study
during the performance of tasks. With an increase in the presented in [9], the author reviewed statistical methods for
availability of eye trackers and a reduction in their cost, eye spatial and temporal analysis of eye movement data, in order to
movement studies have flourished not only in vision research pin down advantages and disadvantages. Cristino et al.
[1, 2] but also in language, advertising, clinical, and animal proposed a method to bin the saccade sequence spatially and
research [3, 4]. Although these researches differ widely in their temporally, which can create a sequence of letters that retains
goals, one common characteristic among them is the fixation location, time, and order information [10]. Jarodzka et
techniques used to study and analyze the eye movements. With al. defined scan paths as a series of geometric vectors, and
the development of eye trackers, it is possible to measure the proposed an approach to offer more multifaceted insights on
position and motion of eye gaze movement more accurately how similar two scan paths can be [11].
and precisely [5].
Goldberg and Helfman described and extended several
In our previous study, a mechanism had been proposed to compact visual scan path representations, which can provide
integrate SNS (such as Twitter) into the social learning process additional insight about scanning strategies [12]. The same
in order to mobilize the learning interaction, and further authors further proposed a method to identify scanning
improve the teaching quality and finally enhance e-learning strategies by aggregating groups of matching scan paths
process [6]. Moreover, we proposed an integrated approach to automatically [13]. Given a gaze path obtained by an
analyze and extract gaze patterns in order to capture users’ eye-tracker, Pallez et al. focused on automatical ordering
unconscious information behaviors based on the eye gaze solutions shown on a screen [14]. Iqbal and Bailey used eye
movement data which are collected by an eye tracker and then gaze patterns to identify user tasks, and pointed out that these
extracted by our proposed method [7]. patterns can help design and evaluation of user interface [15],
In this paper, we present the experiment design to capture while Josephson and Holmes tested their hypothesis on web
and extract the viewing patterns in Twitter using an eye-tracker. page revisitation with an eye-tracker [16]. Burmester and Mast
We design eye-tracking experiment to investigate the patterns investigated the eye movement sequences of users’ visiting
while users browse the content by a web browser. We define web pages repeatedly [17]. In the study presented in [18],
viewing patterns to extract the user’s information behaviors on Underwood et al. showed that scan paths can indicate the user’s
the SNS. When the users are viewing the content in the Twitter knowledge background of the research domain. Zhao and Koch

978-1-4799-2481-3/13 $31.00 © 2013 IEEE 308


DOI 10.1109/UIC-ATC.2013.45

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA MARIA. Downloaded on January 14,2023 at 21:25:04 UTC from IEEE Xplore. Restrictions apply.
constructed a computational saliency model using fixated Eye Tracking
locations in natural scenes [19]. Experiment

Last but not least, there are many applications in


information searching. For example, Conati et al. used eye
Eye
tracking data to build a user model to understand and capture Twitter
Tracking
users’ needs during interaction through adaptive interface [20]. Web Page
Data

III. EXPERIMENT DESIGN


Keyword Extracting
In this section, firstly, we introduce our eye-tracking Random Viewing
experiment environment, in which we organize the messages Organization Patterns
from Twitter into the web browser. Based on these, we further
present the experiment design to catch eye gaze movement and
analyze eye-tracking data and extract viewing patterns, by Viewing
Definition on
using an eye-tracking device. Viewing
Patterns
Patterns
A. Overview of the Experiment
We ask 32 subjects to take part in the experiments. They
Analysis &
are instructed to view the Twitter pages, which contain about Matching
200 lines per page. Their eye movements are recorded by a
Tobii X120 eye-tracker connected to a PC using a 17-inch
monitor (resolution of 1024X768). The eye-tracker is of
Viewing
approximately 0.5 degrees of accuracy with a sample rate of Patterns
120 Hz and infrared lens to detect pupil centers and corneal
reflection [21]. Figure 1 Overview of Extracting Viewing Patterns
With the analysis of eye movement on the Twitter page, we
extract viewing patterns. In this study, viewing pattern is
defined as the path a subject views the Twitter page, which is
repeated in the experiment and has a certain similarity and
regularity. Viewing patterns are extracted based on the analysis
of the gaze data collected by eye-tracking.
Figure 1 shows the overview of the experiment to extract
the viewing patterns. We design the experiment as two parts,
using two Japanese words “Nobody” and “Business” inserted
into the Twitter page. Each page contains 20 keywords. We ask
our subjects to find out the keywords in 45 seconds without the
slightest hesitation to browse the Twitter page (1) From left (2) Around center
(1) from left to right, and Figure 2 Illustration of Twitter Browsing Path
(2) around center
In this study, we consider to cluster and extract the viewing
respectively, as shown in Figure 2. We analyze the patterns from the zigzag scan path in the Twitter page. The
eye-tracking data to extract the viewing patterns. Eye-tracking density and shape of zigzag on the Twitter page represent the
data is represented as the scan path. browsing efficiency. Two basic classifications are conceived
and proposed in this experiment: concentrated viewing pattern
B. Description of Viewing Patterns and sparse viewing pattern. Sparse viewing pattern takes
Eye-tracking researches on browsing behaviors generally shorter time to browse the Twitter page, while concentrated
focus on the existence of repetition within sequences, and the viewing pattern costs more time on the detail in the each line of
browsing behavior may suggest underlying cognitive feature. the Twitter page.
Measuring general similarities and/or regularities between all
of the scan paths would reveal some features. However, each IV. EXPERIMENT RESULT AND ANALYSIS
subject may perform a completely different scan path while A. Experiment Result
seeking information.
Table 1 shows the result of the eye-tracking experiments.
To summarize how the subjects seek hidden keywords in the
Twitter timeline, which can further be extracted and
categorized into viewing patterns, we specify two viewing
patterns: high-efficiency viewing pattern and low-efficiency
viewing pattern, in this experiment. The number in the table is
the number of keywords found in the limited time.

309

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA MARIA. Downloaded on January 14,2023 at 21:25:04 UTC from IEEE Xplore. Restrictions apply.
Figures 3 (a) and (b) show that the subjects are looking for low-efficiency patterns from left. Table 1 shows that 16
the keyword “Nobody” in the high- and low-efficiency patterns subjects found out over 12 keywords in the limited time, and
from left. We can see that the example gaze plots show the the high-efficiency viewing pattern shows sparseness from left.
scan path. The scan paths in this experiment show the zigzag On the other hand, five of subjects found out less than nine
feature. From Table 1, we can see that five of the subjects keywords in the limited time, and the low-efficiency viewing
found out over 12 keywords in the limited time, and the so pattern shows concentrated from left.
defined high-efficiency viewing pattern shows sparseness from
left. On the other hand, eight of subjects found out less than Lastly, Figures 4 (c) and (d) shows that the subjects are
nine keywords in the limited time, and in a similar way, the so looking for the keyword “Business” around center. We can see
defined low-efficiency viewing pattern shows concentrated from Table 1 that 16 of the subjects found out over 12
from left. keywords in the limited time, and the high-efficiency viewing
pattern shows sparseness around center, while five subjects
found out less than nine keywords in the limited time. Their
gaze plot presents the same inclination.
Table 1 Result of Eye-Tracking Experiment
B. Discussion
“Nobody” “Business”
Subjects
We have shown a set of experiment results of eye gazes on
Around Around
From Left
Center
From Left
Center how the subjects seek specified keywords in the Twitter
1 5 9 8 13 timeline, which can be further categorized into two viewing
2 13 14 19 20 patterns. In this experiment, the high-efficiency viewing
3 6 4 8 8 patterns embody the feature that the subjects browse the
4 11 10 20 17
Twitter page comparatively fast, and consequently more
5 9 7 14 16
6 12 14 10 8 keywords were found. The gaze movement of these subjects
7 9 12 8 11 with the high-efficiency viewing patterns performs in a
8 12 12 7 8 reasonably fast way and not wavering as well in each line, so
9 9 8 8 6 that it can contribute to the result that the keyword can be
10 15 10 14 8 sought out in a short time. By analyzing the eye-tracking data,
11 8 11 18 20
12 11 16 17 17
we found that these gaze movements showing the
13 13 14 18 18 high-efficiency viewing patterns did not stop at one line more
14 13 11 16 15 than five times.
15 6 9 17 17
16 6 13 14 19 On the other hand, the low-efficiency viewing patterns
17 12 12 17 18 embody the feature that the subjects browse the Twitter page
18 6 12 17 19 comparatively unorganized, and thus less keywords could be
19 10 9 15 15 found out. Some of the subjects even did not finish the
20 12 15 17 18
21 7 5 9 10
experiment in the limited time. The gaze movements showing
22 7 11 14 19 the low-efficiency viewing patterns stop at every line too much
23 14 16 14 15 time and spend extra time on the detail of the tweets. By
analyzing the eye-tracking data, we found that the gaze
movements showing the low-efficiency viewing patterns
Figures 3 (c) and (d) show that the subjects are looking for stopped on the Twitter line more than five times, or they did
the keyword “Nobody” in the same high- and low-efficiency not stop around the keyword enough time to catch it. In some
patterns around center. In this zigzag gaze figure around center, cases, some of the subjects browsed down so sketchily and did
viewing patterns demonstrate some different features from the not notice on the keywords inserted in the lines.
experiment from left. From Table 1, we can know that seven of
the subjects found out over 12 keywords in the limited time, One more finding is that the subjects who could find
and the high-efficiency viewing pattern shows sparseness “Nobody” with the high-efficiency viewing patterns performed
around center. On the other hand, four of the subjects found out the same in the experiment of “Business” task. Similarly, the
less than nine keywords in the limited time. They are subjects who did not find many words in the experiment of
intertwined with detail in each line. “Nobody” also performed in the low-efficiency in the
“Business” task.
Furthermore, Figures 4 (a) and (b) show that the subjects
are looking for the keyword “Business” in the high- and

310

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA MARIA. Downloaded on January 14,2023 at 21:25:04 UTC from IEEE Xplore. Restrictions apply.
(a) (b) (c) (d)
(1) From left (2) Around center

Figure 3 Experiment Result of Test Nobody

311
“ ”

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA MARIA. Downloaded on January 14,2023 at 21:25:04 UTC from IEEE Xplore. Restrictions apply.
(a) (b) (c) (d)
(1) From left (2) Around center

Figure 4 Experiment Result of Test Business

“ ”
312

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA MARIA. Downloaded on January 14,2023 at 21:25:04 UTC from IEEE Xplore. Restrictions apply.
V. CONCLUSION with Social Media,” Proc. ITME2011 (2011 IEEE International
Symposium on IT in Medicine & Education), Vol.2, pp.335-339,
In this paper, we have presented the experiment design to Guangzhou, China, Dec. 9-11, 2011.
capture and extract the viewing patterns with eye-tracking. We [7] G. Y. Piao, X. K. Zhou, Q. Jin and S. Nishimura, “Analyzing and
have used the Twitter timeline as the experiment test bed. We Extracting Gaze Patterns to Capture Unselfconscious Information
have further showed a set of experiment results based on the Behavior via Eye Tracking Experiments,” Proc. 2011 IEEE
International Conference on Intelligent Computing and Integrated
analysis of eye gazing data, in order to demonstrate how the Systems, Guilin, China, Oct. 24-26, 2011.
subjects seek specified keywords following the Twitter
[8] F. Alt, A.S. Shirazi, A. Schmidt and J. Mennenöh, “Increasing the user's
timeline, which can further contribute to categorization and attention on the web: using implicit interaction based on gaze behavior
extraction of viewing patterns, and infer features by to tailor content,” Proc. 7th Nordic Conference on Human-Computer
summarizing the analysis results. Interaction, ACM, New York, USA, pp.544-553, 2012.
[9] M.I. Coco, “The statistical challenge of scan-path analysis,” Proc. 2nd
As we know, the scan paths recorded as gaze data by Conference on Human System Interactions. IEEE Press, Piscataway, NJ,
eye-tracking technology could provide basic data for further USA, pp.369-372, 2009.
analysis. The viewing patterns discovered in this work could [10] F. Cristino, S. Mathôt, J. Theeuwes and I. D. Gilchrist, “ScanMatch: A
help us gain insights into users’ browsing behaviors, and even novel method for comparing fixation sequences,” Behavior Research
some unconscious behaviors. These data and meta-data (such Methods, Vol.42, No.3, pp.692-700, 2010.
as the viewing patterns) may represent a user’s potential [11] H. Jarodzka, K. Holmqvist and M. Nyström, “A vector-based,
preferences, which can be further used to build the user model. multidimensional scanpath similarity measure,” Proc. 2010 Symposium
on Eye-Tracking Research Applications, ACM, New York, USA,
There are still some improvements that should be made to pp.211-218, 2010.
obtain more objective eye-tracking data for the experiment. [12] J.H. Goldberg and J.I. Helfman. “Visual scanpath representation,” Proc.
Thus, for the future work, we plan to improve how to extract 2010 Symposium on Eye-Tracking Research & Applications, ACM,
New York, USA, pp.203-210, 2010.
the viewing patterns. For example, we will consider using a
[13] J,H. Goldberg and J,I. Helfman, “Scanpath clustering and aggregation,”
combination of multimedia contents for the subjects to browse. Proc. 2010 Symposium on Eye-Tracking Research & Applications,
In addition, it is necessary to investigate how to construct more ACM, New York, USA, pp.227-234, 2010.
elaborated viewing patterns. [14] D. Pallez, M. Cremene, T. Baccino and O. Sabou, “Analyzing human
gaze path during an interactive optimization task,” Proc. 2010 Workshop
ACKNOWLEDGMENT on Eye gaze in intelligent human machine interaction, pp.12-19, 2010.
The work has been partly supported by 2013 Waseda [15] T. Iqbal and B. Bailey, “Using eye gaze patterns to identify user tasks,”
Proc. GHC2004 (The Grace Hopper Celebration of Women in
University Grants for Special Research Project No. Computing), 2004.
2013B-207 and 2010–2012 Waseda University Advanced
[16] S. Josephson and M.E. Holmes, “Attention to repeated images on the
Research Center for Human Sciences Project “Distributed World-Wide Web: Another look at scanpath theory,” Behavior Research
Collaborative Knowledge Information Sharing Systems for Methods, Instruments, & Computers, Vol.34, No.4, pp.539-548, 2002.
Growing Campus.” [17] M. Burmester and M. Mast, “Repeated web page visits and the scanpath
theory: A recurrent pattern detection approach,” Journal of Eye
Movement Research, Vol.3, No.4, pp.1-20, 2010.
REFERENCES [18] G. Underwood, K. Humphrey and T. Foulsham, “Knowledge-Based
[1] D. Parkhurst, K. Law and E. Niebur, “Modeling the role of salience in Patterns of Remembering: Eye Movement Scanpaths Reflect Domain
the allocation of overt visual attention,” Vision Research, Vol.42, No.1, Experience,” Proc. 4th Symposium of the Workgroup Human-Computer
pp.107-123, 2002. Interaction and Usability Engineering of the Austrian Computer Society
on HCI and Usability for Education and Work (A. Holzinger, ed.),
[2] B.W. Tatler, R.J. Baddeley and I.D. Gilchrist. “Visual correlates of Springer-Verlag, Berlin, Heidelberg, pp.125-144, 2008.
fixation selection: effects of scale and time,” Vision Research, Vol.45,
No.5, pp.643-659, 2005. [19] Q. Zhao and C. Koch, “Learning a saliency map using fixated locations
in natural scenes,” Journal of Vision, Vol.11, No.3, pp.1-15, 2011.
[3] K. Rayner, “Eye movements in reading and information processing: 20
years of research,” Psychological Bulletin, Vol.124, No.3, pp.372-422, [20] C. Conati, C. Merten, S. Amershi and K. Muldner, “Using Eye-tracking
1998. Data for High-Level User Modeling in Adaptive,” Proc. AAAI2007
(22nd National Conference on Artificial Intelligence), pp.1614-1617,
[4] A.T. Duchowski, ”A Breadth-First Survey of Eye Tracking 2007.
Applications,” Vol.34, No.4, pp.455-470, 2002.
[21] Tobii Technology, Tobii X60 & X120 User Manual,
[5] A.T. Duchowski, Eye tracking methodology (2nd ed.), Springer, http://www.tobii.com/Global/Analysis/Downloads/User_Manuals_and_
2007. Guides/Tobii_X60_X120_UserManual.pdf (accessed on July 12, 2013)
[6] X. Zhou, G. Piao, Q. Jin and R. Huang, “Organizing Learning Stream
Data by Eye-tracking in a Blended Learning Environment Integrated
.

313

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA MARIA. Downloaded on January 14,2023 at 21:25:04 UTC from IEEE Xplore. Restrictions apply.

You might also like