Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
Maps of Our Lives: Sensing People and Objects Together in the Home

Maps of Our Lives: Sensing People and Objects Together in the Home

Ratings: (0)|Views: 3|Likes:
Published by Ryan Aipperspach
University of California Berkeley Technical Report, 2005
University of California Berkeley Technical Report, 2005

More info:

Published by: Ryan Aipperspach on Feb 24, 2013
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





Maps of Our Lives: Sensing People and ObjectsTogether in the Home
Ryan Aipperspach Allison Woodruff Ken Anderson Ben Hooker 
Electrical Engineering and Computer SciencesUniversity of California at Berkeley
Technical Report No. UCB/EECS-2005-22http://www.eecs.berkeley.edu/Pubs/TechRpts/2005/EECS-2005-22.html
November 30, 2005
Copyright © 2005, by the author(s).All rights reserved.
Permission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and that copiesbear this notice and the full citation on the first page. To copy otherwise, torepublish, to post on servers or to redistribute to lists, requires prior specificpermission.
Maps of Our Lives:Sensing People and Objects Together in the Home
Ryan Aipperspach
, Allison Woodruff 
, Ken Anderson
, and Ben Hooker
{ryanaip@cs.berkeley.edu, woodruff@acm.org, ken.anderson@intel.com, email@benh.net}
Intel Research Berkeley2150 Shattuck Avenue, #1300Berkeley, CA 94704 USA
Intel Research PaPR2111 NE 25th Ave.Hillsboro, OR 97124 USA
Computer Science DivisionUniversity of California, BerkeleyBerkeley, CA 94720 USA
 The proliferation of portable electronic devices in the homecreates the opportunity for increasingly complexinteractions between household residents and their devices.We present a study of these interactions which focuses onlaptop computers in homes with wireless networks,describing the technical infrastructure for the study, andexploring a range of findings about home life. We alsopresent several design implications of this work. Highlyaccurate position and device usage data has been collectedabout residents and wireless laptop computers, andvisualizations of the data were used to motivate discussionduring interviews. This data collection and interviewingmethod is a novel and promising alternative to othermethods such as diaries or self-report surveys.
 Author Keywords
Home life, home technology, wireless laptops, mobility.
 ACM Classification Keywords
H.5.2. [Information Interfaces and Presentation]: UserInterfaces
Portable, wireless devices are rapidly becoming ubiquitous.Wireless networks are being deployed with increasingfrequency, both in the home and elsewhere. Similarly, inMay 2005, laptop computers outsold desktops for the firsttime ever [28], and many households now have severallaptops, along with an array of other electronic devices.Devices such as wireless laptops provide a broader range of possibilities for interaction with the home environmentthan do stationary desktop computers. However, little isknown about emergent patterns of use for mobiletechnologies in the home and how they are integrated indaily life. Understanding these patterns would be valuablefor the future design of both devices and the architecturalspaces that support them. This paper assumes that objects are unavoidably embeddedin activity [15]. In his collection
 The Social Life of Things
,Appaduri posits that commodities, like persons, have sociallives [1]. To grasp the life of objects, he suggests we needto embrace things-in-motion as disclosing agents in studiesof the social-material world. However, it is difficult tocapture the full context of objects-in-motion as theycirculate in the social-physical environment. We havedeveloped a set of tools to help us move beyond anunderstanding of objects as mere commodity or instrumentas we move to develop new technologies. Specifically, weincorporate other objects, architectural features, and socialrelationships. Accordingly, we have chosen to study theuse of portable computing devices in the context of theirlocation in the architectural layout in the home, as well asin the context of the presence or absence of householdmembers.In order to study the use of computing devices in thesecontexts, it is useful to collect a range of detailedinformation about people’s practices in the home and theiruse of devices. As we discuss further below, many existingmethods for studying practices in the home havelimitations. For example, contextual inquiry and self-report are challenging because people typically find itdifficult to report mundane activities accurately, and videois challenging because many people are understandablyreluctant to have cameras deployed throughout their home.We propose a new method in which we use a sensor-basedvisual record of the physical movement of people anddevices to facilitate more accurate and in-depth discussionduring interviews. Previous work has explored the utilityof different media in diary studies [4]. Our work offers arich alternative media (visualization of activity) that doesnot require participants to capture or record events.Additionally, recently available sensing technology [30]supports accuracy on the scale of centimeters and meters,which is highly appropriate for understanding localmovements within and between rooms. This technologyhas made it technically feasible to track the locations andsee the spatial relationships of people and devices in real-world deployments. In addition to providing useful tracesfor discussion in interviews, this technology offers theability to automatically collect valuable quantitative data.

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->