Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
5Activity
0 of .
Results for:
No results containing your search query
P. 1
Cost of Reading Privacy Policies

Cost of Reading Privacy Policies

Ratings: (0)|Views: 429 |Likes:
Published by Pascal Van Hecke

More info:

Published by: Pascal Van Hecke on Oct 27, 2008
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

10/30/2013

pdf

text

original

 
— 1 —
The Cost of Reading Privacy Policies
Aleecia M. McDonald and Lorrie Faith Cranor 
Carnegie Mellon University 
Revised September 26, 2008 for the Telecommunications Policy Research Conference
Abstract
Companies collect personally identifiable information that website visitors are notalways comfortable sharing. One proposed remedy is using economics rather thanlegislation to address privacy risks by creating a market place for privacy where websitevisitors would choose to accept or reject offers for small payments in exchange for loss ofprivacy. The notion of micropayments for privacy has not been realized in practice,perhaps because as Simson Garfinkel points out, advertisers might be willing to pay apenny per name and address yet few people would sell their contact information foronly a penny (Garfinkel, 2001). In this paper we contend that the time to read privacypolicies is, in and of itself, a form of payment. However, instead of receiving paymentsto reveal information, website visitors must pay with their time to research policies inorder to retain their privacy. We pose the question: if website users were to read theprivacy policy for each site they visit just once a year, what would the loss of their timebe worth?Studies show privacy policies are hard to read, read infrequently, and do not supportrational decision making. We calculated the average time to read privacy policies in twoways. First, we used a list of the 75 most popular websites and assumed an averagereading rate of 250 words per minute to find an average reading time of 10 minutes perpolicy. Second, we conducted an online study of 93 participants to measure time to skimonline privacy policies and respond to simple comprehension questions with an averagetime of 6 minutes per policy. We then used data from Nielsen/Net Ratings to estimatethe number of unique websites the average Internet user visits annually with a lowerbound of 119 sites. We estimated the total number of Americans online based on PewInternet & American Life data and Census data. Finally, we estimated the value of timeas 25% of average hourly salary for leisure and twice wages for time at work. Wepresent a range of values, and found the nationwide cost for just the time to readpolicies is on the order of $365 billion. Additional time for comparing policies betweenmultiple sites in order to make informed decisions about privacy brings the social costwell above the market for online advertising. Given that web users also have some valuefor their privacy on top of the time it takes to read policies, this suggests that under thecurrent self-regulation framework, targeted online advertising may have negative socialutility.
 
— 2 —
1 Introduction
The Federal Trade Commission (FTC) supports industry self-regulation for onlineprivacy (FTC, 2000). In the late 1990s, the FTC decided that the Internet was evolvingvery quickly and new legislation could stifle growth. In particular, there were concernsthat it was premature to legislate to protect privacy before other mechanisms evolved,especially when business was expected to offer more effective and efficient responsesthan FTC staff could devise. The Internet was still young, commerce on the Internet wasvery new, and legislators and regulators adopted a hands off approach rather than riskstifling innovation. However, concerns remained about data privacy in general and onthe Internet in particular. For example, the FTC recommended legislation to protectchildren’s privacy, which led to the Children’s Online Protection Act (COPA) in 1998(FTC, 1999).Prior to COPA, the FTC adopted Fair Information Principles (FIPs), a set of idealsaround data use. The notion of FIPs predates the Internet; several nations adopteddiffering FIPs in response to concerns about credit databases on mainframes in the 1970s(Laudon, 1996). While FIPs do not themselves carry the force of law, they provide a setof principles for legislation and government oversight. In this way they are similar to theUniversal Declaration of Human Rights, which in Article 12 states the principle that “Noone shall be subjected to arbitrary interference with his privacy, family, home orcorrespondence, nor to attacks upon his honour and reputation. Everyone has the rightto the protection of the law against such interference or attacks,” but leaves the specificlegal implementations of those ideals in the hands of individual nations (UDHR, 1948).The five FIPs the FTC adopted in 1973 – notice/awareness, choice/consent,access/participation, integrity/security, and enforcement/redress – are a subset of theeight protections ensconced in the Organization for Economic Co-operation andDevelopment (OECD) Guidelines on the Protection of Privacy and Transborder DataFlows of Personal Data (OECD, 1980). The FIP of notice underlies the notion of privacypolicies, which are a mechanism for companies to disclose their practices. In 1998, theFTC commissioned a report that found while 92% of US commercial websites collectedsome type of data, only 14% provided comprehensive notice of their practices (FTC, May2000). The FTC was concerned that the FIP of notice/awareness was not faring well onthe new Internet: consumers did not know where their data went, or what it might beused for. While the FTC did not recommend new legislation or regulation for adults’data, the FTC retained the threat of regulatory action to help coax large companies intovoluntarily disclosing their data practices via online privacy policies (FTC, May 2000).Voluntary disclosure formed the basis of an industry self-regulation approach to notice.Because privacy policies were voluntary there were no requirements for the existence ofa policy let alone restrictions as to the format, length, readability, or content of any givenprivacy policy. However, in addition to the threat of regulatory action to spur voluntarydisclosure, the FTC also used deceptive practices and fraud actions to hold companies towhatever content they did publish. In essence, while a company was not strictlyrequired to post a policy, once published, the policy became enforceable. In one case theFTC brought action even without a privacy policy. When Cartmanager surreptitiouslyrented their customer lists the FTC advanced a legal theory of unfairness rather thanfraud (Stampley, 2005). Cartmanager provided online shopping cart software and
 
— 3 —worked with clients who promised not to sell customer data. The FTC argued eventhough Cartmanager did not have a privacy policy of their own to violate, they stillviolated the policies of their clients (FTC, March 2005).The FTC initiated a series of studies of hundreds of commercial websites to determinehow well industry self-regulation worked in what became know as Internet sweeps.Year over year, the number of companies offering privacy policies increased. By thatmetric it appeared the FTC was successful. However, multiple studies also showedpeople were reluctant to shop online because they had privacy concerns (FTC, May2000). Recall that the FTC’s charter is largely financial – barriers to new markets andcommerce are a serious issue. The FTC turned to two different innovative approaches,rather than legislation or regulatory action. First, they expressed great hope for onlineprivacy seals (FTC, 1999). Two seal providers, TRUSTe and the Better Business Bureau(through BBBOnline), began certifying website’s privacy policies. TRUSTe requirescompanies to follow some basic privacy standards and document their own practices.TRUSTe also investigates consumer allegations that licensees are not abiding by theirpolicies (TRUSTe, 2008). However, TRUSTe has come under criticism for not requiringmore rigorous privacy standards (McCarthy , 1999). In fact, one study showed thatcompanies with TRUSTe seals typically offer less privacy-protective policies than thosewithout TRUSTe seals (Krishnamurthy, 2002). Second, the FTC expressed hope thatprivacy enhancing technologies (PETs) would put greater control directly into the handsof consumers, and created a task force to encourage PETs (FTC, 1999). PETs includeencryption, anonymity tools, and other software-based approaches. One particularlyintriguing approach came from the Platform for Privacy Preferences (P3P) standard,which used privacy policies coded in standardized machine-readable formats todetermine for customers if a given website provided an acceptable privacy policy(Cranor, 2006). Even though P3P support is integrated into popular web browsers,unfortunately most users remain unfamiliar with the technology (Jensen, 2005).
1.1 Economics Theories of Privacy Policies
The FTC started with a set of principles, almost akin to a framework of rights, andencouraged companies to protect these rights by adopting privacy policies. Economistsalso see utility in privacy policies but from an entirely different basis.Advertising economics looks at ways to turn a commodity (e.g., water) into a bundle ofmarketable attributes (e.g., from mountain springs). There are three types of attributes.
Search goods
are things readily evaluated in advance, for example color.
Experience goods
 are only evaluated after purchase or use, for example the claims of a hair care product.
Credence attributes
cannot be determined even after use, for example nutrition content ofa food. One argument for mandatory nutrition labels on food is that it converts nutritioninformation from a credence attribute to a search attribute: consumers can read the labelprior to purchase (Drichoutis, 2006). This argument applies equally well to onlineprivacy. Without a privacy policy, consumers do not know if a company will send spamuntil after they have made the decision to provide their email address. With a privacypolicy, consumers can check to make sure privacy protections are sufficient prior toengaging in business with the site.

Activity (5)

You've already reviewed this. Edit your review.
Barbara Galletly liked this
1 thousand reads
1 hundred reads
a_ezzat liked this
scribdxx25 liked this

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->