The End of Privacy as We Know It?:
The Ethics of Privacy on
Online Social Networks
Cristina Cordova
Adviser: Professor Rob Reich
Reader: Professor BJ Fogg
Reader: Dr. Allegra McLeod
Program in Ethics in Society
Stanford University
May 10, 2010

The purpose oI this thesis is to determine the ethical responsibilities oI online social networks
to protect privacy. Although the social norms oI online privacy are in Ilux. online social
networks must employ Iair inIormation practices by notiIying users when private inIormation is
shared. They must give users the opportunity to reIuse consent to share inIormation and only use
inIormation Ior its intended purpose. Online social networks have slowly eliminated user control
without receiving meaningIul user consent to do so. thereIore violating the user-service provider
relationship. Online social networks have also used choice architecture and design against users
to push them in the direction oI sharing more inIormation than they would otherwise. By
eliminating user control. online social networks have slowly destroyed privacy through unethical
Table of Contents
Introduction 3
Chapter 1: Privacy: InIormation Privacy and Ethical Challenges 6
Chapter 2: Privacy on Email Services 13
Chapter 3: Privacy on E-Commerce Websites 24
Chapter 4: The Case oI Facebook 34
Chapter 5: Choice Architecture 47
Chapter 6: Perceived Privacy 54
Chapter 7: Trust 59
Chapter 8: User InterIace Design 64
Chapter 9: Ethical InIormation Privacy Practices 70
Conclusion 78
Bibliography 80
Introduction: Information Privacy and Ethical Challenges
!Privacy is dead. deal with it." Sun Microsystems CEO Scott McNealy is widely reported to
have declared in 2001 (Hamblen). In the early 1900s. some were apprehensive about using
telephones knowing that phone companies had the capacity to listen to phone calls. When Gmail
launched in 2004. many privacy advocates obiected to its advertising model. which involved
Google's "robot eyes" scanning each e-mail Ior keywords and displaying contextual
advertisements next to a user's inbox (Calore). Yet. the vast maiority oI people soon accepted
the telephone and later Gmail into their lives as convenient technological innovations. Many
accepted the trade-oII oI losing a little privacy Ior the ease oI communication. Admittedly. the
privacy concerns with telephones and email were drastically diIIerent. With the amount oI
technology available today. it would be extremely diIIicult to simply drop oII the grid. While
some privacy advocates clamor to say that a piece oI privacy dies with each and every
technological innovation. many still take action to protect their privacy. Protection oI one's
inIormation is still important. One does not give out his home phone number or home address to
someone he does not not know. How much oI a problem does the erosion oI privacy with the
introduction oI new technologies pose Irom an ethical standpoint? Online social networks have
become ubiquitous. The implications oI the recent erosion oI privacy due to technology are
Ilagrantly pressing in this context.
Today. iI you have a real liIe social network. chances are you have an online social network
(OSN) as well. What's diIIerent about OSNs Irom every other technological innovation oI the
past century? Facebook's rise has been the Iastest - over 400 million users in six years.
Additionally. a typical user is sharing inIormation about himselI with Iriends. Iamily and any
other random person who happens to land on his proIile. rather than simply sending directed
emails or phone calls to speciIic individuals. Facebook transIormed Irom what was simply a Iun
tool Ior communication between college students at a Iew universities. to a requisite Ior most
Internet users. In 2004. when OSNs were less prevalent. no one Ielt compelled to use them.
Today. OSNs have diIIerentiated their services Irom other Iorms oI communication and induced
millions to check the service daily Ior updates on the social lives oI others. OSNs oI varying
kinds are popular everywhere you can imagine - Irom Orkut in Latin America. to Myspace in the
United States. to Friendster in Asia-PaciIic and Bebo in Europe. Facebook is the dominant Iorce
in online social networking around the world as it is the most visited website on the Internet.
When the users oI OSNs care about privacy. they seek to ioin private networks limited to Iriends
and Iamily. The developers oI OSNs have eliminated those private networks. however.
demanding that users must make some oI their inIormation publicly available to everyone on the
Internet. Additionally. they eliminated the privacy settings users were given when they Iirst
signed up Ior the service. Facebook's CEO Mark Zuckerberg told users "we decided that these
would be the social norms now and we iust went Ior it" (Kirkpatrick). With a quick adiustment
oI privacy settings. Facebook transIormed the way users viewed their own personal privacy on
the service. The norms continued to shiIt changing the perception oI privacy on Facebook and
having similar eIIects across the entire web.
Privacy norms are in Ilux. Many are now comIortable with sharing the inIormation they
perceived as private iust ten years ago. In a world where one day users can make their OSN
proIile pictures private. and the next the OSN has told users that they must keep it public.
privacy concerns and practices are constantly evolving. When told that they must constantly
innovate. OSNs have adopted a mentality oI acting now and apologizing later. In this world.
OSN corporations have even more power than they would otherwise. as their policies are also in
Ilux. Many have the ability to unilaterally change privacy norms. Users must have an anchor to
ensure that when privacy policies. corporations and norms are all changing. they have some
semblance oI privacy to hold onto.
In this thesis. I will explore the ethics oI privacy as it relates to OSNs. I argue that privacy is
control. Ethical behavior oI an OSN is determined by a trusting relationship where the user has
control. When users have control oI their inIormation. they can protect their privacy. Control can
be achieved through speciIic Iair inIormation practices including notiIication. choice. use and
security. When users lose control oI their inIormation by Iair inIormation practice standards. they
no longer have a trusting relationship with the OSN and privacy is compromised. OSNs. such as
Facebook. do not act in the user's best interest and. thereIore. do not instill trust in the user to
OSN relationship. When OSNs do not ensure that users opt to share inIormation with meaningIul
consent and by reIusing access to users who want to share less. they to not demonstrate ethical
intentions. Privacy is deeply ingrained in the law. but I will only take a limited look at legal
implications as I Iind the ethical implications to be more indicative oI a user's trust in an OSN.
Much oI my thesis is Iocused on Facebook as it has the largest global reach and is the most
persuasive and pervasive technology to date. yet much oI my analysis applies to other OSNs as
As the Internet continues to grow. the proliIeration oI OSNs presents signiIicant inIormation
privacy issues. In Chapter 1. I will thoroughly explore privacy and its deIinition. applications.
values and costs. BeIore we explore the ethical dimensions oI privacy. we must determine what
it is. how to apply it to OSNs and why it is necessary and important to the public.
Chapter 1: Privacy
1.1 Defining the Jalue of Privacv
The attention given to privacy has increased with the expansion oI the digital age and the
amount oI personal inIormation available through OSNs (Electronic Privacy InIormation
Center). Yet. most have come to little agreement on what privacy actually means and why it
matters. To understand the ethics oI privacy. we must Iirst deIine privacy itselI.
Alan F. Westin in Privacv and Freedom presents the most applicable deIinition oI privacy Ior
technological inquiry. Westin deIines privacy as !the claim oI individuals. groups or institutions
to determine Ior themselves when. how. and to what extent inIormation about them is
communicated to others" (4). Throughout this thesis. when I reIer to !privacy." I am using
Westin#s deIinition. I chose this particular meaning because it Iocuses on the control of
information in reIerence to individuals. This is important as it ensures that when a user has
control. they are able to enter trusting relationships with OSNs to Ireely communicate through
their services.
Additionally. this deIinition is also extremely broad. I am not seeking a legal deIinition to
assert claims Ior inIormation privacy as this deIinition includes too much in order to do that. I am
choosing to look at privacy as more than a legal concept in order to be able to assess legal rights
to privacy. The broad deIinition suits a wide variety oI claims Ior protection in an ethical sense.
This deIinition also provides neutrality. in that the inIormation individuals. institutions. and
groups choose to convey can be true. Ialse. important or inconsequential. InIormation privacy
does not depend on the content or merit oI the inIormation at issue. InIormation privacy on the
Internet does not only apply to what is meaningIul. but inIormation that is misleading or untrue
as well. As a result. protecting the privacy oI others can have great costs when spreading
inIormation that is misleading and serves critical social and political value when spreading
important inIormation.
I have deIined the value oI privacy. but why is it important? Why should any society
recognize a right oI individuals. groups or institutions to determine Ior themselves when. how
and to what extent inIormation about them is communicated to others? This question is
particularly important in light oI the Iact that privacy is not an end in itselI but rather an
instrument Ior achieving other values. Privacy is a tool needed to achieve some result. namely
autonomy. release Irom publicity. selI-evaluation and individual decision-making. and limited
and protected communication. These are Iour values applicable to individuals and organizations
that inIormational privacy serves. These values provide a Iramework Ior identiIying the beneIits
served by privacy.
1. Autonomv: Individuals and institutions require some degree oI personal or organizational
autonomy in order to Iunction. Westin suggests privacy is critical because !deliberate
penetration oI the individual#s protective shell. his psychological armor. would leave him
naked to ridicule and shame and would put him under the control oI those who knew his
secrets" (12). In this respect. privacy is vital to the development oI the individual's sense
oI selI. There exists a need Ior private space to develop and reIlect on ideas and opinions
within a democracy. Organizations oIten require privacy Ior independence and integrity.
what Westin describes as !organizational autonomy." This includes. Ior example. the
need Ior tech companies to keep secrets Irom one another in a competitive market. This
is even required by anti-trust laws and protected by trade secret laws and patents to
support the collective privacy oI an organization. Even government institutions in the
U.S. are permitted and sometimes required to conduct particular activities in private
despite the Iact that they are subiect to extensive openness and disclosure. Private
institutions need privacy iust as much as the individual and today#s capitalistic market
requires this secrecy.
2. Release from Publicitv: Privacy is necessary in order to let loose oI emotions. It provides
the opportunity to be out oI the public eye and express emotion without the Iear oI
consequences to Iurther communication. Denying this kind oI privacy can have several
consequences including increased anxiety and the suppression oI thought. Groups and
organizations also require a release Irom publicity to allow Ior the experimentation oI
ideas and the Iacilitation oI risk taking Irom individuals within the organization. Privacy
provides a release Irom constant image control. Without organizational privacy.
individuals would Ieel watched and unable to concede or compromise. For example. iI
legislative discussions were all made public. Iewer deals to pass legislation would be
made Ior Iear oI public knowledge. People are typically adverse to changing their minds
or backing down publicly. Release Irom publicity diIIers Irom autonomy in that
autonomy allows organizations and individuals to selI-govern and work independently.
Release Irom publicity allows organizations and individuals to operate in the same way
without the inIluence oI the public eye to hold them back Irom acting in certain ways
(either actively or passively).
3. Self-Evaluation and Individual Decision-making: Privacy allows individuals to seize the
opportunity Ior selI-evaluation. The personal privacy resulting Irom solitude provides
reIlection essential to creativity. Without the space and time to process inIormation that
is conIronting individuals. especially in the inIormation age with constant inIormation
bombardment. it is diIIicult Ior one to assess the considerable amount oI inIormation
received. Privacy also allows Ior one to move Irom the private reIlection oI inIormation
to the public sphere as he chooses. Privacy is necessary Ior group evaluation within
organizations and institutions. This value is closely related to the previous two. For
example. iI every conversation among a business# leaders. every draIt oI an email and
every contract were made available to the public. discussion and decision-making would
be repressed. Moreover. the law allows Ior the secrecy oI iury deliberations until a Iinal
decision is reached. The disclosure oI thoughts and suggestions concerning a case would
hinder the accuracy and the outcome oI a decision. The eIIects oI public scrutiny can be
devastating. Privacy is necessary to ensure dependable decision-making. As controversy
and discussion abounds on the Internet. everyone has an opinion with regard to
individuals and institutions. It becomes astonishingly more diIIicult to make decisions
independently and to evaluate what one does without the opinions others oIIer.
4. Limited and Protected Communication: Westin identiIies another value oI individual
privacy including limited and protected communication in order to avoid !the situation in
which each individual was utterly candid in his communications with others saying
exactly what he knew or Ielt at all times" (18).This value oI privacy identiIies the idea
that individuals need opportunities to share personal inIormation with Iriends and
Iamily. This kind oI privacy is needed on an individual level because the inIormation
may be valuable to the person it is disclosed to and is also important to the original
speaker. Organizations need limited and protected communications to receive the
services oI lawyers. accountants and advisers. This value oI inIormation privacy applies
particularly well to organizations. In order to give inIormation to clients and customers.
organizations need limited and protected communications. This interest extends beyond
that oI the individual. as many organizations would be unable to Iunction without having
the access to this relevant inIormation. For example. a bank requires limited and
protected communication oI its Iinancial records because it is necessary Ior the bank#s
activities. Customers would also be unwilling to provide their inIormation unless they
were ensured that it was protected against disclosure. This is an example oI why most
states exempt communications to law enIorcement oIIicials. courts. or business
associates Irom their deIamation laws.
These are the types oI inIormation privacy society generally wishes to encourage. Protecting
the privacy oI the organizations and individuals involved Iacilitates limited and protected
communication. DeIining privacy as control is most applicable to OSNs and allows Ior
autonomy. release Irom publicity. selI-evaluation and individual decision-making. and limited
and protected communication. Privacy as control provides several necessary values Ior
enhancing the relationship between a user and an OSN.
1.2 Two Wavs Privacv is Often Applied to Social Networks
While no single deIinition oI privacy applies in all circumstances. Westin's deIinition applies
best when discussing OSNs due to its wide scope. Although users oIten claim that they want
more privacy on OSNs. what they're oIten really asking Ior is more control over their personal
inIormation. They want to choose who can see their inIormation and have the ability to create.
change and delete it. The control over this inIormation allows the user to keep certain
inIormation private and make other inIormation public. This is the key to developing one's
identity as one can choose when and how others may perceive. observe or interact with him.
Around the world. control and autonomy have become the theoretical Ioundation Ior many
privacy laws and policies.
When others deIend privacy. they determine privacy is a matter oI human dignity. Much has
been written in the deIense oI privacy in this way. Edward Bloustein has a theory oI privacy
grounded in this way. as he argues that all interests in privacy share a value oI "respect Ior
individual dignity. integrity. and independence" (39). When one's privacy is violated. his public
inIormation is exposed and open to public investigation. His relationships may also be
compromised and he no longer has the ability to Iorm and nourish them. When others view
privacy as dignity. they oIten believe that they are not able to truly develop their own personality
and distinct liIe without it.
1.3 Costs of Privacv
Although privacy protects important values. privacy does impose costs on the individual as
well as institutions. Many oI the costs surround the opportunity to mislead which is inherent in
legal protection Ior !the claim oI individuals. groups. or institutions to determine Ior themselves
when. how. and to what extent inIormation about them is communicated to others" (4). Privacy
can Iacilitate the dissemination oI Ialse inIormation. Ior example. when inIormation about
someone is posted anonymously on a social networking blog without repercussions Ior the
person who spread the Ialse inIormation. Privacy can also protect the withholding oI true
inIormation. Ior example. Iailing to disclose relevant criminal history on a iob application.
Privacy can interIere with the collection. organization and storage oI inIormation in which
businesses and individuals might otherwise use to make decisions. This suggests that the costs oI
privacy remain high including the costs incurred by inIormation users who wish to determine the
accuracy and completeness oI the inIormation they receive. and the risk oI Iuture losses resulting
Irom incorrect inIormation. Privacy can reduce productivity and lead to unnecessary
The price oI privacy is social. ethical. psychological and economic. One is allowed to control
the perception oI others by concealing or distorting inIormation. For those that seek privacy
against crimes committed in the past. the saIety oI those who surround criminal oIIenders is
disturbed causing possible public danger. What man would not want to know iI his child#s
babysitter had a criminal history oI molestation? What man would not want to know iI his
potential sexual partner had contracted a sexually transmitted disease? Privacy protects this
interest in non-disclosure. which results in questionable outcomes.
There exists. even more so on the Internet. a popular interest in the lives oI other people.
Tabloid press and gossip blogs Ieed the hunger Ior stories about other people. whether or not
these stories are even true. Much oI this motivation is called !casual prying" or !idle curiosity."
but the inIormation that is yielded Irom the press on these topics appeals to the ordinary person
making consumption and other decisions. The everyday individual is now made into a celebrity
whom others can constantly watch and observe on social networks. The motivations Ior doing so
are inIormational and people are motivated to a greater extent than they may even realize. The
protection oI privacy interIeres with the acquisition oI this inIormation whether signiIicant or Ior
voyeuristic curiosity.
Privacy recognizes the right oI the individual to reveal what he chooses about himselI. which
conIlicts with other societal values. including Iree expression. prevention oI crime. protection oI
property. and conducting government operations eIIectively. There are countless beneIits that
result Irom the expense oI some extent oI privacy. yet privacy remains an important concern Ior
many OSN users. While giving up some privacy can result in overall gains. when norms shiIt to
the release oI most privacy. autonomy and control are eliminated. When norms shiIt with the
change oI privacy policies every Iew months. users are constantly giving without the assurance
oI a gain. BeIore discussing privacy on OSNs. I will discuss privacy on other areas oI the
Internet Ior the purpose oI comparing and contrasting. Email. e-commerce and OSNs are
commonly used on the Internet. but all have very diIIerent implications and eIIects on users.
Chapter 2: Privacy on Email Services
Some believe that privacy on OSNs is similar to privacy on other areas oI the Internet. As it is
your responsibility to ensure that you don't give away your email password. it is also your
responsibility to ensure your photos are not viewable to the public iI you do not want someone
seeing them. One might go as Iar as saying that iI you're worried about privacy. don't belong to
an OSN and delete your account. Would someone respond similarly to a privacy inquiry on snail
mail? AIter all. the postman or a neighbor could easily siIt through your personal letters. It's
unlikely that someone would tell you to stop using the postal service as a Iorm oI communication
because you're worried about privacy. What makes privacy on OSNs diIIerent? OSNs are not a
"necessary" Iorm oI communication Ior all. some might say. Facebook. one social network
alone. has over 400 million users (Facebook Statistics). Many have Iound OSNs to be a necessity
that they use many times a day to communicate more easily than through email. Views on email
and OSNs are very diIIerent. There is little use in researching privacy on OSNs iI we're unable
to compare it to privacy in other realms oI the Internet. I will explore email. as it is something
that most people use on a daily basis. in order to contrast it later with OSNs.
When a user opens an email. he expects that he is the only one who can read it. In Google's
own words. "In personal email communications. there has always been. and always should be. an
expectation oI privacy between the sender and the intended recipients oI a message. enabling
open communication with Iriends. colleagues. Iamily. and others" (About Gmail). Users expect
that personal inIormation (name. email address. billing inIormation) is not shared with other
users unless they explicitly send it to them. There is no "send this email to everyone I know"
button that can share all oI your inIormation with the world at the click oI a mouse. II a user
accidentally clicks "Reply All" instead oI the "Reply" button in response to an email. he would
see it as a personal mistake rather than one he should blame his service provider Ior. The
amount oI contacts is limited and sending emails requires some precise thought as users must Iill
out email addresses one at a time in the address Iield.
While the email relationship structure Irom user to user is Iairly clear across all services. it is
less so when deciphering the relationship structure oI user to service provider. Email services
log inIormation users provide in order to make other useIul products or aspects oI their email
services known to users. Users share personal private inIormation through their services and
expect it to remain private. This inIormation is not shared with anyone outside oI the service
provider unless the user gives the company permission with opt-in consent. iI the service
provider is legally required to by law. or iI the company believes the user is trying to harm the
property or saIety oI the service provider itselI. These privacy policies are standard across the
most popular email service providers.
As most email service providers provide their products Ior Iree. they make proIits through
advertising within email (and oIten through search advertising iI they have a search product).
Advertisements. like the one below in Yahoo! Mail are targeted to users.
Fig. 1 - Yahoo Email Advertisement (Tech-Ex)
On the right oI the email window. a company advertises a 5.75° variable interest rate to a user.
This ad is targeted to users it might be oI interest to. rather than to all users in order to get better
click-through rates. They do this by showing advertisers non-personal aggregated inIormation
such as the total number oI Yahoo users who searched Ior the term "new interest rate" or how
many people clicked on the ad above. None oI this inIormation compromises any inIormation
about any particular individual. Many oI these advertisements can also be helpIul as one who
happens to be looking Ior a new interest rate can Iind a deal they would not have previously
through the advertisement.
Web-based mail servers oI any kind parse messages and reIormat them to make them more
suitable Ior display through a browser. Ad servers oIten scan through emails to provide
advertisements with soItware similar to what is used to detect spam. Rather than identiIying
keywords that are likely to be spam. the soItware Iinds words that may have advertising
potential. All email service providers scan through emails. but this is not done by an employee
looking through what emails you compose to a Iriend. A computer crawler. not a human. scans
emails Ior useIul words or phrases that advertisers can target. Additionally. companies scan
emails in order to provide some oI the very necessary Ieatures that users ask Ior. Email services
scan emails to protect against viruses. Iilter spam and auto-respond to emails. The scanning does
not save or aggregate personal inIormation and share it to anyone else within the service provider
company or outside oI it. Greg Rae. a Iormer Google engineer who worked on Gmail reported.
"There's maior potential Ior something that would be a horrible invasion oI privacy: storing data
about the subiect oI people's e-mails to build up a proIile on what sorts oI topics they're receiving
e-mail on. But Google doesn't do that" (IMHO). Service providers share relationships with
users without compromising privacy as personal inIormation is not shared or stored. As no
human is seeing a user's personal inIormation. the user maintains privacy. Users have control oI
inIormation as the inIormation is used in the way that they perceive and is written in privacy
policies and terms oI use. These policies also rarely change enabling a user to always expect the
same level oI privacy.
3.1 Perceived Privacv
Both consumers and advertisers perceive privacy issues in terms oI inIormation control.
Milne and Rohm propose two dimensions oI user privacy states (238). Much oI this is based on
whether the user is aware that their inIormation is being collected and whether the user can
control the use oI their inIormation. For example. iI a user knows that names are removed when
companies search through emails. the user is aware oI the data collection. Milne and Rohm
illustrated the consumer privacy states below.
Fig. 2 - Consumer Privacv States (Milne and Rohm 238)
Spam is a serious privacy issue on the Internet as users are oIten not inIormed oI the disclosure
oI their inIormation Ior data collection and tracking purposes. When the consumer is inIormed
that their inIormation is being disclosed. they can still receive spam iI their control is low and
they don't have options to stop it. II a user's control is high. they can willingly participate in opt-
in and opt-out programs iI they wish to receive advertisements they do want.
Users have a perceived privacy oI email that may diIIer Irom the privacy protections they are
actually receiving. Users send and receive private inIormation through email and some take
issue with email services displaying ads linked to inIormation Irom emails even iI it is done by a
computer. Some users have a Iear oI third parties looking through their personal private
inIormation even iI crawlers technically bypass it. Suggesting that users who are worried about
what could be done with personal inIormation should not use webmail services avoids
addressing the privacy Iears that some users legitimately have. For example. iI one goes into the
hospital to later Iind that he has a Iatal heart condition. he understands that a nurse may have that
inIormation. but still expects it to be protected. When a third party computer knows the same
inIormation Irom the person who emails a Iamily member to tell her he has a Iatal heart
condition. users are less comIortable. These Iears have impact and aIIect the way one views
their own privacy and acts upon it despite that.
Many oI these Iears hinge on perceived privacy. II a user views his email like a letter. he
expects no one other than the receiver to view it. II a user views her email like a postcard. she
expects that in transit it could be viewed by those other than the receiver. In the same example.
iI one perceives his email to be more like a letter. he surely won't expect the postman to open the
letter up. browse through it Ior target words like "Iootball" and give him the letter along with a
Sports Illustrated advertisement. Users oI email service providers may Iind targeted
advertisements alongside their email to be a similar invasion oI privacy. As privacy hinges on
the idea that users can decide what is and is not shared. we must explore whether users are aware
oI the automatic scanning oI their email. Cassidy. Kyle and Berman in Can You Trust Your
Email? Iound that 32° oI users using web-based mail in their survey were unaware that its
contents were scanned Ior advertisements (82). The automatic scanning oI emails Ior good
(virus protection) and bad (obtrusive advertisements) is standard on the most widely used service
providers and does not share any user's personal inIormation. Although most users perceive that
the crawling exists. some users are unaware oI this use oI their email despite terms oI service
agreements and privacy policies. It is the ethical responsibility oI the service provider to notiIy
users oI how their inIormation is being used. Email services are very clear about automatic
scanning to IulIill this obligation.
In the same survey. users were asked whether or not they believed they owned their email.
83° oI users responded yes (82). Similarly. terms oI service agreements all across the board
state that the service provider owns the email. Hence. when the Iederal government serves
Yahoo with a warrant to view a user's email in a criminal case. Yahoo is Iorced to give the email
over to the government. Why are users oIten misinIormed about what occurs with the privacy oI
their email? OIten. the only clue to how their inIormation is being shared is in the terms oI
service agreement and privacy policy. To ensure that users are aware oI their guidelines. users
must agree to these policies in something similar to the Iigure below.
Fig. 3 - Terms of Service Agreement (Yahoo)
Not all users read through each and every page oI the policies they encounter. and as a
consequence. many are uninIormed. Service providers could do a better iob oI ensuring that
users Iully read and understand their policies. but at the same time. it is a clear step that users
time and time again purposeIully ignore. OIten this is because many don't care about the privacy
oI the numerous accounts they sign up Ior online. Yet. the implications Ior a privacy breach on
an OSN and a breach oI a nytimes.com account are vastly diIIerent. Some online services lack
much personal inIormation at all.
Although email service providers oIten make themselves clear. the perceived privacy and actual
privacy don't regularly align. Users who have a perceived privacy closely related to the actual
privacy will have positive experiences with privacy protection on email services and those who
are unaware will experience distrust toward the service provider. When most are unaware oI how
their inIormation is being used. it is the ethical responsibility oI the service provider to better
inIorm users to the best oI its ability.
3.2 Trust
The perceived privacy that users have contributes directly to the trust they have oI the email
service they use. Chellappa and Pavlou conducted an empirical study linking perceived privacy
to trust in inIormation security areas (358). The trust in the user to service provider relationship
can be thought oI as the user's belieI that the service provider will be honest and act in the user's
interest. Trust can also be thought oI as the user's intention to depend on the service provider
even when the user cannot control it. Although many users depend on Gmail to send their emails
to their intended recipients. one user alone cannot ensure that Gmail does this. While there is
little control. a trusting relationship between the user and the service provider exists. To
illustrate how important this trust is. the Culnan-Milne survey on consumers and online privacy
notices reported that 65° oI users decided not to use a website because they were unsure oI how
their personal inIormation would be used (23). When users have a trusting relationship. they
intend to purchase and participate in services more oIten.
3.3 User Interface Design
The design oI a website plays a key role in determining whether users will give their personal
inIormation to the online service provider. Convenience Iactors and enioyment oIten contribute
to user decisions to purchase a product or sign up Ior a service. In Trustworthiness in Electronic
Commerce. Belanger. Hiller and Smith suggest that website attributes are oIten a deciding Iactor
when users consider giving up their inIormation (245). When email services inundate users with
advertisements to the point that it aIIects their usage on the service. many users Ieel that they
trust the service provider less. Other studies have also similarly Iound that website design can
also become a critical Iactor Ior the user in deciding whether or not to disclose personal
inIormation. but also whether or not to return to the same site in the Iuture Ior a similar service.
The website below has several advertisements on the banner at the top oI the page. banner on
the right and two advertisements in the center oI the page.
Fig. 4 - Website Design and Advertisements (Morgan)
To get to the content oI a website. users would have to scroll past the entire Iirst shot oI the
screen. OIten. when advertisements are obtrusive. users Ieel as iI their privacy has been violated.
II users do not Ieel that a website is pleasurable or convenient. their attitude can aIIect whether it
is used in the Iuture. The simplicity oI Google's homepage below contributed to its early success
as a company due to users' attitudes toward the service.
Fig. 5 - The Google Homepage (The J7 Network)
Google's simple search homepage was pleasurable to look at and convenient to use.
Consequently. users Ielt saIe using the service and continued to return to the site.
3.4 Consequences
Perceived privacy. trust and user interIace design all aIIect whether users will decide to use an
online service. These have consequences Ior users and can aIIect their behavior when using the
service. In the case oI email. one may very well change her behavior by not sending a salacious
email to her husband knowing that someone else could be reading it or scanning it to load up
advertisements. Behavior change in email is oIten the result oI perceived privacy. Legally. it
has taken a while Ior courts to catch up to privacy in email. In Warshak v. USA (2007). the U.S.
Court oI Appeals Ior the Sixth Circuit ruled that the government must move closer to viewing
email like a letter rather than a postcard. For the years beIore this case according to the
Electronic Frontier Foundation. the Stored Communications Act (SCA) allowed the government
"to secretly obtain stored email without a warrant" (Electronic Frontier Foundation).The court
ruled that this warrantless search and seizure oI email was illegal. Legally. users have the
reasonable expectation that no other human being Irom an internet service provider is going to
read their email. Users do not. on the other hand. have the reasonable expectation that a
computer is not going to look through an email to Iilter it Ior spam or send a targeted ad their
The legalities oI email ownership are simply deIined within the letter oI the law. Legislation
has not strengthened the enIorcement oI property rights in relation to email. Most oI the claims
that users have are based on their perceived privacy rather than law. Current law states that an
"e-mail address is technically the property oI the owner oI the domain name to which it is
directed--the (whatever in one's e-mail" (Phillips). This means that iI you use an employer's
domain name. the employer owns your email. The law works similarly Ior Internet service
providers. Although email services. by law. own the email addresses on their domains. most do
not claim ownership oI the actual content oI the emails. Google states that any material that one
transmits or stores in his Google account will not be used "Ior any purpose except to provide you
with the Service" (Google). Most email services have similar statements about intellectual
property rights.  
The legalities oI email ownership do not address other areas oI email ownership. When
answering the question oI who owns a speciIic email message. some might say the person who
wrote it. the person who received it. or others depending on the legal relationship between the
sender and receiver. While some have the expectation that an email will remain private between
the sender and receiver. the disclosure oI a private message could easily be an invasion oI
privacy. An unexpected Iorwarding oI a message could interIere with a business relationship or
bring unwanted exposure oI its contents. In these cases. iI the consequences oI such invasions oI
privacy are severe. one could pursue action with a civil case.
I will use the understanding oI perceived privacy. trust. user interIace design and
consequences Irom email privacy to explore the diIIerences and similarities oI email to OSNs.
Although many individuals have similar privacy concerns with email and OSNs. there is no
single button on your email interIace which would allow you to send an email to everyone you
know. OSNs. contrastingly. can share personal inIormation at a rapid speed to everyone a user
may have a personal connection to. Whereas many are comIortable sending emails casually to
Iriends and co-workers. concerns oI privacy are decidedly heightened when users are considering
sending their Iinancial inIormation through the Internet. Many e-commerce services better
understand the ethical obligation they have to their users. For. iI they did not. users would not
trust the service and would reIuse to use it. As users are more concerned with the security oI
their Iinancial inIormation compared to email to prevent Iraud. analyzing the perceived privacy
and trust oI e-commerce websites will explore a diIIerent user perspective.
Chapter 3: Privacy on E-Commerce Websites
Privacy is an indispensable concern in the realm oI electronic commerce. One cannot
complete a transaction without divulging some oI his personal inIormation such as a shipping
address. credit card inIormation or brand preIerence. Users who are uncomIortable with sharing
this inIormation on an e-commerce website may choose to not purchase anything at all. There is
signiIicant data on the magnitude oI privacy concerns when purchasing online. While some may
be more willing to share a list oI their Iavorite movies on an OSN. users are extremely hesitant
about sharing their Iinancial inIormation. ThereIore. analyzing e-commerce privacy issues will
allow one to compare a privacy issue which most are more sensitive about.
Whereas it's harder to distinguish between the types oI inIormation shared on email services
and OSNs. the type oI inIormation shared on e-commerce websites is much more deIined. It's
important to consider online privacy as it relates to diIIerent types oI inIormation. Below is a
chart comparing how comIortable users are giving out certain types oI personal inIormation
Fig. 6 - Respondents who are alwavs or usuallv comfortable providing information (Ackerman
There are signiIicant diIIerences in comIort level across types oI inIormation. Whereas most Ieel
comIortable giving out their email address. only a handIul are comIortable giving out their credit
card number online and less than halI are comIortable giving out their postal address. Users are
much more comIortable giving out inIormation that is typically seen on an OSN. including age.
email address and Iavorite television show. The most sensitive type oI personal inIormation is
oIten given out on e-commerce websites. Users have a signiIicantly diIIerent perceived privacy
on e-commerce websites compared to online social networks.
3.1 Perceived Privacv
Many have access to the business platIorm oI e-commerce websites and the perceived risk is
diIIerent than the perceived risks on OSNs. Users are concerned by both the possible loss oI
money as well as divulging personal inIormation. As perceived privacy is oIten Iound to be the
anticipation oI how one's inIormation is being collected. used and transmitted. some may think
that the perceived privacy on e-commerce websites is Iairly clear. E-commerce websites accept
credit. shipping and billing inIormation in order to send a user their purchases. OIten. e-
commerce websites hold much more inIormation than what is typed into a browser aIter Iilling
up an online shopping cart. "Data and web mining technologies allow online vendors to even
distinguish between items that were simply looked at versus those that were included in the
shopping cart without actually being purchased" (Chellappa). E-commerce websites are able to
track which pages users go to and what they choose to purchase or pass on. OIten. this
inIormation is used to deliver items the websites recommend based on items a user has browsed
or purchased previously. Below is an example oI how Amazon takes the data Irom what users
browse to oIIer recommendations.
Fig. 7 - Amazon´s User Recommendations (Amazon Homepage)
As I have previously browsed computer equipment Irom Amazon. I was given a list oI similar
recommendations. Although one might be aware that these recommendations exist. some users
may not be aware oI how Amazon stores or uses the same data in other ways. Amazon's privacy
notice states. "We receive and store any inIormation you enter on our Web site or give us in any
other way" (Amazon Privacy Notice). Examples oI this run Irom one's Amazon proIile to wish
lists. posts on discussion boards. special occasion reminders and address book entries. Amazon
can also share this inIormation with aIIiliated businesses they do not control and third-party
service providers. Additionally. they can use one's personal inIormation to send promotional
oIIers on behalI oI other businesses. Despite the Iact that users are Iree to look at Amazon's
privacy notice. many do not and are thereIore unaware oI the disclosure oI their personal
Whereas any prominent e-commerce website posts a notice oI its privacy policy. users have a
choice oI whether or not to read it. TNS and Truste published a consumer privacy index report
Iinding that even when users were making attempts to reduce the amount oI personally
identiIiable inIormation they revealed when visiting websites. only 52° read privacy policies
(12). Because users oIten don't take the time to read privacy policies even when they are making
extended attempts to protect their privacy. many users are unaware oI the way their inIormation
is used on e-commerce websites. Despite the Iact that the perceived privacy oI those using e-
commerce websites is quite diIIerent when compared to reality. trust is oIten very high Ior the
best-perIorming online retailers.
3.2 Trust
Trust on e-commerce websites is in the category oI a consumer-vendor relationship. Many do
not consider OSNs and email service providers to be in this relationship category because users
are not paying Ior these services. Users view services they pay Ior and services they do not pay
Ior diIIerently in terms oI trust level. Additionally. users that make purchases online have certain
perceptions oI a vendor's trustworthiness. OIten this is diIIerent Irom the consumer-vendor
relationships in reality as users who go to the physical Ikea store to purchase Iurniture may not
trust purchasing Iurniture Irom Ikea.com in the same way. This diIIerence may only be due to
the medium oI transaction or platIorm. but exist nonetheless.
The diIIerence in medium is helped by various media reports which publicize hacking or loss
oI credit card and other personally identiIiable inIormation. II users view the Internet as an
uncertain environment. they are less likely to trust the components oI their transactions when
compared to the transactions in a real physical store. Using the previous example. a physical
Ikea store would have to attach cameras on every single person who walked into their store and
process each item they picked up to browse along the way to garner the same inIormation about
consumers as it does online. Yet. large online retailers like Amazon garner much consumer trust.
Below are Iigures oI consumer trust in various organizations.
Fig. 8 - Amazon´s User Recommendations (Truste 9)
Consumers view large online retailers like Amazon or eBay iust as trustworthy as oIIline credit
card companies and even more trustworthy than large physical retail stores. Some studies credit
this to the Iact that many oI the largest online retailers are trusted brands that started Irom the
ground up as consumer-Iocused (Chellappa 21). UnIortunately. this isn't the case with all e-
commerce sites. as many oI these users reported using popular e-commerce websites. There are
some e-commerce sites that consumers do not trust as much as the most popular retailers. OIten
the consumers are less likely to purchase Irom these other retailers as they have no experience
with the service or know anyone who does. When this occurs. users oIten rely on the design oI
the design oI the e-commerce website.
3.3 User Interface Design
The aesthetic design oI an e-commerce website is integral to whether users will make a
purchase. Design is a signiIicant Iactor which can damage the trust consumers Ieel on e-
commerce websites. Correct. timely and proIessional site design is the Iirst thing most users
notice about an e-commerce website. II a user happened to visit the website below. he would
likely lose trust Ior the site due to its cluttered ad-Iilled design.
Fig. 9 - LingsCars.com
Although this website did attempt to provide a personal touch. it missed the mark oI many oI the
user Iriendly Ieatures on more conventional e-commerce websites. Contrastingly. popular and
successIul e-commerce websites are decidedly simple allowing easy browsing Irom product to
product like the Asos website below.
Fig. 10 - Asos
The lack oI obtrusive advertisements allow a reader to trust that the website exists Ior selling its
items. not advertisements. much like one would experience in a physical store. E-commerce
websites can make positive uncluttered user interIace design changes to actively increase trust
and brand connection with users.
Beyond design and advertisements. when websites have outdated inIormation or don't list
contact inIormation. users are wary that a trustworthy individual is not behind the website. Users
want to Ieel as iI content is Iresh and that a business owner is maintaining operations. II a
website has spelling errors or links that don't lead anywhere they oIten criticize the
proIessionalism oI the business. Moreover. many users make it a habit to only buy Irom
websites that have secure purchases. When users are not notiIied by a secure server. like the one
below. they may decide to purchase elsewhere.
Fig. 11 - Amazon Secure Server Sign-In Page
Virtually all oI the top online retailers have secure servers with secure checkout systems. The
retailers that are not secure risk site visits. purchases and customer loyalty. There is much that
users can be cautious about when purchasing online. Users can very easily choose to shop Irom
Amazon to ensure a secure purchase rather than going elsewhere to be leIt worried about the
unintentional disclosure oI their Iinancial inIormation. Many choose to purchase Irom reputable
online retailers to ensure the saIety oI their personal and Iinancial inIormation. Users can
question the ethical intentions oI online retailers based on as little as interIace design. When we
move Iorward to looking at OSNs. design will similarly have a critical role in how much control
users have.
3.4 Consequences
Several oI the Iactors discussed above have very real consequences Ior online retailers. Some
users choose not to purchase items based on whether their inIormation will be used
appropriately. Many online retailers have distinct divisions Iocused on email marketing. Despite
the Iact that some consumers appreciate inIormational emails. some online retailers have
prioritized proIits and taken email marketing too Iar. In early December 2009. online retailer
Topbuy was Iormally warned by the Australian Communications and Media Authority (ACMA)
Iollowing a breach oI the Spam Act (Digital Media). AIter several users asked to be removed
Irom their email marketing lists. Topbuy continued to send the users advertisements. As more
legitimate online retailers are in the limelight. users oIten trust that they Iollow the letter oI the
law compared to secondary retailers.
While trust is thought oI to be a prominent Iactor in e-commerce participation. the answer to
whether trust aIIects the behavior oI users to purchase or register Ior a service is not as deIined.
A study oI young consumers Iound that "E-trust remained a signiIicant predictor oI intentions to
participate in e-commerce" (Yang. Alicea and Clark 6). Among all kinds oI users. "E-commerce
scholars argue that consumers# lack oI trust is one oI the most important reasons why they would
not shop on the Internet" (Yang. Alicea and Clark 6). When users distrust a retailer or are simply
hesitant. they oIten choose to not participate. SpeciIic aspects oI the inIormation provided by the
retailer contribute to whether a user Iinds an online retailer trustworthy. Hesitation can result
most Irequently Irom Iear oI Iinancial loss Irom online shopping. a website that is not secure and
Iear oI unwanted spam. This is exacerbated iI the website does not have listed contact
inIormation and has old content. spelling mistakes. obtrusive advertisements or an insecure
checkout system.
Although the most popular e-commerce websites hold a signiIicant amount oI consumer trust.
websites beyond Amazon and Sears still Iace user scrutiny. For online retailers that don't have
companion physical stores in reality or wide recognition online. hesitation Ior users still exists.
As the most popular e-commerce websites are some oI the most trusted organizations Ior
keeping personal inIormation secure. perceived privacy and trust will diIIer greatly when
compared to OSNs. In the next chapter. I will delve into privacy on the most used social
network. Facebook. The privacy concerns are rampant and many users are not oIIered the wide
variety oI choice in social networks as compared to retailers to purchase items Irom. II all oI
one's Iriends have accounts on an online social network. it is much harder to switch to one that is
more secure iI the connections to Iriends no longer exist on the new network. Privacy concerns
on OSNs are especially unique and diIIer widely Irom email and e-commerce worries. Issues on
e-commerce websites are mostly Iocused on the security oI Iinancial inIormation. which can
create short-term problems while a consumer waits Ior Iinancial security to be restored and a
new credit card to arrive in the mail. Contrastingly. issues on OSNs can be brought on by over-
sharing and the unintentional exposure oI personal inIormation. This can have long-term eIIects.
such as harming one's personal relationships. making overexposure on OSNs a more signiIicant
Chapter 4: The Case of Facebook
Social networks. more so than e-mail or e-commerce websites have very limited privacy. most
oI which users can control and some oI which they have no control over. Some OSNs. as is the
case with Facebook. have consistently eliminated aspects oI control while claiming to give users
more control than they had beIore. I will begin my analysis oI OSNs by delving into one oI the
most popular websites on earth - Facebook. Facebook is leading social networking in almost
every country and is at the IoreIront oI changes to privacy. By exploring Facebook's privacy
policy. we can look at an example oI the Iirst establishment oI trust between the user and the
Fair inIormation practices call Ior OSNs to notiIy users oI how their inIormation will be used.
OSNs use privacy policies to satisIy this requirement and emphasize the importance oI privacy to
the company. This is the Iirst legal step that many companies take when debuting a social
product. Many have even employed the services oI TRUSTe which claims to "protect privacy
and build customer conIidence to work. shop and play online" (Truste). It hands out seals oI
approval Ior websites that protect privacy and promote online saIety. OIten. these seals are
placed Iront and center on lower ranking social networks which also happen to be the slowest
growing. As a result. the mentions oI privacy on social networks remain iust that - private. As I
stated earlier. the rates oI reading privacy policies and terms oI service agreements are low. even
when users are making an attempt to protect their privacy. II a user made an extraordinary
attempt to protect his privacy. he might read a privacy policy Ior a social network to understand
how his inIormation might be used. OSNs cannot Iorce users to read privacy policies and it is the
responsibility oI the user to read and understand them. Yet. Facebook has become notorious Ior
constantly changing its privacy policies every nine months. II a user was a Harvard student who
signed up Ior the service at its start. he would see a Iar diIIerent service today. By simply
changing the terms oI the relationship between a user and the service oIten. it becomes more
diIIicult Ior a user to be appropriately inIormed.
Facebook has made signiIicant changes to its privacy policies and terms oI service within the
last Iew years. In January 2009. Facebook's terms oI service claimed that each user owned any
content he posted and Facebook would no longer have control over it iI he deleted his account.
In February. Facebook made a sweeping change to its terms oI service claiming that anything a
user uploaded to Facebook could be used in any way depending on its privacy setting. Ior
however long it wanted no matter iI the user deleted his account. Facebook changed its closed
and strict network to one that claimed external rights that went Iar beyond the rights oI email
service providers and e-commerce websites. In Facebook's privacy policy. the company is very
clear about aspects oI user proIiles made public. Users were once able to make everything on
Facebook private. as it used to be a private network limited to users by university attended.
Today. a user is no longer able to hide his name. proIile photo. list oI Iriends. pages he is a Ian
oI. gender. geographic location and networks he belongs to. These Iacets oI a user proIile are
considered publicly available to everyone. There are no privacy settings that can block
disclosure oI this inIormation and thereIore. users have little control.
Other inIormation on Facebook is available to a range oI contacts. This can include everyone
on the Internet or iust the user himselI. When relationships between users on social networks
deteriorate. OSNs avoid any and all responsibility Ior the actions oI their users. Facebook's
policies state. "You are solely responsible Ior your interactions with other users oI the
Application. We will not be responsible Ior any damage or harm resulting Irom your interactions
with other users oI the Application. We reserve the right. but have no obligation. to monitor
interactions between you and other users" (Facebook). OSNs are Iilled with disclaimers.
limitations oI liability. privacy policies. terms oI service agreements and contracts. Users can
report violations oI their policies. but there is oIten no Iormal process Ior resolving issues
between users. ThereIore. iI online social networks hold very little responsibility. the protection
oI privacy is very closely related to the control oI personal inIormation. Users can expect next to
nothing Irom social networks outside oI anonymization oI some user inIormation. Facebook
claims to give its users what they need to control the inIormation they want to share. Yet.
Facebook has continued to remove Ieatures oI control that users were accustomed to having. I
argue that when privacy is viewed as control. more oIten than not. a privacy policy change
comes at a loss oI user control. not a gain. Analyzing recent privacy changes on proIiles.
applications and search will provide a better illustration oI the recent loss oI control.
4.1 Profile Privacv
A Facebook user creates a proIile. which is displayed as his online identity. He might include
photos oI himselI. his Iriends. music and activity interests. status updates on his liIe. location and
notes. A user adds Iriends and interacts with them on a basis oI his choosing. Social networks
hope that users connect with as many other users as possible. as the more connections a user has.
the more time he's likely to spend on the site interacting with Iriends. OSNs have relied on user
control in order to execute policy and eliminate responsibility. OSNs oIten explain to users that
they should have control over their personal inIormation and they should have access to the
inIormation that other users would like to share. How do these policies work on an actual OSN?
Social networks expect users to control their inIormation in order to protect their own privacy.
UnIortunately. this task is extremely complicated. especially Ior the user who is iust beginning.
Below are the privacy settings one sees Ior the Iirst time on Facebook.
Fig. 12 - New Facebook Profile Privacv Settings (Facebook Jan 2010)
There are many settings to go through and lists to make oI Iriends who can see certain
inIormation. It's not a simple process. Facebook admittedly views privacy as control. as it oIten
begins describing its own privacy settings with the word "control". In 2006. Zuckerberg stated.
"This is the same reason we have built extensive privacy settings - to give you even more control
over who you share your inIormation with" (The Facebook Blog). Yet. the additional settings
have made it more diIIicult Ior users to execute this control. One has to actively change his
privacy settings iI he wants more privacy. Ìf a user doesn't want anyone to see photos of
himself, he would have to change the "photos and videos of me", "photo albums" and
"posts of me" settings. When trying to change each setting so that only certain people
can see particular photos, it can get very difficult to ensure that each "friend" sees a
user in the way that he wants. Social networking privacy controls are becoming so
notoriously complicated that websites have featured how-tos to inform unaware users of
what to do to protect aspects of their profile. The table below shows what setting options
a user must control for various people to access information.
Fig. 13 - Facebook Privacv Options
Even iI users decide to limit access only to Iriends. some Facebook applications without access
to one's personal inIormation can still see some inIormation that should remain Iairly private.
Users can make inIormation available to everyone or have custom settings. When one selects
inIormation to be available to "Everyone". that inIormation becomes publicly available
inIormation Ior the world to see. Search engines can also index the publicly available
inIormation and it could come up as a result when one searches a name in search. II a user
selects the "Custom" option. he can share inIormation with speciIic Iriends or groups oI Iriends.
Unless the same user also changes his application settings. his Iriends and any applications they
use will also have access to the inIormation. This allows one to make the inIormation oI others
public. even iI they have privacy settings which state otherwise.
The old privacy settings Irom Fig. 12 beIore December 2009 are pictured below in Fig. 14.
Fig. 14 - Old Facebook Profile Privacv Settings (Facebook Oct 2009)
The most noticeable change in the privacy settings Irom Fig. 12 to Fig. 14 is what Facebook
deIaults on. All inIormation was once only available to Iriends as in the privacy settings above.
Some privacy controls were later eliminated and Facebook claimed that control increased Ior
users. Personal. Iamily. relationship. education and work inIormation and posts now deIault to
public Ior everyone who has access to the Internet to see. Several controls were removed Irom
the old settings to the new. Users were once able to control what groups oI people could see
their proIile pages. Anyone outside oI permitted groups would see a limited proIile including a
name. networks. and other inIormation set by search privacy settings. By removing this control.
users must go to another set oI privacy controls to speciIy who can see what speciIic kinds oI
inIormation. This presents another barrier to user control. Facebook also removed the Iriends
control. which allowed the user to decide who could see his Iriends list. The Iriends list was
declared as publicly available inIormation by Facebook. making one's Iriend connections
available to everyone on the Internet. When Facebook claims that control oI privacy increases.
they most oIten mean that privacy controls increase in the number oI buttons or settings. Actual
privacy Irom a user's perspective oIten deteriorates with added settings and buttons. The
increased number oI settings and pages makes it more diIIicult to navigate them.
4.2 Application Privacv
An additional privacy matter Ior Facebook revolves around application privacy settings.
Facebook has various types oI applications to run in addition to the social network. which
include games. quizzes and more. Many users are unaware that iI I run an application on
Facebook. such as playing the ever popular Farmville. that application has access to data Irom
the proIiles oI anyone I am Iriends with. The Application Privacy Page aims to give users
control over the inIormation applications that others run have access to. The new privacy
changes have also changed on this Iront as well. Previously. one would see the Application
Settings below.
Fig. 15 - Old Privacv Application Settings (Facebook Oct 2009)
There are a Iew maior diIIerences between the old privacy settings and the new privacy settings
Fig. 16 - New Privacv Application Settings (Facebook Jan 2010)
The Iirst maior diIIerence is that a user no longer has the option to opt out oI applications
entirely. Users once had a blanket easy option oI "Do not share any inIormation about me
through the Facebook API" (Facebook Privacy Settings). All users are now Iorced to share
inIormation with any applications their Iriends use. unless they take speciIic steps to block
particular inIormation. With proIile inIormation. users are immediately sent and notiIied oI the
privacy settings. The application settings are not immediately brought to the attention oI new
users. ThereIore. new users could easily unknowingly give out personal inIormation to
application developers.
The second maior diIIerence is that Facebook has increased the inIormation that is always
public. With the old settings. a user was able to uncheck all oI the boxes and would only share
his name. networks and list oI Iriends. With the new settings. iI a user unchecks every box. an
application a Iriend uses has access to the user's name. proIile picture. gender. current city.
networks. Iriend list and pages the user is a Ian oI. Additionally. with Facebook Connect. users
were able to login to an external website and see which oI their Iriends have also connected to
the website previously. Users were able to track the movements oI their Iriends on the entire
web. not iust Facebook alone. In April 2010. Facebook introduced the new "like" button across
the entire web. When a user "liked" a webpage or blogpost on the Internet. the new Ieature would
report this to his Facebook proIile. Users no longer needed to connect. Facebook did all the work
Ior them. The lack oI user control in application privacy increases with every update to the
privacy policies. II a privacy policy is updated. a user will be sharing more than they were
previously and not necessarily with their explicit permission. Application privacy settings
change without the meaningIul consent oI users. as most are not actively agreeing to share their
inIormation with application developers.
4.3 Search Privacv
When one searches Ior someone on a search engine like Google. a Facebook proIile oIten
comes up on the Iirst page oI the search results. making it easy to Iind. Facebook recently made
changes to the way that proIile inIormation appears in Internet searches. Below are the old and
new search privacy settings.
Fig. 17 - Old Search Privacv Settings
Fig. 18 - New Search Privacv Settings (Facebook Jan 2010)
The most important diIIerence between the old and new privacy options Ior search is that the
new settings allow Ior search indexing as the deIault option. People are able to see any publicly
available inIormation as it is listed on Facebook when searching on Google or Bing. While this
alone isn't new. the Iact that Facebook now indexes a host oI personal inIormation such as wall
posts and status updates is new. Previously. this public inIormation was not indexed Ior search.
Both Bing and Google announced in late 2009 that its search results would include public wall
posts and status updates in real time. As soon as someone makes a public proIile change. it is
updated and available Ior a non-Facebook user to Iind. The change in inIormation that is now
made public is illustrated below.
Fig. 19 - Old Public Profile Jiew (DOT Rights)
Fig. 20 - New Public Profile Jiew (DOT Rights)
Facebook now allows anyone to see the personally identiIiable inIormation oI anyone who has
registered Ior the social network. EIIectively. this has made part oI what was only available to
close Iriends available to the world. Many oI Facebook's recent changes have made it more
diIIicult to maintain privacy. especially iI one ioined Facebook with the impression that they
were ioining a private network. Through the privacy settings outlined in this chapter. it is clear
that Facebook is eliminating control rather than making it more transparent and available. AIter
describing and analyzing the limited privacy settings oI the Ioremost social network in the world.
I will now move on to look at how users on social networks are pushed in the direction oI
revealing more inIormation than they might otherwise.
Chapter 5: Choice Architecture
Facebook operates with an opt-out system Ior privacy control. which works with choice
architecture to persuade users to share more inIormation. Choice architecture describes how
decisions are inIluenced by how choices are presented. This term was originally coined by Cass
Sunstein and Richard Thaler in Nudge. Improving Decisions about Health. Wealth. and
Happiness. Sunstein and Thaler suggest that the organization presenting a choice can arrange
the deIault outcome to be the outcome that it desires. For example. systems oI organ donation in
countries such as Britain have opt-in policies. Britain's system yields about 30 kidney donations
per million population. In Austria. with an opt-out policy. about 53 kidneys per million
population are donated (Taylor). Austria assumes that people want their organs to be donated
unless they have previously obiected. This yields a positive outcome oI more lives saved due to
increased donations. II a government sought to increase organ donation. they would likely create
an opt-out policy to achieve its goals. Sunstein and Thaler oIten speak oI how choice
architecture inIluences people to act on certain behaviors that have positive eIIects. For
example. a school caIeteria could push its students to choose healthier Ioods by putting Iruits and
vegetables at the Iront. Yet. there are plenty oI organizations that aim to use choice architecture
to get people to spend more. click more and in the case oI online social networks - reveal more
about themselves. In the previous chapter. I outlined the many changes Facebook made to its
choice architecture. These changes decreased overall user privacy and control. having negative
eIIects on users.
Unlike eating healthier Ioods or donating organs aIter death. sharing more isn't always in the
best interest oI the user. A user could easily share more than they expect when Facebook builds
barriers to user control. OSN choice architecture presents ethical concerns as the service provider
is rarely pushing the user into doing what they want or what is best Ior them. There are many
elements oI choice architecture that can be applied to OSNs. The most prominent elements in
the case oI OSNs are design. paths and environment. In this chapter. I will analyze how OSNs
use choice architecture to persuade users to share more by Iorcing them to opt-out rather than
opt-in. Pushing users in the direction that they would not choose otherwise and increasing the
likelihood oI a negative outcome without a user's meaningIul consent is an ethical concern.
5.1 Choice Design
Choice design has an integral role in how users make choices. Thaler and Sunstein claim the
deIault option is key when users are selecting among several options. Whatever an organization
chooses as the deIault option has the highest likelihood oI being selected. Similarly. when organ
donation is high in a country that deIaults on donating. an OSN that deIaults on public is much
more likely to have users who choose to remain public. Fast Iood restaurants have managed to
make a proIit oII oI similar choice design. It is very easy to stick with the menu and order a
combo or value meal. but iI a customer decides to have iust a burger. they must explicitly ask Ior
it. Customers. in Iorce. would rather stick with what's easier and order combo number 5. The
choice design oI Facebook's privacy settings made it clear what it wanted its users to do - keep
inIormation public. Anything else would have required many extra steps with separate pages
Iollowed by more extra steps. Additionally. when Facebook took away many oI its privacy
settings. as a user could no longer hide his name or proIile photo. it changed the design by
eliminating the choices users once had. While requiring users to go through several steps to
protect their privacy. it was oIten too much Ior users to keep track oI.
The addition oI Facebook's "instant personalization" Ieature across the web. in April 2010. is
an opt-out Ieature that makes it very diIIicult to opt-out oI. When Facebook launched the Ieature.
it claimed "You can easilv opt-out oI experiencing this on these sites by clicking here" (Axon).
Users were taken to their privacy settings and told to unclick "Instant Personalization" in the
settings below.
Fig. 21 - Instant Personalization Settings (Axon)
When a user unclicked the box. the pop-up below asked the user to conIirm this decision and told
him that he might miss out on a "richer experience" when browsing the web.
Fig. 22 - Are You Sure? (Axon)
Furthermore. it stated "that iI you opt out. your Iriends may still share public Facebook
inIormation about you to personalize their experience on these partner sites unless you block the
application." ThereIore. to block Iriends Irom sharing inIormation. one would have to block
each and every application that wanted to personalize one's experiences on the web. To Iind out
how to do this. the user would then have to go elsewhere to "Learn More" and Iinally go to each
application and block it. This was not such an easy endeavor Ior users. The more tasks and
windows a user must go through to execute control. the less likely they are to do it. By making it
more diIIicult Ior users to control application developers. Iriends and everyone else Irom seeing
their personal inIormation with an opt-out system. it is one oI the worst practices to enable users
to protect their inIormation.
5.2 Choice Pathwavs
The sequence and placement oI the choices highly aIIects which decision users will make. In
addition to the deIault settings. when one decides to take the step to change a deIault setting Irom
"Everyone" to a choice that may be more private. they are given the pathway below.
Fig. 23 - About Me Privacv Pathwav (Facebook Jan 2010)
Each choice below the deIault setting is only slightly less public than the one beIore it. A user
who wants to be signiIicantly private would have to choose the deIault privacy setting oI "Only
Friends" second Irom the bottom. Additionally. when a user clicks on his privacy settings. he is
directed to the menu below. This then leads the user to Iive other privacy setting pages
depending on the area he wants to control. Just to edit the privacy settings on a photo album. one
would have to click "Privacy Settings". "ProIile InIormation". "Edit Settings" under "Photo
Albums". and Iinally Iind the album and edit the settings.
Fig. 24 - Privacv Settings Privacv Pathwav (Facebook Jan 2010)
Each step required oI the user makes it more diIIicult to decide on the changes they want to
make. Facebook describes its privacy settings with the word "control" under every pathway.
Although Facebook claims to give users the tools to control their privacy. the users are
consistently losing control over time and Iacing higher barriers to execute it.
6.3 Choice Environment
The surrounding environment and cues that users see when considering choices highly aIIect
their outcomes as well. When patrons at a restaurant look open a menu. their eyes typically Iirst
go to the top oI the page on the right. This is oIten where menu designers place the items with
the highest proIitability as they are more likely to be chosen (National Restaurant Association).
Similarly. there are certain hotspots oI where people view what's on a website. Below are
various screenshots showing the hotspots oI where users read web content. Users typically view
webpages in an F-shaped pattern and with data results Iairly consistent across various types oI
Fig. 25 - How Users Read Web Content (Nielson)
Facebook's choice environment Ior privacy is made Iairly hidden on the website. In Iact. when
you Iirst create an account. you are never told that privacy settings even exist and where to Iind
them. The choice environment does not alert the user to privacy issues. which is an important
part in determining how others view this user. Currently. a user must do some poking around
and click through to "Account" at the top right oI the page to get to any privacy settings. To
"unlike" a page or remove a user Irom one's Iriends. the user must go all the way to the bottom
leIt oI the page. This area is not as prominent as the top center oI the page. Even when going
through the privacy settings to control aspects oI a user proIile. some privacy settings are even
harder to Iind. When Facebook updated its privacy settings in late 2009. many did not
understand how online activity with Iriends was posted on the newsIeeds oI other users. Soon
aIter. there were many questions Irom users asking how they could control this new notiIication.
Facebook had to post the notice below to its help page to inIorm users that this control no longer
Fig. 26 - Information About Recent Activitv (Facebook Help Center)
I argue that through the choice architecture described above. Facebook has persuaded users to do
what may not have been in their best interests. The data on choice architecture shows that users
may lose privacy they would have had otherwise. oIten without being aware oI it. Furthermore.
by making it more diIIicult Ior users to execute control through various opt-out procedures. users
are not given the amount oI control the provider promised when they entered the user to service
provider relationship. When deIaulting on private. many users who are never alerted to check
their privacy settings or change them in any way remain with the deIaulted setting. Even when
users decide to change privacy settings. they are bombarded by more settings and pages to wade
through until they Iind what they need. When users have been given little opportunity to reIuse
the consent to share. they have no choice. Without choice. OSNs are not executing Iair
inIormation practices. leaving users with little chance to protect themselves. In the next section. I
will discuss how real users perceive their privacy on OSNs to determine iI they are harmed by a
perceived privacy that is diIIerent Irom their actual privacy.
Chapter 6: Perceived Privacy
While some users perceive that their privacy is more limited than it once was. others are leIt
unaware or simply conIused. Although social networking privacy controls are complex. what
expectations should users have oI their privacy knowing that they're entering into agreements
that OSNs can change when they Ieel like sharing more oI their users' inIormation? Some believe
that users on OSNs are highly aware that personal inIormation is made public Ior Iriends. Iamily.
employers and law enIorcement to see. Others. such as Canada's Privacy Commissioner. the
American Civil Liberties Union and the Electronic Frontier Foundation disagree. They Iound
Facebook's changes to be conIusing and intrusive (Bankston). Chad Skelton. a reporter Ior The
Jancuver Sun. was surprised that Facebook violated the terms oI service that he originally
agreed to. when Facebook was a much saIer place. Chad claimed. "When that saIe. private place
is violated -- by. Ior example. the owner unilaterally showing your kids' photos to any creep that
walks through the door -- it makes you pretty damn angry." Perceived privacy will be diIIerent
Ior every user on every OSN as there is no standard privacy line. but there are some expectations
users can legitimately have based on inIormation provided by the OSN.
I previously deIined perceived privacy as the anticipation oI how one's inIormation is being
collected. Despite whatever conIusion or outrage users might Ieel when they enter into an
agreement by clicking "yes" to a privacy policy or terms oI use. the expectation oI privacy
should be as you make it. II a user sets all or part oI their privacy settings to public. they should
expect that Iriends and the public see iust that. I'll use the example oI issues with law
enIorcement perusing social networks to illustrate a more extreme example. II one has a public
proIile. what they say is available Ior law enIorcement to look at. Just as iI someone commits a
crime in broad daylight Ior the world to see. the public must have the same expectations oI a law
enIorcement response iI they post something to a public proIile. II something questionable is
private. users should have the expectation that law enIorcement will acquire a warrant. This
would be a similar response to someone committing an illegal act in the privacy oI his own
home. Moreover. iI a user posts a questionable photo oI himselI and has made it available Ior a
prospective employer to view. he cannot expect an employer not to look at it. Employers have
used social networking websites to Iilter candidates Ior iobs. Whether or not this is ethical. it is
an expectation that users with public photos should have.
Users are responsible and have control over what they show to the public. but users can make
inIormation or content about someone else available as well. II a Iriend has the ability to post a
questionable photo oI you. what do you do to prevent this? Across the board. websites that
educate users on appropriate user privacy controls agree on the solution. The best way to
prevent questionable content Irom appearing on your OSN proIile is to ensure that you do not
make bad iudgements in your personal liIe. In a world oI constant publicity. the best solution Ior
changing the way one appears on OSNs is oIten to change how they appear in real live social
networks. When users opt into private networks. they have a legitimate expectation oI privacy.
As more and more social networks are deIaulting on public. users should have a limited
expectation oI privacy iI they sign up Ior a public OSN.
While users should have certain expectations oI privacy. this diIIers drastically Irom the actual
expectations oI privacy. In a study simultaneously conducted in Canada. at Ryerson University.
and in the United States. at the University oI Miami in 2009. social networking users responded
to questions regarding expectations oI privacy and personal inIormation protection taken online
(Levin and Sanchez). The graph below shows that most users do adiust their own privacy
settings online. but a signiIicant amount remain on the deIault setting.
Fig. 27 - Users Adiusting or Defaulting on Privacv Settings (Levin and Sanchez)
It's very possible that users may want to legitimately keep the privacy settings that developers
give them. OSN users are oIten very technologically savvy and understand how to adiust their
privacy settings. On the other hand. as OSNs broaden their reach. many users who have a
limited understanding oI the implications oI the lack oI user privacy are creating accounts Ior the
Iirst time. Some oI these users are likely under the Ialse impression that they are protected when
they truly aren't. The graph below shows users who responded to the statement "I see myselI as
someone who takes appropriate steps to limit who has access to my proIile."
Fig. 28 - Jiews on Access to Profile (Levin and Sanchez)
This shows that many users may interact on social networks without understanding who could be
looking at what. These users are knowingly leaving their proIiles open to an unintended
audience and are likely to be deIaulting on public or with most oI their settings open to
everyone. In the Iollowing graph. users responded to the statement "I believe I am able to take
appropriate steps to control what is posted about me on my Online Social Network."
Fig. 29 - Control (Levin and Sanchez)
Only 38° oI users believe they can take control to protect their privacy. Some may argue that
the previous data only showed that users were careless and actively chose to not change their
settings despite the ability to. Although social networks seemingly place all the controls in the
hands oI the users. some users clearly don't Iully understand the controls and leave themselves
open to violations. Facebook. LinkedIn and Twitter boast oI tools to help you protect your
privacy. the more tools some users have. the more conIused they become. On Facebook alone. a
user can control the photos he posts to his wall. the Iull albums he posts and the photos others
post oI him. There are three separate privacy tools Ior iust one Ieature. !Fundamentally. privacy
is about having control over how inIormation Ilows." said Danah Boyd. a social media researcher
at MicrosoIt Research New England. in March 2010. !When they Ieel as though control has been
taken away Irom them or when they lack the control they need to do the right thing. they scream
privacy Ioul" (Kang). Many users are unsure oI the risks oI limited privacy and truly question
their ability to control it. Beyond that. there is little evidence that the developers oI OSNs have a
strong concern Ior user privacy as many continue to push the envelope oI making more
inIormation publicly available. OSNs have a strong interest in eliminating user privacy as the
more inIormation people share. the more content there is Ior users to explore. As users browse
content Ior longer periods oI time. OSNs can serve more advertisements and generate more ad
revenue. More inIormation equals more money Ior OSNs. which creates a serious conIlict oI
interest between the user and the OSN. A user may want to share less to remain private. yet an
OSN wants more time on site Ior added revenue.
The conIlict oI interest became more apparent when Facebook proclaimed it was taking a Iew
steps Iorward by updating its privacy policy to make more inIormation public. In a soon to be
released study Ior the Pew Center#s Internet & American LiIe Proiect. the center Iound "most
people said they cared greatly about online privacy but they didn#t do much about it" (Kang). As
the choice architecture oI many online social networks suggests. developers are making it very
clear that they want users to share more. oIten narrowing the perceived privacy and control that
users have to Iurther company interests. As users have been losing privacy and control
consequentially. many developers have slowly eroded consumer trust along the way.
Chapter 7: Trust
Perceived privacy contributes directly to trust in a similar Iashion Ior online social networks as
e-commerce and email services. When a user trusts an OSN. he believes that the developer will
be honest and act in his best interest. Even when a user cannot control what an OSN does. a
trusting relationship assumes that the user intends to depend on the OSN.
In an attempt to gain user trust. Facebook announced Facebook Governance in February 2009.
which was an eIIort to involve users in developing its terms oI service. In April oI 2009.
Facebook Governance allowed users to vote on its new Terms oI Service hailing that they
received over 3.000 comments Irom their users on the changes. UnIortunately. they claimed that
they would only hold a vote on a proposed change iI at least 7.000 people commented.
Furthermore. the vote would only be "advisory" unless 30° oI its active users participate in the
voting (Axten). Considering Facebook had over 200 million users at the time. they would have
to get 60 million votes and 7.000 comments to aIIect any change whatsoever (Facebook
Statistics). When the voting period was complete. the Los Angeles Times claimed the Facebook
governance vote was a homework assignment no one did (Sarno). Only 0.32° oI Facebook's
users voted. Iar Irom the 30° needed to enact any change. While Facebook attempted to be
democratic with a high voting threshold it's hard to have a democracy without participation.
Most services don't try to run in a democratic manner. Services respond when users scream.
write a blog about their dissatisIaction or shower a company with Ieedback emails.
In February 2009. Mark Zuckerberg responded to privacy concerns raised by The
Consumerist. Facebook's newest terms oI service made it seem as iI the service could to
anything with content one uploaded. no matter iI the account was later deleted (Walters).
Zuckerberg responded. "In reality. we wouldn't share your inIormation in a way you wouldn't
want. The trust you place in us as a saIe place to share inIormation is the most important part oI
what makes Facebook work" (Sarno). Soon aIter. Facebook began to survey users about how
"open" they were. Were they open to sharing their inIormation or did they only share
inIormation with Iriends? The survey below then continued to ask users more detailed questions
about their openness.
Fig. 30 - Facebook Openness Survev (Axon)
In March 2010. Facebook introduced a new set oI privacy policy changes it wanted users to
comment on. In Facebook's redline document oI it's new proposed changes. it began to make
room Ior its newest yet to be released privacy settings by eliminating more control. In Fig. 31.
much oI Facebook's privacy protections have been scrapped.
Fig. 31 - Facebook Redline Privacv Changes - 2010 (Facebook Site Governance Documents)
Users were told to comment on the proposed policies within seven days oI them going public.
Although Facebook has grown to over 400 million users since its last privacy policy change.
Facebook only received 4.000 comments compared to the 3.000 comments Facebook had when
it was halI the size. Why are users not voting? Are they placing their trust in the company?
Hardly. First. Facebook doesn't make the changes easy to Iind. Below. Fig. 32 shows that users
would have to go to their messages and then to their updates to Iind any alert that Facebook was
changing whatsoever. Users are not alerted to the changes with an interstitial when they enter
the site or even a notiIication. For the users that have many updates and messages. the Facebook
update could easily get buried beneath them all unchecked by the time the seven day period was
Fig. 32 - Facebook Privacv Policv Alert (Facebook Site Governance Documents)
Second. Most users don't have an interest in reading the redline documents and proposed
changes. which can take a surprisingly long time to get through. The barrier to reading the
documents in Iull. understanding them and then Iollowing up with a constructive comment to the
Facebook Governance site is high. Third. users don't have Iull comprehension oI the
consequences oI their actions. When Facebook continues to change its privacy policies and terms
oI use. users begin to see the changes as white noise. The more changes there are. the less likely
users will Iind or respond to them. Erick SchonIeld. a reporter Ior TechCrunch wrote. "It is
diIIicult to trust a company that is stripping users oI rights they've become accustomed to. II I
upload a picture which I later regret uploading. why shouldn't I be able to erase it Irom Facebook
Iorever. even iI some oI my Iriends have already seen it?" As Facebook once gave users options
to control and make everything private. it erodes trust when those controls disappear despite
company assurances. I argue that this is another violation oI the practice oI choice as users are
not given the opportunity to reIuse the consent to share inIormation. II one wants to take a photo
oII Facebook. it should be eliminated Irom the servers in an appropriate amount oI time. When a
user ends a relationship with an OSN. his inIormation should be deleted. rather than stored Ior an
extended period oI time as the user is no longer giving the OSN the consent to share.
Furthermore. although users are given the opportunity to aIIect change through voting. the
constant policy updates make it more diIIicult Ior users to realize they can have a real eIIect.
Users become apathetic to the act oI voting when change is seemingly impossible. The high
threshold to aIIect change is also nearly impossible to reach. discouraging users Irom
participating in the "democratic" OSN whatsoever. Whether or not a user trusts an OSN is
aIIected by the user's attitude toward the site. In the next chapter. I will cover user interIace
design. which can contribute to the user's experience and actions in executing his privacy.
Chapter 8: User Interface Design
8.1 Design Principles
The attitudes users have about OSNs are heavily rooted in its design. Facebook has several
design principles which guide its product design. user interIace. user experience and
communication to users. The principles are outlined below.
Fig. 33 - Facebook Design Principles (Facebook Design)
While Facebook claims that its design must be transparent. clear and upIront. many oI the
consequences oI Facebook's design make it more diIIicult Ior a user to Iigure out how to share
less or become more private. Facebook claims to streamline its design and Ior much oI the user
interIace this is true. Yet. I argue that privacy settings are most oIten conIusing. limiting the
ability Ior the user to have control. Without control. users cannot protect their privacy. The more
buttons Facebook adds to the privacy settings page. the less control users have as they must wade
through endless options. Much oI Facebook's design is Iocused on simplicity to make it easy to
share. In contrast. privacy are settings are complicated. purposeIully making to it diIIicult to opt-
8.2 Profile Design
Facebook's clean and consistent proIile design is one oI the core elements separating it Irom
Myspace in the dying battle Ior the social networker. Going to both sites brings you to a similar
sign-up page. Yet. when you view a proIile on Myspace. the personality and design oI an
individual's layout is obvious. Below are two Myspace proIiles side by side. each very diIIerent
and indicative oI the user's attitude and personal preIerences.
Fig. 34 - Mvspace User Profiles (Google Images)
The Facebook proIile design. in contrast. is similar in design across all user proIiles. The design.
as seen below. is cleaner and doesn't leave as much room Ior the user to experiment.
Fig. 35 - Facebook User Profile (Facebook)
Additionally. the kind and amount oI inIormation a user shares is diIIerent. A user can share
vastly more inIormation on Myspace than Facebook. The kinds oI inIormation shared most by
people on OSNs are below.
Fig. 36 - Extent of Information Included in Online Profile (Levin and Sanchez)
Why are some types oI inIormation shared more oIten than others? OSNs ask Ior certain sets oI
inIormation when setting up one's proIile. making it more likely that users will contribute that
inIormation over others. Facebook asks Ior user's real names. whereas on Myspace. it is much
more common to create a proIile with any nickname or Iake name that a user wants. Highschool.
hometown and proIile photo are asked Ior on both Myspace and Facebook and most users share
these as identiIying Iactors that make it easier to connect with others. OSNs also give design
cues to users to ask Ior the inIormation they want them to share with others. When you are about
to upload your proIile picture to Facebook Ior the Iirst time. the website suggests that you upload
a real photo oI yourselI so that others can Iind you. Much oI the inIormation asked Ior is highly
personal. but that also makes it more likely that others will want to connect with others - to Iind
out more personal inIormation about their Iriends.
8.3 Advertising Design
In 2007. Facebook launched Beacon. an advertising system that sent data Irom external
webpages to Facebook. When users would purchase an item Irom Overstock.com. Ior example.
a pop-up like the one below would appear saying that Overstock.com was sending your purchase
inIormation to Facebook.
Fig. 37 - Overstock.com Notification (Zuckerman)
The user would have about ten seconds beIore the window would disappear. The inIormation
would then appear on a user's Facebook wall. telling Iriends what a user purchased and where
they purchased it Irom. There was no purchase history opt-out Ieature that would block all
purchase history inIormation Irom being linked to a user's Facebook proIile. This inIormation
was highly sought by advertising companies. Any company that sells products wanted to know
more inIormation about their customers. Through Facebook. advertisers could Iind the age. sex.
sexual orientation. Iriends. groups and social habits oI their customers via Beacon. When users
wanted to stop companies Irom posting their inIormation. they had to go through hard to Iind
settings that required the user to click "See More". "Edit Settings" and then tell Facebook that
they did not want to post inIormation Irom Overstock.com to Facebook.
The Beacon program was impeccably designed. At every step. the motivations oI users were
pinpointed. The screens that popped up made the deIault action seem like best one. It was much
easier Ior a user to iust let Overstock post the inIormation. rather than go through the steps to
ensure that each and every company a user purchased Irom did not post to his proIile. Users
Iought back Iiercely against Beacon claiming that Facebook stated it would never sell user
inIormation. User demographics. to many Facebook users. is user inIormation. Many users
claimed that age. interests and other things about themselves that they Ireely posted Ior Iriends to
see. made them who they were. Facebook was selling this personal inIormation whether or not it
had a user's name attached to it. Facebook implemented Beacon as an opt-out program only. In
August 2008. a class action lawsuit was Iiled against Facebook and the third party companies
who participated in the program. as personal inIormation about users was released without the
users' permission.
Similar design strategies Irom the 2007 Beacon snaIu have allowed quiet privacy changes to
receive little attention. In 2007. Beacon was ahead oI its time as users were not ready to share
that much inIormation. Comparatively. Facebook proIiles had robust privacy settings allowing
the most private person to use the site without concerns that someone they didn't want could Iind
them or see their proIile picture. In 2009. much more inIormation became publicly available and
users were not given the ability to hide it. Advertisers are now given similar aggregated
demographic inIormation about the users who click on Facebook advertisements. While
Facebook is still giving some oI this demographic inIormation away to advertisers. users don't
notice it due to the Iact that when they click on ads. they don't see a pop-up screen notiIying
them oI where their inIormation is going. I argue that when this is the case. there is little Ior
users to Iight back with when they don't know what inIormation is being shared with advertisers.
Facebook is slowly taking away Ieatures oI proIile privacy settings. in order to make room Ior
Beacon-like advertisements unbeknownst to the user. Facebook has limited its disclosure and
transparency on the sharing oI user inIormation.
Facebook's Beacon-like privacy violations seemed all too Iamiliar with the announcement oI
the "Instant Personalization" Ieature. This was an opt-out Ieature that users could only remove
themselves Iully Irom by going through several steps and blocking individual partners. II one
decided to connect with one oI the websites and then remove himselI. the application would be
able to hold onto his inIormation Iorever. This drastically expands the powers third party
advertisers have. when users did not necessarily sign up to interact with these other websites. but
only Facebook itselI. These slow but progressive design changes limiting a user's exposure to
pertinent notiIication about where their inIormation goes makes it much easier Ior OSNs to share
user inIormation without receiving user backlash. When OSNs Iail to appropriately notiIy users
oI how their inIormation is being shared with the service provider and third parties. it is an added
slight to user control.
Chapter 9: Ethical Information Privacy Practices
9.1 Legalities
Although I do not intend this to be a review oI the legal ramiIications oI OSNs. users are oIten
uncomIortable with services that violate active privacy laws or Iind ways to skirt the law. This
Ieeling can oIten aIIect one's ethical view oI an OSN. Facebook's Beacon advertising program
made users take a look at the site with a very critical and legal eye. "The bottom line." MoveOn
spokesman Adam Green said in an interview with CNET News.com. "is that no Facebook user
should have their private purchases online posted Ior the entire world to see without their explicit
opted-in permission" (McCarthy). In 2007. many organizations made it clear that "opt-out" was
not the ethical or legal setting Ior Beacon advertisements. Sensitive purchases could easily show
up on one's newsIeed Ior all oI a user's Iriends to see. Proponents oI Beacon claimed that the
privacy settings could remove any purchases Irom a user's wall and the newsIeed oI others. Yet.
unknowing users claimed that the opt-out was hidden and diIIicult to Iind. Furthermore. the lack
oI a universal opt-out meant one had to continue to opt-out with every e-commerce site they used
that partnered with Beacon. One would have to continuously iump through several hoops to
ensure his privacy. Facebook eventually eliminated the Beacon program due to these concerns
and awarded $9.5 million in order to settle the lawsuit. Yet. many oI the Beacon Ieatures that
privacy proponents despised are being implemented in other Facebook advertising programs.
The "instant personalization" Ieature implemented in April 2010 received similar complaints Ior
how Facebook automatically opted users in to sharing their inIormation with Yelp. Pandora and
Docs.com. Other Beacon-like settings have been implemented on the proIile settings. Below are
the Facebook privacy settings which are automatically made public when a user creates his
proIile. II one chooses to be more private. he must opt-out oI the public setting.
Fig. 38 - Privacv Settings Opt-Out Status
Another similarity to the Beacon advertising program is how Iar a user must go in order to
change their privacy settings. In 2007. privacy advocates were horriIied that it took two or three
steps to opt-out. When changing the privacy settings today. it may take Iive or six steps to
update a single setting.
Additionally. iust as with Beacon. you cannot universally opt-out oI all privacy Ieatures. For
example. when a user writes on a Iriend's wall (or socially interacts with others in many other
ways). that behavior is recorded on the Iriend's wall and the user's wall as well. Below is an
example oI how the behavior is seen on the user's wall.
Fig. 39 - Recent Activitv on Facebook (Facebook Jan 2009)
Facebook posts all oI a user's social interactions across the site on the user's wall. meaning that
someone can easily track the user's interactions with others. photos they have commented on.
events they are planning on attending etc. II a user wants to remove this behavior Irom his wall.
he has to remove each activity individually by clicking the "remove" button below.
Fig. 40 - Remove Status Comment (O´Neill)
Additionally. even when one clicks "remove" and clearly has the intention oI not sharing this
inIormation. Facebook claims this may still show up in the news Ieeds oI other users to easily
Iind. The lack oI a universal privacy setting makes it very diIIicult Ior a user to only share
inIormation with whom they intended. This requires active privacy patrolling Irom the user
multiple times a day. which is neither convenient nor sensible Ior a user who cares about privacy.
To some. the comparison oI Facebook's Beacon advertising program to the latest privacy
changes may be a stretch. Yet. Facebook Iaced numerous legal issues with the Beacon program
Ior being opt-out only. diIIicult to change the privacy settings Ior and revealing inIormation that
some may not have wanted to share. Facebook has implemented the same strategies to limit a
user's ability to control their inIormation today. Others have also scrutinized the privacy settings
Facebook oIIers. Canada's Privacy Commissioner called out the service Ior having "privacy gaps
in the way the site operates." in that there is a clear way oI deactivating one's account but no way
to actually delete it (McCarthy). Facebook can retain data Irom deactivated accounts Ior an
indeIinite period oI time. which violates Canadian privacy law. Facebook took some steps to
better inIorm users oI the existing privacy settings. but the privacy settings themselves did not
receive a makeover. The legal issues Iaced by OSNs will continue to come into question as more
personal inIormation is made public. OSNs will continue to change privacy policies and terms
oI use. making later privacy setting updates legal. The legalities oI privacy are cut and dry and
vary by country or state. The more interesting concern is whether the OSNs are handling privacy
ethically. rather than legally.
9.2 The Abuse of Ethical Information Privacv Practices
I have explained the several ways in which Facebook has made it more diIIicult Ior a user to
monitor and change his privacy settings. By requiring users to opt-out on many oI their privacy
settings. Facebook has created a barrier Ior users to have and execute control over their personal
inIormation. Earlier. I deIined privacy as control. As Facebook users have consistently lost
control. they have also lost privacy. Whether OSNs behave ethically or not should be
determined by the relationship oI trust that the user has with the OSN. When Facebook
communicates the rules oI the relationship through its terms oI service and privacy policy. it
gives users the opportunity to decide whether or not to trust it. When one has privacy. he has the
ability to control the Ilow oI his own personal inIormation. I argue that control can be achieved
through ethical inIormation privacy practices. These practices are oIten enacted into law. which
continues to evolve and change with the expansion oI OSNs. There are Iour criteria that must be
satisIied in ethical inIormation privacy practices. These criteria are derived Irom various privacy
directives and acts and include the Iollowing:
1. NotiIication (an OSN must notiIy a user when it collects the user's private inIormation)
2. Choice (the OSN must oIIer a user the opportunity to reIuse consent to share
3. Use (the OSN may only use the inIormation Ior the purpose in which it gathered it)
4. Security (the OSN must ensure that the criteria above exist to protect the user)
By analyzing the design and policy changes Facebook has made against the criteria above. we
can determine whether the OSN meets basic ethical privacy standards.
1. NotiIication and Use
When Facebook began. it was a private network with a privacy policy ensuring all inIormation
could remain private at the choice oI the user. Its privacy policy had elaborate protections. under
which Facebook gained much oI its user base. Facebook later released several new privacy
policies that removed many oI those protections aIter much oI its user base was invested in the
service and the company beat out most oI its competition. Users see the privacy policy and
terms oI use as a promise. They explicitly outline how an OSN will use one's inIormation. By
constantly changing privacy policies and terms oI use. the endless notiIications become white
noise. Additionally. at times. Facebook neglects to notiIy users at all when they are
automatically opted-in to new services and user data is compromised. NotiIying users oI how
their inIormation will be used aIter it has already been exposed to third parties (such as in the
case with applications and the instant personalization Ieature) does not meet the notiIication
standard. Furthermore. by using inIormation that users did not intend to share with third parties.
Facebook has shared inIormation beyond the purpose in which it was gathered originally and
also violated the use standard.
2. Choice
When privacy policies are changed. it has a signiIicant impact on the ability oI OSNs to
make believable promises. When users consent to these new privacy policies. it is not
necessarily meaningful consent. II an OSN requires that a user share inIormation and reIuses to
serve him iI the consent is not given. the consent is not meaningIul. When privacy policies are
changed without the meaningIul consent oI the user. it is destructive to the relationship oI trust a
user has with an OSN.
When Facebook releases a privacy update or website design change. a limited number oI users
oIten protest Ior various reasons. When OSNs receive considerable backlash Ior behavior that
may not be ethical. OSNs take action. Mark Zuckerberg claimed that users trust Facebook as a
"saIe place to share inIormation" in February 2009 (Walters). When Facebook updated its
privacy settings iust ten months later. Zuckerberg had a very diIIerent view oI its own service.
"The 25-year old said that. in the seven years since he started the company. people have really
gotten comIortable not only sharing more inIormation and diIIerent kinds. but more openly and
with more people - and that social norm is iust something that has evolved over time"
(McCullagh). Zuckerberg deIended the company's decision to push users to share more. saying
"we decided that these would be the social norms now and we iust went Ior it" (McCullagh).
Facebook pushed even Iurther by eliminating ability and creating barriers to privatize personal
inIormation. Below describes how the barriers to user control I have previously described have
several negative consequences. making it diIIicult Ior users to have a choice in matters oI their
privacy on an OSN.
Fig. 41 - Barriers to User Control
Facebook may claim that it is making it easier to share inIormation with other users. Yet. the
intentions oI the above design barriers are ethically questionable. The conIlict oI interest between
the OSN and the user explains why Facebook would make these changes. as more inIormation
equals more money Ior OSNs. Whereas a user wants to control his inIormation through a
straightIorward privacy setting system. Facebook does not prioritize the best experience Ior the
user. By placing several barriers in the path oI its own users. Facebook demonstrates unethical
intentions by actively using choice architecture against the best interests oI their own users. The
design implementation disregards the best interests oI the users who end up conIused and
powerless wading through endless changes and limitations.
In the Iew years that OSNs have existed. some have argued that users have become more
comIortable with giving up privacy. Is this because users no longer value privacy or because
developers take user privacy away and there is little they can do about it? Participating in
various online services. such as YouTube and Flickr. means we make a choice to give up some
privacy in a trade-oII. This trade-oII is expanded widely with OSNs. II one decided that he
didn't want to use Flickr because the service had limited privacy settings. he could easily Iind
another website with more restrictions to share photos. Yet. this is simply not true with
Facebook. There is nowhere else one can go to maintain as wide oI a social network oI Iriends.
One would have to convince his Iriends to transIer all oI their inIormation and connections to a
new site. which is too high oI a barrier Ior the 400 million active Facebook users. Facebook has
established itselI as the leader in the OSN world and this why it has become more and more
diIIicult to change the behavior oI a company that controls so many online interactions. II a
retailer Iaced ethical concerns Irom the public. consumers can easily Iind another place to shop.
With Facebook. users are increasingly Iorced to give up additional privacy without a similar
alternative. Users are unable to quit using the service when they have ethical concerns about
how their inIormation is being shared when there is no alternative.
3. Security
When OSNs invest in privacy. it is a demonstration oI ethical intent. Simply asserting
trustworthiness. as Facebook's Mark Zuckerberg has done. is not enough to instill trust in today's
users. By not meeting three ethical privacy standards. Facebook has certainly not ensured that
users are protected or secure. Security is diIIicult to achieve when it is not a primary concern Ior
the OSN. The ethical social network would do more to ensure that either users accepted policies
with meaningIul consent or users were not reIused access to their social network iI they wanted
to share less inIormation with the public. By blocking user ability to control inIormation with
several barriers. Facebook has eliminated much oI the choice the user has in controlling his
privacy. When there is no similar alternative. such as in the case oI Facebook. it cannot be
assumed that users are willingly accepting an environment oI limited privacy. Facebook has
Iailed to meet its responsibilities in the user to OSN relationship by not meeting basic ethical
privacy standards.
Although more users have recently become cautious oI OSNs. social networks will likely push
users in the direction oI sharing more inIormation rather than less. Facebook is a maior agent oI
social change. While the service claims to be changing with the times. it is doing most oI the
work in changing society's perception oI privacy. Whether or not the privacy controls oI the
Iuture and changes in culture result in societal good. the ways in which Facebook will likely
implement them continue to be ethically concerning. Will an ethically responsible OSN be
Iormed as a response to the Facebook behemoth? It is unlikely. as Iree and private OSNs are
oIten not scalable or proIitable. Rather than Iinding a suitable alternative. society will likely
have to challenge the service provider to adhere to basic ethical standards.
In determining the ethical responsibilities oI OSNs to protect privacy. I have Iound that while
users are accountable Ior much oI what they say and do. OSNs have several ethical
responsibilities that they must IulIill. Not IulIilling these responsibilities Iorces users to act as iI
every social encounter on an OSN is public. which warrants one to ask "Have we reached the
point oI no return? Are OSNs the end oI privacy as we know it?" We have not come to the point
where people uniquely identiIy others by their Facebook usernames and Big Brother watches
over all oI our social interactions. Yet. many have opened their eyes to the recent Facebook
privacy Iiascos. In late April 2010. Iour senators expressed concern that Facebook was not doing
enough to protect the privacy oI users. The Iollowing May. Facebook accidentally exposed
private chats to those who previewed others' proIiles. Dan Yoder wrote an article detailing ten
reasons why he was deleting his Facebook account (eight had to do with privacy concerns).
which received over 250.000 page views in six days. The concerns about OSNs are rising as
Facebook shapes its services around what is best Ior advertising partners. rather than users. Most
users signed up Ior Facebook under the guise that it would be a private network to connect with
Iriends. Facebook updated its policies claiming that users were changing and wanted to be more
public (McCullagh). When OSNs Ialsely represent user interests in order to inculcate their own.
we must challenge and hold them ethically accountable.
Ackerman. Mark S. Privacv in E-Commerce. Examining User Scenarios and Privacv
Preferences. Published in the ACM Conference on Electronic Commerce. 1999. 1-8. Print.
"Amazon Homepage." Amazon. Web. 20 Sept. 2010. ·https://www.amazon.com/gp/
"Amazon.com Help: Privacy Notice." Amazon. Web. 20 Sept. 2010.
ASOS.com. Web. 08 Feb. 2010. ·http://www.asos.com/~.
Axon. Samuel. "How To: Disable Facebook's "Instant Personalization"" Mashable. 26 Apr.
2010. Web. 28 Apr. 2010. ·http://mashable.com/2010/04/25/disable-Iacebook-instant-
Axten. Simon. "Next Steps on Facebook Governance." The Facebook Blog. 3 Apr. 2009.
Web. 15 Jan. 2010. ·http://blog.Iacebook.com/blog.php?post÷70896562130~.
Bankston. Kevin. "Facebook's New Privacy Changes: The Good. The Bad. and The Ugly."
Electronic Frontier Foundation. 9 Dec. 2009. Web. 07 Feb. 2010.
Belanger. F.. J.S. Hiller. and W.J. Smith. Trustworthiness in Electronic Commerce. The
Role of Privacv. Securitv. and Site Attributes: 245-70. Web. 11 Feb. 2009.
Bloustein. Edward J. Privacv as an Aspect of Human Dignitv. An Answer to Dean Prosser.
New York: New York University. School oI Law. 1964. 39. Print.
Calore. Michael. "Gmail Hits Webmail G-Spot." Wired News. 1 Apr. 2004. Web. 11 Oct.
2009. ·http://www.wired.com/science/discoveries/news/2009/03/
Cassidy. Kyle. and A. Michael Berman. Can You Trust Your Email? New Rochelle. 2006.
Chellappa. Ramnath K. Consumers? Trust in Electronic Commerce Transactions. The Role
of Perceived Privacv and Perceived Securitv. Emory University. Print.
Chellappa. R.K.. and P.A. Pavlou. Perceived Information Securitv. Financial Liabilitv and
Consumer Trust in Electronic Commerce Transactions. 2002. 358-68. Print.
"Consumer Privacy Index Q4 2004. Consumer Behaviors and Attitudes about Privacy."
Truste. 2004. Web. 12 Sept. 2009.
"Cristina Cordova ProIile." Facebook. Web. 28 Mar. 2010.
Culnan. M.J.. and G.R. Milne. "The Culnan-Milne Survey on Consumers & Online Privacy
Notices: Summary oI Responses." Print. Rpt. in Federal Trade Commission. 2001. 23.
"Facebook Design." Facebook. Web. 13 Mar. 2010.
"Facebook Help Center." Facebook. Web. 12 Jan. 2010. ·http://www.Iacebook.com/help/
"Facebook Privacy." Electronic Privacv Information Center. 07 Aug. 2009. Web. 10 May
2010. ·http://epic.org/privacy/Iacebook/~.
"Facebook Privacy Settings." Facebook. Web. 10 Jan. 2010. ·http://www.Iacebook.com/
"Facebook Privacy Settings." Facebook. Web. 23 Oct. 2009. ·http://www.Iacebook.com/
"Facebook Site Governance Documents." Facebook. Web. 1 Apr. 2010.
Facebook Statistics. 03 Mar. 2010. Web. 10 May 2010. ·http://www.Iacebook.com/press/
"Gmail: Legal Notices." Google. Web. 10 Sept. 2010. ·http://www.google.com/mail/help/
"Google and Privacy." IMHO In Mv Humble Opinion. 21 July 2008. Web. 21 Sept. 2009.
"Google Forum." The J7 Network. Web. 10 Jan. 2010. ·http://www.v7n.com/Iorums/~.
"Google Image Results." Google Images. Web. 15 Mar. 2010. ·http://images.google.com/
"Hackers Break into Palin's Yahoo! Email." Tech-Ex. 18 Sept. 2008. Web. 12 Oct. 2009.
Hamblen. Matt. "McNealy Calls Ior Smart Cards to Help Security." Computerworld. 12
Oct. 2001. Web. 05 Oct. 2009. ·http://www.computerworld.com/s/article/64729/
Kang. Cecilia. "Is Internet Privacy Dead? No. Just More Complicated." The Washington
Post. 15 Mar. 2010. Web. 21 Mar. 2010. ·http://voices.washingtonpost.com/posttech/2010/
Kirkpatrick. Marshall. "Facebook's Zuckerberg Says The Age oI Privacy Is Over."
ReadWriteWeb. 9 Jan. 2010. Web. 21 Feb. 2010. ·http://www.readwriteweb.com/archives/
Levin. Avner. and Patricia Sanchez Abril. "Two Notions oI Privacy Online." Janderbilt
Journal of Entertainment & Technologv Law 2009: 1001-051. Print.
Ling´s Cars. Web. 03 Feb. 2010. ·http://www.lingscars.com/~.
McCarthy. Caroline. "MoveOn.org Takes on Facebook's 'Beacon' Ads." CNET News. 20
Nov. 2007. Web. 08 Jan. 2010. ·http://news.cnet.com/
McCullagh. Declan. "Why No One Cares about Privacy Anymore." CNET News. 12 Mar.
2010. Web. 18 Mar. 2010. ·http://news.cnet.com/8301-135783-20000336-38.html~.
Milne. G.R.. and A.J. Rohm. "Consumer Privacy and Name Removal across Direct
Marketing Channels: Exploring Opt-in and Opt-out Alternatives." Journal of Public Policv
and Marketing (2000): 238-49. Web.
"More on Gmail and Privacy." About Gmail. Web. 20 Jan. 2010. ·http://mail.google.com/
Morgan. KC. "On-Site Advertising: How Much Is Too Much?" Dev Mechanic. 8 Sept. 29.
Web. 10 Jan. 2010. ·http://tools.devshed.com/c/a/Website-Advertising/OnSite-
Nielson. Jakob. "F-Shaped Pattern For Reading Web Content." UseIt.com. 17 Apr. 2006.
Web. 03 Jan. 2010. ·http://www.useit.com/alertbox/readingpattern.html~.
O'Neill. Nick. "How To Stop Facebook From Publishing Recent Activity To The News
Feed." All Facebook. 6 Jan. 2010. Web. 10 Jan. 2010.
"Online Retailer Topbuy Warned over Spam." Digital Media. 1 Dec. 2009. Web. 23 Jan.
2010. ·http://www.digital-media.net.au/article/Online-retailer-Topbuy-warned-over-spam/
"An Open Letter Irom Mark Zuckerberg." The Facebook Blog. 8 Sept. 2006. Web. 10 Jan.
2010. ·http://blog.Iacebook.com/blog.php?post÷2208562130~.
Panitz. Beth By. "Reading Between the Lines: The Psychology oI Menu Design." National
Restaurant Association. Aug. 2000. Web. 20 Feb. 2010. ·http://www.restaurant.org/tools/
Phillips. Mark. "Who Owns Your E-mail Address?" CNET. 19 May 2004. Web. 20 Oct.
2009. ·http://news.cnet.com/Who-owns-your-e-mail-address/
Sarno. David. "Facebook Governance Vote Is a Homework Assignment No One Did." Los
Angeles Times. 23 Apr. 2009. Web. 10 Jan. 2010. ·http://latimesblogs.latimes.com/
SchonIeld. Erick. "Zuckerberg On Who Owns User Data On Facebook: It's Complicated."
TechCrunch. 16 Feb. 2009. Web. 20 Sept. 2009. ·http://techcrunch.com/2009/02/16/
Skelton. Chad. "New Facebook Privacy Settings Make Your Private Photos Public." The
Jancouver Sun. 10 Dec. 2009. Web. 08 Jan. 2010. ·http://communities.canada.com/
Taylor. RMR. "Opting in or out oI Organ Donation." 305 (1992): 1380. Print.
"Terms oI Service." Facebook. Web. 29 Jan. 2010. ·http://apps.Iacebook.com/predictions-
Thaler. Richard H.. and Cass R. Sunstein. Nudge. Improving Decisions about Health.
Wealth and Happiness. London: Penguin. 2009. Print.
TRUSTe. Web. 10 Jan. 2010. ·http://www.truste.com/~.
Walters. Chris. "Facebook's New Terms OI Service: "We Can Do Anything We Want With
Your Content. Forever."" The Consumerist. 15 Feb. 2009. Web. 10 Jan. 2010.
"Warshak v. USA." Electronic Frontier Foundation. 18 June 2007. Web. 17 Sept. 2009.
Westin. Alan F. Privacv and Freedom. New York: Atheneum. 1967. Print.
"What Does Facebook's Privacy Transition Mean For You." DOT Rights. Web. 21 Feb.
2010. ·http://www.dotrights.org/what-does-Iacebooks-privacy-transition-mean-you~.
"Yahoo! Registration." Yahoo. Web. 11 Jan. 2010. ·https://edit.yahoo.com/
Yang. M.. B.J. Alicea. and C. Clark. "Does E-Trust Matter? A Social Cognitive Theory oI
Online Shopping Behavior." All Academic Inc. 25 May 2009. Web. 20 Oct. 2009.
Yoder. Dan. "10 Reasons To Delete Your Facebook Account." Business Insider. 3 May
2010. Web. 3 May 2010. ·http://www.businessinsider.com/10-reasons-to-delete-your-
Zuckerman. Ethan. "Facebook Changes the Norms Ior Web Purchasing and Privacy." Mv
Heart´s in Accra. 15 Nov. 2007. Web. 15 Feb. 2010. ·http://www.ethanzuckerman.com/