You are on page 1of 462

Online Consumer

Protection:

Theories of Human Relativism


Kuanchin Chen
Western Michigan University, USA
Adam Fadlalla
Cleveland State University, USA

InformatIon scIence reference


Hershey New York

Director of Editorial Content:


Senior Managing Editor:
Managing Editor:
Managing Development Editor:
Assistant Managing Editor:
Typesetter:
Editorial Assistant:
Copy Editor
Cover Design:
Printed at:

Kristin Klinger
Jennifer Neidig
Jamie Snavely
Kristin M. Roth
Carole Coulson
Chris Hrobak
Rebecca Beistline
Joy Langel
Lisa Tosheff
Yurchak Printing Inc.

Published in the United States of America by


Information Science Reference (an imprint of IGI Global)
701 E. Chocolate Avenue, Suite 200
Hershey PA 17033
Tel: 717-533-8845
Fax: 717-533-8661
E-mail: cust@igi-global.com
Web site: http://www.igi-global.com
and in the United Kingdom by
Information Science Reference (an imprint of IGI Global)
3 Henrietta Street
Covent Garden
London WC2E 8LU
Tel: 44 20 7240 0856
Fax: 44 20 7379 0609
Web site: http://www.eurospanbookstore.com
Copyright 2009 by IGI Global. All rights reserved. No part of this publication may be reproduced, stored or distributed in any form or by
any means, electronic or mechanical, including photocopying, without written permission from the publisher.
Product or company names used in this set are for identification purposes only. Inclusion of the names of the products or companies does
not indicate a claim of ownership by IGI Global of the trademark or registered trademark.
Library of Congress Cataloging-in-Publication Data
Online consumer protection : theories of human relativism / Kuanchin Chen and Adam Fadlalla, editors.
p. cm.
Summary: "This book is designed to offer readers a comprehensive way to understand the nature of online threats, consumer concerns, and
techniques for online privacy protection"--Provided by publisher.
Includes bibliographical references and index.
ISBN 978-1-60566-012-7 (hardcover) -- ISBN 978-1-60566-013-4 (ebook)
1. Consumer protection. 2. Ethical relativism. 3. Privacy, Right of. 4. Electronic commerce--Security measures. 5. Electronic
information resources--Access control. 6. Disclosure of information. 7. Computer crimes. I. Chen, Kuanchin. II. Fadlalla, Adam.
HC79.C63O54 2009
381.3'402854678--dc22
2008010313
British Cataloguing in Publication Data
A Cataloguing in Publication record for this book is available from the British Library.
All work contributed to this book set is original material. The views expressed in this book are those of the authors, but not necessarily of
the publisher.
If a library purchased a print copy of this publication, please go to http://www.igi-global.com/agreement for information on activating
the library's complimentary electronic access to this publication.

Table of Contents

Preface ................................................................................................................................................ xiv


Acknowledgment ................................................................................................................................ xx

Section I
Background
Chapter I
Google: Technological Convenience vs. Technological Intrusion ......................................................... 1
Andrew Pauxtis, Quinnipiac University, USA
Bruce White, Quinnipiac University, USA
Chapter II
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation .............. 16
Angelena M. Secor, Western Michigan University, USA
J. Michael Tarn, Western Michigan University, USA
Chapter III
Online Privacy, Vulnerabilities, and Threats: A Managers Perspective............................................... 33
Hy Sockel, DIKW Management Group, USA
Louis K. Falk, University of Texas at Brownsville, USA

Section II
Frameworks and Models
Chapter IV
Practical Privacy Assessments .............................................................................................................. 57
Thejs Willem Jansen, Technical University of Denmark, Denmark
Sren Peen, Technical University of Denmark, Denmark
Christian Damsgaard Jensen, Technical University of Denmark, Denmark

Chapter V
Privacy and Trust in Online Interactions .............................................................................................. 85
Leszek Lilien, Western Michigan University, USA
Bharat Bhargava, Purdue University, USA
Chapter VI
Current Measures to Protect E-Consumers Privacy in Australia ...................................................... 123
Huong Ha, Monash University, Australia
Ken Coghill, Monash University, Australia
Elizabeth Ann Maharaj, Monash University, Australia
Chapter VII
Antecedents of Online Privacy Protection Behavior: Towards an Integrative Model ........................ 151
Anil Gurung, Neumann College, USA
Anurag Jain, Salem State College, USA

Section III
Empirical Assessments
Chapter VIII
Privacy Control and Assurance: Does Gender Influence Online Information Exchange? ................ 165
Alan Rea, Western Michigan University, USA
Kuanchin Chen, Western Michigan University, USA
Chapter IX
A Profile of the Demographics, Psychological Predispositions, and Social/Behavioral Patterns
of Computer Hacker Insiders and Outsiders ...................................................................................... 190
Bernadette H. Schell, University of Ontario Institute of Technology, Canada
Thomas J. Holt, The University of North Carolina at Charlotte, USA
Chapter X
Privacy or Performance Matters on the Internet: Revisiting Privacy Toward a Situational
Paradigm ............................................................................................................................................ 214
Chiung-wen (Julia) Hsu, National Cheng Chi University, Taiwan

Section IV
Consumer Privacy in Business
Chapter XI
Online Consumer Privacy and Digital Rights Management Systems ............................................... 240
Tom S. Chan, Southern New Hampshire University, USA
J. Stephanie Collins, Southern New Hampshire University, USA
Shahriar Movafaghi, Southern New Hampshire University, USA

Chapter XII
Online Privacy and Marketing: Current Issues for Consumers and Marketers ................................. 256
Betty J. Parker, Western Michigan University, USA
Chapter XIII
An Analysis of Online Privacy Policies of Fortune 100 Companies ................................................. 269
Suhong Li, Bryant University, USA
Chen Zhang, Bryant University, USA
Chapter XIV
Cross Cultural Perceptions on Privacy in The United States, Vietnam, Indonesia, and Taiwan ....... 284
Andy Chiou, National Cheng Kung University, Taiwan
Jeng-chung V. Chen, National Cheng Kung University, Taiwan
Craig Bisset, National Cheng Kung University, Taiwan

Section V
Policies, Techniques, and Laws for Protection
Chapter XV
Biometric Controls and Privacy ......................................................................................................... 300
Sean Lancaster, Miami University, USA
David C. Yen, Miami University, USA
Chapter XVI
Government Stewardship of Online Information: FOIA Requirements and Other
Considerations .................................................................................................................................... 310
G. Scott Erickson, Ithaca College, USA
Chapter XVII
The Legal Framework for Data and Consumer Protection in Europe ............................................... 326
Charles OMahony, Law Reform Commission of Ireland, Ireland
Philip Flaherty, Law Reform Commission of Ireland, Ireland
Chapter XVIII
Cybermedicine, Telemedicine, and Data Protection in the United States ......................................... 347
Karin Mika, Cleveland State University, USA
Barbara J. Tyler, Cleveland State University, USA

Chapter XIX
Online Privacy Protection in Japan: The Current Status and Practices ............................................. 370
J. Michael Tarn, Western Michigan University, USA
Naoki Hamamoto, Western Michigan University, USA

Compilation of References .............................................................................................................. 388


About the Contributors ................................................................................................................... 429
Index ................................................................................................................................................ 436

Detailed Table of Contents

Preface ................................................................................................................................................ xiv


Acknowledgment ................................................................................................................................ xx

Section I
Background
Chapter I
Google: Technological Convenience vs. Technological Intrusion ......................................................... 1
Andrew Pauxtis, Quinnipiac University, USA
Bruce White, Quinnipiac University, USA
Search engines can log and stamp each search made by end-users and use that collected data for an
assortment of business advantages. In a world where technology gives users many conveniences, one
must weigh the benefits of those conveniences against the potential intrusions of personal privacy.
With Googles eyes on moving into radio, television, print, and other technologies, one must back up
and examine the potential privacy risks associated with the technological conveniences being provided
by Google. This chapter gives an overview of Googles services and how they are related to personal
privacy online
Chapter II
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation .............. 16
Angelena M. Secor, Western Michigan University, USA
J. Michael Tarn, Western Michigan University, USA
This chapter offers a review of online privacy issues, particularly in the areas of consumer online privacy
legislation and litigation, relationship among the privacy issues, legal protections, and risks for privacy
violations. A survey into the privacy literature provides insights on privacy protection and privacy concern. Results show a need for a stronger intervention by the government and the business community.
Consumers privacy awareness is also the key to a successful protection online. This chapter is concluded
with a call for consumer privacy education to promote privacy awareness, and for government and businesses timely responses to privacy violations.

Chapter III
Online Privacy, Vulnerabilities, and Threats: A Managers Perspective............................................... 33
Hy Sockel, DIKW Management Group, USA
Louis K. Falk, University of Texas at Brownsville, USA
Victims of online threats are not necessarily just individual consumers anymore. Management must be
ready to neutralize, reduce, and prevent these threats if the organization is going to maintain its viability in todays business environment. This chapter is designed to give managers an overview needed to
have a working knowledge of online privacy, vulnerabilities, and threats. The chapter also highlights
techniques that are commonly used to impede attacks and protect the privacy of the organization, its
customers, and employees.

Section II
Frameworks and Models
Chapter IV
Practical Privacy Assessments .............................................................................................................. 57
Thejs Willem Jansen, Technical University of Denmark, Denmark
Sren Peen, Technical University of Denmark, Denmark
Christian Damsgaard Jensen, Technical University of Denmark, Denmark
This chapter proposes a privacy assessment model called the operational privacy assessment model
that includes organizational, operational, and technical factors for the protection of personal data stored
in an IT system. The factors can be evaluated in a simple scale so that not only the resulting graphical
depiction can be easily created for an IT system, but graphical comparisons across multiple IT systems
are also possible. Examples of factors presented in a Kiviat graph are also presented. This assessment
tool may be used to standardize privacy assessment criteria, making it less painful for the management
to assess privacy risks on their systems
Chapter V
Privacy and Trust in Online Interactions .............................................................................................. 85
Leszek Lilien, Western Michigan University, USA
Bharat Bhargava, Purdue University, USA
Trust is an essential ingredient for a successful interaction or collaboration among different parties. Trust
is also built upon the belief that the privacy of the involved parties is protected before, during, and after
the interaction. This chapter presents different trust models, the interplay between trust and privacy, and
the metrics for these two related concepts. In particular, it shows how ones degree of privacy can be
traded for a gain in the level of trust perceived by the interaction partner. The idea and mechanisms of
trading privacy for trust are also explored.

Chapter VI
Current Measures to Protect E-Consumers Privacy in Australia ...................................................... 123
Huong Ha, Monash University, Australia
Ken Coghill, Monash University, Australia
Elizabeth Ann Maharaj, Monash University, Australia
Australia uses regulation/legislation, guidelines, codes of practice, and activities of consumer associations
and the private sector to enhance protection of consumers privacy. This chapter is designed to report
Australians experience in privacy protection. In particular, the four main areas of protection outlined
above are analyzed to draw implications. Recommendations include areas in coverage of legislation,
uniformity of regulations, relationships among guidelines and legislation, and consumer awareness
Chapter VII
Antecedents of Online Privacy Protection Behavior: Towards an Integrative Model ........................ 151
Anil Gurung, Neumann College, USA
Anurag Jain, Salem State College, USA
This chapter proposes an integrated framework to model online privacy protection behavior. Factors
in this framework are drawn from recent Internet and online privacy studies. Although many possible
factors can be included in the framework, the authors took a very conservative approach to include in
their framework only those factors that were formally studied in the academic literature. This framework
serves as the basis for future extensions or empirical assessments

Section III
Empirical Assessments
Chapter VIII
Privacy Control and Assurance: Does Gender Influence Online Information Exchange? ................ 165
Alan Rea, Western Michigan University, USA
Kuanchin Chen, Western Michigan University, USA
One main reason that online users are wary of providing personal information is because they lack trust
in e-businesses personal information policies and practices. As a result, they exercise several forms of
privacy control as a way to protect their personal data online. This chapter presents survey results of
how the two genders differ in their ways to control their private data on the Internet. Findings provide
guidelines for e-businesses to adjust their privacy policies and practices to increase information and
transactional exchanges

Chapter IX
A Profile of the Demographics, Psychological Predispositions, and Social/Behavioral Patterns
of Computer Hacker Insiders and Outsiders ...................................................................................... 190
Bernadette H. Schell, University of Ontario Institute of Technology, Canada
Thomas J. Holt, The University of North Carolina at Charlotte, USA
Research about hackers is scarce, but the impact on privacy that hackers bring to the Internet world should
not be underestimated. This chapter looks at the demographics, psychological predispositions, and social/
behavioral patterns of computer hacker to better understand the harms that can be caused. Results show that
online breaches and online concerns regarding privacy, security, and trust will require much more complex
solutions than currently exist. Teams of experts in fields such as psychology, criminology, law, and information technology security need to collaborate to bring about more effective solutions for the virtual world.
Chapter X
Privacy or Performance Matters on the Internet: Revisiting Privacy Toward a Situational
Paradigm ............................................................................................................................................ 214
Chiung-wen (Julia) Hsu, National Cheng Chi University, Taiwan
This chapter introduces a situational paradigm to study online privacy. Online privacy concerns and
practices are examined within two contexts: technology platforms and users motivations. Results show
a distinctive staging phenomenon under the theory of uses and gratifications, and a priori theoretical framework. Diffused audience was concerned less about privacy but they did not disclose their
personal information any more than the other groups. Users may act differently in diverse platforms or
environments, implying that treating Internet users as a homogeneous group or considering them to act
the same way across different environments is a problematic assumption.

Section IV
Consumer Privacy in Business
Chapter XI
Online Consumer Privacy and Digital Rights Management Systems ............................................... 240
Tom S. Chan, Southern New Hampshire University, USA
J. Stephanie Collins, Southern New Hampshire University, USA
Shahriar Movafaghi, Southern New Hampshire University, USA
The business values of using the Internet for the delivery of soft media may be hampered when the owners
risk losing control of their intellectual property. Any business that wishes to control access to and use of
its intellectual property is a potential user of digital rights management (DRM) technologies. Managing,
preserving, and distributing digital content through DRM is not without problems. This chapter offers
a critical review of DRM and issues surrounding its use

Chapter XII
Online Privacy and Marketing: Current Issues for Consumers and Marketers ................................. 256
Betty J. Parker, Western Michigan University, USA
Certain marketing practices may sometimes cause privacy conflicts between businesses and consumers.
This chapter offers insights into privacy concerns from todays marketing practices on the Internet. Specifically, areas of focus include current privacy issues, the use of spyware and cookies, word-of-mouth
marketing, online marketing to children, and the use of social networks. Related privacy practices,
concerns, and recommendations are presented from the perspectives of Internet users, marketers, and
government agencies.
Chapter XIII
An Analysis of Online Privacy Policies of Fortune 100 Companies ................................................. 269
Suhong Li, Bryant University, USA
Chen Zhang, Bryant University, USA
This chapter examines the current status of online privacy policies of Fortune 100 companies. Results
show that 94% of the surveyed companies have posted an online privacy policy and 82% collect personal
information from consumers. Additionally, the majority of the companies only partially follow the four
principles (notice, choice, access, and security) of fair information practices. Few organizations have
obtained third-party privacy seals including TRUSTe, BBBOnline Privacy, and Safe Harbor
Chapter XIV
Cross Cultural Perceptions on Privacy in The United States, Vietnam, Indonesia, and Taiwan ....... 284
Andy Chiou, National Cheng Kung University, Taiwan
Jeng-chung V. Chen, National Cheng Kung University, Taiwan
Craig Bisset, National Cheng Kung University, Taiwan
This chapter studies concerns of Internet privacy across multiple cultures. Students from several countries were recruited to participate in the focus group study in order to discover the differences of their
privacy concerns. Collectivistic cultures appear to be less sensitive to the violation of personal privacy;
while the individualistic cultures are found to be more proactive in privacy protection. Implications are
provided.

Section V
Policies, Techniques, and Laws for Protection
Chapter XV
Biometric Controls and Privacy ......................................................................................................... 300
Sean Lancaster, Miami University, USA
David C. Yen, Miami University, USA

This chapter provides an overview of biometric controls to protect individual privacy. Although much of
the discussion targets protection of physical privacy, some may also apply to online consumer privacy.
Discussion is focused in four main areas, technological soundness, economic values, business applications, and legal/ethical concerns. Further insights are provided
Chapter XVI
Government Stewardship of Online Information: FOIA Requirements and Other
Considerations .................................................................................................................................... 310
G. Scott Erickson, Ithaca College, USA
This chapter focuses on the issues surrounding the federal Freedom of Information Act and associated
state and local laws for their implications on personal privacy. Despite the good intentions of these laws
to enable openness in government, confidential business information and private personal information
may be vulnerable when data are in government hands. This chapter offers the readers a better understanding of the several trends regarding the statutes and their interpretations
Chapter XVII
The Legal Framework for Data and Consumer Protection in Europe ............................................... 326
Charles OMahony, Law Reform Commission of Ireland, Ireland
Philip Flaherty, Law Reform Commission of Ireland, Ireland
This chapter discusses the legal framework and the law of the European Union (EU) for consumer and
data protection. The creation of legal frameworks in Europe aims to secure the protection of consumers
while simultaneously facilitating economic growth in the European Union. This chapter outlines the main
sources of privacy protection law and critically analyzes the important provisions in these sources of law.
Gaps and deficiencies in the legal structures for consumer and data protection are also discussed
Chapter XVIII
Cybermedicine, Telemedicine, and Data Protection in the United States ......................................... 347
Karin Mika, Cleveland State University, USA
Barbara J. Tyler, Cleveland State University, USA
This chapter provides an overview of law relating to Internet medical practice, data protection, and consumer information privacy. It provides a comprehensive overview of federal (HIPAA) and state privacy
laws. Readers are given advice to the legal and data protection problems consumers will encounter in
purchasing medical and health services on the Internet. Furthermore, actual case studies and expert advice
are provided to offer a safer online experience. The authors also advocate that the United States must
enact more federal protection for the consumer in order to deter privacy violations and punish criminal,
negligent, and willful violations of personal consumer privacy.

Chapter XIX
Online Privacy Protection in Japan: The Current Status and Practices ............................................. 370
J. Michael Tarn, Western Michigan University, USA
Naoki Hamamoto, Western Michigan University, USA
This chapter reports the current status and practices of online privacy protection in Japan. It offers a
perspective of an eastern culture regarding the concept of privacy, its current practices, and how it is
protected. Following the discussion of the Japanese privacy law called Act on the Protection of Personal
Information, Japans privacy protection mechanisms to support and implement the new act are examined.
The authors also offer a four-stage privacy protection solution model as well as two case studies to show
readers the problems, dilemmas, and solutions for privacy protection from Japans experience.

Compilation of References .............................................................................................................. 388


About the Contributors ................................................................................................................... 429
Index ................................................................................................................................................ 436

xiv

Preface

Privacy, the right to be left alone, is a fundamental human right. Risks of the contraryprivacy invasionhave increased in significant proportions in a world increasingly turning online. In todays networked
world, a fast growing number of users are hopping on and off the Internet superhighways, multiple times
everydaymore so than they hop on and off physical expressways. Internet users are also doing more
diverse activities online, including browsing, shopping, communicating, chatting, gaming, and even
working. With so much online presence, users find themselves, in many situations, divulging information
that they would otherwise may not due to privacy concerns. Users may even be wary of getting online
because of fear of possible privacy invasion from the many preying eyes on the Internet. The issue is
not whether privacy should be protected or not, rather the issue is how it should be protected in the vast
online world where information can be intercepted, stolen, quickly transported, shared unknowingly to
the user, or even sold for profit. Compared to an offline environment, the Internet enables collection of
more information from users cost effectively, sometimes even without their consent. Thus, the Internet
poses greater privacy threat for users as their personal information is transmitted over the Internet if an
organization does not have a good security mechanism in place. Furthermore, the connectivity of the
Internet allows capturing, building, and linking of electronic profiles and behaviors of users.
Online privacy is a multidimensional concept and thus has been addressed in research from a multiplicity of angles, albeit not equally thoroughly. Much research effort has focused on addressing privacy
as a technological factor and hence proposed technical solutions to privacy protection. Although this is
an important dimension of online privacy, there are equally, if not more, important dimensions, such as
context, culture, perceptions, and legislation. Such softer (non-technological) aspects of privacy cannot be understood by only looking at the technological aspects of privacy. The human dimension is as
complex and as important for getting a more complete understanding of privacy. At a micro level, not
only that individuals have varying requirements for privacy, but the same individuals requirements
may change over time or between situational contexts. Response to privacy invasion may be very different between individuals and situational contexts. There may also be a gap between what individuals
desired and actual behaviors in relation to their privacy concerns. Individuals may have more stringent
privacy requirements than what their actual online practice reflects. Online privacy researchers offered
less coverage to these human factors, but understanding these factors, and many more, is key to gaining
a better understanding of online privacyhence the human relativism.
At a macro level, privacy requirements and response to privacy invasion may vary across cultures,
societies, and business situations. Organizational practices of privacy policies and responses to incidents of privacy invasion affect peoples perceptions on the current state of privacy, and consequently
affect their trust in the organization. People are generally concerned about how personal information is
collected, used, and distributed beyond its original purpose and beyond the parties originally involved.
Breaches to how their information is collected, used, or shared and response to such breaches directly

xv

impact their privacy concerns and their trust. There is still not sufficient empirical evidence to answer
many privacy questions at these macro levels, and many human aspects of online privacy in some social
and cultural settings have not yet received enough research attention. Consequently, our understanding
of the relationships between online privacy and dimensions such as culture, user characteristics, business
context, technology use, and education is still limited.
The world is increasingly turning online and there will be no reversal of this trend. To protect the
privacy of online users and to consequently achieve the full potential of transacting in an online world,
the issue of online privacy needs to be understood from multiple facets. The challenge is to minimize
constraints of online dealings without compromising users privacy. Such delicate balancing cannot
be achieved without a broad understanding of online privacy. This book is an attempt to provide such
understanding by offering a comprehensive and balanced coverage of the various dimensions of online
privacy. Many previously published books either treat privacy as a sub-topic under a broader topic of
end-user computing or information systems or focus primarily on technical issues or managerial strategies. Many others focus on end users and offer only introductory material or general guidelines to
enhance personal online security and privacy. While this treatment of these important topics of privacy
is appropriate for their intended use and audience, it does not allow for a broader and a more extensive
examination of online privacy and how it guides practice.
Furthermore, many gaps in privacy, threats, and fraud theories have not yet been filled. The most
prominent such gaps include linking privacy theories to other established theories and frameworks in
information technology or related disciplines. For example, culture, social, and behavioral issues in
privacy have not received enough attention. Research on human aspects as well as empirical assessments of privacy issues are lacking. Research on linking privacy considerations to business practices,
educational curriculum development/assessment, and legislative impacts are also scarce. Many studies
have focused on technological advancements, such as security protection and cryptography to offer
technical tools for privacy protection and for assessing risks of privacy invasion. Although such focus
is a must to protect users from these risks, technology is not equivalent to protection. For a protection
scheme to work well, both technical and human aspects have to work in harmony. A major goal of this
book is to provide a view of privacy that integrates the technical, human, cultural, and legal aspects of
online privacy protection as well as risks and threats to privacy invasion.
The book aims for (a) promoting research and practice in various areas of online privacy, threats assessment, and privacy invasion prevention, (b) offering a better understanding on human issues in these
areas, and (c) furthering the development of online privacy education and legislation. The book goes
beyond introductory coverage and includes contemporary research on the various dimensions of online
privacy. It aims to be a reference for professionals, academics, researchers, and practitioners interested
in online privacy protection, threats, and prevention mechanisms. The book is the result of research
efforts from content experts, and thus it is an essential reference for graduate courses and professional
seminars.
There are 19 great chapters in the book, grouped into five sections: (1) background, (2) frameworks
and models, (3) empirical assessments, (4) consumer privacy in business, and (5) policies, techniques,
and laws for protection.
The background section provides an overview of privacy for those who prefer a short introduction
to the subject. In Chapter I, Pauxtis and White point out the serious privacy implications of online
searches. Search engines can log and stamp each search made by end-users and use that collected data
for an assortment of business advantages. In a world where technology gives users many conveniences,
one must weigh the benefits of those conveniences against the potential intrusions of personal privacy.
Nevertheless, end-users will always use search engines. They will always Google something on their

xvi

mind. The authors conclude that while the vast majority of casual Internet users either do not know
Googles data collection policies, or simply do not care, at the end of the day it comes down to the
simple fact that we as a society must put our trust into the technological innovations that have become
commonplace conveniences.
In Chapter II, Angelina and Tarn brought to the forefront the importance of legal protection and
privacy awareness and presented a taxonomic view to explore the relationship of the issues, legal protections, and the remedies and risks for not complying with the legal requirements. The authors used two
survey studies to reinforce the vital need for a stronger role by the government and business community
as well as the privacy awareness from online consumers themselves. The chapter is concluded with a vital
call for consumer privacy education and awareness, government and legislators attention, and timely
responses with legislation that protects consumers against those who would misuse the technology.
In Chapter III, Sockel and Falk highlighted the gravity of vulnerabilities to privacy in that it is not
uncommon for employees to work offsite, at home, or out of a hotel room, often using less than secure
Internet connectionsdial-up, cable, Internet cafs, libraries, and wireless. The chapter highlights the
relationship between vulnerability, threats, and action in what the authors termed risk triangle. It
delves into techniques that are commonly used to thwart attacks and protect individuals privacy, and
discussed how in the age of unrest and terrorism, privacy has grown even more important, as freedoms
are compromised for security. The chapter provides an overview of the various vulnerabilities, threats,
and actions to ameliorate them.
Section II consists of four chapters that offer frameworks or models to study various privacy issues.
In Chapter IV, Jansen, Peen, and Jensen turn the attention to the claim that Most of the current work
has focused on technical solutions to anonymous communications and pseudonymous interactions, but,
in reality, the majority of privacy violations involve careless management of government IT-systems,
inadequate procedures or insecure data storage. The authors introduced a privacy assessment model,
called the Operational Privacy Assessment Model that includes organizational, operational, and technical
factors. The factors can be evaluated in a simple scale so that not only the resulting graphical depiction
can be easily created for an IT system, but graphical comparisons across multiple IT systems are also
possible. Although their method has been developed in the context of government IT-systems in Europe, they believe that it may also apply to other government systems, non-governmental organisations
(NGOs), and large private companies.
In Chapter V, Lilien and Bhargava underline the strong relationship between privacy and trust. The
authors contend that the role of trust and privacy is as fundamental in computing environments as it is
in social systems. The chapter presents this role in online interactions, emphasizing the close relationship between trust and privacy, and shows how ones degree of privacy can be traded for a gain in the
level of trust perceived by ones interaction partner. The chapter explores in detail the mechanisms of
this core theme of trading privacy for trust. It also presents different trust models, the interplay between
trust and privacy, and the metrics for these two related concepts.
In Chapter VI, Ha, Coghill, and Maharaj offer an Australian perspective on measures to protect econsumers privacy, the current state of e-consumer privacy protection, and discuss policy implications
for the protection of e-consumers privacy. The authors suggest that although privacy protection measures
in the form of legislation, guidelines, and codes of practice are available, their effectiveness is limited in
alleviating consumers privacy and security concerns. The authors contend that protection of consumers
personal information also depends on how e-retailers exercise their corporate social responsibility to
provide protection to e-consumers.
In Chapter VII, Gurung and Jain review the existing literature and analyze the existing online privacy theories, frameworks, and models to understand the variables that are used in the context of online

xvii

privacy protection. The authors developed an integrative framework to encapsulate the antecedents to
online privacy protection behavior.
Section III includes research studies that report empirical findings on various privacy topics. One
main reason that online users are wary of providing personal information is because they lack trust in
e-businesses personal information policies and practices. As a result, they exercise several forms of
privacy control as a way to protect their personal data online. In Chapter VIII, Rea and Chen report
survey results of how the two genders differ in their ways to control their private data on the Internet.
Findings provide guidelines for e-businesses to adjust their privacy policies and practices to increase
information and transactional exchanges.
Discussion on privacy is incomplete without a glimpse into hackers and crackersthe elite corps
of computer designers and programmers, according to Schell and Holt in Chapter IX. Schell and Holt
argue that it is vital that researchers understand the psychological and behavioral composition of network attackers and the social dynamics that they operate within. This understanding can improve our
knowledge of cyber intruders and aid in the development of effective techniques and best practices
to stop them in their tracks. Such techniques can minimize damage to consumer confidence, privacy,
and security in e-commerce Web sites and general information-sharing within and across organizations.
The authors discuss known demographic and behavioral profiles of hackers and crackers, psychological
myths, and truths about those in the computer underground, and how present strategies for dealing with
online privacy, security, and trust issues need to be improved.
In Chapter X, Hsu adds a perspective from communications to the ongoing debate on online privacy.
She examines why online privacy researchers failed to explain why users asserting to have higher privacy
concerns still disclose sensitive information. The author argues that this is due to ignoring the social
context (what the author terms situational paradigm) in the research on online privacy. The author tries to
offer more support for the argument of the situational paradigm from the newly-emerging phenomenon
of online photo album Web sites in Taiwan.
Section IV focuses on consumer privacy in business and consists of four chapters. In Chapter XI,
Chan, Collins, and Movafaghi tackle the issue of online consumer privacy and digital rights management
(DRM) systems of protecting digitally stored content. This protection may be accomplished through
different strategies or combinations of strategies including: identifying authorized users, identifying
genuine content, verifying proof of ownership and purchase, uniquely identifying each copy of the
content, preventing content copying, tracking content usage and distribution, and hiding content from
unauthorized users. The authors argue that DRM systems may change the business model from a traditional buy-and-own to a pay-per-use, but caution that this may pose great risks to consumers and society
as DRM technologies may weaken the rights to privacy, fair use, and threaten the freedom of expression.
The chapter discusses the conflict between the rights of content owners and the privacy rights of content
users, and explores several DRM techniques and how their use could affect consumer privacy.
In Chapter XII, Parker offers views on online privacy from a marketing perspective in the context
of consumer marketing. The chapter provides insights into the ways that online privacy has become a
balancing act in which the needs of businesses are oftentimes balanced against the needs of consumers.
A number of privacy issues that affect the marketing of products and services are presented, along with
recommended best practices. The issues discussed include: (1) consumer, marketer, and government
perspectives on data collection, ownership and dissemination; (2) online advertising and the use of
cookies and spyware; (3) word-of-mouth marketing and the use of blogs, sponsored chat, and bulletin
boards; (4) marketing online to children; and (5) privacy issues in social networks and online communities. The chapter represents one of the first analyses of online marketing practices and their associated
privacy issues.

xviii

In Chapter XIII, Li and Zhang offer analysis of online privacy policies of Fortune 100 companies
within the context of the four principles (notice, choice, access, and security) of fair information practices. The authors found that 94% of the surveyed companies posted an online privacy policy and 82%
of them collect personal information from consumers. The majority of the companies only partially
follow the four principles of fair information practices. In particular, organizations fall short in security
requirementsonly 19% mention that they have taken steps to provide security for information both
during transmission and after their sites have received the information. The authors conclude that a well
designed privacy policy by itself is not adequate to guarantee privacy protection, effective implementation is as important. Consumer education and awareness are also essential for privacy protection.
In Chapter XIV, Chiou, Chen, and Bisset focus attention on the important question of online privacy
across cultures by analyzing cultural perceptions on privacy in the United States, Vietnam, Indonesia,
and Taiwan. The authors point out clear differences between how personal information is viewed in the
United States and Asia. For example, an American in Taiwan might feel suspicious if asked to provide
his passport number by a community Web site, while a Taiwanese in the United States might be puzzled
and alienated by the fierceness at which people guard their private lives. The authors argue that such
differences should be considered in cross-culture online privacy research and legislation. Furthermore,
due to the various cultural differences and backgrounds that form privacy perceptions, great care and
sensitivity should be taken into consideration when conducting privacy studies across cultures.
Section IV deals with policies, techniques, and laws for privacy protection. In Chapter XV, Lancaster and Yen focus on the important linkage between biometric controls and privacy. Biometrics is an
application of technology to authenticate users identities through the measurement of physiological or
behavioral patterns, and thus do not suffer from the shortcoming of external authentication techniques that
rely on items that can be lost, forgotten, stolen, or duplicated. The authors conclude that, with adequate
communication, users are likely to appreciate systems that allow them the ease of use and convenience
that biometric systems offer, and hence their use will continue to grow in the future.
In Chapter XVI, Erickson discusses the important issue of the tension between openness in government and personal privacy. The trend in the federal legislature has been to continually strengthen the
FOIA and openness by reaffirming a presumption that government records should be released unless
there is a compelling reason not to. Alternatively, the trend in agency practice and the courts has been
toward more privacy, allowing use of certain exemptions in the FOIA to deny records to individuals or
organizations seeking them. This balance has been clarified somewhat by legislation on electronic records,
agency practice, and a number of court cases suggesting agencies can limit releases to central purpose
activities and records not including individually identifiable information. The author also considers the
status and vulnerability of confidential business information passed on to governments and the status
and vulnerability of government databases concerning individual citizens. The main conclusion of the
chapter is that matters remain in flux in the legal aspects of privacy, and regardless of which way the
balance tips (openness vs. privacy), more certainty will help government, organizations, and individuals
better plan how and when to share their own information resources.
In Chapter XVII, OMahony and Flaherty discuss the legal framework for consumer and data protection in Europe which seeks to secure the protection of consumers while simultaneously facilitating
economic growth in the European Union. The chapter outlines the main sources of law which protect
consumers and their privacy, the important provisions in these sources of law and critically analyzes
them, and points the gaps and deficiencies in the consumer and data protection legal structures. The
authors argue that the creation of these legal rights and legal protections will only stem the misuse of
personal data if people know about the law and their rights and know how to access legal protections.
Thus, more needs to be done to ensure that citizens of the European Union are equipped with the nec-

xix

essary knowledge to ensure that their personal data is treated with respect and in accordance with law.
The authors conclude that more focus needs to be put on ensuring greater compliance with the law,
particularly from businesses who have benefited from the free flow of data.
In Chapter XVIII, Mika and Tyler provide an overview of the law relating to cybermedicine and
telemedicine in terms of data protection and other legal complications related to licensing and a conflict of
state laws. The authors examine the laws applicable to Web sites where medical diagnosis or the purchase
of medical services (including prescriptions) is available. They discuss how the new methodology of
acquiring medical care is at odds with traditional notions of state regulation and how current laws, both
federal and state, leave many gaps related to any consumer protections or potential causes of action when
privacy is compromised. The authors posit some expert advice for consumers regarding using websites
for medical purposes as well as protecting their own privacy. Lastly, the authors advocate a federal law
more punitive that HIPAA; one that regulates and protects patient information, medical transactions,
and interactions on the Internet and deters violations of patient privacy by mandating significant fines
and imprisonment for negligent or criminal and willful violations of that privacy.
In Chapter XIX, Tarn and Hamamoto emphasized trans-border differences in the concepts of privacy; namely, that the concept of privacy in Japan is different than that in the western countries. They
explained how, after more and more privacy-related problems were revealed by the media, consumers
began to pay attention to the protection of their private information, and, in response, the Japanese government enacted legislation to protect consumers and regulate companies business activities associated
with customers private information. This exposed many weaknesses in companies privacy protection
systems and revealed unethical uses of private data.
We cannot claim perfection of this book on online privacy, a broad and multidimensional concept.
Nevertheless, we believe it fills a major gap in the coverage of privacy by providing a comprehensive
treatment of the topic. Thus, it provides a single integrated source of information on a multitude of privacy dimensions including technical, human, cultural, personal, and legal aspects. Research on privacy
is still evolving and a varied and broad coverage as presented in this book is a valuable reference for
researchers, practitioners, professionals, and students.

xx

Acknowledgment

We are grateful to numerous individuals whose assistance and contributions to the development of this
scholarly book either made this book possible or helped to make it better.
First, we would like to thank all chapter reviewers for their invaluable comments, which helped
ensure the intellectual value of this book. We would also like to express gratitude to our chapter authors
for their excellent contributions to this book.
Special thanks are due to the publishing team at IGI Global, in particular to our Managing Development Editor, Ms. Kristin Roth, who allowed her staff to provide invaluable support to keep the project on
schedule and in high quality, and to Dr. Mehdi Khosrow-Pour whose vision motivated the development
of this pioneering project. This project would not have been successful without Ross Miller, Deborah
Yahnke, and Rebecca Beistline, who tirelessly offered their professional assistance during the development of this project.
Finally, we would like to give our heart-felt thanks to Kuanchins wife, Jiajiun, and Adams family
for their understanding and encouragement during the development of this book.
Kuanchin Chen and Adam Fadlalla

Section I

Background

Chapter I

Google:

Technological Convenience vs.


Technological Intrusion
Andrew Pauxtis
Quinnipiac University, USA
Bruce White
Quinnipiac University, USA

AbstrAct
What began as simple homepages that listed favorite Web sites in the early 1990s have grown into some
of the most sophisticated, enormous collections of searchable, organized data in history. These Web
sites are search enginesthe golden gateways to the Internetand they are used by virtually everyone.
Search engines, particularly Google, log and stamp each and every search made by end-users and use
that collected data for their own purposes. The data is used for an assortment of business advantages,
some which the general population is not privy too, and most of which the casual end-user is typically
unfamiliar with. In a world where technology gives users many conveniences, one must weigh the benefits of those conveniences against the potential intrusions of personal privacy. Googles main stream of
revenue is their content-targeted AdWords program. AdWordswhile not a direct instance of personal
privacy breachmarks a growing trend in invading personal space in order to deliver personalized
content. Gmail, Googles free Web-based e-mail service, marked a new evolution in these procedures,
scanning personal e-mail messages to deliver targeted advertisements. Google has an appetite for data,
and their hundreds of millions of users deliver that every week. With their eyes on moving into radio,
television, print, establishing an Internet service provider, furthering yet the technology of AdWords, as
well as creating and furthering technology in many other ventures, one must back up and examine the
potential privacy and intrusion risks associated with the technological conveniences being provided.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Google

IntroductIon: the World of


seArch engInes
Now more then ever, the casual consumer is letting
their guard down on the Internet because of the
level of comfort gained over the past decade. The
Internet has become a norm of society and a staple
of culture. Many end-users accept the potential
risks of unveiling their credit card number online,
even at the most respected of retailers. While
having a credit card number compromised could
certainly cause a headache, the future of privacy
on the Internet does not have much to do with
those 16 magic digits. Instead, privacy, or lack
thereof, on the Internet has to do with something
all Internet users employ in their daily lives: the
search engine.
Privacy and general consumer protection on
the Internet is no longer exclusively limited to the
safeguarding of personal financial information
such as credit card numbers and bank accounts.
Other personal information is being given out
each and every day simply by using any major
search engine. Google, for instance, logs much
of what their users search for and then use that
information to their advantage. With hundreds
of millions of logged searches each day, a search
engine like Google can analyze everything from
cultural and economic trends right on down to
what a given user is thinking or feeling based on
their search queries. This collection of information is a smoking stockpile of marketing data that
can then be utilized to build or better render other
personalized, content-targeted services.
Search engines provide the enormous service
of indexing billions of pages of data so that the
end-user can mine for a given query. To end-users,
this indexing and search service is the ultimate
convenience put out by the major search engine
companies. It allows us to locate documents,
images, videos, and more among billions of Web
pages in a matter of milliseconds. An Internet
without search engines would be an unorganized,
uncharted, unmeasured wilderness of Web pages.

Rather than having to shuffle through a floor full


of crumpled up, torn notebook pages, search engines put everything into finely labeled, organized
notebooksan invaluable service no end-user
would ever sacrifice.
Web sites are typically archived, or indexed,
using advanced Web crawling bots or spiders that run off of servers and seek out new
Web pages or recently updated pages. A search
engines business is built entirely on the practice
of collecting dataas much of it as possible.
Search engines began as simple, small listings
of useful Web sites in the 1990s. One decade
later, these simple listings have turned into one
the most phenomenal collections of organized
data in history. Google, for instance, claims to
have over 10 billion pages of content indexed in
their search engine, with millions of more pages
being added each day.
It is because of the search engines easy access
to requested information that they have become
second-nature to Web users. People flock to
search engines without thinking twice. Google
has become a part of everyday society and a verb
in modern linguistics. When someone needs to
find something online, they simply Google it.
End-users enter names, addresses, phone numbers,
interests, health ailments, questions, fantasies,
and virtually anything imaginable into search
boxes. Every search is logged and saved. Every
user has a search engine fingerprint trail. The
data that search engines such as Google amount
from logging search queries is astronomical, and
the uses for such data are endless.
The value alone for such precise data to be sold
to advertisers is priceless. Imagine an advertiser
who obtained the search datain its entiretythat
Google has. They could immediately reconfigure
their marketing efforts with pinpoint precision.
Googles data reservoir is the Holy Bible of the
marketing universe. All the same, one could call
these piles of data some of the most dangerous
weapons in the world: identifying, damming,
incriminating search queries are logged by the

Google

millions each and every day. Nevertheless, endusers will always use search engines. End-users
will always Google something on their mind.
Search engines are quick, convenient, and always
yield a precise or near-precise result. Who would
want to give that up?

gAteWAys to the Internet


Google is no longer just a search engineit is a
portal to the Internet. Over the past several years
an architecture shift within the search engine
world has been occurring. Google, Yahoo!, and
an army of others, are providing much more than
just search results. Items like free e-mail and
Web site hosting have always been among the
traditional extensions of a search engine, but in
more recent years the bigger search engines have
launched impressive additions to their communities. Google, for instance, has recently launched a
suite of Web applications which rivals Microsoft
Office, complete with word processing and spreadsheets, all right on your Web browserpair that
with Google Maps, Google Talk, Google Calendar,
Google News, Google Checkout, Google Notebook, Google Groups, YouTube, Google Earth,
and Google Desktop, and one could argue never
having to leave the Google domain for all your
Internet needs! After all, the bigger the audience
they can attractand provide a technological
convenience or solution tothe better.
Why, exactly, does Google go through the
hassle of launching new services? It is simple:
search engines have the ultimate goal of finding
new ways to provide conveniences to the enduser. The Google services listedalong with the
dozens of others provided by Googleare just
that: conveniences. The more services Google has
to attract and retain their viewers the better off
they will be. Search engines live and die by the
amount of traffic that goes through their massive
networks each day. This traffic is their audience,
to which advertisements can be displayed, much

like commercials on television. Positioned on


the top and sides of each and every traditional
search engine result page, or SERP, are blocks
of advertisements which directly correlate to the
end-users search query.
On Google, these aforementioned blocks of
advertisements are called AdWords. AdWords
is one of the main support beams of the Google
business infrastructure, and is the companys
main stream of revenue. It allows end-users, from
individuals to companies to international organizations, to purchase advertising space on the
SERPs as well as other areas of Googles massive
advertising network. Prices for an advertising
campaign are determined solely on who searches
for what queries, and how frequently a given
query is searched for. Google is not alone in this
advertising architectureYahoo! runs their own
program, Yahoo! Search Marketing and MSN
runs Adcenter. Google AdWords, however, has
set the bar for this type of pay-per-click (PPC)
advertising, with Yahoo! and MSN usually following suit with any major changes that Google
makes to their program. AdWordswhile not a
direct instance of personal privacy breachmarks
a growing trend in invading personal space in
order to deliver personalized content.
One must also see things as Google does. This
is their company. When an end-user enters one
of Googles many Web sites, we are in essence
walking though the door of their store. Much like
any responsible store owner may observe his or
her patrons or inventory logs to see what sells
and what does not, Google is ultimately doing the
same, but on much larger scale. Google is free of
charge, and if they are not earning money through
some sort of subscription-based other e-commerce
model, then displaying advertisements is the only
sustainableand sensibleway to go. Google
CEO, Eric Schmidt, on what it takes to run and
improve the advertisements on Google:
More computers, basically, and better algorithms.
And more information about you. The more per-

Google

sonal information youre willing to give usand


you have to choose to give it to usthe more we
can target. The standard example is: When you say
hot dog, are you referring to the food, or is your
dog hot? So the more personalized the information, the better the targeting. We also have done
extensive engineering work with Google Analytics to understand why people click on ads. That
way we can actually look at the purchase and go
back and see what buyers did to get there. That is
the holy grail in advertising, because advertisers
dont advertise just to advertise, they advertise to
sell something. Google CEO Eric Schmidt. (Fred
Vogelstein, Wired Magazine, 4/9/2007)

the fIne lIne betWeen prIvAcy


And publIc domAIn
While Google may certainly be well within its
rights to collect some degree of data from their
users, one needs to determine where public domain
ends and personal privacy begins. In order for a
degree of privacy to exist on the World Wide Web,
one first must chart its boundaries. Definitions of
Internet privacy come on many different levels
and in many different flavors. Ali Salehnia, author
of Ethical Issues of Information Systems (2002),
defines Internet privacy as: the seclusion and
freedom from unauthorized intrusion. Salehnia
goes on to emphasize the word unauthorized,
explaining that the casual end-user understands
that his or her personal information is constantly
being collected by data centers around the world
to some degree. The point of unauthorized intrusion occurs as soon as the end-user is no longer
aware that this data collection is happening.
Unauthorized intrusion comes on several levels. On the highest level, an unauthorized intrusion of personal information will occur when the
perpetrator directly accesses a computer, stealing
as much or as little information as possible. This
could be achieved in a variety of ways, from the
perpetrator physically breaching personal data

by sitting down and simply using a computer, or


by slipping onto a system due to a lack of network security or other vulnerabilities. But most
privacy breaches are a lot more subtle than that,
and occur much more than Internet users would
like to think.
Putting Salehnias concept in a real-world
example, a user who enters their credit card on
Amazon.com typically understands that they are
giving that merchant quite a bit of information
about themselvesmuch the same as if that user
were to walk into a Macys and make a similar
purchase. On the other end of the spectrum, a
casual end-user who utilizes Google several times
a day may come to find out three years later that
every search they have ever made was logged,
time-stamped, and potentially used to Googles
own business benefit. The latter, of course, is
an illustration unauthorized intrusion on the
Internetand it is happening millions of times
a day.

A brIef hIstory of
contemporAry seArch
engInes
End users were not always as comfortable with the
Internet as they are today. Shopping on Amazon.
com, running a Google search, and online banking
are second-nature to most Americansbut back
in the 1990s, they were not. In a 1992 Equifax
study, 79% of Americans noted that they were
concerned about their personal privacy on the
Internet. Moreover, 55% of Americans felt that
privacy and the security of personal information
would get worse in the new millennium (Salehnia,
2002). A March 1999 Federal Trade Commission
(FTC) survey of 361 Web sites revealed that 92.8%
of the sites were collecting at least one type of
identifying information explains Salehnia.
Examining this statement more closely, nine out of
every ten Web sites not only detected but stored a
visitors name, address, IP, or something else that

Google

is both unique and identifying. This was 7 years


ago. Technology has become much more precise
and invasive, under the rationalization that some
of these intrusions are necessary for convenience
and the advancement of technology.
It was in the mid 1990s when several of the
worlds most well-known search engines began
appearing online. Yahoo! was one of the first big
search engines. Debuting in 1995, Yahoo! was
the result of Stanford University students David
Filo and Jerry Yang creating a simple homepage
that listed their own favorite Web sites. Next to
each URL, they included the sites full name
and a short description. As they began manually
indexing more sites, Yahoo! blasted off. It gained
a large fan base, which in turn led to major funding. Also making their debut in the mid 1990s
were Excite, Lycos, and AltaVistaall three of
which are still around presently.
In 1996, two other Stanford University graduate students, Larry Page and Sergey Brin, launched
BackRub. It was a system that organized Web
sites using a proprietary algorithm designed by
Page and Brin. This algorithm measured how
many Web sites link to a given Web site. The
more backlinks equals a higher page quality;
the higher the quality equals how high up on
the SERPs a given page appears. Page and Brin
opened shop in a garage, turned BackRub into
Googol several months later, and ultimately
ended up with Google thanks to a misspelling on
their first check from a venture capitalist. Google
is presently one of the most viewed, utilized Web
site portals on the Internet, and is growing in
virtually every way every single day. It is the big
brother of the World Wide Web, and any change it
makes effects virtually any Web site in existence.
Google grows by the day, but not only in terms
of features and pages indexed. Googles data
reservoir from its users searches is one of the
largest collections of data in history.

survIvAl of the fIttest


Companiesand Web sites in generalneed
search engines to survive. Search engines cannot be ignored or avoided. Professional Web sites
are constructed in such ways so that they are
search engine optimized (SEO), thereby improving their chances of a timely and deep indexing
by the popular search engines. Typical search
engine optimization trends are defined by the
major search engines. Google, for instance, has
largely done away with scanning HTML Meta
tags, which used to define Web site keywords
and details right in the HTML code, in favor of
scanning a given Web site for concise keywords,
themes, design style, inbound and outbound
links, even a visible privacy policy. In the age of
information, when Google hiccups, every major
Web developer hears it.
Search engines bring targeted traffic, which in
turn convert to sales, which creates a client base.
Many Web sites would flounder in the darkness
of cyberspace without search engines. Lawrence
M. Hinman (2006), director of the Values Institute at the University of San Diego explains the
philosophy of the search engine best: Search
engines have become, in effect, the gatekeepers
of knowledge. If you dont show up in Google,
you dont exist indeed, if you dont show up on
the first page of Google, you hardly exist.
Because of the reliance on search engineson
both personal and business levelsInternet users
will not ever stray away from using them. Alexa.
com, one of the Webs most respected traffic pattern analysis companies, reports that the average
Internet user visits google.com seven times a day.
According to Alexa, google.com receives 75-100
million hits a day, and that is not to mention any
visits to their foreign mirrors or other services.
Search engines are here to stay, and a level of
vigilance must be taken as to just how big they
become and how they utilize the sheer amounts
of data they collect.

Google

WhAts the hArm?


What is the harm is having Google log 150 million searches a day? In Elinor Mills CNET news
article Google Balances Privacy, Reach (2005),
she explains some of the extreme, but plausible,
scenarios that could arise: The fear, of course,
is that hackers, zealous government investigators,
or even a Google insider who falls short of the
companys ethics standards could abuse [this]
information. Google, some worry, is amassing
a tempting record of personal information, and
the onus is on the Mountain View, California,
company to keep that information under wraps.
Furthermore, there is no way a user can delete their
data from Googles records, nor is there any sort
of clear explanation what the information could
potentially be used for in the future.
According to GoogleWatch, a non-profit organization that has as one of its goal the spreading
of awareness of search engine privacy, there are
several important main points in regards to the
potential privacy exploits and other harm Google
could wreak. The first point it brings up is Googles
mysterious 30 year cookie. Google places a cookie
with an expiration date of January 17, 2038, 2:14:05
PM on the hard drive of any user that makes a
search or visits a website containing Adsense
advertisements (Googles publisher network).
The cookie contains a uniquely identifying serial
number which can be used to correlate with other
searches. In essence, it is another IP addressa
second layer of identification Google can use
to string together a users search queries or site
viewing trends.
The second breach of privacy, GoogleWatch
argues, is the increasing reliance on IP addresses.
As seen in the server log, Google includes a
users IP address among other things in each
stored search query. IP addresses reveal much
more than a number, but a location as well. This
is called IP delivery based on geolocation and
can serve several purposes. For one, when a user
searches Google, the engine knows your country,

state, and town. Advertisements are then tailored


to these settings.
Finally, GoogleWatch alludes to some of
Googles side projects. A notable project is the
Google Toolbar which was released in March 2002,
and continues to stir a great deal of controversy in
regards to what Google captures and logs. Google
captures a users IP and sends a cookie when their
search engine is used or when the user lands on
a page with Adsense ads. The Google Toolbar,
on the other hand, logs, records, and beams back
to Google every site a user visits, regardless of
whether or not that site is in any way affiliated
with Google.
The Google Toolbar is just another example
in modern conveniences. The convenience of
the end-user having the most powerful search
tool in the world in front of them at all times
outweighs the possible privacy invasions it may
pose. According to PC Magazine writer Cade
Metz, One of the advanced Toolbar features is
a service called PageRank. With this activated,
when you visit a Web site, a small PageRank icon
in the toolbar gives you a rough indication of
how popular the site is. If the site is particularly
popular, youll see a long green line. If not, youll
see a short green line. When this service is turned
on, Google keeps a complete record of every
Web site you visit. The Google Toolbar comes
set with the page-tracking feature set to on. The
Google Toolbar privacy policy explains that by
knowing which web page you are viewing, the
PageRank feature of Google Toolbar can show you
Googles ranking of that web page. PageRank is
part of the search algorithm patent developed by
Sergey Brim and Larry Page that decides which
pages appear higher than others in the SERPs.
Many users simply leave the feature turned on,
while those in Web-production or similar fields
choose to leave it on as a convenient resource.
This resource allows Web developers and Web
aficionados to view in live-time which Web sites
have high PageRank, and which do not. It is an
invaluable resource to those in the field. The price

Google

paid, of course, is privacy. As a side note, out of


the billions of Web pages out there, not many rank
as the coveted PageRank 10. A few examples are:
Google.com, Adobe.com, NASA.gov, Real.com,
MIT.edu, and NSF.gov (November 2006).
Another clash of convenience and privacy
within the Google realm comes under their
Google Earth application. Googles popular
driving directions and maps Web site launched
one of the companys most controversial and
privacy-threatening features to date in June of
2007. Dubbed Street View, it is a feature that
allows end-users to view 360 degree panoramic
and zoom-able images from street level. While
the images are not live, they still captured people
in action during a certain point in time. Some of
the images are clear enough to easily identity individuals. Google was bombarded with complaints
by privacy advocates, and has since instituted
a policy of blacking out images of people upon
request. The Street View feature took loss of
privacy on the World Wide Web to a new, much
more tangible level.

seArch engIne loggIng


Google certainly was not the first search engine,
but it is the one that has made the largest impact
on Internet. Any change made by Google is felt
not only by the Webmasters of the Internet, but
by virtually the entire World Wide Web. Googles
reach is far and strong. Google engineers are
constantly improving indexing technologies, to
make search results more accurate. This in turn
brings in more users, which brings more traffic, which equals more data to be collected and
utilized. The more traffic, of course, also means
a higher flow of clicks on Googles advertising
network. The more clicks, the higher the likelihood of Google selling a user their advertiser
products via AdWords.
Google tracks it all: every search, every query,
right down to each and every letter and punctua-

tion mark. Google stores upwards of 150 million


queries a day from all over the world. What exactly
does this data look like? In its most primitive form,
a search for colleges would yield the following
piece of data in Googles logs:
...89 - /Nov/00 0::9 - http://www.google.
com/search?q=colleges - Firefox .0.; Windows XP SP
08kd0e9

Clearly marked in this simple string of information is the end-users Internet protocol address,
date and time of access, the URL complete
with keywords (search?q=colleges), browser and
browser version, operating system, and unique
user-identifying cookie. Google generates and
stores over one billion of these strings a week,
which equals approximately fifty billion a year. If
Google has not deleted one of these queries since
they began logging, then one could estimate that
they currently store upwards of half a trillion of
these strings within their servers.
Google provides a rather by-the-book explanation in regards to the reason why they log every
search. In a CNET news article by Elinor Mills,
she asked Google executive Nicole Wong the need
for such a collection of data. The answer: Google
uses the log information to analyze traffic in order
to prevent people from rigging search results, for
blocking denial-of-service attacks and to improve
search services. This data is then stored on many
servers, for an unknown period of time. According
to David F. Carr of Baseline Magazine, Google is
believed have anywhere from 150,000 to 450,000
servers around the worldwith the majority being
in the United States.
Googles privacy policy, which clearly outlines
all of their data collection practices, is a mere two
clicks away from any site on their entire network.
Their search engine policy is divided up into nine
clear, simple, easy to understand sections that
span no more than two screens and clocks in at
approximately 980 words. The policy makes it
clear that no personally identifiable information

Google

is ever released without court warrant. However,


anything typed into a search box is certainly
free-game and very much becomes the property
of Google.
Googles privacy policy clearly states that
Google may share information about you with
advertisers, business partners, sponsors, and other
third parties. According to their own policy,
Google shares statistical data regarding search
trends with groups of advertisers and affiliates.
With information like this, an advertiser could
potentially know how to best optimize their own
Web sites to increase traffic from Google or to
appear significantly higher in the SERPs. Obviously, the more traffic a given site receives the
more probability of a conversion, sale or lead.

google knoWs It All


Users type in what they need to find. Typically,
this has much to do with what they are thinking,
planning on doing, or is just something that is a
part of their life. For instance, Google may log IP
123.45.67.89 as having 17 out of 20 searches on a
given day having to do with high blood pressure
and treatments. Clearly, Google now knows that
IP 123.45.67.89 has a user who either has this ailment or is researching it on behalf of a friend or
relative that does. Kevin Bankston, staff attorney
for the Electronic Frontier Foundation, one of the
Internets biggest privacy advocates explains that
data like this is practically a printout of whats
going on in your brain: What you are thinking of
buying, who you talk to, what you talk about.
Taking the example about high blood pressure
one step further, Google now is able to potentially
make money from user 123.45.67.89. First, Google
will make money through their AdWords program.
Google uses the search term to show advertisements that are as close to the search query as
possible on the top and right sides of each SERP.
While in no means is this a direct breach of privacy,
the notion of scanning what an end-user enters to

deliver targeted content is beginning to evolve.


Take Gmail, for instance. Gmail does not scan a
search queryit scans private email messages.
The process is automated, of course. That is to
say there is not a Google employee sitting in a
cubicle somewhere reading personal e-mails of
Gmail users. Messages are scanned by a bot.
Gmail took content-targeted, personalized ad
delivery to a whole new levela level which
is simply the precursor for even more invasive
advertising yet to come.

AdWords
The value of the sheer amounts of data Google
has amassed is priceless. It is data that can prove
trends, display top ranking search queries, illustrate what region searches for what item
morethe list is long, but not as long as the list
of marketing groups or advertisers that would
do anything to have such data. Should the data
Google has amassed ever be released, the damage
could be devastating. In the summer of 2006, AOL
released the search queries of 528,000 of its users
that used AOLs generic search engine. While not
nearly as detrimental as could have been if it was
Google releasing data, the damage was already
done. People were openly identified by their search
termsseveral even being convicted of crimes
and sentenced to prison, as their search queries
either supported evidence of their wrongdoings,
or put the person in the spotlight and exposed
them for committing a crime.
Google uses much of their data to sculpt their
pride and joy: AdWords. AdWords accounts for
nearly $3.5 billion in yearly revenue, and is the
main stream of Googles income. Google utilizes
their users search queries to aggressively and
competitively price keywords. Advertisers then
bid on these keywords for positions on the top
and right side of the SERPs. The more relevant
the ad, the more likelihood a targeted searcher
will click on it. While this is hardly an invasion

Google

of end-user privacy, the concept of AdWords


utilizing search trends and scanning text given
by the end-user is the precursor to more invasive
and privacy-exploiting advertising practices.

gmAIl: AdWords evolve


Gmail, Googles free Web-based e-mail service,
gives users who sign up their own Google e-mail
and nearly three gigabytes (and counting) worth
of storage. It has a lightning-fast interface, and
some of the most advanced features ever seen in
Web-based e-mail. But all of the flashy features are
there for a reason: to once again drive people in.
Once more, Google is freely handing out convenience for the small price of highly-personalized,
content-targeted advertising.
This time around, Google is scanning e-mail
messages for certain keywords. For instance, if
Google sees the word ice cream in a private
e-mail, advertisements for ice cream will appear on the right side of their e-mail application.
According to the Gmail privacy policy: Google
maintains and processes your Gmail account and
its contents to provide the Gmail service to you
and to improve our services. The Gmail service
includes relevant advertising and related links
based on the IP address, content of messages and
other information related to your use of Gmail.
Elinor Mills of PC Magazine explains the other
unsettling fact about Gmailthat Google takes
its time to erase trashed messages from their
servers. According to Mills, Gmail users can
delete messages, but the process isnt intuitive.
Deletion takes multiple steps to accomplish and
it takes an undetermined period of time to delete
the messages from all the Google servers that may
have a copy of it
Since Gmail launched in 2004, their automated
scanning of private e-mail messages has stirred a
bit of controversy. The GoogleWatch organization

argues the possibility of Gmail compiling a database of what email addresses use which keywords.
What further complicates Gmails e-mail privacy
is the fact that people who do not even subscribe
to the Gmail service are having their e-mails
scanned as well. If a Gmail user receives an email from a Comcast e-mail address, the person
with the Comcast e-mail is having their message
scanned for advertising opportunities by Google.
The sender does not agree to Googles invasive
scanning, nor do they agree to have their e-mail
infiltrated with ads pertaining to what they wrote
in private. Gmail furthers the notion of invasive
and targeted advertising, and merely builds some
framework to the forms of advertising Google
may be capable of in the future.

the future of the seArch


engIne Industry
Predicting the future trends of search engines
is like predicting the future trends of any other
industry: its purely hypothetical. Search engines
are businesses. From time to time, like any other
business, they drop hints of their future plans.
Newspapers carry the latest headlines and trade
journals chart their latest strategic business moves.
The best we can do is to connect the dots using
this information. Things such as acquisitions,
however, may give the best glimpse into the future
of a particular search engine.
Take for instance, Googles acquisition of
YouTube. When Google acquired YouTube in
October of 2006 for $1.65 billion, it sent ripples
through the industry. Google, worth almost $170
billion in stock, swallowed YouTube whole. After
all, in the grand scheme of things, YouTube is to
media what Google is to information. Aside from
being one the largest technology acquisitions ever,
this showcased Googles might. But above all,

Google

this acquisition dropped a hint of where Google


is going.
From Googles Press Release:
The acquisition combines one of the largest and
fastest growing online video entertainment communities with Googles expertise in organizing information and creating new models for advertising
on the Internet. The combined companies will focus
on providing a better, more comprehensive experience for users interested in uploading, watching
and sharing videos, and will offer new opportunities for professional content owners to distribute
their work to reach a vast new audience.(source:
Google press release, October 9, 2006)
One of the main motivations behind Googles
acquisition of YouTube is laid out right in their
press release: creating new models for advertising
on the Internet. The past few years have given
birth to a new hybrid form of Internet-based advertising: viral marketing. Although viral marketing has been around for a decade, it was never a
household term or fully-utilized marketing method
until recently. Viral marketing encompasses any
advertising effort that quickly spreads (typically
over the Internet) through social networks and
becomes the talk of the town. Viral videos and
multimedia usually depict highly unique, comical,
or risqu materials that have exceedingly unusual
appeal to certain niche audiences. A successfully
executed viral campaign costs nothing, but can
create the same results as paid print or media
advertising.
What makes the opportunity of viral marketing
so enticing to Google? With millions of hosted
videos, and upwards of 70,000 new videos added
per day, YouTube controls a large portion of the
Webs video-viewing audience. Combined with
Googles analytical and marketing arms, the
Google/YouTube combo makes for a serious advertising podium. If a popular viral niche is found
by Google, they can ride that niche by inserting

0

their own advertising campaign or that of one of


their advertisers via AdWords. As viral videos
spread like the plague through social networks,
Google may be able to control what messages or
advertisements those videos hold.
Viral marketing aside, imagine for a moment
just running a basic search for a particular car
model. Instead of Google just returning the usual
search results and accompanying AdWords, the
search engine can now return streaming commercials targeted precisely for the make and model.
Similar to how logs of how many searches a
particular user makes, logs of which commercials
play more often can also be made and utilized in
a similar capacity. But above all, when an enduser watches one of these advertisements they
have actively sought it out, rather than passively
watching a random commercial on the TV. These
Internet searchers are more targeted consumers
than the passive coach-potato.
Google acquiring YouTube is only one of the
dozens of major ventures either recently completed
or in the works by major search engines. Search
engines are exploding their presence online as
well as off-line. The future of search engines may
be somewhat of a rather ironic thing. During the
dot com bubble of 1990s, businesses were doing
everything possible to get online and to expand
their presence there. Now, we see well-established online entities wanting to move off-line.
For the online powerhouses, the off-line world
is an uncharted wilderness of opportunity and
new audiences.
Take Google, who has paved the way for Internet advertising, now researching something
foreign to them but commonplace to traditional
marketers: radio and television. Google AdWords
makes it extremely simple for a marketing novice to launch an advertising campaign online.
Through their simple interface, all one must do
is create an ad, choose a target audience, and pay
using a credit card. Google uses its stockpile of
end-user search query data to run the advertisements as precisely as possible, which in turn gives
the advertiser sales or conversions.

Google

This degree of simplicity is something that


never existed for marketers wishing to easily
advertise on the TV or radio. To advertise on
one of these mediums, one must traditionally
go through the complicated steps of calling the
cable company, creating an advertisement with
a third-party media company, and working with
another third-party marketer to properly target
the ads. If Google can properly take the complexity out of running advertisements on the radio
and television for the average person (especially
in the age of Webcams and YouTube), it could
potentially reshape the landscape of the TV and
radio advertising industry.
Why does the future of search enginesespecially if their main focus is the off-line world
matter? Simply put: because the data they begin
collecting off-line will be migrated into the online
world. If search engines can one day record the
shows we watch, the songs we listen to, the advertisements we actively (as opposed to passively)
watch, there will be no stop in sight for the types
of overly-invasive advertising methods used.

the future of google


AdvertIsIng
Today, the vast majority of our revenue is in text
ads correlated with searches. In the last couple of
years, we have developed what are called display
ad products, including banner ads, video ads,
click-to-call ads, and things like that. And Ive
also said that we are pursuing the possibility of
television advertising. By that I mean traditional
television advertising. And we bought dMarc
Broadcasting to do radio ads.
So lets rank the probability of them being affected
by targeted ads. Theres search: Thats 100 percent
affected. What about radio? Is it possible to get a
targeted ad right to your car right now? Not yet
because we cant target the individual receiver
in your car. If two cars are next to each other,

the same radio station cannot have two different


ads. However, if its at a regional level we can do
it to the zip code level. So lets call that partial
targeting.
Now, lets look at television. Every one of the next
generation of cable set-top boxes is going to get
upgraded to an IP-addressable set-top box. So
all of a sudden, that set-top box is a computer
that we can talk to. We cant tell whether its the
daughter or the son or the husband or the wife in
a household. All we know is were just talking to
the television. But thats pretty targetable because
family buying patterns are pretty predictable, and
you can see what programs theyre watching. And
if youre watching a YouTube video, we know
youre watching that video.
My point of going through this little treatise is to
say, if the total available market is ($600 billion
to $800 billion, we wont be able to target all
$800 billion. It will not be a 100 percent perfectly
targetable, straight into your brain, but we should
be able to offer a material improvement (in response rates) to many businesses. Eric Schmidt,
Google CEO (Fred Vogelstein, Wired Magazine,
4/9/2007).
AdWords is not only migrating into other
services in Googles network, but it is evolving
technologically as well. During the summer of
2006, Google launched two new services which
further enhanced AdWords. First, they launched
their own payment processor dubbed Google
Checkout. Advertisers that elect to use Google
Checkout have a small shopping cart icon next
to their advertisement. Google Checkout brings
AdWords even closer to 1-click-purchasing. A
Google search engine user that makes a purchase
through Google Checkout surrenders their name,
address, and credit card number to Google to make
that purchase. While financial information is no
doubt safe, Google now has the ability to connect
a users searches to their buying accountif



Google

they elect too, of course. Also launched in the


summer of 2006 was AdWords Click-to-Talk.
With the click of button, a user may now talk to
a salesman or representative on the other side of
the advertisement.
A second advancement in Googles technology
came with the 2005 launch of Google Desktop,
and has only been seeing stable releases as of late
2006. Google Desktop took Googles indexing of
files one step further. Like the Google Toolbar,
this service was a downloadable software package. Essentially, Google Desktop records a users
e-mail, photos, music, and all other files so that
they are easily searchable. Once more, Google
drew in a fan-base for this product, as the search
capability they engineered has been proven far
more powerful than the built-in Windows search.
Google Desktop lets Google have full-access to a
computer and other computers that system may
be networked with unless a particular setting in
the program configuration is set off.
In February of 2006, the Electronic Frontier
Foundation distributed a press release expressing their outrage against Googles latest venture.
The release stated: Google announced a new
feature of its Google Desktop software that
greatly increases the risk to consumer privacy.
If a consumer chooses to use it, the new Search
Across Computers feature will store copies of
the users Word documents, PDFs, spreadsheets,
and other text-based documents on Googles own
servers, to enable searching from any one of the
users computers. The release of and technology
driving Google Desktop is yet another precursor
to the potential future plans Google has in store
for data collection and targeted, intrusive advertising.
As if Googles reach was not far and impressive
enough, the next stop for Google might be in the
airwaves. Google has been known for some time to
be toying with the idea of becoming a full-blown
Internet service provider, in the form of a massive
WiMax or Wi-Fi network. This would give them
more leverage to further deliver targeted advert-



ing and appeal to an even larger base. The notion


came about early in 2005, when the city of San
Francisco sent out a request for information about
a free city-wide high-speed wireless network.
Google originally showed much interest.
PC World Magazine describes Googles pitch
to San Francisco as a system that would deliver
both free access, and the ability to upgrade to
faster access. PC World writer Stephen Lawson
explains that: Google proposed to build an IEEE
802.11b/g Wi-Fi mesh network that delivers more
than 1 megabit per second of capacity throughout
the city. Anyone in the city could get access free
at speeds as high as 300 kilobits per second, and
Google or third parties could sell access at higher
speeds, possibly as high as 3mbps.
By entering the world of Wi-Fi and WiMax,
Google secures itself a new stream of potential
income, while showing off its might to other
media conglomerates. By having a free WiMax
network, Google also can get into more homes
and places that otherwise may not have ever
been able to afford Internet. The keyword, of
course, is free. Jeff Chester, writer for The
Nation Magazine explains that the costs of
operating the free service would be offset by
Googles plans to use the network to promote its
interactive advertising services. In other words,
Google generates even morefreetraffic, that
in turn uses its services, generates more data, and
makes Google more money. Google is currently
up against other big providers such as EarthLink,
who also wants to develop a massive Wi-Fi network in San Francisco.
Along with the potential implementation of
city-wide wireless networks comes the ability for
mobile phones to use these networks instead of
the traditional cellular networks. Imagine, for a
moment, placing a phone call with a friend some
years down the road in the future. The mobile
phone is any one of your typical brand name
phones, but it is not one that belongs exclusively
to any network like Verizon, Sprint, or Cingular.
Instead, it uses voice over IP technology over a

Google

nationwide, ubiquitous Google Wi-Fi network.


The phone call is going over the Internet, much
the same as it can today. Presently, VoIP-based
phones must be attached to a computer or used
wirelessly within close proximity of an access
point. The ability to freely roam does not exist.
In the future, the phone in the mentioned example
making this phone call may be getting used at
the beach, in a restaurant, or in a department
storemuch like an ordinary cell phone. With
high-speed WiMax and city-wide Wi-Fi on the
horizon, cell phones that rely on VoIP have the
potential to become a popular commodity.
Now imagine for a moment a conversation taking place over this network. The conversation has
to do with getting an ice-cream cake for a friends
upcoming birthday. If the Wi-Fi network or VoIP
network the phone is hooked into is run by Google,
for instance, they may have similar technology to
scan verbal words in the same fashion as typedwords on Gmail. As soon as ice cream is said
in the conversation, a small tone beeps, lighting
up the phones screen. On the screen is the nearest
location for an ice cream parlor. If Google can scan
what users type in their personal e-mails, why
not move into audible conversations? By offering
such telecommunications services for free (or
significantly cheaper than the cellular alternative),
Google again increases their customer base and
exposes more users to their network of advertisements. The sacrifice of privacy for convenience
will evolve with the technology.

In google We trust
In a world where technology keeps creating conveniences, we must sometimes hit the brakes so we
can make sure nothing is getting missed. Google
is amassing one of the largest collections of search
data in history. It remains to be seen whose hands
this data will end up in, or how Google will use
it to their maximum advantage. Services like AdWords and Gmail are slowly pushing the notions

of invasive and extremely targeted advertising,


and future innovations are sure to bring even more
invasive advertising. While the vast majority of
casual Internet users either do not know Googles
data collection policies, or simply do not care, at
the end of the day it comes down to the simple
fact that we as a society must put our trust into
the technological innovations that have become
commonplace conveniences.
We live in an interesting time. Wires are disappearing and high-speed Internet will be in the
air all around us. Media conglomerates will go
up against each other for control of the wireless
Internet airwaves. Online giants will migrate
off-line, and more off-line entities will move
online. Privacy will always be a question, but
smart surfing and awareness can help to reduce
unwanted intrusion. Stephen OGrady, the blogger
behind the Tecosystems blog, was quoted by PC
Magazine writer Cade Metz: Google is nearing
a crossroads in determining its future path. They
can take the Microsoft forkand face the same
scrutiny Microsoft does, or they can learn what
the folks from Redmond have: Trust is hard to
earn, easy to lose and nearly impossible to win
back.

references
Bankston, K. (2006, February 9). Press releases:
February, 2006 | electronic frontier foundation.
Retrieved November 11, 2006, from http://www.
eff.org/news/archives/2006_02.php
Brandt, D. (n.d.). Google as big brother. Retrieved
November 11, 2006, from http://www.googlewatch.org
Carr, D. (2006, July 6). How Google works. Retrieved November 17, 2006, from http://www.baselinemag.com/article2/0,1397,1985040,00.asp
Chester, J. (2006, March 26). Googles wi-fi privacy ploy. Retrieved November 14, 2006, from
www.thenation.com/doc/20060410/chester



Google

Gage, D. (2005, March 7). Shadowcrew: Web


mobstimeline: Cybercrime. Retrieved November 1, 2006, from http://www.baselinemag.
com/article2/0,1397,1774786,00.asp
Garfinkel, S. (2002). Web security, privacy &
commerce (2nd ed.). Sebastopol, CA: OReilly
& Associates.
Google corporate information: Google milestones. (n.d.). Retrieved November 25, 2006, from
http://www.google.com/corporate/history.html
Google privacy center: Privacy policy. (n.d.).
Retrieved November 10, 2006, from http://www.
google.com/privacy.html
Googles advertising footprint. (2007, June 14). Retrieved July 21, 2007, from http://www.eweek.com/
slideshow/0,1206,pg=0&s=26782&a=209549,00.
asp
Hinman, L. (2006, March 16). Why Google matters. Retrieved November 7, 2006, from http://
www.signonsandiego.com/uniontrib/20060316/
news_lz1e16hinman.html
Internet usage world statsInternet and population. (n.d.). Retrieved November 12, 2006, from
http://www.Internetworldstats.com
Lawson, S. (2006, Nov. 29). Google describes its
wi-fi pitch. Retrieved December 1, 2006, from
://www.pcworld.com/article/id,123157-page,1/
article.html
Metz, C. (2003, February 27). Is Google invading your privacy? Retrieved December
2, 2006, from http://www.pcmag.com/article2/0,4149,904096,00.asp
Mills, E. (2005, July 14). CNET.com. Retrieved
November 7, 2006, from news.com.com/
Google+balances+privacy,+reach/2100-1032_3Report to Congressional Requestors. (2005). Information security emerging cybersecurity issues
threaten federal information systems. Retrieved



December 12, 2006, from http://www.gao.gov/


new.items/d05231.pdf
Salehnia, A. (2002). Ethical issues of information systems. Hershey, PA: Idea Group Incorporated.
Thompson, B. (2003, February 21). Is Google too
powerful? Retrieved December 12, 2006, from
http://news.bbc.co.uk/2/hi/technology/2786761.
stm
Traffic details for Google.com. (n.d.). Retrieved
November 11, 2006, from http://www.alexa.
com/data/details/traffic_details?q=www.google.
com&url=google.com
Vogelstein, F. (2007, April 9). Text of wireds
interview with Google CEO Eric Schmidt.
Retrieved July 15, 2007, from http://www.
wired.com/techbiz/people/news/2007/04/mag_
schmidt_trans?curren

AddItIonAl reAdIng
Battelle, J. (2005). The search: How google and
its rivals rewrote the rules of business and transformed our culture. (Portfolio Hardcover ISBN
1-59184-088-0).
Brin, S., Motwani, R., Page, L., & Winograd, T.
(1999). The PageRank citation ranking: Bringing
order to the Web. Retrieved from http://dbpubs.
stanford.edu:8090/pub/showDoc.Fulltext?lang=e
n&doc=1999-66&format=pdf&compression=
Brin, S., & Page, L. (1998). The anatomy of a
large-scale hypertextual Web search engine.
Retrieved from http://dbpubs.stanford.edu:8090/
pub/1998-8
Electronic Frontier Foundation (EFF): http://www.
eff.org/issues/privacy
Google PageRank Patent: http://patft.uspto.gov/
netacgi/nph-Parser?patentnumber=7058628

Google

Google Privacy Policy: http://www.google.com/


privacy.html

Search Engine Land Blog: http://searchengineland.com/

Google-Watch.org: http://www.google-watch.
org

Search Engine Watch: http://searchenginewatch.


com/

Malseed, M., & Vise, D. (2005). The Google story.


Delacorte Press.

World Privacy Forum: Search Engine Tips: http://


www.worldprivacyforum.org/searchengineprivacytips.html





Chapter II

A Taxonomic View of
Consumer Online Privacy Legal
Issues, Legislation, and
Litigation
Angelena M. Secor
Western Michigan University, USA
J. Michael Tarn
Western Michigan University, USA

AbstrAct
In this chapter, consumer online privacy legal issues are identified and discussed. Followed by the literature review in consumer online privacy legislation and litigation, a relational model is presented to
explore the relationship of the issues, legal protections, and the remedies and risks for not complying
with the legal requirements. Two survey studies are used to reinforce the vital need for a stronger role
by the government and business community as well as the privacy awareness from online consumers
themselves. This chapter is concluded with a vital call for consumer privacy education and awareness
and government and legislators attention and timely responses with legislation that protects consumers
against those who would misuse the technology.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

IntroductIon
Information privacy is defined as the right of individuals to control information about themselves
(Richards, 2006). As the Internet becomes more
popular and more people are using it as a daily
means of communication, information sharing,
entertainment, and commerce, there are more
opportunities for breaches of privacy and malicious intent attacks. There have been numerous
bills introduced in the House of Representatives
and the Senate in recent years attempting to legislate protections for consumers regarding online
privacy. Many of these attempts at legislation fail
to become laws. This study aims to examine consumer online privacy legal issues, recent litigation
topics, and the present active legislation. The topic
will be of interest because some of the legislation
does not provide more consumer protection but is
instead taking away consumer privacy such as the
USA Patriot Act and the Homeland Security Act
enacted after the terrorist attacks of September
11, 2001. These laws give government more access to private information instead of providing
consumers with increased protections.
Some relevant privacy issues are underage
consumer protections, health information privacy,
lack of consumer control over information stored
in databases, information security breaches, and
identity theft. Recent litigation in the United States
in the information security area has been over the
lack of protection over the information gathered
and stored by companies from consumers. The
Federal Trade Commission (FTC) has initiated
lawsuits against companies not providing the
level of information protection they should. The
FTC charged Petco with Web site security flaws
that allowed a structured query language (SQL)
injection attacker to gain consumer credit card
information (FTC File No. 032 3221, 2004). The
FTC also charged BJs Wholesale Club with failing
to secure credit card magnetic stripe information
appropriately (FTC v. BJs Wholesale Club, Inc,
2005). There was also a class action suit filed on

behalf of Banknorth, N.A. (Visa and Mastercard)


charging BJs Wholesale Club that hackers
gained access to credit card information of cardholders and used the information fraudulently
(FTC v. BJs Wholesale Club, Inc, 2005). These
instances are examples of companies failing to
take proper measures to secure consumer information. The stolen personal information could have
been gathered through an online interaction or a
personal visit to the company. These examples
show that it does not matter how a consumer
interacts with a company, either on the Web, in
person, or on the phone, the company stores the
information they gather in databases on their
systems and all of the information is a target.
Current laws relating to consumer online privacy and protections are the U.S. Safe Web Act
of 2006, the USA Patriot Act of 2001, Homeland
Security Act: Cyber Security Enhancement Act
of 2001, the Federal Privacy Act of 1974, the
Childrens Online Privacy and Protection Act of
2000, and the Health Insurance Portability and
Accountability Act of 1996. The points pertaining to consumer privacy and protection will be
included. Not all parts of the laws may be applicable to the subject of this chapter.
In the following section, consumer online
privacy legal issues are identified and discussed.
Followed by the literature review in consumer
online privacy legislation and litigation, the
authors present a relational model to explore the
relationship of the issues, legal protections, and
the remedies and risks for not complying with the
legal requirements. Two survey studies are used
to reinforce the vital need for a stronger role by
the government and business community as well
as the privacy awareness from online consumers
themselves. This chapter is concluded with a vital
call for consumer privacy education and awareness and government and legislators attention
and timely responses with legislation that protects
consumers against those who would misuse the
technology.



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Table 1. Research in consumer online privacy legal issues


Year

Author

2006

Swartz, Nikki

Health Information
Privacy

Issue

A survey of 1,117 hospitals and health systems that was conducted in January by
the American Health Information Management Association (AHIMA) found that
compliance with the three-year-old federal rules governing the privacy of patients
medical records declined in the past year. --The survey results also suggest that
patients are becoming more concerned about the privacy of their medical records.
According to 30% of respondents, more patients are asking questions.

Contribution

2006

Vasek, S.

Information Privacy

Sensitive, confidential information found in court and probate records, deeds,


mortgages, death certificates, and other civic records is posted online --The information is available because the records containing it are, by law, public records
filed according to provisions of state statutes and maintained at taxpayer expense.
In many instances these records contain health information, social security and
medicare numbers, birth dates, bank account information, prescription numbers,
and information that could be used to steal individuals identity.

2004

Milne et al.

Information Security
Breaches

Directly hacking into company databases and stealing personal or financial data,
such as consumer credit card or social security information

2004

Milne et al.

Identity Theft

Using techniques such as IP spoofing and page jacking


Using e-mail and a bogus Web page to gain access to individuals credit card data
and steal thousands of dollars from consumers
Cyber-thieves who were able to access tens of thousands of personal credit reports
online

2004

Milne et al.

Spyware, Malware,
Viruses & SPAM

The installation of spyware distributed as viruses attached to e-mail makes it


possible for third parties to view the content of a consumers hard drive and track
movement through the Internet.
Cookies that allow others to track clickstream history

2003

Bagner et al.

Underage Consumer
Protection

The lack of super-vision while online exacerbates a childs vulnerability to online


violations of privacy. A 1998 survey revealed that 36% of parents admitted that
they never supervised their childs use of or access to the Internet.

consumer onlIne prIvAcy


legAl Issues
Table 1 summaries the major research studies in
consumer online privacy issues which contain
the following six categories: information security
breaches, information privacy breaches, identity
theft and pre-texting, health information privacy,
underage consumer protection, and spyware, malware, viruses, cookies, and SPAM. The following
subsections discuss these six categories.

Information security breaches


Information security breaches include events such
as hacker or SQL injection attacks on business or
institutional networks resulting in stolen personal,
financial, or medical information, interceptions of

8

e-mail by unauthorized parties, and any breach


of security not withstood by current security
practices. Each of these security breaches can
be disastrous for consumers. In the instance of
the FTC v. Petco, an unauthorized person outside the company was able to successfully break
into the Petco information system through their
Web site using structured query language (SQL)
attacks and gained access to customer personal
and credit card information that was being stored
on the system in an unencrypted format (FTC
File No. 032 3221, 2004). This security breach
resulted in the compromise of customer information in the custody of the business that was not
reasonably and appropriately protected against
external threats.
Besides information being threatened by
external entities, it can be threatened by internal

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

entities as well. Security breaches by employees or agents of companies retaining consumer


information occur all too often. Although there
are not good statistics to quantify the amount of
loss, this is still a threat to information security.
Businesses need to studiously monitor employee
access to sensitive information and have enforceable privacy policies in effect. Policies should
include what information access is necessary for
job positions, how the information is handled and
disposed of if necessary (shredded), and removal
of access immediately upon employee dismissal.
Policies may also include minimum password
security measures.
Also included in information security breaches
is the interception of e-mail or unencrypted files.
According to AnonIC.org e-mail can be intercepted at each step along the way. (The) E-mail
message is stored on two servers on its way at
least: on sender ISP mail server and on recipient
ISP mail server. When traveling through the MX
hosts, message is stored on each of MX hosts (Email Security and Anonymity, 2004). Depending
on where the e-mail message is going, it could
be stored on many servers along the way. There
is opportunity at many points for an experienced
attacker to gain access to email messages and files
attached to the email messages.

Information privacy breaches


Information privacy breaches include any release
of information by a business or institution that is
not compliant with privacy practices defined by
law or within the business privacy statement. Such
occurrences could be purely accidental, but the
occurrence just highlights the need for business
and institution policies to pay careful attention
to the laws and to make sure harm to consumers
is not a result of their actions.
An example of an information privacy breach
would be the selling of customer information to
third-party vendors without notification to the
consumer through a privacy statement that such

a release of information could occur. In 2003 the


FTC brought changes against CartManager, Inc.,
a Web-based shopping cart software company, for
selling to third parties the personal information
of nearly one million consumers. The FTC
asserted that consumers reasonably expected
that the merchants privacy policies, rather than
CartManagers, covered information consumers
provided on the shopping cart and check out
page of the merchants Web sites (Ambrose &
Gelb, 2006).

Identity theft and pre-texting


Identity theft is when someone uses anothers
personally identifying information, like names,
social security numbers, or credit card numbers,
without permission, to commit fraud or other
crimes (ftc.gov, 2006). Online consumer privacy
issues such as identity theft can occur through
many channels. One example is phishing, where
consumers receive unsolicited e-mails requesting
information, and they unwittingly provide private
information, which is then used maliciously. The
theft can also occur when a company does not
secure electronic information appropriately or
when employees steal the information. Identity
theft can also occur in an off-line situation where
documents containing personal information are
not disposed of properly by either a business or
consumer themselves. Many businesses have
policies regarding disposal of confidential or sensitive information, specifying that the documents
must be shredded, but there are many instances
in the recent past where businesses have been
caught not complying with their policies and
documents containing personal information were
not shredded.
Consumers themselves must take steps to
prevent identity theft (see suggestions on FTC.gov
Web site). In addition to common ways thieves
steal identities, there is an activity called pretexting. According to the FTC (FTC.gov, 2006),
pre-texting is the practice of getting your personal

9

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

information under false pretenses, which uses a


variety of tactics to obtain personal information.
For example, a pre-texter may call, claim he is
from a research firm, and ask you for your name,
address, birth date, and social security number.
When the pre-texter has the information he wants,
he uses it to call your financial institution. He
pretends to be you or someone with authorized
access to your account. He might claim that he
has forgotten his checkbook and needs information about his account. In this way, the pre-texter
may be able to obtain other personal information
about you such as your bank and credit card account numbers, information in your credit report,
and the existence and size of your savings and
investment portfolios (FTC.gov, 2006).

health Information privacy


Health information privacy is of great concern
to many consumers. HIPPA, the Health Insurance Portability and Accountability Act of 1996,
legislation requires healthcare providers to adopt
standards designed to facilitate the development of
a uniform, computer-based healthcare information
system, while protecting the security of the information and the privacy of its subjects (Michael,
2001). The concern comes in having personal
health information available in a computer-based
healthcare information system. If the information
is housed in a computer network, then there are
concerns over security of the information. The
legislation demands security of the information,
but even with great security, information can still
be misused, stored improperly, accessed inappropriately, and disposed of incorrectly as with
any electronic information.
According to OCR (2005), the Privacy Rule
protects all individually identifiable health
information held or transmitted by a covered
entity or its business associate, in any form or
media, whether electronic, paper, or oral. The
Privacy Rule calls this information protected
health information (PHI). HIPPA also provides

0

provisions of the law-standards for electronic


transactions and standards for the privacy of
individually identifiable health information
(Michael, 2001). The HIPPA legislation demands
providers use electronic transactions in addition
to using computer-based information systems
which essentially covers all facets of healthcare
interactions within their organization and with
external entities. In addition, HIPPA regulates
the use and disclosure of personally identifiable
information and demands patients be given a
privacy statement detailing how the provider or
institution will use their information.

underage consumer protection


The growth of computer use in school systems is
educating children in the use of computers and the
Internet. Children then have knowledge of how
to browse the Internet and interact with online
entities. In addition, many children have access
to computers at home and can use the Internet
from there as well. School computers have content
blocking software installed where the school can
filter the content returned for online searches and
block access from Web sites known to contain
inappropriate content for the safety of the children, but the same content filtering software may
not be installed on every computer accessible by
children. In addition to access concerns, there
are concerns over how the online entities with
which children interact behave. According to
FTC (FTC.gov, 2007), Congress enacted COPPA
(Childrens Online Privacy Protection Act) in 1998
to address privacy and security risks created when
children under 13 years of age are online. COPPA
imposes requirements on operators of Web sites
and online services directed to children, as well
as other operators with actual knowledge that
they have collected personal information from
children.
The primary goal of COPPA and the Rule is
to place parents in control over what information
is collected from their young children online. The

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Rule was designed to protect children under age


13 while accounting for the dynamic nature of
the Internet. The Rule applies to operators of
commercial websites and online services directed
to children under 13 that collect, use, or disclose
personal information from children, and operators
of general audience Web sites or online services
with actual knowledge that they are collecting,
using, or disclosing personal information from
children under 13 (FTC.gov, 2007).
In addition, the Rule prohibits operators from
conditioning a childs participation in an online
activity on the childs providing more information
than is reasonably necessary to participate in that
activity (FTC.gov, 2007).

spyware, malware, viruses, cookies,


phishing, and spAm
Spyware is software installed on a computer
without the owners consent, which monitors or
controls the owners computer use to send pop-up
ads, redirect the computer to Web sites, monitor
Internet surfing, or record the keystrokes, which,
in turn, could lead to identity theft (FTC.gov Spyware, 2007). There are several ways for a computer
to get infected with spyware. Some of the most
frequent are: downloading free software that
contain the spyware, unauthorized downloads that
occur without consumer knowledge when Internet
security is set to less than medium, links within
pop-up ads can contain spyware, and opening
a spam e-mail or attachment can download the
spyware (FTC.gov Spyware, 2007).
Malware includes adware, hijackers, toolbars,
and dialers. Baratz and McLaughlin (2004) define
malware as follows:

Adware is the class of programs that place


advertisements on a computer screen. These
may be in the form of pop-ups, pop-unders,
advertisements embedded in programs,
advertisements placed on top of ads in Web
sites, or any other way the authors can think

of showing the victim an ad. The pop-ups


generally will not be stopped by pop-up
stoppers, and often are not dependent on the
victim having Internet Explorer open. They
may show up when the victim is playing
a game, writing a document, listening to
music, or anything else. Should the victim
be surfing, the advertisements will often be
related to the Web page he or she is viewing.
Hijackers take control of various parts of
victims Web browsers, including their
home page, search pages, and search bar. A
hijacker may also redirect a victim to certain
sites should the victim mistype an address
or prevent him or her from going to a Web
site the hijacker would rather the victim not,
such as sites that combat malware. Some will
even redirect a victim to a hijackers own
search engine when the victim attempts a
search.
Toolbars plug into Internet Explorer and
provide additional functionality such as
search forms or pop-up blockers. The Google
and Yahoo! toolbars are probably the most
common legitimate examples, and malware
toolbars often attempt to emulate their
functionality and look. Malware toolbars
almost always include characteristics of the
other malware categories. Any toolbar that is
installed through underhanded means falls
into the category of malware.
Dialers are programs that set up a victims
modem connection to connect to a 1-900
number. This provides the number's owner
with revenue while leaving the victim with
a large phone bill. Most dialers are installed
quietly and attempt to do their dirty work
without being detected.

As defined by the National Consumers League


(NCL), phishing is using the Internet to fraudulently gather personal data about a consumer
(Fraud.org). According to a report published by



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

the NCL, the bait is an e-mail claiming to be


from a trusted organization, such as a bank or
an online retailer. The e-mail often claims that
the consumer must urgently take action, or else
a bad thing will occur such as closure of the account. Once the phisher sends spam with bait the
next step is that the e-mail provider delivers bait
to consumer. Next, the user reads bait. A user
might respond directly to the e-mail, shown as
user enters info. More often, the user clicks on
spoofed link. The link is typically to a Web site
controlled by the phisher. The Web site is designed
to seem like the site of the trusted company. The
consumer then enters personal information, such
as account number, password, or social security
number. When the user enters info on spoofed
site the phishing attack has succeeded at its first
goal, to gather personal information fraudulently.
Next the personal information is used to harm
the consumer, when the bad guy selects victims
and attempts fraud. Important examples of fraud
are if the phisher commits bank fraud, such as
by hijacking the consumers account, or credit
card fraud, by using the personal information to
purchase goods fraudulently (Fraud.org).
cookie when a consumer visits a site, a notation may be fed to a file known as a cookie in
his or her computer for future reference. If the
person revisits the site, the cookie file allows the
Web site to identify him or her as a return guest
and offers him or her products tailored to his or her
interests or tastes. The consumer can set his or her
online preferences to limit or let him or her know
about cookies that a Web site places on his or her
computer (FTC.gov Spyware, 2007). Marketers
want to customize a consumers experience at their
Internet store so they collect personal information
from the consumers computer cookies when a
consumer enterers their Web space. They use this
information to target their marketing specifically
to the individual. The companies collect this information store it in their database and can even
sell the information to other companies.



Spam is the receipt of unsolicited e-mail. The


CAN-SPAM Act of 2003 (Controlling the Assault
of Non-Solicited Pornography and Marketing Act)
establishes requirements for those who send commercial e-mail, spells out penalties for spammers
and companies whose products are advertised in
spam if they violate the law, and gives consumers
the right to ask e-mailers to stop spamming them.
The law, which became effective January 1, 2004,
covers e-mail whose primary purpose is advertising or promoting a commercial product or service,
including content on a Web site. A transactional
or relationship messagee-mail that facilitates
an agreed-upon transaction or updates a customer
in an existing business relationshipmay not
contain false or misleading routing information,
but otherwise is exempt from most provisions of
the CAN-SPAM Act (FTC.gov, 2004).

consumer onlIne prIvAcy


Issues, legIslAtIon, lItIgAtIon
& theIr Inter-relAtIonshIps
As the six legal categories in consumer online
privacy were examined, the solutions to these
issues are further investigated in terms of legislation and litigation. According to the literature
review, the results were summarized in Table 2
and Table 3.
Based on the literature research in the legal
issues, legislation and litigation of consumer
online privacy, the relational model of consumer
online privacy as shown in Figure 2 is developed,
which exhibits the relationship between the issues consumers have with online privacy and
how those issues are addressed with protections
provided by legislation and then the remedies
for consumers and risks to companies for not
complying with the legal requirements. Some of
the legislative works provide coverage for more
than one consumer online privacy categories. For
instance, the Federal Privacy Act and Fair and
Accurate Credit Transactions could be used to

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Table 2. Consumer online privacy legislation


Year

Law

Applications

Contribution

2006

U.S. Safe Web Act

Undertaking Spam, Spyware,


And Fraud Enforcement With
Enforcers Beyond Borders Act of
2005 or the U.S. SAFE WEB Act
of 2005

Includes but not limited to:


Amends the Federal Trade Commission Act to include within the
definition of unfair or deceptive acts or practices, those acts or
practices involving foreign commerce
Authorizes the FTC to disclose certain privileged or confidential
information to foreign law enforcement agencies

2003

Fair and Accurate


Credit Transactions Act

To amend the Fair Credit


Reporting Act, to prevent identity theft, improve resolution of
consumer disputes, improve the
accuracy of consumer records,
make improvements in the use
of, and consumer access to, credit
information

Includes but not limited to:


Identity theft prevention
Protection and restoration of identity theft victim credit history
Improvements in use of and consumer access to credit information
Enhancing the accuracy of credit information
Limiting the use and sharing of medical information in the
financial system
Financial literacy and education improvement funds

2003

CAN-SPAM Act

Controlling the Assault of


Non-Solicited Pornography and
Marketing Act

Includes but not limited to:


Bans false or misleading header information
Prohibits deceptive subject lines
Requires the e-mail provide an opt-out method
Requires that commercial e-mail be identified as an advertisement and include the senders valid physical mailing address

2002

Federal Information Security


Management Act

bolster computer and network


security within the Federal
Government and affiliated parties
(such as government contractors)
by mandating yearly audits

Includes but not limited to:


FISMA imposes a mandatory set of processes that must be followed for all information systems used or operated by a U.S.
government government federal agency or by a contractor or other
organization on behalf of a U.S. government agency. These processes must follow a combination of federal information processing
standards (FIPS) documents, the special publications SP-800 series
issued by NIST

2001

USA Patriot Act

Uniting and Strengthening


America by Providing Appropriate Tools Required to Intercept
and Obstruct Terrorism

Includes but not limited to:


Expands the scope of subpoenas for records of electronic communications to include the length and types of service utilized,
temporarily assigned network addresses, and the means and
source of payment (including any credit card or bank account
number).
Permits electronic communication and remote computing service
providers to make emergency disclosures to a governmental entity
of customer electronic communications to protect life and limb.
Makes it lawful to intercept the wire or electronic communication of a computer trespasser in certain circumstances.
Provides for nationwide service of search warrants for electronic
evidence.

2001

Homeland Security Act, Cyber


Security Enhancement Act

Enhanced Cyber Security and


Protections for Companies Assisting in Investigations

Includes but not limited to:


Directs the Attorney General to establish and maintain a National
Infrastructure Protection Center to serve as a national focal point for
threat assessment, warning, investigation, and response to attacks on
the Nations critical infrastructure, both physical and cyber.
Prohibits the distribution of advertisements of illegal interception
devices through the Internet as well as by other, specified media.
Broadens the offense of and increases the penalties for illegally
intercepting cell-phone conversations or invading the privacy of another persons stored communications. States that a law enforcement
officer need not be present for a warrant to be served or executed
under the Electronic Communications Privacy Act.

continued on following page



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Table 2. continued
Year

Law

Applications

Contribution

2000

Childrens Online
Privacy Protection
Act

Specifically protects the privacy


of children under the age of 13
by requesting parental consent
for the collection or use of any
personal information of the
users

Includes but not limited to:


Incorporation of a detailed privacy policy that describes the
information collected from its users.
Acquisition of a verifiable parental consent prior to collection of
personal information from a child under the age of 13.
Disclosure to parents of any information collected on their children by the Web site.
A right to revoke consent and have information deleted.
Limited collection of personal information when a child participates in online games and contests.
A general requirement to protect the confidentiality, security, and
integrity of any personal information that is collected online from
children.

1996

Health Insurance
Portability and
Accountability
Act

Establishes, for the first time,


a set of national standards for
the protection of certain health
information

Includes but not limited to:


The Privacy Rule protects all individually identifiable health
information held or transmitted by a covered entity or its business
associate, in any form or media, whether electronic, paper, or oral.
The Privacy Rule calls this information protected health information
(PHI). Individually identifiable health information is information,
including demographic data, that relates to:
The individuals past, present, or future physical or mental health
or condition,
The provision of health care to the individual, or
the past, present, or future payment for the provision of health
care to the individual, and that identifies the individual or for
which there is a reasonable basis to believe can be used to identify the individual. Individually identifiable health information
includes many common identifiers (e.g., name, address, birth date,
social security number).

1974

Federal Privacy
Act

Created in response to concerns


about how the creation and use
of computerized databases might
impact individuals privacy
rights

Includes but not limited to:


Safeguards privacy through creating four procedural and substantive rights in personal data.
It requires government agencies to show an individual any
records kept on him or her.
It requires agencies to follow certain principles, called fair information practices, when gathering and handling personal data.
It places restrictions on how agencies can share an individuals
data with other people and agencies.
It lets individuals sue the government for violating its provisions

Table 3. Consumer online privacy litigation


Year

Case

Issues

2007

FTC v. Albert

Spyware: the code interfered with the functioning of the computer, and was difficult for
consumers to uninstall or remove. In addition,
the code tracked consumers Internet activity,
changed their home page settings, inserted new
toolbars onto their browsers, inserted a large
side frameor window onto browser windows that in turn displayed ads, and displayed
pop-up ads, even when consumers Internet
browsers were not activated

Contribution
Permanently bars him from interfering with consumers computer use, including distributing software code
that tracks consumers Internet activity or collects other
personal information, changes their preferred homepage
or other browser settings, inserts new toolbars onto their
browsers, installs dialer programs, inserts advertising
hyperlinks into third-party Web pages, or installs other
advertising software. It also prohibits him from making
false or misleading representations; prohibits him from
distributing advertising software and spyware; and requires he perform substantial due diligence and monitoring if he is to participate in any affiliate program

continued on following page



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Table 3. continued
Year

Case

Issues

Contribution

2006

FTC v. Petco

Web site security flaws allowed SQL injection


attacks to gain consumer credit card information

Must construct a system-wide security plan that will


adequately protect consumers from security breaches via
the companys Web site
Will be subject to a biennial security audit from a third
party auditor to validate their effectiveness
Prevents Petco from misrepresenting its security system
to consumers in the future and mandates record keeping
provisions to permit the FTC to monitor compliance

2006

FTC v. BJs
Wholesale
Club

Collection of magnetic stripe information from


consumers and failing to secure the information
appropriately

Must implement and maintain an information security


program that is reasonably designed to protect the security, confidentiality, and integrity of personal information
collected from consumers
Program must assess internal and external threats to
security and design reasonable safeguards to control the
risk identified
Must obtain independent assessment and report of security program within 180 days of the order and biennially
for 20 years

2006

Banknorth,
N.A. v. BJs
Wholesale
Club

Unauthorized parties, hackers, gained access


to credit card information of cardholders and
used the information fraudulently

Establish and implement, and thereafter maintain, a


comprehensive information security program that is
reasonably designed to protect the security, confidentiality, and integrity of personal information collected from
or about consumers
Obtain an assessment and report (an Assessment) from
a qualified, objective, independent third-party professional, using procedures and standards generally accepted
in the profession, within one hundred and eighty (180)
days after service of the order, and biennially hereafter for
twenty (20) years after service
Maintain, and upon request make available to the
Federal Trade Commission for inspection and copying,
a print or electronic copy of each document relating to
compliance

2005

FTC v.
Cartmanager
International

Engaged in unfair trade practices by selling to


third parties the personal information of nearly
one million customers
Web pages collect consumer name, billing
and shipping address, phone number, e-mail
address, and credit card information
Violated merchants privacy policies which
provided that the personal information collected through their Web sites would not be
sold, traded, or rented to third-parties

Cannot sell, rent, or disclose to any third party for


marketing purposes any personally identifiable information collected from consumersprior to the date of
order
Can sell, rent, or disclose such information after date or
order provided there is a clear and conspicuous written
notice of and they obtain a written certification that merchant privacy policy allows the sale, rental, or disclosure, OR prior to collecting personal data the merchant
discloses the consumer is leaving the merchants Web site
and subject to Cartmanagers privacy policy

litigate a breach in information security along with


the Federal Information Security Management
Act. In addition, many of the privacy categories
are closely related or resultive of each other. An
attacker could use spyware to gain information
about a consumer that may lead to identity theft,
or identity theft could be a result of an attacker
gaining access to consumer information through

an information security breach event on a business. Identity theft could be a result from most
of the categories, but could also occur without
any interaction with an online entity through
non-shredding of personal documents therefore,
it is a category in and of itself.
The model illustrates a top-down view of
the consumer online privacy protection flow.




health Information
privacy p ers on ally
id entifiab le inform ation
b ein g releas ed

fair & Accurate credit


transactions Act id entity
th eft preven tion an d c redit
res toration f or vic tim s
health Insurance
portability and
Accountability Act
es tab lis h ed n ation al
s tan d ards for c ollection an d
releas e of p ers on al h ealth
inf orm ation

legislation government provided protections

Identity theft & pretexting p ers on al


inf orm ation or c redit c ard
inf orm ation b ein g s tolen to
im p ers on ate c ons um er

childrens online privacy


protection Act req uirin g
p aren tal c ons ent for
c ollec tion or us e of
p ers on al inf orm ation f or
th os e u nd er 13 yrs . of ag e

underage consumer
p ers on al inf orm ation b ein g
releas ed b y you th

ftc v. petco 2006 S Q L


injec tion attac ks g ain ed
c ons u m er c red it c ard
nu m b ers

Register complaint at
ftc.gov

ftc v. bJs Wholesale


club 2006 f ailin g to
s ec ure c us tom er m ag n etic
c redit c ard inf orm ation

Register complaint at
ftc.gov

felony hIppA violation


w om an c on vic ted on f elon y
c h arg es f or H IP P A violation
of s ellin g p atient
inf orm ation to dru g d ealers

Notify law enforcement


and register a complaint
at ftc.gov

Notify law enforcement


and Health and Human
Services

consumer Actions Available

ftc v. bJs Wholesale


club 2006 h ac k ers
g ain ed ac c es s to c red it
c ard inf orm ation an d us ed
it frau du lentl y

Register complaint at
ftc.gov

ftc v. Xanga.com
c ollec ted and d is c los ed
un d erag e p ers on al
inf orm ation w ith ou t p arental
c ons en t

litigation risks for companies and remedies for consumers

federal privacy Act


privac y s af eg u ards f or
inf orm ation s tored in
c om p uteriz ed d atab as es

federal Info security


mgmt Act b ols ter
g overn m ent an d affiliates
c om p uter an d n etw ork
s ec urity

homeland security Act


en h anc e c yb er s ec urity
protec tions f or c om p anies
as s is tin g in in ves tig ations

Information privacy
s ens itiv e or c onfid ential
inf orm ation releas ed
in a p p ro p riatel y

Information security
breaches h ac king ,
s tealin g p ers on al
inf orm ation, or Intern et or
e-m ail interc eptions

C onsum er O nline P rivacy C ategories

Register complaint at
ftc.gov

ftc v. Albert 2007 c od e


trac k ed ac tivity, c h an g ed
s ettings , an d w as u nrem o vab le

cAn-spAm Act
c ontrollin g n on-s olic ited em ail ag ains t b ein g
m is lead in g or d ec e p tive

us safe Web Act


en h anc in g fraud
enf orc em ent ag ains t
s p y w are an d s p am

spyware, malware,
viruses, cookies, & spAm
to trac k c lic ks , h is tory, or
ins tall m alic ious program s

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Figure 1. Consumer online privacy relational model

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

The following two survey studies examine consumer online behaviors and privacy concerns,
reinforcing the vital need for a stronger role by
the government and business community as well
as the privacy awareness from online consumers
themselves.

survey studIes
survey study I: consumers
protection of online privacy and
Identity
The study by Milne, Rohm, and Bahl (2004)
examined results from three consumer surveys
wherein attitudinal, behavioral, and demographic
antecedents that predict the tendency to protect
ones privacy and identity online were explored.
The research looks at online behaviors increasing or reducing risk of online identity theft and
indicates the propensity to protect oneself form
online identity theft varies by population. The
survey findings are summarized as follows:

There was a positive significant relationship between privacy concern and active
resistance.
Those who had bought online, provided
e-mail, and registered for a Web-site had
higher rates of protection and higher number of hours on the Web.
Males were more likely to protect their
information online than females.
Protection behavior increased with years of
schooling.
Younger online adults were more vigilant
than older adults in protecting information
online.

Overall, the study found that consumers are not


protecting themselves adequately, and therefore
there is a need for a stronger role by the government and business community to combat identity

theft. The researchers suggest the government


enact new lawsto more effectively monitor
business practices (Milne et al., 2004) and businesses take responsibility for security of sensitive
customer information in the off-line as well as
online context (Milne et al., 2004) and implement technological solutions including digital
certificates and signatures, biometrics, and other
authentication approaches (Milne et al., 2004). In
addition, they call for the public sector to expand
educational programs for consumers.
Furthermore, the study covers areas such as
identity theft, online privacy statements, and
discusses about updating virus protection often
and clearing cookies and computer cache after
browsing on the Internet, as would be expected
because these were surveys completed by various
online users. There are also other privacy concerns for consumers besides those covered in the
survey such as security breaches from hacking
and SQL injection attacks on their information
stored within business networks, unscrupulous
online Web sites collecting information from those
underage online users, and inappropriate release
of information not related to online transactions
by businesses or their agents.

survey study II: user privacy


concerns and control techniques
Chen and Rea (2004) examine the relationship
between two types of user privacy concerns
and how users control personal information.
The two types of privacy concerns addressed
are control of personal information and trust
of online businesses. The three privacy control
factors include:

Falsification of private information, which


includes altering ones personal information
and removing browser cookies to obtain access to online resources that require some
form of registration.



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Passive reaction, which includes ignoring


or deleting the unwanted presence.
Identity modification, which includes changing ones personal identity, gender-neutral
IDs, and multiple identities.

The study found that consumers do try to


protect themselves, but there is much yet to be
studied regarding online privacy. The fact is that
many consumers do not demonstrate sufficient
awareness of their online privacy, which require
better privacy controls.

Analysis and Implications


Consumers enjoy their surfing on the Internet to
learn everything, to buy anything, and to communicate with anyone with a computer connected
to the Internet. But how many net visitors are
concerned with their personal information that
may drift on the Internet when they are online
and are concerned about who may be looking at
their personal information?
The study by Milne et al. (2004) suggests
that consumers are more aware of the dangers in
providing sensitive information online. They are,
however, not taking adequate technical protection
and are often unable to differentiate between
a safe site and an unsafe site. Carefree online
buying behavior and ignorance about protection
tools provided by the browser, by the operating
system, and by the open source community are
some reasons, among others, behind lack of online consumer privacy protection. Chen and Rea
(2004) state that consumers appear to use three
types of privacy controls; that is, falsification,
passive reaction, and identity modification when
(a) there is a possibility that others might observe
and use the information (unauthorized secondary
use) and (b) concerns about giving out personal
information. The study also discusses privacy
protections such as encryption and anonymity.
In many cases, these tools are either not available
or simply infeasible for users.

8

The results of these two survey studies point


out that online consumers need to be educated
and encouraged to follow protective measures in
three areas; technical, educational, and behavioral.
However, there are responsibilities for the governments and the businesses as well. What is required
is a concurrent effort among all stakeholders to
protect privacy of all entities involved. The surveys
point out that the online privacy issue is global
rather than local. The researchers suggest the
government enact new laws to effectively monitor
business practices and businesses take responsibility for the protection of online consumers. Along
with laws and standards, available technological
solutions including digital certificates/signatures,
biometrics, and other authentication approaches
can be used.
Online privacy concerns started with the inception of the Internet. These concerns are amplified
tremendously with powerful computing ability to
gather and process vast amount of data which is
coupled with the Internets capacity for dispensing information instantly around the globe. In
short, success or failure of tackling this problem
depends on the willingness and joint effort of all
parties involved.

future trends
The growth of the online community does not
seem to be slowing. There are new and inventive
ways to use the Internet and the degree to which
consumers can interact online is also growing.
There are now ways to collaborate online that
did not exist just a few short years ago. This new
interaction online allows anyone to voice opinions
or work together. The success of the auction Web
site Ebay has been mostly due to the ability of
consumers to voice their opinion by rating their
online transactions. The rating system allows positive, negative, or neutral comments to be posted
regarding specific transactions. The rating and
comments are viewable by anyone and are used

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

to build trust with the online entity. These rating


systems are used by many other online entities
as a way for consumers to give other consumers
their opinions. This trend in allowing consumers
to rate their interactions seems to be a successful
model and will continue to grow in popularity.
Consumers are now demanding more control over
their online interactions and the rating systems
give a controlled venue for opinions. Consumers
are now relying on other consumers to determine
trust in an online business instead of relying solely
on the business.

Legislators need to be aware of what is happening within the technology area and respond
with legislation that protects consumers against
those who would misuse the technology. Each
year many proposals are introduced to further
define consumer privacy and protections, but
most do not become enacted into laws. Consumer
groups need to remain vigilant in their education
of legislators on what misuses of information are
occurring and why it is imperative they act.

future reseArch dIrectIons


conclusIon
Online privacy concerns are present in everyday
life for most consumers in the United States.
Each day there are millions of interactions
with businesses that store and use our personal
information. Each day businesses are trying to
find new ways to interact with consumers and
get ahead in their business. Each day there are
unsavory and unethical people trying to steal
consumers personal information. The balance
between convenience and safety in interacting
with an online entity is a struggle. Consumers
want to provide information to the online entity to
conveniently interact for their own purposes, but
want to control how the entity then uses, stores,
recycles, and disposes of the information as well.
The government is there enacting legislation to
try to define those gray areas, but there are still
so many instances where businesses are not complying with the laws. Consumers are left with the
consequences when a business fails to properly
handle their personal information. The call for
more consumer education is a necessary step in
trying to battle the privacy issues. Consumers
need to be aware of how a business is using their
information and be able to exert pressures on a
business to conform to the required standards of
society in an information age.

Consumer online privacy is an important subject


and will remain an important subject as long as
there are businesses collecting consumer information, online transactions, and hackers trying to
gain access to that information. Future investigation could include the following:

How can consumers protect their information? Finding ways to make opt-in and optout easier and giving consumers options
on how a business stores and handles their
information. Should consumers be able to
more specifically control the storage time
and be informed exactly what information
is being stored?
Investigating a new model for personal information online: creating a secure personal
profile with your information and when
interacting with online businesses, you
provide an encrypted transaction number
that authenticates to your profile and allows
a transaction to be created and stored under
your profile. You are identified by the transaction number and not a name to the online
business. While the business is still storing
the transaction, their records do not include
personally identifiable information.
Gaps in legislative protections for consumers: recommendations for legislative actions

9

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

on how to fill the gaps in the current legislations to bring them up to current business
model expectations. HIPAA was created
before offshore outsourcing began an upswing in healthcare. There are currently no
provisions in the rules for sending consumer
information beyond our borders.
Investigation within different age groups
on specific concerns about online privacy.
Younger generations have concerns, but
still interact online because it is part of their
culture. Other generations choose to not
interact online for a couple of reasons. They
may lack understanding of the technology
and/or their judgments are usually based
more on personal interactions to create trust.
How can they trust an online entity when
there is no personal interaction? What would
it take for other generations to trust online
transactions?

references
Ambrose, S., & Gelb, J. (2006). Consumer privacy
litigation and enforcement actions in the United
States. The Business Lawyer, 61, 2.
Ashworth, L., & Free, C. (2006). Marketing
dataveillance and digital privacy: Using theories of justice to understand consumers online
privacy concerns. Journal of Business Ethics,
67, 07-123.
Baratz, A., & McLaughlin, C. (2004). Malware:
what it is and how to prevent it. Retrieved November 11, from http://arstechnica.com/articles/
paedia/malware.ars
BJs Wholesale Club settles FTC charges. (2005).
Retrieved from http://www.ftc.gov/opa/2005/06/
bjswholesale.htm
Brooke, J., & Robbins, C. (2007). Programmer
gives up all the money he made distributing

0

spyware. Retrieved from http://www.ftc.gov/


opa/2007/02/enternet.htm
Chen, K., & Rea, A. (2004). Protecting personal
information online: A survey of user privacy
concerns and control techniques. The Journal of
Computer Information Systems, 44(4), 85-93.
Childrens Online Privacy Protection Act. (2000).
Enacted April 22, 2000. Retrieved from http://
www.epic.org/privacy/kids/
Congress Passes Safe Web Act 2006. (2007).
Retrieved January 31, 2007, from http://www.
epic.org
COPPA protects children but challenges lie
ahead. (2007). Retrieved from http://www.ftc.
gov/opa/2007/02/copparpt.htm
COPPA FAQs. (n.d.) Retrieved from http://www.
ftc.gov/privacy/coppafaqs.htm
Email Security and Anonymity. (2004). Retrieved
from http://www.anonic.org/email-security.
html
Federal Privacy Act of 1974. (n.d.). Retrieved from
http://www.usdoj.gov/foia/privstat.htm
Federal Security Information Act of 2002.
(2002). Retrieved from http://en.wikipedia.org/
wiki/Federal_Information_Security_Management_Act_of_2002
Fraud.org. (n.d.). Retrieved from http://www.
fraud.org/tips/internet/phishing.htm, http://www.
phishinginfo.org/
FTC v. BJs Wholesale Club, Inc. (2005). Filing
ordered May 17, 2005. Retrieved from http://www.
ftc.gov/os/caselist/0423160/050616agree0423160.
pdf
FTC.gov. (2006). Retrieved from http://www.ftc.
gov/bcp/edu/microsites/idtheft/consumers/aboutidentity-theft.html#Whatisidentitytheft

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

FTC File No. 032 3221. (2004). Petco settles


FTC charges. Retrieved from http://www.ftc.
gov/opa/2004/11/petco.htm

The CAN-SPAM Act: Requirements for Commercial Emailers. (2004). Retrieved from http://www.
ftc.gov/bcp/conline/pubs/buspubs/canspam.htm

FTC File No. 062-3073. (2006). Xanga.com to pay


$1 million for violating childrens online privacy
protection rule. Retrieved from http://www.ftc.
gov/opa/2006/09/xanga.htm

Top Ten Ways to Protect Online Privacy. (2003).


Retrieved from http://www.cdt.org/privacy/guide/
basic/topten.html

FTC.gov Spyware. (2007). Retrieved from


http://www.ftc.gov/bcp/conline/pubs/alerts/spywarealrt.htm
Health Insurance Portability and Accountability
Act of 1996HIPPA (Kennedy-Kassebaum Act).
(n.d.). Retrieved from http://aspe.hhs.gov/admnsimp/pl104191.htm
Homeland Security Act, Cyber Security Enhancement Act enacted December 13, 2001. (2002).
Retrieved from http://www.govtrack.us/congress/
bill.xpd?bill=h107-3482
Medlaw.com. (2006). Retrieved from http://www.
medlaw.com/healthlaw/Medical_Records/8_4/
woman-pleads-guilty-to-se.shtml
Michael, P., & Pritchett, E. (2001). The impact
of HIPPA electronic transmissions and health
information privacy standards. American Dietetic
Association. Journal of the American Dietetic
Association, 101(5), 524-528.
Milne, G., Rohm, A., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), 217-233.
Richards, N. M. (2006). Reviewing the digital
person: Privacy and technology in the information age by Daniel J. Solove. Georgetown Law
Journal, 94, 4.
Summary of the HIPPA Privacy Rules by HIPPA
Compliance Assistance. (2005). Health & Human
Services, May.
Swartz, N. (2006). HIPPA compliance declines,
survey says. Information Management Journal, 40(4), 16.

USA Patriot Act of 2001 enacted October 23,


2001. (2001). Retrieved from http://www.govtrack.
us/congress/bill.xpd?bill=h107-3162
US Safe Web Act of 2006 enacted by 109th Congress March 16, 2006. (2006). Retrieved from
http://www.govtrack.us/congress/bill.xpd?tab=
summary&bill=s109-1608
Vasek, S. (2006). When the right to know and
right to privacy collide. Information Management
Journal, 40(5), 76-81.

AddItIonAl reAdIngs
37 States give consumers the right to freeze credit
files to prevent identity theft; consumers union
offers online guide on how to take advantage
of new state security freeze laws. (2007). PR
Newswire, July 16.
Anthony, B. D. (2007). Protecting consumer
information. Document Processing Technology,
15(4), 7.
Carlson, C. Poll reveals data safety fears. eWeek,
22(50), 29.
Chellappa, R., & Sin, R. (2005). Personalization
versus privacy: An empirical examination of the
online consumers dilemma. Information Technology and Management 6(2-3), 181-202.
de Kervenoael, R., Soopramanien, D., Hallsworth,
A., & Elms, J. Personal privacy as a positive experience of shopping: An illustration through the
case of online grocery shopping. International



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Journal of Retail & Distribution Management,


35(7), 583.

trust through privacy practices. International


Journal of Information Security, 6(5), 323.

DeMarco, D. A. (2006). Understanding consumer


information privacy in the realm of Internet commerce: Personhood and pragmatism, pop-tarts and
six-packs. Texas Law Review, 84(4), 1013-1064.

New trend micro internet security products


strengthen personal information protection
and deliver enhanced performance. (2007). Al
Bawaba.

Erickson, K., & Howard, P. N. (2007). A case


of mistaken identity? News accounts of hacker,
consumer, and organizational responsibility for
compromised digital records. Journal of Computer-Mediated Communication, 12(4), 12291247.

Online privacy policies: An empirical perspective on self-regulatory practices. (2005). Journal


of Electronic Commerce in Organizations, 3,
61-74.

Kelly, E. P., & Erickson, G. S. (2004). Legal


and privacy issues surrounding customer databases and e-merchant bankruptcies: reflections
on Toysmart.com. Industrial Management+Data
Systems, 104(3/4), 209.
Kobsa, A. (2007). Privacy-enhanced personalization. Communications of the ACM, 50(8), 24.
Lauer, T. W., & Deng, X. (2007). Building online



Pan, Y., & Zinkhan, G. M. (2006). Exploring the


impact of online privacy disclosures on consumer
trust. Journal of Retailing, 82(4), 331-338.
Roman, S. (2007). The ethics of online retailing: A scale development and validation from
the consumers perspective. Journal of Business
Ethics, 72(2), 131.
Smith, A. D. (2004). Cybercriminal impacts on
online business and consumer confidence. Online
Information Review, 28(3), 224-234.



Chapter III

Online Privacy, Vulnerabilities,


and Threats:
A Managers Perspective
Hy Sockel
DIKW Management Group, USA
Louis K. Falk
University of Texas at Brownsville, USA

AbstrAct
There are many potential threats that come with conducting business in an online environment. Management must find a way to neutralize or at least reduce these threats if the organization is going to maintain
viability. This chapter is designed to give managers an understanding, as well as the vocabulary needed
to have a working knowledge of online privacy, vulnerabilities, and threats. The chapter also highlights
techniques that are commonly used to impede attacks and protect the privacy of the organization, its
customers, and employees. With the advancements in computing technology, any and all conceivable
steps should be taken to protect an organizations data from outside and inside threats.

IntroductIon
The Internet provides organizations unparalleled
opportunities to perform research and conduct
business beyond their physical borders. It has
proven to be a vital medium for worldwide commerce. Even small organizations now rely on

Internet connectivity to communicate with their


customers, suppliers, and partners. Today, employees routinely work from areas beyond their
offices physical area. They regularly transport
sensitive information on notebook computers,
personal digital assistants (PDAs), smartphones,
and a variety of storage media: thumb drives, CDs,

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Online Privacy, Vulnerabilities, and Threats

DVDs, and even on floppies. It is not uncommon


for employees to work offsite, at home, or out of a
hotel room. Outside the office, they often use less
than secure Internet connectionsdial-up, cable,
Internet cafs, libraries, and wireless.
Organizations often employ portals to share
information with their stakeholders, however;
these portals are not always secure from would
be attackers. In order to protect the organization
from vicious and malicious attacks, management
needs to understand what they are up against.
Even if the organization does not conduct any
business on the Internet, they are still not out of
harms way. Viruses, Trojans, and spyware can
come from multiple sources; floppy discs, CDs,
thumb drives, and even from mobile phones. To
complicate the matter even more, the information
technology (IT) environment at many organizations has become obscurepartially due to new
regulations and industry standards. The standard
has changed, it is no longer enough to be secure
and protect the businesses assets, organizations
need to be able demonstrate that they are compliant
and that security is an ongoing concern; failure
to do so could leave them facing stiff penalties
(Forescout, 2007).
The purpose of this chapter is to address
some of the potential threats that come with
conducting business in an online environment.
The chapter highlights the relationship between
privacy and vulnerability and threats. It delves
into techniques that are commonly used to thwart
attacks and protect individuals privacy. In the age
of unrest and terrorism, privacy has grown even
more important, as freedoms are compromised
for security.
The news is loaded with stories about security
breaches. For example:
In May of 2007, the news of the TJ Maxx security
breach shook up the banking and retail industry.
At first it was estimated that hackers had downloaded at least 45.7 million credit- and debit-card
numbers; however, court filings indicated that



number was closer to 96 million. Estimates for


damage range from $216 million to $4.5 billion.
The breach was blamed on extensive cyber thief
activity within TJ Maxxs network from 2003
through June 2004 and then again from mid-May
2006 through mid-December 2006 (Schuman,
2007). However, others blame the breach on weak
wireless securityOu (2007) revealed that the
retailers wireless network had less security than
many people have on their home networks.
Another example is:
In April 5, 2002 hackers exploited vulnerabilities
in a server holding a database of personnel information on Californias 265,000 state employees.
The state responded, and the world listened.
California is one of the largest economies in the
world, bigger than most countries. The attack
included in its victims, the then Governor Grey
Davis and 120 state legislators. The breach
compromised names, social security numbers,
and payroll information. In response, the state
legislature enacted a security breach notification
law Senate Bill (SB) 1386.
To put this in perspective, if online privacy is
described in terms of a risk triangle, the three
corners are vulnerabilities, threats, and actions.
Where actions represent anything the organization can (and should) do to mitigate attacks.
Applications, like ships, are not designed and
built to sit in a safe harbor, they were meant to be
used in churning chaotic waters. It is important
to understand threats and vulnerabilities enough
to have a good idea to of what to expect, so that
strategies and tools can be put in place to mitigate
the consequences (Bumgarner & Borg, 2007).

vulnerAbIlIty
Software vulnerabilities are not going away, in
fact they are increasing. According to the Coor-

Online Privacy, Vulnerabilities, and Threats

dination Center at Carnegie Mellon University


(CERT, 2007) there was an average of over 10
vulnerabilities discovered every day in 2003
(3,784 in total). This number has jumped to over
5500 in the first nine months of 2007.
Software flaws have become the vulnerabilities of choice for attackers. Flaws cut across the
entire enterprise application stackincluding
Web and application servers, databases, and
operating systems. Dr. (Rear Admiral) Grace
Hopper (1906-1992), a highly respected and accomplished computer scientist indicated that all
software has problems and that it is impossible to
have a perfect system. She articulated this point
using the following exampleif the probability
of an individual module having an error in the
code was just one in a hundred (1%), and that the
system had several hundred modules; then the net
probability of an error for that system would be
100%. This observation is particularly relevant
in that most commercial software developers use
complex computer software program development
toolkits (SDK) to improve their productivity and
effectiveness.
Qualsys (2006), a security vendor, studied
over 40 months of data scans (September 8, 2002
to January 31, 2006) and identified nearly 1600
unique critical vulnerabilities from a total infestation of more than 45 million vulnerabilities.
The data scan showed more than 60% of critical
vulnerabilities were in client applications such as
Web browsers, backup software, media players,
antivirus, and flash.
The Third Brigade found that vulnerabilities
generally fall into one of the following categories
(Aberdeen Group, 2005):

Vulnerabilities caused by incorrect configured systems


Failure to turn off factory defaults, guest
accounts, outdated software
Failure to maintain anti-virus and spam
updates

Failure to change default values leaving


holes
Well-know bugs in system utilities
Poor or ignorant policy decisions
Unapplied vendor security systems patch;
Aberdeen states that 95% of attacks are
against known vulnerabilities for which
patches are available.

Vulnerabilities do not have to be broken program code; Norman (1983) indicated that errors in
system designs, which provoke erroneous entries
by users can also be considered as vulnerabilities
that can be intentionally exploited by attackers.
Individually and collectively vulnerabilities
can create major risks for organizations. Weak
policies and protection can result in the release
of personal private information (PII). The release
of PII is not the only the problem. Another issue
is that hackers can obtain important data and
modify it. Suddenly, there are additional names
on the preferred lists, payroll, and accounts payable; and outsiders could be given authority or
consideration that they are not entitled to. An
organizations strategic plans could be compromised. Additionally, the release of PII can weaken
the publics confidence in the organization, subject
the organization to litigation, large fines/reparation costs, and to rigorous investigations, as well
as oversight.

threAts
Threats are not the same as vulnerabilities; threats
are things that take can advantage of vulnerabilities. Security threats, broadly, can directly
or indirectly lead to system vulnerabilities (Im &
Baskerville, 2005). An analogy might be an army
fort surrounded by the enemy where someone accidently left the forts front gate wide open. The
open gate is a vulnerability and the threat is the
opposing force. Translating this analogy to datainformation, the vulnerability would be a poorly



Online Privacy, Vulnerabilities, and Threats

protected system, and the threat is the criminal


hacker community. In this case, poorly protected
could be construed to be any of a number of things
including absolutely no protection, software that
is not updated, inappropriately defined security
rules, and weak passwords.
In general, it is important to ensure that sensitive information and systems are protected from
all threats, both internal and external. Typically,
this is done by separating the systems from the
networks. However, this is not always possible;
with the advent of e-business there is a need for
organizations to share information.
For example: an organization gives its partners (A)
and (B) permission to look at its online schedule
(instead of calling a clerk as they had in the past).
This can create the opportunity for partner A to
look at (or modify) partner Bs data. If the data
is of a personal type, say medical, several laws
could easily be violated. If it is indicated in the
privacy policy that data/information is not shared,
the individual whose data is released may have
rightful cause to institute litigation.
Clearswift, a leading provider of content
security products, has categorized five major
message traffic threat types as: asset theft, disruption, repudiation, content abuse, and denial
of service.
Asset theft happens via spoofing or social
engineering; when an outsider pretends to be
an authorized user and requests information not
available to an unauthorized user. However, more
commonly, it is the sending of sensitive information inadvertently or by disaffected insiders.
Disruption is a common threat, which includes anything that keep users (and services,
i.e., e-mail, fax ) from doing what they are
suppose to do. Other workplace disruption can
include dissemination of personal, pornographic,
or non-business information.
Repudiation (denial) is concerned with either
party (sender or receiver) being able to declare



that an event did not happen. Techniques like


Diffie-Hellman Key Exchange permit digital
signatures, which provide assurance that the
message was actually sent and/or received by the
intended parties. Digital signatures are accepted as
evidence in a court of law. This is critical because
oftentimes parties involved in transactions do not
know each other.
Content abuse is similar in scope of repudiation, but is focused on the content of the message
and not whether it was sent or received. It deals with
issues between the sending and receiving parties
over what was sent and what was received.
Denial of service (DoS) and distributed DoS
(DDoS) results when a party is bombarded with
more messages than it can handle, causing the
system to use all its resources to handle non-legitimate traffic. This can happen by bombarding
the victims machine with thousands to millions
of messages so that it cannot respond to legitimate
requests or responds so slowly that it is effectively
unavailable. DOS attacks are considered violations of the Internet Architecture Boards (IAB)
Internet proper use policy concerning Internet
ethics passed January 1989 (often referred to as
RFC 1087; see http://tools.ietf.org/html/rfc1087).
In the U.S. (and many countries), DoS is a serious
federal crime under the National Information Infrastructure Protection Act of 1996 with penalties
that can include fines and imprisonment.

socIAl engIneerIng
It is not always a technical issuea perpetrator
can use chicanery and/or persuasion to manipulate
unsuspecting people into either revealing sensitive information (such as logon and password)
or compromise perimeter defenses by installing
inappropriate software or portable storage devices
(that are seeded with malware) on computer networks. For example, an approach of phishing is
to ask a user to fill out a simple fake online form.
The form itself asks almost no personal informa-

Online Privacy, Vulnerabilities, and Threats

Figure 1. Ploy to capture personal information

tion; instead, it pre-fills the form with some of


the info that the sender already knows about the
victim. It asks for a person to make up a login
name and a password. The criminal hackers know
that most people suffer from password overload
and tend to reuse the same passwords over and
over again. Figure 1ploy to capture personal
informationis a representative sample (taken
off the net November 4, 2007):

the formula (see Equation 1risk dollar value)


is straight forward, coming up with the values
and probabilities is not. The important issue is
not the devised dollar value, but what does the
asset really mean to the organization and how are
they going to use it?
Equation 1risk dollar value
Risk $ = Asset value * Threat * Vulnerability *
Impact * Likelihood * Uncertainty

rIsk
Risk is involved in everything, every process, and
every system. Operational risk is often defined
as the risk of loss resulting from inadequate or
failed internal processes, people and systems, or
from external events. Risk is one of those things
that no one can escape and is hard to define. In
general, risk is the probability of a negative outcome because some form of threat will be able
to exploit vulnerabilities against an asset. Many
define the value of a risk attack as: the value of
an asset times the probability of a threat times
the probability of an undiscovered vulnerability
times some impact factor (representing reparations) times the possibility of the event. While

Elimination of risk is categorically impossible; the best that can be hoped for to get it
under control. Even if it were possible, the cost
and scalability issues of risk avoidance have to
be weighed against the cost of the probable losses
resulting from having accepted rather than having
eliminated risk (Pai & Basu, 2007).
Qualys, Inc. (2006) analyzed a global data
pool of more than 40 million IP scans with their
product QualysGuard. Data analysis revealed
the six axioms of vulnerabilities. These axioms
are important because they help management
understand the nature of possible attacks and
why and how their data could be at risk of being
compromised. Qualys Inc. (2006) believes that



Online Privacy, Vulnerabilities, and Threats

Understanding the behavior of vulnerabilities is


essential to setting effective security strategy and
proactively implement security solutions.

of the outbreak. Automated attacks pose a


special hazard to network security because
they inflict damage swiftly with little time
for reaction.

The axioms of Qualyss Research:


1.

2.

3.

4.

5.

6.

8

Half-life: Is the average time it takes an


organization to patch (or apply a fix) to half
of the most dangerous vulnerabilities. The
2006 findings indicate a decrease in the
half life to 19 days (down from 30 in 2003)
on external systems. They found that the
exposure of unpatched systems continues
during the significantly long period of halflife dissipation and increases as the severity
decreases.
Prevalence: Prevalence is the degree to
which the vulnerability poses a significant
threat. They found that half of the most
prevalent critical vulnerabilities are replaced
by new vulnerabilities each year. This means
there is ongoing change to the most important
threats to our networks and systems.
Persistence: The life spans of some vulnerabilities are unlimited as soon as the current
infection is addressed, a variant may appear.
In one day Sophos found over 300 variants of
the Stranton virus. The risk of re-infection
can happen during deployment of machines
with a faulty unpatched operating system
Focus: The 2006 study data revealed that
90% of vulnerability exposure is caused by
10% of critical vulnerabilities.
Exposure: The time-to-exploit cycle is
shrinking faster than the remediation cycle.
Eighty percent of critical vulnerability exploits are available within the first half-life
after their appearance. Since the duration
of vulnerability announcement-to-exploitavailability is dramatically shrinking, organizations must eliminate vulnerabilities
faster.
Exploitation: Nearly all damage from automated attacks is during the first 15 days

Cannon and Kessler (2007) believe that the


rapid increase in breaches and incidents can be
directly related to technology. They indicate that
the increase in 1) computer processing power and
data storage capacity and in 2) higher data transmission bandwidth have acerbated the problem.
This in conjunction with the massive connectivity
of information systems afforded by the Internet
and World Wide Web allow for the mass collection
and misuse of sensitive personal data.

electronIc rIsk mAnAgement


There is a large group of people that believe that
in the final analysis of security breaches, that most
problems should not be blamed on hackers or malicious employees, instead the instances should be
blamed on lack of common sense. To them, the
vast majority of breaches can be classified under
the title of carelessness. As in people not paying
attention to what they are doing, such as putting
a letter in the wrong envelope, adding a copy
to an e-mail, or losing equipment or hardware,
the real culprit is a lack of following procedures
(Padilla, 2007).
However, regardless of how breaches are
caused: by ignorance, carelessness, inside users,
or criminal hackers, there are a lot of them. The
Privacy Rights Clearinghouse (2007); indicates
that more than 48 million records containing
sensitive personal information have been involved
in some form of a security breach in just January
2007 alone. Cannon and Kessler (2007) define
A data breach as the unauthorized access to and
acquisition of data in any form or format containing sensitive information that compromises the
security of confidentiality of such information and
creates a reasonable risk of its misuse.

Online Privacy, Vulnerabilities, and Threats

IDC indicates that strong corporate governance is the foundation of successful protection
of corporate assets from a wide variety of threats
(CNET, 2004). To that end, organizations need to
establish, educate, and enforce their policies to
effectively ensure the protection they need.
1.

2.

3.

Establish: Establish clearly written policies


and procedures for all employee communications. The rules must deal with acceptable
and unacceptable behavior for the Internet,
P2P (peer-to-peer), e-mail, IM (instant messaging), and blogging.
Educate: Educate and support written rules
and policies with company wide training.
The employees need to understand that the
policy is a living document (it will mutate as
new threats and issues arise) but compliance
is mandatory. This can be a critical issue for
an organization because misuse (whether
deliberate or accidental) can result in the
organization being held responsible by the
legal principle of vicarious liability.
Enforcement of written rules and policies
Enforce policies: With a combination of
disciplinary action and software. If there
is any doubt about employee willingness
to adhere to the organizations usage and
content rules, consider applying a technological solution to the people problem.
Tools can help the installation of hardware,
software, and/or appliances, and enforce
established policies. The organization can
block access to inappropriate sites and stay
on top of employees online activity. Failure
to discipline employees for e-mail-related
misconduct may encourage other employees to abuse the system and could create
liability concerns for the organization. It is
important to communicate the policies and
to adhere to them. The American Management Association (2005) Electronic Monitoring & Surveillance Survey found that
most companies monitor employee Web site

usage (76%) and use filters to block access


to inappropriate Web sites (65%). Slightly
more than a quarter (26%) of responding
organizations indicated they went further
admonishing individuals, they terminated
them for misuse of e-mail or the Internet.
The World Bank indicates that to reduce the
e-security risk, day to day augmentation of e-security internal monitoring and processes are needed.
They indicate that proper Risk management is
achieved through a comprehensive checklist per
the cyber-risks that affect the network as a whole.
They have refined a technology risk checklist
based upon standards set by ISO 17799 (Glaessner,
Kellermann, & McNevin, 2004).

mAlWAre
The term malware (malicious software) is typically used as a catch-all to refer to a variety of
forms of hostile, intrusive, or annoying software
designed to infiltrate or interrupt services from
a single computer, server, or computer network
without the owners informed consent. The term
malware includes all types of trouble makers: such
as: viruses, worms, kiddy scripts, Trojan horses,
and macro (scriptcontext) viruses. Malware seeks
to exploit existing vulnerabilities on systems.
Malware can utilize communication tools to
spread and oftentimes it goes unnoticed. McAfee
Avert Labs (Bernard, 2006) has recorded more
than 225,000 unique computer/network threats.
In just 10 months between January and November
of 2006, they found 50,000 new threats. Google
researchers (as part of the Ghost in the Browser
research) warned that one in 10 Web pages is
hiding embedded malware (Provos, McNamee,
Mavrommatis, Wang, & Modadugu, 2007).
The term malware is often associated with the
characteristic attributes of a virus; self-replicating,
something that embeds itself into other programs,
which in turn can infect other programs. The no-

9

Online Privacy, Vulnerabilities, and Threats

tion of a self-replicating program is not new, it


dates back to John von Neumanns 1949 lectures.
Neumann postulated the theory that a program
could reproduce itself. Nearly 35 years later, November 1983, Professor Fred Cohen substantiated
Neumanns work by creating and demonstrating
the first computer virus in a computer security
seminar. The name virus was provided by Len
Adleman (the A in RSA); (http://en.wikipedia.
org/wiki/Malware) and (http://all.net/books/virus/part5.html).
In 1989, John McAfee (of McAfee Avert Labs)
defined a virus as a computer program created to
infect other programs with copies of itself. It is
a program that has the ability to clone itselfso
that it can multiply and constantly seek new host
environments (McAfee & Haynes, 1989). Today,
not all computer viruses inject themselves into
their victims, nor is cloning considered mandatory. Researchers now make distinction between
different malware varieties based on whether it
is considered viral or non-viral malware (Cafarchio, 2004):

0

Viral malware typically replicates rapidly


and fairly indiscriminately, its behavior has
a very visible impact. Viral infections might
be used as part of distributed denial of service attack; worms like Code Red are able
to spread worldwide in a matter of hours.
Non-viral malicious software does not replicate. It is planted by hackers, or unknowingly downloaded by unsuspecting users,
or foisted on systems as part of a software
package to track the users behavior and/or
software usage. Non-viral malicious software is designed to be inconspicuous and
stealthy. These types of infections can go
undetected for long periods of time.
There are some virus types of malware that
are design merely to harass the users and not
to intentionally damage files or the operating
systems. Malware like the Bearded Trojan
are of this style. The Bearded Trojan displays

a nude female and while it is potentially


offensive or embarrassing, it often makes
people realize that they are vulnerable and
could have been infected with a virus that
acts as a key logger, or a Web bot (Harley,
Slade, & Gattiker, 2001).
Another example of a non-viral virus is the
ANSI bombs; thankfully, they are not common and they do not reproduce. An ANSI
bomb is a sequence of characters that is
meant to redefine key(s) on a keyboard. Thus,
when the user presses a key the normally
assigned characters for that key are not sent
to the terminal or computer, but rather the
redefined string. This string may contain any
ASCII characters, including <RETURN>
and multiple commands. A function key or
even the space bar could be assigned a string
that invokes a program to do something the
user does not want to happencopy, delete,
or send material (Harley et al., 2001).

Adware/spyware
A particular annoying and dangerous form of
malware is adware/spyware. The terms are communally used as interchangeably. The goal of this
technology is to gather information without the
target persons knowledge or permission. This
type of software is used to watch and record
which Web sites and items on the Internet the
user visits in hopes of developing a behavioral
profile of the user that can later be exploited.
The slight difference between the two terms is
the intent of the software agent. Adware has an
advertising aspect in the information it collects,
while spyware tracks and record user behavior
(in the traditional sense of the word spy).
The problem with spyware is that users
typically store all sorts of sensitive and personal
information in their machines that should not be
made public. Some information is protected by
law, trade secrets, and financial data. The loss of
personnel and customer information could wreak

Online Privacy, Vulnerabilities, and Threats

havoc for the organization. Additionally, the


theft of stored information such as: bank account
numbers, credit card numbers, social security
numbers, and pictures could also devastate the
individual.
Another thing that makes adware/spyware
so pernicious is that anti-viruses & firewalls are
not very effective against them. While a good
anti-virus program (AV) is an absolutely essential
for any machine, even those that do not connect
to a network (especially if the machine accepts
removable media), it is not enough. AV software
will not protect user machines from spyware.
Viruses and spyware have different properties;
because spyware does not propagate itself like
other forms of malware, it is not likely to be
detected by traditional AV methods.

botnets
Vent Cerf, one of the founding fathers of the
Internet, believes that one in four computers (approximately 150 million out of 600 million) connected to the Internet are compromised and likely
to be unwilling members of a botnet (Fielding,
2007). These machines are often used as proxies
for illegal activities like spamming and credit card
fraud. Botnets have been a growing problem on
the Internet since at least 2002. A bot (short for
robot) is a software agent released onto a computer
connected to the Internet. The bot can download
malicious binary code that compromises the host
turning it into a zombie machine. The collection
of zombies is called a botnet. The servers hosting
the bot binaries are usually located in countries
unfriendly to the United States. The bots are
transparent and run in the background. Bots can
open a channel to a bot controller machine,
which is the device used by the perpetrator (the
bot herder) to issue commands to the bots (Baylor
& Brown, 2006).
Bot herders typically use bot controllers to
harvest user accounts via screen-capture, packet-

sniffers, and key-logging techniques. They are


routinely used for phishing (some estimate that
over 70% Internet spam is due to them), click
fraud, and malware.
Botnets can be used for attacking various Web
sites by unleashing a barrage of requests against
the site, so that the victim site spends more time
processing the requests coming at it, than it does
doing the job it was intended for. These attacks
employ a technique known as distributed denial
of service (DDoS) attack. The idea behind the
DDoS is to flood a machine, server, or cluster
faster than the server can respond to them. DDoS
chew-up bandwidth and resources, effectively
shutting down the attacked network. In this manner a large botnet can wield an amazing amount
of power. If several large botnets are allowed to
join together, they could literally threaten the
national infrastructure of most countries. On
April 27, 2007, a series of DDoS attacks crippled
the financial and academic Web sites in Estonia
(Kirk, 2007).
Botnets are no longer just annoying, spampumping factoriesthey are big business for
criminals. This shift has awakened large businesses, which historically have either looked the
other way or been in denial about bots infiltrating
their organizations (Higgins, 2007).
The Anatomy of a Typical Botnet Attack

Step 1: The bot herder loads remote exploit


code onto an attack machine that might
be dedicated to this purpose or an already
compromised machine. Many bots use filesharing and remote process control (RPC)
ports to spread.
Step 2: Attack machines scan for unpatched
(not current with updates) victim machines
to launch attacks against.
Steps 3 & 4: The victim machine is ordered
to download files (binaries) from another
server (frequently a compromised web or
FTP server).



Online Privacy, Vulnerabilities, and Threats

Step 5: These binaries are run on the victim


machine and convert it to a bot. The victim
machine connects to the bot controller and
reports for duty.
Step 6: The bot controller issues commands
to the victim to down load new modules,
steal account details, install spyware, attack
other machines, and relay spam.
Step 7: The bot herder controls all bots by issuing commands via the bot controller(s).

uses an IRC approach, it does not scale very


well and is unlikely to rival reach Storms size.
Rbots underlying malware uses a backdoor to
gain control of the infected machine, installing
keyloggers, viruses, and even stealing files from
the infected machine, as well as the usual spam
and DDoS attacks. The real scary part is that
Rbot [malware] is readily available to anyone who
wants try to apply some kind of criminal activity
in the bot arena (Higgins, 2007).

Worlds biggest botnets

Whose fault Is It?

Storm

The answer to this question depends on who you


ask. It can easily be argued that it is the users
fault. If the user keeps their antivirus up-to-date
and stays away from traditional types of sites that
harbor malware (celebrity), the problem should
be lessened. However, variants of viruses have
been tracked in the hundreds per day; it is hard to
keep current on protection when there is a whole
industry working against you.
Since it may not necessarily be the user then
it must be the developers, or the publisher for
not creating a product that cannot be usurped.
Unfortunately, there are highly skilled, university
trained hackers that strive to develop the right
code. After all, there is really only one reason for
botnets: and that is to make money. Some people
blame law enforcement or government for not
quick prompt and decisive action. However, many
of the bot herders are in countries in which the
U.S. does not have jurisdiction. Politicians can
pass laws, but never be in the position to have
them enforced.
To that end, in 2007, Senators Orrin Hatch
(R-Utah) and Joseph Biden, Jr. (D-Delaware)
introduced the Cybercrime Act to update existing laws and close what they say are loopholes
that online criminals can exploit. The bill takes
a multifaceted attack. It lowers the threshold of
evidence, it address not only damaged computers
but also to individuals. It prohibits the creation

There is a new threat, that of the super botnet.


While few agree on the actual size of these botnets, they are huge; where the number of active
members per 24 hour period (not just attached
zombies) of the net can be in the hundreds of
thousands. Currently, the largest of the new breed
of botnets is Storm. Storm broke away from
the mode and uses a decentralized peer-to-peer
(P2P) communication, instead of the traditional
centralized Internet relay chat (IRC) model. The
P2P makes it tough to track and tougher to kill;
you cannot render it mute by disabling one or two
central control machines.
Storm uses a complex combination of malware,
which includes worms, rootkits, spam relays, and
Trojans. It propagates via a worm or when a user
visits an infected site or clicks on a link to one. It is
very stealthy, it employs a balance use approach
and a fast-flux. The purpose of fast-flux is to
circumvent the IP-based black list technique (see
black list). It does this by rapidly rotating DNS
records to prevent discovery (Higgins, 2007).

Rbot
Rbot is generally considered the second largest
botnet. It employs an old-style communication
structure using Internet relay chat. Because it



Online Privacy, Vulnerabilities, and Threats

of botnets that could be used in online attacks. It


makes the threat of revealing (extortion) confidential information illegally obtained from computers
a crime (Savage, 2007).

botnetsfbI operation bot roast


In the second week of November 2007, John
Schiefer of Los Angeles, California agreed to
plead guilty to felony charges for building and
using a botnet as large as 250,000 nodes to steal
personal identifying information (PII). The botnet was used to invade individuals privacy by
intercepting electronic communications being
sent over the Internet from the zombie computers
to PayPal and other Web sites. Later, data mining techniques were used to garner PII such as
username and passwords. With the usernames
and passwords, they accessed bank accounts to
make purchases without the consent of the true
owners. The botnet was also used to defraud a
Dutch advertising company. This was the first U.S.
prosecution under the U.S. federal wiretap statute
for conduct related to botnets (Wilson, 2007).
The FBI and Department of Justice in an antibotnet sweep label as Operation Bot Roast has
arrested three individuals for assembling botnets.
They are charged with felonies. One of the three
arrested is alleged to have used a large botnet
network to send tens of millions of unsolicited email messages. Another is charged with infecting
more than 10,000 computers worldwide, including two Chicago hospitals. The bots caused
the infected computers to, among other things,
repeatedly freeze or reboot, causing significant
delays in the provision of medical services. It
took the hospitals more than 1,000 man-hours
to clean up after the infections (Keizer, 2007;
Albanesius, 2007).
The government is working in conjunction
with industry partners to uncover these schemes.
These include the CERT Coordination Center at
Carnegie Mellon University as well as Microsoft,

and The Botnet Task Force, (a low-profile organization initiated by Microsoft in 2004 that acts
as a means of building awareness and providing
training for law enforcement).
In the process, the FBI has identified more
than 1 million hijacked personal computers. The
majority of victims are not aware that their computers have been compromised or their personal
information exploited. The FBI said because of
the widely distributed abilities of botnets they not
only harm individuals but are now considered a
threat to national security, as well as the information infrastructure and the economy.

seArch engInes
A problem with most search engines is that they
are ambivalent to content permissions. Certain
individuals (such as the head of payroll) may
have permission to view all of the companys
information. While other individuals (such as the
head of personnel) are limited in the type of data
are allowed to see. An employee may be given
permission to see their own information but not
that of the person working next to them. There
may also be certain individuals that are not allowed to see any information at all. Because search
engines typically can not take data ownership and
coordinate it with user permissions, problems can
arise when responding to a request.
When implemented carelessly, search engines
have the potential to uncover flaws in existing security frameworks and can expose either restricted
content itself or verify the existence of hidden
information to unauthorized users (Vivisimo,
2006). In this regard, poorly implemented search
engines could release large amount of personal
identification information. Imagine typing the
name of the CEO in a search engine and receiving a page that lists his personal phone number,
salary, and home address.



Online Privacy, Vulnerabilities, and Threats

WIreless medIA
Organizations may think their mobile workers are
safe with their new wireless notebooks, but recent
WLAN tracking at the RSA security conference
showed a multitude of vulnerabilities. Some
common faults were that many users were using
hotspots, but had no idea who was sponsoring
the ports. In some cases, it was discovered that
the users were actually talking to other local
computers that also had their connections active
(Shaw & Rushing, 2007).
Wireless devices often remember the last
good site they were connected to and attempt to
use them first. Which means that if the user did
not shutdown the port (disconnect from a hot spot
correctly), the computer will look for that spot first,
even if there is a more secure connection available. Another issue is that the port will continue
to actively search for a signal. A critical situation
can arise if the user forgets to disable the wireless
card, and then plugs his/her device into a wired
network. A couple of things could happenthe
network will see the other port and might adjust
its routing information to accommodate it, in
the process it could bypass firewalls and border
security. Another thing that may happen is the
device might also connect to another device via
the wireless port, again bypassing some security,
but elevating the permissions and authority of the
newly connected user to that of the legitimate user.
In either case, the result is a huge hole in security
(Shaw & Rushing, 2007).
Organizations are paying a very high price
for wireless management. The Aberdeen Group
estimates that it costs nearly 10 times more to
manage wireless services and devices compared
to wired-lines (Basili, 2007). In spite of that,
Aberdeen found that 80% of respondents were
planning increases in mobile wireless access.
The RSA Conference is an event that draws
thousands of computer users. Many of them
bring their wireless laptops (and other devices).
AirDefense (2005), a wireless security company,



credited by many as the founder of the wireless


security industry, found that more than half of
the 347 wireless devices it monitored during
conference were susceptible to attack. What is
truly amazing is not that it happened once, but
just 2 years later it happened again at another
RSA conference. AirDefense once again found
that more than half of the wireless devices at the
conference network were themselves unsecured
and were vulnerable to attacks; thus leading to the
conclusion that the people responsible for protecting enterprise data were not doing a very good job
of protecting their own assets (Cox, 2007).

telephones
Wireless telephones with computer-enabled features (such as e-mail and Internet access) have
been compromised; Trend Micro Inc. announced
it had found security flaws on MS Windows
Mobile, a popular operating system used in the
smartphone. Many individuals that used these
devices are executives who routinely access sensitive information. In this case, the main risk is not
malware, but the risk of lost devices.

mobile encryption
The news regularly repots that laptops with
thousands of sensitive records on customers or
employees are lost or stolen each month. Organizations know the risks and the threats. These threats
are easy to understand but most organizations do
not allocate the resources necessary to protect
themselves. Encryption is an effective safe guard
for most mobile devices, and one that will relieve
some of the legislative pressures. However, it is
far from being fully adopted; a survey by Credant
(see McGillicuddy, 2006) asked respondents to
list reasons why their companies had not adopted
encryption for mobile devices.

56% indicated it was due to a lack of funding;

Online Privacy, Vulnerabilities, and Threats

51% said encryption was not a priority;


and
50% said there were limited IT resources; in
other words: No one wants to pay for it.

Mobile devices are often seen as low-powered,


low-capacity corporate tools. To which there is
considerable fear that encryption will add little,
but in the end will slow them down. Critics cite
that the idea behind mobile devices is to make
the user more productive by added convenience.
Anything that slows down the devices would
ultimately detract from the users productivity.
Additionally, encrypted devices are harder to
diagnose, repair, and recover. However, these
concerns are more applicable to older less powerful devices (McGillicuddy, 2006).

dAtA
Organizations accumulate a wide breath of data,
that if stolen could potentially hurt the enterprise.
Loss or theft of confidential information: such
as blueprints and engineering plans, tenders,
budgets, client lists, e-mails and pricelists, credit
card and other financial information, medical or
other confidential personally identifiable records,
classified, restricted or personal information,
scripts, storyboards, source code, database
schemas, or proprietary trade secrets can severely
impact the integrity and profitability of a corporation. This risk is amplified by the prevalence of
portable computing devices as a part of normal
business activities and by the increasing levels of
online transactions that occur routinely (GFI-2,
2007).
Fundamentally, there are two types of security.
The first type is concerned with the integrity of the
data. In this case the modification of the records
is strictly controlled. The second type of security
is the protection of the information content from
inappropriate visibility. Names, addresses, phone
numbers, and credit card details are good examples

of this type of data. Unlike the protection from


updates, this type of security requires that access
to the information content is controlled in every
environment.
The Internet makes it easy for organizations
to collect personal identifying information, such
as: names, addresses, social security numbers,
credit card numbers, or other identifiers (Shimeall,
2001). If this information were disclosed inappropriately, it would put these individuals at risk
for identity theft (Wang, Lee, & Wang, 1998). To
guard against such an outcome, laws worldwide
have been passed to aid in data protection.

the threat from Within


Within the U.S., the Gartner Group estimates that
70% of all unauthorized access to information
systems is committed by employees. The CSI/FBI
survey found that 68% of respondents claimed
losses due to security breaches originating from
insiders (Gordon, Loeb, Lucyshyn, & Richardson, 2006). Of course, the magnitude of insider
malfeasances depends somewhat on how one
slices and dices the numbers. The U.K. Scotland
Yard Computer Crime Research Center, (2005)
found that 98% of all crimes committed against
companies in the U.K. had an insider connection.
In the USA, surveys conducted by the U.S. Secret
Service and CERT coordination center concluded
that: Respondents identified current or former
employees and contractors as the second greatest
cyber security threat, preceded only by hackers
(Keeney, Kowalski, Cappelli, Moore, Shimeall,
& Rogers, 2005).

endpoInt (perImeter-bAsed)
securIty
The term endpoint, as its name implies, is any
place that a device can interact with another device.
Generally speaking, an endpoint is an individual
computer system or device that acts as a network



Online Privacy, Vulnerabilities, and Threats

client and serves as a workstation or personal


computing device. Endpoints are often mobile and
intermittently connected and in the mobile society,
they are becoming indistinguishable (Forescout,
2007; Endpointsecurity, 2004).
Laptops have become so popular they have
almost caught up with desk top machines, as
office use goes (40% to 45%CSI/FBI survey,
2006). Because laptops are not tethered to the
desk, they are routinely out of the protection of the
organizations network. Additionally, if removable
media (floppys, CDs, DVDs, flash drives) are
used on laptops or office machines, they are an
easy entry point for malware. A further security
concern is the construct of engineering devices
for easy maintenance. These easy maintenance
devices can allow a person to literally remove the
internal hard drive from a laptop in less than a
minute and make off with all of the private data
that is in the machine.
Endpoint security is the total measures taken
to implement security sending and receiving data.
These measures include assessing the risk to the
clients antivirus and personal firewalls, as well as
protecting the network from themselves. Endpoint
security logically extends to the management and
administration of these security measures. It also
deals with risk, reporting, and knowledge management of the state and results of these measures
(Positive NetworksEndpoint security).

endpoint components
Firewalls
In general terms, a firewall is software or a
hardware device that controls the flow of traffic
between two networks or entities. A packet filter
firewall works by inspecting the contents of each
network packet header and determining whether
it is allowed to traverse the network. There are
basically three types of firewalls: packet filter,
stateful inspection, and application proxy.



In the case of a personal firewall, it controls


the network traffic between a computer on one
side, and the Internet or corporate network on
the other side. A firewall is a network (hardware
& software) node that isolates a private network
from a public network. The firewalls job is to
keep unwelcome traffic from the Internet out of
the computer, and also to keep in the traffic that
you do not want leaving the computer. To that
end, organizations may have several firewalls to
create barriers around different layers of their
infrastructure. Firewalls are often compared to a
bouncer at a nightclub: they are located at the
point of entry; they enforce rules to determine
who gets in (and out); and they inspect all that
passes through the doors they are guarding. With
a layer approach, it is possible that a firewall can
insure that even if a password is compromised
an intruder will only have restricted access to
the network.
However, firewalls are neither the first nor the
last word in endpoint components. Hardware and
software firewalls have a serious flaw in that they
typically do not look at the contents of a packet;
they only look at its headers. As written earlier,
antivirus software is not very effective against
spyware, the same is true with a firewall.

preventIve meAsures
The open nature of PCs in most organizations
has resulted in users installing a wide variety
of applications that they use to get through their
day, and several that they should not. Some IT
managers attempt to prohibit the use of unauthorized peripherals (removable media) and
applications with the hope that this process will
shut out malware. The usage of portable devices
at work could impact corporate network security
through the intentional or unintentional introduction of viruses, malware, or crimeware that can
bring down the corporate network and or disrupt
business activity.

Online Privacy, Vulnerabilities, and Threats

Even with the tightest security net, it is possible


for a destructive breach to occur. Failure to implement a security audit process to meet government
regulatory requirements can result in significant
fines, in addition to the possibility of imprisonment. The risks are real and affecting businesses
on a daily basis (Juniper Research, 2006).
Further, not only are devices a threat to data
and machine integrity, but also to worker productivity. An employee can use company hardware
and software to enhance digital photos, play
computer games, or work on freelance projects.
The control of USB (universal serial port) ports
can limit unauthorized use and prevent intentional
or accidental attacks against a companys network
(Muscat, 2007). Control of the USB ports can be
made either programmatically or by physically
locking & blocking them (PC Guardian).
Additionally, there are emerging technologies
that monitor the movement of data and enforce
actions on the data based on company policies.
These products from vendors such as Orchestria
and Vericept work at the network and desktop
levels, and can monitor movement, as well as
prevent data from being copied from the originating application to external sources, such as
USB drives.
Another approach relies on detecting the departure of an authorized user. A wireless USB
PC Lock will lock and unlock a PC based on a
users proximity to the machine. A small transmitter is carried by the user, if s/he is more than
two meters away, the machines screen will be
programmatically locked, once the user returns
in range the screen is unlocked.

caveat is that this list is not for corporate users,


it is for the home user. For the home user, the
advice is simple:
1.
2.
3.
4.
5.
6.
7.
8.
9.

Get a good antivirus package and keep it up


to date.
Let your system download system updates
(patches) from a trusted site.
Deactivate Active X components.
Do not install items from unknown sources.
Do not open e-mails from people or organizations that you do not know.
Never click on an embedded e-mail link;
copy it or use a book mark.
Be extremely careful about what sites you
visit.
Strangers that send you mail, want something!
You will not win something if you did not
enter.

In an organizational environment, the mentioned still applies. However, the user is usually
burdened by user names and passwords. The
number one suggestion is pick a strong password
and do not share it with anyone for any reason.
If you need to have multiple sign-ons, tailor the
passwords for each application. For example your
password for accounts payable may begin with
AP. The easiest way to pick strong passwords is
to create an acronym out of your favorite song
lyrics. Take the first letter of each of the first 12
words, your application code and some important
number, like the middle digits of your first home
address.

the end user


the human in the equation
While the chapter is aimed at management, we
would be amiss if we did not describe some things
that the end user can do. This list is far from
complete and some may argue about the order of
which items are presented. They might also point
that import suggestions have been admitted. The

According to CompTIAs IT security survey, human error, either alone or in combination with a
technical malfunction, was blamed for 74% of
the IT security breaches (Cochetti, 2007). Human
involvement in systems is not limited to making



Online Privacy, Vulnerabilities, and Threats

errors; during the day users often take breaks to


surf the Web, e-mail, or IM their friends.
However, Web surfing can do more than
relieve stress and waste time; it can expose users and organizations to dangerous Web sites,
data leakage, and e-mails with inappropriate or
dangerous content. Further, it can lead to installation of non-authorized software, which besides
prompting civil and criminal investigations, can
introduce piracy robbing malware. This type of
publicity has a negative impact on the bottom line.
To protect themselves, organizations should abide
by a strong user access policy (Shinder, 2007).
Instant messaging (IM) has begun to be embraced by organizations because it provides a cost
effective means to electronically communicate
both synchronously and nearly instantaneously.
IM presence awareness and permission-based lists
give the perception of a low risk of receiving spam
or other unwanted messages. The rapid adoption
of public IM services (such as AOL, MSN, and
Yahoo) has raised serious concerns about security
risks and compliance with regulatory requirements. IM and e-mail can be used as a tool for
deliberately disseminating private information;
or it may provide a channel that could inadvertently admit spyware, worms, or viruses. Since
instant messaging involves free electronic communication with internal employees and anyone
designated as a trusted colleague, unauthorized
information dissemination may proliferate via
unmonitored (Webex, 2006).
Roger J. Cochetti, group directorCompTIA
U.S. Public Policy states security assurance
continues to depend on human actions and knowledge as much, if not more so, than it does on
technological advances. He indicates that failure
to follow security procedures (human error) was
blamed by more than 55% of the organizations
as the factor that contributed the most to security
breaches (Cochetti, 2007).

8

lIstIngWhIte, blAck, And


grAy
Listing is a response to malwares continuous
mutation of their signatures, which results in a
continuous flow of zero-day attacks. The basic
idea is to restrict execution of programs based
on a list. Listing comes in three distinct styles:
white, black, and gray.
White listing basically consists of allowing users/workstations to run only software that has been
pre-approved by the organization. Implementing
this approach requires conducting exhaustive inventory of all applications in use as well as their
version. Once the inventory is completed, each application must be reviewed to ensure it is required.
After the review, the software implementations
and versions need to be made consistent across
the protected network segments.
Black listing is the opposite of white
listing. Workstations are prevented from running applications or visiting Web site that are
specifically listed. Thus, sites that are found to be
perpetrators of malware and spam are banned
from user activity. While this may seem to be
a viable approach for the business managers, it
is weak, and can be very risky, if not supported
by additional controls. A missed module can
be disastrous. Further, new malicious or highly
vulnerable applications are created or identified
faster than they can be placed on a blacklist.
Gray listing is a conditional blacklist, and
has a high risk of false positives, blacklisting
someone by mistake.
Hybrid listing is a combination of features
that combine the features of white, black, and gray
listing. It is designed so that management can approve some software and ban other software that
is not needed or wanted, thus preventing the first
execution of any new unknown software. Because
the hybrid approach prevents the first execution,
not the installation, the approval/authorization
process can be centrally managed in real time.

Online Privacy, Vulnerabilities, and Threats

Figure 2.

Browser-based listing relies on a modern


browser to check that the site a user is going
to is not a forgery. One option downloads a list
of known Web forgeries (see Figure 1ploy to
capture personal information): but this technique
only offers real protection for a few moments after
it is downloaded. Another technique would be to
have the browser check with an authority (such as
Google) each time a URL or site is entered.
Mozilla indicates that users can protect themselves from Web forgeries by:

That instead of following links from a e-mail


to banks or online commerce sites, always
either type the Web page address in manually or rely on a bookmark;
They also recommend using a Password
Manager to remember passwords instead
of entering them manually; and
They recommend using an e-mail product
that will detect and alert the user about
suspect web sites.

least privilege Authority


In an organizational environment, the information
systems/information technology group struggles
to give users the access they need and want, while

attempting to ensure that security is not sacrificed.


Programs that perform useful functions of work
are known as applications. Applications need
certain capabilities to create, read, update, and
delete datathese privileges often go by the acronym CRUD. Applications need access to certain
services that are only granted access through the
operating system or the system administrators:
such as scheduling new tasks, sending information
across applications, and verifying passwords. In
order for that to work the application/user needs
to be at a high enough level of trust (permissions/privileges/authority) so that they know what
they are doing.
With the principle of least privilege, the goal is
to give users only the minimal access and privileges they need to complete the task at hand. In
most cases this will be for the entire logon session,
from the time they logon in the morning till they
leave for the night. The concept of principle of
least privilege is a prophylactickind of a safety
belt; if the machine is not attacked by malware,
it is not necessary and does no harm; but if it is,
its an extra layer of protection. Therefore, the
construct of least privilege is becoming a common phrase as organizations scramble to protect
network assets and resources.

9

Online Privacy, Vulnerabilities, and Threats

vulnerability management
The process of patch management can be complex, difficult, and is often sacrificed when an
organization is in a crisis mode. If shortcuts
are taken, they will almost always comes back
to haunt the organization. Patching in the programming has long been defined as trading an
error that is known for one that is unknown. It
is not the thing to rush through. Vendors spend
considerable time researching vulnerabilities and
devising repairs or work-arounds. Many of the
repairs are dependent on updates being already
applied. Failure to stay current on updates is one
of the main reasons that enterprises struggle with
bot infections (Symantec).
Patching is a trade off between the time required to repair a problem responsibly and completely versus the hackers window of opportunity
to exploit a specific vulnerability. Vulnerability
management has become a critical aspect in
managing application security. Patching vulnerabilities (depending on the severity) can be a time
consuming job. To do it safely, the patches should
be applied and tested in an isolated environment
against a copy of the system.

0

New components of distributed architectures: Standardization and plug-and-play


are not always positive, they come with a
price. Standardize code makes it easier for
all involved the developer and the criminal
hacker. Each module represents a unique
addressable attack pointa target at which
criminal hackers can aim their exploits.
Multiplying network access points can act
similar to an open wound, if one is not careful, it will allow in all sorts of viruses and
the like. With organizations opening their
networks to suppliers, clients, customers,
employees, and contractors, security has
become a mandate. Multiple entry points
have raised the importance of controlling
the traffic that comes and goes through the

network. Within this regards, firewalls and


antivirus products are important parts of an
effective security program.
Wireless network access points bring
their own set of issues. With wireless, the
perimeter (endpoint) security is critical. It is
important to have IDS (intrusion detection
system) and to monitor all traffic.
Simply relying upon firewalls and antivirus is not an effective strategy. Understanding the network and understanding
its weaknesses (vulnerabilities) can provide
insight on how to manage and protect critical
data.

conclusIon
No matter how hardened a network perimeter is,
there are a number of weaknesses that can allow
breaches to occur. It is usually recommended that
a layer defense approach be adopted to strengthen
protection. However, care needs to be taken that
additional layers actually add protection instead
of just protecting against the exact same vulnerabilities or threats. Reckless implementation or
selection of software may not produce the desired
outcome. A layered approach may be more like
buying overlapping warranty coverage. The harm
is that businesses may confuse this approach for
real security. Ultimately, they could end up spending more money and resources on implementing
the wrong security mechanisms without gaining
complete security (Ou, 2007).
Remember the organization is responsible for
maintaining the privacy of the stakeholders consumer while also preserving a harassment-free,
discrimination-free, crime free, and civil business
environment. The development, implementation,
and enforcement of a comprehensive Internet
policy can help in that goal. Whether employees
intentionally violate Internet policy or accidentally surf to an objectionable Web site, under the
legal principle known as vicarious liability, the

Online Privacy, Vulnerabilities, and Threats

employer can be held responsible for the misconduct of the organizations employeeseven
if the employer is completely unaware that there
is a problem.
Simply following security best practice by
limiting access rights may be a good first step,
but it is just a step. No single approach is going to
be totally viable against all malware and protect
privacy. The best protection comes from using a
layer approach. In addition to using technology
it is important to:

Create and enforce policies and procedures


Educate and train
Monitor the network and the systems
Require Penetration testing
Ban inappropriate sites and prohibit wasted
resources and productivity

Aberdeen Groups (2005) research shows that


technology, by itself is not the primary indicator
for successthis was true despite differences in
technology usage, loss rates, or firm sizes. They
also found that organizations performing as best in
class leaders focus on managing four areas to maximize results for the money being spent on security:
1.
2.
3.
4.

Sharing of data and knowledge to improve


results
Processes in place for executing against
objectives
Organizational structure and strategy to
manage to results
A security technology maturity that influences results

Of the four, they indicate that the most important focus area is the managing of data and
knowledge to improve results.
This chapter presented an overview of the
concerns that organizations must address while
working within the Internet community. It was
meant to inform management of the potential

threats and pitfalls that must be addressed to be


a viable player within the Internet realm. While
there are many technical areas that need to be attended to, nothing is more important than ensuring
maintaining the users confidentiality, integrity,
and authenticity (CIA). Hackers and con-artists
are devising clever and inventive techniques to
violate a users privacy for the purpose of committing illegal activities. If left unchecked, these
issues threaten the viability e-commerce and
e-business.

future reseArch dIrectIons


This chapter lays out some of the issues that must
be concentrated on. With the most emphasis being placed upon a strong organizational Internet
privacy and security policy, followed by education
and training of users and stakeholders.
Future research should focus on how large and
small organizations create, maintain, and monitor
privacy and security policies. Because organizations are of differing sizes and have different
resources available, research should investigate
how large and small organizations vary on their
approaches and implementation. Future research
should also focus on how existing protections can
be expanded to protect tomorrows technology.
Finally, research needs to be conducted on how
protecting portable storage devices from misuse,
as this type of media is bound to proliferate.

references
Aberdeen Group. (2005). Third brigadebusiness
value research seriesmost important security
action: Limiting access to corporate and customer
data. Whitepaper. Retrieved October 2007, from
http://www.thirdbrigade.com/uploadedFiles/
Company/Resources/Aberdeen%20White%20P
aper%20--%20Limiting%20Access%20to%20
Data.pdf



Online Privacy, Vulnerabilities, and Threats

Air Defense Press Release. (2005, February


17). AirDefense monitors wireless airwaves at
RSA 2005 conference. Retrieved October 2007,
from http://airdefense.net/newsandpress/02_07_
05.shtm
American Management Association. (2005).
Electronic monitoring & surveillance survey.
Retrieved October 2007, from http://www.amanet.
org/research/pdfs/EMS_summary05.pdf
Basili, J., Sahir, A., Baroudi, C., & Bartolini,
A. (2007, January). The real cost of enterprise
wireless mobility (Abridged ed.). The Aberdeen
Group. Retrieved October 2007, from http://www.
aberdeen.com/summary/report/benchmark/Mobility_Management_JB_3822.asp
Baylor, K. (2006, October 26). Killing botnets
McAfee. Retrieved March 2007, from http://blogs.
techrepublic.com.com/networking/?cat=2
Bernard, A. (2006). McAfees top ten security
threats for 2007. Retrieved October, from http://
www.cioupdate.com/print.php/3646826
Bumgarner, J., & Borg, S. (2007). The US-CCU
cyber security check list. Retrieved November
2007, from http://www.usccu.us/documents/USCCU%20Cyber-Security%20Check%20List%
202007.pdf
Cafarchio, P. (2004). The challenge of non-viral malware! TISC Insight Newsletter, 4(12).
Retrieved October 2007, from www.pestpatrol.
com/Whitepapers/NonViralMalware0902.asp
Cannon, D. M., & Kessler, L. (2007). Dangercorporate data breach! Journal of Corporate
Accounting & Finance, 18(5), 41-49.
CERT. (2007). Vulnerability remediation statistics. Retrieved November 2007, from http://www.
CERT. org/stats/vulnerability _remediation.
html
Clearswift. (2006 October). Simplifying content
securityensuring best-practice e-mail and



web use. The need for advanced, certified email


protection. Retrieved October 2007, from http://
whitepapers.zdnet.com/whitepaper.aspx?&scid=
280&docid=271750
CNET Staff. (2004, September). Spam volume
keeps rising. Retrieved September 2007, from
http://news.com.com/2114-1032-5339257.html
Cochetti, R. J. (2007, June). Testimony of the computing technology industry association (CompTIA), before the house small business committee
subcommittee on finance and tax, sata security:
Small business perspectives. Retrieved October
2007, from www.house.gov/SMBiz/hearings/
hearing-06-06-07-sub-data/testimony-06-0607-compTIA.pdf
Computing Technology Industry Association.
(2004). Annual study. Retrieved October 2007,
from http://www.joiningdots.net/library/Research/statistics.html
Cox, J. (2007, February 9). RSA: attendees drop
ball on wi-fi securitymany IT security experts
at conference used unsecured devices. Network
World. Retrieved October 2007, from http://www.
networkworld.com/news/2007/020907-rsa-wifisecurity.html
Endpointsecurity. (2004). What is endpoint security? Retrieved October 2007, from http://www.
endpointsecurity.org/Documents/What_is_endpointsecurity.pdf
Fielding, J. (2007, January 28). 25% of all computers on Botnets. Retrieved http://blogs.techrepublic.
com.com/networking/?cat=2
Flynn, N. (2005). E-policy best practices a business guide to compliant & secure internet, instant
messaging (IM), peer-to-peer (P2P) and email
communications. The ePolicy Institute; Executive
Director, St. Bernard Software. Retrieved http://
www.securitytechnet.com/resource/security/application/iPrism_ePolicy_Handbook.pdf

Online Privacy, Vulnerabilities, and Threats

Forescout. (2007). NAC enforcement and the role


of the client. Infonetics Research, Inc. Retrieved
July 2007, from www.Forescout.com/downloads/
whitepapers/Infonetics-NAC-Enforcement-andthe-Role-of-the-Client.pdf
GFI. (2007). The threats posed by portable storage
devices. Whitepaper. Retrieved July 2007, from
http://www.gfi.com/whitepapers/threat-posedby-portable-storage-devices.pdf
Glaessner, T. C., Kellermann, T., & McNevin, V.
(2004). Electronic safety and soundness securing finance in a new age (World Bank Working
Paper No. 26). Washington DC Retrieved http://
siteresources.worldbank.org/DEC/Resources/abstracts_current_studies_2004.pdf
Gordon, L. A., Loeb M. P., Lucyshyn, W., &
Richardson, R. (2006). CSI/FBI computer crime
and security survey. Computer Security Institute.
Retrieved November 2007, from http://www.cse.
msu.edu/~cse429/readings06/FBI2006.pdf
Harley, D., Slade, R., & Gattiker, U. (2001). Viruses
revealed: Understanding and counter malicious
software. New York: McGraw-Hill/Osborne.
Higgins, K. (2007, November 9). The worlds
biggest botnets. Retrieved November 2007,
from http://www.darkreading.com/document.
asp?doc_id=138610
Im, G. P., & Baskerville, R. L. (2005, Fall). A
longitudinal study of information system threat
categories: The enduring problem of human error.
ACM The DATA BASE for Advances in Information Systems, 36(4), 68-79.
Juniper Research. (2006, February). Security
information & event management. Retrieved
http://www.juniper.net/solutions/literature/solutionbriefs/351178.pdf
Keeney, M., Kowalski, E., Cappelli, D., Moore,
A., Shimeall, T., & Rogers S. (2005). Insider threat
study: Computer system sabotage in critical infrastructure sectors. U.S Secret Service and CERT

Coordination Center/SEI. Retrieved November


2007, from http://www.CERT. org/archive/pdf/
insidercross051105.pdf
Kirk, J. (2007, May 17). Estonia recovers from
massive denial-of-service attack. InfoWorld, IDG
News Service. Retrieved November 2007, from
http://www.infoworld.com/article/07/05/17/estonia-denial-of-service-attack_1.html
McAfee, J., & Haynes, C. (1989). Computer
viruses, worms, data diddlers, killer programs,
and other threats to your system. New York: St.
Martins Press.
McGillicuddy, S. (2006, November 1). Encrypting mobile devices: A best practice no
one uses SearchSMB.com http://searchSMB.
techtarget.com/originalContent/0,289142,sid44_
gci1227295,0 0.ht m l?a s r c =SS _C L A _
300336&psrc=CLT_44
Muscat, A. (2007, January 17). Perils of portable
storage. Computer Reseller News. Retrieved http://
www.gfi.com/documents/32686_crn_eprint.pdf
Norman, D. (1983). Design rules based on analysis
of human error. Communications of the ACM,
26(4), 254-258.
Osterman Research Inc. (2003). The impact of
regulations on email archiving requirements.
ORI white paper sponsored by Information
Management Research. Retrieved October 2007,
from http://www.Ostermanresearch.com/whitepapers/or_imr01.pdf
Ou, G. (2007) Wireless LAN security myths that
will not die. ZDNet. Retrieved July 2007, from
http://blogs.zdnet.com/Ou/?p=454
Padilla, R. (2007). Root out data breach dangers by
first implementing common sense. TechRepublic.
Retrieved July 2007, from http://blogs.techrepublic.com.com/tech-manager/?p=312
Pai, A. K., & Basu, S. (2007). Offshore technology
outsourcing: overview of management and legal



Online Privacy, Vulnerabilities, and Threats

issues. Business Process Management Journal,


13(1), 21-46.

org/congressional_testimony/Shimeall_testimony_Aug23.html

Privacy Rights Clearinghouse. (2007). A chronology of data breaches. Retrieved October 2007,
from http://www.privacyrights.org/ar/ChronDataBreaches.htm

Shinder, D. (2007, February 9). How SMBs can


enforce user access policies. Retrieved April 2007,
from http://articles.techrepublic.com.com/51001009_11-6157054.html?tag=nl.e101

Provos, N., McNamee, D., Mavrommatis, P.,


Wang, K., & Modadugu, N. (2007). The ghost
in the browser analysis of web-based malware.
Google, Inc. Retrieved http://www.usenix.org/
events/hotbots07/tech/full_papers/Provos/Provos.pdf

Symantec. (2006, September 19). Symantec finds


firms recognize importance of application security, yet lack commitment in development process.
News release. http://www.symantec.com/about/
news/release/article.jsp?prid=20060919_01

Qualys. (2006). The laws of vulnerabilities: Six


axioms for understanding risk. Retrieved October
2007, from http://developertutorials-whitepapers.
tradepub.com/free/w_qa02/pf/w_qa02.pdf
Savage, M. (2007, October 23). Proposed legislation would strengthen cybercrime laws. Retrieved
November 2007, from http://searchsecurity.
techtarget.com/originalContent/0,289142,sid14_
gci1278341,00.html?track=sy160
Schuman, E. (2007, November 14). TJMaxxs
projected breach costs increase to $216M. eWEEK.
Retrieved November 2007, from http://fe42.news.
sp1.yahoo.com/s/zd/20071114/tc_zd/219495
Shaw, K., & Rushing, R. (2007). Podcast,
Keith Shaw (NetWorkWorld) talks with Richard
Rushing chief security officer at ... data, listen
to this podcast. Retrieved October 2007, from
http://www.networkingsmallbusiness.com/podcasts/panorama/2007/022807pan-airdefense.
html?zb&rc=wireless_sec
Shimeall, T. (2001, August 23). Internet fraud,
Testimony of Timothy J. Shimeall, Ph.D. CERT,
Analysis Center Software Engineering Institute,
Carnegie Mellon University Pittsburgh, PA;
Before the Pennsylvania House Committee on
Commerce and Economic Development, Subcommittee on Economic Development, retrieved
October 2007, available http://www.CERT.



Vivisimo. (2006). Restricted access: Is your enterprise search solution revealing too much? Retrieved
October 2007, from via http://Vivisimo.com/ or
http://www.webbuyersguide.com/bguide/whitepaper/wpDetails.asp_Q_wpId_E_NzYyMQ
Wang, H., Lee, M., & Wang, C. (1998, March).
Consumer privacy concerns about internet marketing. CACM 41(3), 63-70.
Webex.(2006). On-demand vs. On-premise instant
messaging. Webex Communications, Ease of
CommunicationsOn Demand EIM Solutions.
Retrieved October 2007, from http://www.webbuyersguide.com/bguide/Whitepaper/WpDetails.
asp?wpId=Nzc4MQ&hidrestypeid=1&categor
y=
Wilson, T. (2007, November 12). ID thief admits
using botnets to steal data. Retrieved November
2007, from http://www.darkreading.com/document.asp?doc_id=138856
Yank, G. C. (2004 December 21). Canning spam:
Consumer protection or a lid on free speech? Retrieved October 2007 from http://www.law.duke.
edu/journals/dltr/articles/2004dltr0016.html

AddItIonAl reAdIng
Bcher, P., Holz, T., Ktter, M., & Wicherski,
G. (2005). Know your enemy: tracking botnets;

Online Privacy, Vulnerabilities, and Threats

using honeynets to learn more about bots. The


Honeynet Project & Research Alliance. http://
www.honeynet.org Retrieved October 2007, from
http://www.honeynet.org/papers/bots/

Henry, P. A. (2007, June). Did you GET the memo?


Getting you from Web 1.0 to Web 2.0 security. (In
Todays Risky Web 2.0 World, Are You Protected?).
Secure Computing Corporation.

Cohen, F. (1984). Experiments with computer


viruses. Fred Cohen & Associates. Retrieved
October 2007, from http://all.net/books/virus/
part5.html

King, S. T., Chen, P. M., Wang, Y., Verbowski,


C., Wang, H., & Lorch, J. R. (2006). SubVirt:
Implementing malware with virtual machines.
Retrieved October 2007, from http://www.eecs.
umich.edu/virtual/papers/king06.pdf

Commission of the European Communities.


(2000). Proposal for a directive of the European
parliament and of the council concerning the
processing of personal data and the protection
of privacy in the electronic communications sector. Retrieved October 2007, from http://europa.
eu.int/information_society/topics/telecoms/regulatory/new_rf/documents/com2000-385en.pdf
Computer Crime Research Center. (2005). Security issues: find the enemy within. Retrieved
October 2007, from http://www.crime-research.
org/analytics/security-insider/
Endicott -Popovsky, B., & Frincke, D. (2006).
Adding the fourth R: A systems approach to
solving the hackers arms race. In Proceedings of
the 2006 Symposium 39th Hawaii International
Conference on System Sciences. Retrieved October 2007, from http://www.itl.nist.gov/iaui/vvrg/
hicss39/4_r_s_rev_3_HICSS_2006.doc
European Parliament and the Council of the
European Union. (2003). Annex 11 computerised
systems, Labcompliance. Retrieved October 2007,
from http://www.labcompliance.com/documents/
europe/h-213-eu-gmp-annex11.pdf
Federal Trade Commission. (1999). Gramm-Leach
bliley act. Retrieved October 2007, from http://
www.ftc.gov/privacy/privacyinitiatives/glbact.
html
Federal Trade Commission. (2006). ChoicePoint
settles data security breach charges; to pay $10
million in civil penalties, $5 million for consumer
redress. Retrieved October 2007, from http://
www.ftc.gov/opa/2006/01/choicepoint.htm

MessagesLabs. (2007). Effectively securing


small businesses from online threats. Retrieved
October 2007, from http://www.messagelabs.
com/white_papers/secure_smb
SANS Institute. (1999, May). Management errors.
In Proceedings of the Federal Computer Security
Conferences held in Baltimore. Retrieved October
2007, from http://www.sans.org/resources/errors.
php
Sarbanes-Oxley. (2002). Sarbanes-Oxley act of
2002. Retrieved October 2005, from http://www.
sarbanes-oxley.com/section.php?level=1&pub_
id=Sarbanes-Oxley
Shinder, D. (2002). Scene of the cybercrime
(Computer Forensics Handbook). Rockland, MA:
Syngress Publishing.
United Kingdom Parliament. (2000). Freedom
of information act 2000. Retrieved October
2007, from http://www.opsi.gov.uk/ACTS/
acts2000/20000036.htm
U.S.A. Department of Health & Human Services.
(1996). Health insurance portability and accountability act of 1996. Retrieved October 2007, from
http://aspe.hhs.gov/admnsimp/pl104191.htm
U.S.A. Federal Trade Commission. (2002). How
to comply with the privacy of consumer financial
information rule of the Gramm-Leach-Bliley
act. Retrieved July 2002, from http://www.ftc.
gov/bcp/conline/pubs/buspubs/glblong.shtm



Section II

Frameworks and Models



Chapter IV

Practical Privacy Assessments


Thejs Willem Jansen
Technical University of Denmark, Denmark
Sren Peen
Technical University of Denmark, Denmark
Christian Damsgaard Jensen
Technical University of Denmark, Denmark

AbstrAct
Governments and large companies are increasingly relying on information technology to provide enhanced services to the citizens and customers and reduce their operational costs. This means that an
increasing amount of information about ordinary citizens is collected in a growing number of databases.
As the amount of collected information grows and the ability to correlate information from many different databases increases, the risk that some or all of this information is disclosed to unauthorised
third parties grows as well. Although most people appear unaware or unconcerned about this risk, both
governments and large companies have started to worry about the dangers of privacy violations on a
major scale. In this chapter, we present a new method of assessing the privacy protection offered by a
specific IT system. The operational privacy assessment model, presented here, is based on an evaluation
of all the organisational, operational and technical factors that are relevant to the protection of personal
data stored and managed in an IT system. The different factors are measured on a simple scale and the
results presented in a simple graphical form, which makes it easy to compare two systems to each other
or to identify the factors that benefit most from improved privacy enhancing technologies.A standardised
assessment of the privacy protection offered by a particular IT system; serve to help system owners understand the privacy risks in their IT system as well as help individuals, whose data is being processed,
to understand their personal privacy situation. This will facilitate the development and procurement of
IT systems with acceptable privacy levels, but the simple standard assessment result may also provide
the basis for a certification scheme, which may help raise the confidence in the IT systems ability to
protect the privacy of the data stored and processed in the system.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Practical Privacy Assessments

IntroductIon
Existing research into privacy enhancing technology (PET) has provided few answers to many of
the real questions that governments and large
companies are facing when they try to protect the
privacy of their citizens or customers. Most of the
current work has focused on technical solutions
to anonymous communications and pseudonymous interactions, but, in reality, the majority of
privacy violations involve careless management
of government it-systems, inadequate procedures
or insecure data storage. In this chapter, we introduce a method that helps system developers and
managers to assess the level of privacy protection offered by their system and to identify areas
where privacy should be improved. The method
has been developed in the context of government
IT systems in Europe, which has relatively strict
privacy legislation, but we believe that the method
may also apply to other government systems, nongovernmental organisations (NGOs) and large
private companies. With the privatisation of many
state monopolies, such as telecommunications
and railroads, in many countries and the increasing number of public/private partnerships, the
distinction between the public and private sector
has grown increasingly fuzzy.1 For the purpose
of clarity in our discussions, however, we have
decided to use the vocabulary from government
systems, so we discuss the relationships between
governments and citizens instead of companies
and customers.
Governments are increasingly relying on information technology to provide enhanced services
to the citizens and reduce the costs of the public
sector. This means that an increasing amount of
information about ordinary citizens is collected in
an increasing number of government databases.
As the amount of collected information grows and
the ability to correlate information from many
different databases increases, the risk that some
or all of this information is disclosed to unauthorised third parties grows as well. Although most

8

citizens appear unaware or unconcerned about


this risk, governments have started to worry
about the dangers of privacy violations on a major
scale. If the government is not seen to be able
to treat information about its citizens securely,
these citizens will be reluctant to provide timely
and accurate information to the government in
the future. Many of the same factors are relevant
in the relationship between companies and their
customers, so both governments and large companies have realised that protecting the privacy
of their citizens and customers is necessary if
they are to reap the benefits of the information
society in the future.
The benefits of collecting and storing information about citizens in electronic databases is an
increasing level of efficiency in administrative
systems and convenience for the citizens, because
it provides government agencies with faster and
easier access to relevant data and improves their
ability to combine sets of personal data from different systems. This allows improved services at
reduced costs, for example, the Inland Revenue
Service in Denmark has reduced the number of
employees from around 14,000 to around 9,000
in the past decade, while the amount of information that is being processed about each citizen has
increased. The sheer volume of data collected by
different government IT systems, however, makes
it increasingly difficult for anyone to obtain an
accurate picture of all the personal information
that may be available in government databases.
Moreover, it also makes it difficult to determine
which persons, institutions or private companies
that have access to the data.
There have been an increasing number of
incidents, where personal information has been
released to unauthorised third parties, either
through carelessness or through negligent administrative procedures. Financial News reports that
JPMorgan Chase recently lost a container with a
backup tape that includes the account information
and social security numbers of some 47,000 of
their Chicago-area clients according to Financial

Practical Privacy Assessments

News Online U.S. (2007). In another high profile


press story, personal information including birth
dates and social security numbers of 26.5million
U.S. veterans was recently compromised when a
laptop containing the information was stolen from
the house of an employee, who had taken home the
information on his laptop without authorisation,
according to Sandoval (2006). Similar examples
of employees who loose sensitive personal data of
clients and customers have become commonplace,
which lead Smith (2006) to declare 2006 The
year of the stolen laptop and the Privacy Rights
Clearinghouse (2007) has compiled a chronology of data breaches, which includes a total of
almost 155 million number of records containing
sensitive personal information involved in security breaches from January 2005 until May 2007.
These examples illustrate that privacy enhancing
technologies alone cannot protect the privacy of
citizens, effective enforcement of security and
privacy policies and awareness among the users
and mangers of IT systems are also necessary.
The many press reports about privacy violations and the increasing risks associated with
identity thefts, mean that privacy is becoming a
major factor for both system architects and end-users. Such concerns lead to analysis and discussions
with varying results depending on the parties
individual privacy concerns and the type of system involved. Privacy enhancing initiatives are
often focused on technical solutions to anonymous
communications and pseudonymous interactions
thus forgetting that privacy is not exclusively a
technical issue. The resulting ad-hoc approach to
privacy protection means that there are varying
degrees of privacy protection in government IT
systems. For the individual citizen, this means
that there is very little insight into the amount
of private data being handled in the government
sector as well as the how, why, and which officials
may be given access to this information. Asking
the citizens permission every time personal data
is collected ensures that the citizen knows what
data has been collected, but it does not ensure that

the citizen remembers what data has been collected and what information can be inferred from
the data. Informed consent, where the individual
citizen agrees to the collection of private data for
a specific purpose, is one of the most important
instruments in data protection legislation, but it is
not a realistic solution to this problem. Moreover,
informed consent only addresses the collection
and authorised use of private information, it does
little to inform the citizens about the way that
their data is stored and what procedures are in
place to keep this data safe.
In this chapter we present an operational privacy model that helps focus the privacy discussions on areas with the most significant privacy
risks. The model defines a method that can be used
to analyse the privacy properties of IT systems,
which are different in their design, functionality,
and the data that they process. The method helps
identify privacy risks in such systems, so that
they may be the subject of further analysis and
discussion. Moreover, the method helps assess the
magnitude of different privacy problems, so that
developers may decide which problem to address
first and citizens may decide whether they wish to
trust the system with their personal data. While
the method may be used to suggest areas of possible privacy enhancements, it does not seek to
provide specific solutions to privacy problems, but
leaves it to the owners of the system to develop
specific solutions for the individual system. In
other words, the goal of the model is to identify
possible privacy risks and provide the basis for
valuable privacy improvements. The assessment
of privacy plays an important role in online
systems because a standard privacy assessment
scheme may provide the basis for a certification
program that will allow individuals to decide
whether they wish to interact with a particular
service provider. The operational privacy assessment model proposed in this chapter is based on
factual answers to a few simple questions, which
makes it ideally suited for a certification scheme.
The results of a privacy assessment are presented

9

Practical Privacy Assessments

in a simple graphical form, which allows a layman to determine the overall privacy protection
in the examined system and identify areas with
particularly high privacy risks.

prIvAcy In government
It-systems
There have been a number of proposals for a
definition of privacy, but although some have
been widely used, none have been able to meet
the demands of new technologies and variations
in cultures. This is why the concept today is
somewhat diffuse and the definition depends
highly on the context in which it is used.
Generally, privacy can be described as a form
of knowledge about our existence which we may
wish to control as it is something others potentially
can use against us. This knowledge can be further
divided into three categories: knowledge about
our person, knowledge about our relationships,
and knowledge about our behavior.
In this classification, knowledge about our
person covers information regarding physical
factors, such as information about our health,
financial situation, consumer profile, current
location, and recent movements. For practical
reasons, it is common for individuals to share
chosen parts of information from this category
with selected persons or institutions. For example,
health information is often shared with doctors
and the health insurance agency, but this does not
automatically confer a wish to share this information with friends or co-workers.
The category knowledge about our relationships covers information about relationships
with other persons or institutions. Examples of
such relationships could be political and religious
convictions as well as family, sexual, work related, and other relationships with friends. Even
though most of these relationships involve at
least one other party, the knowledge about their
existence can still be sensitive and private for
the individual.
0

The final category knowledge about our


behavior covers information about us, which is
not necessarily based on something physical. This
can be information about our interests, habits,
priorities, life-plans, and general preferences.
While this information is not directly measurable, it can often be deducted from the actions of
the individual, such as shopping profiles, library
visits,2 and social circles.
These three categories cover most aspects that
define the identity of a person: the relation of this
person to others and the actions of the person in
this world, which allows a complete compilation
of information about an individual; information
that the person is likely to want to exercise some
control over.
Another reason why privacy is such a diffuse
and widely discussed term is that it covers a large
area. In order to work with the term without risking
that it becomes too vague, it has to be narrowed
down to what is relevant for the context of the
specific task.
The information which government systems
requires and uses, is usually information about
different physical entities or attributes, this is
different from the private sector which also uses
behavioral information to compile customer profiles and promote sales. The government systems
manage a lot of information which falls under the
category knowledge about our person. Even
though the amount of information is extensive,
it is still only a subset of the total amount of data
covered by this category.
By limiting the definition of privacy for the
specific task of evaluating the privacy risks in a
government IT system, we only consider privacy
issues that are relevant in the context of administrative IT systems, so important privacy issues,
such as physical surveillance through CCTV, are
not addressed in this chapter.
The government sector uses the personal information about an individual for identification
and interaction purposes, for example, name,
date of birth, location of birth, current address,

Practical Privacy Assessments

and so forth. For this reason, these data are extremely sensitive. The consequences of a breach
in the security of these data can be negative for
both the individual and for the government institutions that are responsible for the data. Since
these data serve as the basis of identification of
an individual, a breach can in worst case, lead to
extensive problems with identity theft.
Common for personal data stored in government systems is that they are gathered for a specific
purpose. Whether this purpose is fair or is in
itself a breach of privacy is a completely different
discussion concerning the merits of the specific
system versus the limitations to the privacy of
the individual. This chapter does not enter this
discussion beyond looking at whether a system
has a given purpose for collecting private data or
not. The goal must be to limit access, use, and
processing of personal data as much as possible,
while still acknowledging the purpose for which
the data has been gathered.
With this focus on personal data in government
systems, we can finally make a task-specific definition of privacy. This definition is only applicable
for use in the model described in this chapter and
as such is not an attempt to make a complete and
general definition of privacy.
Privacy is the influence on and knowledge about
the existence and use of personal data in government digital records.
The requirement of knowledge about the
information that is stored about the individual,
gives the individual an important tool to know
exactly what personal information the government has registered as well as the purpose of
this registration. The requirement of influence
addresses the important goal of being able to
control who gets access to personal data. We use
the term influence, instead of the stronger term
control, because government systems usually

have a purpose that rarely allows the individual


to completely opt-out.
This chapter does not aim to propose solutions or recommendations of technical tools for
protecting privacy in specific systems or generally. Instead, we define a method for evaluating
systems from a privacy point of view, the result
of which can be used to focus further work on
improving privacy in a given system.

prIvAcy protectIon
With the growth of online systems and the digitalisation of administrative procedures, the interest
in the topic of user privacy is rapidly increasing.
Much of the work being done in securing privacy
is being focused on defining and developing
technical tools and standards for the protection of
private information according to Pfitzmann and
Khntopp (2000) and the European Parliament
(1995). These tools seek to maximise privacy in
systems by minimising the amount of and restricting the access to personally identifiable data as
well as the amount of information that can be
gathered from the interactions between private
parties and services. However, these methods
are often hampered or outright prevented by the
functional requirements of the systems they seek
to secure. For example, many financial systems
require the real name of its users in order to meet
accountability requirements, so these requirements prevent the system from implementing
privacy through anonymity or pseudonymity.3
When the requirements of the individual systems
prevent the implementation of technical privacy,
other means are needed to ensure the best possible
level of privacy protection. While the technical
privacy enhancing technologies may be readily
available, their full effect on the privacy of a
system is often hard to measure, especially in
the cases where these tools cannot be used to
their full effect.



Practical Privacy Assessments

technical privacy Issues


According to Pfitzmann and Khntopp (2000),
contemporary privacy research, tools, and general
public interest is largely focused on the technical
issues of anonymity, pseudonymity, unobservability, and unlinkability. These four concepts also
define the privacy class of the common criteria
security evaluation described by International
Organization for Standardization (1999). The four
concepts define different levels of anonymity,
from pseudonymity, which provides some level
of accountability at the risk of allowing profiling,
to unlinkability, where two transactions issued by
the same anonymous user cannot be linked to each
other. The technical tools for protecting privacy
are very efficient, but they often impose serious
restrictions on the applications and generally introduce a considerable performance penalty. This
means that there is often a direct conflict between
privacy concerns and the desired functionality of
a given system.

operational privacy Issues


In cases where functional requirements limit the
applicability of privacy enhancing technologies,
there should be other safeguards in place to ensure the best possible level of privacy protection.
Usually, such safeguards consist of legislation
and operational procedures that ensure that data
is handled with care. Legislation and operational
procedures cannot directly enforce privacy policies, so they cannot guarantee that privacy risks
are eliminated. The work presented in this chapter
addresses systems where technical tools alone
cannot secure privacy. Working with practical
privacy protection starts in the systems design
phase where the required amount and structure
of personal data is determined and where input
from laws and interest groups are essential. The
legislation must be considered to ensure that the
system conforms to the law; both with respect to



what information should be collected and with


respect to what information must not be collected.
External interest groups often have useful perspectives on the scope of information that should
be collected and the necessary procedures for
auditing the use of data. Once the system is up
and running, it is still possible to improve privacy
by ensuring that only properly trained personnel
have access to personal identifying data, as well as
improving the amount and quality of information
available to the registered individuals. Properly
defined operational procedures are therefore an
important element in the protection of privacy,
because they instruct personnel what data should
be collected and how that data should be handled
so that legislation is followed.

Influence on Privacy
The level of privacy protection that is required for
a specific system depends on the type of system
and the type of personal identifiable information
that is managed in the system. There are three
primary categories that influence privacy by defining demands that the system must meet.

Actors
Actors are individuals and institutions, such as
political parties, interest groups, and industry
lobbyists, whose work directly or indirectly influence the public opinion about privacy. Their
mutual interests and conflicts can often influence
the level of privacy protection implemented in
a government IT system. Actors that influence
privacy rarely aim to violate privacy directly.
Often, it is the negative side effects of suggestions to change other parts of the system, which
end up influencing privacy in a negative way, for
example, proposals to increase surveillance as
part of anti-terror legislations are meant to save
innocent lives, but they are obviously detrimental
to privacy. Privacy activists, on the other hand,

Practical Privacy Assessments

often work directly with privacy with the aim to


protect the private sphere.
There are three main types of actors:

Government actors are identified by owning


or controlling one or more public systems.
The actors relationship with personal data is
often a wish to reduce the workload through
use of IT systems, which often includes
processing of personal data.
Commercial actors are identified by being
driven by a commercial interest. An actor
can be an industry association or a private
company. The operation of a government IT
system can sometimes be outsourced to a
commercial actor. Many commercial actors
have interest in the data they collect, which
may be used for marketing purposes.
Non government organisations (NGO) are
a large group of different organisations.
These are identified by being motivated
by the members political or social goals.
NGOs are not motivated by commercial or
government interests.4

General for all actors is that their willingness to


work for a solution to a specific problem depends
on how important the problem is within the scope
of the actor. Another factor is how controversial
a given system is, because actors often want to
set their mark on high profile cases.

legislation
Although laws are the result of the work of politicians, they are still a decisive factor in government systems, because they are more consistent
and stable than the mind of politicians and other
actors.
An example of such legislations is the Data
Protection Directive from the European Parliament (1995), which is defined to protect personal
data. The goal of this directive is to harmonise

the privacy laws in the individual member states.


All member states of the EU have to follow the
directive, so there is a relatively high level of
privacy protection in the European Union and
the directive ensures that privacy is protected
when new legislation is introduced in any member
state of EU.
This legislation protects personal information
in both commercial and government systems.
The legislation also prevents data form inside the
EU from being transferred to countries outside
the EU, unless the country guarantees the same
level of privacy protection as countries in the EU.
It protects individuals, giving them some basic
rights, for example, to access data, to know the
origin of the data, and to withhold permission to
the use of data in some cases.
The advantage of defining privacy rights in
legislation is that public institutions not always
have an obvious interest in the protection of privacy, but they always have an interest in following
the law. Without legislation, the developers will
design the systems according to their own ideas,
but the law ensures some basic rules for privacy
protection.

culture
Culture is an important factor in the perception of
privacy, and it often decides when privacy issues
are of interest. Cultures with a relaxed attitude towards the protection of privacy are also less likely
to protest against the development of systems to
process personal data. Even within the EU, there
are large variations between the citizens privacy
concerns in the different countries. According to
a privacy poll performed by the European Opinion Research Group (2003), countries such as
Denmark, Spain, and Portugal have populations
where only 13% are very concerned about their
privacy, but the populations of countries such as
Greece and Sweden are much more concerned
about privacy with 58% and 54% of their respective
populations stating that they are very concerned



Practical Privacy Assessments

about their privacy. It is interesting to note that


these large differences occur in spite of similar
data protection laws within the European Union,
which suggest to us that the culture of the individual country may play an important role in the
privacy concerns of its citizens. The result of this
variation is that systems, which in some countries
would be faced with severe public concern, might
be more easily introduced in other, less privacy
concerned, countries.
There is a huge difference in how the three
categories, actors, legislation, and culture influence privacy in government systems. There are
actors, who drive new initiatives. Legislation,
on the other hand limits the development of systems. Culture is also an important factor, since
it influences the values that can limit the use and
development of new systems.

though this is conflicting with legislation.


Political change can change the foundation
for new privacy legislation. Even though
the change might not be introduced to limit
privacy, optimisation or detection of benefit
fraud can introduce circumstances where
privacy is violated beyond the people committing the fraud. In this case, both technology and previous legislation help identify
risks of the change.

These examples all show that both technology


and legislation in different ways influence the
development of privacy, and protect the individual
from harm of future change.

prIvAcy enhAncIng
technologIes

privacy maintenance
Privacy is not clearly defined and the composition
of the three categories of actors in the privacy
field tends to change over time, so the privacy
risks also change over time. This is important
to keep in mind when designing new systems or
legislation, because both will have to be updated
at regular intervals. There are two main areas that
serve to maintain the privacy of citizens, the
legislation based protection and the technology
based protection. The risk of placing protection
in legislation is that it can be changed. What is
legal today might not be legal next year. Therefore, there are different factors that have to be
considered:



The development of technology can open


for new ways to use data. In this case the
legislation could limit the influence of this
factor.
The environment of the system (the world
around the system), a good example of this is
the anti-terror legislation. Privacy may here
be protected by technology protection, even

Privacy enhancing technologies (PET) are the


technical tools used to protect privacy. The term
does not cover all privacy enhancing methods
as there are many means to protect privacy that
do not involve technology. This means that for
a given system, which processes personal data,
there is a whole range of elements that influence
privacy. Again, these elements are not limited to
technical aspect of data security, but also concerns
practical aspects such as the privacy awareness of
the system builder, the system operators, and the
individual user. Another aspect is the availability
of contact points for the registered individual. As
such, interaction between government and private
persons present a range of potential privacy-risk
areas; communications, data-storage, and dataprocessing:

Communication: This privacy risk area


concerns the risk of someone eaves dropping
when private information is sent or received
across any communications channel. This
could be a person sending an online request
to the local library or someone filling in

Practical Privacy Assessments

their tax return online. This area concerns


using encryption schemes such as public
key encryption described by Rivest, Shamir,
and Adleman (1978) and the International
Organization for Standardization (2006), but
also hiding the sender and receiver of the
messages as developed by Chaum (1988).
Data-storage: This concerns the risks
associated with storing private data electronically for later use. Later use could be
report production or data-mining. Stored
data could be the loan records stored in the
library systems or the tax returns. Stanton
(2004) describes how both physical and
cryptographic mechanisms exist to prevent
intruders from accessing stored data either
remotely or through common theft of the
storage media.
Data-processing: This covers the risks that
arise when personal information is somehow
used to make decisions. This concerns both
human processing where a person evaluates
the data, and automated processing where a
computer makes the evaluation. An example
of a process like this could be a librarian
registering the return of a book or a tax
auditor examining a tax return. The area
covers employees access, such as role-based
access control described by Ferraiolo and
Kuhn (1992) and interaction with data as
well as the registered individuals control
(Blaze, Feigenbaum, & Lacy, 1996) and
information procedures.

These three terms cover the spectrum of privacy risks in government systems, but they are
far too generic to base an exact evaluation on.
Therefore these three areas are decomposed into
their privacy relevant components.

into hiding the very fact that a communication


is going on, since the fact that you are talking
to someone may be private information. For
example, an employer that monitors outgoing
telephone calls may discover that an employee
has called a recruitment agency, which breaks the
privacy of the employee, although the contents of
the conversations are unknown. There are many
technical tools that protect the communication
channels, such as SSL5 and Pretty Good Privacy
created by Zimmermann (1995). Pfitzmann and
Khntopp (2000) found that there are many issues
that have to be addressed in secure communications, but most of these can be generalized into
three main areas:

Anonymity: Defined as being unable to


identify sender and receiver within a space
of possible senders and receivers
Unlinkability: Defined as being unable to
relate separate message even with knowledge
of the channels being used. This ensures that
a third party cannot link repeated communications with the same or different resources,
back to a user.6
Unobservability: Defined as not being able
to distinguish messages sent between two
points, they will look like any other message
or random noise.

Pseudonymity is defined by Pfitzmann and


Khntopp (2000) as the use of pseudonym IDs;
this is worth mentioning here, because it is often
the best solution in systems where total anonymity
is impossible or impractical. While pseudonymity is an improvement over directly identifiable
information, it cannot match the privacy of anonymization of data as long as it is reversible.

data-storage
communication
Privacy in communication is very much a matter
of securing the transmissions. But it also extends

Storing data securely is also a task that relates


to IT security. The task here is that data must
be protected against both theft and loss through



Practical Privacy Assessments

negligence. While a locked vault can guarantee


protection against theft from outsiders, it does not
protect against employees losing their laptop with
copies of private data on it. This means that not
only must the physical protection be considered,
but also whether the data itself is protected, for
example encrypted, on disk using something like
an encrypted file system as described by Zadok,
Badulescu, and Shender (1998).
In the context of privacy, the sensitivity of data
being stored is also important, that is, the more
sensitive the data being stored is, the larger the
risks and consequences of theft will be. Therefore,
it is important to assess the degree of sensitivity
of the data stored in the system and the possible consequences of a privacy violation (e.g.,
caused by a security breach). This highlights the
importance of storing as little sensitive data as
is possible.
Data-storage also concerns providing access
to the stored data to authorised personnel. This
raises the problems of who should be authorised
to see the information and how much information
an individual employee should be allowed to see.
The answers to these questions differ a lot from
system to system but also within a single system,
as there can be large variations in the level of access required depending on the persons function.
For a government system, it is possible to identify
five separate interest groups relating to a systems
data. The first group is management who has an
interest in the proper treatment of data, but who
is also interested in increasing efficiency thus
reducing costs. The second group is the employees
who, idle curiosity aside, primarily are concerned
with getting the data they need to do their job. The
third group is the society at large, which has an
interest in the datas use in statistical reports and
scientific research but also that current laws and
regulations are upheld. The fourth group consists
of commercial interest groups that may have interests in using private data for targeted marketing
and product development; generally this interest
extends to getting access to as much data as pos-



sible. Finally, the registered users themselves have


an obvious interest in their private data, that it is
treated responsibly and well protected.
With the large variation in the protection of
stored data, it is important to also look at the privacy issues in access control, because it is important to restrict the access to only those who really
need it, and only when they need it. In a report
published by the European Parliament (2006) it is
found that many security breaches today are made
by employees or others with legitimate access to
data and it is important to consider this problem
as well when protecting privacy

data-processing
A very privacy sensitive person may argue that
privacy is broken whenever personal data is being
processed. However, in government systems there
is often a reason for the processing of personal
information, which just has to be accepted. The
discussion of whether the purpose of a specific
system in itself is a breach of privacy, such as the
current transfer of airline passenger data between
European Union and United Stats of America
(2007), is another discussion. Here we focus on
the three components of data-processing that are
important in a privacy evaluation.

Confidentiality is important when private


data is processed. Any persons with access
must know and follow the requirements of
confidentiality. Furthermore, it is important
that only the data directly relevant to the
process is made available, so that peripheral
information does not affect the results of the
process.
Completeness means that the data used
in the process must be both complete and
correct to ensure that faulty decisions are
not made based on the data. Erroneous processing could also lead to further breaches
of privacy.

Practical Privacy Assessments

Insight is important for the people whose


personal data is being processed. Insight
means access to information about what
data has been used in what processes and
what the conclusions are as well as who did
the processing. This allows citizens to either
accept the use of personal data or to raise
objections if they feel that their privacy is
being violated.

A further development of ensuring confidentiality, completeness and insight, would be


to allow a person lesser or greater control over
data-processing involving his or her personal
data. While this quickly becomes complicated
when considering large systems, it is still desirable from a privacy point of view.

frameworks for privacy evaluation


There have been some initial efforts to define a
common framework for discourse about privacy.
Lederer, Mankoff, and Dey (2003) have done
similar work on dividing the different privacy
problems into more general categories in order to
improve the clarity of discussions. Their work aims
to improve discourse on the subject of privacy in
regards to different technical tools and protocols.
Their work is, however, not immediately applicable when evaluating a single system which may
include several privacy enhancing technologies.
Their work allows for comparisons on the effects
of different privacy enhancing tools but does not
extend to comparing systems.
Yu and Cysneiros (2002) have presented a
framework to help design systems while satisfying, the sometimes contradicting, demands of
privacy, usability, and functionality. Their work
establishes a method for engineers to evaluate
the effects that different design decisions have
on the privacy and general usability of the system. The framework is, however, very detailed
and does not naturally extend into non-technical

privacy improvements, which limits its use for


non-engineers.

operAtIonAl prIvAcy
Assessment model
In the following, we present a method to assess
the level of privacy protection offered by government IT systems that contain sensitive data. The
model defines a method to identify and highlight
privacy issues either during the development of the
system or as an evaluation of an existing system.
The method gives a standardised evaluation of
privacy, which is not influenced by the current
state of technology or current moral standards.
This evaluation can then be used to apply more
focused solutions for upgrading privacy, solutions
which are tailored to the design of the specific
system and its functional requirements.
Applying the model to a system in its design
phase allows the system designer to evaluate the
privacy issues of the current design and the effects that different design choices may have on
privacy. Focusing on these issues already, before
implementation, makes it significantly easier to
ensure minimal privacy risks in the final system.
The method does not prescribe or recommend
specific solutions to these privacy issues, but only
highlight areas that could benefit from additional
privacy protection. The method also produces
results in a simple graphical form that can be used
to compare two possible design alternatives for
the current system or two systems with similar
functionality, which can be useful when having
to choose between systems.

Identifying privacy factors


The model divides the factors that influence privacy into different categories, where each category
describes an element of privacy protection. Using
the component breakdown in the fourth section,



Practical Privacy Assessments

we have defined a range of privacy factors, which


are divided into seven main categories that are
explained below. The division into categories helps
keep the model simple to work with. Each category
has its distinct privacy focus, which decides the
elements it contains. It is important that the factors cover all the categories of privacy identified
in the second section as well as the problem areas
described in the fourth section, to ensure that the
result of the model gives a complete image of the
privacy situation. Each category may be further
divided into subcategories that cover more specific
fields of privacy, but are still within the scope of
the main category. The different categories and
subcategories do not only focus on technical
solutions, but also on practical aspects relating
to the operation of the system, such as how is
data stored?, who has access to the data?, and
what personal data is recorded? The categories
are defined as follows:
Data protection: This category covers the
risks concerning data being leaked or stolen. This
is the area of privacy that prevents an outsider
from gaining direct access to the private data,
either where it is stored or in transit over the
network. The category is to a high degree based
on traditional IT security tools. The major areas
of concern are:

The focus of this category is very technical


and is considered a traditional topic in IT security.
Encryption and physical security are well known
and widely used security mechanisms and though
they are not designed as specific privacy tools,
the effects they achieve amounts to the same. A
system failing to get a decent score in this category
not only has problems with privacy, but problems
with the general security of the system as well.
Sensitivity: This category covers the sensitivity of data stored and managed by the system.
The sensitivity is an expression of the amount of
damage a privacy breach could do. The category
identifies privacy risks in a given system, so it
must include the techniques that are used to protect data, in case it is leaked or stolen, such as
separation of sensitive data, from less sensitive
data. This means that the category works to accomplish a measurement of the sensitivity of the
data, as well as whether this sensitivity has been
lowered by separating identifiable information
from the rest of the data. The focus areas of this
category are:

8

Storage of private data. For private data,


both the physical security of the storage
media is important, but also that the data is
encrypted should a theft occur.
Communication of private data. This covers the security of communication channels carrying private data between users
of a system. In order to ensure privacy, it
is important to consider both the security
of the data packets and the identity of the
communicating parties, that is, hiding the
identities of the sender and the receiver to
prevent traffic analysis.

Risk profile which is a measure of the sensitivity of the data in a system. The measure
ranges from items of low sensitivity, for
example, phone numbers, to items of high
sensitivity such as medical records.
ID-separation is a measure of the degree in
which identification data has been separated
from the operational data in a system. The
use of pseudonyms, instead of real names,
would provide a low separation, while use of
completely anonymized data would provide
a high degree of separation.

The possible score of this category is highly


dependant on the context and the function of the
system, for example, medical journaling system
for an emergency room will probably never be
able to get a perfect score.

Practical Privacy Assessments

Environment: This category covers the


entire environment that surrounds the system.
The environment includes all aspects relating
to privacy, from the ability of interested parties
to comment on and influence the development
process, to the production environment of the
system. The category seeks to assess the work
that has been done to ensure that the system has
been subjected to examination and comments have
been received by special interest groups and the
like. It also examines whether the system is seen
to conform to applicable laws and to what degree
parts or the whole of the system is outsourced
to other companies with different interests or to
other nations with different cultures. The major
areas of concern for this category are:

Influence concerns whether the concerns


and comments of public interest groups have
been sought and whether they have come to
a common understanding.
Law and regulation must be followed and
it is important to note whether there has
been a professional to check the systems
compliance.
Outsourcing can lead to problems arising
from differences in culture and interests.
Outsourcing covers not only outsourcing to
other countries but also outsourcing from
government agencies to private companies.

This category is mostly important during the


development phase of the system. Its purpose
is to prevent closed and secretive development
of systems handling private data by emphasising the values of open and publicly scrutinised
systems.
Surveillance: This category covers the surveillance capabilities that the system provides to
the operators of the system. This includes who has
interest in the data, and how easy it is to link data
from this system to other datasets. The category
establishes to what degree the system limits its

gathering of data, as well as how much it uses of


common identifiers that can easily be linked to
other datasets. The focus areas are:

Limiting data is about ensuring that only


the necessary data are collected and processed in a system. In order to minimize
privacy issues a system should resist the
lure to collect more data than needed, that
is, the system should not collect data which
is nice-to-have, but should limit itself to only
collect data that it needs-to-have.
Common identifiers concern the use of
identifiers which are used in other databases
and therefore allows for data to be joined
from one or more separate databases. Identifiers such as social security numbers are
especially dangerous in this regard.

While data-mining may solve a number of


important problems for governments and large
organisations, it is a very large threat to an
individuals privacy. While data contained in
individual databases may not present a serious
privacy threat in itself, the combination of data
from many databases may provide the ability to
infer large amounts of private information about
an individual.
Ordinary use: This category covers how data
is processed when it is used on a daily basis. The
category examines how risks of data leaks are
handled, and what is done to reduce individual
users access to sensitive data. This important
category establishes how the operators of the
system are trained to handle such data and to
what degree they are restricted in their data access. Finally, the category examines the operators
ability to remove the private data from its security
framework by, for instance, printing the data. The
category focuses on the following areas:

Education covers the training received by


the personnel, such as case workers, that are
actively working with peoples private data

9

Practical Privacy Assessments

on a daily basis. Basically, this establishes


whether the employees have been informed
or educated about privacy and the data they
have access to.
Access control concerns to what degree
the systems restrict individual employees
access to data. Generally, tighter restrictions
on data access are better for privacy.
User storage is generally about the ability
of the systems user to remove private data
from the security framework, for example,
by taking data home on their laptop or by
printing cases.

Often within IT security, the greatest threats


to a systems comes from the employees. This is
no different for privacy where it may be much
more common as something as casual curiosity
can be difficult to detect or prevent.
Transparency: This category covers the
level of information that the system provides
to its users about privacy risks in the system.
For example if the system is able to tell the user
why and for what purpose the data is collected.
The category also examines to what degree the
users are able to influence the way that data is
processed in the system. More specifically, the
category examines how well the system manages
to inform the registered people about the purpose
of the registration and the system, contact points,
the registered persons rights, and exactly what
data are processed. The major areas of concern
in this category are:

0

Purpose concerns whether the system has


a well defined purpose, which motivates the
collection of private data and whether the
registered individuals are informed about
this purpose.
Contacts concerns how easy it is for individuals to get in touch with representatives
of the system, making it possible to get
information or make complaints about it.

Information of rights covers to what degree


a system informs registered individuals of
their rights, if any, relating to the processing
of their private data.
Information of data covers the amount
of information that registered individuals
receive from a system about what data has
been collected. This also extends to any extra
information the individuals receive about
the processing of their data. For example,
the names of case workers and such.

This category of privacy is less about preventing privacy breaches and more about ensuring
that every person whose private data is collected
and processed is informed of this and given the
necessary knowledge and opportunity to ask
questions and raise objections.
Control: This category covers the controls
that the system implements to check the use and
storage of the data. This includes both the users
control of data correctness and external audits
of the system. The category examines the users
own level of control with data in the system, but
also the level of outside impartial controls to
ensure that the system is only doing what it is
supposed to do. Finally, the category examines
the systems own functions to ensure that data is
correct and coherent. The major focus areas of
the category are:

Registered individuals control concerns


the level of control an individual is given
in the processing of his or her data. Such
controls could include the ability to decide
who gets access to the data and who should
be denied access to the data.
Data correctness concerns the quality of
the private data. This examines whether a
system double-checks data with other data
sources and/or whether individuals are encouraged to check and correct errors.

Practical Privacy Assessments

Audit covers the controls of the system and


its functions. A higher score is given for
audits performed by independent sources
but even internal audits are better than no
control.

The category examines the controls that ensure


a running system is performing according to its
purpose. With private information being held in
the system, the registered individuals have an
interest in being able to check that this information is not being misused. This also helps to build
trust between the system and the users. Audits
may be introduced to ensure that the system performs correctly, even when the users are failing
to check the system or when they do not know
how to check the system. Finally, auditors may
also be allowed to check parts of the system that
are not visible to the users.

Privacy and Privacy Factors


The privacy factors are derived from a compilation
of several sources. The model emphasises the need
to cover not only the traditional privacy protection in the form of classic IT security and related
technical tools, but also the practical handling of
privacy data. The traditional security factors are
according to Jansen and Peen (2007) partly identified by the common criteria security evaluation
scheme which has a category covering privacy.
This set is further expanded on by the focus areas
of privacy enhancing technologies as described in
the fourth section. A state-of-the-art analysis into
current privacy enhancing technologies is used
to identify the possibilities in privacy protection.
It is this set of possible privacy tools that define
the technical factors in the privacy model and
the score of each factor (as each factors score is
based on how well privacy is protected in relation
to how well privacy can be protected). The rest
of the factors concern the more practical privacy
protection or risk reduction. These factors are the
ones focusing more on the operational privacy

issues covered in the third section. These factors


are defined so that they cover the privacy risks
arising from the different interests and motivations of the actors in a system. These factors also
cover legislation and cultural issues. Combined,
this aims to cover the definition of privacy in
context as defined in the second section.
Some overlap does exist between the technical
and the practical factors as a result of similarities
in the cause of the specific privacy risk. In these
cases, the factors have been kept separate to assess
the technical and practical solutions separately,
as they are not mutually exclusive.

Measuring Privacy Factors


The results of the evaluation of each category
will finally be used to sum up a final result. To
produce a result set which can be compared to
other sets, it is important that the results are treated
uniformly. The operational privacy assessment
model defines a series of questions designed to
determine the level of privacy protection that the
system provides within each category. The specific
questions each address one of the issues covered
by the category. The questions are designed to
result in a percentage score ranging from 0% in
cases of the highest privacy risk, to 100% in cases
of the lowest possible privacy risk. Each possible
answer to a question, results in a score that reflects
the possible steps taken (or lack of steps taken) to
reduce the privacy risks. Note that the score goes
towards 100% as you approach the steps taken to
ensure privacy which are practically possible, and
not just theoretically possible. These questions
are compiled into a questionnaire that should be
answered by someone who knows about the design
and operation of the system within the different
areas covered by the questionnaire. Typically, the
head of IT services will be able to answer most
of these questions, but she may delegate some
questions to someone who are more familiar with
a particular topic. The answers to the questions
within each sub-category are aggregated into



Practical Privacy Assessments

a single score for each of the seven categories,


which may then be plotted on a chart that shows
seven percentage values, one for each category.
The score for each sub-category is calculated by
simply reading of the recorded answer to each
question, which is always a percentage value.
If there are several valid answers to a particular
question, the assessment method selects the one
that gives the lowest score. This enforces the policy
that no chain is stronger than the weakest link.
The result of each category is then calculated as
the average of the questions in each sub-category.
The averaging produces seven discrete results,
one for each of the above categories.
Example: Consider a system that contains
sensitive personal data. An evaluation of the
category Sensitivity requires an evaluation
of two subcategories: ID-Separation and Risk
Profile, which may receive the following scores:
ID-Separation = 0% because the person is
completely identified (i.e., the highest possible
privacy risk) and Risk Profile = 50% because
the sensitivity of personal data is medium (i.e.,
the medium of possible privacy risk).
With these results, the final result is calculated as a simple average, which gives 25%. This
category score is then used as one of the seven
scores in the final result.

Privacy Assessment Questionnaire


As previously mentioned, the practical assessment of privacy is based on a series of questions
answered by a person who is familiar with the
system that is being evaluated. These questions
are presented in the form of a questionnaire, where
each question has a few standard answers that are
associated with a numerical value. The scale of
these values is equidistant, which means that there
is no way to differentiate between systems that
offer the same type of protection with different
degrees of security, that is, there is no difference
in the value of a server that is locked in a normal
server room and a server locked in a highly se-



cured area, such as a bunker. This is done because


precise weights on this type of issues depends
on too many factors that differs from system to
system, so we have decided on a unified scale, so
the differences in security is only made evident
by a combination of questions. The underlying
categories for each question are identified in the
section on identifying privacy factors and the way
we associate numerical values with each answer
is presented in the section on measuring privacy
factors. The first page of a standard questionnaire
is presented in Figure 1.
In the following, we present the questions for
each category and the numerical values associated
with each of the possible standard answers that
we have identified.
Data protection: This category considers
the protection of data that are being stored in the
system as well as data being transmitted between
components in the system. The data protection
classes are shown in Table 1.
Sensitivity: This category describes the
sensitivity of the data managed in the system.
This includes a risk profile based on the inherent
sensitivity of the data and id-separation, which
describes how easy it is to link data to a physical
(real world) identity. The sensitivity classes are
shown in Table 2:
Environment: The questions pertaining to the
environment focus primarily on factors relating to
the development of the system. Questions include
whether relevant external organisations have been
consulted in the development process, whether
competent advice has been sought regarding the
legal aspects, and whether parts of the development
has been outsourced to another country, where
the privacy legislation or culture of community
living may be different. The standard environment
classes are shown in Table 3.
Surveillance: This category addresses the
surveillance capabilities that the system offers
to the operators of the system. These capabilities
are normally controlled by limiting the amount of
data collected and preventing linkability through

Practical Privacy Assessments

Table 1. Standard data protection classes

Table 2. Standard sensitivity classes


Risk profile

Storage
1.

No protection

0%

1.

Highly sensitive data

0%

2.

Protected against physical theft or encrypted

50%

2.

Sensitive data

50%

3.

Protected against physical theft and encrypted

100%

3.

Normal public accessible data

100%

ID-separation

Communication
1.

No protection

0%

2.

Protected against wire tapping but not


against identification

50%

3.

Protected against wire tapping and identification

100%

the use of common identifiers. The standard surveillance classes are shown in Table 4.
Ordinary use: This category focus on problems relating to the day to day operation of the
system. This includes the education of staff with
respect to privacy issues, the implementation
of the necessary access controls to enforce the
need-to-know principle, and the ability to make
off-line copies of data that are not subject to the
controls implemented by the system. The standard
ordinary use classes are shown in Table 5.
Transparency: This category focus on the
registered persons ability to ascertain the purpose

1.

No protectionidentification is possible

0%

2.

Responsible pseudonyms

33%

3.

Pseudonyms data

66%

4.

Full protectionanonymised data

100%

for registering personal information, contact the


agents who operate the system, find out what
rights they have with respect to the registered
data, and the actual right that registered people
have with respect to the private date managed in
the system. The standard transparency classes
are shown in Table 6.
Control: The final category addresses the
external controls imposed on the system by the
registered users and auditors. These controls
include the procedures put in place to ensure the
correctness of data managed by the system. The
standard control classes are shown in Table 7.

Table 3. Standard environment classes


Influence
1.

Developed without from external organizations help or users

0%

2.

Developed with a few external organizations or users

33%

3.

Developed with external organizations and/or users

66%

4.

Developed with many external organization and users, with high public attention

100%

Law and regulation


1.

Developed without legal advice

0%

2.

Developed legal advice

50%

3.

Approved by the authorities

100%

Outsourcing
1.

Developed with all development outsourced to a foreign country

0%

2.

Developed with some/all of the development outsourced to a company that is covered by the same laws

50%

3.

Not outsourced

100%



Practical Privacy Assessments

prIvAcy Assessment

Table 4. Standard surveillance classes


Limiting data
1.

Collection all possible data, for possible later


data processing

0%

2.

Collection of some extra data

50%

3.

Limited data collection

100%

Common identifiers
1.

Social security numbers used for identification

0%

2.

Some system depended identifiers and social


security numbers

50%

3.

System depended identifiers and no social


security numbers

100%

To present an intuitive overview of the operational privacy assessment results calculated by


our method, and to enable a comparison of the
different results of the categories, we suggest displaying the results using a Kiviat graph defined by
Kolence and Kiviat (1973), where each category
is represented by an axis. The chart displays the
strength and weaknesses of the system tested.
High privacy protection is placed at the edge of
the chart and low privacy protection is placed close
Table 6. Standard transparency classes

The assessed systems score with respect to


these different classes is determined through a
questionnaire, which is completed by a person
who is familiar with the different aspects of the
development and operation of the system. An
example of the first page of a questionnaire is
shown in Figure 1.

Table 5. Standard ordinary use classes

Purpose
1.

There is not given information about purpose


of system

0%

2.

There is not given information about purpose


of system. But there is access to some part of
the data.

25%

3.

There is not given information about purpose


of system. But there is access to all of the data.

50%

4.

There is given information about purpose of


system. And there is access to some part of the
data.

75%

5.

There is given information about purpose of


system. And there is access to all of the data.

100%

Education
1.

No information about privacy issues

0%

2.

Warning about system containing sensitive


data

33%

3.

Informed about how to handle sensitive data

66%

4.

Educated in handling sensitive data

100%

Access control

Contacts
1.

No official way to contact system owner

0%

2.

It is possible to contact system owner, but this


is not public knowledge

50%

3.

Official and public known ways to contact


system owner

100%

Information of rights

1.

Open system full access to data

0%

2.

Closed systems were some users have access


to all data

33%

3.

Some segregation of duties, where access is


granted depending on job description

66%

4.

Full segregation of duties, including geographical segregation

100%

1.

No information about information rights

0%

2.

The simple rights are available

50%

3.

The rights are available and explained in a


decent manner

100%

Information of data
1.

It is not possible to see registered data or know


how it is processed

0%

User storage



1.

Extraction data is possible without requirement for encryption

0%

2.

It is possible to see registered data, but not to


know how it is processed

33%

2.

Extraction data is possible with requirement


for encryption

50%

3.

It is possible to see registered data, and know


some about how it is processed

66%

3.

No extraction of data is possible

100%

4.

It is possible to see registered data, and know


how it is processed

100%

Practical Privacy Assessments

Figure 1. First page in a privacy assessment questionnaire



Practical Privacy Assessments

Figure 2. Example of a Kiviat graph from a fictitious system

Table 7. Standard control classes


Registered individuals control
1.

No control of the data

0%

2.

The process is described but no control

33%

3.

Parts of the process are controllable

66%

4.

All parts of the process are controllable

100%

Data correctness
1.

Not possible to check

0%

2.

Only some administrators can see and


correct the data

33%

3.

All administrators can see and correct


the data

66%

4.

Individuals can see and correct the data

100%

1.

No audit

0%

2.

Occasional audit

33%

3.

Internal audit by official schedule

66%

4.

Independent audit by official schedule

100%

Audit

to the centre of the chart. When the scores are


connected a heptagon is formed, and the area of
the graph depends on the total score with a small
area showing high privacy risk and a larger area
showing lesser privacy risks.
Figure 2 shows a fictional government IT
system. It shows that the privacy protection of the
system is strong in the categories environment,
ordinary use, and control; the system has average
scores in surveillance and data protection; and
the areas sensitivity and transparency have weak
privacy protection. These final two areas would
possibly benefit from further work to improve
their privacy level.

prIvAcy Assessment
evAluAtIon
When the privacy protection instruments of a
given system is evaluated, it is important to examine both the quality of the privacy enhancing
technologies employed in the system and the



operational environment of the system, which


includes privacy legislation, system operation, and
education of the managers and end-users of the
system. The privacy assessment method presented
in this paper includes all of these factors.

prIvAcy Assessment
methodology
Producing the assessment result using the model
means that for each of the seven categories in
the model, the components must be assigned a
percentage value as described in the fifth section.
The specific values are found by establishing the
amount of work done to protect privacy in a scale
between no work and all that is possible. Such
values can be found through setting a number of
possible levels of work and assigning each level
a value. For example, to determine the amount
of work done to protect privacy through audits,
in the category control, four possible levels can
be used (see Box 1).
When determining which level is achieved
by the system, the lowest denominator has precedence, for example, an occasional independent
audit will only score 33%. Completing these
evaluations for all categories and their components
produces the values required to calculate the cat-

Practical Privacy Assessments

Box 1. Privacy through audit


1.
2.
3.
4.

No audits
Occasional audit
Internal audits by official schedule
Independent audits by official schedule

egory scores and the Kiviat graph as described


in the fifth section.
Realizing the full benefit of a completed evaluation is an important two-step process, where the
first step identifies the areas of potential privacy
improvements and the second step further analyzes privacy risks and identifies more specific
privacy improvements.
Initially, an evaluation using the model will
result in a Kiviat graph visualising privacy protection in a given system as illustrated by Figure
2. In this visualization, the dimensions with the
lowest scores are the most immediately interesting
as they represent the largest risks and therefore
the largest potential areas for improvements. It
is unlikely that all privacy risk areas can be improved, because the systems functional requirements may prevent some privacy improvements.
For example, a system requiring large amounts
of very sensitive data will receive a low score in
sensitivity, representing the high risk associated,
and will be unable to improve on that score. The
discovery of such areas is in itself valuable information that allows the system owners to counter
this risk using other precautions and/or raising
the demands in other areas. The results of the
privacy assessment must therefore be interpreted
in the context of the functional requirements of
the system and the limits these requirements
impose. The model does not seek to conclude on
the question of whether the systems benefits are
sufficient for the associated privacy risks to be
accepted, it merely seeks to determine and clarify
these privacy risks.
Once the overall assessment has been completed, more specific improvements can be
found by looking at the individual dimensions.

0%
33%
66%
100%

The dimensions with the lowest scores represent


the greatest areas of privacy threats, which are
often the most efficient areas to start with, but
are not necessarily the easiest which is why the
overall assessment in step one is important. For
each dimension, the specific issues that cause
the low score can be easily identified. It is for
these specific issues that the system owners and
developers should try to identify and implement
possible improvements. While the evaluation
presents the specific privacy issues in a system,
it does not provide the specific solutions. The
method of improving the individual solutions is
highly dependant on the type and structure of the
specific system. Any solution must therefore be
tailored to the specific system using industry approved methods and tools. This means that there
is a certain amount of work to be done finding an
optimal solution for each of the specific privacy
issues. Once all the issues of a dimension have
been examined, and improved if possible, the
work can continue to the next dimension having
a low score. This continues as long as time or
budgets allows or until all dimensions have been
adequately analyzed.

prIvAcy Assessment
scenArIos
In order to demonstrate the feasibility of the
proposed privacy assessment model, we have
defined two application scenarios, which illustrate
how our model may be used to assess the privacy
properties of these scenarios. In each scenario, we
provide a description of the setting and the immediate results of an assessment for each category



Practical Privacy Assessments

of the model. Also, the resulting Kiviat graph is


presented. After the two scenarios we discuss a
comparison of the results.

scenario 1
The first scenario defines a simple medical journaling system used by doctors and small clinics.
The data stored in the system is personal medical
information, which is by nature very sensitive. The
journaling system is integrated with the doctor
or small clinics existing IT system, which also
stores the data. The system does not encrypt data
but the door to the office is locked after opening
hours. The system is developed by a small development company and is only used by a handful
of doctors and clinics. The system uses social
security numbers to identify patients and stores
any personal data that the doctors find relevant.
The clerks at the medical clinic have no special
training regarding management of personal information and they have the same access to the
system as the doctor. The system allows doctors
to print individual patients journals so that they
can bring them on a house call or take them
home. Patients data are entered into the system
the first time they visit the clinic and updated at
subsequent visits. The data in the system does
not expire and there is no external audit of the
system. The patients can, upon request, receive
a copy of their data but they are not otherwise
informed about the contents or functions of the
system and the patient has no direct control of
the stored data.
Assessment Results: Using our model, we get
a specific result for each category which indicates
the level of privacy risks in each category.
Data protection: While the overall score in
this category is relatively high, the model shows
an evident opportunity for improvement by
encrypting the stored private data. The system
scores 25%.
Sensitivity: The category notes some very
large risks inherent in a system with this sort of

8

data. Also, the system is too small to properly


separate the identification data from the rest of the
set. A low score in the category warns that extra
care should be taken. The system scores 12%.
Environment: As a result of the closed development, the system is unable to achieve a top
score in this category. The system scores 33%.
Surveillance: The system scores low in this
category as a result of the unrestricted amount of
personal data, the high sensitivity of the data, as
well as the use of common identifiers such as social
security numbers. The system scores 0%.
Ordinary use: A bottom score in this category
reveals a lot of room for improvement. The clerk
has no training in handling personal data and there
are no levels of access to differentiate between
what data the different users can access. Also,
any security in the system is lost when the data
is printed or otherwise copied out of the system.
The system scores 8%.
Transparency: While the system is not a secret, it does little to actively inform the registered
persons. This makes it harder for the registered
person and thus lowers the score of the category.
The system scores 46%.
Control: A lack of system revision as well
as the registered individual having very little to
say in how and why the data is processed. The
system scores 22%.

scenario 2
This scenario is similar to Scenario 1, but with a
different system. The journaling system in this
scenario is developed by the IT branch of a large
medical company which also hosts and serves the
data from its central server farms. These server
farms securely stores the data and communicates
with the clinics using encrypted channels. The
system is able to share data with hospitals and
provides online access for the patients to their
data. The system uses social security numbers
to identify patients and stores any personal data
the doctors find relevant. The clerks and doctors

Practical Privacy Assessments

Figure 3. Privacy assessment of Scenario 1

using the system attend an introductory course


which gives information about management of
personal data. The system has different levels of
access for the doctors and the clerks. The system
synchronises items such as addresses and phone
numbers with government databases. Through
the online access, patients are able to browse their
personal data and contact the system owners.
Also, the system is regularly audited by government auditors.
Assessment results: Using our model, we get
a specific result for each category which indicates
the level of privacy risks in each category.
Data protection: The system has protection
of data both during storage and transport. While
this serves to reduce privacy risks, the model does
present options for additional improvements. The
system scores 50%.
Sensitivity: Due to the sensitive nature of the
data, there are significant privacy risks involved
in the system. The score of this category reflects
this fact. The system scores 16%.
Environment: This system has been developed with more input from external interest groups
as well as having been checked for compliance
with current law. However, the possible conflicts
of interest from the private firm hosting the data
subtracts from the final score. The system scores
44%.

Surveillance: As with the system described in


Scenario 1, the risks that stem from the amounts
of data and the common identifiers are considerable. The system scores 0%.
Ordinary use: The extra educational effort
combined with the role based access limits goes
a long way to reduce the risks in this category.
However, the possibility for removing data from
the security of the system still exists. The system
scores 58%.
Transparency: The system allows registered
people to access their data and contact their doctor or the system owner if something is wrong.
This gives the individual better information about
his or her privacy situation with respect to this
system and the data that it contains. The system
scores 75%.
Control: While the registered persons have no
control of the processing of their personal data,
the system still improves on privacy by ensuring
correctness of data as well as submitting to regular
official audits. The system scores 66%.

prIvAcy Assessment results


The privacy assessments of the two scenarios allow us to compare the privacy properties of the two
systems, by plotting the scores in a Kiviat graph.
Figure 3 and Figure 4 shows the plotted results of
the privacy evaluation. The interpretation of the
Kiviat graph is presented in the fifth section.
While the sensitivity, which is inherent in
medical systems, is a large factor in the large
overall level of privacy risks, there are important differences between the two systems. The
categories of sensitivity and surveillance can be
difficult to improve directly, which explains the
similar scores, but it is not impossible and it is
worth considering for both systems.
The system in Scenario 2 is somewhat more
professional in design and operation. The extra
effort results in a considerably improved privacy
protection compared to Scenario 1, in the catego-

9

Practical Privacy Assessments

Figure 4. Privacy assessment of Scenario 2

ries of data protection, control, transparency, and ordinary use. While some of these
improvements are technical solutions, such as
encryption and role based access control, many
are the result of improved policies, practises and
awareness. This includes improved training,
regular audits, and the effort to inform the registered individuals of what is going on. While these
improvements can be more difficult to assess than
technical solutions, such as the use of role based
access control, they may prove equally efficient
and help to significantly reduce privacy risks when
technical solutions are not possible.
The comparison reveals that scenario two is
in most areas superior to scenario one from a
privacy perspective. While some of this is based
on technical solutions that may not be easily adaptable for scenario one, many of the more practical
are. Transparency, ordinary use, and control are
areas where scenario one could benefit by learning from scenario two.
While the system describe in scenario one is
likely to be cheaper, it comes with an extra cost
of poor privacy protection.

future trends
Most people are, in one way or another, concerned
about their privacy, this is especially true amongst

80

Internet users as indicated by Reips (2007). This


concern, however, is not easily translated into
specific demands for privacy protection until after
mayor privacy breaches become known. Therefore, we do not see demands of privacy coming
from the users of government systems, but from
politicians and companies that feel a need to avoid
scandals caused by the loss of personal data. The
reason for this is probably that most people do
not see the risks, because they lack understanding of the privacy issues and their risks. Most
people only know about privacy risks and loss
of personal data from the newspapers, but once
data breaches are known, demands for privacy
enhancing technologies emerge.
The threats against privacy are also likely
to increase. This is probably best illustrated by
the latest anti-terror legislation by The Congress
of the United States (2001) and The English
Parliament (2006). The fear of terror has also
prompted legislators to introduce new biometric
passports and to require passenger information,
including credit card details, for all passengers
in commercial airlines. Finally, the European
Parliament has published a report by Walton
(2006) on the existence of a global system for
the interception of private and commercial communications (ECHELON interception system),
which also increase surveillance and thereby the
threats against privacy.
Not all new privacy enhancing technologies
are easily implemented in government systems,
but the method defined by our operational privacy
assessment model gives an overview of the current privacy risks and security level. This can be
used to highlight the issues relating to privacy in
the future development of government systems,
which may help to make them more privacy
friendly. Furthermore, it may form the foundation for further research into what privacy risks
are common and should be the focus for further
privacy research.

Practical Privacy Assessments

conclusIon
Privacy is a subjective concept, so its definition
and importance will vary from person to person.
The model presented in this chapter helps to standardise the work of securing privacy in electronic
systems. The model contains seven categories
that together cover all aspects of privacy. Each
category clarifies a range of questions concerning privacy and the model produces a simple
objective result in the form of a score system.
The score system makes it possible to assess the
overall privacy level in a system and to compare
the system to other similar systems. The score in
each category indicates the level of privacy risk
within that category, which helps the developers
and administrators of government IT systems
to identify the privacy factors that should be
addressed first. The score relates to the current
state of privacy in the system, but it may also
help determine how well the system tested may
address future privacy problems, for example, if
the system has a low score in sensitivity because
the information that it manages is highly sensitive,
there is little hope for a better score in the future.
The standardisation and the overview of privacy
risks provided by the model, serve to help system owners understand the privacy risks in their
systems as well as help the individuals, whose
private data is being processed, to understand
their personal privacy situation. Furthermore, the
model addresses all the technical and operational
aspects which influence privacy in a system. The
model has been evaluated in a few real systems
by Jansen and Peen (2007), but we would like
to analyse a whole sector within a government
administration, in order to demonstrate the
general applicability of our model. This analysis
would also provide valuable information about
the general state of privacy in the government
sector. The proposed model focuses on government IT systems, which are governed by privacy
legislation and there are few direct motives for
civil servants to ignore privacy laws. The privacy

issues in the private sector are slightly different


because private companies have financial interests
in the personal data that they collect, so further
studies are needed to determine whether it is possible to include this aspect into the model without
compromising its usability.

future reseArch dIrectIons


The privacy model described in this chapter defines a basis for a systematic approach to working
with privacy. The current definition of the model
is very general and it may not capture all aspects
of privacy in a given application. However, working with the model in practice and, in particular,
performing assessments of existing government
IT systems will help refine the model so that it
may serve as the basis of a privacy certification
scheme. Ideally, we would hope for a single general
scheme, but we believe that it is more likely that
a number of schemes may emerge, which would
be limited to certain genres of systems, such as
e-commerce. The narrower the genre of system
examined with the model, the easier it may be to
find an appropriate definition of privacy in that
context. An interesting idea would be to attempt
to extend common criteria certifications with elements of the model described in this chapter. As
the current model is already based on the structure
of the common criteria, this would be an obvious
and valuable extension of the work presented in
this chapter. The advantage of certification is that
systems with the label privacy friendly would
help focus the publics attention on privacy.
A more general problem that needs to be addressed is that privacy has different meanings
for different people. Moreover, the definition of
privacy seems to change when people are forced
to consider the issues involved, so the meaning
of privacy may actually change when observed.
It would therefore be interesting to solve this
problem and come up with a generally acceptable
definition of the privacy concept. Questionnaires

8

Practical Privacy Assessments

and interviews might help determine the value


that people intuitively put on privacy and how
this value changes when they are asked to think
about problems relating to privacy. This approach
could also be used to determine the ability and
willingness of people to manage their own privacy,
which determines whether an approach based on
informed consent, is actually useful. This study
would also help develop new tools that allow
people to control the keys to their own data in
a non-intrusive and intuitive way. There are a
lot of questions to answer in this area, which is
typical when it comes to working with privacy,
but it could be interesting to see if these questions
could lead to people defining their own ideas of
privacy and to define a framework for building
IT systems that incorporate these personal definitions of privacy when choosing how to work
with private data.

references
Blaze, M., Feigenbaum, J., & Lacy, J. (1996).
Decentralized trust management. In Proceedings
1996 IEEE Symposium on Security and Privacy
(pp. 164-173).
Chaum, D. (1988). The dining cryptographers
problem: unconditional sender and recipient untraceability. Journal of Cryptology, 1(1), 65-75.
Congress of the United States. (2001). USA PATRIOT ACT of 2001. Retrieved July 16, 2007, from
http://thomas.loc.gov/cgi-bin/bdquery/z?d107:
H.R.3162

24 October 1995 on the protection of individuals


with regard to the processing of personal data and
on the free movement of such data.
European Parliament. (2001). Report on the existence of a global system for the interception of
private and commercial communications (ECHELON interception system) (2001/2098(INI)).
European Union and United Stats of America.
(2007). Agreement between the European Union
and the United States of America on the processing
and transfer of passenger name record (PNR) data
by air carriers to the United States Department
of Homeland Security.
Ferraiolo, D., & Kuhn, R. (1992). Role-based
access controls. In Proceedings of the 15th NISTNCSC National Computer Security Conference
(pp. 554-563).
Financial News Online U.S. (2007). JPMorgan
client data loss. Story attributed to the Wall Street
Journal, reported on Financial News Online U.S.
on May 1, 2007.
International Organization for Standardization.
(1999). Common criteria for information technology security evaluation (ISO IS 15408). Retrieved
on July 12, 2007, from http://www.commoncriteriaportal.org/
International Organization for Standardization.
(2006). Information technologysecurity techniquesencryption algorithmspart 2: Asymmetric ciphers (ISO/IEC 18033-2).

English Parliament. (2006). Terrorism Act 2006,


Queens Printer of Acts of Parliament, UK.

Jansen, T. W., & Peen, S. (2007). Privacy i offentlige systemer. Masters thesis, Informatics and
Mathematical Modelling, Technical University of
Denmark (in Danish).

European Opinion Research Group. (2003). Data


Protection, Special Euro barometer 196, Wave
60.0.

Kolence, K. W., & Kiviat, P. J. (1973). Software


unit profiles & Kiviat figures. ACM SIGMETRICS
Performance Evaluation Review, 1973(2), 2-12.

European Parliament. (1995). Directive 95/46/EC


of the European Parliament and of the Council of

Lederer S., Mankoff, J., & Dey, A. (2003). Towards a deconstruction of the privacy space. In

8

Practical Privacy Assessments

Proceedings of the Ubicomp communities: privacy


as boundary negotiation Workshop on Privacy
at Ubicomp2003.
Pfitzmann, A., & Khntopp, M. (2000). Anonymity, unobservability, and pseudonymitya
proposal for terminology. In H. Federrath (Ed.),
Workshop on Design Issues in Anonymity and
Unobservability. Springer Verlag.
Privacy Rights Clearinghouse. (2007). A chronology of data breaches. Retrieved on May 14, 2007,
from http://www.privacyrights.org/ar/ChronDataBreaches.htm
Reips, U.-D. (2007). Internet users perceptions
of privacy concerns and privacy actions. International Journal of Human-Computer Studies
65(6), 526-536.
Rivest, R. L., Shamir, A., & Adleman, L. A.
(1978). A method for obtaining digital signatures
and public-key cryptosystems. Communications
of the ACM, 21(2), 120-126.
Sandoval, G. (2006). Veterans data swiped in
theft. CNET News.com. Story last modified Mon
May 22 16:55:51 PDT 2006.
Smith, R. E. (2006). Laptop hall of shame. Forbes.
com. Commentary on Forbes.com September 7,
2006.
Stanton, P. (2004). Securing data in storage: A review of current research. CoRR,
cs.OS/0409034.
Walton, R. (2006). Balancing the insider and
outsider threat. Computer Fraud and Security
2006(11), 8-11.
Yu, E., & Cysneiros, L. (2002). Designing for
privacy and other competing requirements. In
Proceedings of the 2nd Symposium on Requirements Engineering for Information Security
(SREIS-02).

system (Technical Report CUCS-021-98). Columbia University: Computer Science Department.


Zimmermann, P. R. (1995). The official pgp users
guide., Cambridge MA: The MIT Press.
Additional Reading
American Civil Liberties UnionACLU. (2007).
Privacy section of their web site. Retrieved July
12, 2007, from http://www.aclu.org/privacy
Anderson, R. (1996). The eternity service. In
Proceedings of the 1st International Conference on the Theory, Applications of Cryptology,
PRAGOCRYPT96.
Brands, S. (2000). Rethinking public key infrastructures and digital certificates. Cambridge,
MA: The MIT Press.
Chaum, D. (1981). Untraceable electronic mail,
return addresses, and digital pseudonyms. Communications of the ACM, 24(2), 84-88.
Chaum, D. (1985). Security without identification: Transaction systems to make big brother
obsolete. Communications of the ACM, 28(10),
1030-1044.
Chaum, D., Fiat, A., & Naor, M. (1990). Untraceable electronic cash. In S. Goldwasser (Ed.), Advances in Cryptology CRYPTO88 (pp. 319-327).
Springer Verlag.
Clarke, I., Sandberg, O., Wiley, B., & Hong, T.
W. (2000). Freenet: A distributed anonymous
information storage and retrieval system. In H.
Federrath (Ed.), Workshop on Design Issues in
Anonymity and Unobservability (pp. 4666).
Springer Verlag.
Dingledine, R., Freedman, M. J., & Molnar,
D. (2000). The free haven project: Distributed
anonymous storage service. In H. Federrath (Ed.),
Workshop on Design Issues in Anonymity and
Unobservability (pp. 6795). Springer Verlag.

Zadok, E., Badulescu, I., & Shender, A. (1998).


Cryptfs: A stackable vnode level encryption file

8

Practical Privacy Assessments

Electronic Frontier Foundation. (2007). Privacy


section of their web site. Retrieved July 12, 2007,
from http://www.eff.org/Privacy
Electronic Privacy Information CenterEPIC
(2007). Privacy web site. Retrieved July 12, 2007,
from http://epic.org
European Union. (2006). Directive 2006/24/EC
of the European Parliament and of the Council of
15 March 2006 on the retention of data generated
or processed in connection with the provision of
publicly available electronic communications
services or of public communications networks
and amending Directive 2002/58/EC.
Goldberg, I. (2000). A pseudonymous communications infrastructure for the internet. Ph.D. Thesis,
University of California, Berkeley.
Goldberg, I. (2003). Privacy-enhancing technologies for the Internet, II:Five years later. In G. Goos,
J. Hartmanis, & J. van Leeuwen (Eds.), Second
InternationalWorkshop on Privacy Enhancing
Technologies (PET 2002) (pp. 209-213). Springer
Verlag.
Goldberg, I., Wagner, D., & Brewer, E. (1997).
Privacy-enhancing technologies for the Internet.
In Proceedings of IEEE COMPCON97 (pp. 103110). IEEE Computer Society Press.

Hiding Workshop 1999 (pp. 434-447). Springer


Verlag.
Waldman, M., Rubin, A., & Cranor, L. F. (2000).
Publius: a robust, tamper-evident, censorshipresistant and source-anonymous web publishing
system. In Proceedings of the 9th UsenixSecurity
Symposium (pp. 5972).

endnotes
1

Privacy Commissioner of Canada. (2000). The


Personal Information Protection and Electronic
Documents Act.
Reiter, M., & Rubin, A. (1999). Anonymous web
transactions with crowds. Communications of the
ACM, 42(2), 32-48.
Stajano, F., & Anderson, R. (1999). The cocaine
auction protocol: On the power of anonymous
broadcast. In A. Pfitzmann (Ed.), Information

8

5
6

Healthcare is an example of a service which


has primarily been offered by the public
sector, but where the private sector is playing an increasing role in many European
countries.
It is interesting to note that library records
of ordinary citizens are among the types of
data monitored in the hope to identify and
apprehend potential terrorists before they
commit any terrorism.
A numbered Swiss bank account is an example of a pseudonymous financial service,
but most governments wish to limit the availability of such services because they can be
used by criminals to hide the proceeds from
their crime.
Examples of NGOs that are active in the
field of privacy are: The American Civil
Liberties Union (ACLU), the Electronic
Frontier Foundation (EFF), Electronic
Privacy Information Center (EPIC), and
Privacy International.
Secure Sockets Layer, RFC 2246
Definition from Common Criteria issued by
International Organization for Standardization (1999).

8

Chapter V

Privacy and Trust in Online


Interactions
Leszek Lilien
Western Michigan University, USA
Bharat Bhargava
Purdue University, USA

AbstrAct
Any interactionfrom a simple transaction to a complex collaborationrequires an adequate level of
trust between interacting parties. Trust includes a conviction that ones privacy is protected by the other
partner. This is as true in online transactions as in social systems. The recognition of the importance of
privacy is growing since privacy guarantees are absolutely essential for realizing the goal of pervasive
computing. This chapter presents the role of trust and privacy in interactions, emphasizing their interplay. In particular, it shows how ones degree of privacy can be traded for a gain in the level of trust
perceived by the interaction partner. After a brief overview of related research, the idea and mechanisms
of trading privacy for trust are explored. Conclusions and future trends in dealing with privacy and trust
problems complement the chapter.

IntroductIon
Any interactionfrom a simple transaction to
a complex collaborationcan be successful
only if an adequate level of trust exists between
interacting entities. One of the more important

components of trust of an entity in its interaction


partner is its reliance that the partner is both willing and able to protect entitys privacy. This is as
true in the cyberspace as in social systems.
The need for privacy is broadly recognized
by individuals, businesses, the government, the

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Privacy and Trust in Online Interactions

computer industry, and academic researchers.


Examples are shown in Table 1. The growing recognition of the importance of privacy is motivated
not only by users sensitivity about their personal
data. Other factors include business losses due
to privacy violations, and enactments of federal
and state privacy laws. Even more important, the
quest for the promised land of pervasive computing will fail if adequate privacy guarantees are
not provided.
The role of trust and privacy is fundamental in
social systems as well as in computing environments. The objective of this chapter is presenting
this role in online interactions, emphasizing the
close relationship between trust and privacy. In
particular, we show how ones degree of privacy
can be traded for a gain in the level of trust perceived by ones interaction partner.
We begin with a brief overview in the next
section, presenting the background for research on

trust, privacy, and related issues. First, we define


trust and privacy, and then discuss their fundamental characteristics. Selecting the most relevant
aspects of trust and privacy to be employed in a
given application and computing environment is
in and by itself a significant challenge. The reason
is that both trust and privacy are very complex,
multifaceted concepts.
Privacy and trust in computing environments are as closely related and as interesting in
various aspects of their interplay as they are in
social systems (Bhargava, Lilien, Rosenthal, &
Winslett, 2004). On the one hand, a high level of
trust can be very advantageous. For example, an
online seller might reward a highly trusted customer with special benefits, such as discounted
prices and better quality of services. To gain
trust, a customer can reveal private digital credentialscertificates, recommendations, or past
interaction histories. On the other hand, a mere

Table 1. Recognition of the need for privacy by different entities


Recognition of the need for privacy by individuals (Cranor, Reagle, & Ackerman, 1999)
99% unwilling to reveal their SSN
18% unwilling to reveal their favorite TV show
Recognition of the need for privacy by businesses
Online consumers worrying about revealing personal data held back $15 billion in online revenue in 2001 (Kelley, 2001)
Recognition of the need for privacy by the federal government
Privacy Act of 1974 for federal agencies (Privacy Act, 2004)
Health Insurance Portability and Accountability Act of 1996 (HIPAA) (Summary HIPAA, 2003; Mercuri, 2004)
Recognition of the need for privacy by computer industry research (examples)
IBMincl. Privacy Research Institute (IBM Privacy, 2007)
Topics include: pseudonymity for e-commerce, EPA, and EPALenterprise privacy architecture and language, RFID privacy,
privacy-preserving video surveillance, federated identity management (for enterprise federations), privacy-preserving data mining
and privacy-preserving mining of association rules, hippocratic (privacy-preserving) databases, online privacy monitoring
Microsoft Researchincluding Trustworthy Computing Initiative (Trustworthy Computing, 2003)
The biggest research challenges: reliability/security/privacy/business Integrity
Topics include: DRMdigital rights management (incl. watermarking surviving photo editing attacks), software rights protection,
intellectual property and content protection, database privacy and privacy-preserving data mining, anonymous e-cash, anti-spyware
Recognition of the need for privacy by academic researchers (examples)
Trust negotiation with controlled release of private credentials, privacy-trust tradeoff
Trust negotiation languages
Privacy metrics
Anonymity and k-anonymity
Privacy-preserving data mining and privacy-preserving database testing
Privacy-preserving data dissemination
Preserving location privacy in pervasive computing, and privacy-preserving location-based routing and services in networks
Trust negotiation with controlled release of private credentials
Genomic privacy

8

Privacy and Trust in Online Interactions

perception of a privacy threat from a collaborator may result in a substantial lowering of trust.
In particular, any sharing of an entitys private
information depends on satisfactory limits on its
further dissemination, such as a partners solid
privacy policies. Just a privacy threat can impede
sharing of sensitive data among the interacting
entities, which results in reduced effectiveness of
the interaction and, in the extreme cases, even in
termination of the interaction. For instance, a user
who learns that an Internet service provider (ISP)
has carelessly revealed any customers e-mail will
look for another ISP.
The idea and mechanisms of trading privacy
for trust, the main topic of this chapter, are explored in the following section. It categorizes
types of privacy-for-trust tradeoff, and shows
how to exchange ones privacy for trust in an
optimal way. The remaining sections, in turn,
present our view of future trends in research on
privacy and trust; include conclusions; present
future research directions for privacy and trust
in computing; include references; and suggest
additional reading material that can supplement
the topics of this chapter.

bAckground: trust, prIvAcy,


And relAted Work
The notions of trust and privacy require an indepth discussion of their background, including
their interplay. It is provided in this section.

trust and Its characteristics


Definition of Trust
We define trust as reliance on the integrity, ability,
or character of a person or thing (The American,
2000). Use of trust is often implicit. Frequently, it
is gained offline (Bhargava et al., 2004). A user,
who downloads a file from an unfamiliar Web
site, trusts it implicitly by not even considering

trust in a conscious way. A user who decides to


buy an Internet service from an Internet service
provider may build her trust offline by asking her
friends for recommendations.
Each entity E should have a good reason to
expect that its interaction partner is both willing
and able to protect Es privacy. This indicates
that dimensions of trust include integrity and
competence of a trusted party. That is, the integrity dimension of trust is a belief by a truster
that a trustee is honest and acts in favor of the
truster, and the competence dimension of trust is a
belief in a trustees ability or expertise to perform
certain tasks in a specific situation. Predictability
can be attached as a secondary measure to both
an integrity belief and a competence belief (Zhong,
Lu, Bhargava, & Lilien, 2006).

Implicit and Explicit Trust


Trust is a powerful paradigm, truly ubiquitous and
beneficial in social systems, that enables smooth
operation of such systems, also under conditions
of uncertainty or incomplete information. Trust
has been comprehensively used and well tested.
The need for trust exists in all interactions, irrespective of the fact whether the parties involve
individuals, institutions, or artifacts. For example,
trust is constantlyif often unconsciouslyapplied in interactions between people and animals
(e.g., a guide dog), or people and artifacts (e.g.,
Can I rely on my car for this long trip?).
Trust has to be approached differently in closed
and open systems. In the former, trustworthiness
of an interaction partner is known to the initiator of an interaction before the interaction starts,
and in the latter it is not known. An example of a
closed social system is a small village where people
know each other (or at least know each others
reputations). Trust is used implicitly since each
villager has a rather good clue what to expect of
everybody else. In short, X feels how much to
trust Y. An example of an open social system is
a large city where trust must be used explicitly to

8

Privacy and Trust in Online Interactions

avoid unpleasant surprises (such as being harmed


by a dishonest or incompetent car mechanic or
dentist). A city dweller needs to ask around to
find a trusted entity she needs, inquiring friends,
office mates, and so forth. She can also inquire
professional reputation databases, such as the
Better Business Bureau (BBB), or the AAAs
Approved Auto Repair Network.
Trust has proven its usefulness in social systems. We need similarly ubiquitous, efficient, and
effective trust mechanisms in the cyberspace.
We have both closed computing systemssuch
as a LAN serving a research laband opened
computing environmentssuch as the World
Wide Web or WiFi hot spots. Only the latter include users who are not known in advance to the
system. For example, an access control subsystem
for a WiFi hot spot must determine the permitted
actions for each user, also a new and completely
unknown user.
We believe that many users or computer systems err by not considering trust issue at all. They
do not assume trust implicitly. They simply ignore
the issue of trust. Without even knowing it, they
trust blindly, without any evidence, justification,
or verification. Such a mistake is made also by
any operating system that trusts all application
programs, allowing any programincluding
malwareto run. As another example, too many
users do not even know that they show a nave
trust by accessing unknown Web site, which can
harm them or their computers.
Still, closed computing environments systemsanalogous to a small villagehave been
working well without applying the notion of trust,
at least explicitly. However, it becomes more
and more difficult to handle open computing
systemsanalogous to a big citywithout the
assistance from the powerful trust paradigm. In
the security area, for example, the confidentialityintegrity-availability (CIA) paradigm has served
sufficiently well in closed systems but it has to be
replaced or augmented with trust-based solutions
in open environments. Using the trust paradigm

88

can simplify security problems by reducing complexity of interactions among system components,
both human and artificial ones.

Selected Trust Characteristics


Trust is a very complex and multifaceted notion.
A researcher wishing to use trust in computing
systems must cope with the challenging choice
of the optimal subset of trust characteristics. A
vast variety of different trust-based systems can
result from selecting different subsets. Only some
of the choices will make systems based on them
effective and efficient.
Some of the choices for trust characteristics
include the following:
1.

2.

3.

Symmetric vs. asymmetric trust: Symmetric trust assumes that X trusts Y implies
Y trusts X, which is not true in general.
Asymmetric trust does not assume such
implication and is, therefore, more general.
Symmetric trust can be viewed as its special case, which can be chosen only in very
special circumstances or applications.
Gradual vs. binary trust: The former allows for degrees of trust, and can be defined
on a multilevel or a continuous scale. The
latter is an all-or-nothing proposition, which
forces to specify a single trust threshold
above which full trust can be assumed. Binary trust, as a special case of gradual trust,
is insufficient in general. It can be assumed
only for very special and limited settings.
Explicit vs. implicit trust: Implicit trust is
used by either ignorant or nave interaction
parties. For instance, a user, who downloads
a file from an unfamiliar Web site, trusts it
implicitly by not even considering trust in
a conscious way. The consequences might
include penetration by malware.
Explicit trust allows for its clear specification, assuring that trust considerations are
not ignored. Given Xs need for determin-

Privacy and Trust in Online Interactions

4.

5.

ing trustworthiness of Y, only explicit trust


allows for determination of the party that
vouches for trustworthiness of Y, and assumes risks when this trust is breached. It
could, but does not have to be the case, that
Y vouches for its own trustworthiness (e.g.,
via its behavior in earlier interactions with
X).
Explicit trust might be gained offline. For
instance, a person who decides to buy an
Internet service from an Internet service
provider (ISP) may build her trust offline by
asking her friends for trustworthy ISPs.
Direct vs. indirect trust: Direct trust between X and Y, when X trusts Y, is limited
to cases when X has gained a degree of trust
in Y from previous interactions. (This may,
but does not have to, mean that Y gained
any degree of trust in X).
It is obvious that the domain of trust can be
significantly extended by relying not only
on direct trust but also on indirect trust. For
indirect trust, X does not need to trust Y to
be willing to interact with it. It is sufficient
that X finds an intermediary Z such that X
has a sufficient degree of trust in Z and Z
trusts Y. (To be more precise, in this case
X needs to trust to a sufficient degree in Zs
recommendations about trustworthiness of
Y).
Z becomes a trusted third party (TTP). A
TTP can be any entity accepted by X, in
particular, it can be an institution set up to
provide indirect trust, also on a commercial
basis. Examples of such institutions are
certification bodies of all kinds, including
providers of digital certificates.
Type of trusted entities: Should trust be
lavished only on humans? We believe that
the answer should be no. We trust our cars,
refrigerators, cellphones, PDAs, or RFID
tags in stores. As is the case with humans,
this trust can be breached if the devices are

6.

7.

disloyal, or loyal to other parties than their


owners or primary users. Loyalty decides
who the entrusted party works for. For example, sensors and recorders in a car can
work not for the driver but for an insurer, a
browser can work for a commercial advertiser, and a sensor network in ones house
can be hijacked by a nosy neighbor orin
the worst caseby the Big Brother.
Number of trusted entities: The most
critical distinction is between trusting somebody or trusting nobody. The latter leads to
paranoid behaviors, with extremely negative consequences on system performance,
including exorbitant costs.
We believe that You cannot trust everybody
but you have to trust somebody. Trusting
more partners improves performance as
long as trust is not abused. Any breach of
trust causes some performance penalties. An
optimal number of trusted entities should be
determined.
Responsibility for a breach of trust: If
no TTP is involved, is the truster or the
trustee responsible for deciding on the
degree of trust required to offer or accept
a service? As a consequence, is the truster
or the trustee ultimately responsible for
bearing consequences of possible breaches
of trust? In commercial relationships, most
often a buyer determines whether the seller
is trustworthy enough and thenat least
once the warranty period is overbears
the possible costs of broken trust. There are,
however, cases when it is the seller who pays
for abuses by the buyer (as in the case when
terrorists are not prevented from boarding
a plane).
If a TTP is involved in a trust relationship,
it may be held responsible to the extent allowed by its legal obligations.

89

Privacy and Trust in Online Interactions

Caveats

Selected Privacy Characteristics

A few words of caution are in order (Bhargava


et al., 2004). First, using a trust model too complex for an application domain (i.e., including
superfluous trust aspects) hurts flexibility or
performance. Second, excessive demands for
evidence or credentials result in laborious and
uncomfortable trust-based interactions, while
insufficient requirements make them too lax. (In
the latter case, who wants to befriend someone
who befriends crooks and thieves?). Third, exaggerating the need for explicit trust relationships
hurts performance. For example, modules in
a well-integrated (hence, closed) system should
rely on implicit trust, just as villagers do. Also,
in a crowd of entities, only some communicate
directly, so only they need to use trustbut even
not all of them need to use trust explicitly.

Privacy has three dimensions: (a) personal


privacy of an entitydemanding protecting an
entity against undue interference (such as physical searches) and information that violates moral
sense of the entity; (b) territorial privacycalling
for protection of the area surrounding the entity,
such as laws on trespassing; and (c) informational privacyrequiring protection of gathering,
compilation and dissemination of information
(Fischer-Hbner, 2001).
Any interaction involves exchange of data.
It is hard to find any data that (at least in conjunction with other data, also offline data) does
not carry any private information on its sender.
Hence, informational privacy is endangered by
each interaction involving release or dissemination of data.
The release of sensitive data can be controlled
in various degrees: from none to full control. It can
also be categorized as voluntary, pseudo-voluntary, or mandatory (incl. the case of information
release as required by law). The pseudo-voluntary
data dissemination is particularly deceitful since
it appears to give a user a freedom to decline
sharing his private informationbut only at the
cost of denying the user an access to a desirable
service. As a simple example, a person who refuses
for privacy reasons (including fears of receiving
more spam) to enter the e-mail address on a Web
site can be denied the sites services. Quite often,
in the name of a real need or just a convenience
the user is forced or pressured to provide private
data. (This is a tradeoff between privacy and
convenience that should be studied).
The amount of privacy lost by disclosing a
piece of information is affected by the identity of
the recipients of this information, possible uses of
this information, and related private information
disclosed in the past. First, the recipients of private information include not only direct but also
all indirect recipients, who receive some of this
private information from entities other than the

privacy and Its characteristics


Definition of Privacy
We define privacy as the right of an entity
(normally a person), acting in its own behalf, to
determine the degree to which it will interact with
its environment, including the degree to which
the entity is willing to share information about
itself with others (Internet Security, 2007). We
fully embrace the possibilityindicated by the
words an entity (normally a person)to extend
the scope of the notion of privacy from a person
to an entity. The latter may be an organization,
an artifact (software in particular), and so forth.
The extension is consistent with the use of the
notion of trust also in relationship to artifacts
(The American, 2000), and with the common
practice of antropomorphization of intelligent
system components (such as objects and agents)
in computer science. The extension is useful for
discussion of privacy not only for humans but also
for artificial entities (acting, more or less directly,
on behalf of humans).

90

Privacy and Trust in Online Interactions

user. For example, a doctor, the direct recipient of


private patients information, passes some of this
information to an insurer, an indirect recipient.
Any indirect recipient can disseminate information further. In our example, the insurer can pass
some information to users employer. Second,
possible uses of information vary from completely
benevolent to the most malicious ones, with the
latter including the most painful case of identity
theft. Third, related private information disclosed
in the past has a life of its own, like a genie let
out of the bottle. At best, it is limited only by the
controls that its owner was able to impose on its
dissemination, for example, asking a company
not to sell to or share it with other businesses. At
worst, it can be retrieved and combined with all
pieces of information about the owner, destroying
much of owners privacy.

Threats to Privacy

forwarder, or receiver; (ii) threats to anonymity


of service provider; and (iii) threats to privacy
of communication (e.g., via monitoring, logging, and storage of transactional data). In the
third category, threats to privacy at the system
level are due to attacks on the system in order to
gain access to its data. For example, attacks on
access controls can allow the attacker breaking
into confidential databases. Finally, in the fourth
category, threats to privacy in audit trails are due
to the wealth of information included in system
logs and audit trails. A special attention should
be directed to consider logs and trails that gained
an independent life, away from the system from
which they were derived.
Another view of threats to privacy (FischerHbner, 2003) categorizes the threats as:
1.
2.
3.

Threats to privacy can be classified into four


categories (Fischer-Hbner, 2003):
1.
2.
3.
4.

Threats to privacy at the application level


Threats to privacy at the communication
level
Threats to privacy at the system level
Threats to privacy in audit trails

In the first category, threats to privacy at


the application level are due to collection and
transmission of large quantities of sensitive data.
Prominent examples of these types of threats
are large projects for the information highway,
including large peer-to-peer systems (Can, 2007)
or implementations of applications for public administration networks, health networks, research
networks, electronic commerce, teleworking,
distance learning, and so forth. In the second
category, threats to privacy at the communication
level include risks to anonymity of communication, such as: (i) threats to anonymity of sender,

4.

5.

6.

Threats to aggregation and data mining


Threats due to poor system security
Government-related threats. They arise
in part because the government has a lot
of peoples most private data (incl. data
on taxes, homeland security, etc.) and it is
difficult to strike the right balance between
peoples privacy on the one hand and homeland security concerns on the other hand.
Threats due to use of Internet, for example,
intercepting of unencrypted e-mail, recording of visited Web sites, and attacks via
Internet
Threats due to corporate rights and business practices. For instance, companies in
the United States may collect data that even
the federal government is not allowed to
gather.
Threats due to many traps of privacy for
sale, that is, temptations to sell out ones
privacy. Too often online offers that seem
to be free are not really free since they require providing the benefactor with ones
private data. An example is providing ones
data for a free frequent-buyer card.

9

Privacy and Trust in Online Interactions

Escalation of Threats to Privacy in


Pervasive Computing
Pervasive computing exacerbates the privacy
problem (Bhargava et al., 2004). People will be
submerged in an ocean of zillions of computing
devices of all kinds, sizes, and aptitudes (Sensor
Nation, 2004). Most of them will have limited or
even rudimentary capabilities and will be quite
small, such as RFID tags and smart dust. Most
will be embedded in artifacts for everyday use,
or even within human bodies, with possibilities
for both beneficial and apocalyptic consequences.
Unless privacy is adequately protected, the progress of pervasive computing will be slowed down
or derailed altogether.
Pervasive devices with inherent communication capabilities might even self-organize into
huge, opportunistic sensor networks (Lilien,
Kamal, Bhuse, & Gupta, 2006; Lilien, Gupta, &
Yang, 2007) able to spy anywhere, anytime, on
everybody and everything within their midst.
Without proper means of detection and neutralization, no one will be able to tell which and how
many snoops are active, what data they collect,
and who they are loyal to. Questions such as Can
I trust my refrigerator? will not be jokesthe
refrigerator will be able to snitch on its owners
dietary misbehavior to the owners doctor.
We might ask these serious questions, notwithstanding their humorous appearance. Will
pervasive computing force us to abandon all hope
for privacy? Will a cyberfly, with high-resolution
camera eyes and supersensitive microphone ears,
end privacy as we know it?1 Should a cyberfly be
too clever to end up in the soup, the only hope
might be to develop cyberspiders. But cyberbirds
might eat those up. So, we will build a cybercat.
And so on and so forth.
Radically changed reality demands new approaches to computer security and privacy. We
believe that we should talk about a new privacy
categorynamely, privacy of artificial entities.
We think that socially-based paradigms, such

9

as trust-based approaches, will play a big role


in pervasive computing. As in social settings,
solutions will vary from heavyweight ones for
entities of high intelligence and capabilitiessuch
as humans and intelligent systemsinteracting
in complex and important matters, to lightweight
ones for less intelligent and less capable entities
interacting in simpler matters of lesser consequence.

Interplay of privacy and trust


Privacy and trust can be in a symbiotic or in an
adversarial relationship. We concentrate here on
the latter, when users in interactions with businesses and institutions face tradeoffs between a
loss of their privacy and the corresponding gain of
trust by their partners. An example of a symbiotic
relationship is the situation when a better privacy
provided by a commercial Web site results in its
customers higher degree of trust.
Users entering an online interaction want to
gain a certain level of trust with the least loss
of their privacy. This is the level of trust that is
required by an interaction partner, for example,
a Web site, to provide a needed service, for example, an online purchase of a gift. The interaction
partner will ask a user for sensitive information
such as certain credentials, for example, the users
credit card number and other cardholder information. These credentials, when provided online, are
indeed digital credentialsdespite the fact that
non-digital credentials, such as a plastic credit
card, are their basis.
This simple scenario shows how privacy and
trust are intertwined. The digital credentials are
used to build trust, while providing the credentials
reduces the degree of privacy of their owner. It
should be noted that in a closed environment a user
could receive certain service while revealing much
less private information. For example, a student
can order free educational software just by logging into a password-protected account, without
any need for providing his credit card information.

Privacy and Trust in Online Interactions

Obviously, entering only ones login and password


reveals less sensitive data than providing ones
credit card information.
We can not expect that privacy and trust are
provided for free or traded for freeunder any
cost measures. Only in an ideal world we would
never loose our privacy in any interaction, would
be fully trusted at the same time, and would be
provided these benefits at no cost. In reality, we can
only approach this optimum by providing minimal
privacy disclosuresones that are absolutely
necessary to gain a level of trust required by the
interaction partners. The mechanisms providing
minimal privacy disclosures and trust carry costs,
including costs of computation, communication,
storage, and so forth.
It is obvious that gaining a higher level of trust
may require a larger loss of privacy. It should also
be obvious that revealing more sensitive information beyond certain point will produce no more
trust gains, or at least, no more useful trust gains.
For example, a student wishing to enter a tavern
must show a proof of his age, exchanging a loss
of privacy for a trust gain. Showing his driver
license is entirely sufficient, and showing his
passport and birth certificate produces no more
trust gains.
It should also be obvious that for each required
level of trust we can determine (at least in theory)
the minimal loss of privacy required to produce
this level of trust. This means that users can (and
usually want) to build a certain level of trust with
this minimal loss of privacy. We want to automate
the process of finding this optimal privacy-fortrust tradeoff, including automatic evaluation
of a privacy loss and a trust gain. To this end,
we must first provide appropriate measures of
privacy and trust, and then quantify the tradeoff
between privacy and trust. This quantification
will assist a user in deciding whether or not to
trade her privacy for the potential benefits gained
from trust establishment. A number of questions,
including the following, must be answered. How
much privacy is lost by disclosing a specific

piece of information? How much trust is gained


by disclosing given data? How much does a user
benefit by having a given trust gain? How much
privacy a user is willing to sacrifice for a certain
amount of trust gain? Only after answering these
questions, we can design algorithms and mechanisms that will assist users in making rational
privacy-for-trust decisions. Proper mechanisms
can empower a users decision making process, or
even automate it based on policies or preferences
predefined by the user.

related Work
Related Work on Privacy
Many conferences and journals, not only in
the area of computer science or other technical
disciplines, focus on privacy. We can mention
only a few publications that affected our search
for a privacy-for-trade solution presented in this
chapter.
Reiter and Rubin (1999) use the size of the
anonymity set to measure the degree of anonymity for senders or receivers. The anonymity set
contains all the potential subjects that might have
sent or received data. The size of the anonymity
set does not capture the fact that not all senders
in the set have an equal probability of sending a
message. This may help the attacker in reducing
the size of the set of potential senders. Therefore,
the size of the anonymity set may be a misleading measure, showing a higher degree of privacy
than it really is.
Another approach (Diaz, Seys, Claessens, &
Preneel, 2003; Serjantov & Danezis, 2003) uses
entropy to measure the level of privacy that a
system achieves. Differential entropy is used by
Agrawal and Aggarwal (2001) to quantify the
closeness of an attribute value, as estimated by
an attacker, to its original value. These papers
assume a static model of an attacker, in the sense
that the attacker does not accumulate information
by watching the system over the time.

9

Privacy and Trust in Online Interactions

The Scrub system (Sweeney, 1996) can be


used to de-identify personal patients information. Privacy is ensured by filtering identifying
information out of data exchanged between
applications. The system searches through prescriptions, physician letters, and notes written by
clinicians to replace with generic data information
identifying patientssuch as their names, phone
numbers, and addresses. A database of personally-identifying informationsuch as first and
last names, addresses, phones, social security
numbers, employers, and birth datesis used
to detect the occurrences of such information.
In addition, the system constructs templates for
different information formats, for example, different formats for phone numbers and dates. These
templates are used to detect variants of personal
information.
Collecting pieces of information from different
sources and putting them together to reveal personal information is termed data fusion (Sweeney,
2001a). Data fusion is more and more invasive
due to the tremendous growth of information on
individuals being electronically gathered (Sweeney, 2001b). The Scrub system does not provide
a sufficient protection against data fusion, that
is, it does not assure complete anonymity. The
Datafly system (Sweeney, 1998; Sweeney, 2002b)
maintains anonymity even if data are linked with
other information sources. While maintaining a
practical use of data, Datafly automatically aggregates, substitutes, and removes information
to maintain data privacy. Datafly achieves data
privacy by employing the k-anonymity algorithm
(Sweeney, 2002b), which provides a formal guarantee that an individual can not be distinguished
from at least k - 1 other individuals.
Platform for privacy preferences (P3P) is the
best-known protocol and tool suite for specifying
privacy policies of a Web site and preferences of
Web users (Cranor , 2003). P3P is not intended to
be a comprehensive privacy solution that would
address all principles of Fair Information Practices
(Trade Commission, 1998). AT&T Privacy Bird

9

is a prominent implementation of P3P (Privacy


Bird, 2004). It is a tool that can be added to a
Web browser to keep its users aware of privacy
policies of the visited Web sites.
We do not discuss here general security solutions which contribute to privacy protection.
Examples include protecting software, mobile
objects, or agents from many types of attacks
by either: (i) running them only on dedicated
and tamper-resistant platformsfor example,
on secure coprocessors (Tygar & Yee, 1994); or
(ii) by providing security on commodity hardwarefor example, partitioning a single hardware
platform into many isolated virtual machines
or closed boxes (Garfinkel, 2003). Examples
include also protection of a software client (code)
from a malicious host by obfuscating, tamperproofing, or watermarking the code (Collberg &
Thomborson, 2000).

Related Work on Trust


The problem of establishing and maintaining trust
in dynamic settings has attracted many researchers. One of the first formalized models of trust in
computer science (Marsh, 1994) introduced the
concepts widely used by other researchers, such
as context and situational trust.
A comprehensive social trust model, based on
surveying more than 60 papers across a wide range
of disciplines, has been proposed by McKnight
and Chervany (2001). It has been validated via an
empirical experimental study (McKnight, Choudhury, & Kacmar, 2002). The model defines five
conceptual trust elements: trusting behavior, trusting intention, trusting belief, institution-based
trust, and disposition to trust (cf. Cofta, 2006).
First, trusting behavior is an action that increases a
trusters risk or makes the truster vulnerable to the
trustee. Second, trusting intention indicates that a
truster is willing to engage in trusting behaviors
with the trustee. A trusting intention implies a
trust decision and leads to trusting behaviors. Two
subtypes of trusting intention are: (i) willingness

Privacy and Trust in Online Interactions

to depend, that is, the volitional preparedness to


make oneself vulnerable to the trustee; and (ii)
subjective probability of depending, that is, the
likelihood that a truster will depend on a trustee.
Third, trusting belief is a trusters subjective belief
in the fact that a trustee has attributes beneficial to
the truster. The followings are the four attributes
used most often: (i) competence: a trustee has the
ability or expertness to perform certain tasks; (ii)
benevolence: a trustee cares about a trusters interests; (iii) integrity: a trustee is honest and keeps
commitments; and (iv) predictability: a trustees
actions are sufficiently consistentso future action can be predicated based on the knowledge of
previous behavior. Fourth, institution-based trust
is the belief that proper structural conditions are
in place to enhance the probability of achieving
a successful outcome. Two subtypes of institution-based trust are: (i) structural assurance: the
belief that deployed structures promote positive
outcomes, where structures include guarantees,
regulations, promises and so forth; and (ii)
situational normality: the belief that the properly ordered environments facilitate successful
outcomes. Fifth, disposition to trust characterizes a trusters general propensity to depend on
others across a broad spectrum of situations. Two
subtypes of disposition to trust are: (i) faith in
humanity: the general assumptions about trustees
integrity, competence, and benevolencethat is,
a priori trusting beliefs; and (ii) trusting stance:
a preference for the default trust-based strategy
in relationships.
Zacharia and Maes (2000) proposed two reputation systems, SPORAS and HISTOS. Reputations in SPORAS are global, that is, a principals
reputation is the same from the perspective of any
querier. HISTOS has the notion of personalized
reputation, that is, different queriers may get different reputation values about the same principal.
In addition to the reputation value, a reputation
deviation is provided to measure the reliability of
the value. Discarding a notorious identity, used
in many systems by dishonest parties to shed

their unfavorable reputations, is unprofitable in


SPORAS and HISTOS, because a newcomer starts
with the lowest reputation value. Carbo, Molina,
and Davila (2003) propose a trust management
approach using fuzzy reputation. The basic idea
is similar to that of SPORAS.
A distributed personalized reputation management approach for e-commerce is proposed by
Yu and Singh (2002a, 2002b). The authors adopt
the ideas from the Dampster-Shafer theory of
evidence to represent and evaluate reputation. If
two principals a and b have direct interactions,
b evaluates as reputation based on the ratings
of these interactions. This reputation is called
a local belief. Otherwise, b queries a so called
TrustNet for other principals local beliefs about
a. The reputation of a is computed based on the
gathered local beliefs using the Dampster-Shafer
theory. How to build and maintain the TrustNet
is not mentioned in the papers. Aberer and Despotovic (2001) simplify this model and apply it
to manage trust in a P2P system.
Sabater and Sierra (2002) propose a reputation
model for gregarious societies called a Regret
system. The authors assume that a principal owns
a set of sociograms describing the social relations
in the environment. The Regret system structure
has three dimensions. The individual dimension
models the direct experience between two principals. The social dimension models the information coming from other principals. The ontology
dimension models how to combine reputations
on different aspects. Different reputations are
defined: witness reputation, neighborhood reputation, and system reputation. The performance of
this approach highly depends on the underlying
sociograms. The paper does not discuss how to
build sociograms.
Bayesian analysis approach to model reputation and trust is used by Mui (2002) and Mui,
Mohtashemi, and Halberstadt (2002). Many
reputation models and security mechanisms assume the existence of a social network (Barnes &
Cerrito, 1998). Pujol, Sangesa, and Delgado (2002)

9

Privacy and Trust in Online Interactions

propose an approach to extract reputation from the


social network topology that encodes reputation
information. Morinaga, Yamanishi, Tateishi, and
Fukushima (2002) propose an approach to mining
product reputations on the Web.

Related Work on Privacy-Trust


Optimization
Yu, Winslett, and Seamons (2003) investigated
automated trust negotiation (ATN) considering
the issues of iteratively exchanging credentials
between two entities to incrementally establish
trust. This approach considers the tradeoff between the length of the negotiation, the amount
of information disclosed, and the computation
effort. The major difference between ATN and the
proposed research is that we focus on the tradeoff
between privacy and trust. Our research leads to
a method for estimating the privacy loss due to
disclosing a piece of information, and ways for
making rational decisions.
Wagealla, Carbone, English, Terzis, Lowe, and
Nixon (2003) present a formal model for trustbased decision making. An approach is provided
to manage trust lifecycle with considerations of
both trust and risk assessments. This approach and
our research on trust and evidence formalization
(Bhargava & Zhong, 2002; Zhong, 2005), can be
extended to use trustworthiness of an information receiver to decide whether or not to disclose
private information to him.
Seigneur and Jensen (2004) propose an approach to trade minimal privacy for the required
trust. Privacy is based on a multiple-to-one linkability of pieces of evidence to a pseudonym, and
is measured by nymity (Goldberg, 2000). The
authors assume the presence of a partial order of
nymity levels for the measurement of privacy. Our
research approach employs multiple-to-multiple
relationships between pieces of evidence and
private attributes.

trAdIng prIvAcy for trust


problems in trading privacy for
trust
To gain trust, a user must reveal private digital
credentialscertificates, recommendations, or
past interaction histories. She is faced with a
number of tough questions:

How much privacy is lost by disclosing a


specific credential? To make the answer even
more difficult, the amount of privacy loss
is affected by credentials and information
disclosed in the past.
How many credentials should a user reveal?
If alternative credentials are available (e.g.,
both a driver license and a passport indicate
birth data), which one or ones should be
revealed?
What is the trust gain obtained by disclosing a given credential? Also, what is the
minimal degree of privacy that must be
sacrificed to obtain a required trust gain?
Which credentials should be presented to
satisfy this minimum requirement?
How much does a user benefit by having a
given trust gain?
How much privacy is a user willing to sacrifice for a certain amount of trust gain?

These questions alone show how complex and


difficult is optimization of the privacy-for-trust
exchange. Obtaining an optimal solution without a
technical support is practically impossible. There
is only a small chance that intuitive approaches
to this process would result in outcomes close to
the optimal results.

A solution for trading privacy for


trust
This section presents our proposed solution facilitating privacy-for-trust trading and enabling

9

Privacy and Trust in Online Interactions

optimal outcomes of this process. It discusses


proposed approaches for building and verifying
trust, protecting privacy, and trading privacy
for trust.

Building and Verifying Trust


We focus on methods of building trust in opened
and dynamic computing environments, which are
more challenging than the closed and static settings. Digital credentials are common means of
building trust in open environments. Credentials
include certificates, recommendations, or past
transaction histories (Farrell & Housley, 2002;
Fujimura & Nishihara, 2003). Since credentials
contain private information, presenting them in
hopes of gaining trust means privacy losses. We
need to consider problems with credentials, including their imperfect and non-uniform trustworthiness. Since no credentials are perfect, means to
verify trust gained by showing them are necessary.
We present basic ways of verifying trust.

A. Trust Metrics
Trust cannot be built or verified without having
measures of trust or trust gain. We propose a
three-step method for defining a trust gain metric.
In the first step, we determine multilevel trust
metrics with n trust levels, measured on a numeric
scale from 1 to n, where n could be an arbitrarily
large number. Such metric is generic, applicable to
a broad range of applications, with the value of n
determined for a particular application or a set of
applications. The case of n = 2 reduces multilevel
trust to the simplistic case of binary trust (it might
still be useful in simple trust-based applications),
with trust levels named, perhaps, full_service and
no_service. Selecting, for example, n = 5 results
in having five trust levels that could be named:
no_service, minimal_service, limited_service,
full_service, and privileged_service going from
the lowest to the highest level.

Trust levels could be defined by a service


provider, the owner of a Web site on which it
resides (who might be the same or different from
the service provider), or any other entity that is
an intermediary between the service provider and
the customer. The number of levels n could be increased when the site outgrows its old trust metric,
or when the user becomes more sophisticated and
needs or wants to use more trust levels.
In the second step, a trust benefit function
B(ti), associated with each trust level ti, needs be
defined. The default trust benefit function for a
service can be defined by the same party that
defined trust levels in the preceding step. An
optional trust benefit function, overriding the
default one, can also be defined by an individual
customer, allowing for a more user-specific benefit metric.
In the third step, trust gain, denoted by G(t2,
t1), can be calculated based on the benefit function. G(t2, t1) indicates how much a user gains
if the users trust level, as seen by the users
interaction partner, increases from t1 to t2. The
following simple formula is used to compute the
trust gain:
trust_gain = G(new_trust_level, old_trust_level) =
B(new_trust_level) - B(old_trust_level)

B. Methods for Building Trust


Some of many generic means of building trust are
listed in Table 2. They include familiarity with
the entity to be trusted or its affiliation with a
familiar entity, as well as building trust by firsthand experience or second-hand reputation.
Rather than looking at ways of building trust
in general, we differentiate them depending on
the relative strengths of the interacting parties.
The strength of a party P1 participating in an
interaction with another party, P2, is defined by
P1s capability to demand private information
from P2, and P1s means available in case when
P2 refuses to comply. As a simple example,

9

Privacy and Trust in Online Interactions

Table 2. Basic means of building trust among partners


Building trust by familiarity with X
Person: face, voice, handwriting, and so forth.
Institution: company name, image, good will, and so forth.
Artifact: Manufacturer name, perceived quality, and so forth.
Building trust by affiliation of X with person/institution/artifact Y
Trust or distrust towards Y rubs off on X
Building trust by first-hand experience with Xs activities/performance
Good or bad experience (trust or distrust grows)
Building trust by second-hand reputation of X determined by evidence or credentials
Reputation databases (e.g., BBB, industry organizations, etc.) with good evidence or
a lack of bad evidence
Credentials: driver license, library card, credit card

a bank is stronger than a customer requesting a


mortgage loan. As another example, two small
businesses negotiating a contract are, in most
cases, equally strong.
We concentrate on different-strength trust
relationships, in which one party is stronger and
another weaker, for example, trust relationships
between individuals and institutions, or between
small businesses and large businesses. We ignore
trust relationships with same-strength partners,
such as individual-to-individual interactions and
most B2B interactions. We will interchangeably use the terms: a weaker partner and a
customer, as well as a stronger partner and
a company.
Example means of building trust by a company
in a customer include receiving a cash payment for
a service provided, or checking partners records
in the e-Bay reputation databases. Example means
of building trust by a customer in a company
include asking friends about companys reputation, or checking its reputation in Better Business
Bureau databases.
Multiple means of building trust by a stronger
partner in the weaker partner are shown in Table
3. They can assist a company in a fight against
fraud attempts by its dishonest customer. All
these means can be divided into means preserving
privacy of the weaker partner, and the means not
preserving privacy. Only the first item listed in

98

Table 3 (Ask partner for an anonymous payment


for goods or services) belongs to the privacypreserving meansby the virtue of preserving
customers anonymity. All others compromise
customers privacy and result in disclosing private information. This indicates that, much more
often than not, successful interactions with a
stronger party require that a weaker party trades
its privacy loss for a trust gain required by this
stronger principal.
There are also multiple means of building trust
by a weaker partner in the stronger partner, with
some of them shown in Table 4. All these means
can assist a customer in a fight against fraud attempts by a company. It is clear that customers
major weapon is information about the company
and its reputation.

C. Methods for Verifying Trust


Since no credentials are perfect, means to verify
trust are necessary. This is as true in computing
as in social life.2 The basic ways of verifying trust
are shown in Table 5.
Verification must be careful, not based on mere
appearances of trustworthiness, which could be
easily exploited by fraudsters. The cyberspace can
facilitate more careful verification than is the case
in the offline world, in which such verification
might be too costly or too inconvenient. Quite

Privacy and Trust in Online Interactions

Table 3. Means of building trust by a stronger partner in her weaker partner


Ask partner for an anonymous payment for goods or services
Cash / digital cash / other
---------- above privacy-preserving, below privacy-revealing ---------Ask partner for a non-anonymous payment for goods or services
Credit card / travelers checks / other
Ask partner for specific private information
Check partners credit history
Computer authorization subsystem observes partners behavior
Trustworthy or not, stable or not,
Problem: Needs time for a fair judgment
Computerized trading system checks partners records in reputation databases
e-Bay, PayPal,
Computer system verifies partners digital credentials
Passwords, magnetic and chip cards, biometrics,
Business protects itself against partners misbehavior
Trusted third party, security deposit, prepayment, buying insurance,

often a business order sent from Company A to


Company B is processed without verification. The
reasons, in addition to costs and convenience,
include the following factors: (i) implicit trust
prevails in business; (ii) risk of fraud is low among
reputable businesses; and (iii) Company B might
be insured against being cheated by its business
partners, that is, a trusted third-party intermediary might assume transaction risk (for example a
buyers bank could guarantee a transaction).

protecting privacy
Protecting privacy requires defining privacy
metrics as a prerequisite. Privacy measures are
discussed first, and methods for protecting privacy,
relying on metrics, are presented next.

A. Privacy Metrics
We cannot protect privacy if we do not know
how to measure it. This indicates the importance
of privacy metrics. More specifically, we need
a privacy metric to determine what degree of
data and communication privacy is provided by
given protection methods. The metric has to work

in any existing or future combination of users,


techniques, and systems. It has to support or deny
claims made by any such combination that a certain level of privacy will be maintained by it.
This gives rise to at least two heterogeneityrelated challenges. First, different privacy-preserving techniques or systems claim different
degrees of data privacy. These claims are usually
verified using ad hoc methods customized for each
technique and system. While this approach can
indicate the privacy level for each technique or
system, it does not allow comparisons of diverse
techniques or systems.
Second, privacy metrics themselves are usually ad hoc and customized for a user model and
for a specific technique or system.
Requirements for good privacy metrics call
for unified and comprehensive privacy measures
to provide quantitative assessments of degrees
of privacy achieved by a broad range of privacypreserving techniques. A good privacy metric has
to compare different techniques/systems confidently. It also has to account for: (i) operation of
a broad range of privacy-preserving techniques;
(ii) dynamics of legitimate userssuch as how
users interact with the system and awareness that

99

Privacy and Trust in Online Interactions

Table 4. Means of building trust by a weaker partner in his stronger partner


Ask around
Family, friends, co-workers,
Check partners history and stated philosophy
Accomplishments, failures and associated recoveries,
Mission, goals, policies (incl. privacy policies),
Observe partners behavior
Trustworthy or not, stable or not,
Problem: Needs time for a fair judgment
Check reputation databases
Better Business Bureau, consumer advocacy groups,
Verify partners credentials
Certificates and awards, memberships in trust-building organizations (e.g., BBB),
Protect yourself against partners misbehavior
Trusted third party, security deposit, prepayment, buying insurance,

repeated patterns of data access can leak information to a violator; (iii) dynamics of violatorssuch
as how much information a violator may gain by
watching the system for some time; and (iv) costs
associated with a metric implementationsuch
as injected traffic, delays, CPU cycles, and storage use.
We proposed two metrics for assessing the
privacy achieved by a given system: an anonymity
set size metric and an information-theoretic metric
also known as an entropy-based metric. The first
metric can provide a quick estimate of privacy,
while the second gives a more detailed insight into
the privacy aspects of the system it measures.

Table 5. Basic ways of verifying trust toward


entity X
Verify own experience with X
Check own notes about Xs activities / performance
Verify reputation evidence / credentials for X
Call back to verify phone number
Check online user feedback about quality of an artifact
Check reputation database (e.g., consumer reports, BBB)
Verify affiliation of X
Check with employer if X still employed
Check reputation of Y with which X is affiliated

00

A.1. Effective Anonymity Set Size Metric: Since


anonymity is defined as the state of being indistinguishable within a set of subjects (Pfitzmann
& Khntopp, 2000), we can use the size of the
anonymity set as a privacy metric. The basic idea
is that of hiding in a crowd. As illustrated in
Figure 1, hiding among n entities provides more
privacy than hiding among 4 entities (for n >>
4). Clearly, the larger the set of indistinguishable
entities, the lower probability of identifying any
particular one. This approach can be generalized
to anonymize not only identities of entities but
also the values of their attributes: a selected attribute value is hidden within the domain of its
all possible values.
We need to present this metric in more detail.
The set of subjects, or values, known as the anonymity set, is denoted by A. Using the size of the
anonymity set directly may indicate a stronger
privacy than what it really is. The probability
distribution that the violator can assign to individual subjects of the set should be considered.
To illustrate this problem, consider a system
claiming that a subject receiving data cannot be
distinguished from |A| other subjects belonging
to the anonymity set A. Suppose that a violator

Privacy and Trust in Online Interactions

Figure 1. Hiding in a crowd underlies the metrics based on the anonymity set size
in a crowd
HidingHiding
in a crowd

Less anonymous (1/4)


More anonymous (1/n)

has noticed that a half of the nodes in A rarely


receive messages. Then, he assigns to these nodes
a very low probability of receiving a data item.
The violator has effectively reduced the anonymity set size to |A|/2. To counter this problem, we
define the anonymity set as: A = {(s1, p1), (s2, p2),
, (sn, pn)}, where pi represents the probability
assigned to Subject si. Thus, we can determine
the effective anonymity set size as:
| A|

L = | A | min( pi , 1/ | A |)

(1)

with the maximum possible entropy value, we


can learn how much information the attacker has
gained about the system. Therefore, the privacy
of a system can be measure based on how much
of its private information was revealed.
a.) Entropy calculation example: Privacy loss
D(A, t) at time t, when a subset of attribute values
A might have been disclosed, is given by:
D(A,t) = H* (A) H(A,t)

Note that the maximum value for L is |A|. L


equals |A| when all entities in A are equally likely
to access data, that is, pi = 1/|A|, 1 i n. Equation (1) captures the fact that the anonymity set
size is effectively reduced when the probability
distribution is skewed, that is, when some entities have a higher probability of accessing data
than the others.

where H*(A) is the maximum entropy (computed


when probability distribution of pis is uniform),
and H(A, t) is entropy at time t given by:

A.2 Information-theoretic (entropy-based)


metric: Entropy measures the randomness in a
system, and therefore, it measures the uncertainty
that one has about that system (Cover & Thomas,
1991). Building on this notion, we propose to
use entropy to measure the level of privacy that
a system achieves at a given moment. The idea
is that when an attacker gains more information
about the system, the uncertainty about subjects
that send or receive data, and thus their entropy, is
decreased. By comparing a current entropy value

Consider a private phone number: (a1 a2 a3) a4 a5 a6


a7 a8 a9 a10, where the first three digits constitute
the area code. Assume that each digit is stored as a
value of a separate attribute. Assume further that
the range of values for each attribute is [09],
and that all attributes are equally important, that
is, for each j [1-10], wj = 1.
The maximum entropy exists when an attacker has no information about the probability
distribution of the values of the attributes. In
such a case, the attacker must assign a uniform

| A|

(2)
H ( A, t ) = w j ( pi log 2 ( pi ))
j =1
i

with wj denoting weights that capture relative


privacy value of the attributes.

0

Privacy and Trust in Online Interactions

probability distribution to the attribute values.


Thus, aj = i with pi = 0.1 for each j and for each
i, and we get:
9
10

H * ( A) = w j (0.1 log 2 (0.1)) = 33.3


j =0
i =1

Suppose that after time t, the attacker can


figure out the state in which the phone number
is located. This may allow the attacker to learn
the three leftmost digits (at least for states with a
single area code). Entropy at time t is given by:
10
9

H ( A, t ) = 0 + w j (0.1 log 2 (0.1)) = 23.3


j =4
i =0

Note that attributes a1, a2, and a3 contribute 0 to


the entropy value because the attacker knows their
correct values. Information loss at time t is:
D(A,t) = H * (A) H(A,t) = 10.0
b.) Decrease of system entropy with attribute
disclosures: Decrease of system entropy with
attribute disclosures is illustrated in Figure 2.
The lighter circles indicate the size of the attribute set for private data under consideration, the
smaller darker circles within them indicate the
sizes of subsets of disclosed attributes, the vertical lines to the left of the lighter circles indicate
the maximum entropy H*, and vertical bars to the
left of the lighter circles (superimposed on the H*
lines) indicate the current entropy level. Let us
first consider cases (a) (c) in which we assume
a fixed size of private data. This fixed size of private data explains why the lighter circles in these
three cases have the same size. When entropy of
data reaches a certain higher threshold value H2,
as shown in case (b), a controlled data distortion
method, increasing entropy of these data, must
be invoked to protect their privacy. Examples of
distortion mechanisms include generalization and
suppression (Sweeney, 2002a) or data evaporation
(Lilien & Bhargava, 2006). When entropy of
data drops below a certain lower threshold level
H1, as shown in case (c), data destruction must

0

be triggered to destroy all private data. Data destruction is the ultimate way of preventing their
disclosure.
Let us add a bit more detail to this example.
Consider private data that consists of three attributes: a name, a social security number, and a
zip code. Each attribute has a domain of values.
The owner of private data first computes the
maximum entropy H* for all three attributes. The
owner also determines two values for entropy
mentioned above: the higher value H2 (the threshold for triggering controlled data distortions), and
the lower value H1 (the threshold for triggering
data destruction). Each time when private data is
shared or transferred from one entity to another,
the new value of entropy, Hnew, is calculated using
Equation (2). If Hnew stays above H2, no privacypreserving actions are needed. If Hnew drops below
H2 but stays above H1 (H2 > Hnew > H1), a controlled
data distortions method is invoked to increases
data entropy. Finally, if Hnew drops below H1 (H1 >
Hnew), data destruction must be invoked to protect
data privacy.
The example assumed that the size of the
private data attribute set is fixed. The entropy
metric can be extended to cases when the private
data attribute set is allowed to grow or shrink.
Case (d) in Figure 2, when compared with case
(c), illustrates the situation in which private data
attribute set grew. This growth is indicated by the
larger diameter of the lighter circle, indicating a
larger attribute set for sensitive data. The sizes
of subsets of disclosed attributes, indicated by
the darker circles, are identical in cases (d) and
(c); do not be fooled by the optical illusion that
the darker circle in (d) is smaller than the darker
circle in (c). As a result, entropy for case (d) is
higher than for case (c), as indicated by a higher
vertical bar for case (d). This utilizes the principle
of hiding in the (larger) crowd.
Entropy can be increased not only by increasing the size of the private data attribute set, as
just demonstrated, but also by making its subset
of disclosed attributes less valuable. For example,

Privacy and Trust in Online Interactions

Figure 2. Dynamics of entropy

suppose that a bank releases the current account


balance of a customer to an insurer. This balance
is valid for a specific period of time. After this
period, the value of knowing this private piece
of information decreases, because the customer
could have changed her balance. In computing the
new value of entropy, the balance is assumed to
be private again. This leads to a gradual increase
in entropy. In another example, a bank can increase entropy rapidly: to make stolen credit card
numbers useless, it quickly changes credit card
numbers for all compromised accounts.

B. Methods for Protecting Privacy


Privacy controls for sensitive data are necessary.
Without them, many interaction opportunities are
lost. Examples are patients symptoms hidden
from doctors, given up business transactions,
lost research collaborations, and rejected social
contacts
Privacy can be supported by technical or legal
controls. Examples of legal controls are the EPA
privacy act (Privacy Act, 2004), and HIPAA
Privacy Rule (Summary HIPAA, 2003; Mercuri,
2004) intended to protect privacy of individuals.
Yet, there are many examples of privacy violations even by federal agencies. The sharing of
travelers data by JetBlue Airways with the federal government was one such incident (Privacy
Act, 2004).
Technical controls for facilitating or enabling
privacy controls must complement legal controls.

Such privacy protection mechanisms should


empower users (peers, nodes, etc.) to protect
user identity, privacy of user location and movement, as well as privacy in collaborations, data
warehousing, and data dissemination. Each party
that obtained other partys sensitive data through
an interaction must protect privacy of these data.
Forwarding them without proper privacy guarantees to other entities could endanger partners
privacy.
Both stronger and weaker party should be
assisted with technical solutions. On the one
hand, the responsibility of the stronger partner
for protecting privacy is larger. The reason is
that the stronger partner obtains more sensitive
data of her counterpart then a weaker partner.
In many cases, the stronger partner might be the
only party that obtains private data. On the other
hand, the weaker partner should not rely entirely
on the integrity of the stronger counterpart. He
needs mechanisms to protect sensitive data released by him evenor especiallywhen they
are out of his hands.
This means that at least the following two
kinds of mechanisms are needed. The first
one must assist in minimizing the amount of
private information that is disclosed. A system
for privacy-for-trust exchange, presented in this
chapter, is an example of a mechanism of this
kind. Mechanisms of the second kind provide
protection for further dissemination of sensitive
data that are already disclosed, setting clear and
well-defined limits for such dissemination. They

0

Privacy and Trust in Online Interactions

assure that data disclosed to a stronger party are


not freely disseminated by her to other entities.
For example, a mechanism of this kind assures
that only some of data revealed by a patient to his
doctor are forwarded by the doctor to an insurer
or a nurse, and most sensitive data are never
forwarded to anybody.3 An example of this kind
is the solution named P2D2 (privacy-preserving
data dissemination) (Lilien & Bhargava, 2006),
which enables control of further dissemination
of sensitive data by integrating privacy protection mechanisms with the data they guard. P2D2
relies on the ideas of: (i) bundling sensitive data
with metadata specifying privacy preferences
and policies; (ii) an apoptosisthat is, a clean
self-destruction (Tschudin, 1999)of endangered
bundles; and (iii) an adaptive evaporation of
bundles in suspect environments.
B.1. Technical privacy controls: Technical privacy controls, also known as privacy-enhancing
technologies (PETs), can be categorized into the
following categories (Fischer-Hbner, 2001):

Protecting user identities


Protecting usee identities
Protecting confidentiality and integrity of
personal data

We take a look at these categories of technologies in turn.


a. Protecting user identities (Fischer-Hbner,
2001): User identities can be protected via
anonymity, unobservability, unlinkability, and
pseudonymity of users. First, anonymity ensures
that a user may use a resource or service without
disclosing the users identity (Class FPR, 1999).
The special cases of anonymity are: sender
and receiver anonymity, ensuring that a user is
anonymous in the role of a sender or a receiver,
respectively, of a message.
We can define the following six degrees of
sender anonymityfrom the one fully protect-

0

ing the sender, to the one exposing the sender


(Fischer-Hbner, 2001): (i) absolute privacy,
when the sender enjoys full privacy with respect
to being considered as the sender of a message;
(ii) beyond suspicion, when the sender appears
to be no more likely to be the originator of a
message than any other potential sender in the
system; (iii) probable innocence, when the sender
appears to be no more likely to be the originator
of a message than not to be its originator; (iv)
possible innocence, when there is a nontrivial
probability that the real sender is someone else;
(v) exposed, when the sender is highly likely to
be the originator of a message; and (vi) provably
exposed, when the sender is identified beyond any
doubt as the originator of a message.
Second, unobservability ensures that a user
may use a resource or service without others being able to observe that the resource or service is
being used (Class FPR, 1999). Third, unlinkability
ensures that a user may make use of resources and
services without others being able to link these
uses together (ibid). Its special case is unlinkability
of a sender and a recipient, when a sender and a
recipient cannot be identified as communicating
with each other. Fourth, pseudonymity ensures
that a user acting under a pseudonym may use a
resource or service without disclosing his identity
(ibid).
b. Protecting usee identities (Fischer-Hbner,
2001): In this case, the protected entity is the usee,
that is, the subject described by data. Usee identities can be protected, for example, via depersonalisation, providing anonymity or pseudonymity
of data subjects.
Depersonalisation (anonymization) of data
subjects can be classified as a perfect depersonalization, when data are rendered anonymous in
such a way that the usee (data subject) is no longer
identifiable, and a practical depersonalization,
when a modification of personal data is such that
information concerning personal or material circumstances can either no longer be attributed to

Privacy and Trust in Online Interactions

an identified or identifiable individual, or can be


so attributed only with a disproportionate amount
of time, expense and labor. Attackers attempt to
circumvent depersonalization by reidentification
attacks.
c. Protecting confidentiality and integrity of
personal data (Fischer-Hbner, 2001): Confidentiality and integrity of personal data can be
protected by a number of methods and technologies, including privacy-enhanced identity
management, access controls, using enterprise
privacy policies, making data inaccessible with
cryptography or steganography (e.g., hiding a
message in an image), and utilizing specific tools
(such as platform for privacy preferences or P3P;
Marchiori, 2002).
B.2 Legal privacy controls: For completeness
of our presentation, we will take a look, albeit a
quick one, at legal privacy controls.
Despite the fact that definitions of privacy vary
according to context and environment, the belief
that privacy is a fundamental human right, even
one of the most important rights of the modern
age, is internationally recognized .At a minimum,
individual countries protect inviolability of the
home and secrecy of communications (Green,
2003).
There are two types of privacy laws in various
countries (ibid): comprehensive laws and sectoral
laws. Comprehensive laws are general laws that
govern the collection, use and dissemination
of personal information by public and private
sectors. They are enforced by commissioners or
independent enforcement bodies. The disadvantages of this approach include lack of resources
for oversight and enforcement agencies, as well
as the governmental control over the agencies.
Comprehensive privacy laws are used in Australia, Canada, the European Union, and the United
Kingdom.
Sectoral laws focus on specific sectors and
avoid general laws. They benefit from being

able to use a variety of specialized enforcement


mechanisms, selecting the ones best suited for
the sector they apply to. Their disadvantage is the
need for a new legislation for each new sectoral
technology. Sectoral privacy laws are used in the
United States.
Disparate national privacy laws require international agreements to bridge different privacy
approaches. An example is the Safe Harbor Agreement, reached in July 2000, between the United
States and the European Union (Welcome Safe,
2007). It has been criticized by privacy advocates
and consumer groups in both the United States
and European Union for inadequate enforcement,
and relying too much on mere promises of participating companies.

Trading Privacy for Trust


An interacting entity can choose to trade its
privacy for a corresponding gain in its partners
trust in it (Zhong & Bhargava, 2004). We believe
that the scope of a privacy disclosure should be
proportional to the benefit expected by the entity
that discloses its private information. For example,
a customer applying for a mortgage mustand
is willing toreveal much more personal data
than a customer buying a book.
Having measures of trust and privacy defined
above allows for precise observation of these two
quantities, and precise answers to questions such
as: (i) What degree of privacy is lost by disclosing given data? (ii) How much trust is gained
by disclosing given data? and (iii) What degree
of privacy must be sacrificed to obtain a certain
amount of trust gain?

A. Same-Strength and
Different-Strength Trust
As defined earlier, the strength of a party participating in the relationship is defined by the
partys capability to demand private information
from the other party, and the means available in

0

Privacy and Trust in Online Interactions

case when the other party refuses to comply. As a


simple example, a bank is stronger than a customer
requesting a mortgage loan. As mentioned, trust
relationships can be divided into same-strength
or different-strength.

B. Same-Strength and
Different-Strength Privacy-for-Trust
Negotiations
To realize a privacy-for-trust tradeoff, two
interacting parties, P1 and P2, must negotiate
how much privacy needs be reveled for trust.
We categorize such negotiations as either: (1)
same-strengthwhen both parties are of similar
strength; or (2) different-strengthwhen one
partys position is stronger vis--vis the others. In
turn, same-strength privacy-for-trust negotiations
can be either: (1a) privacy-revealing negotiations,
in which parties disclose their certificates or policies; or (1b) privacy-preserving negotiations, in
which parties preserve privacy of their certificates
and policies.
We compare all three kinds of privacy-for-trust
negotiationsthat is, (1a), (1b), and (2)in terms
of their behavior during the negotiations. This
behavior includes defining trust level necessary
to enter negotiations, growth of trust level during
negotiations, and the final trust level sufficient
for getting a service.
Same-strength negotiations are very popular
in the research literature. Different-strength
negotiations, to the best of our knowledge have
been defined explicitly by us.
B.1. Trust growth in same-strength trust
negotiations: Same-strength trust negotiations
involve partners of similar strength.
a. Trust growth in privacy-revealing samestrength trust negotiations: Negotiations of this
type can start only if an initial degree of trust
exists between the parties. They must trust each
other enough to reveal to each other some certifi-

0

cates and policies right away. From this point on,


their mutual trust grows in a stepwise manner as
more private information is revealed by each party.
Negotiations succeed when a sufficient mutual
trust is established by the time when the negotiation endsit is sufficient for the requestor
to obtain the desired service. This procedure is
summarized in Table 6.
b. Trust growth in privacy-preserving samestrength trust negotiations: In contrast to privacy-revealing same-strength trust negotiations,
negotiations of this type can start without any
initial trust, since they involve no risk related to
revealing ones privacy. There are no intermediate
degrees of trust established during the negotiations. They continue without mutual trust up to the
moment when they succeed or fail. They succeed
when a sufficient mutual trust is established by the
time when the negotiation ends. This procedure
is summarized in Table 7.
B.2. Trust growth in different-strength trust
negotiations: Negotiations of this type can start
only if at their start the weaker partner has a
sufficient trust in the stronger partner. This trust
is sufficient when the weaker party is ready
for revealing private information required to
start gaining stronger partys trust necessary for
obtaining its service. As negotiations continue,
the weaker partner trades a degree of privacy
loss for a trust gain as perceived by the stronger

Table 6. Trust growth in privacy-revealing samestrength trust negotiations


An initial degree of trust necessary
Must trust enough to reveal (some) certificates / policies
right away
Stepwise trust growth in each other as more (possibly
private) information about each other revealed
Proportional to the number of certificates revealed to each
other
Succeed if sufficient mutual trust established when
negotiations completed
Sufficient for the task at hand

Privacy and Trust in Online Interactions

Table 7. Trust growth in privacy-preserving samestrength trust negotiations


Initial distrust
No one wants to reveal any information to the
partner
No intermediate degrees of trust established
From distrust to trust
Succeed if sufficient mutual trust established when
negotiations completed
Sufficient for the task at hand

partner. It should be clear that the former looses


a next degree of privacy when revealing the next
private certificate to the latter. (The only exception to privacy loss is the no privacy loss case
in the anonymity-preserving example in stronger
building trust in weaker, shown in Table 3).
Negotiations succeed when, by the time when
the different-strength trust negotiations end, the
stronger party gains a sufficient trust in the weaker
party to provide it the requested service. This
procedure is summarized in Table 8.

Table 8. Trust growth in privacy-preserving different-strength trust negotiations


Initially, Weaker has a sufficient trust in Stronger
Weaker must trust Stronger sufficiently to start revealing
private information required to gain Strongers sufficient
trust
Weaker trades a degree of privacy loss for a trust gain as
perceived by Stronger
A next degree of privacy lost when a next certificate
revealed to Stronger
Sufficient trust of Stronger in Weaker established when
negotiations completed
Sufficient for the task at hand

Table 9. Summary of trust growth in samestrength


Trust growth in same-strength disclosing trust
negotiations
Initial degree of trust / stepwise trust growth /
establishing mutual full trust
Trades information for trust (information is private or
not)
Trust growth in same-strength preserving trust
negotiations (from distrust to trust)
Initial distrust / no stepwise trust growth / establishes
mutual full trust
No trading of information for trust (information is
private or not)

B.3. Summary of privacy-for-trust trading


in same-strength and different-strength Trust
Negotiations: Table 9 summarizes trust growth
in same-strength and different-strength trust
negotiations and different-strength trust negotiations

C. Privacy-for-Trust Optimization in
Different-Strength Trust Negotiations
The optimization procedure for trading privacy for
trust in different-strength trust negotiations presented, follows our approach (Zhong & Bhargava,
2004; Zhong, 2005). It includes four steps:
1.
2.
3.

Formalizing the privacy-trust tradeoff problem


Measuring privacy loss due to disclosing a
private credential set
Measuring trust gain obtained by disclosing
a private credential set

Trust growth in different-strength trust negotiations


Initial full trust of Weaker in Stronger and no trust of
Stronger in Weaker / stepwise trust growth / establishes
full trust of Stronger in Weaker
Trades private information for trust

4.

Developing a system that minimizes privacy


loss for a required trust gain

We distinguish two forms of privacy-for-trust


optimization. The first one minimizes the loss
of privacy by the weaker partner necessary for
obtaining, in the eyes of the stronger partner, a
certain trust level required to get a service. This
is the form discussed in more detail below. The
second form of optimization finds the degree of
privacy disclosure by the weaker partner necessary
for maximizing the trust level obtained from the
stronger partner. We do not discuss this form,

0

Privacy and Trust in Online Interactions

just noticing that it is needed in situations when


the weaker partners benefits obtained from the
stronger partner are proportional to the trust level
attained in the eyes of the stronger partner.
We assume that a user has multiple choices
in selecting sensitive information for disclosure.
For example, in response to an age query, a user
can show a driver license, a passport, or a birth
certificate.
C.1. Formulating the tradeoff problem: Suppose
that the private attributes we want to conceal are
a1, a2, , am. A user has a set of credentials {c1,
c2, , cn}. A credential set can be partitioned by
a service provider into revealed and unrevealed
credential subsets, denoted as R(s) and U(s),
respectively, where s is the identity of a service
provider.
The tradeoff problem can now be formulated
as follows: choose from U(s) the next credential
nc to be revealed in a way that minimizes privacy
loss while satisfying trust requirements. More
formally, this can be expressed as:
min {PrivacyLoss(nc R(s)) PrivacyLoss(R(s))
| nc satisfies trust requirements}
This problem can be investigated in two
scenarios:
1.

2.

08

Service providers never collude to discover customers private information. An


individual version R(s) is maintained for
each service provider and privacy loss is
computed based on it.
Some service providers collude to discover
customers private information. A global
version Rg that consists of all credentials
disclosed to any colluding service provider
is maintained. Since the difference between
R(s) and Rg is transparent to the procedure
for evaluation of privacy loss and trust
gain, they both are denoted as R in further
considerations.

The tradeoff problem changes to a multivariate problem if multiple attributes are taken into
consideration. It is possible that selecting nc1 is
better than nc2 for a1 but worse for a2. We assume
the existence of an m-dimensional weight vector [w1, w2, , wm] associated with these private
attributes. The vector determines the protection
priority for the private attributes a1, a2, , am,
respectively. We can minimize either: (a) the
weighted sum of privacy losses for all attributes
or (b) the privacy loss of the attribute with the
highest protection weight.
Another factor affecting the tradeoff decision is
the purpose of data collection. It can be specified
in the service providers privacy policy statement,
for instance, by using P3P (Marchiori, 2002).
Pseudonymous analysis and individual decision
are two data collection purposes defined in P3P.
The former states that collected information will
not be used in any attempts to identify specific
individuals. The latter tells that information may
be used to determine the habits, interests, or
other characteristics of individuals. A user could
make different negotiation decisions based on the
stated purpose of data collection. Furthermore,
the service providers trustworthiness to fulfill
the declared privacy commitment can be taken
into consideration.
C.2. Estimating privacy loss: We distinguish
two types of privacy losses: the query-dependent
and query-independent ones. Query-dependent
privacy loss for a credential nc is defined as the
amount of information that nc provides in answering a specific query. The following example
illustrates a query-dependent privacy loss for a
credential. Suppose that a users age is a private
attribute. The first query asks: Are you older
than 15? The second query tests the condition
for joining a silver insurance plan, and asks:
Are you older than 50? If the user has already
presented a valid driver license, we are 100%
sure that the answer to the first query is yes but
the probability of answering yes to the second

Privacy and Trust in Online Interactions

query by a person with a driver license is, say,


40%. Privacy loss for a revealed driver license is
here query-dependent since it varies for different
queries: it is a full privacy loss (100%) for the first
query, and only a partial (probabilistic) privacy
loss (40%) for the second query. This example
also makes it clear that the value of revealed
information (such as a driver license) can vary
for different queries.
Let us consider now an example illustrating a
query-independent privacy loss for a credential.
Suppose that a user has already presented her
driver license, which implies that she is older than
16. If she uses her Purdue undergraduate student
ID as the next piece of evidence, a high query-independent privacy loss ensuessince this credential
greatly reduces the probable range of her age. Let
us consider the third query asking: Are you an
elementary school student? The students ID is
redundant as a credential for this query because
the revealed driver license has already excluded
this possibility. This shows that a credential having
a high query-independent privacy loss may not
necessarily be useful to answer a specific query
(a large privacy loss with no trust gain).
Two types of methods can be used to measure
a privacy loss: probabilistic methods and the
predefined lattice method.
a. Probabilistic methods for estimating privacy
losses: There are two probabilistic methods: one
for evaluating query-dependent privacy losses,
and another for evaluating query-independent
privacy losses. More specifically, the first probabilistic method evaluates the query-independent
privacy loss for disclosing a credential ci with
respect to one attribute aj that has a finite domain
{ v1, v2, , vk }. The probability of aj = vi before
disclosing the credential is Prob(aj = vi | R). The
probability of aj = vi with a given credential ci
disclosed is Prob(aj = vi | R ci). The privacy
loss is measured as the difference between entropy
values (Young, 1971):

i =1

i =1

Pr ivacyLossa j (ci | R) = Pi log 2 ( Pi ) Pi* log 2 ( Pi* )

where Pi = Prob(aj = vi | R) and Pi* = Prob(aj =


vi | R ci).
The second probabilistic method evaluates the
query-dependent privacy loss based on the knowledge of a complete set of potential queries. Let q1,
q2, , qn denote n queries. Let pri be the probability that qi is asked, and wi be the corresponding
weight indicating the protection priority for this
query. We can now evaluate the privacy loss for
disclosing a credential ci in response to a query
qk. Suppose that there are r possible answers to
the query. The domain of an attribute aj is divided
into r subsets { qv1j , qv2j , ..., qvrj } based on the
query answer set. The privacy loss with respect
to attribute aj and query qk is computed as:
where Pi = Prob(aj qvik | R) and Pi* = Prob(aj
qvik | R ci).
The query-dependent privacy loss with respect to attribute aj is evaluated by the following
formula:
n

Pr ivacyLossa j (ci | R) = (Pr ivacyLossa j ,q k * prk * wk )


k =1

Bayes networks (Jensen, 1996) and kernel


density estimation can be used for conditional
probability estimations.
b. The predefined lattice method for estimating
privacy losses: The second type of methods that
can be used for measuring a privacy loss is represented by the predefined lattice method. This
method assumes that each credential is associated with a tag indicating its privacy level with
respect to attribute aj. The tag set is organized as
a lattice (Donnellan, 1968) in advance. Tags are
assigned to each subset of credentials as follows.
Suppose that TB and TA are two tags and TB TA.
TA and TB are assigned to credentials cA and cB,

09

Privacy and Trust in Online Interactions

respectively, if the information inferred from cA


is more precise than what can be inferred from
cB. cA determines a possible value set VA for aj,
and cB determines another set VB. The formula to
compute the privacy loss is: (see Box 1.) where lub
is the least upper bound operator (ibid).
C.3. Estimating trust gain: We have already
shown in the section on Trust Metrics the way
to compute a trust gain. It requires defining a
trust benefit function B(ti) associated with each
trust level ti. Then, the trust gain G is calculated
as follows:
trust_gain =
G(new_trust_level, old_trust_level) =
B(new_trust_level) - B(old_trust_level)
C.4. PRETTYa system minimizing privacy loss
for a required trust gain: A prototype system
named PRETTY (private and trusted systems)
was developed in the Raid Laboratory at Purdue
University (Zhong & Bhargava, 2004). PRETTY,
shown in Figure 3, utilizes the server/client architecture. It uses as one of its components the
existing TERA (trust-enhanced role assignment)
prototype, which also was developed in the Raid
Lab (Zhong, Lu, & Bhargava, 2004). TERA evaluates the trust level of a user based on the users
behavior. It decides whether the user is authorized
for an operation based on the policies, the credentials, and the level of trust. The users trust value
is dynamically updated when more data on the
users behavior become available.
The server component consists of the server
application, the TERA server, the privacy negotiator, the set of privacy policies, the database,

and the data disseminator. The client component


of PRETTY consists of the user application, the
credential manager, the evaluator of trust gain
and privacy loss, the privacy negotiator, and a set
of privacy policies. The evaluator of trust gain
and privacy loss implements the mechanisms
presented in this paper. It prompts a user to enter
the utility function for the ongoing interaction. The
evaluator either automatically makes the tradeoff
decision, or provides the user with the evaluation
results for privacy loss and trust gain.
A typical interaction is illustrated in Figure 3
with numbered arrows. An arrow number (1 4,
some with letter suffixes) corresponds to an item
number in the interaction description.
A number in parentheses in the figure denotes
an unconditional path (e.g., (1)), that is, a path
followed by all interactions. A number in brackets
denotes a conditional path (e.g., [2a]), that is,
a path followed only by some interactions. An
interaction takes place as follows:
1.
2.

The user application sends a query to the


server application.
The server application sends user information to the TERA server for trust evaluation
and role assignment. (Examples of roles
are student role and faculty role, the latter
granting its actor broader access rights.)
2a. If a higher trust level is required for
query, the TERA server sends a request
for more users credentials to the local
privacy negotiator.
2b. Based on the servers privacy policies and credential requirements, the
servers privacy negotiator interacts
with the clients privacy negotiator to
build a higher level of trust.

Box 1.
Pr ivacyLossa j (ci | ) = Tag (ci )
Pr ivacyLossa j (ci | R) = lub( PrivacyLossa j (ci | ), Pr ivacyLossa j ( R | ) )

0

Privacy and Trust in Online Interactions

Figure 3. Architecture of PRETTY

3.

4.

2c. The trust gain and privacy loss evaluator


of the client selects credentials that will
increase trust to the required level with
the minimal privacy loss. The calculations consider credential requirements
and credentials disclosed in previous
interactions. (This item includes two
actions: [2c1] and [2c2]).
2d. According to clients privacy policies
and the calculated privacy loss, the
clients privacy negotiator decides
whether or not to supply credentials
to the server.
Once the achieved trust level meets the
minimum requirements, the appropriate
roles are assigned to the user for execution
of the users query.
Based on the query results, the users trust
level and privacy policies, the data disseminator determines: (i) whether to distort data
for protecting its privacy and, if so, to what
degree; and (ii) what privacy enforcement
metadata (e.g., policies) should be associated
with it.

future trends In prIvAcy- And


trust-relAted solutIons
Technical privacy-related and trust-related solutions will continue their strong impact on online
consumer protection. The future trends related to
privacy will be determined, among others, by the
following challenges (Bhargava, Farkas, Lilien,
& Makedon, 2003; Bhargava, 2006):
1.

2.

Defining and measuring privacy and


privacy policies: How to define privacy?
How to define privacy metrics? How to
best define privacy policies? How to best
perform privacy requirement analysis and
stakeholder analysis? How to analyze, and
manage privacy policies?
Determining technologies that endanger
privacy in computing environments: What
technologies or system components endanger privacy in computing environments, and
how to prevent this? As an example, how
to prevent pervasive computing from illegitimate monitoring and controlling people?
How to assure anonymity in pervasive
computing environments? How to balance



Privacy and Trust in Online Interactions

3.

anonymity with accountability under these


circumstances?
Finding new privacy-enhancing technologies

What technologies can be utilized or exploited


to provide privacy, and how to use them to this
end? What are the best ways of privacy-preserving data mining and querying? How to monitor
Web privacy and prevent privacy invasions by
undesirable inferences? How to address the issue
of monitoring the monitor, including identification and prevention of situations when incorrect
monitor data result in a personal harm?
The future trends related to trust will be determined among others, by the following challenges
(Bhargava et al., 2003):

4.

What is the impact of trust solutions on


system performance and economics? How
to guarantee performance and economy of
trust solutions? How and what economic
incentives and penalties can be used for
trust solutions?
Engineering trust-based applications
and systems: How to experiment with and
implement trust-based applications and
systems for e-government, e-commerce,
and other applications? How to enhance
system performance, security, economics,
and so forth, with trust-based ideas (such as
enhancing role-based access control with
trust-based mappings)?

conclusIon
1.

2.

3.



Improving initiation and building of


trust:How to create formal models of trust,
addressing the issues of different types of
trust (e.g., trust towards data subjects, or
data users)? How to define trust metrics
able to compare different trust models? How
should trust models assist in selecting and
accommodating trust characteristics? How
should the models of trust handle both direct
evidence and second-hand recommendations? How TTPs can be used to initiate and
build trust? How timeliness, precision, and
accuracy affect the process of trust building?
Maintaining and evaluating trust: How to
collect and maintain trust data (e.g., credentials, recommendations)? How and when to
evaluate trust data? How to discover betrayal
of trust, and how to enforce accountability
for damaging trust? How to prevent trust
abuse, for example, by preventive revocation
of access rights? How to motivate users to
be good citizens and to contribute to trust
maintenance?
Constructing practical trust solutions:
How to scale up trust models and solutions?

Providing tools for privacy-for-trust exchange is


critical to further developments of online interactions. Without privacy guarantees, there can
be no trust, and without at least some trust no
interactions can even commenceunless a party
is totally oblivious to the dangers of privacy loss,
up to the point of identity theft. Normally, people
will avoid any negotiations if their privacy is
threatened by a prospective negotiation partner.
Without trust-building negotiations, no trust can
be established.
The stakes are becoming higher since privacy
guarantees are becoming absolutely essential as
we progress towards pervasive computing. More
pervasive devices have the higher potential for
violating privacy. Unless adequate technical
privacy controls and privacy-for-trust support
is provided, possibilities of huge privacy losses
will scare people off, crippling the promise of
pervasive computing.
The objective of this chapter was presentation
of an approach and a tool for protecting privacy in
privacy-for-trust exchanges. We presented the notions of privacy and trust, their characteristics, and
their role in online interactions, emphasizing the

Privacy and Trust in Online Interactions

tradeoff between these two phenomena. An overview of problems facing a person wishing to trade
privacy for trust was followed by a description of
our proposed solution. It started with a look at
trust metrics and means for building and verifying
trust, and continued with a presentation of two
privacy metrics: an effective anonymity set size
and an entropy-based metric. We then discussed
technical means for protecting privacy.
We categorized the processes of trading
privacy for trust into same-strength privacy-fortrust negotiations and different-strength privacyfor-trust negotiations, dividing the former into
privacy-revealing and privacy-preserving subcategories. The described privacy-for-trust solution
is intended for optimization in different-strength
trust negotiations. It involves four steps: formulating the tradeoff problem, estimating privacy loss,
estimating trust gain, and minimizing privacy
loss for a required trust gain. We provided a brief
description of PRETTY, a system minimizing
privacy loss for a required trust gain.

2.

3.

future reseArch dIrectIons


for prIvAcy And trust
reseArch
We have shown that privacy and trust enable
and facilitate collaboration and communication.
We indicated their growing role in open and
dynamic environments. To increase the benefits
of privacy-related and trust-related solutions, a
number of research directions should be pursued
(Bhargava et al., 2003). For privacy-related solutions, the following research problems should be
addressed (ibid):
4.
1.

Privacy metrics: Issues of privacy of users


or applications, on the one hand, and privacy
(secrecy, confidentiality) of data, on the other
hand, intertwine. Metrics for personal and
confidential data usage should be developed
to measures that accesses data, what data

are accessed, and how they are accessed.


Metrics and methods for measurements of
privacy-related aspects of data quality should
be provided. Accuracy in information extraction should be measured since inaccurate
information can obstruct accountability or
harm privacy (e.g., in a case of a wrongly
identified individual).
Privacy policy monitoring and validation: We need to better understand how to
monitor and validate privacy policies, and
develop technologies that ensure correct
enforcement of privacy policies. Researchers
should address monitoring and validating
privacy aspects of data integration, separation, warehousing, and aggregation. An
interesting issue is licensing of personal
data for specific uses by their owners (an
example is Ms. Smith agreeing to receive
house-for-sale advertising by licensing her
e-mail rights to a real estate advertiser).
Information hiding, obfuscation, anonymity, and accountability: Researchers
should address different ways of assuring
anonymity via information hiding and obfuscation, ranging from steganography through
location security and hiding message source
and destination from intermediate nodes
to approaches used for digital elections. At
the same time, for accountability, we need
to investigate how to prevent illegitimate
or improper information hiding. We need
models that support accountable anonymity without depending on a trusted third
party. As an example, accountability suffers
when data provenance obfuscation or user
anonymity hinder intruder identification.
New privacy-enabling and privacydisabling technologies: The impact of
emerging technologies on privacy should be
investigated. In particular, broad research on
privacy for pervasive computing is an urgent necessity. Unless proper access control
is provided, less restricted data. Another



Privacy and Trust in Online Interactions

5.

important issue is privacy-preserving data


mining on massive datasets.
Interdisciplinary privacy research: Interdisciplinary research should propose comprehensive and rich privacy models based
on social and ethical privacy paradigms.
Another direction is considering public acceptance of privacy requirements and rules,
and their enforcement.

In turn, for trust-related solutions, the following research problems should be addressed
(ibid):
1.

2.

3.



A better utilization of the social paradigm


of trust: Utilization of the powerful social
paradigm of trust, based on the analogies to
uses of the notion of trust in social systems,
should be explored in many ways. Finding
out what makes trust work in existing social
systems, and transferring this to a computing
world is a big challenge. This work calls for
a strong cooperation with social scientists.
Liability of trust: We need to provide
methods, algorithms, and tools to identify
which components and processes of the
system depend on trust. We also need to
find out to which extent and how security
of a system may be compromised if any of
these trust-dependent components fails.
As an example, the role of data provenance
explanations in trust-based systems needs
be investigated.
Scalable and adaptable trust infrastructure: A high priority should be given to
building scalable and adaptable trust infrastructures, including support for trust
management and trust-based negotiations. In
particular, support should be made available
for gaining insight from different applications, for exploring the issue of dynamic
trust, for building interoperable tools for the
trust infrastructure, for developing flexible
and extensible standards, and for investigating trust-based negotiations.

4.

5.

Benchmarks, testbeds, and development


of trust-based applications: We need
benchmarks and testbeds for experimenting
with diverse roles of trust in computing systems. The experiments should form a strong
basis for the development of diverse prototype trust-based applications. Trust-based
solutions for new and emerging technologies
should be studied. An example is ensuring
data integrity and privacy in sensor networks
deployed in trustless environments.
Interdisciplinary trust research: There is
a strong need for trust-related interdisciplinary research outside of the realm of computer
science and engineering. In addition to already-mentioned interdisciplinary work on
the social paradigm of trust, it should include
research on ethical, social, and legal issues,
both human-centered and system-centered.
Another important direction is work on
economic incentives for building trust, and
disincentives and penalties for committing
fraud.

Trust and privacy are strongly related to security. Therefore, in addition to the separate research
directions for privacy and trust specified, we can
also indicate threads of research common not only
to them, but also to security. This means research
on intersecting aspects of trust, privacy, and
security (TPS) (Bhargava et al., 2003). The first
common thread includes the tradeoffs, including
not only the tradeoff between privacy and trust,
but also performance vs. TPS, cost and functionality vs. TPS, and data monitoring and mining
vs. TPS. The second common thread contains
policies, regulations, and technologies for TPS.
This includes creation of flexible TPS policies,
appropriate TPS data management (including
collection, usage, dissemination, and sharing of
TPS data), and development of domain- and application-specific TPS approaches (such as TPS
solutions for commercial, government, medical,
and e-commerce fields). The third and the fourth

Privacy and Trust in Online Interactions

threads are a development of economic models


for TPS, and investigation of legal and social
TPS aspects.

AcknoWledgment
This research was supported in part by the NSF
Grants IIS-0242840, IIS-0209059, ANI-0219110,
and NCCR-0001788. The authors thank Dr. Yuhui
Zhong and Dr. Yi Lu for contributing input for
the subsections on Related Work, Trust Metrics,
Privacy Metrics, and Privacy-for-trust Optimization in Different-strength Trust Negotiations; as
well as Dr. Mohamed Hefeeda for contributing
Figure 2. Any opinions, findings, conclusions,
or recommendation expressed in the chapter are
those of the authors and do not necessarily reflect
the views of the funding agencies or institutions
with which the authors are affiliated.

references
Aberer, K., & Despotovic, Z. (2001). Managing
trust in a peer-2-peer information system. In
Proceedings of the 2001 ACM CIKM International Conference on Information and Knowledge
Management, Atlanta, Georgia (pp. 310-317).
New York: ACM.
Agrawal, D., & Aggarwal, C. (2001). On the
design and quantification of privacy preserving data mining algorithms. In Proceedings of
the Twentieth ACM SIGMOD-SIGACT-SIGART
Symposium on Principles of Database Systems,
PODS01, Santa Barbara, California (pp. 247-255).
New York: ACM.
The American Heritage Dictionary of the English
Language (4th ed.). (2000). Boston: Houghton
Mifflin.
Barnes, G. R., Cerrito, P. B., & Levi, I. (1998). A
mathematical model for interpersonal relation-

ships in social networks. Social Networks, 20(2),


179-196.
Bhargava, B. (2006, September). Innovative ideas
in privacy research (Keynote talk). In Proceedings of the Seventeenth International Workshop
on Database and Expert Systems Applications
DEXA06, Krakw, Poland (pp. 677-681). Los
Alamitos, CA: IEEE Computer Society.
Bhargava, B., Farkas, C., Lilien, L., & Makedon, F.
(2003, September 14-16). Trust, privacy, and security: summary of a workshop breakout session
at the national science foundation information
and data management (IDM) workshop held in
Seattle, Washington (Tech. Rep. No. 2003-34).
West Lafayette, IN: Purdue University, Center
for Education and Research in Information Assurance and Security (CERIAS).
Bhargava, B., Lilien, L., Rosenthal, A., &
Winslett, M. (2004). Pervasive trust. IEEE Intelligent Systems, 19(5), 74-77.
Bhargava, B., & Zhong, Y. (2002). Authorization
based on evidence and trust. In Y. Kambayashi, W.
Winiwarter, & M. Arikawa (Eds.), Proceedings of
4th International Conference on Data Warehousing and Knowledge Discovery (DaWaK 2002)
Lecture Notes in Computer Science (Vol. 2454,
pp. 94-103). Heidelberg, Germany: Springer.
Can, A. B. (2007). Trust and anonymity in peerto-peer systems. Ph.D. thesis, West Lafayette,
Indiana: Purdue University.
Carbo, J., Molina, J., & Davila, J. (2003). Trust
management through fuzzy reputation. International Journal of Cooperative Information
Systems, 12(1), 135155.
Class FPR: Privacy. (1999). Common criteria
for information technology security evaluation.
Part 2: Security functional requirements. Version
2.1. (Report CCIMB-99-032) (pp.109-118). Ft.
Meade, MD: National Information Assurance
Partnership (NIAP). Retrieved June 5, 2007, from



Privacy and Trust in Online Interactions

http://www.niap-ccevs.org/cc-scheme/cc_docs/
cc_v21_part2.pdf
Collberg, C., & Thomborson, C. (2002). Watermarking, tamper-proofing, and obfuscation-tools
for software protection. IEEE Transactions on
Software Engineering, 28(8), 735-746.
Cofta, P. (2006). Impact of convergence on trust
in ecommerce. BT Technology Journal, 24(2),
214-218.
Cover, T., & Thomas, J. (1991). Elements of
information theory. Hoboken, NJ: John Wiley
& Sons.
Cranor, L. F. (2003). P3P: making privacy policies more useful. IEEE Security and Privacy,
1(6), 5055.
Cranor, L. F., Reagle, J., & Ackerman, M. S.
(1999). Beyond concern: Understanding net users
attitudes about online privacy (Tech. Rep. No. TR
99.4.3). Middletown, NJ: AT&T Labs-Research.
Retrieved June 5, 2007, from http://citeseer.ist.
psu.edu/cranor99beyond.html
Diaz, C., Seys, S., Claessens, J., & Preneel, B.
(2003, April). Towards measuring anonymity. In R. Dingledine & P. F. Syverson, (Eds.),
Proceedings of the 2nd International Workshop
on Privacy Enhancing Technologies PET 2002,
San Francisco, CA. Lecture Notes in Computer
Science (Vol. 2482, pp. 184-188). Heidelberg,
Germany: Springer.
Donnellan, T. (1968). Lattice theory. Oxford, NY:
Pergamon Press.
Farrell, S., & Housley, R. (2002). RFC3281: An
Internet attribute certificate profile for authorization. The Internet Society. Network Working
Group. Retrieved June 5, 2007, from http://www.
ietf.org/rfc/rfc3281.txt
Fischer-Hbner, S. (2001). IT-security and privacydesign and use of privacy-enhancing security
mechanisms. Lecture Notes on Computer Science,
1958. Heidelberg, Germany: Springer.



Fischer-Hbner, S. (2003). Privacy enhancing


technologies. Session 1 and 2, Ph.D. course,
Winter/Spring 2003. Karlstad, Sweden: Karlstad
University. Retrieved June 5, 2007, from http://
www.cs.kau.se/~simone/kau-phd-course.htm
Fujimura, K., & Nishihara, T. (2003). Reputation
rating system based on past behavior of evaluators. In Proceedings of the 4th ACM Conference
on Electronic Commerce, San Diego, CA, USA
(pp. 246-247). New York: ACM Press.
Garfinkel, T., Pfaff, B., Chow, J., Rosenblum, M.,
& Boneh, D. (2003). Terra: A virtual machinebased platform for trusted computing. In Proceedings of 19th ACM Symposium on Operating
Systems Principles, SOSP 2003, Bolton Landing,
New York (pp. 193-206). New York: ACM Press.
Retrieved June 5, 2007, from http://citeseer.ist.
psu.edu/667929.html
Goldberg, I. (2000). A pseudonymous communications infrastructure for the Internet. Ph.D. thesis,
University of California at Berkeley. Retrieved
June 5, 2007, from http://www.isaac.cs.berkeley.
edu/ iang/thesis-final.pdf
Green, A. M. (2003). International privacy laws.
Sensitive information in a wired world (Tech. Rep.
No. CS 457). New Haven, CT: Yale University.
IBM Privacy Research Institute. (2007). Armonk,
NY: IBM. Retrieved June 5, 2007, from http://
www.research.ibm.com/privacy/
Internet Security Glossary. (2007). The Internet
Society. Retrieved June 5, 2007, from www.faqs.
org/rfcs/rfc2828.html
Jensen, F.V. (1996). An introduction to Bayesian
networks. London: UCL Press.
Kelley, C. M., Denton, A., & Broadbent, R. (2001).
Privacy concerns cost ecommerce $15 billion.
Cambridge, MA: Forrester Research.
Lilien, L., & Bhargava, B. (2006). A scheme for
privacy-preserving data dissemination. IEEE

Privacy and Trust in Online Interactions

Transactions on Systems, Man and Cybernetics,


Part A: Systems and Humans, 36(3), 503-506.
Lilien, L., Gupta, A., & Yang, Z. (2007). Opportunistic networks for emergency applications
and their standard implementation framework. In
Proceedings of the First International Workshop
on Next Generation Networks for First Responders and Critical Infrastructure, NetCri07, New
Orleans, LA (pp. 588-593).
Lilien, L., Kamal, Z. H., Bhuse, V., & Gupta, A.
(2006). Opportunistic networks: The concept and
research challenges in privacy and security. In P.
Reiher, K. Makki, & S. Makki (Eds.), Proceedings of the International Workshop on Research
Challenges in Security and Privacy for Mobile
and Wireless Networks (WSPWN06), Miami, FL
(pp. 134-147).
Marchiori, M. (2002). The platform for privacy
preferences 1.0 (P3P1.0) specification. W3C
recommendation. W3C. Retrieved June 5, 2007,
from http://www.w3.org/TR/P3P/
Marsh, S. (1994). Formalizing trust as a computational concept. Ph.D. thesis, University of
Stirling, U.K.
McKnight, D. H., & Chervany, N. L. (2001). Conceptualizing trust: A typology and e-commerce
customer relationships model. In Proceedings of
the 34th Annual Hawaii International Conference
on System Sciences, HICSS-34, Island of Maui,
Hawaii (Volume 7, 10 pages). Washington, D.C.:
IEEE Computer Society. Retrieved June 5, 2007,
from http://csdl2.computer.org/comp/proceedings/hicss/2001/0981/07/09817022.pdf
McKnight, D., Choudhury, V., & Kacmar, C.
(2002). Developing and validating trust measures
for e-commerce: An integrative topology. Information Systems Research, 13(3), 334-359.
Mercuri, R. T. (2004). The HIPAA-potamus in
health care data security. Communications of the
ACM, 47(7), 25-28.

Morinaga, S., Yamanishi, K., Tateishi, K., & T.


Fukushima, T. (2002). Mining product reputations on the Web. In Proceedings of the 8th ACM
SIGKDD International Conference on Knowledge
Discovery and Data Mining (pp. 341349). New
York: ACM Press. Retrieved June 5, 2007, from
citeseer.ist.psu.edu/morinaga02mining.html
Mui, L. (2002). Computational models of trust
and reputation: Agents, evolutionary games,
and social networks. Ph.D. thesis, Massachusetts
Institute of Technology.
Mui, L., Mohtashemi, M., & Halberstadt, A.
(2002). A computational model of trust and
reputation for e-businesses. In Proceedings of
the 35th Annual Hawaii International Conference
on System Sciences, HICSS02, Track 7, Island
of Hawaii, Hawaii (9 pages). Washington, D.C.:
IEEE Computer Society. Retrieved June 5, 2007,
from http://csdl2.computer.org/comp/proceedings/hicss/2002/1435/07/14350188.pdf
Pfitzmann, A., & Khntopp, M. (2000). Anonymity, unobservability, and pseudonymitya
proposal for terminology. In H. Federrath (Ed.),
Designing privacy enhancing technologies,
Proceedings of the Workshop on Design Issues
in Anonymity and Observability, Berkeley, California. Lecture Notes in Computer Science (Vol.
2009, pp. 1-9). Heidelberg, Germany: Springer.
Privacy Act. (2004). Washington, D.C.: U.S. Environmental Protection Agency. Retrieved June
5, 2007, from http://www.epa.gov/privacy/
Privacy Bird Tour. (2007). Retrieved June 5, 2007,
from http://www.privacybird.org/tour/1_3_beta/
tour.html
Privacy online: A report to congress. (1998).
Washington, D.C.: U.S. Federal Trade Commission.
Pujol, J. M., Sangesa, R., & Delgado, J. (2002).
Extracting reputation in multi agent systems by
means of social network topology. In Proceed-



Privacy and Trust in Online Interactions

ings of the First International Joint Conference


on Autonomous Agents and Multiagent Systems,
AAMAS 02, Bologna, Italy (pp. 467-474). New
York: ACM Press. Retrieved June 5, 2007, from
citeseer.ist.psu.edu/pujol02extracting.html
Reiter, M., & Rubin, A. (1999). Crowds: Anonymity for Web transactions. Communications of the
ACM, 42(2), 32-48.
Ross, R. (2007, July 19). Robotic insect takes
off. Researchers have created a robotic fly for
covert surveillance. Technology Review, July
19. Retrieved July 20, 2007, from http://www.
technologyreview.com/Infotech/19068/
Sabater, J., & Sierra, C. (2002). Social ReGreT, a
reputation model based on social relations. ACM
SIGecom Exchanges, 3(1), 44-56.
Seigneur, J.-M., & Jensen, C. D. (2004). Trading
privacy for trust. In T. Dimitrakos (Ed.), Proceedings of the Second International Conference on
Trust Management, iTrust 2004, Oxford, UK.
Lecture Notes in Computer Science (Vol. 2995,
pp. 93-107). Heidelberg, Germany: Springer.
Sensor Nation. (2004). [Special Report]. IEEE
Spectrum, 41(7).
Serjantov, A., & Danezis, G. (2003). Towards an
information theoretic metric for anonymity. In R.
Dingledine & P. F. Syverson. (Eds.), Proceedings
of the 2nd International Workshop on Privacy Enhancing Technologies, PET 2002, San Francisco,
California. Lecture Notes in Computer Science
(Vol. 2482, pp. 259-263). Heidelberg, Germany:
Springer.
Summary of the HIPAA Privacy Rule. (2003).
Washington, D.C.: The U.S. Department of Health
and Human Services.
Sweeney, L. (1996). Replacing personally-identifying information in medical records, the Scrub
system. In J. J. Cimino (Ed.), Proceedings of the
American Medical Informatics Association (pp.
333-337). Washington, D.C.: Hanley & Belfus.

8

Retrieved June 5, 2007, from http://privacy.cs.cmu.


edu/people/sweeney/scrubAMIA1.pdf
Sweeney, L. (1998). Datafly: A system for providing anonymity in medical data. In T. Y. Lin &
S. Qian (Eds.), Database security XI: Status and
prospects. IFIP TC11 WG11.3 Eleventh International Conference on Database Security, Lake
Tahoe, California (pp. 356-381). Amsterdam, The
Netherlands: Elsevier Science.
Sweeney, L. (2001a). Computational disclosure
control: A primer on data privacy protection.
Ph.D. thesis, Massachusetts Institute of Technology.
Sweeney, L. (2001b). Information explosion. In L.
Zayatz, P. Doyle, J. Theeuwes, & J. Lane (Eds.),
Confidentiality, disclosure, and data access:
Theory and practical applications for statistical
agencies (26 pages). Washington, D.C.: Urban Institute. Retrieved June 5, 2007, from http://privacy.
cs.cmu.edu/people/sweeney/explosion2.pdf
Sweeney, L. (2002a). Achieving k-anonymity
privacy protection using generalization and suppression. International Journal on Uncertainty,
Fuzziness and Knowledge-based Systems, 10(5),
571-588.
Sweeney, L. (2002b). k-anonymity: a model for
protecting privacy. International Journal on
Uncertainty, Fuzziness and Knowledge-based
Systems, 10(5), 557570.
Trustworthy Computing White Paper. (2003).
Redmond, Washington: Microsoft. Retrieved
June 5, 2007, from http://www.microsoft.com/
mscorp/twc/twc_whitepaper.mspx
Tschudin, C. (1999). Apoptosisthe programmed
death of distributed services. In J. Vitek & C.
D. Jensen (Eds.), Secure Internet programming.
Security issues for mobile and distributed objects.
Lecture Notes in Computer Science (Vol. 1603,
pp. 253-260). Heidelberg, Germany: Springer.

Privacy and Trust in Online Interactions

Tygar, J. D., & Yee, B. (1994). Dyad: A system


for using physically secure coprocessors. In Proceedings of the Joint Harvard-MIT Workshop
Technological Strategies for Protecting Intellectual Property in the Networked Multimedia
Environment. Annapolis, MD: Interactive Multimedia Association. Retrieved July 20, 2007, from
http://www.cni.org/docs/ima.ip-workshop/Tygar.
Yee.html
Wagealla, W., Carbone, M., English, C., Terzis,
S., Lowe, H. & Nixon, P. (2003, September). A
formal model for trust lifecycle management. In
Proceedings of the 1st International Workshop
on Formal Aspects of Security and Trust, FAST
2003, Pisa, Italy, (pp. 181-192). Retrieved July 20,
2007, from http://www.iit.cnr.it/FAST2003/fastproc-final.pdf
Welcome to the Safe Harbor. (2007). Trade Information Center. Retrieved June 5, 2007, from
http://www.export.gov/safeharbor/
Young, J. F. (1971). Information theory. New York:
Wiley Interscience.
Yu, B., & Singh, M. P. (2002a). Distributed
reputation management for electronic commerce.
Computational Intelligence, 18(4), 535-549.
Yu, B., & Singh, M. P. (2002b). An evidential
model of distributed reputation management.
In Proceedings of the First International Joint
Conference on Autonomous Agents and Multiagent Systems, AAMAS 02, Bologna, Italy, (pp.
294301). New York: ACM Press.
Yu, T., Winslett, M., & Seamons, K. E. (2003).
Supporting structured credentials and sensitive
policies through interoperable strategies for automated trust negotiation. ACM Transactions on
Information and System Security, 6(1), 1-42.
Zacharia, G., & Maes, P. (2000). Trust management through reputation mechanisms. Applied
Artificial Intelligence, 14, 881-907.

Zhong, Y. (2005). Formalization of trust. Ph.D.


thesis, West Lafayette, IN: Purdue University.
Zhong, Y., & Bhargava, B. (2004, September).
Using entropy to trade privacy for trust. In Proceedings of the NSF/NSA/AFRL Conference on
Secure Knowledge Management, SKM 2004,
Amherst, NY (6 pages).
Zhong, Y., Lu, Y., & Bhargava, B. (2004). TERA:
An authorization framework based on uncertain
evidence and dynamic trust (Tech. Rep. No.
CSD-TR 04-009). West Lafayette, IN: Purdue
University.
Zhong, Y., Lu, Y., Bhargava, B., & Lilien, L.
(2006). A computational dynamic trust model
for user authorization (Working Paper). West
Lafayette, IN: Purdue University.

AddItIonAl reAdIng In prIvAcy


And trust
Antoniou, G., Wilson, C., & Geneiatakis, D.
(2006, October). PPINAa forensic investigation protocol for privacy enhancing technologies.
In H. Leitold & E. Leitold (Eds.), Proceedings
of the 10th IFIP International Conference on
Communications and Multimedia Security, CMS
2006, Heraklion, Crete, Greece. Lecture Notes
in Computer Science (Vol. 4237, pp. 185-195).
Heidelberg, Germany: Springer
Bauer, K., McCoy, D., Grunwald, D., Kohno,
T., & Sicker D. (2007). Low-resource routing
attacks against anonymous systems (Tech. Rep.
No. CU-CS-1025-07). Boulder, CT: University of
Colorado at Boulder. Retrieved June 5, 2007, from
http://www.cs.colorado.edu/department/publications/reports/docs/CU-CS-1025-07.pdf
Bearly, T., & Kumar, V. (2004). Expanding trust
beyond reputation in peer-to-peer systems. In
Proceedings of the 15th International Database

9

Privacy and Trust in Online Interactions

and Expert Systems Applications, DEXA04,


Zaragoza, Spain (pp. 966-970).
Bouguettaya, A. R. A., & Eltoweissy, M. Y. (2003).
Privacy on the Web: facts, challenges, and solutions. IEEE Security and Privacy, 1(6), 40-49.
Cahill, V., Gray, E., Seigneur, J.-M., Jensen, C.D.,
Chen, Y., English, C., et al. (2003). Using trust for
secure collaboration in uncertain environments.
IEEE Pervasive Computing, 2(3), 52-61.
Capra, L. (2004, November). Engineering human
trust in mobile system collaborations. In Proceedings of the 12th International Symposium on the
Foundations of Software Engineering, SIGSOFT
2004, Newport Beach, California, (pp. 107-116).
Washington, D.C.: IEEE Computer Society.
Chaum, D. (1981). Untraceable electronic mail,
return addresses, and digital pseudonyms. Communications of the ACM, 24(2), 84-90. Retrieved
June 5, 2007, from http://world.std.com/~franl/
crypto/chaum-acm-1981.html
Frosch-Wilke, D. (2001, June). Are e-privacy and
e-commerce a contradiction in terms? An economic examination. In Proceedings of Informing
Science Challenges to Informing Clients: A Transdisciplinary Approach, Krakow, Poland, (pp. 191197). Retrieved June 5, 2007, from http://www.
informingscience.org/proceedings/IS2001Proceedings/pdf/FroschWilkeEBKAreEP.pdf
Grandison, T., & Sloman, M. (2000). A survey of
trust in Internet applications. IEEE Communications Surveys, 3(4), 2-16.
Hubaux, J.-P., Capkun, S., & Luo, J. (2004). The
security and privacy of smart vehicles. IEEE
Security and Privacy Magazine, 2(3), 49-55.
Kagal, L., Finin, T., & Joshi, A. (2001). Trust-based
security in pervasive computing environments.
IEEE Computer, 34(12), 154-157.
Kesdogan, D., Agrawal, D., & Penz, S. (2002).
Limits of anonymity in open environments. In

0

F. A. P. Petitcolas (Ed.), Proceedings of the 5th


International Workshop on Information Hiding
(IH 2002), Noordwijkerhout, The Netherlands.
Lecture Notes in Computer Science (Vol. 2578, pp.
53-69). Heidelberg: Germany: Springer. Retrieved
June 5, 2007, from http://www-i4.informatik.
rwth-aachen.de/sap/publications/15.pdf
Kpsell, S., Wendolsky, R., & Federrath, H.
(2006). Revocable anonymity. In Proceedings of
the 2006 International Conference on Emerging
Trends in Information and Communication Security, ETRICS, Freiburg, Germany. Lecture Notes
in Computer Science (Vol. 3995, pp. 206-220).
Heidelberg: Germany: Springer
Kuntze, N., & Schmidt, A. U . (2006). Transitive trust in mobile scenarios. In Proceedings of
the 2006 International Conference on Emerging Trends in Information and Communication
Security, ETRICS, Freiburg, Germany, Lecture
Notes in Computer Science (Vol. 3995, pp. 73-85).
Heidelberg: Germany: Springer.
Lam, S. K., Frankowski, D., & Riedl, J. (2006,
June). Do you trust your recommendations? An
exploration of security and privacy issues in recommender systems. In Proceedings of the 2006
International Conference on Emerging Trends
in Information and Communication Security,
ETRICS, Freiburg, Germany, Lecture Notes in
Computer Science (Vol. 3995, pp. 14-29). Heidelberg: Germany: Springer.
Langheinrich, M. (2001). Privacy by design
principles for privacy-aware ubiquitous systems.
In G. D. Abowd, B. Brumitt, & S. Shafer (Eds.),
Proceedings of the 3rd International Conference on Ubiquitous Computing, UbiComp 2001,
Atlanta, Georgia, Lecture Notes in Computer
Science (Vol. 2201, pp. 273-291). Heidelberg,
Germany: Springer.
Lu, Y., Wang, W., Bhargava, B., & Xu, D. (2006).
Trust-based privacy preservation for peer-to-peer
data sharing. IEEE Transactions on Systems, Man
and Cybernetics, Part A, 36(3), 498-502.

Privacy and Trust in Online Interactions

Manchala, D. W. (2000). E-commerce trust


metrics and models. IEEE Internet Computing,
4(2), 36-44.
Martin, D. M., Jr., Smith, R. M., Brittain, M.,
Fetch, I., & Wu, H. (2001). The privacy practices
of Web browser extensions. Communications of
the ACM, 44(2), 45-50.
Nielson, S. Crosby, S., & Wallach, D. (2005). A
taxonomy of rational attacks. In M. Castro & R.
Renesse (Eds.), Proceedings of the 4th International Workshop on Peer-to-Peer Systems, IPTPS
05, Ithaca, New York, Lecture Notes in Computer
Science, (Vol. 3640, pp. 36-46). Heidelberg, Germany: Springer.
Resnick, P., Kuwabara, K., Zeckhauser, R. &
Friedman, E. (2000). Reputation systems. Communications of the ACM, 43(12), 45-48.
Richardson, M., Agrawal, R., & Domingos, P.
(2003). Trust management for the Semantic Web.
In Proceedings of the 2nd International Semantic
Web Conference. Lecture Notes in Computer
Science, (Vol. 2870, pp. 351368). Heidelberg,
Germany: Springer.

Theodorakopoulos, G., & Baras, J. S. (2006). On


trust models and trust evaluation metrics for ad
hoc networks. IEEE Journal on Selected Areas
in Communications, 24(2), 318-328.
Thuraisingham, B. (2007, June). Confidentiality, privacy and trust policy enforcement for the
Semantic Web. In Proceedings of the Eighth
IEEE International Workshop on Policies for
Distributed Systems and Networks, POLICY07,
Bologna, Italy, (pp. 8-11). Washington, DC: IEEE
Computer Society.
Virendra M., Jadliwala, M., Chandrasekaran, M.,
& Upadhyaya, S. (2005, April). Quantifying trust
in mobile ad-hoc networks. In Proceedings of the
IEEE International Conference on Integration
of Knowledge Intensive Multi-Agent Systems,
KIMAS 2005, Boston, MA, (pp. 65-70). Washington, D.C.: IEEE Computer Society.
Westin, A. (1967). Privacy and freedom. New
York: Atheneum.
Westin, A. (2003). Social and political dimensions of privacy. Journal of Social Issues, 59(2),
431-453.

Rousseau, D. M., Sitkin, S. B., Burt, R. S., &


Camerer, C. (1998). Not so different after all: A
cross-discipline view of trust. Academic Management Review, 23(3), 393-404.

Xiao, L., Xu, Z., & Zhang, X. (2003). Low-cost and


reliable mutual anonymity protocols in peer-topeer networks. IEEE Transactions on Distributed
Systems, 14(9), 829-840.

Sandberg, C. K. (2002). Privacy and customer


relationship management: can they peacefully
co-exist. William Mitchell Law Review, 28(3),
1147-1162.

Yao, D., Frikken, K., Atallah, M., & Tamassia,


R. (2006, December). Point-based trust: Define
how much privacy is worth. In Q. Ning & Li
(Eds.), Proceedings of the Eighth International
Conference on Information and Communications
Security, ICICS 06, Raleigh, North Carolina,
Lecture Notes in Computer Science, (Vol. 4307,
pp. 190-209). New York: ACM Press.

Skogsrud, H., Benatallah, B., & Casati, F. (2003).


Model-driven trust negotiation for Web services.
IEEE Internet Computing, 7(6), 45-52.
Squicciarini, A. C., Bertino, E., Ferrari, E., & Ray,
I. (2006). Achieving privacy in trust negotiations
with an ontology-based approach. IEEE Transactions on Dependable and Secure Computing,
3(1), 13-30.

Zhang, N., & Todd, C. (2006, October). A privacy


agent in context-aware ubiquitous computing
environments. In H. Leitold & E. Leitold (Eds.),
Proceedings of the 10th IFIP International Conference on Communications and Multimedia



Privacy and Trust in Online Interactions

Security, CMS 2006, Heraklion, Crete, Greece,


Lecture Notes in Computer Science, (Vol. 4327,
pp. 196-205). Heidelberg, Germany: Springer.
Zhu, H., Bao, F., & Liu, J. (2006). Computing of
trust in ad-hoc networks. In H. Leitold & E. Leitold
(Eds.), Proceedings of the 10th IFIP International
Conference on Communications and Multimedia
Security, CMS 2006, Heraklion, Crete, Greece,
Lecture Notes in Computer Science, (Vol. 4237,
pp. 1-11). Heidelberg, Germany: Springer.

endnotes
1



A successful construction of a cyberfly or


the first robot to achieve liftoff that's modeled on a fly and built on such a small scale
was just reported (Ross, 2007).
This includes politics. A Russian proverb
Trust but verify was made famous in the
mid 1980s by President Reagan, at the start
of the historic negotiations with General
Secretary Gorbachev.
A special case of protection of this kind is
preventing any forwarding of disseminated
data by any party that received it directly
from their original source.



Chapter VI

Current Measures to Protect


E-Consumers Privacy in
Australia
Huong Ha
Monash University, Australia
Ken Coghill
Monash University, Australia
Elizabeth Ann Maharaj
Monash University, Australia

AbstrAct
The current measures to protect e-consumers privacy in Australia include (i) regulation/legislation, (ii)
guidelines, (iii) codes of practice, and (iv) activities of consumer associations and the private sector.
However, information about the outcomes of such measures has not been sufficiently reported, whereas
privacy incidents have increased. Some policy implications for e-consumer protection are drawn from
the analysis. Firstly, national privacy legislation should widen its coverage. Secondly, uniform regulations and guidelines could contribute to providing equal protection to e-consumers. Thirdly, guidelines
and codes of practice need to be supported by legislation and a proper compliance regime. Corporate
social responsibility by e-retailers is also required for effective adoption of self-regulatory measures.
Fourthly, consumer education is important to enhance consumer awareness of online privacy risks and
their ability to deal with such incidents. Finally, a combination of legal frameworks, technological, and
human-behaviour related measures is more likely to address online privacy issues effectively.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Current Measures to Protect E-Consumers Privacy in Australia

IntroductIon
E-retailing has generated many benefits to both
e-retailers and e-consumers. At the same time,
it has also raised serious problems for the operation of the online market, especially consumer
protection. Among several problems with online
shopping, privacy concerns are key factors which
discourage consumers from shopping online
(Stoney & Stoney, 2003).
These concerns have been addressed by a
number of measures at both the international and
national levels. However, insufficient information
about these measures and the outcomes of such
measures has been reported.
This chapter examines the current measures to
protect consumers privacy in the online market,
using Australia as a case study; examines the
current state of e-consumer protection regarding
privacy; and discusses policy implications for the
protection of e-consumers privacy.
This chapter consists of four main sections.
The first section introduces three main privacy
issues, namely data security, spam/spim, and
spyware. The second section examines several
measures implemented at the international and
national levels to address privacy issues. In Australia, these measures include (i) legislation, (ii)
guidelines, (iii) codes of practice, (iv) initiatives by
the private sector, and (v) activities by consumer
associations. The effectiveness of the current
measures to address privacy concerns has been
examined in the third section by analysing the
current state of e-consumer protection in terms of
privacy. This section also discusses a case study,
using Dell as a subject of investigation. The final
section discusses the policy implications.
The findings suggest that although legislation,
guidelines, and codes of practice are available,
the effectiveness of these measures is limited.
Consumers are not confident to shop online
due to privacy and security concerns. Also, the
protection of consumers personal information
depends on how e-retailers exercise their corpo-



rate social responsibility to provide protection to


e-consumers.
The chapter aims to contribute to the development of theoretical understanding relating to
regulations, guidelines, industry codes of conduct,
and initiatives by the private sector to protect econsumers privacy. Its also provides an insight
into measures addressing privacy concerns and
how these measures could be improved to enhance
consumer confidence in the online market.

bAckground
This section first discusses three sub-issues of
concern in the protection of e-consumers privacy.
It then introduces the concept of consumer rights,
and discusses justification for e-consumer protection. It also analyses the current framework for
e-consumer protection regarding privacy.

privacy Issues
Privacy is one of the key issues in e-consumer
protection (Stoney & Stoney, 2003; Consumers
International, 2001; Jackson, 2003; Kehoe, Pitkow,
Sutton, Aggarwal, & Rogers, 1999). Internet users
are very concerned about how their personal and
financial data and medical history are collected,
used and disseminated (Consumers International,
2001; Jackson, 2003; Kehoe, Pitkow, Sutton, Aggarwal, & Rogers, 1999; Moghe, 2003). Many
consumers are very reluctant to reveal their
particulars because they do not want e-retailers
to misuse their personal information. However,
by adopting advanced technology, e-retailers can
easily collect personal and financial details of
e-consumers (Lynch, 1997). In addition, many
Web sites request e-shoppers to register or accept cookies which can help in tracking their
Internet itinerary (Yianakos, 2002). Privacy risks
can become a greater danger when e-retailers
share common databases (Egger, 2002). To make
things worse, many e-retailers have not published

Current Measures to Protect E-Consumers Privacy in Australia

privacy policies on their Web sites (Consumer Affairs Victoria, 2003; Consumer Affairs Victoria,
2004). For example, only 28% of the Web sites,
investigated in the Internet Sweep Day 2001, had
privacy policies (Australian Competition and
Consumer Commission, 2003).
This chapter focuses on three sub-issues,
namely data security, spam/spim, and spyware, affecting privacy due to their relevance
to e-consumer protection. Firstly, data security
refers to the security of personal and financial
information during the collection, usage, transmission, and retention stages of e-transactions.
Personal identity includes name, residential and
postal address, driving license, date of birth, social
security number, health card number, passport
number, birth certificate, contact number, contact and place of employment, mothers maiden
name, employee or student identification number,
and e-mail address of the workplace (Lawson &
Lawford, 2003; Milne, 2003). Identity can also
be username, password, , cryptographic keys,
physical devices such as dongles, swipe cards, or
even biometric recognition (Marshall & Tompsett, 2005). Financial information includes bank
account number, credit card number, password,
personal identification number (PIN), and tax
file number (Lawson & Lawford, 2003; Milne,
2003). Identity theft occurs when a customers
financial and personal information is illegally
collected and used by unscrupulous retailers
or unauthorised person in order to impersonate
another for personal gain or committing a fraud
(Grabosky, Smith, & Dempsey, 2001; Lawson
& Lawford, 2003; Marshall & Tompsett, 2005;
Milne, Rohm, & Bahl, 2004; Smith, 2004). It has
been noted that more cases of identity theft have
been committed via electronic means (Grabosky,
Smith, & Dempsey, 2001; Ha, 2005; Metz, 2005).
Information can be unlawfully obtained during
the transmission process. Employees, independent
hackers, criminal individuals, and organised
crime rings, business competitors, saboteurs,
and cyber terrorists are possible intruders (Crime

and Misconduct Commission Queensland, 2004;


Ha, 2005; Ha, 2006). Information can also be
attained from a dishonest e-retailer or an online
bank officer who illegally share customers information with others (Egger, 2002). For instance,
the Financial Institution Online Fraud Survey in
2004 reported that 40% of bank officers shared
passwords (Cyota, 2005). Information can also
be illegally acquired during the retention stage,
especially when the storage of data is connected
to the Internet (Milloy, Fink, & Morris, 2002).
Other ways to unlawfully obtain information
online are the use of computer viruses/worms,
spyware, phreaking, smurfing, and account harvesting. Phreaking refers to the hacking of a
telecommunication system to obtain free phone
services (U.S. Senate Permanent Subcommittee
on Investigations, 1986). Hacking refers to the
illegal breaking into a computer system for various malicious purposes (Krone, 2005). Smurfing
refers to the use of smurf program to use internet
protocol and Internet control message protocol to
send a request using a packet Internet gopher to an
Internet host to test its response (Krone, 2005).
Account harvesting refers to the collection of
e-mail accounts from information in the public
domain or by using spybots to search for e-mail
addresses stored locally on a computer (Krone,
2005). However, this chapter does not examine
these heavily technical forms of computer offences
(Ha, 2007, unpublished).
Secondly, spam occurs as a result of personal
e-mail addresses being divulged, and is thus a
subset of privacy concerns. In this case, users
e-mail addresses are either stolen randomly from
the Internet or are provided to other parties by eretailers or authorised persons in their companies,
then used for unsolicited commercial e-mails
(UCE) or unsolicited bulk e-mails (UBE)
(Quo, 2004).
Spam usually promotes prohibited or undesirable content which involves scams, hoaxes,
deception and advertisements of financial products, sexual sites, and unaccredited educational



Current Measures to Protect E-Consumers Privacy in Australia

programs (Cheng, 2004; James & Murray, 2003;


OECD, 2006). Spam takes a lot of computer
resource (i.e., memory), and thus reduces productivity and annoys Internet users with unsolicited
e-mails and pop-up advertisements. Spam spreads
computer viruses and worms and wastes users
time to filter and delete junk messages (Australian
Direct Marketing Association (ADMA), 2005;
Cheng, 2004), causing a huge financial and productivity losses to industry (Ha, 2007; MacRae,
2003). A new version of spam, called spim, targets
instant messaging (IM) services and spreads
undesirable content such as pornography (Australian Institute of Criminology, 2006).
Finally, spyware refers to software that is
used to steal information related to identification,
password, and PIN numbers during a transaction
process (Australian Federal Police (AFP), 2007).
It can monitor and gather sensitive financial or
medical information from Internet users. It can
also change users browser settings or home pages
which may direct users to other Web sites (Majoras, Swindle, Leary, Harbour, & Leibowitz, 2005).
However, many consumers do not have adequate
knowledge about it (Zhang, 2005). Only 45.6% of
respondents in a survey by Zhang (2005) in 2005
were aware that their PCs could be infected by
spyware. Therefore, there is an urgent need to
raise e-consumers awareness of spyware, and
how to deal with the online privacy risks and for
societies and groups to establish and define new
norms of behaviour (Kaufman, Edlund, Ford,
& Powers, 2005).
Overall, e-consumers are not confident of
e-retailers practices regarding privacy protection. A survey by Roy Morgan Research (2004)
reported that 81% of the respondents believed
that customer details of a business are often
transferred or sold to another business. Consumers only want to shop online and provide their
personal and information to e-retailers when they
trust such e-retailers (Grabner-Kraeuter, 2002;
Kim, Williams, & Lee, 2003; Mansoorian, 2006;
Saarenp & Tiainen, 2003; Walczuch, Seelen,



& Lundgren, 2001). Hence, e-retailers need to


keep customers information confidential and
they are, by law, not allowed to use customers
personal information for other promotion activities without customers prior consent (Ha, 2007,
unpublished). The analysis shows that privacy is a
problem of both technology and human behaviour
(Yianakos, 2002). Thus, the priority task is to find
the best set of motivators and inducements to
deal with simple challenges which are made
more complicated by the networked world
(Yianakos, 2002).

consumer rights and consumer


protection
The four basic consumer rights, introduced in
1962 by the then USA President John Kennedy,
aim to empower consumers in commercial,
social, and economic activities (Akenji, 2004;
Consumers International, 2006). They are the
rights to (i) safety, (ii) information, (iii) choice,
and (iv) representation (Akenji, 2004; Consumers
International, 2006).
These rights were adopted by international
consumer organisations and another four rights
were added as a result of the development of
consumer movement led by Consumers International (USA) (Consumers International, 2006).
The United Nations adopted the eight basic
consumer rights in 1985, including the right to:
(i) safety, (ii) be informed, (iii) choose, (iv) be
heard, (v) satisfaction of basic needs, (vi) redress,
(vii) consumer education, and (viii) a healthy
environment (Akenji, 2004; Choice, 2006; Consumers International, 2006; Federal Bureau of
Consumer Affairs (Australia), 1993; Huffmann,
2004; Mansor, 2003; NSW Office of Fair Trading,
2003; Singh, 2002).
These rights have been the basic foundation
for countries to establish their own principles for
consumer protection. However, these rights apply to consumers in both the online and off-line
market without distinction for e-consumers. Ac-

Current Measures to Protect E-Consumers Privacy in Australia

cording to these rights, e-consumers consumers


are entitled to be further protected against unsolicited communication . invasion of privacy
(Huffmann, 2004).
Therefore, consumer protection aims to
protect the interest(s) of consumers in trade and
commerce (Quirk & Forder, 2003). Consumer
protection is one of the tools to implement the
eight universal basic consumer rights. It refers to
activities which police the market against acts
and practices that distort the manner in which
consumers make decisions in the marketplace
(Muris, 2002).
The development of the online global marketplace has entailed a number of consumer
protection issues in Australia which have been
uniquely e-related, that is, some issues only
occur in the online market (Round & Tustin,
2004). Consumer protection is important in eretailing for a number of reasons. Firstly, there
has been a lack of respect for [consumer] rights
and their safety online and many e-consumers
do not know what kind of protection they have
been entitled to receive, how and by whom they
have been protected (Scottish Consumer Council,
2001). It means the first consumer right to safety
may not be met. Secondly, e-consumers seem to
be disadvantaged in commercial transactions in
terms of lack of choice in contracts and lack
of understanding of terms and conditions, and
lack of knowledge about alternative choice in an
e-contract (Petty & Hamilton, 2004). The second
and third consumer rights to receive information
and to choose may be violated if these above issues
are not properly addressed. Consumers have to
accept all terms and conditions, known as bundle
consent which usually includes the option for
receiving further marketing information, if they
want to proceed with an online purchase (Ha,
2007, unpublished). Thirdly, consumers have to
take more risks when conducting online transactions because customers cannot contact e-retailers

physically before making decisions to buy online.


They also have to pay in advance and provide their
personal and financial information (Huffmann,
2004). Nevertheless, they could not be assured that
their privacy is sufficiently protected. Consumer
rights to receive adequate protection in terms of
privacy are not respected in practice.

the current measures to protect


e-consumers privacy
Currently, privacy issues associated with consumer protection in e-retailing have been addressed
by a mixture of measures at the international
and national levels. This section briefly discusses
legislation and guidelines introduced by national
and supra-national organisations to protect consumers privacy. It also analyses other measures
which are employed by different organisations
in both the public and private sectors to address
privacy issues in e-retailing.

Table 1. Summary of international and national


guidelines and legislation for consumer protection in terms of privacy (based on information in
Harland, 1987; North American Consumer Project
on Electronic Commerce, 2006)
Organisation

Guidelines and Legislation

UN

Universal Declaration of Human Rights 1948


Guidelines for Consumer Protection 1985
(expanded in 1999)

OECD

Guidelines on the Protection of Privacy and


Trans-border Flows of Personal Data 1980
Guidelines for Consumer Protection in the
Context of Electronic Commerce 1999

EU
APEC

Data Protection Directive (EU) (95/46/EC)


APEC Voluntary Online Consumer Protection
Guidelines 2002

Sources: summarised from North American Consumer Project on Electronic Commerce (NACPEC). (2006). Internet
Consumer Protection Policy Issues. Geneva: The Internet
Governance Forum (IGF).
Harland, D. (1987). The United Nations Guidelines for
Consumer Protection Journal of Consumer Policy 10(2),
245-266.



Current Measures to Protect E-Consumers Privacy in Australia

International level
The following section discusses legislation and
guidelines relating to consumer protection by the
UN, the OECD, the EU, and APEC (Table 1).

The United Nations (UN)


According to Article 12 in the Universal Declaration of Human Rights proclaimed by the General
Assembly of the United Nations in 1948, everyone
has the right to be protected against interference
with his privacy (Australian Privacy Foundation
Inc, 2006; United Nations, 1948). This document
acknowledges the universal right of human beings
regarding privacy.
The UN Guidelines for Consumer Protection
1985, adopted on April 9, 1985 (Harland, 1987),
are an extension of the UN basic consumer rights
(see Appendix 1). These provide guidance for
governments of member countries to develop,
strengthen, and modify (if necessary) the legal
framework and policies related to consumer
protection in their countries. However, provision
for privacy protection is not mentioned in this
set of guidelines.

The Organisation for Economic


Co-Operation and Development
(OECD)
The OECD introduced the Guidelines Governing the Protection of Privacy and Trans-border
Flows of Personal Data in September 1980. This
set of guidelines provides a foundation for member countries to review their privacy legislation.
The OECD guidelines include five parts and 22
provisions which provide directions to stakeholders regarding the collection, use, transmission,
and retention of individual information (OECD,
2001b). The OECD guidelines also seek strong
support among all stakeholders within and among
countries to adopt these guidelines, and promote

8

international co-operation. This aims to develop


a healthy online market which can facilitate the
production and distribution of goods and services globally, whereas individuals privacy can
still be protected. OECD member countries are
encouraged to employ a wide range of measures
to deal with privacy incidents such as using (i)
market-based incentives and punishment (e.g.,
trust-marks and privacy seals) to encourage compliance with standards, (ii) technical measures,
(iii) self-regulatory approach (e.g., online privacy
policies), and (iv) online privacy-related dispute
resolution (OECD, 2003).
The OECD Guidelines for Consumer Protection in the Context of Electronic Commerce
was approved by member countries in 1999. This
document covers many areas such as information
disclosure, confirmation process, and conclusion
of contracts, security, privacy, dispute resolution,
and redress as shown in Appendix 1 (OECD,
2000; Smith, 2004). The seventh OECD guidelines
for e-consumer protection aim to protect e-consumers privacy in accordance with the OECD
Guidelines Governing the Protection of Privacy
and Transborder Flow of Personal Data (1980).
This document also advises member countries
to improve measures to protect e-consumers
regarding privacy and other aspects, taking into
consideration the cross-border flow of information (OECD, 2000).
Similarly to the UN guidelines, both of these
OECD guidelines encourage private sector initiatives and call for strong and fruitful co-operation
among all stakeholders to achieve the common
objectives (OECD, 2000). These documents
emphasise consumer protection in general, and
privacy protection is only a sub-set of consumer
protection.

The European Union (EU)


The key legislation regarding e-consumer protection regarding privacy in the European Union
(EU) is the Data Protection Directive (Huffmann,

Current Measures to Protect E-Consumers Privacy in Australia

2004). This directive and self-regulatory privacy protection schemes aim to deal with spam
and other privacy incidents (Cheng, 2004). The
EU also acknowledged that sufficient consumer
protection in terms of privacy constitutes a fundamental right to consumers, and new areas of
protection had to be addressed so that the full
potential of e-retailing could be realised, and
both e-consumers and e-retailers could take full
advantage of the benefits which e-retailers could
offer (Buning, Hondius, Prins, & Vries, 2001).
However, Appendix 1 shows that EU Principles
of Consumer Protection do not cover the protection of privacy.

Asia Pacific Economic Co-Operation


(APEC)
APEC introduced APEC Voluntary Online
Consumer Protection Guidelines in 2002 (APEC
Electronic Commerce Steering Group, 2004).
These guidelines state that consumers must
receive the same level of protection no matter
which forms of commerce they engage in. The
APEC guidelines include ten provisions relating
to international cooperation, education and awareness, private sector leadership, online advertising
and marketing, online information disclosure to
consumers, confirmation process, resolution of
consumer disputes, privacy, security, and choice
of law and jurisdiction as summarised in Appendix 1 (APEC Electronic Commerce Steering
Group, 2004; Consumer Protection Commission,
n.d.). The content of these guidelines is similar to
the UN and the OECD guidelines, and generally
consistent with the eight basic consumer rights.
In brief, a number of guidelines and legislation have been introduced by the UN, the OECD,
the EU, and APEC to create a favourable online
environment, and to protect consumers in general, and their privacy in particular. Yet, these
guidelines have not been compulsory, and there
have been no clear powers or mechanisms to
enforce them. These guidelines aim to comple-

ment existing national legal frameworks for


e-consumer protection rather than overriding or
replacing them (Ha, 2006). Although the level
of compliance may vary among individuals and
organisations, depending on what roles they play,
the ultimate objective of these documents is to
create a safe and secure online environment for
all players. At the international level, there has
been no measure to deal with member countries
which have not complied with these guidelines
for whatever reasons. Non-uniform regulations
and standards in different countries have made the
protection of e-consumers privacy more difficult
and challenging due to ambiguous jurisdictional
applications and difficulties in enforcement. Thus,
different countries can adopt these guidelines to
review and develop their current privacy policy
on e-consumer protection (Harland, 1999).

Australia
In Australia, privacy issues had already been a
concern of the public and relevant authorities even
before the introduction of e-retailing. Privacy
concerns have been tackled by several measures,
including (i) legislation, (ii) guidelines, (iii) codes
of practice, (iv) initiatives by the private sector,
and (v) activities by consumer associations as
summarised in Table 2.
Source: Ha, H. (2007). Governance to Address
Consumer Protection in E-retailing (unpublished
thesis). Department of Management, Monash
University.
This section discusses the current policy
framework for protecting e-consumers privacy
and the institutional arrangement in Australia.

Legislation
Privacy protection is based on two main mechanisms: (i) general laws that regulate the collection, use, and dissemination of personal data both
by the public and private sector and (ii) different
acts (Moulinos, Iliadis, & Tsoumas, 2004).

9

Current Measures to Protect E-Consumers Privacy in Australia

Table 2. The current regulatory framework to address privacy concerns in Australia (based on information in Ha, 2007, unpublished thesis)
Policy framework

Activities

Legislation

Privacy Act 1988 (Public sector) (Cth) (Commonwealth)


Privacy Amendment (Private Sector) Act 2000 (Cth) (Commonwealth)
Telecommunications (Interception and Access) Act 1979 (Cth) (Commonwealth)
Spam Act 2003 (Cth) (Commonwealth)

Guidelines

Scams and Spam booklet and fact-sheets published by the ACCC and the CAV, respectively

Industry codes of practice

Codes by Australian Direct Marketing Association (ADMA) and Australian Internet Industry
Association (IIA)

Initiatives by the private


sector

Using privacy, digital seals, trust marks provided by TRUSTe, WebTrust, BBBOnline, BetterWeb
Using the Platform for Privacy Preferences

Activities by consumer
associations

Activities by Australian Consumers Associations (ACA) and Australia Privacy Foundation (AFP)

Table 2 shows that there are three main acts


regarding privacy: Privacy Act, Telecommunications Act, and Spam Act. Firstly, the Privacy
Act 1988 (Cth) is applicable to the public sector
regarding handling personal information (Curtis,
2005b; Jackson, 2003). This Act requires government agencies to gather personal information in a
legitimate and proper way. These agencies must
transparently disclose to the recipients about
what will be collected and how the information
will be used (Jackson, 2003; Vasiu, Warren, &
Mackay, 2002). Government bodies must ensure
that personal information is recorded accurately
and stored securely. They have to explain to individuals about the nature of the information
and why the information has been collected as
well as allow individuals to make changes if the
information is not correct (Jackson, 2003).
However, this Act has been criticised by the EU
on four main grounds. Firstly, it excludes many
organisations such as the private sector from the
coverage of the Act. Secondly, it introduces an
opt out scheme which permits organisations
to use personal data for direct marketing without obtaining the prior consent of the recipient.
Thirdly, it only covers data collected and used for
other purposes rather than the primary purpose.
An organisation collects and uses personal data
for the primary purpose for which it was col-

0

lected is not regulated under this Act (Vasiu,


Warren, & Mackay, 2002). Finally, the Act only
protects the privacy of information collected from
Australian citizens or permanent residents, not
from foreigners even if they are residing in Australia at the time of the offence (Vasiu, Warren,
& Mackay, 2002).
The Privacy Act 1988 (Cth), amended in 2000,
became the Privacy Amendment (Private Sector) Act 2000 (Cth) (Expert Group on Electronic
Commerce (Australia), 2003). This amended Act
extended to the private sector and privatised government corporations (Jackson, 2003). Schedule
3 in the Privacy Amendment (Private Sector) Act
2000 (Cth) includes the ten National Principles
for the Fair Handling of Personal Information
(NPPs) which do not apply to sub-national public
sector or government business enterprises (GBEs)
that perform substantially core government
functions (Information Law Branch, n.d.). The
10 national privacy principles are: (i) collection,
(ii) use and disclosure, (iii) data quality, (iv) data
security, (v) openness, (vi) access and correction,
(vii) identifiers, (viii) anonymity, (ix) transborder
data flows, and (x) sensitive information (Privacy
Commissioner (Australia), 2000). At the minimum
level, businesses must adhere to these 10 NPPs
(Jackson, 2003; Moghe, 2003; Privacy Commissioner (Australia), 2000). Paradoxically, the rights

Current Measures to Protect E-Consumers Privacy in Australia

of individuals to privacy is said to obstruct the


investigation of activities of cyber criminals such
as hackers, money launderers, drug traffickers,
terrorists, and fraudsters, and thus also hinder law
enforcement (Armstrong & Forde, 2003).
Nevertheless, the amended Act is not applicable
to small businesses with annual revenue less than
$3 million, the media industry or political parties and representatives (Jackson, 2003). These
organisations can choose to adopt a privacy code
established by an industry which must be approved
by the privacy commissioner (Moghe, 2003).
Small firms are not covered by the Act, and thus
many e-retailers may avoid liability for the misuse
of personal information or the abuse of customers
particulars. Neither do these national privacy acts
regulate sub-national agencies, with an exception
for the Australian Capital Territory (Clark, 1989;
Privacy Commissioner (Australia), 2003; Privacy
Commissioner (Australia), 2003, n.d.).
The current legislation may not be effective in
tackling privacy concerns due to legal loopholes.
The Privacy Act 1988 (Cth) and the Privacy
Amendment Act 2000 (Cth) are not applicable to
overseas spammers and international e-retailers
that do not buy and sell personal information. An
e-mail address without an individuals name may
not be regarded as personal information under the
Privacy Act. If a spammer gives the receivers a
chance to opt out or to comply with such request,
he/she is not under any legal liability (National
Office for the Information Economy (Australia),
2003). Furthermore, it is not easy to track the
senders due to the anonymous and global nature of
electronic messages. Although national legislation
is in place, these acts do not cover all individuals
and groups. This allows some e-retailers to avoid
protection of their customers privacy. Given the
loophole in the current privacy legislation.
Regarding institutional arrangements, the Federal Office of Privacy Commissioner administers
the Privacy Act 1988 (Cth). Each sub-national
jurisdiction also has an office of the Privacy
Commissioner. For example, the office of the

Victorian privacy commissioner (Privacy Victoria


or OVPC), established in 2001 by the Information
Privacy Act 2000 (Vic), performs various functions including advising the attorney-general and
relevant agencies on the privacy implications of
proposals for policy and legislation, revising the
Guidelines to the information privacy principles
(IPPs), investigating and enforcing breaches of
the IPPs (Privacy Commissioner (Australia),
2006a).
The second relevant Australian Act is the
Telecommunications (Interception and Access)
Act 1979 (Cth), amended in 2006, which specifies
the responsibilities of Internet service providers
in terms of the usage and disclosure of customers information. This Act encourages relevant
industries to develop codes of practice and other
internal schemes in order to address consumer
protection regarding privacy (Moghe, 2003).
Another act is the Australian Spam Act 2003
(Cth) which prohibits the sending of unsolicited
commercial electronic messages which have an
Australian link and the use of address-harvesting
software (Quo, 2004). It is unlawful to send any
spam, either by mobile phones or by e-mail, from
Australia and/or to an Australian address from
overseas (Williams, 2004). This act extends to
every external territory and also applies to matters outside Australia if there is no contrary intention (see Spam Act 2003 (Cth) ss). Yet, messages
from government agencies, registered political
parties, charities, and religious organisations are
exempted from this act. There are two conditions
of the exemption: (i) messages can be sent in
mass but these messages must communicate the
information about goods or services and (ii) the
senders must be the providers of such products
(Ha, 2007, unpublished). Messages with merely
factual content and with no commercial content
are also exempted, but these messages must contain precise identifying information (Australian
Communications and Media Authority, 2005).
Also, messages sent by e-retailers to consumers
who they have existing relationships will not



Current Measures to Protect E-Consumers Privacy in Australia

be classified as spam (Cheng, 2004). This Act


introduces an opt in principle which allows the
consent to be implicit or explicit (Vaile, 2004).
This Act does not cover non-electronic messages
and, therefore, it has a narrower application
than legislation by the UK (Cheng, 2004). On the
other hand, The Spam Act 2003 (Cth) has a wider
application than the EU Privacy Direct and the
US Can-Spam Act 2003 (US) because this Act
is applicable to spam originating from outside
Australia (Cheng, 2004).

Guidelines
The Treasury (Australia) published the Australian
Guidelines for Electronic Commerce (AGEC) in
March 2006, which has become one of the most
important documents promoting best practice
in e-retailing (The Australian Guidelines for
Electronic Commerce (AGEC) 2006 replaces the
Australian Best Practice Model (BPM) introduced
in 2000). The AGEC consists of 14 main provisions (see Table 3).

Table 3. The Australian guidelines for electronic


commerce 2006 (AGEC) (based on information
in Treasury (Australia), 2006)
No.

Guidelines

1
2
3
4
5
6
7
8
9
10
11
12
13
14

Fair business practices


Accessibility
Disability access
Advertising and marketing
Engaging with minors
Informationidentification of the business
Informationcontractual
Conclusion of contract
Privacy
Payment
Security and authentication
Internal complaint-handling
External dispute resolution
Applicable law and forum

Source: Summarised from Treasury (Australia) (2006) The


Australian Guidelines for Electronic Commerce (March
2006). Canberra: Treasury (Australia).



Guideline 9, Items 37 and 38 of the AGEC indicate that consumers privacy must be respected
and protected. The AGEC encourages small
businesses, which are not under the scope of the
Privacy Act 1988 (Cth), to comply with privacy
legislation so that they can enhance consumer
trust and confidence (Treasury (Australia), 2006).
Nevertheless, some e-retailers do not want to adopt
government guidelines (Ha, 2007).
In 2004, the Australian Competition and Consumer Commission published a Scams and Spam
booklet and other educational material to inform
consumers about the types and the adverse effect
of scams and spam, and how to avoid scams and
spam (Australian Competition and Consumer
Commission, 2004). National enforcement of
consumer protection laws is undertaken by the
ACCC. Established in 1995, it acts independently
of ministerial direction as a national statutory
body to administer the implementation of the
TPA. The main function of the ACCC is to advance the interest of Australian consumers by
promoting fair, dynamic and lawful competition
among all kinds of businesses. The ACCC has
continuously advocated consumer rights and has
conciliated many complaints related to online
purchase (Graeme, 2005; OECD, 2001a).
Other countries, such as the USA, have enacted
a number of pieces of legislation to deal with
privacy issues in general. For example, the U.S.
Can-Spam Act deals with offences relating to
spam e-mails (Cheng, 2004; OECD, 2003). The
state of California (U.S.) has enacted the AntiPhishing Act of 2005 (Business and Professions
Code sections 22948-22948.3) and Computer
Spyware 2004 (Business and Professions Code
section 22947) legislation (2004) which prohibit
phishing activities and illegal installation or
provision of software that can collect personal
information of the recipients without their knowledge and/or consent (State of California, 2007).
The Online Privacy Protection Act of 2003
(Business and Professions Code sections 2257522579), which came into effect in July 2004,

Current Measures to Protect E-Consumers Privacy in Australia

requires e-retailers to post a privacy policy on


the site and to comply with its policy (State of
California, 2007).
However, only a few countries such as Canada
have designed principles especially for consumer
protection in e-retailing. Canada introduced eight
principles of e-consumer protection in August
1999 (Working Group on Electronic Commerce
and Consumers (Canada), 1999) as shown in Appendix 2. Principles 3, 4, and 7 require e-retailers
to respect customers privacy and ensure that
e-transactions are secure. E-retailers should not
send commercial e-mails to consumers without
consumers prior consent. This principle aims to
address the first basic consumer right to safety.
In 2004, Canada developed the Canadian Code
of Practice for Consumer Protection in Electronic Commerce, based on the eight principles
of consumer protection. The Code has been
warmly welcomed by other stakeholders (Working
Group on Electronic Commerce and Consumers
(Canada), 2004).

Industry Codes of Practice


Consumers privacy is also protected by industry
codes of practice. The two best known industry
associations in Australia are the Australian Direct
Marketing Association (ADMA) (a non-profit organisation), and the Australian Internet Industry
Association (IIA) (a national industry organisation) (Australian Direct Marketing Association
(ADMA), 2005; Coonan, 2005; Quo, 2004). Both
devote their activities to advance the interests of
their members and the community as well as to
reinforce consumer confidence (Australian Direct
Marketing Association (ADMA), 2005; Internet
Industry Association (Australian), 2006). These
two organisations have established codes regarding privacy and spam, standards, and mechanisms
to ensure their members to comply with the laws
regarding consumer protection (Clarke, 1998;
Privacy Commissioner (Australia), 2006b).

Firstly, the ADMA has developed a code of


practice which established a self-regulatory
mechanism for it members to comply with privacy
policies and to avoid spam, after consultation with
the relevant authorities, the ACCC and consumer
and business groups (Quo, 2004; Scott, 2004). The
ADMA has also appointed an independent code
authority to monitor the compliance of its members
with the Code of Practice. The code authority has
the authority to sanction offenders and the penalty
may be the termination of membership.
The second set of code of practice (2000),
introduced by the Australian Internet Industry
Association (IIA), discourages its members and
code subscribers from using spam as one of
marketing tools with exception in the case of
pre-existing relationships (acquaintance spam)
(Quo, 2004). Both sets of code of practices mainly
apply to their members and code subscribers, but
non-member organisations are welcome to adopt
these codes.
Different from many countries which have
not had government-endorsed codes of conduct,
some industry associations in Australia have
lodged codes of practice to the relevant authorities
for approval (Chan, 2003; OECD, 2003). For instance, the ADMA received conditional authorisation of its direct marketing code of practice from
the ACCC (Australian Competition and Consumer
Commission, 1999), whereas the IIA Content
Regulation Code of Practice (Version 10.4) has
been registered with and administered by the
Australian Communications and Media Authority (Forder, 1999; Internet Industry Association
(Australia), 2006). Yet, insufficient information
about how these codes improve the protection of
consumers privacy has been reported.

Initiatives from the Private Sector


The private sector has adopted a self-regulatory
approach to address the protection of consumers
privacy by engaging other firms providing services
related to audits of privacy policies and privacy



Current Measures to Protect E-Consumers Privacy in Australia

seals which could assure that the participant


companies adhere to their policy (Egger, 2002).
Some examples of such services are TRUSTe or
WebTrust, BBBOnline Privacy Program, and
BetterWeb (Egger, 2002; Milloy, Fink, & Morris,
2002; Moulinos et al., 2004). The use of security
locks, security and privacy statements, and certificates can increase e-consumer confidence which
can, in turn, increase their purchase intention
(Milloy, Fink, & Morris, 2002).
However, Moulinos et al. (2004) argued that
there have been many factors affecting the efficiency and acceptability of digital seals used
by companies. These factors include the technology related to security, the brand name of the
companies issuing the digital seal, and the legal
framework surrounding online privacy. Yet, these
privacy seals do not offer any legal protection
because they only measure the extent to which
e-retailers conform to their promises. In addition, some seals are recognised in one country
but not in other countries, others may not have
any value in the same country (Egger, 2002). In
many cases, consumers place their trust more in a
seal of their local Consumers Association than
a seal endorsed by other organisations (Egger,
2002). Nevertheless, less than half of respondents
in a survey conducted by Moores (2005) could
recognise a privacy seal, and even those who
could recognise a private seal might not know
how to identify whether a seal was genuine or
counterfeit.
New technologies developed by the private sector also offer alternative solutions. The Platform
for Privacy Preferences (P3P) allows users to
compare a Web sites privacy policies with their
own preferences (Yianakos, 2002). This enables
users to select the websites which match their
expectation and thus they should only do business
with such Web sites. The P3P is supplementary to
legislative and self-regulatory mechanisms to



help in the review and enforcement of Web site


policies (Moghe, 2003). Other means to counter
privacy incidents include the use of digital cash
(cyber cash or Internet cash which does not require
users to reveal their personal and financial information), authentication, filters, and anti-phishing
and anti-pharming systems (Milloy, Fink, &
Morris, 2002; OECD, 2006).

Activities by Consumer Associations


The Australian Privacy Foundation (APF), established in 1987, is a non-government voluntary
organisation (Australian Privacy Foundation,
2006). The APF claims that its main objective is to
protect privacy rights of Australians via different
means such as provision of education to increase
public awareness, and advocacy of new legislation
and codes of practice regarding privacy (Australian Privacy Foundation, 2005). One of the main
activities of the APF is to organise campaigns
against privacy threats caused by the adoption
of the ID card scheme (Davies, 2004).
The Australian Consumers Association
(ACA) is a non-profit and key consumer association (Brown, 1996). It advocates consumer
rights in both the online and offline markets. It
provides the public with advice on different kinds
of goods and services and represents consumers
(Federal Bureau of Consumer Affairs (Australia),
1995). The ACA has also advocated a review of
privacy legislation and other legislations regarding consumer protection (Australian Consumers
Association, 2004).
Overall, although several measures described
above have been used to address e-consumer protection regarding privacy, insufficient information
about formal evaluation of such measures has
been published. The following section provides
statistics relating to privacy

Current Measures to Protect E-Consumers Privacy in Australia

mAIn thrust: the current


stAte of e-consumer
protectIon regArdIng
prIvAcy
privacy Incidents
International Level
About 62 billion spam e-mails are sent everyday
worldwide (Sullivan, 2007). Spam accounted for
93% of all e-mail traffic which was monitored by
Postini, an Internet security firm, in February
2007 (Haymarket Media, 2007).
A study by Teo (2002) in 2002 reported that
43.9% of Singaporean respondents showed concerns about privacy when they shopped online,
whereas Udos study (2001) reported 55.1% of U.S.
respondents ranked privacy concerns number one.
According to PC Magazine (2005), 20% of the
respondents in a recent survey were victims of
identity theft, and 64% of U.S. consumers would
not buy online because of concern over personal
information.
Consumers International and the Trans Atlantic Consumer Dialogue (TACD) conducted
an international online survey in 2004 in which
21,000 people from more than 36 countries were
asked about spam. Forty two percent of them replied that spam e-mails accounted for more than
50% of their e-mails, whereas 84% of respondents
welcomed the prohibition of all unsolicited e-mails
(Consumers International, 2004).

National Level
In Australia, 62% of Australian respondents in
a survey conducted by Roy Morgan Research
in 2004, worried about privacy concerns (Roy
Morgan Research, 2004). Australia loses more
than $1.1 billion per year for identity fraud (The
Age, 2006). The National Australia Bank loses
$1 million per month due to Internet fraud, and

this amount is expected to reach $30 million per


year by 2008 (Lekakis, 2005).
The findings from surveys conducted by Coalition against Unsolicited Bulk E-mail (Australia)
(2002) revealed that the average number of spam
per e-mail address was 118 and 140 in 1999 and
2001 respectively, an increase of 18.6% in Australia. Spam costs Australia about $2 billion a
year (Quo, 2004).
The highest number of computer-related offences reported to Victoria Police was in 19981999 and the number of offences decreased from
2000-2001 to 2002-2003. The most common
offences reported to Victoria Police related to
illegal access to computer systems which can be
classified as security and/or privacy incidents.
Official statistics might not include all the cases
occurring because the police could only deal
with matters that were reported to them (Drugs
and Crime Prevention Committee (Parliament of
Victoria), 2004). In addition, many consumers
did not bother to complain as the cost of redress
might outweigh the claim value (Donahey, 2003;
Patel & Lindley, 2001). Most e-consumers have
been very reluctant to report their problems. This
is evident by a low rate of reporting attacks on
computers to police since up to 65% of victims
of e-attacks did not report to law enforcement in
Australia in 2005 (Krone, 2006).
Generally, the statistics about complaints are
fragmented. Some e-consumers have lodged a
complaint with the ACCC or Consumer Affairs
Victoria, whereas others might report their cases
to the police. However, these figures can provide
an overview picture of the current state of e-consumer protection regarding privacy in Australia,
and worldwide, that is, the number of complaints
relating to online shopping has increased.

consumers Attitudes towards


online privacy Issues
A recent study by Ha (2007, unpublished) provides
an insight into the current state of e-consumer



Current Measures to Protect E-Consumers Privacy in Australia

protection regrading privacy. According to this


study, the majority of the respondents (80%) were
aware of different devices and means used on the
Internet, such as cookies, computer bugs, viruses,
spyware, and adware, which could reveal their
personal data. Nearly three-quarters of them (73%)
were worried about the amount of spam they received. This is consistent with a survey conducted
by the NOIE in 2000 which reported that 77% of
the respondents would not shop online because
of privacy concerns (Consumer Affairs Victoria,
2004). The Australian National Office for the Information Economy (NOIE) was replaced by the
Australian Government Information Management
Office (AGIMO) in 2004 (Williams, 2004).
Only 49% of these respondents showed an
agreement on the sufficiency, accuracy, and
ease of locating information about privacy on
commercial Web sites. As discussed previously,
the current Privacy Act (Australia) does not apply to small businesses with the annual revenue
less than A$3 million (Privacy Commissioner
(Australia), n.d.). Thus, small companies are not
required to post any information about privacy.
Most of e-retailers are small and new, with limited
security skills and budget (Centeno, 2002), and
thus this may explain for the low percentage of
the respondents who agreed on the adequacy and
precise of information about privacy. In this case,
lack of regulation relating to privacy applied to
small businesses is one of the weaknesses of the
current policy framework for consumer protection in e-retailing.
Finally, only 54% of the respondents agreed
that they knew how to handle issues relating to
privacy. This means nearly half of them might not
know how to deal with online privacy incidents
(Ha, 2007, unpublished).

how does a Well-known e-retailer


protect the privacy of Its
customers?
This section illustrates how a well-known eretailer provides protection to its customers by


discussing and evaluating Dells policies and


practices regarding privacy. Dell has been chosen
because of its unique direct e-business model and
its success as a computer e-retailer operating in
many countries, including Australia (Bangeman,
2006).

Policies and Practices


Dell has posted privacy policy on its website, as
required by the privacy legislation in Australia.
Dell has also employed several self-regulatory
measures to protect the privacy of its customers.
One of Dells activities is the launch of an online
essential portal which aims to educate consumers to protect their own privacy. This portal has
been developed by Dell in conjunction with the
National Consumers League (National Consumers League (USA), n.d.). Dell has also worked with
other organisations to launch several campaigns
to enhance public awareness of issues associated
with online shopping. For instance, Dell and the
Internet Education Foundation (www.neted.org)
jointly launched the consumer spyware initiative (CSI) public awareness campaign in 2004
(Claburn, 2004).
Dells Web site provides information about
what types of personal information will be collected, and how customers particulars will be used
and stored (Dell Inc. Australia, 2004). Dells privacy policy assures customers that their personal
data will not be revealed to third parties without
their prior written consent (Dell Inc. Australia,
2005). According to Dells policy, customers
have the right to opt out from the marketing list.
Dell also requests its employees to protect the
confidentiality of information about the relationship between Dell and its customers and other
stakeholders. Consumers can update or correct
their personal information online or by contacting
Dell (Dell Inc. Australia, 2004). Customers can
find a physical address and an online feedback
platform on Dells Web site, and they can make
queries to Dell about privacy issues.

Current Measures to Protect E-Consumers Privacy in Australia

Dell is a member of the U.S.-based Word


of Mouth Marketing Association (WOMMA),
and was the first corporate subscriber to have
publicly committed itself to the code of ethics
introduced by WOMMA in 2006 (Word of Mouth
Marketing Association, 2006). In addition, Dell
is a member of the Australian Internet Industry
Association (IIA), the Electronic Industry Code
of Conduct (USA), and the BBB OnLine Privacy
Program (BBB Online, 2003). Dells employees
must provide accurate and sufficient information
to its customers, and protect the privacy of both
internal and external customers. Dells employees
who do not act in accordance with Dells policy
are liable to discipline and/or civil and/or criminal
penalties (Dell Inc. Australia, 2007).

Evaluation
Dell does not fully comply with the regulation in
that Dell does not provide sufficient information
regarding identifiers, anonymity, cross-border
data flows, and sensitive information as required
by the NPPs (see the second section). This implies
that Dell has not fully respected consumers safety
(the first basic consumer right) regarding privacy
(see the second section). Also, the contact number
and the physical address which consumers can
communicate any privacy concern to Dell are
in Singapore, and no name of any individual in
charge of privacy at Dell is indicated. This shows
that Dell fails to meet the accepted standard of
information disclosure (Clayton, 2000, Ha, 2007,
unpublished).
Finally, there has been insufficient formal
evaluation of how Dells codes of conduct improve
the level of privacy protection. Also, most of Dells
collaboration with industry and consumer associations has taken place in the USA, not in Australia.
Furthermore, insufficient information about how
Dell has worked with industry and consumer associations in Australia has been reported.
Generally, companies registered in Australia, except for small businesses, have to comply

with the current privacy legislation. In addition,


guidelines and industry codes of practice are
only advisory, not compulsory, whilst the activities
of industry and consumer associations are limited. Thus, the protection of consumers privacy
depends much on the willingness and ability of
companies to practice social corporate responsibility (CSR) and adopt self-regulatory measures
to protect their customers privacy.
The mentioned data demonstrate that the current measures to address privacy issues associated with online shopping may not be effective
without the willingness and ability of e-retailers
to protect consumers personal data. The case
study also shows that even a well-known e-retailer
does not fully comply with the prevailing privacy
legislation to protect its customers privacy much
less voluntarily go beyond the minimum requirements of the laws.

polIcy ImplIcAtIons for


e-consumer protectIon In
terms of prIvAcy
There are a number of policy implications for the
protection of e-consumers regarding privacy.
The first implication refers to the coverage of
the current national legislation regarding small
businesses. Most e-retailers are small and they
do not post sufficient information about privacy
on their Web sites. The failure to provide sufficient information regarding privacy is difficult
to reconcile with good standards of consumer
protection. Thus, the current privacy legislation
will be reviewed in order to widen its coverage
and ensure all e-retailers receive equal treatment
in terms of compliance with privacy legislation
(Ruddock, 2006).
The second implication refers to the harmonisation of regulations, standards, and guidelines.
Most international guidelines call for voluntary
adoption of good business practice by e-retailers
to protect e-consumers privacy given the special



Current Measures to Protect E-Consumers Privacy in Australia

nature of the online market (Curtis, 2005a; Dudley,


2002; Gilliams, 2003; Lahey, 2005). However, econsumers perceive that self-regulation means
no standards, that is, e-consumers will receive
different levels of protection by different e-retailers. Also, different laws and standards across jurisdictions regarding privacy may adversely affect
the compliance of e-retailers and the effectiveness
of law enforcement. Thus, uniform regulations,
guidelines, and CSR can contribute to addressing
jurisdiction concerns and enabling consumers to
receive equal level of privacy protection.
The third implication refers to the review of
enforcement mechanisms of both legislation and
guidelines regarding privacy. The AGEC and
industry code of practice are only advisory, not
mandatory. Such guidelines and codes could not
be effective unless backed by legislation and a
compliance regime; there is little incentive for
e-retailers to voluntarily comply with guidelines
and/or codes of conduct (Ha, 2007, unpublished;
Mayer, 2002).
The fourth implication refers to consumer
education. Given the lack of awareness of consumers regarding identity theft and spyware,
educational programs provided by government
agencies, industry and consumer associations
would increase the awareness of e-consumers of
the importance of keeping personal information
confidential. Such education programs could aim
to equip the public with knowledge about how to
recognise online threats and risks as well as how
to avoid online privacy incidents. In addition, some
e-retailers are willing to comply with regulations
and guidelines, but they do not have sufficient
means to do so (Ha, 2007, unpublished). Such eretailers may not know what information should
be posted, and how much information would be
sufficient. Thus, e-retailer education would be
helpful to such e-retailers. Industry associations
could be the most appropriate candidates to provide education to e-retailers.
Another implication refers to the use of a
combination of legal, human, and technical mea-

8

sures to address privacy issues more effectively,


as Yianakos (2002) comments that privacy is a
problem of both technology and behaviour. Thus,
a combination of legal framework (e.g., legislation,
guidelines, and codes of practice), technological
measures (e.g., digital seal and certificate), and
human behaviour related measures (e.g., education) is desirable to improve the protection of
consumers privacy in the online market.
The final implication refers to the effect of
acts enacted by other countries which have great
impacts on global business and privacy protection.
For example, the U.S. Fair Information Practice
Principles lays the foundation for the formulation
of privacy laws by supra-national organisations
and many countries (State of California, 2007).
The Safe Harbor Privacy Framework, which
was approved by the EU in 2000, facilitates the
compliance of U.S. companies with the European
Directive on Data Protection (1998) and other
relevant European privacy laws. Some other
countries which have low benchmark for adequacy of privacy policy may have to improve
their standards regarding privacy to meet the EUs
standards in order to prevent any interruptions in
commercial transactions when Australian companies do business in EU countries (Allens Arthur
Robinson, 2007; Greenleaf, 2000b, p. 1). However,
the Attorney General (Australia) argued that Australian privacy legislation has gone significantly
further than the US Safe Harbor Agreement
(Allens Arthur Robinson, 2007), although there
have still been loopholes in the amended privacy
act (Greenleaf, 2000a). The impacts of privacy
acts enacted by supra-national organisations and
other countries on international trade should be
a subject of further research.

conclusIon
This chapter has examined privacy issues associated with consumer protection in the online
market, including data security, spam/spim, and

Current Measures to Protect E-Consumers Privacy in Australia

spyware, and the policy framework for addressing online privacy incidents at both international
and national levels.
Although international and national policies
to address privacy issues are in place, the effectiveness of the current policy framework has not
been formally evaluated. However, the number of
online privacy incidents has steadily increased,
and most consumers do not know how to deal with
such incidents. The findings reveal that a single
organisation or a single measure is not adequate
to address the complicated and challenging issues
associated with online privacy. A joint effort of
all stakeholders and adoption of a combination of
different measures would be desirable to protect
e-consumers privacy more effectively.
Given the lack of studies on online consumer
protection in terms of privacy, further research
on online privacy will certainly contribute to
the development of a theoretical framework and
practical approaches to solving stagnant problems
with e-consumer protection regarding privacy.

The exposure of loopholes in the current privacy legislation has led the Australian government
to review it (Ruddock, 2006). This confirms the
potential for more research on the extent to which
new legislation could deter undesirable behaviour
relating to privacy.
The cross-border and transient nature of e-retailing justifies more research on how legislation
and guidelines could be adopted at a supra-national
level to more effectively prevent the abuse of or
illegal use of e-customers particulars.
In addition, the limited formal evaluation of
privacy protection measures in e-retailing suggests they should be further investigated.
Finally, security and privacy issues are interrelated because lack of security measures may
lead to the unwanted disclosure of customers
personal and financial information. Addressing
any one of these issues separately is insufficient to
ensure consumers interests to be fully protected.
Thus, they must be investigated as an integrated
problem and addressed simultaneously.

future reseArch dIrectIons

references

Further research could usefully focus on the


motivations and behaviour of consumers in
protecting themselves against privacy incidents
in the online marketplace and on technical measures which can contribute to addressing online
privacy concerns.
Privacy issues associated with e-retailing are
caused by a combination of legality, technology,
and human behaviour which require different
measures by different groups of stakeholders to
address them effectively. Thus, future research
could also focus on whether cooperation among
all stakeholders at all levels (international, national, and sub-national levels) could address
privacy online incidents more effectively and on
whether greater consumer vigilance and self-help
efforts could contribute to addressing privacy
concerns.

Akenji, L. (2004). The eight basic consumer rights.


Retrieved November 8, 2006, from http://www.
tudatosvasarlo.hu/english/article/print/254
Allens Arthur Robinson. (2007). International
data flow. Retrieved October 2, 2007, from www.
aar.com.au/privacy/over/data.htm
APEC Electronic Commerce Steering Group.
(2004). APEC voluntary online consumer protection guidelines. Retrieved March 1, 2005, from
http://www.export.gov/apececommerce/cp/guidelines.htm
Armstrong, H. L., & Forde, P. G. (2003). Internet
anonymity practices in computer crime. Information Management & Computer Security, 11(5),
209-215.

9

Current Measures to Protect E-Consumers Privacy in Australia

Australian Communications and Media Authority. (2005). Consumer information. Anti


spamfighting spam in Australia: Consumer information. Retrieved July 22, 2005,
from http://www.acma.gov.au/ACMAINTER.2163012:STANDARD:848603301:pc=PC_
1965#anti%20spam%20law
Australian Competition and Consumer Commission. (1999). ACCC conditionally authorises
ADMA code of practice. Retrieved March 27,
2007, from http://www.accc.gov.au/content/index.
phtml/itemId/322914/fromItemId/621589
Australian Competition and Consumer Commission. (2003). Review of building consumer
sovereignty in electronic commerce (best practice
model). Retrieved November 11, 2006, from http://
www.ecommerce.treasury.gov.au/bpmreview/
content/_download/submissions/accc.rtf
Australian Competition and Consumer Commission. (2004). Annual report 2003-2004fostering competitive, efficient, fair and informed
Australian markets. Canberra, ACT: Australian
Competition and Consumer Commission.
Australian Consumers Association. (2004).
Submission to the review of the private sector
provisions of the privacy act 1988 (Cth) (the
privacy act) Sydney, NSW: Office of the Privacy
Commissioner (Australia).
Australian Direct Marketing Association
(ADMA). (2005). ADMA profile. Retrieved
August 17, 2005, from http://www.adma.com.
au/asp/index.asp?pgid=2026
Australian Federal Police (AFP). (2007). Internet
fraud. Retrieved March 16, 2007, from http://www.
afp.gov.au/national/e-crime/internet_scams
Australian Institute of Criminology. (2006). More
Malwareadware, spyware, spam and spim. High
Tech Crime Brief, 1(2006), 1-2.
Australian Privacy Foundation. (2005). Rule
3objectives and purposes. Retrieved April 2,

0

2007, from http://www.privacy.org.au/About/Objectives.html


Australian Privacy Foundation. (2006a). Identity
checks for pre-paid mobile phones. Retrieved
April 2, 2007, from http://www.acma.gov.au/
webwr/_assets/main/lib100696/apf.pdf
Australian Privacy Foundation. (2006b). International instruments relating to privacy law.
Retrieved February 23, 2007, from http://www.
privacy.org.au/Resources/PLawsIntl.html
Bangeman, E. (2006). Dell growth rate slips
behind market. Retrieved July 20, 2006, from
http://arstechnica.com/news.ars/post/200604206640.html
BBB Online. (2003). Dispute resolution. Retrieved
July 19, 2006, from http://www.bbbonline.org/
privacy/dr.asp
Brown, J. (1996). Australia and the modern consumer movement. A History of the Australian
Consumer Movement (pp. 1-6). Braddon, ACT:
Consumers Federation of Australia.
Buning, M. D. C., Hondius, E., Prins, C., & Vries, M. D. (2001). Consumer@Protection.EU.
An analysis of European consumer legislation
information society. Journal of Consumer Policy,
24(3/4), 287-338.
Centeno, C. (2002). Building security and consumer trust in internet payments: The potential
of soft measure. Seville, Spain: Institute for
Prospective Technological Studies.
Chan, P. (2003, September). The practical effect
of privacy laws on the global business and global
consumer. Paper presented at the 25th International Conference of Data Protection and Privacy
Commissioners, Sydney, NSW.
Cheng, T. S. L. (2004). Spam regulation: Recent
international attempts to can spam. Computer
Law & Security Report, 20(6), 472-479.

Current Measures to Protect E-Consumers Privacy in Australia

Choice. (2006). The eight basic consumer rights.


Retrieved November 5, 2006, from http://www.
choice.com.au/viewArticle.aspx?id=100736&cat
Id=100528&tid=100008&p=1&title=The+eight+
basic+consumer+rights

Consumer Protection Commission, E. Y. T.


(undated). E-commerce: APEC voluntary online
consumer protection guidelines. Retrieved April
3, 2007, from http://www.cpc.gov.tw/en/index.
asp?pagenumber=25

Claburn, T. (2004). Dell believes education is


best way to fight spyware. InformationWeek,
October 20. Retrieved September 30, from
http://www.informationweek.com/showArticle.
jhtml;jsessionid=GHVMAU4IX1LXGQSNDLO
SKHSCJUNN2JVN?articleID=50900097&quer
yText=Dell+Believes+Education+Is+Best+Way+
To+Fight+Spyware

Consumers International. (2001). Should I buy?


Shopping online 2001: An international comparative study of electronic commerce. London:
Consumers International.

Clark, R. (1989). The Australian privacy act 1988


as an implementation of the OECD data protection guidelines. Retrieved March 27, 2007, from
http://www.anu.edu.au/people/Roger.Clarke/DV/
PActOECD.html
Clarke, R. (1998). Direct marketing and privacy.
Retrieved March 24, 2007, from http://www.anu.
edu.au/people/Roger.Clarke/DV/DirectMkting.
html
Clayton, G. (2000). Privacy evaluation: Dell.
Retrieved July 20, 2006, from http://www.informationweek.com/privacy/dell.htm
Coalition Against Unsolicited Bulk Email (Australia). (2002). Spam volume statistics. Retrieved
June 2, 2007, from http://www.caube.org.au/spamstats.html
Consumer Affairs Victoria. (2003). Commonwealth website guidelines ignored. Retrieved
November 16, 2006, from http://www.consumer.
vic.gov.au/CA256F2B00231FE5/Print/C3DCD
CFFC3DBD8EECA256F54000412C4?OpenD
ocument
Consumer Affairs Victoria. (2004). Online shopping and consumer protection. Discussion paper,
Melbourne, Victoria: Standing Committee of
Officials of Consumer AffairsE-commerce
Working Party, Consumer Affairs Victoria.

Consumers International. (2004). Annual report


2004. London: Consumers International.
Consumers International. (2006). World consumer
rights day. Retrieved November 7, 2006, from
http://www.consumersinternational.org/Templates/Internal.asp?NodeID=95043&int1stParent
NodeID=89651&int2ndParentNodeID=90145
Coonan, H. (2005, February). 10 years on. 10 years
strong. The internet in Australia. Paper presented
at the 2005 Internet Industry Association Annual
Dinner, Sydney, NSW.
Crime and Misconduct Commission Queensland.
(2004). Cyber trapsan overview of crime,
misconduct and security risks in the cyber environment. Queensland: Crime and Misconduct
Commission.
Curtis, K. (2005a, September). The importance of
self-regulation in the implementation of data protection principles: the Australian private sector
experience. Paper presented at the 27th International Conference of Data Protection and Privacy
Commissioners, Montreux, Switzerland.
Curtis, K. (2005b, March). Privacy in practice.
Paper presented at the Centre for Continuing Legal
Education, University of NSW, Sydney.
Cyota. (2005). Cyota online fraud survey. Retrieved April 7, 2006, from http://www.cyota.
com/press-releases.asp?id=78
Davies, S. (2004, February). The loose cannon: An
overview of campaigns of opposition to national



Current Measures to Protect E-Consumers Privacy in Australia

identity card proposals. Paper presented at the


Unisys seminar: e-ID: Securing the mobility
of citizens and commerce in a Greater Europe,
Nice.
Dell Inc. Australia. (2004). Dells privacy policy.
Retrieved March 2, 2006, from http://www1.
ap.dell.com/content/topics/topic.aspx/ap/policy/
en/privacy?c=au&l=en&s=gen
Dell Inc. Australia. (2005). Dells online policies.
Retrieved March 28, 2006, from http://www1.
ap.dell.com/content/topics/topic.aspx/ap/policy/
en/au/termsau?c=au&l=en&s=gen
Dell Inc. Australia. (2007). Online communication
policy. Retrieved June 5, 2007, from http://www.
dell.com/content/topics/global.aspx/corp/governance/en/online_comm?c=us&l=en&s=corp
Department of Economic and Social Affairs (UN).
(2003). United nations guidelines for consumer
protection (as expanded in 1999). New York:
United Nations.
Donahey, M. S. (2003). The UDRP model applied
to online consumer transactions. Journal of International Arbitration, 20(5), 475-491.
Drugs and Crime Prevention Committee (Parliament of Victoria). (2004). Inquiry into fraud and
electronic commercefinal report. Melbourne,
Victoria: Parliament of Victoria.
Egger, F. N. (2002). Consumer trust in e-commerce: From psychology to interaction design. In J.
E. J. Prins, P. M. A. Ribbers, H. C. A. van Tilborg,
A. F. L. Veth, & J. G. L. van der Wees (Eds.), Trust
in electronic commerce. The role of trust forms a
legal, an organizational and a technical point of
view (pp. 11-43). The Hagues/London/New York:
Kluwer Law International.
European Commission. (2005). Consumer protection in the European Union: Ten basic principles.
Brussels: European Commission.



Expert Group on Electronic Commerce (Australia). (2003). Review of building consumer sovereignty in electronic commerce: A best practice
model for business. Canberra, ACT: Treasury
(Australia).
Federal Bureau of Consumer Affairs (Australia).
(1993). Your consumer rights. In K. Healey (Ed.),
Consumer rights (Vol. 38). Balmain NSW: The
Spinney Press.
Federal Bureau of Consumer Affairs (Australia).
(1995). Your consumer rights. In K. Healey (Ed.),
Consumer rights (Vol. 38). Balmain NSW: The
Spinney Press.
Forder, J. (1999). The IIA code of practice: Coregulation of the internet starts here. Retrieved
March 31, 2007, from http://epublications.bond.
edu.au/law pubs/38
Fraud costing Australia $1.1b a Year. (2006, April
7). The Age.
Gilliams, H. (2003, October). Self regulation by
liberal professions and the competition rules.
Paper presented at the Regulation of Professional
Services Conference organized by the European
Commission, Brussels.
Grabner-Kraeuter, S. (2002). The role of consumers trust in online-shopping. Journal of Business
Ethics, 39(1-2), 43-50.
Grabosky, P., Smith, R. G., & Dempsey, G. (2001).
Electronic theft: Unlawful acquisition in cyberspace. Cambridge and New York: Cambridge
University Press.
Graeme, S. (2005). 30 years of protecting consumers and promoting competition. Keeping Good
Companies, 57(1 (Feb)), 38041.
Greenleaf, G. (2000a). Private sector bill amendments ignore EU problems. Retrieved October
2007, from http://www.austlii.edu.au/au/journals/
PLPR/2000/30.html

Current Measures to Protect E-Consumers Privacy in Australia

Greenleaf, G. (2000b). Safe harbors low benchmark for adequacy: EU sells out privacy for
U.S.$. Retrieved October 2, 2007, from www.
austlii.edu.au/au/journals/PLPR/2000/32.html
Ha, H. (2005, October). Consumer protection in
business-to-consumer e-commerce in Victoria,
Australia. Paper presented at the CACS 2005
Oceania Conference, Perth, WA.
Ha, H. (2006, September-October). Security issues
and consumer protection in business-to-consumer
e-commerce in Australia. Paper presented at
the 2nd Australasian Business and Behavioural
Sciences Association International Conference:
Industry, Market, and Regions, Adelaide.
Ha, H. (2007). Governance to address consumer
protection in e-retailing. Unpublished doctoral
thesis, Monash University, Melbourne, Victoria.
Harland, D. (1987). The United Nations guidelines
for consumer protection. Journal of Consumer
Policy, 10(2), 245-266.
Harland, D. (1999). The consumer in the globalised
information societythe impact of the international organisations. Australian Competition and
Consumer Law Journal, 7(1999), 23.
Haymarket Media. (2007). Spam hits records levels in February. Retrieved March
20, 2007, from http://www.crn.com.au/story.
aspx?CIID=75798&r=rss
Huffmann, H. (2004). Consumer protection in
e-commerce. University of Cape Town, Cape
Town.
Information Law Branch. (undated). Information
paper on the introduction of the privacy amendment (private sector) bill 2000. Barton, ACT:
Attorney Generals Department (Australia).
Internet Industry Association (Australia). (2006a).
Content code. Retrieved March 31, 2007, from
http://www.iia.net.au/index.php?option=com_c

ontent&task=category&sectionid=3&id=19&It
emid=33
Internet Industry Association (Australian).
(2006b). About the IIA. Retrieved March
24, 2007, from http://www.iia.net.au/index.
php?option=com_content&task=section&id=7
&Itemid=38
Jackson, M. (2003). Internet privacy. Telecommunications Journal of Australia, 53(2), 21-31.
James, M. L., & Murray, B. E. (2003). Computer
crime and compromised commerce (Research
Note No. 6). Canberra, ACT: Department of the
Parliamentary Library.
Kaufman, J. H., Edlund, S., Ford, D. A., & Powers,
C. (2005). The social contract core. Electronic
Commerce Research, 5(1), 141-165.
Kehoe, C., Pitkow, J., Sutton, K., Aggarwal, G., &
Rogers, J. D. (1999). Results of GVUs tenth world
wide web user survey. Retrieved November 16,
2006, from http://www.gvu.gatech.edu/user_surveys/survey-1998-10/tenthreport.html
Kim, S., Williams, R., & Lee, Y. (2003). Attitude
toward online shopping and retail website quality: A comparison of US and Korean consumers.
Journal of International Consumer Marketing,
16(1), 89-111.
Krone, T. (2005). Concepts and terms. Canberra:
The Australian High Tech Crime Centre.
Krone, T. (2006). Gaps in cyberspace can leave
us vulnerable. Platypus Magazine, 90 (March
2006), 31-36.
Lahey, K. (2005, August 30). Red tape on a roll...
and it must stop. The Age, 8.
Lawson, P., & Lawford, J. (2003). Identity theft:
The need for better consumer protection. Ottawa:
The Public Interest Advocacy Centre.
Lekakis, G. (2005). Computer crime: The
Australian facts and figures. Retrieved April



Current Measures to Protect E-Consumers Privacy in Australia

7, 2007, from http://www.crime-research.org/


news/19.07.2005/1373/
Lynch, E. (1997). Protecting consumers in the
cybermarket. OECD Observer, 208(Oct/Nov),
11-15.
MacRae, P. (2003). Avoiding eternal spamnation.
Chatswood, NSW: Australian Telecommunications Users Group Limited (ATUG).
Majoras, D. P., Swindle, O., Leary, T. B., Harbour,
P. J., & Leibowitz, J. (2005). The US SAFE WEB
Act: Protecting consumers from spam, spyware,
and fraud. A legislative recommendation to
congress. Washington D.C.: Federal Trade Commission (US).
Mansoorian, A. (2006). Measuring factors for
increasing trust of people in e-transactions. Lulea,
Sweden: Lulea University of Technology.
Mansor, P. (2003, April-May). Consumer interests
in global standards. Paper presented at the Global
Standards Collaboration (GSC) 8User Working
Group Session, Ottawa.
Marshall, A. M., & Tompsett, B. C. (2005). Identity theft in an online world. Computer Law &
Security Report, 21, 128-137.
Mayer, R. N. (2002). Shopping from a list: International studies of consumer online experiences.
Journal of Consumer Affairs, 36(1), 115-126.
Metz, C. (2005, August 23). Identity theft is out
of control. PC Magazine, 87-88.
Milloy, M., Fink, D., & Morris, R. (2002, June).
Modelling online security and privacy to increase
consumer purchasing intent. Paper presented at
the Informing Science + IT Education Conference, Ireland.
Milne, G. R. (2003). How well do consumer protect themselves from identity theft? The Journal
of Consumer Affairs, 37(2), 388-402.



Milne, G. R., Rohm, A. J., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), -232217.
Moghe, V. (2003). Privacy managementa new
era in the Australian business environment. Information Management & Computer Security,
11(2), 60-66.
Moores, T. (2005). Do consumers understand the
role of privacy seals in e-commerce? Communication of the ACM, 48(3), 86-91.
Moulinos, K., Iliadis, J., & Tsoumas, V. (2004).
Towards secure sealing of privacy policies. Information Management & Computer Security,
12(4), 350-361.
Muris, T. J. M. (2002, October). The interface of
competition and consumer protection. Paper presented at the Fordham Corporate Law Institutes
Twenty-Ninth Annual Conference on International Antitrust Law and Policy, New York.
National Consumers League (USA). (n.d.). Essentials for online privacy. Retrieved June 25,
2007, from http://www.nclnet.org/technology/essentials/privacy.html
National Office for the Information Economy
(Australia). (2003). Spamfinal report of the
NOIE review of the spam problem and how it
can be countered. Canberra, ACT: Department
of Communication, Information Technology and
the Arts.
North American Consumer Project on Electronic
Commerce (NACPEC). (2006). Internet consumer
protection policy issues. Geneva: The Internet
Governance Forum (IGF).
NSW Office of Fair Trading. (2003). International
consumer rights: The world view on international
consumer rights. Retrieved November 15, 2006,
from http://www.fairtrading.nsw.gov.au/shopping/shoppingtips/internationalconsumerrights.
html

Current Measures to Protect E-Consumers Privacy in Australia

OECD. (2000). OECD Guidelines for consumer


protection in the context of electronic commerce.
Paris: OECD.
OECD. (2001a). Australiaannual report on
consumer policy development 2001. Retrieved
March 11, 2005, from http://www.oecd.org/dataoecd/33/45/1955404.pdf
OECD. (2001b). OECD guidelines on the protection of privacy and transborder flows of personal
data. Paris: OECD.
OECD. (2003). Report on compliance with, and
enforcement of, privacy protection online. Paris:
OECD.
OECD. (2006). Protecting consumers from cyberfraud. Paris: OECD.
Patel, A., & Lindley, A. (2001). Resolving online
disputes: Not worth the bother? Consumer Policy
Review, 11(1 (Jan/Feb)), 2-5.
PC Magazine. (2005). The perils of online shopping. PC Magazine, 24(14), 23.
Petty, R. D., & Hamilton, J. (2004). Seeking a
single policy for contractual fairness to consumers: A comparison of U.S. and E.U efforts. The
Journal of Consumer Affairs, 38(1), 146-166.
Privacy Commissioner (Australia), O. o. (2000).
National privacy principles (extracted from the
privacy amendment (private sector) act 2000).
Retrieved July 19, 2006, from http://www.privacy.
gov.au/publications/npps01.html
Privacy Commissioner (Australia), O. o. (2003).
National privacy principle 7identifiers in the
health sector. Sydney: Privacy Commissioner,
Office of (Australia).
Privacy Commissioner (Australia), O. o. (2006a).
Annual report 2005-2006. Melbourne, Victoria:
Office of the Victorian Privacy Commissioner.
Privacy Commissioner (Australia), O. o. (2006b).
Industry standard for the making of telemarket-

ing calls. Sydney, NSW: Office of the Privacy


Commissioner (Australia).
Privacy Commissioner (Australia), O. o. (n.d.).
State & territory privacy laws. Retrieved August
2, 2005, from http://www.privacy.gov.au/privacy_rights/laws/index.html#2
Quirk, P., & Forder, J. (2003). Electronic commerce and the law (2nd ed.). Queensland: John
Wiley & Sons Australia, Ltd.
Quo, S. (2004). Spam: Private and legislative
responses to unsolicited electronic mail in Australia and the United States. ELawMurchdoch
University, 11(1).
Round, D. K., & Tustin, J. (2004, September).
Consumers as international traders: Some potential information issues for consumer protection
regulators. Paper presented at the International
Trade Law Conference, Attorney-Generals Department, Canberra, ACT.
Roy Morgan Research. (2004). Community attitudes towards privacy 2004. Sydney, NSW:
The Office of the Federal Privacy Commissioner
(Australia).
Ruddock, P. (2006). Australian law reform commission to review privacy act. Retrieved June 7,
2007, from hhttp://www.ag.gov.au/agd/WWW/
MinisterRuddockHome.nsf/Page/Media_Releases_2006_First_Quarter_31_January_2006__Australian_Law_Reform_Commission_to_review_Privacy_Act_-_0062006#
Saarenp, T., & Tiainen, T. (2003). Consumers
and e-commerce in information system studies.
In M. Hannula, A-M. Jrvelin, & M. Sepp (Eds.),
Frontiers of e-business research: 2003 (pp. 62-76).
Tampere: Tampere University of Technology and
University of Tampere.
Scott, C. (2004). Regulatory innovation and the
online consumer. Law & Policy, 26(3-4), 477506.



Current Measures to Protect E-Consumers Privacy in Australia

Scottish Consumer Council. (2001). E-commerce


and consumer protection: Consumersreal
needs in a virtual world. Glasgow: Scottish
Consumer Council.
Singh, B. (2002). Consumer education on consumer rights and responsibilities, code of conduct
for ethical business, importance of product labelling. Kualar Lumpur: Consumers International.
Smith, L. (2004). Global online shopping: How
well protected is the Australian consumer? Australian Competition and Consumer Law Journal,
12(2), 163-190.
State of California. (2007). Privacy laws. Retrieved September 26, 2007, from http://www.
privacy.ca.gov/lawenforcement/laws.htm
Stoney, M. A. S., & Stoney, S. (2003). The problems
of jurisdiction to e-commercesome suggested
strategies. Logistics Information Management,
16(1), 74-80.
Sullivan, B. (2007). Spam is back, and worse
than ever. Retrieved March 20, 2007, from
http://redtape.msnbc.com/2007/01/spam_is_
back_an.html
Teo, T. S. H. (2002). Attitudes toward online shopping and the internet. Behaviour & Information
Technology, 21(4), 259-271.
Treasury (Australia). (2006). The Australian
guidelines for electronic commerce (March 2006).
Canberra, ACT: Treasury (Australia)
Udo, G. J. (2001). Privacy and security concerns
as major barriers for e-commerce: A survey study.
Information Management & Computer Security,
9(4), 165-174.
United Nations. (1948). Universal declaration of
human rights. New York: United Nations.
U.S. Senate Permanent Subcommittee on Investigations. (1986). Security in cyberspace. Retrieved
April 3, 2007, from http://www.fas.org/irp/congress/1996_hr/s960605t.htm



Vaile, D. (2004). Spam cannednew laws for


Australia. Internet Law Bulletin, 6(9), 113-115.
Vasiu, L., Warren, M., & Mackay, D. (2002, December). Personal information privacy issues in
B2C e-commerce: a theoretical framework. Paper
presented at the 7th Annual CollECTeR Conference on Electronic Commerce (CollECTeR02),
Melbourne, Victoria
Walczuch, R., Seelen, J., & Lundgren, H. (2001,
September). Psychological determinants for
consumer trust in e-retailing. Paper presented
at the Eight Research Symposium on Emerging
Electronic Markets (RSEEM01), Maastricht, The
Netherlands.
Williams, D. (2004, 27 Feb). Business guides to
combat spam. Retrieved August 2, 2005, from
http://www.agimo.gov.au/media/2004/02/12070.
html
Williams, D. (2004, 10 March). Maximising the
benefits of the information economy. Retrieved
August 2, 2005, from http://www.agimo.gov.
au/media/2004/03/21377.html
Word of Mouth Marketing Association. (2006,
13 Nov). Dell makes public commitment to word
of mouth ethics. Retrieved January 5, 2007, from
http://www.womma.org/womnibus/007895.php
Working Group on Electronic Commerce and
Consumers (Canada). (1999). Principles of consumer protection for electronic commercea
Canadian framework. Ottawa: Canada Bankers
Association.
Working Group on Electronic Commerce and
Consumers (Canada). (2004). Canadian code of
practice for consumer protection in electronic
commerce. Ottawa: Office of Consumer Affairs,
Industry Canada.
Yianakos, C. (2002). Nameless in cyberspace
protecting online privacy. B+FS, 116(6), 48-49.

Current Measures to Protect E-Consumers Privacy in Australia

Zhang, X. (2005). What do consumers really


know? Communications of the ACM, 48(8), 4448.

Acts
Anti-Phishing Act 2005 (California)
Can-Sapm Act 2002 (USA)
Computer Spyware Act 2004 (California)
Online Privacy Protection Act 2003 (California)
Privacy Act 1988 (Public sector) (Cth)
Privacy Amendment (Private Sector) Act 2000
(Cth) sch 3
Spam Act 2003 (Cth)
Telecommunications (Interception and Access)
Act 1979 (Cth)

AddItIonAl reAdIng
Al Iannnuzzir, J. (2002). Industry self-regulation
and voluntary environmental compliance. Boca
Raton: Lewis Publishers.
Ang, P. H. (2001). The role of self-regulation of
privacy and the internet. Journal of Interactive
Advertising, 1(2), 1-11.
Australian Privacy Foundation. (2006). Identity
checks for pre-paid mobile phones. Retrieved
April 2, 2007, from http://www.acma.gov.au/
webwr/_assets/main/lib100696/apf.pdf
Business for Social Responsibility. (2005, April
2005). Privacy (consumer and employee).
Retrieved December 8, 2006, from http://
www.bsr.org/CSRResources/IssueBriefDetail.
cfm?DocumentID=50970
Chung, W. C., & Paynter, J. (2002, January).
Privacy issues on the internet. Paper presented
at The 35th Hawaii International Conference on
System Sciences, Hawaii.

Clayton, G. (2000). Privacy evaluation: Dell.


Retrieved July 20, 2006, from http://www.informationweek.com/privacy/dell.htm
Coonan, H. (2005, November). Self-regulation and
the challenge of new technologies. Paper presented
at the AMTA Annual general Meeting Luncheon,
Sheraton on the Park, Sydney, NSW.
Drake, W. J. (2004). ICT global governance and the
public interest: Transactions and content issues.
Geneva, Switzerland: Computer Professionals for
Social Responsibility.
Drezner, D. W. (2004). The global governance of
the internet: Bringing the state back in. Political
Science Quarterly, 119(3), 477-498.
Gunningham, N., & Rees, J. (1997). Industry
self-regulation: An institutional perspective. Law
& Policy, 19(4), 363-414.
Ha, H. (2006). Regulation. In Encyclopedia of
World Poverty, 2, 903-906). Thousand Oaks,
London, New Delhi: Sage Publications.
Lacey, D., & Cuganesan, S. (2004). The role of
organizations in identity theft response: The organization-individual victim dynamic. Journal
of Consumer Affairs, 38(2), 244-261.
Linnhoff, S., & Kangenderfer, J. (2004). Identity
theft legislation: The fair and accurate credit
transactions act of 2003 and the road not taken.
Journal of Consumer Affairs, 38(2), 204-216.
Marshall, A. M., & Tompsett, B. C. (2005). Identity theft in an online world. Computer Law &
Security Report, 21, 128-137.
OECD. (2001). OECD guidelines on the protection of privacy and transborder flows of personal
data. Paris: OECD.
OECD. (2006). Report on the implementation of
the 2003 OECD guidelines for protecting consumers from fraudulent and deceptive commercial
practices across borders. Paris: OECD.



Current Measures to Protect E-Consumers Privacy in Australia

Office of Communications (UK). (2006). Online


protection: A survey of consumer, industry and
regulatory mechanisms and systems. London:
Office of Communications (UK).
ONeill, B. (2001). Online shopping: Consumer
protection and regulation. Consumer Interests
Annual, 47, 1-5.
Price, M. E., & Verhulst, S. G. (2004). Self regulation and the internet. Alphen aan den Rijn, The
Netherlands: Kluwer Law International.
Schwarts, P. M. (1999). Privacy and democracy
in cyberspace. Vanderbilt Law Review, 52, 16091702.
Simpson, S., & Wilkinson, R. (2003, September
2003). Governing e-commerce: Prospects and
problems. Paper presented at the 31st Telecommunications Policy Research Conference, Communication, Information and Internet Policy,

8

National Centre for Technology and Law, George


Mason University, School of Law, Arlington,
VA.
Stafford, M. R. (2004). Identity theft: Laws,
crimes, and victims. Journal of Consumer Affairs,
38(2), 201-203.
Sylvan, L. (2002, September). Self-regulation
whos in charge here? Paper presented at the
Australian Institute of Criminology Conference
on Current Issues in Regulation: Enforcement
and Compliance, Melbourne, Victoria.
Sylvan, L. (2004, September). Issues for consumers in global trading. Paper presented at the 26th
International Trade Law Conference, Rydges
Lakeside, Canberra, ACT.

Current Measures to Protect E-Consumers Privacy in Australia

AppendIX A
Principles and Guidelines of Consumer Protection by the United Nations (UN), Organisation
Economic Corporation Development (OECD), European Union (EU), and Asia Pacific Economic
Cooperation (APEC) (based on information in Department of Economic and Social Affairs (UN),
2003; OECD, 2000; European Commission, 2005; Consumer Protection Commission, E. Y. T., n.d.)
Table 1A.
No.

UN(a)

OECD (b)

EU (c)

APEC (d)

Physical safety

Transparent and effective


protection

Buy what you want, where you want

International cooperation

Promotion and
protection of
consumers economic
interests

Fair business, advertising,


and marketing practices

If it does not work, send it back

Education and awareness

Standards for the


safety and quality of
consumers goods and
services

Online disclosures
information about the
business, the goods or
services, the transaction

High safety standards for food and other


consumer goods

Private sector leadership

Distribution facilities
for essential consumer
goods and services

Confirmation process

Know what you are eating

Online advertising and


marketing

Measures enabling
consumers to obtain
redress

Payment

Contracts should be fair to consumers

Online information disclosure


to consumers

Education and
information programs

Dispute resolution and


redress

Sometimes consumers can change their


mind

Confirmation process

Promotion
of sustainable
consumption

Privacy

Making it easier to compare prices

Resolution of consumer
disputes

Measures relating to
specific areas

Education and awareness

Consumer should not be misled

Privacy

Protection while you are on holiday

Security

10

Effective redress for cross-border disputes

Choice of law and jurisdiction

Sources: (a) Department of Economic and Social Affairs (UN). (2003). United Nations Guidelines for Consumer Protection
(as expanded in 1999). New York: United Nations.
(b) OECD. (2000). Guidelines for Consumer Protection in the Context of Electronic Commerce. Paris: OECD.
(c) European Commission. (2005). Consumer Protection in the European Union: Ten Basic Principles. Brussels: European
Commission.
(d) Consumer Protection Commission, E. Y. T. (undated). E-Commerce: APEC Voluntary Online Consumer Protection Guidelines.
Consumer Protection Commission, Executive Yuan (Taiwan). Retrieved April 3, 2007, from http://www.cpc.gov.tw/en/index.
asp?Pagenumber=25

9

Current Measures to Protect E-Consumers Privacy in Australia

AppendIX b
Summary of Eight Principles of Consumer Protection in Canada (based on information in Working
Group on Electronic Commerce and Consumers (Canada), 1999)
Table 2B.
No.

Principles

Information provision

Contract formation

Privacy

Security of payment and personal information

Redress

Liability

Unsolicited commercial e-mail

Consumer awareness

Sources: Working Group on Electronic Commerce and Consumers (Canada). (1999). Principles of Consumer Protection for
Electronic CommerceA Canadian Framework. Ottawa: Canada Bankers Association.

0



Chapter VII

Antecedents of Online Privacy


Protection Behavior:
Towards an Integrative Model
Anil Gurung
Neumann College, USA
Anurag Jain
Salem State College, USA

AbstrAct
Individuals are generally concerned about their privacy and may withhold from disclosing their personal
information while interacting with online vendors. Withholding personal information can prevent online
vendors from developing profiles to match needs and wants. Through a literature review of research on
online privacy, we develop an integrative framework of online privacy protection.

IntroductIon
The latest report on e-commerce by the U.S.
Census Bureau (2007) shows that although there
has been an increase in online purchasing by individuals, the portion of consumer e-commerce
or online to total retail sales is far less than the

portion of electronic business-to-business sales


to the total business-to-business sales. One of
the factors that may be influencing this online
consumer behavior is the privacy concerns that
consumers have regarding the personal data collection procedures used by online companies.
An individuals trust in online companies and

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Antecedents of Online Privacy Protection Behavior

their data collection procedures has been the


major factor hindering the growth of electronic
commerce (Belanger, Hiller, & Smith, 2002; Liu,
Marchewka, Lu, & Yu, 2004).
Companies use the consumer data to study
consumer preferences so that they can build effective strategies to expand their customer base.
Emergent technologies and organizational practices in gathering data raise privacy concerns.
Such technologies include the use of cookies,
authentication programs, spyware, and adware.
The growth of technologies to collect information about consumers may only lead to fueling
the consumers privacy concerns. Companies
have realized that protecting consumers private
information is an essential component in winning the trust of the consumers and is a must in
facilitating business transactions (Belanger et
al., 2002; McKnight & Chervany, 2001). Privacy
policies that inform the consumer about how the
collected information will be used are usually
posted on the websites. However, there is not
enough evidence to prove whether or not these
policies are effective in alleviating the consumers
privacy concerns. In the absence of any strong
mechanisms, technologies or policies that ensure
information privacy, the consumer adopts different strategies for their privacy protection. Such
strategies may include, for instance, abstaining
from purchasing, falsifying information, and adjusting security and privacy settings in the Web
browsers (Chen & Rea, 2004).
In this chapter, we review the existing literature
and analyze the existing online privacy theories,
frameworks, and models. Through the analysis
of the literature, we aim to understand existing
privacy frameworks and variables that are used in
the context of online privacy protection. Finally,
based on the review, we develop an integrative
framework to encapsulate the antecedents to
online privacy protection behavior.
The motivation for this study is to understand
the factors that are related to online privacy
protection. Although this topic has been studied



in other disciplines, such as marketing, (e.g.,


Sheehan, & Hoy, 1999), the literature review
shows that research on privacy is fragmented.
The proposed integrative framework aims to
integrate these fragmented yet related constructs
under one overarching concept. This will help us
in expanding our understanding of the various
issues involved in online privacy. Specifically,
we focus on what has been done in privacy protection and how future studies in this area can
proceed forward.

bAckground
Research has shown that privacy concerns act as
a hindrance to the growth of electronic commerce
(Hoffman, Novak, & Peralta, 1999; Miyazaki &
Fernandez, 2001). In countering privacy concerns,
the Federal Trade Commission has primarily relied
upon fair information practices to guide privacy
regulation in the United States (Milne, 2000).
Fair information practices include the following:
notice of the firms information practices regarding what personal information will be collected
and how the collected information will be used;
choice or consent regarding the secondary use
of the information; accessibility of users to view
their own data collected by companies; security
of the collected data; and enforcement to ensure
that companies comply with fair information
practices.
Research shows that fair information practices have not been effective in alleviating the
privacy concerns of consumers (Culnan, 2000).
In the absence of stricter laws to ensure privacy,
consumers adopt differing strategies to protect
their identity online, for instance, falsification,
passive reaction, and identity modification (e.g.,
Sheehan & Hoy, 1999). For the purpose of this
chapter, the strategies adopted by consumers to
protect their identity are defined under the general
term of privacy protection behavior in an
online environment.

Main Research Questions

How do information
privacy concerns affect the
growth and development
of consumer-oriented
commercial activity on the
internet?

What actions are taken by


online consumers in response
to their privacy concerns?

What is the extent of online


retailer disclosures of various
privacy and security related
practices?

What is the extent to


which consumer-oriented
commercial Web sites post
disclosures that describe their
information practices and
whether these disclosures
reflect fair information
practices?

Develop a privacy research


framework that highlights
key dimensions of the
information interaction
between marketers and
consumers.

What are the underlying


factors of online privacy
concerns?

Authors &
Year

(Hoffman et
al., 1999)

(Sheehan &
Hoy, 1999)

(Miyazaki &
Fernandez,
2000)

(Culnan,
2000)

(Milne, 2000)

(Sheehan &
Hoy, 2000)
Survey

Conceptual

Survey

Survey

889

361

128

Awareness
of collection,
information
usage, sensitivity
of information,
familiarity, and
compensation

Marketer information
strategy

Personal identifying
information

Privacy related
statements,
security-related
statements, consumer
perceptions

Privacy concerns,
situational contexts

889

Survey

IV

Information privacy,
environmental
control, and
secondary use of
information control

Conceptual

Method

Consumer
information
behavior

Information practices

Online disclosures and


information practices

Fair information practices

Online disclosures

Purchase
likelihood

Information
practice
and privacy
policy
disclosures

There is some correlation between privacy


concerns and consumer complaining
behavior such as flaming, complaining,
or abstaining from participating in online
activities. The most frequently adopted
complaining behavior was providing
incomplete information when registering
for Web sites.

Consumer complaining
behavior

Consumer
behavior
(i.e.,
falsifying
information,
reading
unsolicited
e-mail

Three important factors are: control over


collection and usage of information;
short-term transactional relationship; and
established long term relationship.

The main types of interactions are


information requests/disclosures,
information provision, information
capturing without consent, and information
practices.

The most of the Web sites surveyed notified


about their information practices but did
not fully disclose fair information practices.
Not having a fully-agreed definition for fair
information practices pose challenges in
assessing online disclosures.

A positive relationship exists between the


percentage of privacy and security related
statements on Web sites for particular
online shopping categories and consumers
online purchase likelihoods.

The opt-in, informed consent policies


are beneficial for online businesses. The
most effective way for commercial Web
providers to develop profitable exchange
relationships is gaining consumer trust.

Relationship exchange

Policy,
Protection

Findings

Theoretical Framework

DV

Antecedents of Online Privacy Protection Behavior

Table 1. A review of online information privacy literature

continued on following page





What are the important


features of business to
consumer Web sites?

What are the differences in


privacy concerns of online
users?

What is the impact of


customer perceptions of
security control on ecommerce acceptance?

(Ranganathan
& Ganapathy,
2002)

(Sheehan,
2002)

(Suh & Han,


2003)
Survey

Survey

Survey

502

889

214

100

Authentication,
nonrepudiation,
confidentiality,
privacy protection,
data integrity, trust,
attitude, behavioral
intention

Awareness,
usage, sensitivity,
familiarity, and
compensation

Information content,
design, security, and
privacy

Choice, access,
security, and notice

Study the content of online


privacy notices to inform
public policy

(Milne &
Culnan, 2002)

Survey

Trustworthiness, site
quality, privacy, and
security features

Experiment

What is the importance


of third party privacy
seals, privacy statements,
third party security seals,
and security features on
purchasing behavior of
consumers? What role does
trustworthiness play in
consumer behavior?

(Belanger et
al., 2002)

140

Disposition to trust,
institution-based
trust, trusting beliefs,
trusting intentions,
and Web vendor
interventions (i.e.,
third party seals,
privacy policy)

Conceptual

What are different typologies


of trust and how do they
relate with e-commerce
consumer behavior?

Internet experience,
purchasing method,
risk concerns

(McKnight
& Chervany,
2001)

160

Survey

How do risk perceptions vary


with Internet experience?
What is the effect of risk
perceptions on online
shopping activity?

(Miyazaki &
Fernandez,
2001)

Actual use

Purchase
intent

Information
disclosure

Technology acceptance
model

Information practices

Information practices

Fair information practices

Information practices

Customer perceived strength of


nonrepudiation, privacy protection, and data
integrity was important for determining
e-commerce acceptance.

The privacy concerns of consumers


vary depending upon the situation. The
contextual nature of online privacy makes
it difficult to predict how online users will
react to specific online situations.

Security is the best predictor of online


purchase intent followed by privacy, design
and information content.

Effective privacy notice is the first step


towards privacy protection. The amount of
Web sites that posted privacy notice grew
from 1998 to 2001.

Security features are more important than


privacy and security seals. Trustworthiness
of Web merchants is important.

A trust model is presented which helps to


study consumer trust at levels of personal,
institutional, and interpersonal.

Theory of reasoned action

Trust related
Internet
behavior

Intention to
purchase

This study indicates that higher levels of


Internet experience may lead to lower risk
perceptions regarding online shopping and
fewer specific concerns regarding system
security and online retailer fraud, yet more
privacy concerns.

Information practices

Online
purchasing
rate

Antecedents of Online Privacy Protection Behavior

Table 1.continued

continued on following page

23

Interview

What is the perception of


Internet users regarding
privacy? What are the
implications of gathering
information by offering
financial benefits?

What is the relationship


between privacy risk
beliefs and confidence and
enticement beliefs that
influence the intention to
disclose information?

Do consumers value privacy


statements and privacy seals?
If so, do these statements
and seals affect consumer
disclosure of personal
information?

Do information transparency
features, which provide
knowledge of information
and procedures, affect
willingness for information
disclosure?

(Olivero &
Lunt, 2004)

(Dinev &
Hart, 2006)

(Hui, Teo, &


Lee, 2007)

(Awad &
Krishnan,
2006)
Survey

Experiment

Survey

293
&
449

Survey and
experiment

What is the nature and


dimensions of Internet
users information privacy
concerns?

(Malhotra,
Kim, &
Agarwal,
2004)

401

109

369

212

Experiment

Do privacy seals ease


privacy concerns of online
customers?

(Liu et al.,
2004)

102

Survey

What types of privacy


control techniques are used
in an online context?

(Chen & Rea,


2004)

Information
transparency, privacy
concern, privacy
policy, and previous
privacy invasion

Privacy statement,
privacy seal,
monetary incentive,
sensitivity of
information

Privacy concerns,
trust, privacy risk,
personal interest

Attitude toward
privacy, control,
perceived risk,
and awareness
of information
collection

Collection, control,
awareness, type of
information, trust
beliefs, risk beliefs

Notice, access,
choice, security, and
trust

Concerns of
unauthorized
use Concerns of
giving out personal
information

Willingness
to be
profiled

Information
disclosure

Willingness
to disclose
information

Information practices

Contemporary choice
theory

Privacy calculus

Online disclosures

Social contract theory

Behavioral
intention

Willingness
to disclose
information

Theory of reasoned action

Information practices

Behavioral
intention to
purchase

Privacy
controls

Customers who desire greater information


transparency are less willing to be profiled.

The existence of privacy statement was


effective for information disclosure while
that of privacy seal was not. Monetary
incentive was positive influence on
disclosure. Information request had a
negative influence on disclosure.

Privacy concerns inhibit e-commerce


transactions. Trust and personal interest
outweigh privacy risk perceptions
while deciding on personal information
disclosure.

Perceived risk and awareness of


information collection are related with a
shift in concerns from trust issues to control
issues. Risk awareness reduced the level of
trust and increased the demand for control.

The second order Internet users


information privacy concerns scale is
developed with dimensions of collection,
control, and awareness. Privacy concerns
will have negative influence the willingness
to have relationships with online
companies.

Privacy concerns have strong influence


on whether an individual will trust an
electronic commerce business. Trust
will influence the behavioral intention to
purchase online.

Passive control was related to the concern


of unauthorized use of personal information
and identity modification was related to the
concern of giving out personal information.

Antecedents of Online Privacy Protection Behavior

Table 1.continued



Antecedents of Online Privacy Protection Behavior

revIeW of fIndIngs
The methodology followed for this chapter was
a literature review. In this conceptual study,
the existing privacy and related literature was
analyzed to identify existing frameworks and
variables related to online privacy. In the review
of the literature, we retained the studies where
privacy was in the context of online, and the
unit of analysis was either individual and/or online
consumers. The results of the literature review
are presented in Table 1. The research articles that
were considered for the review were published
from 1999 onwards. This was necessary, since the
popular media has been ripe with news coverage on
heightened privacy concerns of consumers since
that time. Most of the research studies included
in the review used a survey methodology, while
experiments were the second most frequently
used methodology.
Our review of the literature on privacy revealed
that most of the research studied the consumers
willingness to disclose information in light of
their privacy concerns (Dinev & Hart, 2006;
Hui et al., 2007; Malhotra et al., 2004; Milne &
Culnan, 2002; Olivero & Lunt, 2004). There were
other group of literature that studied the consumers willingness to purchase in light of privacy
concerns (Belanger et al., 2002; Miyazaki &
Fernandez, 2001; Suh & Han, 2003). There were
very few studies that actually studied privacy
protection behavior (Chen & Rea, 2004). The
review of current research shows that privacy
concerns affect the disclosure of information
or purchase intent of consumers (Belanger et
al., 2002; Malhotra et al., 2004). The reviewed
literature gives us insights into how the privacy
construct is used with other related constructs
from different perspective. Therefore, we feel
it is necessary that an integrative framework of
privacy be proposed. This framework would be
helpful to study in more completeness, the impact
of privacy on consumer behavior. As outlined
in the beginning, the proposed framework will



attempt to explain the antecedents that lead to


privacy protection behavior. Since only one
study specifically examined the privacy protection behavior, we feel that a discussion on how
privacy concerns will affect consumer behavior is
relevant before outlining a framework on privacy
protection behavior. Therefore, we first proceed
to discuss the existing typologies of privacy concerns. These typologies explain both the states
and types of privacy concerns in an individual.
A mixed typology is put forth that combines both
the states and types of privacy concerns.

typology of privacy concerns


Several privacy typologies have been suggested,
such as privacy aware, privacy active, and privacy suspicious (Drennan, Mort, & Previte, 2006).
Privacy aware refers to being knowledgeable
and sensitive about risks associated with sharing
personal information online. The privacy aware
factor consists of selectivity about information
provision, awareness of sensitivity of mothers
maiden name, and perceptions that online companies require an excessive amount of personal
information. The privacy active factor refers to
active behaviors adopted by consumers in regards
to their privacy concerns. This factor consists of
seeking detailed information about online purchasing, requesting that companies do not share
collected personal information, and regularly
changing passwords to protect ones privacy. The
privacy suspicious factor refers to concerns about
company behavior regarding privacy practices.
This factor consists of awareness of companies
plans to share collected personal information,
belief that company privacy policies are hard
to find in their Web sites and checking to make
sure that e-mail and phone numbers are provided
online before transactions. In summation, these
typologies seem to be related to state or degree
of privacy concerns that exist in individuals.
In addition to the privacy typologies described,
other typologies have also been suggested in lit-

Antecedents of Online Privacy Protection Behavior

Table 2. Mixed typology of privacy concerns


Fundamentalists & privacy aware

Fundamentalists & privacy active

Fundamentalists & privacy suspicious

Unconcerned & privacy aware

Unconcerned only

Unconcerned only

Pragmatists & privacy aware

Pragmatists & privacy active

Pragmatists & privacy suspicious

erature. For instance, as it appears in the report


by Federal Trade Commission (1996), categorizes
consumers into groups such as fundamentalists,
unconcerned, and pragmatists as suggested
by Westin (1967). Fundamentalist individuals
prefer privacy controls over consumer benefits
and comprise one fourth of the population. They
are unlikely to partake in any activities that will
compromise their privacy. The unconcerned individuals fall in the other extreme and also comprise
one fourth of the population. They are willing to
forego their privacy if they can enjoy any consumer benefits. Such individuals are most likely
to join reward programs and are more willing to
divulge their personal information in order to get
discounts. The other half of the population is comprised of pragmatists who weigh the advantages
of various consumer benefits against the degree
of personal information sought by companies.
Building upon Westins typology, Sheehan (2002)
suggested unconcerned, circumspect, wary, and
alarmed as privacy typologies. The unconcerned
users have the most minimal privacy concern.
They are willing to provide accurate information to online companies. The circumspect have
minimal privacy concerns, however, they are more
likely than unconcerned to provide incomplete
information to online companies during registration. The wary have a moderate privacy concern
and are likely to provide incomplete information
during registration. The alarmed users are highly
concerned. Even if they register for Web sites,
they are more likely to provide incomplete or
inaccurate information.
In our analysis, Westins and Sheehans privacy
typology relates to the state or degree of privacy
concerns. Some are too concerned while some are
not concerned at all, with the rest of the popula-

tion falling somewhere between the extremes.


On the other hand, the typology suggested by
Drennan et al.(2006) is behavior-centric as they
refer to behavior in response to privacy concerns.
Rather than being mutually exclusive, these two
suggested typologies are related. This relatedness
is illustrated in the three-by-three matrix in Table
2. In the second row of the table, the second and
third cells have only unconcerned. We believe
that if consumers are unconcerned about their
privacy, they are less likely to be privacy active or
privacy suspicious, although they may be aware
of privacy issues. For our degree of privacy concern typology, we have followed Westins (1967)
typology instead of Sheehans, for its conciseness.
Moreover, in order to form a mixed typology, we
combined Westins typology with the typology
suggested by Drennan et al. (2006).

privacy protection
Approaches taken by individuals to protect their
personal information online may be passive or
active. Passive protection may involve depending upon external entities such as government or
private institutions and not adopting any privacy
protection by oneself. As the name suggests, active
protection involves using different measures for
privacy protections. Some of the privacy protection strategies are as follows: use personal firewalls, withhold information to a Web site, remove
name and address from mailing lists, inform Web
sites not to share information, avoid using a Web
site, disable cookies, use anti-spyware tools, and
provide false or incomplete information when
registering on a Web site. Privacy protections
can be viewed from three perspectives: preroga-



Antecedents of Online Privacy Protection Behavior

tive, objective, and subjective (Yao, 2005). The


prerogative privacy is enforced at such a broad
level by the government that it is hard to link it
to beliefs, attitudes, and behaviors of individuals.
The objective privacy focuses on the effectiveness of privacy protection strategies such as the
ones set by the Federal Trade Commission. The
subjective privacy can be addressed by specific
human efforts taken to protect privacy online.
Table 3 shows the different perspectives on privacy protection.
Since there are no means to help users determine for themselves what information to share and
with whom to share and control the dissemination
of information, the consumers have resorted to
other methods in order to protect their personal
information and still receive goods and services
from online vendors (Chen & Rea, 2004). Described as privacy controls, Chen & Rea (2004)
developed three factors that relate to different
behaviors adopted by consumers to protect their
personal information online. The three factors are
falsification of private information, passive reaction, and identity modification. Privacy controls
are defined as consumers ability to hold control
over an unwanted presence in the environment
(Goodwin, 1991).

findings to provide insight into the phenomenon


of online privacy protection behavior.

Dependent Variable
Several differing factors that contribute to the
overall behavior of an individual to protect their
privacys have been discussed in literature (Chen
& Rea, 2004). The first factor, falsification, refers
to altering ones personal information and removing browser cookies when registering for online
Web sites. The second factor, passive reaction,
refers to just ignoring or deleting the intrusion
of others. The third factor, identity modification,
refers to changing ones personal identity by using gender-neural identities or multiple identities
when registering for online services.

Independent Variables
Our literature analysis showed that a wide range
of variables have been used to predict online
privacy protection behavior. These predictor
variables can be classified as privacy concerns,
Internet experience, demographics, and awareness of privacy issues as shown in Figure 1.

Privacy Concerns
frAmeWork
In this section, we propose an integrative framework for online privacy protection behavior.
The proposed framework, as shown in Figure 1,
builds upon prior research and integrates research

Privacy concerns arise from the fear that the faith


that consumers put in online companies will be
violated. When companies gather personal information from consumers, there is an implied social
contract that companies will act upon the collected
information as they have agreed (Phelps, Nowak,

Table 3. Perspectives on online privacy protection

8

Prerogative

Objective

Subjective

political or legal issue that


can be addressed by
philosophical, political, or
legal debates

can be addressed by
measuring effectiveness
of privacy protection
strategies

can be addressed by
focusing on factors that
determine the adoption of
specific privacy protection
strategies

Antecedents of Online Privacy Protection Behavior

& Ferrell, 2000). The implied social contract is


violated if information is collected without consumers awareness, if the collected information
is used for purposes other than those of which the
consumer has been informed, or if the collected
information is shared with third parties without
consumers consent (Phelps, DSouza, & Nowak,
2001; Smith, Milburg, & Burke, 1996; Smith,
1996). Because of privacy concerns, consumers
are unwilling to disclose personal information
to online companies. The consumers unwillingness to disclose can be attributed to perceived
lack of environmental control and information
control (Goodwin, 1991). The perceived lack of
information control is related to privacy concerns
and is central to the issue of this chapter, while
environmental control is related to security concerns (Gurung, 2006).
One of the few studies that examined the
relationship between online privacy concerns
and online behavior found some significant correlations (Sheehan & Hoy, 1999). As privacy
concerns increased consumers were less likely
to register for Web sites, more likely to provide
incomplete information to Web sites, more likely
to report spam, more likely to request removal
from mailing lists, and more likely to send highly
negative messages or flames to those sending
unsolicited e-mail (Sheehan & Hoy, 1999). Privacy
concerns regarding unauthorized use and concerns
of giving out personal information were found to
be significant with privacy protection behavior
(Chen & Rea, 2004).

Internet Experience
As consumers become more experienced in using
the Internet, they are likely to become familiar with
privacy protection strategies. The relationship
between Internet experience of an individual and
their adoption of privacy protection strategies has
been suggested in literature, since the Internet
experience helps to increase behavioral control
which is considered significant in the prediction

of privacy protection behavior (Yao, 2005).


Internet experience has been linked to use of
privacy protection in literature (Yao, 2005). Such
past behavior only helps to reinforce the behavioral
control that one has over the privacy protection
behavior, and thus acts as a predictor of future
privacy protection behavior.

Demographics
Among demographic variables of age, gender,
and race, Chen and Rea (2004) found that gender
and race are significant factors in privacy protection behavior. Their findings suggest that male
consumers are more likely to falsify personal
information than are female users. Their results
further implied that data quality of personal
information collected online may vary among
racial groups. Phelps et.al (2000) found that among
demographic variables such as gender, marital
status, age, education, employment status, and
income, only the education was significant with
privacy concerns. They reported that respondents
who had vocational or some college education
were associated with highest levels of privacy
concern. This further supports the contention
that demographic variable may be related to the
privacy protection behavior.

Awareness
Consumers may be more likely to adopt privacy
protection behavior if they are aware of the
malpractices of online companies and the extent
and the severity of privacy violations that could
occur. In their research about anti-spyware tools,
Hu and Dinev (2005) found that awareness was a
key predictor to anti-spyware adoption behavior.
Users were likely to run anti-spyware tools only
when they became aware that their personal computers were infected with spyware and when they
were aware of the negative consequences posed
by spyware. The concept of awareness has been
defined as the initial stage in the innovation dif-

9

Antecedents of Online Privacy Protection Behavior

Figure 1. Framework for the antecedents of online privacy protection behavior


Privacy
concerns
Internet
experience
Demographics

Online privacy
protection behavior

Awareness

fusion process model (Rogers, 1995). Dinev and


Hu (2007) suggest that awareness may be related
to situational awareness and problem solving
process which includes identifying the problem,
raising consciousness, and resolving the problem.
In their research on protective information technologies, Dinev and Hu (2007) found that awareness is a significant predictor for the adoption of
anti-spyware tools. Therefore, the prior research
findings suggest that awareness may be related to
the online privacy protection behavior.

dIscussIon
The proposed framework provides a comprehensive approach for studying online privacy
protection behavior of consumers. There has
been much literature assessing the consumers
concerns about privacy. We all know that consumers are concerned about their privacy in general.
What we need to understand is how consumers
are dealing with these concerns. The framework
proposes that apart from privacy concerns, Internet experience, demographics, and awareness
are also important antecedents to the prediction of
online privacy protection behavior. Only those
variables that have been researched in the past

0

have been included, and these variables should not


be looked as the exhaustive list of variables that
are important predictors for protection behavior.
There may be other important factors that could
further contribute to the prediction of online
privacy protection behavior. One such variable
is self-efficacy. The developed framework in this
study can be used to formulate and test various
hypotheses related to the adoption of privacy
protection behavior. The defined variables in
the framework may also be applicable to the
development of research models of e-commerce
adoption and information disclosures.
Identification of factors that are related to
consumers privacy behavior would help companies to formulate policies and strategies that
could influence consumer satisfaction and thereby
increase their confidence. Consumers disclose
information based on their personal assessment
of the risks and the benefits. Having a better
idea of the privacy-related variables will help
companies to focus on significant areas where
they can foster their relationship with consumers. By understanding the mechanisms used by
consumers for privacy protection, the companies
can develop policies to raise consumer confidence
and to increase the probability of disclosure of
their personal information.

Antecedents of Online Privacy Protection Behavior

possIble future reseArch


dIrectIons
Future research can be undertaken to empirically
validate the proposed framework. Either surveys
or experiments can be conducted to collect the
empirical data to test the model. One can also
expand the model by adding other relevant variables such as self-efficacy, previous privacy violation, and personality traits. Privacy concerns of
consumers may vary based on their culture. The
cultural differences regarding privacy concerns
and how this will affect privacy protection
behavior can be another avenue used to further
this research.

conclusIon
This research was undertaken with the objective of
investigating how online privacy has been studied. A framework for online privacy protection
behavior was proposed based on the literature
review. The proposed framework provides us with
a roadmap to further analyze and understand the
privacy concerns of consumers and consequent
strategies taken by consumers to protect their
privacy online. Implications for research and
practice were discussed. Further, we hope that the
proposed research directions will help to encourage more research in this exciting area.

references
Awad, N. F., & Krishnan, M. S. (2006). The
personalization privacy paradox: An empirical
evaluation of information transparency and the
willingness to be profiled online for personalization. MIS Quarterly, 30(1), 13-28.
Belanger, F., Hiller, J. S., & Smith, W. J. (2002).
Trustworthiness in electronic commerce: The role
of privacy, security, and site attributes. Journal

of Strategic Information Systems, 11(3-4), 245270.


Chen, K., & Rea, A. I. J. (2004). Protecting
personal information online: A survey of user
privacy concerns and control techniques. Journal
of Computer Information Systems, 44(4), 85-92.
Culnan, M. (2000). Protecting privacy online: Is
self-regulation working? Journal of Public Policy
& Marketing, 19(1), 20-26.
Dinev, T., & Hart, P. (2006). An extended privacy
calculus model for e-commerce transactions.
Information Systems Research, 17(1), 61-80.
Dinev, T., & Hu, Q. (2007). The centrality of
awareness in the formation of user behavioral
intention toward protective information technologies. Journal of the Association for Information
Systems, 8(7), 386-408.
Drennan, J., Mort, G. S., & Previte, J. (2006).
Privacy, risk perception, and expert online behavior: An exploratory study of household end
users. Journal of Organizational and End User
Computing, 18(1), 1-22.
Federal Trade Commission. (1996). Consumer
information privacy hearings.
Goodwin, C. (1991). Privacy: Recognition of
a consumer right. Journal of Public Policy &
Marketing, 10(1), 149-166.
Gurung, A. (2006). Empirical investigation of the
relationship of privacy, security and trust with
behavioral intention to transact in e-commerce.
Unpublished Dissertation, University of Texas at
Arlington, Arlington.
Hoffman, D. L., Novak, T. P., & Peralta, M. A.
(1999). Information privacy in the marketspace:
Implications for the commercial uses of anonymity on the web. The Information Society, 15(4),
129-139.
Hu, Q., & Dinev, T. (2005). Is spyware an internet
nuisance or public menace? Communications of
the ACM, 48(8), 61-66.


Antecedents of Online Privacy Protection Behavior

Hui, K.-L., Teo, H. H., & Lee, S.-Y. T. (2007). The


value of privacy assurance: An exploratory field
experiment. MIS Quarterly, 31(1), 19-33.
Liu, C., Marchewka, J., Lu, J., & Yu, C. (2004).
Beyond concern: A privacy-trust-behavioral
intention model of electronic commerce. Information & Management, 42(1), 127-142.
Malhotra, N. K., Kim, S. S., & Agarwal, J. (2004).
Internet users information privacy concerns
(IUIPC): The construct, the scale, and a causal
model. Information Systems Research, 15(4),
336-355.
McKnight, D. H., & Chervany, N. L. (2001). What
trust means in e-commerce customer relationships: An interdisciplinary conceptual typology.
International Journal of Electronic Commerce,
6(2), 35-59.
Milne, G. R. (2000). Privacy and ethical issues
in database/interactive marketing and public
policy: A research framework and overview of
the special issue. Journal of Public Policy &
Marketing, 19(1), 1-6.
Milne, G. R., & Culnan, M. J. (2002). Using the
content of online privacy notices to inform public
policy: A longitudinal analysis of the 1998-2001
U.S. web surveys. The Information Society, 18(5),
345-359.
Miyazaki, A. D., & Fernandez, A. (2000). Internet
privacy and security: An examination of online
retailer disclosures. Journal of Public Policy &
Marketing, 19(1), 54-61.
Miyazaki, A. D., & Fernandez, A. (2001). Consumer perceptions of privacy and security risks
for online shopping. The Journal of Consumer
Affairs, 35(1), 27-44.
Olivero, N., & Lunt, P. (2004). Privacy versus
willingness to disclose in e-commerce exchanges:
The effect of risk awareness on the relative role of
trust and control. Journal of Economic Psychology, 25(2), 243-262.



Phelps, J. E., DSouza, G., & Nowak, G. J. (2001).


Antecedents and consequences of consumer
privacy concerns: An empirical investigation.
Journal of Interactive Marketing, 15(4), 2-17.
Phelps, J. E., Nowak, G. J., & Ferrell, E. (2000).
Privacy concerns and consumer willingness to
provide personal information. Journal of Public
Policy & Marketing, 19(1), 27-41.
Ranganathan, C., & Ganapathy, S. (2002). Key
dimensions of business-to-consumer web sites.
39(6), 457-465.
Rogers, E. M. (1995). Diffusion of innovations
(4th ed.). New York: Free Press.
Sheehan, K. B. (2002). Toward a typology of
internet users and online privacy concerns. The
Information Society, 18(1), 21-32.
Sheehan, K. B., & Hoy, M. G. (1999). Flaming,
complaining, abstaining: How online users respond to privacy concerns. Journal of Advertising,
28(3), 37-51.
Sheehan, K. B., & Hoy, M. G. (2000). Dimensions
of privacy concern among online consumers.
Journal of Public Policy & Marketing, 19(1),
62-73.
Smith, H. J. (1996). Information privacy: Measuring individuals concerns about organizational
practices. MIS Quarterly, 20(2), 167-196.
Smith, H., Milburg, S., & Burke, S. (1996). Information privacy: Measuring individuals concerns
about organizational practices. MIS Quarterly,
20(2), 167-196.
Suh, B., & Han, I. (2003). The impact of customer
trust and perception of security control on the acceptance of electronic commerce. International
Journal of Electronic Commerce, 7(3), 135-161.
U.S. Census Bureau. (2007). The census bureau
of the department of commerce report on retail
e-commerce sales.

Antecedents of Online Privacy Protection Behavior

Westin, A. (1967). Privacy and freedom. New


York: Atheneum.
Yao, M. Z. (2005). Predicting the adoption of
self-protections of online privacy: A test of an
expanded theory of planned behavior model. Unpublished dissertation, University of California,
Santa Barbara.

AddItIonAl reAdIng
Culnan, M. J. (1993). How did they get my
namean exploratory investigation of consumer
attitudes toward secondary information use. MIS
Quarterly, 17(3), 341-361.
Culnan, M., & Armstrong, P. (1999). Information privacy concerns, procedural fairness, and
impersonal trust: An empirical investigation.
Organization Science, 10(1), 104-115.
Greenaway, K. E., & Chan, Y. E. (2005). Theoretical explanations for firms information privacy
behaviors. Journal of Association for Information
Systems, 6(6), 171-198.

Luo, X. (2002). Trust production and privacy


concerns on the internet: A framework based
on relationship marketing and social exchange
theory. Industrial Marketing Management, 31(2),
111-118.
Milne, G. R., & Culnan, M. J. (2002). Using the
content of online privacy notices to inform public
policy: A longitudinal analysis of the 1998-2001
U.S. web surveys. The Information Society, 18(5),
345-359.
Stewart, K. A., & Segars, A. H. (2002). An empirical examination of the concern for information privacy instrument. Information Systems
Research, 13(1), 36-49.
Udo, G. J. (2001). Privacy and security concerns
as major barriers for e-commerce: A survey study.
Information Management & Computer Security,
9(4), 165-174.
Wang, H., Lee, M. K., & Wang, C. (1998). Consumer privacy concerns about internet marketing.
Communications of the ACM, 41(3), 63-70.
Warren, S. D., & Brandeis, L. D. (1890). The right
to privacy. Harvard Law Review, 4(5), 193-220.

Henderson, S. C., & Snyder, C. A. (1999). Personal information privacy: Implications for mis
managers. 36(4), 213-220.



Section III

Empirical Assessments



Chapter VIII

Privacy Control and Assurance:

Does Gender Influence Online Information


Exchange?
Alan Rea
Western Michigan University, USA
Kuanchin Chen
Western Michigan University, USA

AbstrAct
Protecting personal information while Web surfing has become a struggle. This is especially the case
when transactions require a modicum of trust to be successfully completed. E-businesses argue that
they need personal information so they can create viable data to tailor user interactions and provide
targeted marketing. However, users are wary of providing personal information because they lack trust
in e-businesses personal information policies and practices. E-businesses have attempted to mitigate
user apprehension and build a relationship base in B2C transactions to facilitate the sharing of personal information. Some efforts have been successful. This chapter presents survey results that suggest
a relationship between gender and how users control personal information. The findings suggest that
e-businesses should modify information and privacy policies to increase information and transactional
exchanges.

IntroductIon
In the past few years we have witnessed the
competing interests of technological convenience,

personal privacy, and e-business needs. Consumers are finding that e-businesses are asking foror
takingmore personal information than they may
be willing to give in order to utilize goods and

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Privacy Control and Assurance

services. E-businesses counter that they only take


necessary information to complete transactions
and perform effective marketing and customization of their products and services.
Consumers want to actively control how much
personal information they disclose depending
on the level of trust inherent in each e-business relationship. A consumer using a familiar
Web site with privacy policies she trusts will be
more willing to divulge crucial information an
e-business needs, such as demographic data and
shopping preferences. E-businesses want to create an atmosphere that will foster this trust and
information sharing.
However, there is a palpable tension between
consumers and e-businesses at the start of a partnership. This tension exists because of a lack of
trust between users and e-businesses. This mistrust is not unfounded. E-businesses have a poor
record when it comes to protecting consumers
privacy online.

privacy and the consumer


The popular Apple iTunes software is no stranger
to privacy indiscretions. In early 2006, Apple
released iTunes version 6.0.2 which included a
new feature called the MiniStore (Borland, 2006).
The MiniStore enabled iTunes to offer customized
user recommendations based on past browsing
and purchases. Granted, this customizable feature
offered a means to enable users to find more personalized selections. However, computer experts
found that in addition to the song selection, unique
data about each user was sent back to Apple via
the MiniStore (McElhearn, 2006). Once this information was found, the softwares user agreement
was analyzed by experts who found no mention of
this particular MiniStore functionality (Borland,
2006). Apple soon recanted and explained to users
how to turn off this feature. In all new versions of
iTunes, MiniStore functionality must be enabled
by users (Apple, 2007).



However, Apples iTunes is once again in the


privacy spotlight. In 2007, researchers discovered
that all DRM-free music purchased via iTunes
embeds each users personal information in the
file (Fisher, 2007). Additional research found that
all iTunes purchases include this information with
no explanation from Apple.
Apple is not the only organization tracking
users information without their knowledge.
Microsoft Windows Media Player stores data
about all users media files that they watch either
online or on DVDs. The Media player encodes
all selections with a Windows Media Player ID
number specific to each user (Festa, 2002; Smith,
2002a). This information is then sent to an online
Microsoft database. These SuperCookies can
be used to track user viewing habits, Web surfing preferences, and other personal information.
While Microsoft denies any plans to use this data
and provides instructions on how to disable this
feature on its support pages (Smith, 2002b), the
feature is on by default until a user completes a
series of steps hidden within a detailed privacy
statement (Microsoft, 2003).
Other companies have also amassed consumers data without their knowledge. In 1999, researchers learned that Comet Cursor was tracking
the clickstreams of over 16 million people who
downloaded the free software (Oakes, 1999). Other
companies that have tracked, or are tracking,
online user movements include RealNetworks,
DoubleClick, HitBox, and X10. Some companies,
such as DoubleClick, had discontinued tracking in
favor of consumer privacy because of lawsuits and
user complaints (Krill, 2002). However, Googles
pending acquisition of DoubleClick raises new
concerns (EPIC, 2007).
Ultimately, while many of these e-businesses
have either changed data collection practices
or written the procedures into privacy policies,
users still are not always aware of the privacy
implications. Moreover, much amassing of data is
conducted without users awareness. Companies
such as WebTrends specialize in offering e-busi-

Privacy Control and Assurance

nesses detailed user data and Web site usage in


order to analyze what users do at the e-business
site (WebTrends, 2007). Users are not usually made
aware of the clickstream tracking, page views, or
other data being collected about them with each
click and view of a Web page within a site.

lack of Implemented privacy


techniques
As early as 1997, the Electronic Privacy Information Center (EPIC) released a study noting that
Web users (a.k.a. surfers) needed to be aware of
Web site usage and privacy policies (EPIC, 1997).
Our proffered examples illustrate that even if users
should be aware of privacy policies if they exist,
e-businesses can use technology to take personal
information without asking, or at least without
intentionally informing consumers.
However, many e-businesses are working to
simultaneously acquire necessary information
and protect consumers personal privacy. In early
2000, some e-businesses began implementing
P3P (platform for privacy preferences) into their
Web sites. P3P is an XML scripting language that
enables e-businesses to code their Web privacy
statements with standardized markup syntax
(W3C, 2007). Using this technology would allow
all Web browsers to find and read the e-businesses
privacy policies. Consumers would then be able to
set acceptable personal privacy criteria and would
be notified by their Web browsers if a Web site
meets these criteria (Levy & Gutwin, 2005). P3P
promised much to consumers because it would
allow them to decide not only if they wanted to
share personal information with a Web site but
also what information to share (Radcliff, 2001).
Unfortunately, the World Wide Web Consortium (W3C) suspended work on the P3P platform
in late 2006 because of a lack of support from
Web browser developers (W3C, 2006). Interestingly enough, the W3C P3P group notes that the
standard is ready for implementation even though
Web browsers currently do not support it. Perhaps
future Web browser versions will.

Because there are no massively-deployed


technologies to assist users who want to protect
personal data, most use passive and active measures during online transactions. Consumers
may not give out information (passive) or supply
false information (active) in order to complete
a transaction. As a result, e-businesses suffer
from lost or incomplete transactions, or partial
or incorrect data. Without assurances about Web
site securityfactoring in privacy and trust considerationsconsumers are unwilling to supply
the necessary personal information in order to
complete an online purchase. A recent Gartner
study estimates that over 2 billion dollars were
lost in 2006 due to consumers security fears over
e-business transactions (Schuman, 2006).

tension between desire to use and


desire to share
Consumers using active and passive measures
underscore the tension between their desire to use
technology and online services, and the level of
information they are willing to provide. Invasive
technology, such as the Media Player and Comet
Cursor, take personal data without a users
permission and not only cause user concern but
also hinder trust between a user and an e-business.
On the other hand, P3P and related technologies
offer the control users need to decide whether or
not to share personal information. In other words,
the more control users have over their personal
information, the more trust invoked between the
entities and the less user concern. P3P offers users
the chance to negotiate with each e-business as
to how much personal information they want to
share (W3C, 2007).
In order to understand how e-businesses can
foster better relationships with consumers, we
studied the correlation matrix among the criteria
of trust, concern, and control as it relates to consumers online privacy. In our research, we have
found a significant difference in gender tendencies and privacy concerns (Chen & Rea, 2004).



Privacy Control and Assurance

This discussion supports and extends the factor


of gender as crucial to relationship negotiation
between consumer and e-business. Using this
information, we can suggest when consumers
might more readily share personal information
with an e-business. We also delve into the relationships between gender and two categories of
privacy constructs: the ability to controlactively
or passivelythe unwanted presence of others
and Internet privacy concerns. Finally, we look
at how we came to these findings, and explore the
implications for future research in this area.

bAckground
Negotiating trust is especially important in this
study because all online interactions require
some level of trust between the consumer and
the e-business. From a simple hyperlink click
to a complex e-commerce purchase, trust must
be negotiated before users are willing to share
personal data in exchange for goods, services,
or information. E-businesses must understand
how to establish and maintain this trust in order
to be successful. A crucial enabling component
for trust is privacy.

privacy
Privacy takes many forms. Some researchers view
it as a moral, legal, or consumer right (Goodwin,
1991; Papazafeiropoulou & Pouloudi, 2001; Han
& Maclaurin, 2002). Others view it within the
context of a social power struggle (Campbell
& Carlson, 2002), economic theory (Hemphill,
2002), or commitment-trust theory (Mukherjee
& Nath, 2007). Some view it as simply a need
to sustain personal space (Gumpert & Drucker,
1998; Clarke, 1999) or a necessary psychological
condition (Yao, Rice, & Wallis, 2007).
This study does not address the privacy needs
of personal space in terms of biometrics, video
monitoring, or workplace surveillance. Instead,

8

we measure the consumers need for online privacy, or information privacy:


Information privacy refers to the claims of individuals that data about themselves should generally not be available to other individuals and
organizations, and that, where data is possessed
by another party, the individual must be able to
exercise a substantial degree of control over that
data and its use. (Clarke, 1999)
Key components can be extracted from this
definition. First we see users concerns that
personal data should not be taken without their
knowledge. Consumers raged at Apples iTunes,
Windows Media Player, Comet Cursor, and
DoubleClick for acquiring information without
active consent. Consumer concern over the uninformed harvest of personal information is high.
This concern quickly correlates to a lack of trust
between users and e-businesses since consumers
experience lack of control over what information
the Web sites have collected about them (Hoffman, Novak, & Peralta, 1999). Numerous studies
discuss the importance of consumers control over
personal information as the basis of establishing
online trust (Wang, Lee, & Wang, 1998; Tavani &
Moor, 2001; Han & Macclaurin, 2002; Hemphill,
2002; Roussos & Moussouri, 2004; Ashrafi &
Kuilboer, 2005; Metzger, 2006) and promoting
consumer privacy. While e-businesses argue
that the collected data will not be used without
consent, acquiring the data without explicitly
asking does not sit well with users and privacy
advocates (Yousafzai, Pallister, & Foxall, 2003;
Duffy, 2005; Flavin & Guinalu, 2006; Pan &
Zinkhan, 2006). Conversely, asking for users
consent can be regarded as a shift of information
control to the users (Eastlick, Lotz, & Warrington,
2006; Van Dyke, Midha, & Nemati, 2007). Many
e-businesses are concerned this control shift will
limit their access to crucial information and are
wary of acceding this power to users.

Privacy Control and Assurance

It follows then that one of the best means of


addressing users concerns and building trust is
for e-businesses to allow users to control their
personal information. However, it is not always
feasible in a business context to permit complete
control; therefore, e-businesses should inform
users via online privacy policies how the collected information will be used. This allows for
informed users to decide whether they should provide personal information in exchange for goods
and services (Han & Maclaurin, 2002; Ashrafi
& Kuilboer, 2005; Flavin & Guinalu, 2006;
Pan & Zinkhan, 2006; Shalhoub, 2006; Lauer &
Deng, 2007). Researchers note that companies
that inform users how their information will be
used begin to build online relationships crucial
for success:
In some respects, the lack of other means in cyberspace of establishing customer relationships
and trust based on reputation and personal contact demands that firms reveal their policies on
information disclosure, informed consent, and
handling disputes. (Schoder & Yin, 2000)
It would seem that a key measurement of how
much personal information users are willing to
share hinges on the trust level in the relationship between users and e-businesses (Roman,
2007).

trust
Before users choose to enter into a relationship
with a business, they must first be convinced that
it is in their best interest. Consumers look at discounts, reputation, and other factors before they
decide they will enter a physical store to conduct
business (So & Sculli, 2002; Duffy, 2005; Eastlick et al., 2006; Chen & Barnes, 2007; Roman,
2007). There must also be some factor of initial
trust before a user will enter into an e-business
transaction:

The initial trust model, which assumes that parties


barely know each other, also seems appropriate
for the distant, impersonal relationships that characterize most Web vendor/customer relationships.
(McKnight, Choudhury, & Kacmar, 2000)
Many studies have looked at methods through
which e-businesses can gain users trust (Hoffman et al., 1999; McKnight et al., 2000; Phelps,
Nowak, & Ferrell, 2000; Schoder & Yin, 2000;
Riegelsberger, Sasse, & McCarthy, 2003; Aiken
& Boush, 2006; Hui, Tan, & Goh, 2006; Pan &
Zinkhan, 2006). Without the means to establish
the trust relationship, there can be no viable
interaction.
Initial trust becomes the challenge for e-businesses as each Web site occupies a single clickspace for users. Current studies look at various
factors e-business can use to foster this initial
trust, such as third-party trustmarks (Noteberg,
Christiaanse, & Wallage, 2003; Hu, Lin, & Zhang,
2003; Kim, Steinfield, & Lai, 2004; Patton &
Josang, 2004; Moores, 2005; Aiken & Boush,
2006), privacy policy statements (Han & Maclaurin, 2002; Ashrafi & Kuilboer, 2005; Flavin
& Guinalu, 2006; Pan & Zinkhan, 2006), brand
reputation (So & Sculli, 2002; Yousafzai et al.,
2003; Duffy, 2005; Aiken & Boush, 2006; Eastlick
et al., 2006; Metzger, 2006; Roman, 2007), and
Web site design and content (Janda, Trocchia, &
Gwinner, 2002; Chen & Barnes, 2007). However,
it is still ultimately the trust factor or lack of
faith noted almost 10 years ago by Hoffman,
Novak, and Peralta (1999) that plays one of the
most crucial roles in e-business/consumer relational success.
Without trust, users will not enter into relationships with e-businesses, and they will not
provide any personal information to e-business
sites: Almost 95% of Web users have declined
to provide personal information to Web sites at
one time or another when asked (Hoffman et
al., 1999). More recent studies share statistics
similar to Hoffman et al. (1999), suggesting that

9

Privacy Control and Assurance

not much has changed in terms of the consumer/


e-business trust relationship (Moores, 2005; Pan
& Zinkhan, 2006).
Trust must be established and maintained in
order to foster and encourage a relationship in the
e-business sphere. Trust is a construct that must
be negotiated between a user and an e-business.
What makes trust difficult is that this negotiation
factor differs not only by the type of e-business
or user but also by each relationship. Studies have
looked at users disposition to trust (McKnight
et al., 2000), relationship exchanges (Hoffman
et al., 1999), and consumer perceptions regarding
the ethics of online retailers (CPEO) (Roman,
2007) to explain how these relationship negotiations can be accomplished.
Another factor that must be considered in
this trust negotiation is gender. Researchers have
found significant differences in gender tendencies
toward Internet usage (Teo & Lim, 1997; Allen,
2000; Nachimias, Mioduser, & Shelma, 2001;
Roach, 2001; Sexton, Johnson, & Hignite, 2002;
Ono & Zavodny, 2005; Chesley, 2006; Kang &
Yang, 2006; Fraser & Henry, 2007), e-business
(Papazafeiropoulou & Pouloudi, 2001; Sexton
et al., 2002; Garbarino & Strahilevitz, 2004;
Ha & Stoel, 2004; Fraser & Henry, 2007), and
exploratory work concerning privacy (Hupfer &
Detlor, 2007; Yao et al., 2007). Our study supports
and extends the factor of gender as crucial to the
relationship negotiation and users attitudes as to
whenor if they will share personal information with an e-business.

mAIn thrust of the chApter


Concerns about giving out personal information
in the e-commerce environment cause low user
involvement in online transactions (Miyazaki &
Fernandez, 2001; Aiken & Boush, 2006). Users
almost always find it unacceptable for marketers to
use their personal information for purposes other
than current, and sometimes future, transactions

0

that require personal information whether it occurs


within traditional marketing methods (Goodwin,
1991; Nowak & Phelps, 1992; Wang & Petrison,
1993) or online (Tavani & Moor, 2001; Flavin
& Guinalu, 2006; Roman, 2007). Both shoppers
and non-shoppers worry about issues of acquisition and dissemination of consumer data (Rohm
& Milne, 1998; Tavani, 1999; Berendt, Gnther,
& Spiekermann, 2005; Mukerjee & Nath, 2007).
These concerns are usually triggered by more than
one catalyst, such as age (Han & Maclaurin, 2002)
or education (Ono & Zavodny, 2005; Flavin &
Guinalu, 2006). However, in many cases, a major
catalyst is linked to gender (Sexton et al., 2002;
Garbarion & Strahilevitz, 2004; Ha & Stoel, 2004;
Chesley, 2006; Fraser & Henry, 2007).

gender-linked privacy Questions


Male users have been reported to use the Internet
more frequently and for a greater number of tasks
than female users (Teo & Lim, 1997; Teo, Lim,
& Lai, 1999; Papazafeiropoulou & Pouloudi,
2001; Nachimias et al., 2001; Sexton et al., 2002;
Flavin & Guinalu, 2006). Although Internet
usage patterns are shifting toward a more equal
gender balance (Roach, 2001; Ono & Zavodny,
2003; Ono & Zavodny, 2005), e-business transactions remain a male-dominated realm (Garbarino
& Strahilevitz, 2004; Ha & Stoel, 2004; Fraser
& Henry, 2007). The increased level of Internet
experience and utilization of online tasks has
been negatively related to concerns about online
privacy (Hoffman et al., 1999; Flavin & Guinalu,
2006). However, other studies indicate that online
privacy concerns are more readily attributed to
educational level (Phelps et al., 2000; Flavin &
Guinalu, 2006), or other demographic characteristics, such as marital status (Chesley, 2006), age
(Han & Maclaurin, 2002), or employment (Ono
& Zavodny, 2005).
Ultimately, there is a strong indication that
gender is a major factor relative to privacy concerns. Statistics from an early study show that

Privacy Control and Assurance

87% of female Internet users were very concerned


about threats to their personal privacy while only
76% of male Internet users were very concerned.
Furthermore, women registered higher levels of
concern on every privacy-related issue about
which they were questioned (Ackerman & Lorrie,
1999). More recently, Friedman, Kahn, Hagman,
Severson, and Gill (2006) report in a study, The
Watcher and the Watched, that females were
more concerned than males about all aspects of
privacy. In this study, researchers recorded more
than 900 people passing through a public space.
Afterward, those filmed were asked their reactions to whether these images should be publically
viewed in real-time at the location, in real-time at
another location (anywhere in the world), or saved
and distributed online. Both males and females
had some concerns over one of the three options.
However, in almost all cases, females expressed
privacy concerns over displaying the video and
images no matter what the situational context
or if they were in the role of the watcher or the
watched. Ultimately, females were more aware
of the implications of private information within
public spaces (Friedman et al., 2006).
This awareness can be linked to how women
communicate on the basis of network-oriented and
collaborative tasks while men, on the other hand,
communicate to elevate their social hierarchy
(Kilbourne & Weeks, 1997; Teo & Lim, 1997).
Female users consider e-mail to have a higher
social presence (Gefen & Straub, 1997), and tend
to focus more on messaging and community formation (Teo & Lim, 1997). Mens use of electronic
media involves exchanging messages to disseminate information (Brunner, 1991). Men also focus
more on searching, downloading, and purchasing
(Teo & Lim, 1997; Ha & Stoel, 2004).
Womens use of electronic media for collaborative purposes may involve exchanging
messages that contain personal information,
views, and other information not meant for public
disclosures. Moreover, studies have shown most
women rely on recommendations from social

networks to make purchasing decisions (Garbarino & Strahilevitz, 2004). Studies of women in
female-only discussion groups show women to
be more focused on self-disclosure and individual
opinions; women respond directly to others in
the discussion groups. Conversely, men in maleonly discussion groups do not self-disclose, and
instead argue to win discussions (Savicki, Kelley,
& Lingenfelter, 1996).
Because the content of communication messages largely affects the participants willingness
to share the messages with others, it is likely that
women would prefer more privacy protection
than men. Therefore, we conjecture the following hypothesis:
H1: Female users are more concerned with online
privacy practices than male users.
Consumers can be very uncomfortable sharing their personal data with others because they
are sensitive to disclosing such data (Phelps et
al., 2000; Han & Maclaurin, 2002; So & Sculli,
2002; Duffy, 2005; Aiken & Boush, 2006; Flavin & Guinalu, 2006). The term information
sensitivity refers to the level of privacy concern
an individual feels for a type of data in a specific
situation (Weible, 1993). Despite concerns about
online privacy, consumers do realize that personal
information is important to online marketers.
Consequently, they are willing to provide such
information when Web sites provide privacy
statements explaining how the collected information would be used (Hoffman et al., 1999; Han
& Maclaurin, 2002; Ashrafi & Kuilboer, 2005;
Flavin & Guinalu, 2006; Pan & Zinkhan, 2006).
The disclosure of such privacy statements and
similar constructs, such as trustmarks, is related
to higher levels of involvement in e-commerce
(Miyazaki & Fernandez, 2000; Noteberg et al.,
2003; Hu et al., 2003; Kim et al., 2004; Patton &
Josang, 2004; Moores, 2005; Aiken & Boush,
2006). Thus, online privacy issues are also related
to (1) an individuals willingness to share personal



Privacy Control and Assurance

data and (2) the privacy disclosure statements of


online vendors.
Although some exploratory studies (Garbarino
& Strahilevitz, 2004; Ha & Stoel, 2004; Flavin
& Guinalu, 2006; Fraser & Henry, 2007) indicate
a relationship between gender and privacy issues,
they have not demonstrated a direct relationship
between gender and these two types of privacy
issues (willingness and disclosure). Nonetheless,
women have been found to process information
in more detail and thus are more aware of, and
sensitive to, changes in their environments (Meyers-Levy & Maheswaran, 1991). Women are more
likely to be irritated than men by ambient factors
(e.g., background conditions), store design factors
(e.g., aesthetic and functional aspects), and social
factors (e.g., customers in the same environment)
(DAstous, 2000; Garbarino & Strahilevitz, 2004;
Ha & Stoel, 2004).
Even though most of the indicated studies
focused on womens sensitivity to changes in their
physical environments, it is likely this sensitivity
would continue in the virtual environment.
Because of these reasons, we hypothesize that
gender difference may be significant in the context
of online privacy:
H2: Female users are less likely to share their
personal data with online vendors.
H3: Female users are more concerned with the
disclosure of privacy statements on Web
sites.
Consumers ability to control their interactions
within Web sites affects their perceptions of online
security and privacy practices (Hoffman et al.,
1999; Han & Maclaurin, 2002; Nam, Song, Lee,
& Park, 2005; Pan & Zinkhan, 2006). Goodwin
(1991) defines this type of control as control over
unwanted presence in the environment. However,
in some environments, such as the Internet, users
cannot easily control the unwanted presence of
others. For example, users can delete unsolicited



e-mail thus controlling the unwanted presence in


a passive way. Alternatively, users can configure
their e-mail software to filter out unwanted future
e-mail, respond to the sender to opt out of mailing
lists, or report the offense to a third party service,
thereby branding the senders as spammers. In
instant messaging, users can block others from
their friend lists or simply ignore them. Therefore,
even though intrusion into users environments
due to publicly accessible nature of the Internet
is unavoidable, users reactions to such intrusions
can be of two types: active control (block) and
passive control (ignore).
While women consider electronic communications useful, it is men who are more comfortable
actively using software tools for such Internet
activities (Arbaugh, 2000; Sexton et al., 2002).
Existing studies report that the literature discussing the relationship between computer anxiety and
gender is relatively inconclusive (Chua, Chen, &
Wong, 1999; King, Bond, & Blandford, 2002).
However, in terms of online shopping, women have
reported difficulty locating Internet merchandise
and navigating through modern Web interfaces
(Fram & Grady, 1997; Garbarino & Strahilevitz,
2004; Ha & Stoel, 2004). Compounded with fewer
interests in computer-related activities, female
users are less likely to have a favorable attitude
toward computer usage (Qutami & Abu-Jaber,
1997; Nachimias et al., 2001).
In studying gender differences in Web searching, researchers found that men are more active
online and explored more hyperlinks than women
(Teo & Lim, 1997; Large, Beheshti, & Rahman,
2002). Male online users also take control of the
bandwidth by sending longer messages (Herring,
1992) and tend to use the computer more hours
per week than women (Sexton et al., 2002). Thus,
it is likely that women will be engaged in passive
controls to block the unwanted presence of others. Conversely, men, with their strong computer
interests and skills, are more likely to take active
control over any unwanted presence:

Privacy Control and Assurance

H4: Female users are likely to be engaged in


passive control over unwanted presence of
others.
H5: Male users are more likely to actively control
unwanted presence of others.

2004) and Table 4 shows only the survey questions


related to the current study. Each privacy-related
question was measured with a seven-point Likert
scale anchored by strongly agree (1) to strongly
disagree (7).

survey procedure
study procedures
The authors developed a survey (Table 4) based
on existing privacy literature. To our best knowledge, there is no existing study that empirically
covers online privacy issues focusing on users
preferences to control the unwanted presence of
others (either actively or passively), and concerns
on privacy disclosure, revealing personal data, and
online privacy practices. Specifying the domain
of individual privacy concerns and their ability
to control, the authors used the literature review
(detailed in a previous section) as the starting point
for this instrument. Questions for three types of
privacy concerns (revealing personal data, privacy
practice, and disclosure of privacy policy) were
modified from Smith, Milberg, and Burke (1996)
and others, to suit the context of the current study.
The user controls over others presence were
derived from Goodwin (1991) and others. The
survey instrument was internally distributed to
faculty members who specialize in e-commerce
and online privacy to ensure its content validity.
A pilot study followed and the instrument was
further fine-tuned before implementation.

survey structure
The survey begins with a series of questions that
assess the respondents concerns over various
privacy issues and his or her ability to control
personal data and the presence of others. To avoid
the likelihood of associating each participants
personal demographic data with real individuals, questions regarding identifiable information
were separated from the main part of the survey.
This survey is part of a larger study (Chen & Rea,

The survey was administered to undergraduate


students taking a Web site architecture/design
course. Several prerequisite courses had prepared
participants for basic computer literacy, such as
computer software, hardware, and the Internet,
before they were eligible for this junior-senior level
course. Participants were awarded extra points
in class for completing the survey. To encourage
involvement, the participants were promised
anonymity and that their responses would be used
solely for research purposes.
As other studies note, a study sample of more
advanced users from a younger demographic is
useful because it functions as an indicator of
Internet and e-business usage among more Websavvy individuals (Nachimias et al., 2001; Kim et
al., 2004; Moores, 2005; Aiken & Boush, 2006).
With the increase of Web usage in almost all
countries, the study group offers a glimpse into
the future of online consumption. However, an
application of the study to a larger, more general
Web-using population would also prove useful
for measuring our study of controls, and also
indicate shifts in approaches as more users partake in online commerce and general Web usage.
Other studies have demonstrated that even with
a larger demographic, gender tends to play a role
in Internet usage (Teo & Lim, 1997; Roach, 2001;
Sexton et al., 2002; Ono & Zavodny, 2003; Ono &
Zavodny, 2005); in some cases, gender may also
be a factor in e-business participation (Garbarino
& Strahilevitz, 2004; Ha & Stoel, 2004; Chesley,
2006; Fraser & Henry, 2007).
Out of a possible 160 students, 107 elected to
participate in the survey, of which 105 valid responses were returned. Of the 105 valid responses,



Privacy Control and Assurance

Table 1. Factor analysis


Factor Loading
Variable

Factor 1: Concern about privacy practice


(A)

.86

(B)

.83

(C)

.83

(D)

.82

(E)

.82

(F)

.67

.39

Factor 2: Concern about privacy disclosure


(G)

.90

(H)

.88

(I)

.85

(J)

.85

(K)

.82

(L)

.73

Factor 3: Concern about giving out personal data


(M)

.84

(N)

.74

(O)

.74

(P)

.70

Factor 4: Active control


(Q)

.86

(R)

.84

(S)

.60

(T)

.58

Factor 5: Passive control


(U)

.78

(V)

.77

(W)

.56

(X)

.49

Variance explained
Cronbachs alpha



24.07%

17.89%

10.02%

7.85%

6.90%

.90

.92

.84

.70

.62

Privacy Control and Assurance

eight did not indicate gender in the demographics


section and have been removed for the purposes
of this study. The 97 remaining responses were
entered in the database for further analyses, and
are the basis for this study and ensuing discussion.
The response rate was 66.88%.

Analysis and results


Of the 97 responses, 71 participants were male
(73.2%) and 26 participants were female (26.8%).
Most respondents were between the ages of 16
and 24. Of them, 55.2% were whites, 21.9%
were Asians, 6.7% were African-Americans, and
4.8% were Hispanics. Furthermore, 53.3% of the
participants were full-time students, 38.1% were
employed, and 1% was retired.
Questions regarding privacy concerns and the
users ability to control the environment were factor analyzed with principal component extraction.
The exploratory factor analysis with an orthogonal
(varimax) rotation yielded factors with multiple
loadings. With eigenvalues of 1.0 and scree plots
as the criteria, we found a presence of five factors. Items with factor loadings less than 3.0 were
excluded from further analyses. The results of the
factor analysis are shown in Table 1.
Factors one and three were split from a larger
section of questions in which respondents were
asked how concerned they were about (1) current
privacy practice on the Internet and (2) giving
out personal information. The results of factor
analysis show two separate factors: concern about
privacy practice (variance explained: 24.07%;
Cronbachs alpha: .90) and concern about giving
out personal data (variance explained: 10.02%;
Cronbachs alpha: .84).
The second factor reflects the respondents
feelings about giving their personal data when
e-commerce sites are supplemented with privacy
components, such as privacy statements, privacy
logos, and privacy endorsements from third parties. Therefore, the factor is named concern about
privacy disclosure. This factor explains 17.89%
of the variance and its Cronbachs alpha is .92.

The fourth factor is comprised of items such as


deleting browser cookies, abandoning spammed
e-mail accounts, and faking personal information
for online registration. Although these items are
related to dealing with the unwanted presence
of others, they exhibit the users intention to be
actively involved in preventing intrusion into
personal online privacy. Users combat such
intrusion with active actions, such as information falsification and multiple identities. Thus,
the factor is named active control. This factor
explains 7.85% of the variance and its Cronbachs
alpha is .70.
Four items loaded on the last factor clearly represent users reactions to the unwanted presence of
others. In public accessible media, such as e-mail
and chat rooms, it is difficult for users to prevent
others from contacting them or automatically collecting personal information. One good strategy
of blocking such an unwanted presence is to just
ignore it. This factor is named passive control
because items clustered into it were related to users control in a passive way. This factor explains
6.90% of the variance. Cronbachs alpha for this
factor is .62. Traditionally, an instrument is considered sufficiently reliable when its Cronbachs
alpha is .70 or above. However, Hair, Anderson,
Tatham, and Black (1998) indicate that Cronbachs
alpha may decrease to .60 in exploratory studies.
Because of the exploratory nature of the current
study, this factor is considered valid for further
analysis. Factor loadings of these five factors were
saved for later analyses.
ANOVA was conducted to examine the differences between the two genders in their reactions
to the five privacy factors. The results in Table 3
show that two variables (concern about privacy
disclosure and active control) were significantly
related to the gender variable. Both Table 2
and Table 3 indicate that male users were more
concerned about active control of the unwanted
presence of others, while female users were more
concerned about privacy disclosure. Gender differences were not found in the rest of variables:



Privacy Control and Assurance

Table 2. Means and S.D.five privacy factors


N

(1) Concern about privacy practice

(2) Concern about privacy disclosure

(3) Concern about giving out personal data

(4) Active control

(5) Passive control

Mean

Std.
Deviation

Std.
Error

95% Confidence
Interval for Mean
Lower
Bound

Upper
Bound

Min.

Max.

Female

26

-.08

1.20

.24

-.57

.40

-4.22

1.30

Male

71

.02

.93

.11

-.20

.24

-3.56

1.11

Total

97

-.05

1.00

.10

-.21

.20

-4.23

1.30

Female

26

.38

.75

.15

.08

.69

-1.30

1.36

Male

71

-.15

1.05

.12

-.40

.10

-2.22

1.48

Total

97

-.08

1.00

.10

-.21

.19

-2.22

1.48

Female

26

-.21

1.11

.22

-.66

.24

-2.64

1.31

Male

71

.07

.96

.11

-.16

.30

-2.60

1.65

Total

97

-.05

1.00

.10

-.21

.20

-2.64

1.65

Female

26

-.33

.96

.19

-.72

.05

-1.82

1.51

Male

71

.14

.99

.12

-.09

.37

-2.15

1.90

Total

97

.01

.99

.10

-.19

.21

-2.15

1.90

Female

26

.25

.77

.15

-.06

.56

-1.48

1.50

Male

71

-.06

1.03

.12

-.31

.18

-3.91

1.81

Total

97

.02

.98

.1

-.18

.22

-3.91

1.81

Table 3. ANOVAfive privacy factors

(1) Concern about privacy practice

(2) Concern about privacy disclosure

(3) Concern about giving out personal data

(4) Active control

(5) Passive control



Sum of
Squares

df

Mean
Square

Sig.

Between groups

.221

.22

.22

.64

Within groups

96.55

95

1.02

Total

96.77

96

Between groups

5.45

5.45

5.69

.02

Within groups

90.91

95

.96

Total

96.36

96
1.47

.23

4.45

.04

1.952

.17

Between groups

1.47

1.47

Within groups

95.28

95

1.00

Total

96.75

96

Between groups

4.28

4.28

Within groups

91.30

95

.96

Total

95.58

96

Between groups

1.86

1.86

Within groups

90.59

95

.95

Total

92.45

96

Privacy Control and Assurance

passive control, concern about giving personal


data, and concern about privacy practice.

discussion and recommendations


The first hypothesis predicts that females are
more concerned about privacy practice online
(Factor 1). With a p-value greater than .05, this
is not supported in the current study (p = .64).
The second hypothesis conjectures that females
are less likely to share their personal data with
online vendors (Factor 3). The result of ANOVA
suggests this hypothesis should also be rejected (p
= .23). There was no statistical difference between
the two gender groups in their concerns about
giving out personal data. However, the means of
variables clustered into the same factor range from
5.59 to 6.19 for females, and from 5.73 to 6.41 for
males, indicating that both gender groups were
very concerned about giving out their personal
data on the Internet. E-businesses should take this
concern into account and work toward alleviating
this condition through privacy assurances.
Concerns about disclosure of privacy procedures (Factor 2) were conjectured in the third
hypothesis. The results from Table 3 support this
hypothesis (p = .02). Females appear to be more
concerned about whether e-businesses disclose
their privacy procedures. As the results suggest,
females pay more attention to online privacy than
males. The Watched study also found similar
gender-related privacy concerns (Friedman et al.,
2006). E-businesses that disclose their privacy
practices in various modes, such as showing a
privacy statement, displaying a third-party privacy
logo, and creating alliances with other well-known
sites (Hu et al., 2003; Noteberg et al., 2003; Kim et
al., 2004; Aiken & Boush, 2005), are very likely
to draw attention from female users. It is likely
that privacy issues and practices are beginning to
emerge as a strong influencer of e-commerce
success. Privacy is an important measure by
those who are sensitive to their physical or virtual
environments. Females are known to possess a

high environmental sensitivity (Meyers-Levy &


Maheswaran, 1991) and will be the first to spot
the importance of privacy practices within their
virtual environments.
Trust in e-privacy practices has been cited as
one cause of low consumer involvement in business-to-consumer electronic commerce (Hoffman
et al., 1999; Phelps et al., 2000; Eastlick et al.,
2006; Roman, 2007). Our present study supports
this assertion because female users are most likely
to be influenced by various privacy procedure
disclosures. Therefore, to foster trust and enhance
consumer perception of privacy protection, ebusinesses are advised to make their privacy
procedures visible and understandable.
Hypotheses four and five predict that females
are more likely to adopt passive control mechanisms (Factor 5) to block the unwanted presence
of others, while males are more likely to take a
proactive approach (Factor 4) to prevent such a
presence. Hypothesis four was not supported (p =
.17), but hypothesis five was (p = .04). As evident
in existing literature, males are likely to exercise
control over messages and the communication medium. Unsolicited marketing e-mail and Web user
tracking components, such as cookies and Web
bugs, can be ineffective because male users will
control and filter out these unwanted messages.
As a result, business intelligence resulting from
mining the data collected from such user-tracking
software can be biased and distort online marketing decisions. E-businesses should be cautioned
to carefully interpret the mined data, which may
be subverted by male users online behavior.

limitations of the study


Our study empirically assessed an Internet users
ability to control his or her private information
and its relationship with the five types of privacy
assurance techniques. However, due to its early
assessment of privacy controls and its empirical
nature, this study is subject to limitations.



8

E-businesses should never share personal information with other


companies without proper authorization.

E-businesses should never sell the personal information in their


computer database to other companies.

E-businesses should take more steps to make sure that unauthorized


people cannot access personal information in their computers.

E-businesses should devote more time and effort to preventing


unauthorized access to personal information.

When people give personal information to an e-business for some


reason, the e-business should never use the information for any other
reason.

(B)

(C)

(D)

(E)

(F)

(M)

It usually bothers me when e-businesses ask me for personal


information.

Factor 3: Concern about giving out personal data

I feel more comfortable submitting personal information to sites that


have a brick-and-mortar counterpart that I can shop in.

I feel more comfortable submitting personal information to sites that


have a statement that guarantees protection of personal information.

(J)

(L)

I feel more comfortable submitting personal information to sites that


have a privacy logo endorsed by another site.

(I)

I feel more comfortable submitting personal information to sites that


have a privacy statement.

I feel more comfortable submitting personal information to sites that


have established themselves with other well-known companies.

(H)

(K)

I feel more comfortable submitting personal information to sites that


have established themselves as good e-commerce sites.

(G)

Factor 2: Concern about privacy disclosure

Computer databases that contain personal information should be


protected from unauthorized access.

(A)

Factor 1: Concern about privacy practice

Variable

Strongly
disagree

Somewhat
disagree

Disagree

Neutral

Agree

Somewhat
agree

Strongly
agree

Privacy Control and Assurance

Table 4. Survey instrument

continued on following page

It bothers me to give personal information to so many e-businesses.

I am concerned that e-businesses are collecting too much personal


information about me.

(O)

(P)

When visiting a Web page that requires registration, I fake my


personal information to obtain access.

I use several email accounts for privacy reasons.

I know how to delete cookies from my computer.

(R)

(S)

(T)

I do not answer to unsolicited telemarketing calls.

I do not respond to unsolicited email.

I block Web browsers from receiving cookies.

I ignore chat requests from people that I dont know.

(U)

(V)

(W)

(X)

Factor 5: Passive control

When downloading software from a Web site that requires


registration, I fake my personal information to obtain the software.

(Q)

Factor 4: Active control

When e-businesses ask me for personal information, I sometimes


think twice before providing it.

(N)

Privacy Control and Assurance

Table 4. continued

9

Privacy Control and Assurance

As noted previously, the current study sample


may not be entirely representative since students
were recruited to participate in the survey. However, the responses of these Web-savvy users may
indicate that much needs to be done to facilitate
effective e-business transactions. The sampling
issue may also influence the generalizability of
the findings. However, the differences found in
the relationship between privacy controls and user
concerns from this somewhat homogeneous group
of respondents may suggest that more dramatic differences could be expected from a broader sample
involving randomly selected participants.
Finally, this study does not address psychological and attitudinal factors, which may be avenues
for future exploration within the online privacy
context. Even with these constraints, the current study nonetheless provides preliminary, yet
useful, insights to the body of privacy research.
Moreover, it can be situated with fellow researchers from diverse disciplines who are currently
examining various factors in other exploratory
studies (Garbarino & Strahilevitz, 2004; Royal,
2005; Chesley, 2006; Kang & Yang, 2006; Fraser
& Henry, 2007; Hupfer & Detlor, 2007; Yao et
al., 2007).

future trends
The ongoing tension between the e-business need
for personal information and user willingness
to provide personal information in exchange for
goods and services will increasingly demand our
attention. As more information becomes available
online for individual use, more information is
collected by those offering the services. Consider
Internet behemoths, such as Google, that offer a
powerful search engine, e-mail, map searches
(including actual street views), word processing
software, and various other tools (Google, 2007a),
all online for free access and use by anyone.
However, these features may come at a price of
which most users are unaware: privacy. Whether

80

a user remains logged into her Google account or


keeps her Google Toolbar activated in her Web
browser (Google, 2007b), information is collected
at granular levels and saved for months or years on
Googles systems (OBrien & Crampton, 2007).
Subverting this process requires extensive technical knowledge and will limit many of the proffered
Google features. Here we have the crux of the
issue: features versus privacy. To compound the
issue, experts have labeled Google as the worst
offender in terms of its privacy practices (CNN,
2007). Coupled with a pending acquisition of
DoubleClick by Google, we could see serious
privacy infringements via the amount and type
of personal information collected online (EPIC,
2007).
Google is not the only e-business in this
scenario of services versus privacy, but it is one
of the most prevalent. As more commodities are
offered online, more will be asked of users. Both
e-businesses and consumers must work together
to find the middle ground between how much
information can be collected, retained, and used,
and what users are willing to provide in exchange
for the goods and services.
One avenue towards this middle ground may
already be available to e-businesses: a rebirth
and reintegration of the W3Cs P3P specification.
Although P3P is currently suspended, it would
take little effort to incorporate this specification
into current Web browsers via plug-in technology.
Additional initiatives based on P3P are already in
the works, such as the Policy Aware Web (2006)
and Prime (2007).
No matter what the solution, without a dialogue about these issues, users will continue to
find both active and passive means to subvert
data collection techniques. As a result, e-businesses will continue to work to find new ways to
collect data, sometimes without users consent.
We suggest the resources and time necessary for
an acceptable compromise would be better spent
looking for solutions together rather than fighting
for control of information.

Privacy Control and Assurance

conclusIon
Our study suggests that both female and male
users implement controls when utilizing online
services in the current e-business environment.
Males use multiple identities and techniques to
actively thwart data collection, whereas females
passively ignore information requests or altogether forgo participation. Whatever the case, if
e-businesses want to collect viable data in order
to improve online offerings and remain competitive, they must (1) implement an accessible and
easy-to-read privacy statement and (2) obtain
endorsement from well-known privacy groups
such as the BBBOnLine (BBBOnLine, 2007) and
TRUSTe (TRUSTe, 2007), as well as prominently
display the resulting certification logo. These two
items are the foundation upon which an e-business
can begin to form the initial trusting relationship
between itself and users.
Without trust between e-business and consumer, there can be no productive relationship
(Duffy, 2005; Flavin & Guinalu, 2006; Roman,
2007). Consumers will share the required information if they understand and agree with the privacy
policies, as well as trust that an e-business will
only use their personal data to better the available personalized offerings. In the information
age, most consumers are bombarded with useless
information on a daily basis; customized offerings are a most welcome change. However, if the
trust is breeched, an e-business will experience an
uphill battle to regain relationships with existing
customers and acquire new consumers.

future reseArch dIrectIons


Our study linking gender to attitudes toward online privacy is promising. E-businesses should take
note that female users tend to build relationships
with organizations that disclose their privacy policies (hypothesis three). E-businesses, particularly
those seeking to maximize their female customer

base, should consider meeting the requirements


of privacy and business organizations, such as
TRUSTe (TRUSTe, 2007) and the BBBonline
(BBBOnLine, 2007), thereby enabling the display
of the third-party trustmark logo on their Web
sites. Future research should look to help e-businesses determine the correct course of action to
maximize initial relationship formations.
Male users taking an active approach to privacy
protection (hypothesis five) suggests another track
of research. E-businesses that rely solely on Web
bugs, cookies, banner ads, and click streams are
losing valuable data because of the active control
of information male users employ during their
Web surfing forays. As a result, mined data might
be skewed in directions unfavorable to effective
marketing applications.
In recent years, Internet computing has experienced a major interaction shift with the influx
of social computing applications, such as social
networking sites like Facebook (Facebook, 2007)
and MySpace (MySpace, 2007), and the influx of
Web 2.0 services, such as Google Docs (Google,
2007c). Current calls for cross-disciplinary research (Parameswaran & Whinston, 2007) must
be heeded as we look to the changing social dynamics within online social networks (Jenkins &
Boyd, 2006) and how they affect gender-related
perceptions of privacy. For example, recent selfdocumented Facebook incidents (Chang, 2006,
Leitzey, 2006, Stapleton-Paff, 2007; Stokes, 2007)
of collegiate underage drinking and criminal
mischief are no longer uncommon. Both male
and female students share more of what used
to be considered personal within these public
spaces. Researchers must examine if this shift
translates into changing attitudes (gender-related or otherwise) toward all forms of control
of personal information, especially within the
e-business realm.
Our current study is designed to focus on
gender differences in several privacy issues. It
is likely that further insights can be discovered
when other user characteristics, such as computer

8

Privacy Control and Assurance

and Internet experience, culture, learning styles


and other types of online preferences, and demographic data, such as age, job type, and education,
are included. When these other factors are taken
into consideration, gender may or may not be the
sole factor that explains user privacy concerns.
Ultimately, our study suggests there are rich veins
of research yet to be mined. Online privacy issues
are in their infancy. As technology advances and
more people use the Internet, demographical studies concerning users view of online privacy and
how e-business can remain effective, yet maintain
trusting relationships, are essential.

BBBOnLine. (2007). Retrieved December 9, 2007,


from http://www.bbbonline.org/

references

Campbell, J., & Carlson, M. (2002). Panopticon.


com: Online surveillance and the commodification
of privacy. Journal of Broadcasting & Electronic
Media, 46(4), 586-606.

Ackerman, M. S., & Lorrie F. C. (1999). Privacy


critics: UI components to safeguard users privacy. In Proceedings of the ACM Conference on
Human Factors in Computing Systems (CHI99)
(pp. 258-259). Pittsburgh, PA.
Aiken, K., & Boush, D. (2006). Trustmarks, objective-source ratings, and implied investments
in advertising: Investigating online trust and
the context-specific nature of internet signals.
Academy of Marketing Science Journal, 34(3),
308-323.
Allen, A. (2000). Gender and privacy in cyberspace. Stanford Law Review, 52(5), 1175-1200.
Apple. (2007). iTunes [software]. Retrieved
December 9, 2007, from http://www.apple.com/
itunes/
Arbaugh, J. B. (2000). An exploratory study of
effects of gender on student learning and class
participation in an internet-based MBA course.
Management Learning, 31(4), 503-519.
Ashrafi, N., & Kuilboer, J. (2005). Online privacy
policies: An empirical perspective on self-regulatory practices. Journal of Electronic Commerce
in Organizations, 3(4), 61-74.

8

Berendt, B., Gnther, O., & Spiekermann, S.


(2005). Privacy in e-commerce: Stated preferences vs. actual behavior. Communications of
the ACM, 48(4), 101-106.
Borland, J. (2006). Apples iTunes raises privacy
concerns. CNet News. Retrieved July 7, 2007, from
http://news.com.com/Apples+iTunes+raises+priv
acy+concerns/2100-1029_3-6026542.html
Brunner, C. (1991). Gender and distance learning.
Annals of the American Academy of Political and
Social Science, 133-145.

Chang, J. (2006). Is Facebook private? Northwestern Chronicle. Retrieved October 2, 2007,


from http://www.chron.org/tools/viewart.
php?artid=1346
Chen, K., & Rea, A. (2004). Protecting personal
information online: A survey of user privacy concerns and control techniques. Journal of Computer
Information Systems, 44(4), 85-92.
Chen, Y., & Barnes, S. (2007). Initial trust and
online buyer behaviour. Industrial Management
& Data Systems, 107(1), 21-36.
Chesley, N. (2006). Families in a high-tech age:
Technology usage patterns, work and family
correlates, and gender. Journal of Family Issues,
27(5), 587-608.
Chua, S. L., Chen, D.T., & Wong, A. F. L. (1999).
Computer anxiety and its correlates: A metaanalysis. Computers in Human Behaviors, 15,
609-623.
Clarke, R. (1999). Internet privacy concerns confirm the case for intervention. Communications
of the ACM, 42(2), 60-67.

Privacy Control and Assurance

CNN. (2007). Google privacy Worst on the


Web. CNN.com. Retrieved July 10, 2007, from
http://www.cnn.com/2007/TECH/internet/06/11/
google.privacy.ap/
DAstous, A. (2000). Irritating aspects of the shopping environment. Journal of Business Research,
49, 149-156.
Duffy, D. (2005). The evolution of customer loyalty strategy. The Journal of Consumer Marketing,
22(4/5), 284-286.
Eastlick, M., Lotz, S., & Warrington, P. (2006).
Understanding online B-to-C relationships: An
integrated model of privacy concerns, trust, and
commitment. Journal of Business Research,
59(8), 877-886.
Electronic Privacy Information Center (EPIC).
(1997). Surfer beware: Personal privacy and the
internet. Washington, D.C.: Electronic Privacy
Information Center. Retrieved July 7, 2007, from
http://www.epic.org/reports/surfer-beware.html
Electronic Privacy Information Center (EPIC).
(2007). Proposed Google/DoubleClick deal.
Washington, D.C.: Electronic Privacy Information
Center. Retrieved July 7, 2007, from http://www.
epic.org/privacy/ftc/google/
Facebook. (2007). Retrieved December 9, 2007,
from http://www.facebook.com/
Festa, P. (2002). Windows media aware of DVDs
watched. CNet News. Retrieved May 28, 2007,
from http://news.com.com/2100-1023-841766.
html
Fischer, K. (2007). Apple hides account info in
DRM-free music, too. Ars Technica. Retrieved
July 8, 2007, from http://arstechnica.com/news.
ars/post/20070530-apple-hides-account-info-indrm-free-music-too.html
Flavin, C., & Guinalu, M. (2006). Consumer
trust, perceived security and privacy policy: Three

basic elements of loyalty to a web site. Industrial


Management & Data Systems, 106(5), 601-620.
Fram, E. H., & Grady, D. B. (1997). Internet
shoppers: Is there a surfer gender gap? Direct
Marketing, January, 46-50.
Fraser, S., & Henry, L. (2007). An exploratory
study of residential internet shopping in Barbados. Journal of Eastern Caribbean Studies,
32(1), 1-20, 93.
Friedman, B., Kahn, P., Hagman, J., Severson, R.,
& Gill, B. (2006). The watcher and the watched:
Social judgments about privacy in a public place.
Human-Computer Interaction, 21(2), 235-272.
Garbarino, E., & Strahilevitz, M. (2004). Gender
differences in the perceived risk of buying online
and the effects of receiving a site recommendation.
Journal of Business Research, 57(7), 768-775.
Gefen, D., & Straub, D. W. (1997). Gender differences in the perception and use of e-mail: An
extension to the technology acceptance model.
MIS Quarterly, 21(4), 389-400.
Goodwin, C. (1991). Privacy: Recognition of
a consumer right. Journal of Public Policy &
Marketing, 10(1), 149-166.
Google. (2007a). Retrieved December 9, 2007,
from http://www.google.com/
Google. (2007b). Retrieved December 9, 2007,
from http://toolbar.google.com/
Google. (2007c). Retrieved December 9, 2007,
from http://documents.google.com/
Gumpert, G., & Drucker, S. (1998). The demise
of privacy in a private world: From front porches
to chat rooms. Communication Theory, 8(4),
408-425.
Ha, Y., & Stoel, L. (2004). Internet apparel
shopping behaviors: The influence of general innovativeness. International Journal of Retail &
Distribution Management, 32(8/9), 377-385.

8

Privacy Control and Assurance

Hair, J. F., Jr., Anderson, R. E., Tatham, R. L., &


Black, W. C. (1998). Multivariate data analysis
(5th ed.). UpperSaddle River, NJ: Prentice Hall.
Han, P., & Maclaurin, A. (2002). Do consumers really care about online privacy? Marketing
Management, 11(1), 35-38.
Hemphill, T. (2002). Electronic commerce and
consumer privacy: Establishing online trust in
the U.S. digital economy. Business and Society
Review, 107(2), 221-239.
Herring, S. C. (1992). Gender and participation
in computer-mediated linguistic discourse. Washington, DC: ERIC Clearinghouse on Languages
and Linguistics, Document no. ED 345552.
Hoffman, D. L., Novak, T. P., & Peralta, M. (1999).
Building consumer trust online. Communications
of the ACM, 42(4), 80-85.
Hu, X., Lin, Z., & Zhang, H. (2003). Trust promoting seals in electronic markets: An exploratory study of their effectiveness for online sales
promotion. Journal of Promotion Management,
9(1/2), 163-180.
Hui, K. L., Tan, B., & Goh, C. Y. (2006). Online
information disclosure: Motivators and measurements. ACM Transactions on Internet Technology,
6(4), 415-441.
Hupfer, M., & Detlor, B. (2007). Beyond gender
differences: Self-concept orientation and relationship-building applications on the internet. Journal
of Business Research, 60(6), 613-619.
Janda, S., Trocchia, P., & Gwinner, K. (2002).
Consumer perceptions of internet retail service
quality. International Journal of Service Industry
Management, 13(5), 412-431.
Jenkins, H., & Boyd, D. (2006). MySpace and
deleting online predators act (DOPA). MIT Tech
Talk. Retrieved October 12, 2007, from http://web.
mit.edu/cms/People/henry3/MySpace.pdf

8

Kang, H., & Yang, H. (2006). The visual characteristics of avatars in computer-mediated communication: Comparison of internet relay chat
and instant messenger as of 2003. International
Journal of Human-Computer Studies, 64(12),
1173-1183.
Kilbourne, W., & Weeks, S. (1997). A socio-economic perspective on gender bias in technology.
Journal of Socio-Economics, 26(1), 243-260.
Kim, D. J., Steinfield, C., & Lai, Y. (2004). Revisiting the role of web assurance seals in consumer
trust. In M. Janssen, H. G. Sol, & R. W. Wagenaar (Eds.), Proceedings of the 6th International
Conference on Electronic Commerce (ICEC 04,
60) (pp. 280-287). Delft, The Netherlands: ACM
Press.
King, J., Bond, T., & Blandford, S. (2002). An
investigation of computer anxiety by gender
and grade. Computers in Human Behavior, 18,
69-84.
Krill, P. (2002). DoubleClick discontinues web
tracking service. ComputerWorld. Retrieved February 24, 2002, from http://www.computerworld.
com/storyba/0,4125,NAV47_STO67262,00.html
Large, A., Beheshti, J., & Rahman, T. (2002).
Gender differences in collaborative web searching
behavior: An elementary school study. Information Processing & Management, 38, 427-443.
Lauer, T., & Deng, X. (2007). Building online trust
through privacy practices. International Journal
of Information Security, 6(5), 323-331.
Leitzsey, M. (2006). Facebook can cause problems
for students. Online PacerTimes. Retrieved October 2, 2007, from http://media.www.pacertimes.
com/media/storage/paper795/news/2006/01/31/
News/Facebook.Can.Cause.Problems.For.Students-1545539.shtml
Levy, S., & Gutwin, C. (2005). Security through
the eyes of users. In Proceedings of the 14th In-

Privacy Control and Assurance

ternational Conference on the World Wide Web


(pp. 480-488). Chiba, Japan.
McElhearn, K. (2006). iSpy: More on the iTunes
MiniStore and privacy. Kirkville. Retrieved July
7, 2007, from http://www.mcelhearn.com/article.
php?story=20060112175208864

Nachimias, R., Mioduser, D., & Shelma, A. (2001).


Information and communication technologies
usage by students in an Israeli high school: Equity, gender, and inside/outside school learning
issues. Education and Information Technologies,
6(1), 43-53.

McKnight, D. H., Choudhury, V., & Kacmar, C.


(2000). Trust in e-commerce vendors: A two-stage
model. In Proceedings of the Association for Information Systems (pp. 532-536). Atlanta, GA.

Nam, C., Song, C., Lee, E., & Park, C. (2005).


Consumers privacy concerns and willingness
to provide marketing-related personal information online. Advances in Consumer Research,
33, 212-217.

Metzger, M. (2006). Effects of site, vendor, and


consumer characteristics on web site trust and
disclosure. Communication Research, 33(3),
155-179.

Noteberg, A., Christiaanse, E., & Wallage, P.


(2003). Consumer trust in electronic channels.
E-Service Journal, 2(2), 46-67.

Meyers-Levy, J. & Maheswaran, D. (1991). Exploring differences in males and females processing strategies. Journal of Consumer Research,
18(June), 63-70.
Microsoft. (2003). Windows media player 9 series
privacy settings. Retrieved July 7, 2007, from
http://www.microsoft.com/windows/windowsmedia/player/9series/privacy.aspx
Miyazaki, A. D., & Fernandez, A. (2000). Internet
privacy and security: An examination of online
retailer disclosures. Journal of Public Policy &
Marketing, 19(Spring), 54-61.

Nowak, G. J., & Phelps, J. (1992). Understanding


privacy concerns: An assessment of consumers
information related knowledge and beliefs. Journal of Direct Marketing, 6(Autumn), 28-39.
OBrien, K., & Crampton, T. (2007). EU asks
Google to explain data retention policies. International HeraldTribune. Retrieved July 10, 2007,
from http://www.iht.com/articles/2007/05/25/
business/google.php
Oakes, C. (1999). Mouse pointer records clicks.
Wired News. Retrieved June 28, 2006, from http://
www.wired.com/news/print/0,1294,32788,00.
html

Miyazaki, A. D., & Fernandez, A. (2001). Consumer perceptions of privacy and security risks
for online shopping. Journal of Consumer Affairs,
35(1), 27-44.

Ono, H., & Zavodny, M. (2003). Gender and


the internet. Social Science Quarterly, 84(1),
111-121.

Moores, T. (2005). Do consumers understand the


role of privacy seals in e-commerce? Communications of the ACM, 48(3), 86-91.

Ono, H., & Zavodny, M. (2005). Gender differences in information technology usage: A U.S.Japan comparison. Sociological Perspectives,
48(1), 105-133.

Mukherjee, A., & Nath, P. (2007). Role of electronic trust in online retailing: A re-examination of
the commitment-trust theory. European Journal
of Marketing, 41(9/10), 1173-1202.

Pan, Y., & Zinkhan, G. (2006). Exploring the


impact of online privacy disclosures on consumer
trust. Journal of Retailing, 82(4), 331-338.

MySpace. (2007). Retrieved December 9, 2007,


from http://www.myspace.com/

Papazafeiropoulou, A., & Pouloudi, A. (2001).


Social issues in electronic commerce: Implica-

8

Privacy Control and Assurance

tions for policy makers. Information Resources


Management Journal, 14(4), 24-32.
Parameswaran, M., & Whinston, A. (2007).
Research issues in social computing. Journal of
the Association for Information Systems, 8(6),
336-350.
Patton, M., & Josang, A. (2004). Technologies for
trust in electronic commerce. Electronic Commerce Research, 4(1-2), 9-21.
Phelps, J., Nowak, G., & Ferrell, E. (2000). Privacy
concerns and consumer willingness to provide
personal information. Journal of Public Policy
& Marketing, 19(1), 27-41.
Policy Aware Web. (2006). Retrieved July 10,
2007, from http://www.policyawareweb.org/
Privacy and Identity Management for Europe.
(Prime). (2007). Retrieved July 10, 2007, from
https://www.prime-project.eu/

In A. Andreasen, A. Simonson, & N. C. Smith


(Eds.), Proceedings of Marketing and Public
Policy (Vol. 8, pp. 73-79).Chicago: American
Marketing Association.
Roman, S. (2007). The ethics of online retailing: A scale development and validation from
the consumers perspective. Journal of Business
Ethics, 72(2), 131-148.
Roussous, G., & Moussouri, T. (2004). Consumer perceptions of privacy, security and trust
in ubiquitous commerce. Pers Ubiquit Comput,
8, 416-429.
Royal, C. (2005). A meta-analysis of journal articles intersecting issues of internet and gender.
Journal of Technical Writing and Communication,
35(4), 403-429.
Savicki, V., Kelley, M., & Lingenfelter, D. (1996).
Gender and small task group activity using computer-mediated communication. Computers in
Human Behavior, 12, 209-224.

Qutami, Y., & Abu-Jaber, M. (1997). Students


self-efficacy in computer skills as a function of
gender and cognitive learning style at Sultan
Qaboos University. International Journal of
Instructional Media, 24(1), 63-74.

Schoder, D., & Yin, P. (2000). Building firm trust


online. Communications of the ACM, 43(12),
73-79.

Radcliff, D. (2001). Giving users back their


privacy. ComputerWorld. Retrieved February
28, 2002, from http://www.computerworld.com/
storyba/0,4125,NAV47_STO61981,00.html

Schuman, E. (2006). Gartner: $2 billion in ecommerce sales lost because of security fears.
eWeek.com, November 27. Retrieved October
15, 2007, from http://www.eweek.com/article2/0,1895,2063979,00.asp

Riegelsberger, J., Sasse, M., & McCarthy, J. (2003).


Shiny happy people building trust?: Photos on
e-commerce websites and consumer trust. In
Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems (pp. 121-128). Ft.
Lauderdale, FL.
Roach, R. (2001). Internet usage reflects gender
breakdown. Black Issues in Higher Education,
18(11), 119.
Rohm, A. J., & Milne, G. R. (1998). Emerging
marketing and policy issues in electronic commerce: Attitudes and beliefs of internet users.

8

Sexton, R., Johnson, R., & Hignite, M. (2002).


Predicting internet/e-commerce use. Internet
Research, 12(5), 402-410.
Shalhoub, Z. (2006). Trust, privacy, and security in electronic business: The case of the GCC
countries. Information Management & Computer
Security, 14(3), 270-283.
Smith, H. J., Milberg, S. J., & Burke, S. J. (1996).
Information privacy: Measuring individuals
concerns about organizational practices. MIS
Quarterly, June, 167-196.

Privacy Control and Assurance

Smith, R. (2002a). Serious privacy problems in


windows media player for Windows XP. Retrieved
July 7, 2007, from http://www.computerbytesman.
com/privacy/wmp8dvd.htm
Smith, R. (2002b). Microsoft response to the windows media player 8 privacy advisory. Retrieved
July 7, 2007, from http://www.computerbytesman.
com/privacy/wmp8response.htm
So, M., & Sculli, D. (2002). The role of trust,
quality, value and risk in conducting e-business.
Industrial Management & Data Systems, 102(8/9),
503-512.
Stapleton-Paff, K. (2007). Facebook poses
privacy issues for students. The Daily of the
University of Washington. Retrieved October
2, 2007, from http://www.thedaily.washington.
edu/article/2007/4/25/facebookPosesPrivacyIssuesForStudents
Stokes, M. (2007). Police charge three female
students for sign stealing. Western Herald. Retrieved October 22, 2007 from http://media.www.
westernherald.com/media/storage/paper881/
news/2007/10/22/News/Police.Charge.Three.Female.Students.For.Sign.Stealing-3046592.shtml
Tavani, H. (1999). Privacy online. Computers and
Society, 29(4), 11-19.
Tavani, H., & Moor, J. (2001). Privacy protection,
control of information, and privacy-enhancing
technologies. Computers and Society, 31(1), 611.

TRUSTe. (2007). Retrieved December 9, 2007,


from http://www.truste.org/
Van Dyke, T., Midha, V., & Nemati, H. (2007).
The effect of consumer privacy empowerment
on trust and privacy concerns in e-commerce.
Electronic Markets, 17(1), 68-81.
Wang, H., Lee, M., & Wang, C. (1998). Consumer
privacy concerns about internet marketing. Communications of the ACM, 41(3), 63-70.
Wang, P., & Petrison, L. A. (1993). Direct marketing activities and personal privacy. Journal of
Direct Marketing, 7(Winter), 7-19.
WebTrends. (2007). Retrieved December 9, 2007,
from http://www.webtrends.com/
Weible, R. J. (1993). Privacy and data: An empirical study of the influence and types and data
and situational context upon privacy perceptions.
(Doctoral Dissertation, Department of Business
Administration, Mississippi State University).
Dissertation Abstracts International, 54(06).
World Wide Web Consortium (W3C). (2006).
The platform for privacy preferences 1.1 (P3P1.1)
specification. World Wide Web Consortium.
Retrieved July 7, 2007, from http://www.w3.org/
TR/P3P11/
World Wide Web Consortium (W3C). (2007). The
platform for privacy preferences (P3P) project.
World Wide Web Consortium. Retrieved July 5,
2007, from http://www.w3.org/P3P/

Teo, T., & Lim, V. (1997). Usage patterns and


perceptions of the internet: The gender gap. Equal
Opportunities International, 16(6/7), 1-8.

Yao, M., Rice, R., & Wallis, K. (2007). Predicting user concerns about online privacy. Journal
of the American Society for Information Science
and Technology, 58(5), 710-722.

Teo, T. S. H., Lim, V. K. G., & Lai, R. Y. C.


(1999). Intrinsic and extrinsic motivation in internet usage. OMEGA: International Journal of
Management Science, 27, 25-37.

Yousafzai, S., Pallister, J., & Foxall, G. (2003). A


proposed model of e-trust for electronic banking.
Technovation, 23(11), 847-860.

8

Privacy Control and Assurance

AddItIonAl reAdIng
Antn, A., Bertino, E., Li, N., & Yu, T. (2007). A
roadmap for online privacy policy management.
Communications of the ACM, 50(7), 109-116.
Bonini, S., McKillop, K., & Mendonca, L. (2007).
The trust gap between consumers and corporations. The McKinsey Quarterly, 2, 7-10.
Boyens, C., Gnther, O., & Teltzrow, M. (2002).
Privacy conflicts in CRM services for online
shops: A case study. In Proceedings of the IEEE
International Conference on Privacy, Security,
and Data Mining Maebashi City, Japan, (Vol.
14, pp. 27-35).
Curtin, M. (2002). Developing trust: Online privacy and security. New York: Springer-Verlag.
Cvrcek, D., Kumpost, M., Matyas, V., & Danezis,
G. (2006). A study on the value of location privacy. In Proceedings of the 5th ACM Workshop
on Privacy in Electronic Society (pp. 109-118).
Alexandria, VA.
Drennan, J., Sullivan, G., & Previte, J. (2006).
Privacy, risk perception, and expert online behavior: An exploratory study of household end
users. Journal of Organizational and End User
Computing, 18(1), 1-22.
Earp, J., & Baumer, D. (2003). Innovative web
use to learn about consumer behavior and online
privacy. Communications of the ACM, 46(4),
81-83.
Electronic Frontier Foundation (EFF). (2007).
Privacy issues. Retrieved July 7, 2007, from
http://www.eff.org/Privacy/
Electronic Privacy Information Center (EPIC).
(2007). EPIC online guide to practical privacy
tools. Retrieved July 7, 2007, from http://www.
epic.org/privacy/tools.html
Frackman, A., Ray, C, & Martin, R. (2002). Internet and online privacy: A legal and business
guide. New York: ALM Publishing.
88

Freeman, L., & Peace, A. (Eds.). (2005). Information ethics: Privacy and intellectual property.
Hershey, PA: Information Science Publishing.
Golle, P. (2006). Revisiting the uniqueness of
simple demographics in the U.S. population. In
Workshop on Privacy in the Electronic Society:
Proceedings of the 5th ACM Workshop on Privacy
in the Electronic Society (pp. 77-80). Alexandria,
VA.
Gross, R., Acquisti, A., & Heinz, H. (2005).
Information revelation and privacy in online
social networks. In Workshop on Privacy in The
Electronic Society: Proceedings of the 2005 ACM
Workshop on Privacy in the Electronic Society
(pp. 71-80). Alexandria, VA.
Kauppinen, K., Kivimki, A., Era, T., & Robinson, M. (1998). Producing identity in collaborative virtual environments. In Proceedings of the
ACM Symposium on Virtual Reality Software and
Technology (pp. 35-42). Taipei, Taiwan.
Khalil, A., & Connelly, K. (2006). Context-aware
telephony: Privacy preferences and sharing patterns. In Proceedings of the 2006 20th Anniversary
Conference on Computer Supported Cooperative
Work (pp. 469-478). Banff, Alberta, Canada.
Landesberg, M., Levin, T., Curtain G, & Lev,
O. (1998). Privacy online: A report to congress.
Federal Trade Commission. Retrieved July 7,
2007, from http://www.ftc.gov/reports/privacy3/
toc.shtm
Lumsden, J., & MacKay, L. (2006). How does
personality affect trust in B2C e-commerce?
In ACM International Conference Proceeding
Series, Vol. 156: Proceedings of the 8th International Conference on Electronic Commerce: The
New E-Commerce: Innovations for Conquering
Current Barriers, Obstacles and Limitations to
Conducting Successful Business on the Internet.
(pp. 471-481). Fredericton, New Brunswick,
Canada.

Privacy Control and Assurance

McCloskey, D. W. (2006). The importance of ease


of use, usefulness, and trust to online consumers:
An examination of the technology acceptance
model with older customers. Journal of Organizational and End User Computing, 18(3), 47-65.
Novak, J., Raghavan, P., & Tomkins, A. (2004).
Anti-aliasing on the web. In Proceedings of the
13th International Conference on World Wide
Web (pp. 30-39). New York.
Peters, T. (1999). Computerized monitoring and
online privacy. Jefferson, NC: McFarland &
Co.

Reagle, J., & Cranor, L. F. (1999). The platform


for privacy preferences. Communications of the
ACM, 42(2), 48-55
Tan, F. B., & Sutherland, P. (2004). Online consumer trust: A multi-dimensional model. Journal
of Electronic Commerce in Organizations, 2(3),
40-58.
Walters, G. J. (2001). Privacy and security: An
ethical analysis. SIGCAS Computer Soc., 31(2),
8-23.
Yee, G. (2006). Privacy protection for e-services.
Hershey, PA: IGI Publishing.

Ratnasingam, P. (2003). Inter-organizational trust


for business to business e-commerce. Hershey,
PA: IRM Press.

89

90

Chapter IX

A Profile of the Demographics,


Psychological Predispositions,
and Social/Behavioral
Patterns of Computer Hacker
Insiders and Outsiders
Bernadette H. Schell
University of Ontario Institute of Technology, Canada
Thomas J. Holt
The University of North Carolina at Charlotte, USA

AbstrAct
This chapter looks at the literaturemyths and realitiessurrounding the demographics, psychological
predispositions, and social/behavioral patterns of computer hackers, to better understand the harms that
can be caused to targeted persons and property by online breaches. The authors suggest that a number
of prevailing theories regarding those in the computer underground (CU)such as those espoused by
the psychosexual theoristsmay be less accurate than theories based on gender role socialization, given
recent empirical studies designed to better understand those in the CU and why they engage in hacking
and cracking activities. The authors conclude the chapter by maintaining that online breaches and online
concerns regarding privacy, security, and trust will require much more complex solutions than currently
exist, and that teams of experts in psychology, criminology, law, and information technology security
need to collaborate to bring about more effective real-world solutions for the virtual world.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

IntroductIon
Hackers are the elite corps of computer designers
and programmers. They like to see themselves
as the wizards and warriors of tech. Designing
software and inventing algorithms can involve
bravura intellection, and tinkering with them is as
much fun as fiddling with engines. Hackers have
their own culture, their own language. And in the
off-hours, they can turn their ingenuity to sparring
with enemies on the Net, or to the midnight stroll
through systems you should not be able to enter,
were you not so very clever. Dark-side hackers, or
crackers, slip into systems for the smash-and-grab,
but most hackers are in it for the virtuoso ingress.
It is a high-stress life, but it can be amazing fun.
Imagine being paidwell paidto play forever
with the toys you love. Imagine. St. Jude, Mondo
2000: Users Guide to the New Edge
Since its appearance in the United States in
the second part of the twentieth century, the Internet has been the topic of arduous study from
a number of academic disciplines, including the
social sciences and criminology, business, law,
computer science, and political science. In recent
decades, as the Internet has expanded at unprecedented rates, and with different socio-economic
interests becoming increasingly involved, the
Internets impact on global citizens daily lives
has been profound. The Internet has become one
of the most important ways of communicating
internationally in real time (such as is the case
with online activismknown in the information technology field as hacktivism). Also, the
complex infrastructure of the Internet has on
the positive side facilitated a number of common
activitiessuch as e-commerce, Internet banking, online gaming, and online votingand has
provided a more level political and economic
playing field for citizens residing in both developed and developing nations, particularly in
China, India, Russia, and Pakistan.

Moreover, in recent years in developed nations,


professionals have been able to broaden their
returns to society by adopting Internet-related
technologies. For example, using hand-held devices, doctors have been able to access patients
health histories and diagnostic records over the
Internet without having to rely on snail mail
courier services, and high-tech billionaires such
as those who started the Google search engine
(with a November, 2005, market cap of US$120
billion) have pushed the online entrepreneurial
envelope to a whole new higher and societalbeneficial plane (Schell, 2007).
However, with the growth of and diversity
in Internet traffic, a dark side has surfaced, particularly since the late 1980s as more and more
citizens have become able to afford personal
computers (PCs) and online accounts. Thus, techsavvy criminals have increasingly made use of
the Internet to perpetrate online crimescausing
an increase in incidences of online child exploitation, identity theft, intellectual property theft,
worm and virus infestations of business and home
computers, and online fraud involving its many
presentationse-commerce, voting, and gaming. Consequently, Internet-connected citizens
worldwide have become increasingly fearful that
their privacyincluding personal health histories,
banking transactions, social security numbers,
and online voting preferenceswould be vulnerable to destruction or alteration by mal-inclined
computer hackers (known as crackers). Too,
business leaders have become concerned that not
only will their computer networks be tampered by
tech-savvy outsiders but also by insider employees
determined to destroy critical business data when,
say, they leave the firm under less than happy
circumstances (Schell, 2007).
In the last decade, in particular, with the
growth of the Internet and electronic or e-commerce, the amount of personal information that
can potentially be collected about individuals by
corporations, financial and medical institutions,
and governments has also increased. Such data

9

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

collection, along with usage tracking and the


sharing of data with third partiesespecially in
light of the reality that such actions can easily be
accomplished through high-speed links and highcapacity storage devices without the consumers
expressed knowledge or consenthas raised a
multitude of issues among Internet users about
privacy (by definition, the state of being free from
unauthorized access), security (by definition, being protected from adversaries, particularly from
those who would do harm, even unintentionally,
to property or to a person or persons), and trust
(by definition, the element present in business
relationships when one partner willingly depends
on an exchanging partner in whom one has confidenceincluding online business exchanges)
(Grami & Schell, 2004).
Computer systems are regularly attacked to do
harm or for personal gain in todays wired and
wireless world (see Taylor, Caeti, Loper, Fritsch, &
Liederback, 2006). Such mal-inclined attacks are
often referred to as hacks but in the computer
underground (CU), they are known as cracks.
Both the positively-motivated, authorized hackers and the negatively-motivated, non-authorized
crackers have a profound interest in computers
and technology, and they like to use their knowledge to access computer systems (Schell, Dodge,
& Moutsatsos, 2002). Many in the general public
identify hackers as a primary threat to computer
security, and there is significant media attention
given to dramatic computer crimes attributed to
them (Furnell, 2002).
From a technical perspective, privacy issues
in the security sense regarding cracks of computer systems include digital rights management,
spam deterrence, anonymity maintenance, and
disclosure rule adequacy. Over the past few
years, a number of recorded cyber intruders into
companies and institutions computer networks
have violated the privacy rights of employees
and online registrants. In 2005, for example,
the media publicized evidence suggesting that
the Internet information brokerage industry

9

is poorly regulated, for on or about March 10,


cyber criminals stole passwords from legitimate
online users of as many 32,000 Americans in a
data-base owned by the renowned LexisNexis
Group. Similar computer network breaches occurred at about that time at ChoicePoint, Inc., and
at the Bank of America, prompting calls for the
U.S. federal government oversight through the
General Services Administration to investigate
the matter. Recommendations were to follow
about providing adequate protection for the safety
of federal employees information, and fears of
identity theft surfaced on a broad scale. In short,
the present-day reality is that regardless of how
well intentioned management is about protecting
employees privacy rights as they navigate online,
valuable personal information can be collected by
hidden online tools such as cookies (small bits
of data transmitted from a Web server to a Web
browser that personalize a Web site for users)
and Web bugsand then that information can be
shared with third parties for marketing purposes
or surveillance.
Moreover, with the increased usage in recent
years of cellular phones and hand-held computers, users have become vulnerable to security
breaches, for wireless communications rely on
open and public transmission media (over the
air). Thus, the mobile security challenges relate
to the users mobile device, the wireless access
network, the wired-line backbone network, and
mobile commerce software applications. All of
these challenges must be addressed by the companies releasing such products into the marketplace,
including providing fixes for software vulnerabilities. Moreover, unlike wire-line networks, the
uniqueness of wireless networks poses a number
of complex challenges for security experts, such
as vulnerability of the air interface, an open peerto-peer (P2P) network architecture (in mobile
and ad hoc networks), a shared wireless medium,
the limited computing power of mobile devices,
a highly dynamic network topology, and the low
data rates and frequent disconnects of wireless

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

communications. The media have often reported


that coping with these wireless devices vulnerabilities costs companies considerable money;
for example, the chargeback rate for credit card
transactions using wireless devices is about 15
times higher than that for in-store point-of-sale
credit card transactions (Grami & Schell, 2004).
Mobile services, in general, are prone to two
types of risks as a result of security vulnerabilities: subscription fraud (more commonly known
as identity theft) and device theft (i.e., stolen
devices).
In 2003, the notion of the vulnerability of
wireless computer systems came to the forefront
in an interesting case involving a U.S. computer
security analyst named Stefan Puffer. After he
determined that the Harris County district clerks
wireless computer network was vulnerable to
crack attacks, he warned the clerks office that
anyone with a wireless network card could gain
access to their sensitive data. Though Puffer was
charged by police for cracking the network, he
was later acquitted by a Texas jury. Moreover,
on September 8, 2003, while many students were
returning to school, the media reported that a
young hacker from Massachusetts pleaded guilty
to cracking Paris Hiltons T-Mobile cellular phone
and dumped personal information of hers on the
Internet for millions around the world to see. He
also admitted to cracking other computer systems
a year earlier and to stealing personal information
without authorization. But his exploits did not stop
there; the youth said he also sent bomb threats
through the Internet to high schools in Florida
and Massachusetts. The authorities estimated that
the harm created to persons and to property for
these exploits cost an estimated US$1 million, and
like other juveniles who have cracked systems,
he was sentenced to eleven months in a detention
center (Schell, 2007).
Finally, the essence of any business transactiononline, in industry, or in retail outletsdepends on trust, commonly expressed in laws,
contracts, regulations, policies, and personal

reputations. Recent evidence has indicated that


though consumers would tend not to initially
trust someone knocking on a house door trying
to sell expensive goods, many consumers using
the Internet to purchase goods or to communicate
with others seem to be overly trusting. Numerous consumers fall victim to spammers (those
taking advantage of users e-mail accounts by
swamping them with unwanted advertising using
false but legitimate-looking headers), download
virus-infected software, or engage in online
chat rooms with strangerssometimes sharing
personal information which can later cause them
or their loved ones harm. Though many efforts
in the United States have been made to eradicate
spam, including the creation of filters to stop it
from getting through the network and the passage
of laws like the U.S. CAN-SPAM Act of 2003, to
date, these remedies have proven not to be fully
effective.
For example, on April 8, 2005, a landmark legal
case concluded that spammer Jeremy Jaynes of
Raleigh, North Carolina, who went by the name
Gaven Stubberfield, was guilty of massive
spamming and was sentenced to nine years in
U.S. prison in violation of the U.S. CAN-SPAM
Act. Described by prosecutors as being among
the top 10 spammers in the world, this case is
considered to be important because it was the
United States first successful felony prosecution
for transmitting spam over the Internet. Jaynes
apparently transmitted 10 million e-mails a day
using 16 high-speed lines. For these mal-inclined
exploits and breaches of trust, he allegedly earned
as much as $750,000 a month on his spamming
operation (Schell & Martin, 2006).
Another interesting cyber case of breach of
online trust occurred in March, 2005, when the
Harvard Business School administration said that
because of unauthorized intrusions, they decided
to reject almost 120 applicants who had followed
a hackers instructions on how to break into the
universitys admission Internet Web site to learn
if they had been admitted to this prestigious

9

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

university. The administrators argued that these


intrusions were unethical and that the actions
of the potential students breached trust. Other
universities took similar punitive approaches to
similar cracking incidents, including Carnegie
Mellon Universitys School of Business. The common thread involved in these breaches was the
use of the Apply Yourself online application and
notification software (Associated Press, 2005).
Breaking into a computer typically involves
discovering vulnerabilities and then creating an
exploit (a program or set of instructions to be followed) to take advantage of the vulnerabilities.
These vulnerabilities, as well as their related
exploit programs, if released into the public
domain, can be used by many other individuals,
regardless of their inclinations (Rescorla, 2005).
For example, system administrators tend to use
them to test their systems, and good-natured computer hackers maintain that they capitalize on the
vulnerabilities just to have a good time.
There are also the malicious crackers who scan
systems to determine which have vulnerabilities
and then plan an attack. Crackers typically aim to
get revenge on some perceived enemy or to make
a profit from the cyber attack. It is not uncommon,
in fact, for crackers to verify the success of their
attacks, for it brings them considerable pleasure
(Meinel, 2006).
While detailing how hackers break into computers is out of the scope of this chapter, readers
can refer to Appendix A: How Do Hackers Break
into Computers? by Carolyn Meinel to get a better idea of how exploits are completed, as cited
in this work (Schell & Martin, 2006).
Considering the significant amount of valuable
information contained in business, government,
and personal computers worldwide, it is necessary to consider the risks and prevalence of attacks against computer networks. According to
recent estimates, the costs to victims of malicious
computer attacks have totaled more than $10 billion since 2000, including recent cellular phone
exploits (IBM Research, 2006).

9

The CSI/FBI (Computer Security Institute/


Federal Bureau of Investigation) Computer Crime
and Security Survey has annually attempted to
assess the costs of computer attacks to industry,
medical and educational institutions, and government agencies for the past 12 years by asking those
involved in the IT security role to respond to the
questions posed. The most recent 2007 version
is simply known as the CSI, designed solely by
the Computer Security Institute.
The 2007 survey findings revealed that the
average annual loss reported in this past year
rose to $350,424 from $168,000 the previous
year. Not since the 2004 report have the average
losses been this high. Moreover, about one-fifth
of the IT security respondents said that their firms
tended to suffer a targeted attack, meaning
that the mal-ware attack was aimed exclusively
at their organization or those within a given
subset. Also, financial fraud in 2007 overtook
virus attacks as the major source of financial loss
in 2006. Another significant cause of loss was
system intrusion by outsiders. Insider abuse of
network access or e-mail (including trafficking
in child exploitation pornography or software
pirating) also edged-out virus incidents as the
most prevalent form of security problemwith
59% and 52% of the respondents, respectively,
reporting these (Richardson, 2007).
The 2006 CSI/FBI survey findings further
revealed that most IT respondents perceived that
their organizations major cyber losses resulted
from system breaches by outsiders (i.e., those
not employed by the company). However, about
33% of the respondents believed that insider
threats accounted for at least one-third of their
sizable network abuse problems (Gordon, Loeb,
Lucyshyn, & Richardson, 2006).
The 2006 survey results suggested that within
the enterprise security perimeter, the news is
good, for the survey respondents maintained that
they are keeping their cyber crime losses lower.
At the same time, in the developed and developing world, our economic reliance on computers

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

and technology is growing, and so are criminal


threats. Because criminal threats are becoming
more sophisticated, IT security experts around
the globe should not overestimate these recent
gains (Cincu & Richardson, 2006).
It is vital that researchers in a number of fields
better understand the psychological and behavioral
composition of network attackers and the social
dynamics that they operate within. Knowledge of
the personalities, behaviors, and communication
patterns of computer attackers can help security
researchers and practitioners to prepare for future
exploits and to reduce costs due to electronic intrusion, alteration, and theft. Better preparation can
also minimize damage to consumer confidence,
privacy, and security in e-commerce Web sites
and general information-sharing within and across
companies (Cincu & Richardson, 2006).

Background: A briefing on basic hacking/cracking vocabulary, the coincidence


of the four critical elements constituting
a cyber crime, and the common types of
unauthorized use
Issue and controversy #1: Known demographic and behavioral profiles of hackers
and crackersbehavioral misfits or seemingly normals?
Issue and controversy #2: Psychological
myths and truths about those in the computer underground (CU)do they tend to
be disease-prone or self-healer types?
Future trends: How present strategies for
dealing with online privacy, security, and
trust issues need to be improved

bAckground
purpose of thIs chApter
The purpose of this chapter is to summarize what
is known in the literature about the demographic,
psychological, and social/behavioral patterns of
computer hackers and crackers. This information
can improve our knowledge of cyber intruders
and aid in the development of effective techniques
and best practices to stop them in their tracks.
There is little question among IT security experts
that when it comes to privacy issues, hackers and
crackers are often ignored. As a matter of fact, if
crackers do attack an international database containing high degrees of personal and/or homeland
security information (with the 2005 LexisNexis
database exploit serving as a smaller-scale case in
point), this large-scale exploit could cause massive
disasters affecting citizens across multitudes of
jurisdictions, including a critical infrastructure
failure. This chapter intends to assist in shedding
light on what is known about how hackers and
crackers generally tend to think and behave.
Specific topics covered in this chapter include:

As noted, though the words hacker and cracker


are regularly used interchangeably by the media
and the public, these two terms have distinct meanings within the CU (Furnell, 2002; Holt, 2007).
The word hacker typically refers to a person who
enjoys learning the details of computer systems
and how to stretch their capabilities (Furnell,
2002). Crackers tend to be malicious meddlers
trying to discover information by deception or
illegal means, often with the intent to do harm to
another person or to anothers property for revenge
or personal gain (Furnell, 2002).
There are also variations of hackers within
the CU, based on their motives and actions while
hacking (Holt, 2007). For example, White Hat
hackers are individuals who use their skills to
benefit or protect computer systems. The term
Black Hat hacker often refers to those hackers
who maliciously damage or harm networks. In
this context, a hack to gain knowledge or serve
as a warning to security personnel in a company
that a computer system is not properly protected
may be defined by members of the CU as good
and positively motivated. However, if the hacking

9

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

event occurs because of a perpetrators need for


revenge, sabotage, blackmail, or greed, this action
may be labeled as wrong, and possibly criminal
in nature (Schell et al., 2002).
While the public and the media suggest that
the bulk of computer crack attacks, or exploits, are completed by sophisticated hackers
with a detailed knowledge of computer systems
and how they work, the reality is somewhat different. Evidence suggests that most perpetrators
who attack networks are younger than age 30,
are bored, and, often, are in need of peer recognition and acceptance (Schell et al., 2002). In
fact, most industrial and government computer
system invaders are not sophisticated hackers;
they are often teenagers out to be challenged and
to be recognized for their exploits by their peers.
The less skilled individuals who engage in these
sorts of attacks are typically referred to as script
kiddies (Furnell, 2002; Schell et al., 2002; Holt,
2007). This term is often used derisively within
the CU, as it recognizes a persons dependence
on pre-made scripts that facilitate hacks. Using
a program to hack suggests that the individual
does not have significant computer knowledge
and is not truly a hacker (Furnell, 2002; Schell
et al., 2002; Holt, 2007).
There are several other terms used to describe
hackers and computer attackers, particularly those
interested in politically-motivated cyber attacks.
For example, the term cyber terrorist refers to
individuals who use hacking techniques to attack
networks, systems, or data under the motivation
of a particular social or political agenda (Furnell,
2002), and hacktivists are those who break into
computer systems to promote an activist agenda,
often defacing Web sites to express an opinion
(Furnell, 2002).
While this list is by no means an exhaustive
one of the different labels applied to those in the
CU or their activities, it, nonetheless, demonstrates
the remarkable variation present in motives, thinking, and behavior within the CU.

9

Cyber Crime Defined


The growth and spread of the Internet across the
world over the last two decades has fostered the
growth of a variety of online crimes as well as
laws aimed at curbing these behaviors. In fact,
a unique debate has developed concerning the
definition of online crimes, using both cyber crime
and computer crime. Cyber crimes typically occur
because the individual uses special knowledge
of cyberspace, while computer crimes involve
special knowledge of computer technology. The
interrelated nature of these behaviors complicates
this definition process, and many individuals use
the terms interchangeably. As a consequence, the
term cyber crime will be used in this chapter
to refer to any crime completed either on or with
a computer (Furnell, 2002).
Cyber crime generally includes electronic
commerce (e-commerce) theft, intellectual
property rights (IPR) or copyright infringement,
privacy rights infringement, and identity theft.
Also, cyber crime involves such activities as child
exploitation and pornography; credit card fraud;
cyberstalking; defaming or threatening another
user online; gaining unauthorized access to computer networks; ignoring or abusing copyright,
software licensing, and trademark protection;
overriding encryption to make illegal copies of
software; software piracy; and stealing anothers
identity to conduct criminal acts. Although variations on the parameters constituting these unlawful acts, as well as the penalties corresponding to
the infringements, may vary from one jurisdiction
to another worldwide, this list is relevant and
pertinent (Schell & Martin, 2006).
Taken from a broad perspective, cyber crime is
not all that different from the more conventional
real-world crime. In fact, one of the most well
known cyber crime typology classifies behavior
along similar lines of traditional crime, including trespass, theft, obscenity, and violence (Wall,
2001). While this framework identifies multiple
forms of potentially criminal cyber behaviors,

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

the criminal acts of trespass (defined as entering


unlawfully into an area to commit an offense)
and theft (an act occurring when someone takes,
or exercises illegal control over, the property of
another to deprive that owner of the asset) tend
to garner the lions share of attention from the
media (Furnell, 2002).
Types of cyber trespass and theft commonly
include but are not limited to (Schell & Martin,
2004):

Floodinga form of cyberspace vandalism resulting in denial of service (DoS) to


authorized users of a Web site or computer
system;
Virus and worm production and releasea
form of cyberspace vandalism causing corruption and, possibly, the erasing of data;
Spoofingthe cyberspace appropriation of
an authentic users identity by non-authentic
users, causing fraud or attempted fraud and
commonly known as identity theft;
Phreakinga form of cyberspace theft and/
or fraud consisting of the use of technology
to make free telephone calls; and
Infringing intellectual property rights (IPR)
and copyrighta form of cyberspace theft
involving the copying of a targets information or software without getting their
consent.

the four critical elements of cyber


crimes
Harm resulting from cyber crimes, as in conventional crimes, can be to property, to persons,
or to both. As in the conventional world, in the
cyber world, there are politically-motivated cyber
crimes, controversial crimes, and technical nonoffenses. For a cyber crime and a conventional
crime to exist in U.S. jurisdictions, four elements
must be present (Brenner, 2001):

Actus reus (the prohibited act or failing to


act when one is supposed to be under duty
to do so)
Mens rea (a culpable mental state)
Attendant circumstances (the presence of
certain necessary conditions)
Harm (to either persons or property, or
both)

Here is an example illustrating the four


elements for a property cyber crime involving
criminal trespass, whereby the perpetrator intends
to steal information from another. A cyber perpetrator gains entry into a computer and unlawfully
takes control of the property, the information of
another user (actus reus). He or she enters with the
intent to commit an offense by law and acts with
the intent of depriving the lawful owner of data
(mens rea). By societys norms, the perpetrator
has no legal right to enter the computer network
(i.e., is not authorized to do so) or to gain control of
the targeted software (attendant circumstances).
Consequently, the cyber perpetrator is liable for
his or her unlawful acts, for he or she unlawfully
entered the computer (that is, criminal trespass)
to commit an offense once access was gained
(i.e., theft). In the end, the targeted user was not
able to access data, resulting in harm to the target
(Schell & Martin, 2004).

the changing nature of cyber crime


and the need for emerging
legislation
As the nature of cyber crime has evolved, so have
the legal structures to prosecute and punish these
behaviors. Most newsworthy cyber crime cases
have been prosecuted in the United States under
the computer crime statute 18 U.S.C. subsection
1030. The primary federal statute criminalizing
cracking was originally the Computer Fraud and
Abuse Act (CFAA), which was modified in 1996
by the National Information Infrastructure Protection Act and codified at 18 U.S.C. subsection

9

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

1030, Fraud and Related Activity in Connection


with Computers.
If caught in the United States, crackers are
often charged with intentionally causing damage
without authorization to a protected computer. A
first offender typically faces up to 5 years in prison
and fines up to $250,000 per count, or twice the loss
suffered by the targets. The U.S. federal sentencing guidelines for cracking have been expanded
in recent years to provide longer sentences from
20 years behind bars to life sentences if exploits
lead to injury or death of online citizens (Krebs,
2003). The targets of cyber crimes can also seek
civil penalties (Evans & McKenna, 2000).
It should be noted that while finding a cracker
may not be an easy task for law enforcement because of the rather anonymous nature of the cyber
environment, successfully prosecuting a cracker
can be even tougher, for enforcement ability falls
to any jurisdiction that has suffered the effects of
the crack. If, for example, the corporate targets
of the attack are in the United States, then U.S.
laws would apply. Practically speaking, only
those jurisdictions with the cracker physically
in their locale will be able to enforce their laws.
Therefore, though the United States or Canada
may attempt to apply their countrys laws to any
given cracking incident, the perpetrator needs to
be physically in their jurisdiction to enforce the
law (Walton, 2000).
After the September 11, 2001 terrorist attacks
on the World Trade Center, the U.S. government
became increasingly concerned about terrorist
attacks of various natures and homeland security
protection. To this end, the U.S. passed a series of
laws aimed at halting computer criminals, including the 2002 Homeland Security Act, with section
225 known as the Cyber Security Enhancement
Act of 2002. In 2003, the Prosecutorial Remedies
and Tools against the Exploitation of Children
Today Act (PROTECT Act) was passed to assist
law enforcement agents in their efforts to track
and identify pedophiles using the Internet for
child exploitation purposes. Also in 2003, the
Can Spam Act was passed by the United States
98

Senate, aimed at decreasing the issues raised by


commercial e-mailers and spammers. Its longer
title was the Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003,
a title accurately reflecting its purpose.
Other countries have enacted similar antiintrusion legislation. For example, section 342.1
of the Canadian Criminal Code is aimed at a
number of potential harms, including theft of
computer services, invasion of privacy, trading
in computer passwords, or cracking encryption
systems. Charges for violations are made according to the sections of the Criminal Code dealing
with theft, fraud, computer abuse, data abuse,
and the interception of communications (Schell
& Martin, 2004). In fact, the Global Cyber Law
Survey of 50 countries around the world found
that 70 % of countries had legislation against
unauthorized computer access, as well as data
tampering, sabotage, mal-ware or malicious
software usage, and fraud.

Issue And controversy #1:


knoWn demogrAphIc And
behAvIorAl profIles of
hAckers And
crAckersbehAvorIAl mIsfIts
or seemIngly normAls?
In light of the significant agreement on the potential harm caused by unauthorized computer
intrusions and malicious cracks, it is necessary to
consider the demographic, behavioral, and social
composition of the CU. In short, who is likely to be
responsible for these exploits, and are they really
all that different from individuals in mainstream
society? While the clothes that hackers wear seem
to have shifted a bit from the 1960s (when long hair
and sandals were the norm) through the present
(where backpacks and black t-shirts are the norm),
hackers and crackers in the experimental phase
still seem to be predominately males under age
30 (Gilboa, 1996; Jordan & Taylor, 1998; Schell
et al., 2002). But why is this?

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

theoretical framework
From its very beginning as an all-male Tech
Model Railroad Club in the 1960s at MIT, where
the geeks had an insatiable curiosity about how
thingsand particularly how a slow-moving hunk
of metal called the PDP-1 workedthe CU has
attracted to this day predominantly men to its
fold. Back then, because of the PDP-1s turtle-like
pace, the smarter computer programmers at MIT
created what they called hacks, or programming
shortcuts, to complete their computing tasks more
efficiently. In fact, the clubs adoption of the term
hacker to describe themselves as well as their
acts indicated a creative individual who could
push the envelope around what computers
were designed to do. The clubs talented hackers
became the seed of MITs Artificial Intelligence
(AI) Lab, the worlds prime center of AI research.
In 1969, the AI labs fame and influence spread
fast, the year in which the Advanced Research
Projects Agency Network, or ARPANET, was
formed, the first transcontinental, high-speed
computer network created by the U.S. Defense
Department as an experiment in digital communications (Schell et al., 2002)
It is interesting to note that the positive, creative
reputation associated with those in the CU has
over the years taken on a negative connotation.
Since the 1980s, the media seem to have focused
on the darker side, frequently reporting the costs
due to property and personal harm as a result of
computer exploits by those in the CU. Moreover,
this rather less than positive picture has also been
painted by theorists trying to understand this
rather unique population.
One school of thought posited by psychosexual
theorists argues that hacking can be viewed as a
way for young men to fulfill their erotic desires
through electronic means (Taylor, 2003). This
notion is generated, in part, by stereotypical conceptions of hackers as introverted, socially inept,
or awkward males who have difficulty relating to
others (Furnell, 2002; Taylor et al., 2006). Certain

technologically-gifted young males inability to


connect in meaningful ways with other people,
especially women, these psychosexual theorists
argue, drive them to spend more their time
with computers and technology (Taylor, 2003).
Through very focused and solitary involvement
with their computers, young men in the CU
become intimately connected with technology
and with hacking. The shouting and swearing in
arcades, the fixation on war and sports games, the
focus on speed, and the presence of primarily men
on software packages and computer games are all
characteristics of this broader-based computer
culture that is stereotypically male (Cohoon &
Aspray, 2006).
These psychosexual theorists further maintain
that the knowledge base young hackers and crackers develop allows them to take out their sexual
frustrations through their computer hardware and
software. The effort taken to bend a computer
system to an individuals will can provide a sense of
physical gratification that directly mirrors physical
masturbation (Keller, 1988; Ullman, 1997). The
destructive or vandalism-based hacks of young
male script kiddies, in particular, are similar to
masturbation in that they usually have no necessary objective or goal aside from the pleasure of
the act itself (Taylor, 2003). These activities may,
however, have little appeal for women, especially
when coupled with their difficulty of communicating with self-focused male hackers (Jordan &
Taylor, 2004).
A second and more inclusive explanation of
the male stereotype of those in the CU relates
to gender role socialization on-line and off-line
(Jordan & Taylor, 2004; Taylor, 2003). Wajcman
(1991) suggests that gender and technology influence and shape one another, such that technology is a source of and consequence of gender
relationships. For example, there are gender
differences in the way that humans interact with
and approach technology (Turkle, 1984). Males
tend to use hard mastery techniques involving
a distinct and decisive plan to impose their will

99

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

over the device, whereas women tend to practice


soft mastery techniques based on interaction
with the device and responding to its processes
(Turkle, 1984). Moreover, these theorists argue,
over recent decades, gender role socialization
has resulted in an ongoing interest by males in
technologyto the near exclusion of females.
The creators of early computers, it is posited,
were mostly men who used analytical and rational
thinking to develop technology, especially the
command-line interfaces of the early systems
(Levy, 1984). These traits are in direct contrast to
the prevalent role-specific traits purported to exist in females (nurturing, caring, and protecting).
The gender role socialization theorists argue that
given this societal bias, since the 1960s, young
females have not been steered by male mentors
to become enamored with technology, primarily
because it runs counter to females supposed
natural personality traits and softer approaches
to technology.

present-day myths and reality


If asked to describe the demographic, behavioral,
and social composition of those in the CU, most
people in mainstream society would probably
suggest that hackers have a low tolerance for
business suits and business attirepreferring to
wear clothes optimizing comfort, function, and
minimal maintenance. Onlookers would also
likely add that hackers are obviously connected
to their computers, perhaps even addicted to them.
Also, mainstream citizens would likely suggest
that besides their computers, hackers as a group
would seem to like music, chess, war games,
and intellectual games of all kinds. Moreover,
assert onlookers in the mainstream, if hackers
were to engage in sports, they would probably
choose those that are self-competitive rather than
team-orientedas afforded by the martial arts,
bicycling, auto racing, hiking, rock climbing,
aviation, and juggling. In terms of religion and
self-control, most mainstream onlookers might

00

suggest that hackers would probably describe


themselves as agnostics, atheists, or followers of
Zen Buddhim or Taoism, and they would probably
tend to avoid substances that would make them
become cognitively stupid (such as alcohol), but
they would tend to ingest high-caffeine drinks
as an aid to staying awake long hours so that
they could hack. And when communicating with
one another online, many mainstream onlookers
would probably add that hackers would tend to use
monikers (like Mafiaboy) rather than their own
names. It is interesting to note that given these
rather superficial attributes commonly assigned
to hackers by mainstream citizens, many of the
hackers, when asked, would tend to concur with
these observations. As a matter of fact, this is a
subset of descriptors for hackers appearing from
1996 through 2002 at the popular hacking Web
site http://project.cyberpunk.ru/links.html.
Aside from these appearance descriptors,
little else but myths about those in the CU existed until just a few years ago. Contrary to some
commonly-held beliefs among the public and the
media that hackers are behavioral misfits with
odd sexual relationships (whereby bi-sexual and
trans-sexual relationships outnumber those in the
adult population), are largely unemployed, and
have strange sleeping patterns (with a propensity
to hack through the night), a comprehensive behavioral and psychological assessment study by
Schell et al. (2002) on 210 hacker attendees of the
Hackers on Planet Earth (HOPE) 2000 gathering
in New York City and of the DefCon gathering
in Las Vegas in July, 2000, found that these three
behavioral myths about those in the CUand
other mythswere generally unfounded.
Of the respondents in this Schell et al. (2002)
hacker study, 91% were males, 9% were femalesrepresentative, the authors argued, of the
predominance of males in the CU population. The
mean age of respondents was 25 years, with the
youngest respondent being 14 years and with the
eldest being 61 years.

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

Regarding sexual relationship propensity, a


significant 79% of the hacker survey respondents
(males and females) claimed to be monogamous
heterosexual, and those approaching or exceeding
age 30 were likely to be gainfully employedwith
the mean salary for males placing at about $57,000
(n = 190) and with the mean salary for females
placing at about $50,000 (n = 18). The largest
reported income was $700,000. So from a sexual
proclivity and employment standpoint, males and
females in the CU seem to be quite similar to those
in mainstream society and are generally sound
contributors to the economic wealth of society.
Moreover, from a financial angle, a t-test on the
findings revealed that female hackers seem to be
as financially rewarded as their male peers.
The hacker study authors also affirmed that
there were some statistically significant findings
of demographic and behavioral mean score differences between the White Hats and the Black
Hats in the study. The White Hats were designated
as those who reported being motivated primarily by achievement or by societal/organizational
gains (such as advancing networks or software
and computer capabilities and hacking to expose
weaknesses in organizations computer systems
or in their productsbut for the overall good of
society), whereas the Black Hats were designated
as those who reported being motivated by narcissistic needs such as hacking for personal financial
gain without regard to the personal and property
costs to others or for enhancing their reputation
within the hacker community/world without
regard to the costs to others.
Using these definitions, Schell et al. (2002)
classified 12 of the respondents as Black Hats.
Demographically, these individuals were, on
average, 27 years old (1 year younger, on average, than their White Hat counterparts), male
(but not exclusively), and American. The Black
Hats earned significantly more than their White
Hat counterpartsalmost double. In fact, the
Black Hats (male and female) earned, on average,
$98,000 annually, while the White Hats (male

and female) earned, on average, $54,000. Moreover, the Black Hat and White Hat males tended
to work in large companies (with an average of
5,673 employees) and were generally not charged
with hacking-related crimes (but not exclusively),
whereas the females (White Hat and Black Hat)
tended to prefer working in smaller companies
with an average of about 1,400 employees.
Interestingly and consistent with previous
study findings and with the myth about hackers
that they value information and activities that
make them smarter, both the Black Hat and the
White Hat hackers in the Schell et al. (2002) study
tended to be self- and other-taught (like the founding hackers at MIT) and were quite well educated,
with at least a community college education. The
female White Hats and Black Hats, consistent with
the gender role socialization theory, admitted to
generally learning their computer skills later in life
at college or university, largely because they were
not steered in the technology direction by parents,
teachers, or career counselors. Also, consistent
with Meyers (1989) earlier study suggesting that
neophyte hackers are drawn to computers from
an early age and tinker with them on their own
time, the most frequent response (39%, n = 83) to
the item asking the hackers (male and female) how
they learned their computer skills was that they
were self-taught. The next largest pocket (7% of
the respondents) said that they were self-taught,
completed formal courses and courses on the job,
and learned from friends and relatives.
Regarding the myth in the public domain about
odd sleeping patterns of hackers in the CU, a significant 79% of the hacker respondents (males and
females) said that they sleep sometime during the
night from 12 midnight through 8 A.M.similar
to individuals sleeping patterns in mainstream
culture. Thus, Schell et al. (2002) noted that this
myth about odd sleeping patterns was unfounded
by their study findings.
Regarding prevailing culture within the CU,
Thomas (2002) has suggested, as earlier noted, that
those in the CU operate within a predominately

0

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

competitive culture centered on demonstrations of


mastery, wherein talented individuals show their
ability to dominate their social and physical environment using their computers and their brains.
Those in the CU taunt and challenge the ability
of others while online, with the ultimate goal of
gaining complete control of another individuals
computer system, often referred to as taking root
or 0wning a box. Sometimes this taunting can
get out-of-hand, moving into the adverse impact
areas of cyber harassment and cyber stalking.
According to popular myth and consistent with
the assertions of Thomas (2002), the Schell et al.
(2002) study findings revealed that the male and
female hackers adopt online monikers to protect
their anonymity and privacy while online and to
decrease the chances of being cyber harassed.
The majority63% of the respondentssaid
that they typically use a net handle or moniker
to identify themselves online, with only 10% of
the respondents admitting to use their birth name
alone, and with 27% claiming that they use a
combination of their birth names and net handles
while online. Out in the real world, the hacker
respondents affirmed that they tended to use their
birth names only, indicating that when they are
operating in mainstream society, they take on a
rather mainstream persona. Again, a significant
56% of the respondents said that they used their
net handles specifically for hacking activities.
In this 2002 study, as suggested more recently
by Holt (2007), the female hackers confirmed
that they may adopt male monikers or handles to
obfuscate their gender, to increase the likelihood
that they are accepted by others in the CU, and
to reduce the fear that they may be exposed to
high levels of harassment via flaming (the situation where individuals direct online obscene or
abusive messages to another online user to upset
that individual and to provoke distress).
Finally, contrary to the stereotypical communication patterns posited by the psychosexual
theorists regarding the stereotypical conceptions
of hackers as introverted, socially inept, or awk-

0

ward males who have difficulty relating to others


(Furnell, 2002; Taylor et al., 2006), the Schell et al.
(2002) study findings paint a somewhat different
picture. While the psychosexual theorists tend to
agree with the prevailing myth about hackers that
they communicate only with their computers and
not with other people and that they are loners, the
just-cited 2002 study found that, as Meyers earlier
1989 study reported, hackers spend considerable
time during the week communicating with their
colleaguesabout 25%. Moreover, while 57%
of the respondents said they like to hack solo,
the remaining 43% of the respondents (male and
female) said that they prefer to collaborate with
others when hacking.
In summary, the literature seems to suggest
that as more becomes known about those in the
CU, the picture that emerges is quite different from
the dark-side palette that prevails in the minds
of the public, the media, and the psychosexual
theorists. More support appears to be accumulating that paints a clearer picture of the positive
sides of those in the hacker community, debunks
a number of demographic and behavioral myths
prevailing about individuals in the CU, and points
to the accuracy of the gender role socialization
theory. Considering the concern of industry and
society about cyber crimesand the huge costs
to society regarding breaches of privacy, security,
and trustare there some psychological indicators that may point to a proclivity to cause harm
to property and persons by those in the computer
underground?

Issue And controversy #2:


psychologIcAl myths And
truths About those In the
computer underground
(cu)do they tend to be
dIseAse-prone or self-heAler
types?
In light of the significant agreement on the potential harm caused by unauthorized computer intru-

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

sions and malicious cracks by law enforcement


agents, the public, industry, and those in the CU, it
is necessary to consider the psychological makeup of those operating daily in the CU. In short,
who is likely to be responsible for these exploits,
and are these hackers really all that different from
individuals in mainstream society?

theoretical framework
At the start of this chapter, we addressed a number
of concerns that industry and the general public
have about privacy, security, and trust. However,
in recent months, even those in the CU have
expressed concerns about harmful episodes that
have jeopardized their psychological safety and
could interfere over the longer term with their
personal safety. For example, in March, 2007,
anonymous online death threats were levied
against Kathy Sierra, a popular Web developer
within the information technology community,
author, and blogger who encourages companies
to consider human behavior when designing their
technological products. While many bloggers rallied to her supportonline and off-line, a number
of women and men got online to talk about their
incidents of online bullying, harassment, and
stalking (Fost, 2007).
By definition, online bullying entails verbally
abusing targets by threatening to cause harm to
ones reputation; cyber harassment uses cyber
space to harass a targeted individual; and cyber
stalking occurs when individuals repeatedly
deliver online unwanted, threatening, and offensive e-mail or other personal communications
to targeted individuals, including death threats
(Schell & Martin, 2006). All of these threats are
intended to cause psychological damage to others,
and some of these exploits may actually result in
death to the targets.
One of the most well known cyber stalking
cases reported in the popular media involved a
young cracker named Eric Burns (a.k.a. Zyklon).
Eric Burns claim to fame is that he attacked the

Web pages of about 80 businesses and government offices whose pages were hosted by Laser.
Net in Fairfax, Virginia. Burns, a creative individual, designed a program called Web bandit
to identify computers on the Internet that were
vulnerable to attack. He then used the vulnerable systems to advertise his proclamations of
love for a young classmate named Crystal. These
computer exploits by Burns became his way of
advertising worldwide his unrelenting love [or,
more accurately, to get the attention of and then
to take control of or to take root of] Crystal. He
hoped that by proclaiming his love in the cyber
world, he would, hopefully, get her attention, if
not her long-term commitment. This real-world
case ended with the 19-year-old male pleading
guilty to attacking the Web pages for NATO and
Vice President Al Gore. In November, 1999, the
judge hearing the case ruled that Burns should
serve 15 months in federal prison for his cracking
exploits, pay $36,240 in restitution, and not be
allowed to touch a computer for 3 years after his
release. The irony in this case is that the young
woman named Crystal attended the same high
school as Eric but hardly knew him. In the end,
she assisted the authorities in his capture, but Eric
Burns, in his role as cyber stalker, did not make
the media headlines, just the fact that he cracked
Web sites (Reuters, 1999).
Mental health experts who assessed Eric Burns
declared that he likely felt more comfortable communicating online than in person, he had difficulty
overcoming his fear of rejection by people in the
real world, he lacked the social skills to repair
relationships, and he was mocking authority by
saying something like I can put my favorite
girls name on your Web site for everyone to see,
but you cant get me. In short, Eric Burns had a
number of pre-existing mental health issues that
he acted out online (Schell et al., 2002).
The case of Eric Burns is symbolic on a number of planes, including that most of what the
literature reports about the psychological profiles
of those in the CU has been gleaned from legal

0

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

documents following convicted crackers arrests,


both insiders and outsiders. Moreover, Eric Burns
as a cyber stalker is likely not too different from
real-world stalkers who have been charged and
convicted. Stalking experts tend to agree that
those who commit these crimes in the virtual
world are likely not much different psychologically from those who commit such crimes in the
mainstream (Schell & Lanteigne, 2000).
So when these crackers are caught, convicted,
and analyzed by mental health experts, how are
they classified? In general, mental health experts
tend to describe the overall profile of working
adults as being predominantly self-healing in
nature (i.e., they recover well from stressful life
episodes, and they are generally psychologically
and physically well individuals over the longer
term) and predominantly disease-prone (i.e.,
they tend not to recover well from stressful life
episodes, they often suffer from bouts of depression in the short-term, and in the longer term,
they often suffer from diseases such as premature
cardiovascular disease or cancer) (Schell, 1997).
As noted, Eric Burns would likely be placed in
the category of disease-proneness.
Consistent with some earlier reported offender profile findings developed on computer
science students and information systems (IS)
employees, Shaw, Post, and Ruby in 1999 said that
insider computer criminals, in particular, tend
to have eight traits that are more disease-prone
than self-healing. In particular, insider crackers
who cause damage to company computers and/or
to individuals within these organizations are: (1)
introverted; (2) they have a history of significant
family problems in early childhood, leaving
them with negative attitudes toward authority;
(3) they have an online computer dependency
that significantly interferes with or replaces direct
social interactions in adulthood; (4) they have an
ethical flexibility that allows them to justify their
violations, even if they get caught; (5) they have
a stronger loyalty to their computer specialty
than to their employers; (6) they have a sense of

0

entitlement, thinking that they are special and


are thus owed the corresponding recognition; (7)
they tend to have a lack of empathy, preferring to
disregard the adverse impact of their actions on
others; and (8) because of their introverted natures, they are less likely to seek direct assistance
from their supervisors or from their companys
employee assistance program (EAP) when they
are distressed.
Earlier in 1997, Ehud Avner constructed what
he called a personality analysis he completed
in several countries for mal-inclined information
systems employees. His results found that the prototypical insider IT criminal rarely gets caught,
because he or she tends to be a manager or a highranking clerk without a criminal record, he or she
commits the crack in the course of normal and
legal system operations, and he or she seems on
the surface to be a bright, thorough, highly-motivated, diligent and a trustworthy employeeuntil
caught. When caught, he or she tends to say that
he or she did not intend to hurt anyone, that banks
steal more than he or she did, or that he or she only
tried to prove to their employers that it is possible
to crack the vulnerable system. It is important to
remember that the probable reason for their not
getting caught earlier is that these cyber criminals
frequently have a considerable number of White
Hat traitsas well as Black Hat oneswhich
generally keep them out of investigators eyes
(Schell et al., 2002).
Considering this growth in knowledge about
insider and outsider convicted crackers, noted
Schell et al. in 2002, what was still sorely missing from the literature on those in the CU as of
2000 was a psychological profile of the White Hat
hackers in the CU. What were they like psychologically, and did they differ substantially from
the Black Hats or those charged and convicted
of cyber crimes?

present-day myths and reality


While a number of researchers have increasingly
reported that disease-prone types and self-healing

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

types likely exist in the CU, few have been able


to describe the relative proportion of each. Accepting this observation, researchers have become
curious about the supposed lack of bias and fair
play in the CU, as in recent times, anecdotal
evidence (such as that of Sierra in March, 2007)
seems to indicate that some degree of rejection
or adverse attention-seeking by online peers is a
real concern and needs to be addressed as a major
online privacy, security, and trust issue within the
CU (Gilboa, 1996; Holt, 2006; Jordan & Taylor,
1998, Sagan, 2000; Schell, 2007).
As noted in the demographic section, the media
have focused on the disease-prone side of those in
the CU. Myths surfacing in the headlines since the
1980s, some founded and some not, include the
notion that hackers seem to have troubled childhoods, marked by a history of alcoholic parents,
abandonment by one or more parents, and parental
discord (Shaw et al., 1999)and that this could
be a primary reason for their apparent need to
rebel against authority figures. The Canadian
cracker known as Mafiaboy, who in 2000 raised
concerns when he cracked Internet servers and
used them as launching pads for denial of service
(DoS) attacks on the Web sites of Amazon, eBay,
and Yahoo, is a poster boy for a cracker with a
troubled childhood.
Another prevailing myth about hackers is
that because they have the predisposition and the
capability to be multi-tasked (Meyer, 1989) and appear to be computer addicted (Young, 1996), they
are likely stressed-out in the short-term and are
likely to be cardiovascular-prone at early ages (i.e,
disease-prone Type A) over the longer term. Thus,
argues those who cite this myth, label hackers in
the CU as predominantly disease-prone.
Finally, if there is a bright light to shine in
terms of prevailing hacker myths, it is that hackers are generally perceived by the public and
the media to be creative individualsa definite
self-healing trait.
As noted, the Schell et al. 2002 study presents
the most comprehensive picture, to date, of those

inhabiting the CU, including their psychological


make-up.
Regarding the first myth about childhood
trauma for those in the CU, while 28% of the hacker
convention respondents in this 2002 study said
that they had experienced childhood trauma or
significant personal losses, the majority of hacker
respondents did not make such claims. However,
supportive of this myth, of those hackers who
reported having a troubled childhood, the majority61%said that they knew that these events
had a long-term adverse impact on their thoughts
and behaviors. Moreover, a t-test analysis on the
findings revealed that female hackers were more
likely to admit experiencing childhood trauma
than their male counterparts, but there was no
significant difference in the reporting of childhood trauma for those charged and not charged
of crimes, or for those hackers under ago 30 and
those over age 30 (Schell et al., 2002).
Though many mental health experts would
seem to suggest those in the CU are predominantly
task-obsessed and perfectionist Type A individuals who are cardiovascular self-destructors,
Schell et al. (2002) reported that those individuals
attending hacker conventions actually tend to be
more moderated, self-healing Type B individuals in nature.
Moreover, a current study being undertaken by
the authors of this chapter seems to indicate that
those attending hacker conventions may actually
possess relatively high degrees of the Aspergers
Syndrome, or the Nerd Syndrome, and that this
constellation of traits may, in fact, protect hackers
in the CU from becoming too stressed-out by the
highly competitive and, at times, very aggressive
nature of this virtual environment (Fitzgerald &
Corvin, 2001). The ability to screen out distractions is a geeky trait that can be extremely useful
to computer programmers, in particular, and many
of these traits in milder forms seem to be descriptive of computer hackers (Nash, 2002).
Regarding the third bright light myth, Schell
et al. (2002) corroborated with their study findings

0

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

that those in the CU are creative individuals who


report using predominantly complex analytical
and conceptual styles of problem-solving and
decision-making. These researchers reported
that on the 20-item Creative Personality Test
of Dubrin, for example, the majority of hacker
respondents62%had scores meeting or
exceeding the critical creative level score of
15, thus supporting this myth. Further analysis
revealed no significant differences in the creativity mean scores for the males and the females,
for those charged and not charged of crimes in
the CU, and for those under age 30 and over age
30. Moreover, using the 20-item Decision Style
Inventory III of Rowe and colleagues, these
researchers found that the highest mean scores
for decision-making and problem-solving styles
of hackers placed in the analytic and conceptual
styles, supporting the myth that those in the CU
are cognitively complex and creative in their
thinking predispositions.
In closing, in an effort to try to determine who
might have high degrees of disease-proneness and
the propensity to cause harm to persons and to
property in the CU, Schell et al. (2002) completed
a number of analyses on the Black Hat segment
that admitted to being motivated to take revenge
on targets. The traits for these individuals were
that they tended to be under age 30, they had less
formal education than their White Hat peers, they
reported significantly higher hostility and anger
stress symptoms in the short term than their White
Hat peers, they reported higher depression stress
symptoms and more longer-lasting (and addictive)
hacking sessions than their comparative group,
and they had higher psychopathic and highly
narcissistic personality predispositions than their
self-healing counterparts. Thus, the researchers
concluded that that the group of hackers in the
CU most at risk for committing self- and otherdestructive acts appeared to be under age 30,
narcissistic, angry, obsessive individuals suffering from repeat bouts of depression. The authors
suggested that the percentage of high-risk hackers

0

ready to cause harm to persons, in particular, may


be as high as 7%.

future trends: hoW present


strAtegIes for deAlIng WIth
onlIne prIvAcy, securIty And
trust Issues need to be
Improved
This chapter has discussed, at length, about how
evolving cyber crime legislation and an increased
knowledge of the demographic, psychological, and
social/behavioral propensities of those in the CU
may, combined, lead to better methods of not only
curbing cyber crime but of better understanding
who may commit it, how, and why.
However, other solutions developed to deal
with the privacy, security, and trust issues in
cyber space have been developed and need to be
discussed.

solutions for online privacy


Recent public surveys have shown that a number
of consumers are still afraid to buy goods and services online, because they fear that their personal
information (particularly credit card and social
security numbers) will be used by someone else.
Moreover, despite assurances from credit card
companies that they will not hold consumers accountable for any false charges, in recent times
trust seals and increased government regulation
have become two main ways of promoting improved privacy disclosures on the Internet.
Trust seals nowadays appear on e-business
Web sitesincluding green Truste images, the
BBBOnLine (Better Business Bureau OnLine)
padlocks, and a host of other privacy and security
seals. In fact, some companies are paying up to
$13,000 annually to display these logos on their
Web sites in the hopes of having consumers relate
positively to their efforts to provide online privacy.

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

In fact, almost half of the Fortune 100 companies


display such logos, and of the fourteen information
technology Web sites in the Fortune 100 companies, 10 have such seals (Cline, 2003).
The question, however, remains about whether
trust seals really work. Although they are intended
to advance privacy for consumers through legislationprimarily through business self-regulationcritics often suggest that these trust seals
are more of a privacy advocate for corporations
than for consumers. But those supporting the
usefulness of trust seals note that if companies
display them on their Web sites, they need to
follow the trust standards, provide online clients
with a means of opting out of direct marketing
and having their personal information sold to
third parties, and give consumers a way to access
the companys information and file complaints.
Also on the positive side, ScanAlert, an emerging
security seal provider, argues that Internet sales
are reported to increase by 10% to 30% if the
trust seals are displayed (Cline, 2003).
From consumers submissions, a study conducted by Flinn and Lumsden (2005) indicated
that 42% of the consumers surveyed in their study
said that they were more likely to trust a Web site
that displays a trust mark than those not having
the display, and 49% of the respondents said that
they are likely to trust a Web site only if they are
able to recognize the trust mark program.
Furthermore, while government regulations
in North America are increasing to advance the
privacy of citizensby passing laws like the
Canadian Personal Information Protection and
Electronic Documents Act (PIPEDA) and the U.S.
Health Insurance Portability and Accountability
Act (HIPAA) of 1996, citizens are commonly
uneasy with such pieces of legislation, for consumers dislike having their telecommunications
traffic monitored by government agents. As a
result of these privacy concerns, the field of information ethics (IE) has emerged to deal with
issues arising from connecting technology with
concepts such as privacy, intellectual property

rights (IPR) information access, and intellectual


freedom. Although IE issues have been raised as
early as 1980, nowadays the field is more complexspurred on by the concerns of a number of
academic disciplines regarding Internet abuses.
According to IE, information itself, in some form
or role, is recognized to have intrinsic moral value.
Thus, theoreticians have formulated a number of
complex mathematical solutions for providing
better information protection over the Internet.
This is a field that is sure to bring more comprehensive solutions to online privacy concerns in
future years.

solutions for online security


Generally, businesses and government agencies
take two kinds of approaches to prevent security
breaches: proactive approachessuch as preventing crackers from launching attacks in the first
place (typically through various cryptographic
techniques) and reactive approachesby detecting security threats after the fact and applying
appropriate fixes. The two combined allow for
comprehensive network solutions. In technical
circles, securing Web sites generally refers to the
use of SSL (secure sockets layer) technology for
encrypting and authenticating HTTP (hypertext
transfer protocol) connections (Flinn & Lumsden,
2005).
Moreover, because network security is a chain,
it is only as secure as its weakest link. Although
enhanced network security features are desirable,
they cost moneya key consideration factor for
companies and institutions, especially the smaller
ones that are often reluctant to apply an enhanced
security solution because of prohibitive costs.
These costs are associated with, for example, additional overhead (such as increased bandwidth),
increased complexity (requiring specialized
security personnel), and information processing
delays such as degraded performance, which can,
in turn, degrade network performance (Grami &
Schell, 2004). Thus, if smaller companies in the

0

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

chain cannot afford appropriate security features,


the whole chain is vulnerable to attack.
Also, industry has come forward in recent
years with some innovative commercial tools to
assist network administrators in preventing network intrusions. One such tool, designed by Dan
Farmer and Wieste Venema in 1995, is known as
SATAN (security administrator tool for analyzing networks). This tool works by procuring as
much data as possible about system and network
services, and upon discovering vulnerabilities,
it gives rather limited data to network administrators to assist them in fixing the problem. Its
successor, SAINT, is also on the market to assist
in this regard.
Other improvements are on the way to help
improve online security. From a standard perspective, the emerging IEEE 802.11i standards will
improve wireless security issues, in particular,
and turn wireless networking into a more trusted
medium for all users, including the prevention of
DoS problems caused when the entire network is
jammed. (The attack could be against the clients
wireless device or against the networks access
point). Jamming has, to date, been difficult to
stop, largely because most wireless local area
networking technologies use unlicensed frequencies and are subject to interference from a variety
of sources (Grami & Schell, 2004)

solutions for online trust


As noted earlier in this chapter, a major barrier
to the success of online commerce has been the
fundamental lack of faith between business and
consumer partners. This lack of trust by consumers is largely caused by their having to provide
detailed personal and confidential information to
companies on request. Also, consumers fear that
their credit card number could be used for purposes
other than that for which permission was given.
From the business partners trust vantage point,
the company is not sure if the credit card number
the consumer gives is genuine, is in good credit

08

standing, and actually belongs to the consumer


trying to complete the transaction.
In short, communicating with unknowns
through the Internet elicits two crucial sets of
questions that require reflection: One, what is
the real identity of other person(s) on the Internet
and can their identities be authenticated? Two,
how reliable are other persons on the Internet,
and is it safe to interact with them? (Jonczy &
Haenni, 2005).
Consumer and business trust online is based
on such authentication issues, and in recent years,
a number of products have been developed to assist in the authentication process, including the
following (Schell, 2007):

Biometrics, which assess users signatures,


facial features, and other biological identifiers;
Smart cards, which have microprocessor
chips that run cryptographic algorithms and
store a private key;
Digital certificates, which contain public or
private keysthe value needed to encrypt
or decrypt a message; and
SecureID, a commercial product using
a key and the current time to generate a
random number stream that is verifiable
by a server, thus ensuring that a potential
user puts in a verifiable number on the card
within a set amount of time (typically 5 or
10 seconds).

Experts agree that trusted authentication


management in a distributed network like the
Internetwhich is the importance of the second
questionis not easy, for in a decentralized authority, every user is also a potential issuer of credentials. Furthermore, a given set of credentials,
perhaps issued by many different users, forms a
credential network. Therefore, a web of trust
model and solution was introduced by pretty good
privacy (PGP), a popular application for e-mail
authentication. Without getting too technical,

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

PGP organizes public keys and corresponding


certificates in local key rings. The owner of the
key ring gets a web of trust by assigning trust
values to issuers. This web of trust thus acts
as the basis for a qualitative evaluation of the
authenticity of the public keys involved. In PGP,
the evaluation of a web of trust is founded on
three rules and on the production of two outcomes:
valid or invalid (Jonczy & Haenni, 2005).
Unfortunately, to date, authentication systems
like PGP have failed to gain large acceptance and
to solve real-world trust problems such as spam,
primarily because they suffer from a number
of deployment usability issues, as well as trust
management issues. For example, in web of
trust style systems, Internet users must validate
keys out-of-band, which is a laborious task. Better
solutions need to be developed which minimize
these shortcomings.
Finally, it is important to recognize that in
recent years there has been some effort by experts to set standards and indicators for a more
systematic and coordinated fashion to capture the
trustworthiness state of a particular information
technology infrastructure, including the Internet.
Such indicators would reflect the assurance of
the IT infrastructure to reliably transfer information (including security, quality of service, and
availability of service)thus increasing consumers trust in the network. These indicators could
then be used to identify areas of the infrastructure requiring attention and be utilized by an IT
organization to assess the return on investment
(ROI) for improved IT infrastructure equipment
purchase. Unfortunately, despite the existing work
in progress, there is still no standard or widelyaccepted method of assessing assurance levels
associated with IT infrastructures, including
end-hosts, servers, applications, routers, firewalls,
and the network permitting the subsystems to
communicate. Clearly, this is an area where
academics and security experts need to focus to
find more effective trust solutions (Schell, 2007;
Seddigh, Pieda, Matrawy, Nandy, Lombadaris,
& Hatfield, 2004).

closIng
This chapter has discussed a number of privacy,
security, and trust issues affecting online consumers. Clearly, understanding these issues and finding solutions for themlegislative, technological,
sociological, or psychologicalis a complex chore
requiring experts in multiple fields, including law,
information technology, security, business, and
the social sciences. Likely the only way forward in
finding more comprehensive solutions is to adopt
a team approach for finding hybrid solutions that
are outside any one silo of expertise.

references
Associated Press. (2005). Business schools: Harvard to bar 119 applicants who hacked admissions
site. The Globe and Mail, March 9, B12.
Brenner, S. (2001). Is there such a thing as virtual
crime? Retrieved February 1, 2006,from http://
www.crime-research.org/library/Susan.htm
Cincu, J., & Richardson, R. (2006). Virus attacks
named leading culprit of financial lossby U.S.
companies in 2006 CSI/FBI computer crime and
security survey. Retrieved July 13, 2006, from
http://www.gocsi.com/press/20060712.jhtml
Cline, J. (2003). The ROI of privacy seals.
Retrieved October 12, 2007, fromhttp://www.
computerworld.com/developmentopics/websitemgmt/story/0,10801,81633,00.html
Cohoon, J. M., & Aspray, W. (2006). A critical
review of the research on womens
participation in postsecondary computing education. In J. M. Cohoon and W. Aspray (Ed.),
Women and information technology: Research on
underrepresentation (pp. 137-180). Cambridge,
MA: MIT Press.
Evans, M., & McKenna, B. (2000). Dragnet targets
Internet vandals. The Globe and Mail, February
10, A1, A10.
09

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

Fitzgerald, M., & Corvin, A. (2001). Diagnosis and


differential diagnosis of Asperger syndrome. Advances in Psychiatric Treatment, 7(4), 310-318.
Flinn, S., & Lumsden, J. (2005). User perceptions
of privacy and security on the web. Retrieved 12
October, 2007, from http://www.lib.unb.ca/Texts/
PST/2005/pdf/flinn.pdf

Jonczy, J., & Haenni, R. (2005). Credential networks: A general model for distributed trust and
authenticity management. Retrieved on October
10, 2007, from http://www.lib.unb.ca/Texts/
PST/2005/pdf/jonczy.pdf
Jordan, T., & Taylor, P. (1998). A sociology of hackers. The Sociological Review, 46(4),757-780.

Fost, D. (2007). The technology chronicles: the


attack on Kathy Sierra. Retrieved March 27, 2007,
from http://www.sfgate.com/cgi-bin/blogs/sfgate/
detail?blogid=19&century_id=14783

Keller, L. S. (1988, July). Machismo and the


hacker mentality: some personal observations
and speculations. Paper presented to the WiC
(Women inComputing) Conference.

Furnell, S. (2002). Cybercrime: Vandalizing


the information society. Boston, MA: AddisonWesley.

Krebs, B. (2003). Hackers to face tougher sentences. Washington Post. Retrieved Feburary 4,
2004, from http://www.washingtonpost.com/ac2/
wp-dyn/A35261-2003Oct2?language=printer

Gilboa, W. (1996). Elites, lamers, narcs, and


whores: Exploring the computer underground. In.
L. Cherny & E. R. Weise (Eds.), Wired women:
Gender and new realities in cyberspace (pp. 98113). Seattle, WA: Seal Press.
Gordon, L. A., Loeb, M. P., Lucyshyn, W., &
Richardson, R. (2007). 2006 CSI/FBI computer
crime survey. Retrieved March 3, 2007, from
http://www.GoCSI.com
Grami, A., & Schell, B. (2004). Future trends
in mobile commerce: Service offerings, technological advances, and security challenges.
Retrieved October 13, 2004, from http://dev.hil.
unb.ca/Texts/PST/pdf/grami.pdf
Holt, T.J. (2006, June). Gender and hacking. Paper
presented at the CarolinaCon 06 Convention,
Raleigh, NC.
Holt, Thomas J. (2007). Subcultural evolution?
Examining the influence of on- and off-line experiences on deviant subcultures. Deviant behavior,
28(2), 171-198.
IBM Research. (2006). Global security analysis
lab: Fact sheet. IBM Research. Retrieved January
16, 2006, from http://domino.research.ibm.com/
comm/pr.nsf/pages/rsc.gsal.html

0

Levy, S. (1984). Hackers: Heroes of the computer


revolution. New York: Dell.
Meinel, C. (2006). Appendix A: How do hackers
break into computers? In B. Schell &
C. Martin (Ed.), Websters new world hacker
dictionary (pp. 373-380). Indianapolis, IN: Wiley
Publishing, Inc.
Meyer, G. R. (1989). The social organization of
the computer underworld. Retrieved October
13, 2007, from http://bak.spc.org/dms/archive/
hackorg.html
Nash, J. M. (2002). The geek syndrome. Retrieved
July 5, 2007, from http://www.time.com/time/covers/1101020506/scaspergers.html
Rescorla, E. (2005). Is finding security holes a good
idea? IEE Security and Privacy, 3(1), 14-19.
Reuters. (1999). Teen pleads guilty to government hacks. Retrieved October 13, 2007, from
http://scout.wisc.edu/Projects/PastProjects/netnews/99-11/99-11- 23/0007.html
Richardson, R. (2007). 2007 CSI Computer Crime
and Security Survey. Retrieved October 12, 2007,
from http://www.GoCSI.com

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

Sagan, S. (2000). Hacker women are few but


strong. Retrieved August 5, 2004, from http://
abcnews.go.com/sections/tech/DailyNews/hackerwomen000602.html#top
Schell, B.H. (2007). Contemporary world issues:
The internet and society. Santa Barbara, CA:
ABC-CLIO.
Schell, Bernadette H. (1997). A self-diagnostic
approach to understanding organizational and
personal stressors: The C-O-P-E model for stress
reduction. Westport, CT: Quorum Books.
Schell, B. H., Dodge, J. L., & Moutsatos, S. S.
(2002). The hacking of America: Whos doing it,
why, and how. Westport, CT: Quorum Books.
Schell, B. H., & Lanteigne, N. (2000). Stalking,
harassment, and murder in the workplace: Guidelines for protection and prevention. Westport,
CT: Quorum Books.
Schell, B. H., & Martin, C. (2006). Websters new
world hacker dictionary. Indianapolis, IN: Wiley
Publishing, Inc.
Schell, Bernadette H.. & Martin, Clemens. (2004).
Contemporary world issues: Cybercrime. Santa
Barbara, CA: ABC- CLIO.
Seddigh, N., Pieda, P., Matrawy, A., Nandy, B.,
Lombadaris, J., & Hatfield, A. (2004). Current
trends and advances in information assurance
metrics. Retrieved October 10, 2007, from http://
dev.hil.unb.ca/Texts/PST/pdf/seddigh.pdf
Shaw, E., Post, J., & Ruby, K. (1999). Inside
the mind of the insider. Security Management,
43(12), 34-44.
Taylor, P. A. (2003). Maestros or misogynists?
Gender and the social construction of hacking. In
Y. Jewkes (Ed.), Dot.cons: Crime, deviance and
identity on the Internet (pp. 125-145). Portland,
OR: Willan Publishing.

digital terrorism. Upper Saddle River, NJ: Pearson


Prentice Hall.
Thomas, D. (2002). Hacker culture. Minneapolis,
MN: University of Minnesota Press.
Turkle, S. (1984). The second self: Computers
and the human spirit. New York:Simon and
Schuster.
Ullman, E. (1997). Close to the machine: Technophilia and its discontents. San Francisco: City
Lights Books.
Wajcman, J. (1991). Feminism confronts technology. University Park, PA: Pennsylvania State
University Press.
Wall, D. S. (2001). Cybercrimes and the Internet.
In D.S. Wall (Ed.), Crime and the Internet (pp.
1-17). New York: Routledge.
Walton, D. (2000). Hackers tough to prosecute, FBI
says. The Globe and Mail, February 10, B5.
Young, K. S. (1996). Psychology of computer use:
XL. Addictive use of the Internet: A case that
breaks the stereotype. Psychological Reports,
79(3), 899-902.

AddItIonAl reAdIngs
The Cybercrime Blackmarket. (2007). Retrieved
July 11, 2007, from http://www.symantec.com/avcenter/cybercrime/index_page5.html
Florio, E. (2005). When malware meets rootkits.
Retrieved July 11, 2007, from http://www.symantec.com/avcenter/reference/when.malware.
meets.rootkits.pdf
Grabowski, P., & Smith, R. (2001). Telecommunications fraud in the digital age: The convergence
of technologies. In D. S. Wall (Ed.), Crime and the
Internet (pp. 29-43). New York: Routledge.

Taylor, R. W., Caeti, T. J., Loper, D. K., Fritsch,


E. J., & Liederback, J. (2006). Digital crime and



Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

Herring, S. C. (2004). Computer-mediated discourse analysis: An approach to researching


online behavior. In S. A. Barab, R. Kling, J. H.
Gray (Eds.), Designing for virtual communities in
the service of learning (pp. 338-376). New York:
Cambridge University Press.
Holt, T. J. (2003). Examining a transnational problem: An analysis of computer crime victimization
in eight countries from 1999 to 2001. International
Journal of Comparative and Applied Criminal
Justice, 27(2), 199-220.
Holt, T. J., & Graves, D. C. (2007). A qualitative
analysis of advanced fee fraud schemes. The
International Journal of Cyber-Criminology,
1(1), 137-154.
The Honeynet Project. (2001). Know your enemy:
Learning about security threats. Boston, MA:
Addison-Wesley.
James, L. (2006). Trojans & botnets & malware,
oh my! Presentation at ShmooCon 2006. Retrieved July 11, 2007, from http://www.shmoocon.
org/2006/presentations.html
Kleen, L. J. (2001). Malicious hackers: A framework for analysis and case study. Masters Thesis, Air Force Institute of Technology. Retrieved
January 3, 2004, from http://www.iwar.org.uk/
iwar/resources/usaf/maxwell/students/2001/afitgor-ens-01m-09.pdf
Littman, J. (1997). The watchman: The twisted
life and crimes of serial hacker Kevin Poulsen.
New York: Little Brown.
Mann, D., & Sutton, M. (1998). Netcrime: More
change in the organization of thieving. British
Journal of Criminology, 38(2), 201-229.
Noblett, M. G., Pollitt, M. M., & Presley, L. A.
(2000). Recovering and examining computer
forensic evidence. Forensic Science Communications 2(4). Retrieved February 4, 2005, from
http://www.fbi.gov/hq/lab/fsc/backissu/oct2000/
computer.htm



Norman, P. (2001). Policing high tech crime


within the global context: The role of transnational
policy networks. In D. S. Wall (Ed.), Crime and the
Internet (pp. 184-194). New York: Routledge.
Ollmann, G. (2004). The phishing guide: understanding and preventing phishing attacks.
Retrieved July 11, 2007, from http://www.ngssoftware.com/papers/NISRWP-Phishing.pdf
Parizo, E. B. (2005). Busted: The inside story
of operation firewall. Retrieved July 9, 2007,
from http://searchsecurity.techtarget.com/
originalContent/0,289142,sid14_gci1146949,00.
html
Savona, E. U., & Mignone, M. (2004). The fox
and the hunters: How IC
technologies change the crime race. European
Journal on Criminal Policy and Research, 10(1),
3-26.
Schell, B. H. (2007). Contemporary world issues: The internet and society. Santa Barbara,
CA: ABC-CLIO.
Sterling, B. (1992). The hacker crackdown: Law
and disorder on the electronic frontier. New
York: Bantam.
Taylor, P. A. (1999). Hackers: Crime in the digital
sublime. London: Routledge.
Taylor, P. A. (2003). Maestros or misogynists?
Gender and the social construction of hacking. In
Y. Jewkes (Ed.), Dot.cons: Crime, deviance and
identity on the Internet (pp. 125-145). Portland,
OR: Willan Publishing.
Taylor, R. W., Caeti, T. J., Loper, D. K., Fritsch,
E. J., & Liederback, J. (2006). Digital crime and
digital terrorism. Upper Saddle River, NJ: Pearson
Prentice Hall.
Thomas, D., & Loader, B. D. (2000). Introductioncybercrime: law enforcement, security,
and surveillance in the information age. In D.
Thomas& B. D. Loader (Ed), Cybercrime: Law

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

enforcement, security and surveillance in the information age (pp. 1-14). New York: Routledge.

P. Francis (Eds.), Invisible crimes. London:


Macmillan.

Thomas, R., & Martin, J. (2006). The underground


economy: Priceless. Login, 31(6), 7-6.

Wuest, C. (2005). Phishing in the middle of the


stream--todays threats to on-line banking. Retrieved July 11, 2007, from http://www.symantec.
com/avcenter/reference/phishing.in.the.middle.
of.the.stream.pdf

Wall, D. S. (1999). Cybercrimes: New wine, no


bottles? In P. Davies, V. Jupp, &





Chapter X

Privacy or Performance Matters


on the Internet:
Revisiting Privacy Toward a Situational
Paradigm
Chiung-wen (Julia) Hsu
National Cheng Chi University, Taiwan

AbstrAct
This chapter introduces a situational paradigm as a means of studying online privacy. It argues that
data subjects are not always opponent to data users. They judge contexts before disclosing information. This chapter proves it by examining online privacy concerns and practices with two contexts:
technology platforms and users motivations. It explores gratifications of online photo album users in
Taiwan, and finds the distinctive staging phenomenon under the theory of uses and gratifications,
and a priori theoretical framework, the spectacle/performance paradigm. The users with diffused
audience gratifications are concerned less about privacy but not disclose more of their information.
Furthermore, it finds that users act differently in diverse platforms, implying that studying Internet as a
whole is problematic. The author proposes that studying online privacy through the use of a situational
paradigm will help better research designs for studying privacy, and assist in understanding of users
behaviors among technology platforms.

IntroductIon
The common assumptions of the online privacy
concerns literature claim that net users who have

higher privacy concerns disclose less information


and that data subjects are always adversarial to
data users. Thus, researchers with these assumptions ignore online environments, take privacy

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Privacy or Performance Matters on the Internet

concerns as privacy practices, and follow the


off-line literature reviews to study what kind of
persons (demographical variables) are concerned
more about their privacy. This is called the adversarial paradigm, which does not take social
contexts into account (Hine & Eve, 1998; Hsu,
2006; Raab & Bennett, 1998).
What does go wrong for online privacy research with an adversarial paradigm? Researchers
fail to explain why users asserting to have higher
privacy concerns still disclose sensitive information and fail to verify why some claim that those
belonging to particular demographical variables
are concerned more about privacy, which is not
always the case in other research studies. Thus,
researchers instead have to find more social
contexts which are essential to users privacy
concerns and practices on the Internet as well as
study what makes users disclose more of their
informationthe so-called situational paradigm
(Hsu, 2006; Raab & Bennett, 1998).
In this study, the author tries to find more proofs
for the main argument of the situational paradigm,
in which assumption is human relativisma new
approach for examining online privacy especially
for the newly-emergence phenomena. What are
the newly-emerging phenomena on the Internet?
The author raises an example of online photo Web
sites. Online photo album Web sites were originally started for sharing digital memories with
friends and relatives. This trend is encouraged by
commercial online photo album Web sites which
provide free or fee spaces. In Taiwan, online
photo albums (usually with additional blog functions) are also popular among Internet users.
As a communication scholar, the author alleges that communication is a post-disciplinary
in which the rigid walls of disciplinarity are
replaced with bridges (Streeter, 1995). Online
privacy is such an interdisciplinary topic, whereby
communication could contribute with others.
Given that the Internet is a mass media (Morris
& Ogan, 1996), the author assumes that uses
and gratification theory may pave the way for a

situational paradigm in online privacy research.


The study suggests that media use is motivated
by needs and goals that are defined by audiences
themselves, and that active participation in the
communication process may assist, limit, or influence the gratifications associated with exposure.
Thus, different goals lead to diverse participation
and gratification. Current online privacy research
seldom takes users motivations of Internet
behaviors into account. How do these different
uses, motivations, and gratification influence their
online privacy concerns and privacy practices?
This is necessary subject to investigate.
In addition to normal usage of online photo
albums, there is a distinct staging or performing phenomenon in Taiwan. For example, I-Ren
Wang, a now-famous celebrity, was recruited as
an anchor by TVBS, a cable news network, due
to her incredible popularity on the largest online
photo album, Wretch (Jhu & Yung, 2006). Other
girls, such as Cutiecherry and Fing, were
invited to participate in noted TV programs and
turned into commercial stars.
For the majority of Internet users who have
not yet become celebrity, they may enjoy having a
reputation among users, getting on a popular list,
or being discussed on the online chat system and
BBS. It also seems that online photo album Web
sites have developed into a stage for those who
want to become stars and celebrities. This implies
that the motivations for some online photo album
(a new media use context) users are quite different
from net users in previous studies.
Online photo album users are more like diffused audiences, the concept from the spectacle/
performance paradigm (SPP) (Abercrombie &
Longhurst, 1998). Adopting the diffused audience
cycle into the online photo album context, some users are drenched with mediascapes, integrate what
they learned from the mediascapes into everyday
life, and perform them for users own visibility.
Others are drenched with mediascapes that facilitate discussions and help to attach themselves to
some idols. No matter what purpose users hold,



Privacy or Performance Matters on the Internet

after they achieve a certain level of narcissism,


they turn their attention to getting further media
drenching and performance.
Performing or staging on the Internet means
users usually disclose a large amount of their
personal information. Are those users aware of
possibly losing privacy and being transparent on
the Internet? Are those staging users concerned
less about their privacy or have they no idea of
the consequences of their behavior? There are two
possible results: one is that the staging users do
care about their privacy. Two is that they do not
care about privacy. The first outcome is parallel
to the situational arguments. The users concerns
do not reflect directly on their behaviors.
The second result is even more interesting,
because it is so different from previous studies
that claim that concerns over privacy protection
on the Internet or Web sites might become an
obstacle to the diffusion of the Internet and the
future growth of e-commerce (Culnan, 1999a,
1999b; Federal Trade Commission, 1998, 1999,
2000; Milne, Culnan, & Greene, 2006). Why do
they ignore their privacy? What are their bottom lines and their disclosing strategies? What
are the implications for the privacy studies and
online business?
Online photo album Web sites have a very
special characteristic in that visual anonymity
does not exist as with other Internet technology
platforms, because they provide more visual cues,
making visual anonymity impossible. This is another context worth studying. How does a visual
function on the Internet influence users privacy
concerns and practices? An investigation into this
question will help us better understand how users
perceive privacy while using a visual function
and how different users motivations for using a
visual function influence their privacy.
The staging phenomenon (observed by uses
and gratifications and the SPP theories) and the
Internet technology platform (online photo album
websites) are the two contexts developed in this
study to validate the situational paradigm. This



study attempts to prove that these two contexts do


influence users privacy concerns and practices
which also verify that human relativism is the
proper approach to study online privacy.

bAckground
studying privacy with the situational
paradigm
The major flaw of the current definition of privacy is that it assumes that people are vulnerable
without considering situations. Therefore, privacy
risks are always deemed to be dangerous (Raab
& Bennett, 1998). According to the previous discussion, studying the nature of privacy, privacy
risks, and privacy protection with the adversarial
paradigm means one is always coping with new
privacy infringement. As Moor (1997) puts it,
the privacy concept has been developed chronologically. In the current computer age, privacy
has become very informationally enriched.
As such, there should be an updated approach
studying privacy.
Moor, Raab, and Bennett separately study
the nature of privacy, privacy risks, and privacy
protection from an adversarial paradigm toward
a situational paradigm, especially for Internet
settings. However, little research is aware that
privacy concerns studies are still trapped on the
adversarial paradigm. Nowadays, online privacy
concerns studies mostly adopt the off-line literature review and try to find what kind of Internet
users care more about their privacy by means of
using demographics as independent variables
(Hoffman, Novak, & Peralta, 1999; Kate, 1998;
Milne & Rohm, 2000; ONeil, 2001; Sheehan,
2002). Nevertheless, the findings of the privacy
concerns literature focusing on demographics
usually are in conflict with each other. It implies
that privacy concerns are not static, but vary
with contexts.

Privacy or Performance Matters on the Internet

Contexts are not a breakthrough idea. Most


research studies are aware that understanding data
subjects demographic variables is not enough
to explain and predict data subjects. Contexts
also determine subjects privacy practices and
concerns. Hine and Eve (1998) raise the idea of
situated privacy concerns by examining different
situations qualitatively. They find that there is
no particular kind of information that is always
privacy sensitive in all kinds of contexts.
Unlike the adversarial paradigm, researching
privacy concerns with the situational paradigm
needs to take two things into consideration. One
is the context of privacy risks and data subjects.
The other is a fundamental problem with studying privacy concerns, whereby it is necessary
to distinguish privacy concerns and privacy
practices. When researchers ask about net users
online privacy concerns, people tend to think
back to serious privacy infringements that they
have experienced or heard, and thus rate their
privacy concerns higher than what they actually
practice on the Internet. However, when researchers ask about net users Internet usage regarding
online privacy, users daily practices show that
they might not be concerned about privacy as
seriously as they report (GVU, 1998; Culnan &
Armstrong, 1999).
This study will examine privacy concerns and
privacy practices with the contexts of data subjects
separately. Contexts might be technology (Sixsmith & Murray, 2001), Web sites performance
(Hsu, 2002; Culnan & Armstrong, 1999), privacy
regulations (Bennett, 1992), political system
(Plichtova & Brozmanova, 1997), culture/country
(In the adversarial paradigm, the country/culture
variable is usually taken to be demographics.
Privacy concern/practice research studies under
a situational paradigm revise the culture context
of subjects as a social context along with social
group), and so on.
As mentioned earlier, online photo album Web
sites are the new technology platform which should
be taken as a context. How do users have differ-

ent privacy concerns under different technology


platforms? In the early age of the Internet, there
were several technology platforms: MUD, BBS,
newsgroups, mailing lists, World Wide Web,
e-mail, instant messaging (IM), and so forth. In
e-mail posts and archives, Sixsmith and Murray
(2001) point out that due to the absence of visual,
aural, and other elements of face-to-face communication (i.e., list serves, bulletin board discussion
groups, and chat rooms), those e-mail posts and
archive users are less aware of their audience and
believe they are interacting with only a limited
circulation of subscribed, interested members. If
researchers analyzing their message content or
outsiders used the same words, they would feel
their privacy had been invaded. When people use
e-mail posts and archives, they might not care
about their online contact information which is
already listed on the list serve or discussion board.
However, they might be concerned more about the
message contents they have been posting.
Online photo albums and blogs are so popular
nowadays among users (Spector, 2003; Trammell,
Williams, Postelinicu & Landreville, 2006). The
unprecedented services provide opportunities
for users to show their own personal information, pictures, and visual clips on the Internet. It
means users make visual cues of themselves or
their acquaintances public on the Internet, which
might be identified by other users. Visual cues
are very much related to privacy. Some research
studies have done some anonymous investigation
into CMC (computer-mediated communication)
to find out the differences between visual and
discursive anonymity (Barreto & Ellemers, 2002;
Lea, Spears, & de Groot, 2001; Postmes, Spears,
Sakhel, & de Groot, 2001).
Visual anonymity refers to the lack of any
visual presentation of users like pictures or video
clips. Discursive anonymity is a bit more complex.
Users posting might reveal, to a certain degree
of information, about the message source. Anonymity may be viewed as fostering a sense of deindividuation, which sequentially contributes to



Privacy or Performance Matters on the Internet

self-disclosure (McKenna & Bargh, 2000).Visual


function, by contrast, may be viewed as fostering
a sense of individuation, which in turn contributes
to privacy concerns. However, this seems to be
a different situation for Taiwanese online photo
album users staging behaviors.
Under said circumstances, it is necessary to
examine if a visual function facilitates users
privacy concerns on the Internet and if different
motivations of using a visual function would influence users privacy concerns. In order to know
their motivation, this study first adopts the uses
and gratifications approach.

Internet Uses and Gratifications


As people integrate new technology into their
lives, they often do not view it as just an update
of the fixed function, but assign it special values.
The cell phone enhances the fixed-line telephone
as a means of social connectedness (Wei & Lo,
2006). The media literature has the same token.
Ever since the Internet has turned out to be so
popular, a number of researchers have taken it as
a mass medium (Morris & Ogan, 1996) and also
have started to find its users gratifications, but
its special values are often ignored, which will
be articulated later.
The uses and gratifications approach provides
an ample structure for studying new technology.
The approach basically assumes that individual
differences lead to each user seeking out diverse
medium and employing the media in a different
way. It is a user-centered perspective and implies that users are active in three ways: utility,
intentionality, and selectivity (Katz, Gurevitch,
& Hass, 1973; Katz, Blumler, & Gurevitch, 1974;
Palmgreen, Wenner, & Rosengren, 1985).
Some studies research the Internet as a whole,
while some explore specific forms of Internet
technology, such as homepages, search engines,
and BBS. They try to find users motivations
and identify new factors when new forms come
out. In summary, net users attempt to fulfill their

8

needs such as information seeking, interaction,


social avoidance, socialization, economic, entertainment, hobbies, relationship maintenance,
coolness, life convenience, and so on (Charney
& Greenberg, 2002; Hoffman & Novak, 1996;
Korgaonkar & Wolin, 1999; LaRose, Mastro, &
Eastin, 2001; Parker & Plank, 2000). The motivations for using both conventional media and the
Internet are essentially the same, but they are not
absolutely identical.
Song, Larose, Eastin, and Lin (2004) identify a
new gratification only for Internet settingsvirtual communitywhich suggests a new self-image
and social life on the Internet that has improved
on real life. In addition, self-image, self-identity,
and self-presentation are further recognized as
unique to the Internet by researchers (Dring,
2002). Net users have become information producers instead of consumers only while creating
their own personal homepages or blogs to disseminate and control individual information.
Net users establish a new self-image through the
Internet, which might be totally different from
their true ego (Dominick, 1999; OSullivan, 2000;
Papacharissi, 2002).
The motivation of using online photo albums
accounts for most of the mentioned cases. However, in the case of Taiwan, the uses and gratifications approach is not sufficient for explaining
online photo album users staging behaviors.
First, the traditional uses and gratifications approach is for media usage, which is so different
from Internet users behaviors today. As Song et
al. (2004) claim, an over-reliance on the set developed from television studies leads to a failure
to depict new gratifications.
Second, net users actively shape their presentation to interact and communicate with others adequately in everyday life in order for themselves to
perform in this performative society (Kershaw,
1994), which has not been elaborated, such as
staging or performing. This is the special value
that technology users often assign, but researchers
ignore when studying technology only with the
users and gratifications approach.

Privacy or Performance Matters on the Internet

As a result, this study suggests that studying net


users motivations must consider that users are not
only the audience of the uses and gratifications
approach, but also the diffused audience of the
spectacle performance paradigm. This part will
be used to construct a questionnaire to identify
new factors.

diffused Audiences
As Abercrombie and Longhurst (1998) argue, audience research has to take account of the changing
nature of audience and social processes, which
current research ignores. The common uses and
gratifications approach (and effects literature) is
categorized as a behavioral paradigm. According
to Hall (1980), there are a number of problems with
this paradigm, such as its lack of attention to power
relations, the textual nature of media message,
and understanding of social life. Halls critical
approach to the study of the media is categorized
as the incorporation/resistance paradigm (IRP),
as in how social structuring and social location
influence decoding of media texts. The key argument is the extent to which audiences resist or are
incorporated by media texts in ideological terms.
However, overemphasizing the coherence of the
response to different texts is problematic.
Abercrombie and Longhurst (1998) argue that
the spectacle/performance paradigm (SPP) is
much better in understanding the changing audience and conceptualizations of the audience and in
recognizing the audiences identity formation and
reformation in everyday life. They propose that
there are three different types of audiences: simple,
mass, and diffused. All of them co-exist.
The simple audience involves direct communication from performers to audience. The mass
audience reflects the more mediated forms of
communication. The diffused audience implies
that everyone becomes an audience all the time,
which entails people spending increasing amounts
of time in media consumption. Thus, the audience
interacts with the form of mediascapes, rather than

media messages or texts per se. Being a member


of an audience is a part of everyday life under two
social processes. First, the social world is more
of a spectacle nowadays. Second, the individual
is more narcissistic. Furthermore, the nature of
audience is changing and not static. In terms of
skills, audience could be identified as a continuum,
from the consumer, fan, cultist, enthusiast, to the
petty producer, in ascending order. Those who try
to make money by means of their online photo
album are more like petty producers.
Abercrombie and Longhurst offer a cycle to
explain the interaction of media and the diffused
audience. They take football as an instance. Four
processes form the kind of cycle. First, media
drenching: The audience increases football consumption in various mediums. Second, everyday
life: Media drenching facilitates interaction and
discussion, as well as emotions engaged. Third,
performance: The audience has more knowledge
about football, increases attachment to the favorite
team, and is identified as a football fan/follower.
Fourth, spectacle/narcissism: The audience desires
for increased visibility/knowledge as a basis for
performance and is also constructed by football,
and the display of logos, photos, and clothing. To
obtain more knowledge, the audience is drenched
in a massive amount of mediascapes again.
The SPP is proved adequate in empirical research. Longhurst, Bagnall, and Savage (2004)
use it to analyze the museum audience to connect
the ordinariness of museums as part of the audience processes of everyday life to wider themes
of spectacle and performance. Museum visiting
is proven to be a salient part of the identity of
the parent.
Online photo album users are not a typical
simple or mass audience. They are simultaneously information consumers and producers. They
continually get information from fused communication and a huge amount of mediascapes. They
could be also categorized as consumers, fans,
cultists, enthusiasts, and petty producers. Without considering the assumptions of the diffused

9

Privacy or Performance Matters on the Internet

audience and exploration under the SPP, online


photo album users are simply taken as the mass
audience. Their different behaviors are deemed
only as another gratification. However, the role of
information producer, the phenomenon of being
noted, and the constitutive of everyday life are
left unexplained.

mAIn thrust of the chApter


Issues, controversIes,
problems
research Questions and
hypotheses
Based on the review of privacy, major gratifications of the Internet, and diffused audiences
from the SPP, we find that privacy concerns and
privacy practices are situated, which cannot be
examined without considering the contexts. In this
study, the contexts are users gratifications and
visual cues. Research questions and hypotheses
are listed below.
RQ1: What is the relationship between privacy
concerns and practices?
H1a: The more they disclose their information,
the less privacy concerns they hold.
H1b: The more they disclose their visual cues, the
less privacy concerns they hold.
H1c: The more they disclose their visual cues
(post more pictures), the more information
they disclose.
RQ2: What is the difference on privacy concerns
between online photo album haves and
have-nots?
H2: Those who have online photo albums are
concerned less about their privacy.
RQ3: What is the difference on privacy practices
between online photo album haves and
have-nots?
H3: Those who have online photo albums
disclose more information.

0

RQ4: What are online photo album users gratifications?


RQ5: What is the relationship between gratifications and privacy concerns?
H4: Those whose gratifications are more like
diffused audiences are concerned less about
privacy.
RQ6: What is the relationship between gratifications and posting pictures?
H5: Those whose gratifications are more like
diffused audiences post more pictures on
the Internet.
RQ7: What is the relationship between gratifications and privacy practices?
H6: Those whose gratifications are more like
diffused audiences disclose more of their
information.

data collection
The survey sample is drawn from volunteers, who
were recruited from the biggest online photo album
Web site in Taiwan: Wretch (http://www.wretch.
cc). The advertisement for recruiting volunteers
was put on the homepage with a hyperlink to
a Web survey interface. A sample of 893 users
participated in this Web survey. The sample
consisted of 91.3% users (815) who have at least
one online photo album and 8.6% users who do
not have any (77). The average number of hours
on the Internet per week was reported to be 5.64
hours. The most frequent visiting album types
are relatives (56.2%), gorgeous persons (48.0%),
celebrities (19.4%), and others.

Instrument construction
This study draws on motives identified in previous studies on uses and gratifications from mass
media and the Internet, adds the virtual community factor found by Song and his colleagues,
and develops new items based upon Abercrombie
and Longhursts diffused audience cycle. The
diffused audience questions are developed by

Privacy or Performance Matters on the Internet

conducting an in-depth interview with 10 users.


Different levels of users are chosen in terms of
the audience skills. The wordings were slightly
changed to fit the online photo album context.
After a pilot study, this study uses a list of 49
gratification items with Likert scales from 1 to
5. This study revises the privacy concerns questionnaire adopted from previous research (Smith,
Milberg, & Burke, 1996).
This study also would like to see online photo
album users privacy practices, which is how often they disclose their information and pictures.
There are two questions about pictures: clear
picture of you and clear picture of friends and
relatives. The scopes of personal information in
this study are revised from the most frequently
asked information lists done by GVU (1998) and
sensitive information which are emphasized by
privacy advocates, including demographic information, contact information (address, telephone
numbers), online contact information (email
address), other family members information,
and details of everyday life, such as where to go,
where to diet, and so forth.

data Analysis
The results are listed, followed by the orders of
research questions and hypotheses. All analyses
were done using the SPSS 14.0 statistical program.
Principal component solution and varimax rotation
were adopted to find gratification groupings.
This chapter selects items with a major
loading at 0.5 or higher and a secondary loading no less than 0.40. Each factor is extracted
with eigenvalues greater than 1.0 and minimum
reliability more than 0.60. All hypotheses were
tested by Pearson product-moment correlations.
A regression analysis was employed to evaluate the relationship between privacy concerns
(dependent variable) and Internet gratifications
(independent variables).

results
RQ1: What is the relationship between privacy
concerns and practices?
H1a: The more they disclose their information,
the less privacy concerns they hold.
H1b: The more they disclose their visual cues, the
less privacy concerns they hold.
H1c: The more they disclose their visual cues
(post more pictures), the more information
they disclose.
It seems that the respondents are not concerned
a lot about privacy as the means are all fewer
than 2 (see Table 1). However, the respondents
concern about their privacy is very different as
the standard deviations are all higher than .600,
especially questions 1, 5, 7, and 12, in which
their standard deviations are over .800. In order
to know the details, all 12 questions added as
general privacy concerns, are further analyzed
with gratifications and other variables.
The correlation of frequency of information
disclosing and general privacy concerns shows
some critical points. First, the privacy practices
are not parallel with their privacy concerns. The
coefficients are quite weak and contradictory (see
Table 2). Those who post more clear pictures of
themselves (-.130**) or clear pictures of their
friends or relatives (-.124**) hold less privacy
concerns. Second, users who usually disclose
visual cues on the Internet are concerned less
about privacy. However, despite other family
members information, contact information, and
demographic information being somehow more
sensitive than everyday life and online contact
information, respondents with higher privacy
concerns unexpectedly disclose more of the
sensitive types of information. In order to make
clear the contradiction, it is necessary to see the
differences between users who have online photo
albums and those who do not.



Privacy or Performance Matters on the Internet

Table 1. Descriptives statistics of privacy concerns


Privacy Concerns

SD

1.

It usually bothers me when a Web site asks me for personal information.

1.85

.887

2.

When a Web site asks me for personal information, I sometimes think twice before providing it.

1.77

.789

3.

Web sites should take more steps to make sure that unauthorized people cannot access personal
information in their computers.

1.48

.687

4.

When people give personal information to a Web site for some reason, the Web site should never
use the information for any other reason.

1.34

.637

5.

Web sites should have better procedures to correct errors in personal information.

1.69

.815

6.

Computer databases that contain personal information should be protected from unauthorized
access, no matter how much it costs.

1.35

.638

7.

Some Web sites ask me to register or submit personal information before using them. It bothers
me to give my personal information to so many websites.

1.82

.910

8.

Web sites should ever sell the personal information in their databases to other companies.

1.26

.605

9.

Web sites should never share personal information with other Web sites unless it has been
authorized by the individual who provided the information.

1.31

.627

10.

I am concerned that Web sites are collecting too much personal information about me.

1.60

.799

11.

The best way to protect personal privacy on the Internet would be through strong laws.

1.50

.781

12.

The best way to protect personal privacy on the Internet would be through corporate policies,
which the corporations develop themselves.

1.63

.908

RQ2: What is the difference on privacy concerns


between online photo album haves and
have-nots?
H2: Those who have online photo albums are
concerned less about their privacy.
RQ3: What is the difference on privacy practices
between online photo album haves and
have-nots?
Table 2. Correlations of privacy practices and
general privacy concerns
General Privacy
Concerns
Clear pictures of you

-.130**

Clear pictures of your friends or relatives

-.124**

Demographic information

.126**

Contact information

.183***

Online contact information


Other family members information
Everyday life

.080*
.199***
.028

*** Correlation is significant at the 0.001 level (2-tailed).


** Correlation is significant at the 0.01 level (2-tailed).
* Correlation is significant at the 0.05 level (2-tailed).



H3: Those who have online photo albums disclose


more information.
In order to answer Hypotheses 2 and 3, the
t test is employed to see if users with online
photo albums have less privacy concerns and
more privacy practices (see Table 3). Those who
have online photo albums perceive less privacy
concerns than the have-nots. This parallels with
Hypothesis 2. As for privacy practices, those
who have online photo albums do disclose more
clear pictures of themselves and clear pictures
of their friends or relatives than the have-nots,
which is not surprising. What is unexpected is
those who have no online photo albums disclose
more sensitive information, such as demographic
information, contact information, and other family members information, than those who have
online photo albums.
The findings show that privacy concerns do not
always reflect upon users practices. Taking the
contexts into consideration, online photo album
users post many pictures online, but they might

Privacy or Performance Matters on the Internet

Table 3. Effect of having online photo album or not on privacy concerns and information disclosure
Online
photo
album

Mean

SD

Mean
difference

General privacy concerns

YES

702

1.53

.463

-4.544a

-.31

NO

63

1.84

.873

Clear pictures of you

YES

746

3.72

1.253

4.526a

1.724

(PCU)

NO

11

2.00

1.342

Clear pictures of your


friends or relatives

YES

744

3.63

1.256

3.549a

1.360

-2.109c

-.786

-4.445a

-1.309

-.093

-.040

-4.475a

-1.210

.916

.330

(PFR)

NO

11

2.27

1.618

Demographic information

YES

734

2.21

1.159

(DEM)

NO

10

3.00

1.886

Contact information

YES

741

1.49

.912

(CON)

NO

10

2.80

1.687

Online contact
information

YES

741

2.66

1.348

(OCO)

NO

10

2.70

1.767

Other family members


information

YES

743

1.49

.832

(FAM)

NO

10

2.70

1.767

Everyday life

YES

743

3.53

1.124

(LIF)

NO

10

3.20

1.687

a
Correlation is significant at the 0.001 level (2-tailed). b Correlation is significant at the 0.01 level (2-tailed). c Correlation is
significant at the 0.05 level (2-tailed).

hold back the very sensitive information to prevent


cyber stalking and other consequences. That is
why insensitive information, such as online contact information and everyday life, does not show
any significant differences between the haves and
have-nots. Apparently, there is no harm to know
the visual cues and insensitive information. The
results show that the first contextthe technology platformsproves that privacy practices and
concerns are situated.
RQ4: What are online photo album users gratifications?
This research extracts 10 factors with eigenvalues above 1.0 accounting for 67.39% of
total variance from the original set of Internet

gratifications. The first gratification, named as


information-seeking, contains seven items ( =
0.89), including learning about local community
events, getting useful information, finding bargains, and getting up to date with new technology (see Table 4). Factor two is characterized as
media drenching ( = 0.88). This factor indicates
that respondents increase their usages of online
photo albums and get updates from some of their
favorite albums.
The third gratification, named as diversion (
= 0.85), results from the pleasurable experience
of content. The fourth and fifth gratifications,
named respectively as performance and narcissism, are unprecedented in prior research. The
factor performance ( = 0.83) refers to users
media drenching facilitating a discussion to



Privacy or Performance Matters on the Internet

particular persons or things. Users then perform


their identity of fans/followers and also show their
attachment and knowledge. The factor narcissism
( = 0.82) points out that users desire for increased
visibility/knowledge as a basis for performance
and show their special identities, not only on the
Internet, but also outside the Internet.
The sixth and seventh gratifications are relationship maintenance ( = 0.90) and aesthetic experience ( = 0.88). Those findings are comparable
to those of Song et al. Unlike virtual community,
the factor relationship maintenance focuses on
existing acquaintances, not new friends on the
Internet. Aesthetic experience fits the needs of
aesthetic pleasure. The eighth gratification is
virtual community ( = 0.82), which is similar to
the finding of Song et al. Users try to establish a
new social life online.
The ninth and tenth gratifications are named
as function ( = 0.60) and reference ( = 0.82).
These two are unprecedented. Saving files and
finding albums that are easy to navigate are goaloriented examples. Users take album owners

released information as a reference and plan to do


the same thing. They do not pursue pleasurable
experiences only, but put what they learn into
practice in the real world.
Which gratifications are more like ones that
diffused audiences may have? According to
Abercrombie and Longhurst (1998), narcissism,
performance and media drenching undoubtedly
could be categorized as diffused audiences
gratifications. Virtual community in name only,
seems to have no connection with a diffused
audience. If we look into the meaning, we find
out that the purpose of virtual community is in
finding companionship and meeting new friends
and the way of finding companionship is through
establishing special identities to appeal to new
friends (Papacharissi, 2004; Song et al., 2004).
This is quite like the argument of the SPP, whereby
audiences form their identities to perform and to
be narcissistic.
Respondents show 10 gratifications while
using online photo albums in this study. How do
the gratifications influence users privacy con-

Table 4. Online photo album gratification factor loadings


Loading Eigenvalue Variance

Factor 1: Information seeking (IS)


Learn about local community events

.547

Get fashion information

.738

Get travel information

.851

Get gourmet food information

.860

Get useful information about products or services

.837

Find bargains on product and services

.513

Get up to date with new technology

.587

5.31

10.83

0.89

4.03

8.232

0.88

Factor 2: Media drenching (MD)


Spend lots of time checking online photo albums without awareness

.730

Checking albums is a part of my life

.757

Have my favorite albums

.787

Check if those albums are updated regularly

.794

Expect new photos from people I like

.705

Factor 3: Diversion (DV)


Have fun

.575

continued on following page



Privacy or Performance Matters on the Internet

Table 4. continued
Feel excited

.520

Feel entertained

.754

Feel relaxed

.738

Kill time

.655

3.57

7.292

0.85

3.52

7.190

0.83

3.52

7.179

0.82

3.47

7.088

0.90

3.21

6.559

.88

3.01

6.153

.82

1.75

3.565

.59

1.62

3.301

.82

Factor 4: Performance (PF)


Download favorite persons pictures

.568

Discuss particular persons from albums with friends

.576

Discuss particular persons from albums in BBS

.711

Find relative information of particular persons from albums

.644

Let others know I am a fan of particular persons, from albums

.686

Enjoy browsing online photo album

.608

Factor 5: Narcissism (NA)


Develop a romantic relationship

.574

Get people to think I am cool

.641

Improve my standing in the world

.700

Feel like I belong to a group

.507

Find ways to make more money

.545

Get noted

.610

Factor 6: Relationship maintenance (RM)


Get in touch with people I know

.744

Keep posted about relatives and friends

.879

Want to see relatives and friends recent pictures

.885

Want to see relatives and friends social life

.798

Factor 7: Aesthetic experience (AE)


Find cool new albums

.645

See albums with pleasing designs

.659

Find some gorgeous persons pictures

.636

Find attractive graphics

.660

See albums with pleasing color schemes

.568

Factor 8: Virtual community (VC)


Find more interesting people than in real life

.637

Meet someone in person who I met on Internet

.535

Find companionship

.755

Meet new friends

.674

Factor 9: Function (FN)


Save picture files

.777

Find albums that are easy to navigate

.519

Factor 10: Reference (RE)


Check where particular persons from albums have been

.600

Go to the same place where particular persons from albums went

.719

Total variance explained = 67.39%



Privacy or Performance Matters on the Internet

cerns and practices? Do the diffused audiences


gratifications make any differences from other
gratifications? Do respondents with diffused
audiences gratifications care less about privacy
and disclose more information or anything else?
This is articulated in next section.
RQ5: What is the relationship between gratifications and privacy concerns?
H4: Those whose gratifications are more like diffused audiences, are concerned less about
privacy.
A correlation coefficient may be difficult to
assess just by examining the coefficient itself.
In this research, the correlation between factors,
privacy concerns, and general privacy concerns
is considered low (see Table 5). Nevertheless, the
researcher should consider the situation when
interpreting correlation coefficient sizes. Jaeger
points out (1990, cited from Reinard, 2006):
whether a correlation coefficient is large or small,
you have to know what is typical. Thus, the study
pays more attention to a comparison of correlation
coefficients between factors and privacy concerns
in order to see how different gratifications influence respondents privacy concerns.
The descending order of correlation coefficients between gratifications and general privacy
concerns is diversion, relationship maintenance,
aesthetic experience, reference, media drenching,
function, information seeking, virtual community,
performance, and narcissism. All are significant,
but if we look at each question deeply, users with
different gratifications do have diverse aspects of
privacy concerns.
There are six gratifications which do not correlate with Q1 significantly. The coefficients of
media drenching, diversion, relationship maintenance, and reference are significant. Among
12 questions, Q1 and Q7 (Q7 will be discussed
later) seem not to irritate users privacy concerns
as much as others. Why do some respondents not
feel bothersome when a website asks them for



personal information? This context should be


considered. The respondents with information
seeking, performance, narcissism, aesthetic experience, virtual community, and function have no
choices, but register sites do provide information
in order to use services. Therefore, these six correlation coefficients are insignificant.
As for Q2, respondents with performance,
narcissism, and virtual community gratifications
seem not to think twice before providing personal
informationwhen a Web site asks me for it.
Unlike with Q2, virtual community gratification
has a significant correlation with Q3. Why do
respondents with virtual community gratification think that Web sites should take more steps
to make sure that unauthorized people cannot
access personal information in their computers?
Net users usually have several identities on several
communities. Without preventing unauthorized
access, users would be easier recognized by
cross-referencing. Performance, narcissism, and
function have an insignificant correlation with Q3.
Users with these gratifications do care less about
unauthorized people access information.
Information seeking, performance, narcissism,
virtual community, and function do not correlate
with Q4. Those with these gratifications do not
worry about their personal information being used
for any other reason by the Web sites which they
give information. In addition, all gratifications
correlate to Q5, which means the respondents
all care about data error. As for Q6, only respondents with performance, narcissism, and virtual
community care less about the database being
protected from unauthorized accessno matter
how much it costs (no significant correlation).
Comparing Q3 and Q6, it seems that users with
virtual community only care about unauthorized
people accessing information and do not care about
protecting databases in a broad sense.
The correlation of Question 7 and gratifications
is worth deliberating. Only four gratifications,
diversion, relationship maintenance, aesthetic
experience and reference, have a significant

Privacy or Performance Matters on the Internet

usually have a higher correlation coefficient. It


might be the reason that users with these two
gratifications, to some extent, have to count on
the Web sites services to fulfill the tasks. By
the same token, information seeking and function do not correlate with Q1, Q4, Q7, and Q9
significantly.
Although all gratifications significantly correlate with Q10, users with relationship maintenance, function, reference, and division worry
more about information being collected. On the
other hand, respondents with narcissism, virtual
community, and performance gratifications are
willing to expose themselves to the Internet. Correspondingly, they are not concerned that websites
are collecting too much personal information
about them.
As for Q11 (protecting online privacy through
strong laws), performance and narcissism do
not have a significant correlation with it, but 10
gratifications all significantly correlate with Q12
(through corporate policies). In general, respondents who have privacy concerns in descending
order usually use online photo albums with diver-

correlation with Q7. Taking research situations


into consideration, when users use online photo
albums for diversion, aesthetic experience, and
reference, they usually do the activities alone
and do not prepare to have any interaction with
others online. That is why they feel particularly
bothersome when websites ask them to register or
submit personal information before using them.
As for relationship maintenance, this gratification
could be seen as an in-group need. Users tend to
have interaction with their acquaintances online.
Following Taylor, Franke, and Maynards (2000)
argument, people with more in-group privacy are
more sensitive to differentiating between people
inside and outside their private world.
Q8data transferring seems not a big deal for
respondents with narcissism, performance, and
virtual community, and this makes sense under
the SPP paradigm. Those who are more interested
in being identified and noted are not concerned
about data transferring, which ironically might
satisfy their needs to catch more peoples eyes.
Interestingly, Q9 does not correlate significantly
with information seeking and function, which

Table 5. Correlation between factors and privacy concerns


IS

MD
c

DV

PF

NA

.127

.031

RM

AE

VC

FN

RE

.003

.096

.061

-.030

.021

.080c

1.

-.001

.082

2.

.116b

.127a

.137a

.066

.049

.142a

.136a

.061

.071c

.123a

3.

.093b

.131a

.205a

.060

.049

.170a

.173a

.097b

.063

.154a

4.

.050

.137

.204

.037

.009

.203

.138

.036

.063

.102b

5.

.139a

.218a

.223a

.177a

.124b

.164a

.229a

.172a

.201a

.200a

6.

.089

.141

.169

.050

.008

.179

.169

.060

.088

.128a

7.

.039

.054

.160a

.040

.037

.089c

.103b

-.013

.057

.084c

8.

.093b

.153a

.198a

.056

.016

.214a

.166a

.064

.089c

.126a

9.

.019

.115

.164

.040

-.027

.186

.144

.016

.064

.079c

10.

.119b

.147a

.153a

.093b

.108b

.177a

.145a

.108b

.167c

.164a

11.

.132a

.168a

.158a

.029

.053

.170a

.168a

.071c

.186a

.165a

12.

.169

Gen.

.127b

.163

.177

.101

.139

.147

.163

.130

.180

.157a

.195a

.247a

.107b

.083c

.236a

.219b

.109b

.156a

.206a

a
Correlation is significant at the 0.001 level (2-tailed). b Correlation is significant at the 0.01 level (2-tailed).c Correlation is
significant at the 0.05 level (2-tailed).



Privacy or Performance Matters on the Internet

variables. The 10 gratifications account for 12.0%


of the variance in privacy concerns.
The regression analysis with the stepwise
method demonstrates more details of these three
significant gratifications (see Table 7). The strongest variable is relationship maintenance, and then
diversion and reference. The three gratifications
account for 10.4% of the variance in privacy
concerns. It explains that uses and gratifications
should be an important predictor for privacy
concerns, which is unprecedented.

sion, relation maintenance, aesthetic experience,


reference, media drenching, function, information
seeking, virtual community, performance, and
narcissism gratifications.
Hypothesis H4 is sustained. Those whose gratifications are more like diffused audiences (performance, narcissism, and virtual community) are
concerned less about privacy. However, media
drenching seems to be not a gratification which
can be categorized into the diffused audiences
gratifications with performance, narcissism, and
virtual community. It only has an insignificant
correlation with Q7.
How much do those surveyed gratifications
influence privacy concerns? The regression
analysis with the enter method (see Table 6) illustrates that information seeking, performance,
narcissism, and virtual community are all negative
and insignificant predictors of privacy concerns,
which corresponds with the SPP argument of this
study. Media drenching, aesthetic experience, and
function are positive, but not significant predictors.
Diversion, relationship maintenance, and reference are all positive and significant independent

RQ6: What is the relationship between gratifications and posting pictures?


H5: Those whose gratifications are more like
diffused audiences post more pictures on
the Internet.
RQ7: What is the relationship between gratifications and privacy practices?
H6: Those whose gratifications are more like
diffused audiences disclose more of their
information.

Table 6. Regressoion analysis with the enter Table 7. Regressoion analysis with the stepwise method
method
Model
1

Beta

-1.621

.064

1.247

.192

3.464

Performance

-.072

-1.419

Narcissism

-.050

-.934

Diversion

.134

3.097b

Aesthetic
Experience

.094

1.701

Virtual
Community

-.032

-.581

Reference

8

Model

8.886a

.120

Beta
(Constant)
Relationship
Maintenance

Relationship
Maintenance

Function

R2

22.850

46.303

R2

10.707
-.085

Media Drenching

F
a

(Constant)
Information
Seeking

.256

(Constant)

.066
.066

6.805 a
14.341
a

Relationship
Maintenance

.191

4.799 a

Diversion

.180

4.527 a

(Constant)

34.086

.104

.151

3.584 a
a

.661

Diversion

.159

3.952

.119

Reference

.114

2.760 a

.028

.010

Relationship
Maintenance

.094

12.263

.029

2.535

R2
change

25.492
a

Privacy or Performance Matters on the Internet

In order to inspect RQ6, RQ7, H5, and H6, this


research adopts the Pearson correlation analysis
(see Table 8). For those whose gratifications are
more like diffused audiences, including performance, narcissism, and virtual community,
there is no significant correlation between these
three gratifications and posting clear pictures
of themselves on the Internet. For those whose
gratifications are media drenching, diversion,
relationship maintenance, and reference, there is
a significant negative correlation, which means
they tend not to post clear pictures of themselves.
Those four gratifications are so private that users will usually lurk and not be known. There
is also no significant correlation between these
three gratifications (performance, narcissism, and
virtual community) and posting clear pictures of
friends and relatives on the Internet. In addition,
there is a significant negative correlation between
these three gratifications (media drenching,
relationship maintenance, and reference) and
posting clear pictures of friends and relatives on
the Internet.
This study cannot directly prove that those
whose gratifications are more like diffused audiences will post more pictures on the Internet.
However, it does find out that those whose gratifications are more private disclose fewer pictures
of themselves, friends, and relatives. Thus, H5
is rejected and should be revised as those whose
gratifications are more private post fewer pictures
on the Internet.
Privacy practices, interestingly enough, have
a significant correlation with diffused audience
gratifications, but conflicting with Hypothesis
6, those who have performance, narcissism, and
virtual community gratifications do not disclose
more. On the contrary, narcissism gratification has
a significant negative correlation with disclosing
contact information, online contact information,
other family members information, and everyday
life. Performance gratification has a significant
negative correlation with online contact information, other family members information, and

everyday life. Virtual community gratification has


a significant negative correlation with disclosing
contact information, online contact information,
and other family members information.
If we look more into the contexts, it makes
senses again. There is a difference between
performance and narcissism, although they are
grouped as diffused audience gratifications. Those
who have performance gratification are cultists,
enthusiasts, or petty producers for some particular
themes or persons. They are willing to disclose
information to others who are not as professional
as they are. They even enjoy being a pro in the
field. People who have narcissism gratification
are not crazy for a particular theme or person.
Their performance subjects are themselves.
Thus, they already post a lot of information and
hesitate to give away contact information, online
contact information, and other family members
information, which might ruin their identities
created online.
For performance, they see online contact information as being more sensitive than contact
information. This is so different from previous
research. Although people with performance
gratification are willing to be famous due to their
professions, do they really feel no harm at being
reached in the real word? Why do they hold back
their online contact information instead of contact
information? The reason might be homepages,
blogs, and those self-disclosure tools on the Internet are categorized into online contact information, which might reveal more about themselves.
Additionally, users are known to some extent as
having either performance or narcissism gratifications. Thus, they reserve other family members
information and everyday life.
As for virtual community, those who want to
meet people online hold back their information in
real life. Therefore, they do not want to disclose
their contact information, online contact information, and other family members information,
which would be traced back to their attribution. As
long as they remain anonymous in their real life,

9

Privacy or Performance Matters on the Internet

it would be fine for them to give away information


about their everyday life.
This study cannot directly prove that those
whose gratifications are more like diffused audiences disclose more on the Internet. Thus, H6 is
rejected, but there is more worth examining. In
contrast, this study finds that users with performance, narcissism, and virtual community have
some particular consideration due to their disclosing many pictures. As for correlation of other
gratifications and privacy practices, information
seeking, diversion, aesthetic experience, and function have a significant negative correlation with
online contact information. Nevertheless, there
is no significant correlation of any gratification
with contact information.
Why does the insensitive type of information
have a significant correlation with more private
gratifications? The reason might be that those with
more private gratifications take securing sensitive
information online for granted and have never
thought about giving it away, which makes the
statistic results irrelevant. However, they do think
twice about the consequences of giving away the
insensitive information. Take relationship maintenance as an example. Demography and everyday
life have a significant negative correlation. Why
not other information types? If we consider the
in-group privacy characteristics of relationship
maintenance, the answer is obvious. In-group
members do not want to share information about
this group to others.
Media drenching gratification only has a sig-

nificant negative correlation with everyday life.


We do not have a clear idea why media drenching has no correlation with others. It is logical to
presume that media drenching users love to show
their expertise about some celebrities or things,
but they may not love to be known about how s/he
did it. Everyday life is somehow sensitive for media drenching users. Indeed, the simple statistics
cannot tell more about the relationship between
them, but they reveal that contexts do play very
important roles on users privacy practices.

solutions and recommendations


Each of the significant relationships is provocative
and invites further thinking and research on the
associations of privacy and contexts which the
adversarial paradigm neglects. Although users
who have online photo albums are concerned less
about privacy than the have-nots in general, it
does not always mean the haves will disclose any
information more than the have-nots. This might
come about because of two important contexts.
The first context is the common operations of
online photo albums. Online photo album users
post many pictures online, and they might hold
back the very sensitive information to prevent
cyber stalking and other consequences.
The second context is from patrons gratifications. Based upon the interpretation, we have
seen how users gratifications of using a technology platform impact their privacy concerns

Table 8. Correlation between factors and privacy practices


IS

DV
a

-.076

PCU

.024

-.198

PFR

.050

-.153 a

DEM

.033

-.051

PF
c

NA

RM

AE

VC

FN

RE

-.045

.061

-.065

-.179 a

-.062

-.126 b

-.031

-.065

-.023

-.022

-.314

.007

-.018

-.004

-.272 a

.006

.090

-.004

-.062

.001

-.082 c

-.004

.007

CON

-.045

.058

-.005

-.024

-.112

.072

.004

-.078

-.038

.001

OCO

-.086 c

-.015

-.143 a

-.103 b

-.151 a

-.020

-.104 b

-.179 a

-.075 c

-.024

FAM

-.079 c

-.020

.011

-.083 c

-.156 a

.057

.020

-.101b

-.033

-.006

-.046

-.062

-.028

-.163 a

LIF

0

MD

-.019

-.164

-.088

-.075

-.077

-.178

Privacy or Performance Matters on the Internet

and practices. H5 is rejected. It instead finds out


that those whose gratifications are more private
disclose fewer pictures of themselves, friends,
and relatives. H6 is also rejected. For those who
already post many pictures, or are more diffused
audience-oriented, might think about possible
consequences and hold back information.
As for the relationships between privacy concerns and gratifications, the descending order of
correlation coefficients between gratifications
and privacy concerns is diversion, relationship
maintenance, aesthetic experience, reference,
media drenching, function, information seeking,
virtual community, performance, and narcissism.
From the standpoint of uses and gratifications
and the SPP, this studys findings highlight the
idea that the gratifications are categorized as
three groups.
The first group includes performance, narcissism, and virtual community. Respondents who
use online photo albums with these three gratifications like expose themselves to the Internet world
in order to meet people and even become noted.
The second group consists of information seeking,
function, and media drenching. Respondents with
these three gratifications are goal-oriented. They
adopt online photo albums for some purposes, such
as seeking information, storing files, and checking
their favorite albums. They do not like to meet
people or become famous. The third group comprises diversion, relation maintenance, aesthetic
experience, and reference, or four gratifications.
They are not as goal-oriented as respondents with
the second groups gratifications. Their purposes
are more like seeking personal pleasure which is
usually kept secret or sharing with some in-group
persons only.
Media drenching originally categorized as
diffused audiences gratifications is an interesting
one. Users with this one do not like to put clear
picture of themselves, relatives, and friends on
the Internet. However, it only has a significant
negative correlation with everyday life (privacy
practices) and a non-significant correlation with

Q7 (privacy concerns). By looking at the questions


composing the factor, the context influences are
revealed. Media drenching users enjoy being a pro
on their favorite field and make efforts to maintain
their expertise, but they do want to reveal their
efforts and resources. Therefore, media drenching instead is categorized into the second group,
which often does things secretly.
The diverse aspects of privacy concerns
caused by different gratifications do show up in
this studys findings. For example: Information
seeking does not have a significant correlation
with Q1, Q4, Q7, and Q9. Those questions relate
to collection and unauthorized secondary use
(Smith et al., 1996). This study finds that users
with information seeking are not concerned more
about information being used by other Web sites,
but are concerned about their information being
used by unauthorized individuals. The message
here is that users who want to use Web site services to seek information should scarify their
privacy in some sense, but it does not mean that
being accessed by unauthorized individuals is
endurable.
Function does not have a significant correlation
with Q1, Q3, Q4, Q7, and Q9 either. Those questions relate to the improper use of information and
collection which Hsus (2006) study defines. As
long as Web sites protect data well, users with the
function gratification do not worry about collection and secondary use. They need the safe spaces
for file storage more than have privacy.
The more diffused audience gratifications,
including performance, narcissism, and virtual
community, only have a significant correlation
with Q5, Q10, and Q12. It reveals that users with
these three gratifications only care about data
error, collecting too much personal information about them, and protecting privacy through
corporate policies. They do not care about Web
sites registration collection, secondary uses, or
unauthorized access (except virtual community).
Users with virtual communication have two more
aspect of privacy concernsunauthorized access



Privacy or Performance Matters on the Internet

and protecting privacy through strong laws. Accordingly, it is quite reasonable that users manage
pure online relationships with special cautions of
being cheated or their identity in the real world
being divulged.
This in turn begs the question: What gratification might entail users to care more about their
privacy? By looking at both correlation and regression analyses, relation maintenance, diversion, and
reference gratifications predict privacy concerns
with a 10.4% variance. Three gratifications are
gained from using online photo albums secretly
or sharing with some in-group persons. Thus,
they care about collection, data error, unauthorized access, and secondary uses, and hope that
privacy can be protected by both strong laws and
corporate policy. As for other gratifications, they
are not significant predictors at all. Moreover, relation maintenance raises the interesting challenge
to understanding the international/intercultural
differences of privacy (Taylor et al., 2000). Does
the in-group privacy phenomenon only happen
in Taiwan or Asian countries? Could it be applied
to other countries and cultures?
By doing this research, the author tries to prove
that current online privacy theories, frameworks,
and models are not thoughtful. The situational
paradigm could seem to be a solution for online
privacy research. Speaking of the situational
paradigm, two common questions are brought
up most. First, when it comes to contexts, there
are no concrete answers to questions. It is not
possible to depict privacy, because it depends on
the contexts. Second, how can we account for any
kind of contexts? The situational paradigm does
not mean contexts or situations are omnipotent.
We need to keep in mind that users behavior
patterns and environment risks are not static, but
dynamic, especially in Internet settings. Thus, it
would be unguarded and arbitrary to conclude
what types of persons care more about their privacy and disclose less information. The findings
also give a positive answer to the main questions
of this book: There are no absolute truths on



online consumer protection, but instead theories


of human relativism.
This study sheds new light of how users
gratifications influence their privacy concerns
and practices. This is unprecedented. Internet
users us