Title: Generating Standards for Privacy Audits: Theoretical Bases from

Two Disciplines
Author: Alan Toy
EAP Date (approved for print): 14 November 2017

Note to users: Articles in the ‘Epubs ahead of print’ (EAP) section are peer
reviewed accepted articles to be published in this journal. Please be aware
that although EAPs do not have all bibliographic details available yet, they
can be cited using the year of online publication and the Digital Object
Identifier (DOI) as follows: Author(s), ‘Article Title’, Journal (Year),
Volume(Issue), EAP (page #).

The EAP page number will be retained in the bottom margin of the printed
version of this article when it is collated in a print issue.

Collated print versions of the article will contain an additional volumetric
page number. Both page citations will be relevant, but any EAP reference
must continue to be preceded by the letters EAP.

ISSN-0729-1485
Copyright Ó 2017 University of Tasmania
All rights reserved. Subject to the law of copyright no part of this publication
may be reproduced, stored in a retrieval system or transmitted in any form or
by any means electronic, mechanical, photocopying, recording or otherwise,
without the permission of the owner of the copyright. All enquiries seeking
permission to reproduce any part of this publication should be addressed in
the first instance to:
The Editor, Journal of Law, Information and Science, Private Bag 89, Hobart,
Tasmania 7001, Australia.
editor@jlisjournal.org

http://www.jlisjournal.org/

Generating Standards for Privacy Audits: Theoretical
Bases from Two Disciplines

ALAN TOY*

Abstract

Privacy auditing is performed by a range of professionals, including those within the
disciplines of accounting/auditing and law. However, the range of services that may be
called privacy audits is broad, and different organisations are approaching privacy
audits in different ways. It is not possible to identify a set of standards for a privacy
audit that the majority of privacy auditors would agree upon. This paper suggests that
accountants and lawyers may reach agreement on a common theoretical basis and that
this could produce standards for privacy audits that are capable of providing assurance
to organisations that operate internationally, and to consumers in different countries.

Critical Theory provides a lens through which the practice of privacy auditing may be
viewed. This allows for a study of privacy auditing that emphasises areas in which the
practice may have room for improvement. It is suggested that privacy audits may be
improved by the use of standards that come closer to harmonisation. This would provide
the additional benefit of updating the standards to more modern criteria than are
currently contained within the national information privacy laws.

Introduction

Privacy audits are now firmly on the agenda of professional services firms in
the disciplines of both accounting/auditing and law. Some well publicised
privacy audits have targeted high-profile organisations.1 Audits can help to
address the concerns of individual citizens, particularly where concerns are
amplified by the sweeping nature of privacy violations,2 which can affect large
groups of citizens at once.

* Senior Lecturer in Commercial Law, The University of Auckland Business School.
1 Kashmir Hill, So, What Are These Privacy Audits That Google And Facebook Have To Do
For The Next 20 Years? (30 November 2011) Forbes
<http://www.forbes.com/sites/kashmirhill/2011/11/30/so-what-are-these-
privacy-audits-that-google-and-facebook-have-to-do-for-the-next-20-years/>.
2 The Target data breach affected 70 million customers: Pamela Prah, Target’s data
breach highlights state role in privacy (16 January 2014) USA Today
<http://www.usatoday.com/story/news/nation/2014/01/16/target-data-
breach-states-privacy/4509749/>; In New Zealand the Accident Compensation
Corporation privacy breach in 2012 involved the unauthorised disclosure of the

EAP 1
Journal of Law, Information and Science Vol 25 2017

However, privacy audits do not fall under one universally agreed definition.
The range of services that may be called privacy audits is very wide, and
different organisations are approaching privacy audits in different ways. It is
currently impossible to identify a set of standards for a privacy audit that the
majority of privacy auditors would agree upon. This variation of standards
impedes the development of privacy audits as an assurance service.

This article argues that accountants and lawyers may have some common
ground when creating standards for privacy audits. Further, agreement
between the disciplines on a common theoretical basis could allow for the
production of standards for privacy audits that are capable of providing
assurance to organisations that operate internationally, and to other
stakeholders such as consumers who might reside in a different country from
the one in which the audit report is produced.

As privacy audits develop, new types of expertise and institutions might arise
to conduct them. This may include private auditors such as audit firms
developing specialised privacy audit teams, or it may go further than that.
Multidisciplinary teams may also benefit from a common agreement on the
basis of standards for privacy audits.

1 An Increasing Need for Privacy Auditing

More and more information is being collected about people in society: ‘[t]he
scope of surveillance and social control in contemporary society is at an
unprecedented high’.3 The focus must now turn from preventing collection of
personal information to overseeing its uses. The power of those who control
this data is increasing and this must be matched by a corresponding increase
in their responsibility. Accountability of data controllers can be facilitated by
the rise of privacy auditing. It has not been necessary until recently for privacy
auditing to assume a greater role, but changes in technology now make this
imperative.

The advent of Big Data4 has important implications for privacy and will result
in increasing risks of breaches of privacy. The benefits of Big Data are myriad,

personal information of 6,748 individuals: KPMG and Information Integrity
Solutions, Independent Review of ACC’s Privacy and Security of Information (Report,
KPMG and Information Integrity Solutions, 22 August 2012)
<https://privacy.org.nz/assets/Files/Media-Releases/22-August-2012-ACC-
Independent-Review-FINAL-REPORT.pdf>.
3 Lee Bygrave, Data Protection Law: Approaching Its Rationale, Logic and Limits (Kluwer
Law International, 2002) 100.
4 The term ‘Big Data’ has no formal definition, but it is generally understood that ‘big
data refers to things one can do at a large scale that cannot be done at a smaller one,
to extract new insights or create new forms of value, in ways that change markets,

EAP 2
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

but the danger is that people could be subject to profiling and decisions could
be made without the subject individuals knowing the reasons for such
decisions. Privacy audits may be a method of addressing these concerns. To
guard against the possibility of loss of autonomy and individual liberty, ‘big
data will require monitoring and transparency, which in turn will require new
types of expertise and institutions’.5 For example, some tertiary institutions in
the United States that have sufficient resources are using the social media
history of individual applicants as a screening process, assisting in a
determination of whether to admit a student to college or not. The potential
students are sometimes not informed that their information has been used in
this way.6 Privacy concerns of consumers appear to be significant,7 and in
research by the Federal Trade Commission (‘FTC’), ‘a nationwide survey
indicated that 57 per cent of all app users have either uninstalled an app over
concerns about having to share their personal information, or declined to install
an app in the first place for similar reasons’.8

Privacy audits are one way of increasing privacy protections in the age of Big
Data and social networking. These audits investigate the flows of personal
information within an organisation and determine whether the organisation
implements appropriate privacy principles in its management of these data
flows. The scope of a privacy audit relates to personal information, whether or
not it is stored in an IT system. A privacy audit is therefore different from an
IT audit, which does not focus on implementation of appropriate privacy
principles but instead focuses on the security of information. Even if an
organisation implements the best security controls available, it may still fail to
implement appropriate privacy principles.

organisations, the relationship between citizens and governments, and more’: Viktor
Mayer-Schonberger and Kenneth Cukier, Big Data: A Revolution That Will Transform
How We Live, Work and Think (Houghton Mifflin Harcourt, 2013) 6.
5 Mayer-Schonberger and Cukier, above n 4, 179.
6 Natasha Singer, ‘Toning Down the Tweets Just in Case Colleges Pry’ The New York
Times (online) 19 November 2014
<http://www.nytimes.com/2014/11/20/technology/college-applicants-sanitize-
online-profiles-as-college-
pry.html?hp&action=click&pgtype=Homepage&module=mini-moth&region=top-
stories-below&WT.nav=top-stories-below>.
7 Stephen Shelton, ‘The Case for Privacy Audits’ (2010) 67 Internal Auditor 23, 23.
8 Federal Trade Commission, ‘Mobile Privacy Disclosures: Building Trust Through
Transparency’ (Staff Report, Federal Trade Commission, February 2013) 3
<https://www.ftc.gov/sites/default/files/documents/reports/mobile-privacy-
disclosures-building-trust-through-transparency-federal-trade-commission-staff-
report/130201mobileprivacyreport.pdf>.

EAP 3
Journal of Law, Information and Science Vol 25 2017

The earliest privacy audits appear to have taken place in West Germany in the
early 1980s.9 Early examples also took place in other countries in Europe and
in Canada.10 In Europe, ‘the number and frequency of [privacy] audits is
increasing’.11 However the number of privacy audits still varies widely ‘with
hundreds performed annually in some Member States, and just a few in
others’.12 Interestingly, there has been some use of consistent privacy audit
standards simultaneously in multiple states of the European Union (‘EU’). For
example, in 2006 the Article 29 Working Party began investigation of data
processing practices in the Private Health Insurance sector. This was a ‘co-
ordinated EU-wide investigation’ that used the same methodology across the
different countries.13

There have been a small but increasing number of privacy audits required
under orders by the FTC in the United States. These have generally targeted
large, dominant players in the online environment such as Google.14 Another
example is Snapchat, which has recently been issued with a consent order that
requires it to have biennial assessments that ‘certify that the privacy controls
are operating with sufficient effectiveness to provide reasonable assurance to
protect the privacy of covered information and that the controls have so
operated throughout the reporting period’.15

9 David Flaherty, Protecting privacy in surveillance societies: the Federal Republic of
Germany, Sweden, France, Canada, and the United States (University of North Carolina
Press, 1989) 58.
10 Colin Bennett and Charles Raab, The Governance of Privacy: Policy Instruments in
Global Perspective (Ashgate Publishing, 2003) 110.
11 Christopher Kuner, European Data Protection Law: Corporate Compliance and Regulation
(Oxford University Press, 2nd ed, 2007) 51.
12 Ibid 52.
13 Article 29 Working Party, ‘The EU- Working Party for data protection is launching
an investigation into the processing of personal data in the private health insurance
sector early March 2006’ (Press Release, 13 March 2006) 1
<http://ec.europa.eu/justice/data-
protection/law/files/news/etf_press_release_final_13_11_06_en.pdf>.
14 Agreement Containing Consent Order with a service date of October 28 2011, between
Google Inc and the Federal Trade Commission (US) (28 October 2011) Federal Trade
Commission
<https://www.ftc.gov/sites/default/files/documents/cases/2011/03/110330goo
glebuzzagreeorder.pdf>.
15 Agreement Containing Consent Order with a service date of December 14 2014, between
Snapchat Inc and the Federal Trade Commission (US) (14 December 2014) Federal Trade
Commission 4
<https://www.ftc.gov/system/files/documents/cases/140508snapchatorder.pdf
>.

EAP 4
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

Privacy audits are related to other mechanisms of privacy governance such as
Privacy Impact Assessments (‘PIAs’)16 that can assess the effect on privacy
rights of the possible implementation of a new product or service. If a PIA is
completed at an early stage of development of a new product or service then a
privacy audit may be performed after implementation of the new product or
service to assess whether or not the PIA was implemented correctly, or as part
of a periodic review cycle (periodic review is a common requirement in a PIA).

Regulatory and self-regulatory policy instruments are also relevant to privacy
governance, and these may assist to protect the privacy of citizens in addition
to privacy audits. Examples include United Nations guidelines in 199017 and
Convention 108 of the Council of Europe,18 which may soon be modernised and
may become of increasing global importance.19 The International Organisation
for Standardisation (‘ISO’) sought to develop standards for information
privacy and it produced a set of standards in 2011,20 but these have not yet
gained widespread acceptance. Privacy seal organisations such as TRUSTe are
another example of self-regulatory attempts to protect information privacy,21
but these are not synonymous with privacy audits and therefore they will not
be discussed in detail in this paper. Privacy Enhancing Technologies (‘PETs’)
are technological mechanisms for privacy governance and they may also assist
to enhance the privacy protection of citizens.22 PETs exist to enable citizens to
implement their information privacy rights. These should be distinguished
from technologies that exist to protect security.23 It is possible that a privacy
audit may consider the use of PETs by an organisation.

Important revelations regarding the collection of personal data under the
auspices of the government of the United States have resulted in public anxiety

16 David Wright and Paul De Hert, Privacy Impact Assessment (Springer, 2012) 172.
17 Lee Bygrave, Data Privacy Law: An International Perspective (Oxford University Press,
2014) 51.
18 Convention for the Protection of Individuals with regard to Automatic Processing of
Personal Data, opened for signature 28 January 1981, CETS No 108 (entered into force
1 October 1985); Bennett and Raab, above n 10, 72.
19 Graham Greenleaf, ‘Renewing Convention 108: The CoE’s “GDPR Lite” initiatives’
(2016) 142 Privacy Laws & Business International Report 14, 17.
20 ISO/IEC 29100:2011 (2011–12) International Organisation for Standardisation
<http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnu
mber=45123>.
21 Bennett and Raab, above n 10, 129.
22 Bygrave, above n 17, 101.
23 Bennett and Raab, above n 10, 141.

EAP 5
Journal of Law, Information and Science Vol 25 2017

and action, including legal action.24 There have also been official reports within
the United States that recommend changes to official policies on data collection,
due to the fact that

there have been serious and persistent instances of noncompliance in the
Intelligence Community’s implementation of its authorities. Even if
unintentional, these instances of noncompliance raise serious concerns about the
intelligence community’s capacity to manage its authorities in an effective and
lawful manner.25

There is also a recommendation that ‘the US Government should follow the
model of the Department of Homeland Security and apply the Privacy Act of
1974 in the same way to both US persons and non-US persons’.26 This indicates
both an impetus for change, and a confirmation that information privacy is an
issue that affects people across national boundaries. Privacy audits could assist
to provide accountability in respect of personal information that has been
gathered under these programs.

It is unknown whether and to what extent President Trump will give effect to
information privacy initiatives pursued by the White House under President
Obama. For example, President Obama has voiced privacy concerns in a speech
delivered at the FTC,27 and the latest Consumer Privacy Bill of Rights was
proposed.28 If passed, this Bill would allow enforcement by the FTC of privacy
rights for consumers. This document supports the approach suggested by some
research that is in favour of fundamental principles supplemented by industry
codes of conduct.29 However, it does not embrace the principles of
proportionality, legitimacy and privacy by design to the full extent that has
been recommended by the latest European proposals, and proposals by the

24 For example: Klayman v Obama 957 F Supp 2d 1 (D DC, 2013), which was brought
following the revelations by Edward Snowden.
25 Richard A Clarke et al, ‘Liberty and Security in a Changing World’ (Report,
President’s Review Group on Intelligence and Communication Technologies, 12
December 2013) 76
<https://obamawhitehouse.archives.gov/sites/default/files/docs/2013-12-
12_rg_final_report.pdf>.
26 Ibid 19.
27 Reed Freeman, President Obama Turns His Attention to Privacy (12 January 2015)
International Association of Privacy Professionals
<https://privacyassociation.org/news/a/president-obama-turns-his-attention-to-
privacy/>.
28 Administration Discussion Draft: Consumer Privacy Bill of Rights Act of 2015 (2015) The
White House <https://www.scribd.com/document/257168595/Administration-
Discussion-Draft-Consumer-Privacy-Bill-of-Rights-Act-of-2015>.
29 Alan Toy, ‘Different Planets or Parallel Universes: Old and New Paradigms for
Information Privacy’ (2013) 25 New Zealand Universities Law Review 938.

EAP 6
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

FTC itself. Nevertheless, it would be an important step forward for the US to
enact a privacy bill of rights to cover the general privacy rights of consumers.

Despite the risks of disclosure of personal information, a ‘privacy paradox’
means that, ‘despite reported high privacy concerns, consumers still readily
submit their personal information in a number of circumstances’.30 Given that
privacy is valued as a right, it may still be assigned an economic value. This has
given rise to the idea of privacy as a commodity. In essence, this means that
consumers trade their privacy for certain benefits.31 An example of such a
benefit may be free access to online social networking services. A report of the
US government under the previous administration claims that there exists
flawed speculation that privacy cannot exist anymore because it is inconsistent
with ‘modern communications technologies’.32 According to this report, there
is no basis in fact for the decline of privacy, and there is justification for a
strengthening of responses by officials to adjust to new threats to privacy.33 The
use of privacy audits is an important backstop protection in such an
environment.

The use of new technologies is also changing the balance between privacy and
other interests in the legal sphere. New judgments are striking a different
balance regarding privacy and interests such as law enforcement. For example,
in Riley v California34 the majority in the Supreme Court of the United States
held that an appropriate balance must be struck with regard to searches of
digital data on cell-phones that is different to that struck with regard to
searches of other objects.35 In a concurring judgment, Justice Alito said that this
issue required a ‘new balancing of law enforcement and privacy interests’.36
This judgment prevents police officers from performing a warrantless search
of a cell-phone’s data even if the cell-phone is in the possession of a person who
has been arrested. The basis of this ruling is that the sheer quantity of data held
on a cell-phone requires different treatment from other items in a person’s
possession, confirming that changes in technology can result in increases in
legal privacy protections. Privacy audits are able to apply the new balance of

30 H Jeff Smith, Tanara Dinev and Heng Xu, ‘Information Privacy Research: An
Interdisciplinary Review’ (2011) 35 Management Information Systems Quarterly 989,
993.
31 Paul Pavlou, ‘State of the Information Privacy Literature: Where Are We Now and
Where Should We Go?’ (2011) 35 Management Information Systems Quarterly 977, 981.
32 Clarke et al, above n 25, 45.
33 Ibid 45–6.
34 Riley v California 573 US __ (2014).
35 Ibid 8 (Roberts CJ).
36 Ibid 3.

EAP 7
Journal of Law, Information and Science Vol 25 2017

interests provided they are flexible and do not simply apply the information
privacy laws in place in one single jurisdiction.

There have been calls for greater harmonisation of information privacy laws.
For example, one recent survey about personal information found that ’73 per
cent of respondents indicated that there should be a call for a global consumer
bill of rights and furthermore saw the United Nations as fostering that’.37
Furthermore, the FTC has been active in recommending new initiatives to
address the challenges to privacy that are presented by the rise of data broker
organisations. These are organisations that collect personal information of
consumers and then use or transfer that information with or to others. There
are privacy risks with this business model. The FTC has stated that ‘[t]he
specific legislative recommendations made by the Commission reflect high-
level principles drawn from the findings of this study, the Commission’s
previous work in this area, and the ongoing public debate about data brokers’.38
These principles reflect best practices for privacy protection, such as ‘privacy
by design, which includes considering privacy issues at every stage of product
development’.39 These policy recommendations may have significant influence
on the practice of privacy auditing because the high-level principles that the
FTC is recommending reflect some aspects of best practice (such as privacy by
design) that are not currently part of information privacy laws in countries such
as Australia, Canada, Ireland, New Zealand or the United States. The FTC has
developed privacy audits as a key part of settlements,40 and it ‘will continue to
work with industry, consumer groups and lawmakers to further the goals of
increased transparency and consumer control’.41 This indicates that the

37 Cloud Security Alliance ‘Data Protection Heat Index Survey’ (Report, September
2014) 6 <https://cloudsecurityalliance.org/download/data-protection-heat-index-
survey-report/>.
38 Edith Ramirez et al, ‘Data Brokers: A Call for Transparency and Accountability’
(Report, Federal Trade Commission, May 2014) vii
<https://www.ftc.gov/system/files/documents/reports/data-brokers-call-
transparency-accountability-report-federal-trade-commission-may-
2014/140527databrokerreport.pdf>.
39 Ibid 54.
40 The Google privacy audit was a reasonable assurance attestation engagement: Alan
Toy and David Hay, ‘Privacy Auditing Standards’ (2015) 34 Auditing: A Journal of
Practice & Theory 181, 184.
41 Ramirez et al, above n 38, 57; The National Telecommunications and Information
Agency (NTIA) within the US Department of Commerce has begun to convene
meetings to craft codes of practice for privacy best practices in specific industries.
The ‘multi-stakeholder process to develop a code of conduct on mobile application
transparency’ began in July 2012 and there have been a number of subsequent
meetings: Federal Trade Commission, above n 8, iii; If the process results in the
development of strong codes, the FTC may refrain from exercising its law

EAP 8
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

direction of the United States may be one of increased leadership in
information privacy best practice. The goal of transparency is also echoed in
other statements of the US government, albeit under the previous
administration.42 Although it has been doubted that audits required by the FTC
are of a sufficient standard to warrant the term audit (and that they are merely
‘assessments’),43 such an argument is misleading. The relevant standard for
attestation engagements states that audits are for financial reports, and that for
other types of information the proper term is ‘examination’,44 which may be at
the level of reasonable assurance (ie: a high level of assurance; exactly the same
level as an audit). Assessments are not a term of art in accounting and to claim
otherwise is simply false.

2 Impetus for privacy audits

The concept of privacy audits has existed at least since Gelinas’ PhD research.45
However, it was not until the 1990s that publicly available privacy audits were
conducted. More recently, it has been suggested that from a management
perspective, it may be useful to have a privacy audit.46

Many privacy audits arise from complaints to regulators such as Privacy
Commissioners. Examples include:
• The audit of the Canadian Firearms Program, where it is stated that the
Privacy Commissioner of Canada ‘has received a number of inquiries
and complaints about the Program. … In part to assist our Office in
responding to these complaints and inquiries, we decided in September
1999 that it was an opportune time to review the Program’.47

enforcement powers against an organisation that adheres to such a code: Federal
Trade Commission, above n 8, 12.
42 Clarke et al, above n 25, 28.
43 Chris Hoofnagle, Federal Trade Commission Privacy Law and Policy (Cambridge,
Cambridge University Press, 2016) 342–3.
44 AT-C Section 105: Concepts Common to All Attestation Engagements, Source: SSAE 18,
(2016) American Institute of Certified Public Accountants
<http://www.aicpa.org/RESEARCH/STANDARDS/AUDITATTEST/Download
ableDocuments/AT-C-00105.pdf>.
45 Ulric Gelinas, Privacy Audits and the Certified Public Accountant (PhD Thesis,
University of Massachusetts Amherst, 1978).
46 Kai-Lung Hui, Hock Hai Teo and Sang-Yong Tom Lee, ‘The Value of Privacy
Assurance: an Exploratory Field Experiment’ (2007) 31 Management Information
Systems Quarterly 19, 28.
47 George Radwanski, ‘Review of the Personal Information Handling Practices of the
Canadian Firearms Program’ (Final Report, Office of the Privacy Commissioner of

EAP 9
Journal of Law, Information and Science Vol 25 2017

• Complaints made regarding Staples Business Depot. Between 2004 and
2008, Staples had sold devices to consumers without properly wiping the
electronic memory. Some of these devices contained personal
information about Canadians, and the Canadian Privacy Commissioner
commenced a privacy audit.48
• The well-known complaint-driven privacy audit of Facebook Ireland Ltd
in 2011.49 This audit was completed by the Office of the Data Protection
Commissioner of Ireland, following receipt of a complaint by Max
Schrems, an Austrian law student.50

Public concern may provide the impetus for an audit in a more general way
than the complaints route, and may trigger a privacy authority to conduct an
audit. This is the case with the privacy audit of the Canadian Border Services
Agency, which came about after findings in a 2004 study that ‘the Canadian
public is concerned about the trans-border flow of their personal information
to the United States’.51 Public concern regarding a number of breaches of data
security led to a privacy audit of the Office of the Revenue Commissioners by
the Office of the Data Protection Commissioner of Ireland.

Highly publicised privacy breaches have also necessitated privacy audits:
• The Department of Social and Family Affairs of Ireland ‘was scheduled
for priority audit in direct response to further media reports in October
2007 alleging a series of unlawful disclosures of personal data by an
employee of the Department who then used the information for criminal
purposes’.52 This was in the context of broader public concern over how
this Department was handling personal data.

Canada, 29 August 2001) 4 <https://www.priv.gc.ca/en/opc-actions-and-
decisions/audits/fr_010813/>.
48 Steven Morgan et al, ‘Staples Business Depot: Audit Report’ (Final Report, Office of
the Privacy Commissioner of Canada, 2011) 6
<https://www.priv.gc.ca/media/1150/ar-vr_staples_2011_e.pdf>.
49 Gary Davis, ‘Facebook Ireland Ltd: Report of Audit’ (Final Report, Office of the Data
Protection Commissioner of Ireland, 21 December 2011)
<https://www.dataprotection.ie/documents/facebook%20report/final%20report
/report.pdf>.
50 Pamela Duncan, ‘Unfriends of Facebook unite’ The Irish Times (Dublin) 15 October
2011, B3.
51 ‘Audit of the Personal Information Management Practices of the Canada Border
Services Agency Trans-border Data Flows’ (Report, Office of the Privacy
Commissioner of Canada, June 2006) 6
<https://www.priv.gc.ca/media/1166/cbsa_060620_e.pdf>.
52 Office of the Data Protection Commissioner of Ireland, ‘Data Protection in the
Department of Social & Family Affairs’ (Audit Report, 2008) 5

EAP 10
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

• In Canada in 1998, information collected by the federal government and
held by National Archives Canada was transferred to a contractor for
disposal. The contractor instead arranged to sell the intact paper files to
the highest bidder (the files contained information about thousands of
Canadians such as tax information and parole records). This led to an
audit by the Canadian Privacy Commissioner.53
• In 2008, hundreds of credit reports regarding Canadians were
downloaded from 14 mortgage brokers by an unauthorised person for
his own use. A privacy audit followed this breach.54
• The federal police force in Canada was found to be disclosing ‘details of
convictions, discharges or pardons to employers without the informed
consent of the prospective employee’ and this also provided justification
for a privacy audit.55
• Into this category also falls the Google privacy audit,56 which was done
pursuant to an agreement struck with the FTC.57
• In New Zealand, the Accident Compensation Corporation was audited
following a privacy breach,58 as was the Ministry of Social
Development.59

Government policies may provide reasons for privacy audits. This is
demonstrated with the 2007 privacy audit of nine institutions of the Canadian

<http://www.dataprotection.ie/viewdoc.asp?m=p&fn=/documents/AUDITS/A
uditReports.htm>.
53 Steven Morgan et al, ‘Personal Information Disposal Practices in Selected Federal
Institutions’ (Final Report, Office of the Privacy Commissioner of Canada, 2010) 4
<https://www.priv.gc.ca/media/1143/ar-vr_pidp_2010_e.pdf>.
54 Steven Morgan et al, ‘Audit of Selected Mortgage Brokers’ (Final Report, Office of
the Privacy Commissioner of Canada, 2010) 3
<https://www.priv.gc.ca/media/1140/ar-vr_mb_2010_e.pdf>.
55 Steven Morgan et al, ‘Audit of Selected RCMP Operational Databases’ (Final Report,
Office of the Privacy Commissioner of Canada, 2011) 3
<https://www.priv.gc.ca/media/1148/ar-vr_rcmp_2011_e.pdf>.
56 ‘Initial Assessment Report on Google’s Privacy Program’ (Report, PwC, 22 June
2012) <https://epic.org/privacy/ftc/googlebuzz/FTC-Initial-Assessment-09-26-
12.pdf>.
57 Agreement Containing Consent Order with a service date of October 28, 2011, between
Google Inc and the Federal Trade Commission (US), above n 14.
58 KPMG and Information Integrity Solutions, Independent Review of ACC’s Privacy and
Security of Information (Report, KPMG and Information Integrity Solutions, 22
August 2012) <https://privacy.org.nz/assets/Files/Media-Releases/22-August-
2012-ACC-Independent-Review-FINAL-REPORT.pdf>.
59 Deloitte, ‘Ministry of Social Development Independent Review of Information
Systems Security’ (Report, Deloitte, 30 November 2012)
<http://img.scoop.co.nz/media/pdfs/1212/deloittephase2finalreport.pdf>.

EAP 11
Journal of Law, Information and Science Vol 25 2017

government, to determine their compliance with the Privacy Impact
Assessment Policy that had been introduced by the government of Canada in
2002.60 A change to legislation requiring organisations to submit Federal
Annual Privacy Reports also provided a reason for the Canadian Privacy
Commissioner to assess compliance with this change.61 The Canadian Privacy
Commissioner also conducted a privacy audit in 2009 of the Passenger Protect
Program, an anti-terrorist initiative set up in 2007.62

Further impetus for privacy audits can occur due to changes at public
institutions. An example is the sudden growth and change of Canadian
Passport Operations. This produced a situation where an unprecedented
increase in staff numbers resulted in a potentially lower level of compliance
with privacy procedures because new staff had not yet completed their privacy
training before starting work.63 Technology upgrades such as the provision of
smartphones to thousands of public servants have raised concerns regarding
the protection of data and have given rise to a privacy audit of wireless
environments in federal institutions.64 The introduction of ‘naked scanners’65
by the Canadian Air Transport Security Authority also justified a privacy
audit.66

60 Raymond D’Aoust et al, ‘Audit Report of the Privacy Commissioner of Canada:
Assessing the Privacy Impacts of Programs, Plans, and Policies’ (Report, Office of
the Privacy Commissioner of Canada, 2007) 3
<https://www.priv.gc.ca/media/1066/pia_200710_e.pdf>.
61 Trevor Shaw et al, ‘Audit of Federal Annual Privacy Reports’ (Report, Office of the
Privacy Commissioner of Canada, 2009) 1
<https://www.priv.gc.ca/media/1133/ar-vr_fapr_200910_e.pdf>.
62 Steven Morgan et al, ‘Audit of Passenger Protect Program, Transport Canada’
(Report, Office of the Privacy Commissioner of Canada, 2009) 2
<https://www.priv.gc.ca/media/1146/ar-vr_ppp_200910_e.pdf>.
63 Trevor R Shaw et al, ‘Privacy Audit of Canadian Passport Operations’ (Report,
Office of the Privacy Commissioner of Canada, 2008) 7
<https://www.priv.gc.ca/en/opc-actions-and-decisions/audits/pc_20081204/>.
64 Steven Morgan et al, ‘The Protection of Personal Information in Wireless
Environments: An Examination of Selected Federal Institutions’ (Final Report, Office
of the Privacy Commissioner of Canada, 2010) 1
<http://publications.gc.ca/collections/collection_2011/priv/IP54-33-2010-
eng.pdf>.
65 Sarah Schmidt, ‘Watchdog to look at “naked scanners”; Privacy of air travelers
under scrutiny. Reappointed privacy commissioner launches audit of agency in
charge of air-passenger screening’ The Gazette (Montreal, Quebec) 9 December 2010,
A12.
66 Steven Morgan et al, ‘Privacy and Aviation Security: An Examination of the
Canadian Air Transport Security Authority’ (Final Report, Office of the Privacy

EAP 12
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

Previous privacy audits or even mere investigations may also raise issues that
ought to be followed up in subsequent audits. This was the case with the re-
audit of Facebook Ireland.67 It is also demonstrated in a joint audit performed
by the Canadian Office of the Auditor General and the Office of the Privacy
Commissioner.68 This is the first evidence of collaboration in Canada between
these two offices in a privacy audit. An investigation of Veterans Affairs
Canada also led to a subsequent audit.69

Organisations that wish to transfer data across national boundaries may
encounter restrictions, such as the requirements of the European Union that
prevent data being transferred out of the EU to other countries that do not have
acceptable levels of protection for that data.70 One way of ensuring an
acceptable level of protection is to have Binding Corporate Rules (‘BCRs’) that
would allow data subjects to enforce rights against all entities that are part of a
multi-national organisation. It may be necessary for BCRs to provide for an
audit.71

Finally, organisations wishing to project a public image of respect for privacy
may voluntarily submit to the requirements of privacy seals. Certain
organisations, such as TRUSTe, provide privacy seals after they verify that an
organisation has met certain minimum privacy protection requirements. In
order to ensure compliance, the privacy seal organisation may require
‘Certified Public Accountant (‘CPA’) audits of privacy policies’.72 Privacy

Commissioner of Canada, 2011) 3 <https://www.priv.gc.ca/media/1127/ar-
vr_catsa_2011_e.pdf>.
67 Gary Davis, ‘Facebook Ireland Ltd: Report of Re-Audit’ (Report, Office of the Data
Protection Commissioner of Ireland, 21 September 2012) 3
<http://edepositireland.ie/handle/2262/81672>.
68 Trevor R Shaw et al, ‘Privacy Management Frameworks of Selected Federal
Institutions’ (Audit Report, Office of the Privacy Commissioner of Canada, 2009) 1
<https://www.priv.gc.ca/media/1201/pmf_20090212_e.pdf>.
69 Steven Morgan et al, ‘Veterans Affairs Canada’ (Final Report, Office of the Privacy
Commissioner of Canada. 2012) 7 <https://www.priv.gc.ca/en/opc-actions-and-
decisions/audits/ar-vr_vac_2012/>.
70 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on
the protection of individuals with regard to the processing of personal data and on the free
movement of such data [1995] OJ L 281/31, arts 25, 26.
71 David Bender and Larry Ponemon, ‘Binding Corporate Rules for Cross-Border Data
Transfer’ (2006) 3 Rutgers Journal of Law & Urban Policy 154, 158.
72 Robert LaRose and Nora Rifon, ‘Your privacy is assured – of being disturbed:
websites with and without privacy seals’ (2006) 8 New Media and Society 1009, 1014;
Karim Jamal, Michael Maier and Shyam Sunder, ‘Enforced Standards Versus
Evolution by General Acceptance: A Comparative Study of E-Commerce Privacy
Disclosure and Practice in the United States and the United Kingdom’ (2005) 43
Journal of Accounting Research 73, 94.

EAP 13
Journal of Law, Information and Science Vol 25 2017

audits have been done by members of the American Institute of Certified Public
Accountants for WebTrust, a privacy seal organisation.73 These audits are
private documents and they are not available for the research in this paper.

3 A Theoretical approach to privacy audit standards

As may be surmised from the preceding discussion, the drivers of the practice
of privacy auditing are diverse and it is therefore no surprise that the
theoretical basis of privacy auditing and the information privacy rights on
which the practice is based are underdeveloped.

Information privacy is a new and unsettled field of law, which emphasises the
necessity for an enhanced theoretical basis. For example, in Google Spain SL v
Agencia Española de Protección de Datos,74 the Court of Justice of the European
Union decided that Google must remove links in its website to some personal
information of European citizens, a decision that caused an important and
immediate change in the way that Google operates.

Due to the current unsettled and inadequate theoretical basis of privacy
auditing, this paper argues that privacy auditing theory needs to be
transformed and enhanced. The concern is that privacy auditing has not yet
achieved a level of rigour that will enable it to be seen as useful and integral to
the operation of organisations. Society is changing, and our conception of
privacy is changing, but the law is not changing fast enough. Some privacy
auditors, especially those who are regulators, are focused on the standards
contained in national legislation. Other privacy auditors, such as the Big Four
professional services firms, are using standards in their privacy audits that are
more modern than those contained in national privacy laws. Privacy auditing,
generally speaking, should use more consistent and modern standards, such as
the principle of privacy by design. Positivist research about privacy auditing
would not have an emphasis on change and would instead focus on what the
practice of privacy auditing has involved up until now. However, the most
interesting questions about privacy auditing relate to its future, not to its past.

There is currently a large degree of divergence between the criteria used by
different privacy auditors. This divergence arguably should not be explained
by differences in national information privacy legislation in different countries.
If privacy audits are to be seen as useful by stakeholders then such audits may
need to be of relevance to users in multiple countries, especially where a
privacy audit examines the activities of an organisation that operates across
national borders. Privacy issues are increasingly of global impact. There should

73 Hui, Teo and Lee, above n 46, 28.
74 Google Spain SL, Google Inc v Agencia Española de Protección de Datos (AEPD), Mario
Costeja González (C-131/12) [2014] ECR 317.

EAP 14
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

exist flexibility for the practice to improve as changes to technology pose
greater and greater challenges to the information privacy rights of citizens.

Some jurisdictions have produced privacy audits that implement standards
that diverge significantly from those in other jurisdictions. Privacy audits
conducted under the mandatory audit powers in Australia have produced
audits with a low level of international comparability.75 In Ireland, the audits
of Facebook Ireland Ltd also demonstrate a focus on the legal regime of just
one jurisdiction. However, the Canadian Privacy Commissioner has made
some moves toward the use of international best practice. There is therefore a
gulf between the practices of some privacy auditors when compared across
different jurisdictions.

4 Privacy audits across national borders

The main question that is raised when an organisation operates in multiple
countries is whether it is subject to information privacy laws in all of those
countries, or only some or none of them. It appears at least arguable that a
single organisation may be subject to the information privacy laws of more than
one country.76 A recent decision of the Court of Justice of the European Union
holds that where an organisation is engaging in electronic commerce,
applicable data protection law includes that of the country in which the
relevant consumer is located.77 This decision was an interpretation of Article
4(1)(a) of Directive 95/46/EC of the European Parliament and of the Council of 24
October 1995 on the protection of individuals with regard to the processing of personal
data and on the free movement of such data.78 The application of the consumer’s
local data protection law is, however, dependent on the organisation having an
‘establishment’ in the consumer’s local Member State. It is not immediately
clear what constitutes such ‘establishment’ because the mere fact that the
organisation’s website is accessible there is not enough, but conversely it is not
necessary to show that a branch or subsidiary of the organisation exists in the
consumer’s local Member State. Therefore, some point in between must
constitute ‘establishment’ before an organisation can be liable to follow the
local data protection law.

75 Toy and Hay, above n 40, 192.
76 Alan Toy, ‘Cross-border and Extraterritorial Application of New Zealand Data
Protection Laws to Online Activity’ (2010) 24 New Zealand Universities Law Review
222.
77 Verein für Konsumenteninformation v Amazon EU Sàrl (C-191/15) [2016] ECR 612.
78 [1995] OJ L 281/31.

EAP 15
Journal of Law, Information and Science Vol 25 2017

Although there have been calls for harmonisation of global standards for
information privacy law, this may raise ‘unrealistic expectations’.79 The
differences between information privacy laws of different countries may
therefore continue to exist for a considerable time. However, this need not
present a problem for privacy audits. Privacy audits need not be a mechanical
application of the information privacy laws within a single jurisdiction, but
may instead apply standards for privacy auditing that may have more in
common across different countries than information privacy laws do. This is
not a resolved issue, however, as some privacy audits do represent application
of the information privacy laws of just one country. For example, the privacy
audit of Facebook Ireland demonstrates this approach. On the other hand, the
privacy audit of Google demonstrates a departure from application of
information privacy laws, instead focusing on other standards for the audit.

It is common for initiatives for international harmonisation of privacy auditing
standards to contain suggestions that international standards may be useful,
for example the ISO privacy standards.80 Another example is that the Asia-
Pacific Economic Cooperation (‘APEC’) system of Cross Border Privacy Rules
(‘CBPRs’) provides for accountability agents,81 which can certify that the
privacy practices of an organisation are compliant with APEC’s rules.
However, international standards are yet to be widely accepted in the practice
of privacy auditing. This presents a challenge for privacy auditing that must be
overcome. Common agreement among privacy auditors on a theoretical basis
for privacy auditing may be an important step toward solving such problems.

5 Critical Theory

While both accounting theory and legal theory recognise Positivism, this is
unlikely to be the theory that is most useful for the development of privacy
auditing. Privacy auditing is in a state of change and this is demonstrated by
the wide variety of different standards that are used in current privacy audits.82
Positivist theory does not provide a way for privacy auditing to advance from
the current, unsatisfactory position in which it finds itself. Indeed, it has even
been suggested that there will be no overall advantage in achieving an

79 Christopher Kuner, Transborder Data Flows and Data Privacy Law (Oxford University
Press, 2013) 164.
80 ISO/IEC 29100:2011 (2011–12) International Organisation for Standardisation 19
<https://www.iso.org/standard/45123.html>.
81 Details of approved accountability agents can be found here: Cross Border Privacy
Rules System, For Accountability Agents (19 January 2016)
<http://www.cbprs.org/Agents/AgentDetails.aspx>.
82 Toy and Hay, above n 40, 181.

EAP 16
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

international agreement on information privacy rights83 and that, as an aspect
of constitutionalism, the Supreme Court in the United States relies on ‘statutory
protections of personal information as fundamental to the balancing of
interest[s]’.84

Interpretivism is also a theory that has roots in both accounting and law, and it
has traditionally been associated with the jurisprudential arguments of
Dworkin.85 However, Interpretivism may also fail to provide a sufficient
theoretical basis for privacy auditing. The essential nature of privacy auditing
is that it exists to protect privacy, and it has been argued that one possible way
to achieve more consistency for privacy audits, and therefore more
applicability across different countries, is to adopt a common set of principles
that may be balanced in different ways to achieve the best fit among different
cultures.86 However, the balancing of the principles is under constant and
increasing assault from technological considerations. We need a completely
new approach to privacy to deal with the threat of constant surveillance and
control. The balance is always tipping away from privacy, so we need a strong
rebalancing event. Critical Theory87 may provide this.

Critical Theory is a better basis for a theory of privacy auditing than either
Positivism or Interpretivism. It has been suggested that ‘[o]ne crucial

83 Stephen Schulhofer, ‘An international right to privacy? Be careful what you wish
for’ (2016) 14 International Journal of Constitutional Law 238, 261.
84 Caleb Seely, ‘Once more unto the breach: The constitutional right to informational
privacy and the Privacy Act’ (2016) 91 New York University Law Review 1355, 1364.
85 Nicos Stavropoulos, Legal Interpretivism (29 April 2014) The Stanford Encyclopedia
of Philosophy (Summer 2014 Edition)
<https://plato.stanford.edu/archives/sum2014/entries/law-interpretivist/>.
86 Toy and Hay, above n 40.
87 Critical Theorists seek ‘human emancipation in circumstances of domination and
oppression’: James Bohman, Critical Theory (8 March 2005) The Stanford
Encyclopedia of Philosophy (Fall 2016 Edition)
<http://plato.stanford.edu/archives/fall2016/entries/critical-theory/>; Critical
Theory developed at least in part in Frankfurt in the 1930s: Mats Alvesson and Hugh
Willmott, ‘Introduction’ in Mats Alvesson and Hugh Willmott (eds), Studying
Management Critically (Sage, 2003) 1, 2; It has motivations that include emancipation
and freedom of human beings in the ‘struggle for the future’: Bryan Turner, ‘Critical
Theory’ in Bryan Turner (ed) Cambridge Dictionary of Sociology (Cambridge
University Press, 2006)
<http://search.credoreference.com.ezproxy.auckland.ac.nz/content/entry/cupsoc
/critical_theory/0>; It has been suggested that ‘[t]hrough self-reflection one is freed
from past constraints (such as dominant ideology and traditional disciplinary
boundaries) and thus critical theory is emancipatory’: Michael Gaffikin, Accounting
Theory: Research, regulation and accounting practice (Pearson Education Australia,
2008) 151.

EAP 17
Journal of Law, Information and Science Vol 25 2017

contribution of the critical project (however we define ‘critical’) has been to
help scholars learn to expose the implicit contours of their worldviews’.88 The
Critical perspective challenges the basis of the current practice of privacy
auditing with the aim of suggesting potential future solutions to any problems
identified.

It has been argued that ‘steering media such as accounting and the law do not
have a fixed position in the lifeworld-system complex and may be increasingly
subsumed and internalized within systemic imperatives’.89 The lifeworld is a
conception of everyday experience, while the system concept refers to
functional areas such as the economy as a whole. In accordance with this
argument, the basis of this paper is that social imperatives may influence the
actions of privacy auditors and that this may influence later changes in the law
to accord with modern practice. Especially in the area of information privacy
law, where it is difficult for legislators to predict the types of data flows that
will occur in the future, the law may need significant guidance from social
norms and ideals regarding the information privacy rights of citizens. Privacy
audits are an ideal mechanism for the recognition and propagation of practices
that are consistent with these social norms and ideals.

Postmodernism may also have relevance to privacy audits. Postmodernism
emphasises the relationship between power and knowledge, especially in
relation to information technology.90 This school of thought has been used in
the information systems literature to expound the concept of ‘Gaze’,91 which
may be, through ‘systems of surveillance’,92 a way of exercising power in
society. Gaze is a technique to control those gazed upon by influencing them
to self-police. This is done by imposing the threat of observation (even though
not every individual will be actually observed). The concept of Gaze has
analogies with the basis of information privacy rights in a legal sense, because
autonomy and liberty may be restricted merely by the threat of the Gaze.

88 Rob Gray and Markus Milne, ‘It’s not what you do, it’s the way that you do it? Of
method and madness’ (2015) Critical Perspectives on Accounting (forthcoming) 6.
89 Michael Power, Richard Laughlin and David Cooper, ‘Accounting and Critical
Theory’ in Mats Alvesson and Hugh Willmott (eds), Studying Management Critically
(Sage, 2003) 132, 142.
90 Stewart Clegg et al, The Sage Handbook of Organization Studies (Sage, 2006) 256.
91 Mei-Lien Young, Feng-Yang Kuo and Michael Myers, ‘To share or not to share: a
critical research perspective on knowledge management systems’ (2012) 21 European
Journal of Information Systems 496, 498.
92 Michel Foucault, ‘Afterword: The Subject and Power’ in Hubert Dreyfus and Paul
Rabinow (eds), Michel Foucault: Beyond Structuralism and Hermeneutics (The
University of Chicago Press, 1983) 208, 223.

EAP 18
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

There are some important differences between Postmodernism and Critical
Theory which demonstrate that the latter is the more appropriate perspective
for privacy auditing. Critical Theory ultimately aims to suggest improvements
to a practice such as privacy auditing. These suggestions are central to this
paper. Postmodernism does not share this ambition, and may suspect that any
new form of consensus may cause new elites to arise and new illusions to be
created.93

6 Non-Positivism

A Non-Positivist perspective allows solutions to problems identified with
previous privacy audits and suggests goals that the practice of privacy auditing
might aspire to. The Non-Positivist approach is useful because privacy
auditing may be assisted by the application of a set of fundamental principles,94
with the addition of principles drawn from the latest proposals for policy and
legislative reform regarding information privacy rights. The Non-Positivist
approach adopted is capable of grounding the theoretical basis of privacy
auditing in both the disciplines of accounting and law. It is vital for privacy
auditing to be underpinned by a theory that has influence in both of these
disciplines because privacy audits may be conducted equally by those with an
accounting/auditing or a legal background.

This paper argues that Critical Theory is consistent with at least one form of
Non-Positivism and that it is essential for privacy auditing to be informed by
Critical Theory in order to progress. It has been suggested that Non-Positivism
can be divided into several different types95 of which the strongest form is
inclusive Non-Positivism. ‘Inclusive non-positivism claims neither that moral
defects always undermine legal validity nor that they never do’.96 One form of
inclusive Non-Positivism employs the Radbruch formula, which holds that
‘extreme injustice is not law’.97 Therefore, if a law is immoral, it can still be a
law provided that it is not extremely immoral. This type of Non-Positivism may

93 Clegg et al, above n 90, 273.
94 Toy, ‘Different Planets or Parallel Universes’, above n 29; Toy and Hay, above n 40,
181.
95 These include ‘exclusive, inclusive, and super-inclusive non-positivism’: Robert
Alexy, ‘Scott J Shapiro between Positivism and Non-Positivism’ (2016) 7
Jurisprudence 299, 301; Robert Alexy, ‘Law, Morality, and the Existence of Human
Rights’ (2012) 25 Ratio Juris 2, 4–7.
96 Robert Alexy, ‘Scott J Shapiro between Positivism and Non-Positivism’ (2016) 7
Jurisprudence 299, 301.
97 Ibid.

EAP 19
Journal of Law, Information and Science Vol 25 2017

also be thought of as hybrid Non-Positivism.98 This paper develops the
theoretical basis of privacy auditing by aligning it with Non-Positivism. It will
be shown that this is the way for privacy auditing to develop in a manner that
is consistent across the disciplines of accounting/auditing and law.

Previous research has suggested fundamental principles for privacy auditing
that may be balanced against each other and against other principles, in order
to assess which of the principles forms a reason that can amount to ‘law’ in a
particular case.99 In order to achieve this balancing of interests against each
other, each principle must have a dimension of ‘weight’ to assess its impact.
This idea is central to the fundamental principles because of the international
dimension. In different jurisdictions, such principles may have a different
‘weight’ that demands a different outcome when balanced against other
interests. The principles may be balanced against each other and also against
other values in society. For example, in the United States the principle of
freedom of speech is of great importance.100 In Europe, by contrast, this
principle has less importance. A logical outcome may be seen in the area of
consent to infringements of privacy, where the US is prepared to sanction some
uses of personal data in the absence of consent, while the EU is more cautious
about this. The FTC suggests the principle of Simplified Consumer Choice,101
which states that consent is not required for some dealings with data of citizens.
By contrast, the European Commission does not recognise the same exceptions,
although some permissions may exist.102 In this way, the fundamental
principles may apply across different countries, each of whom may have
different cultures. They do not cease to be valid principles simply because the
culture in different countries may require a different balancing of interests
there.

Raz argues that:

Those accustomed to ‘balancing’ talk may think that the existence of a (morally)

98 Anthony Reeves, ‘Reasons of Law: Dworkin on the Legal Decision’ (2016) 7
Jurisprudence 210, 213.
99 These fundamental principles have been developed in previous research: Toy and
Hay, above n 40.
100 Alan Toy, ‘Cross-border and Extraterritorial Application of New Zealand Data
Protection Laws to Online Activity’ (2010) 24 New Zealand Universities Law Review
222, 227.
101 Federal Trade Commission, ‘Protecting Consumer Privacy in an Era of Rapid
Change: Recommendations for Businesses and Policymakers’ (Report, March 2012)
<http://www.ftc.gov/os/2012/03/120326privacyreport.pdf>.
102 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016
on the protection of natural persons with regard to the processing of personal data and on the
free movement of such data, and repealing Directive 95/46/EC [2016] OJ L 119/1, art
6(1)(a).

EAP 20
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

legitimate law establishing a duty to perform a certain action is a reason for it, to
be added to other reasons for that action and balanced against whatever reasons
there are against it. That is a very misleading and wrong-headed view.103

Positivism cannot refer back to morally legitimate reasons for laws. It can only
refer to the pedigree (sources) of laws as legitimate reasons for action.104
However, Positivism appears to buck the trend of legal reasoning by judges in
both the EU and the US. In both of those jurisdictions, balancing talk is bound
into the reasons for decisions in information privacy cases.105 Positivism
appears to therefore be out of touch with the reality of judicial decision making
and, as such, cannot be a proper approach to the philosophy of law, at least
where information privacy is concerned.

Although they are necessarily linked in his theory, Dworkin still sees
distinctions between law and morality, and between law and justice. He states
the abstract idea that ‘legal rights are those flowing from past political decisions
according to the best interpretation of what this means’.106 He therefore
attempts to distance himself from the idea that law is a blueprint of morality.
While this argument is correct in its aims, there may be a less complicated way
to justify the distinction. Community morality is constantly changing. This can
be seen in the change in values over time relating to bankruptcy, which used
to cause the bankrupt to become a slave, then softened to a merely criminal
offence, then further softened to mere civil penalties, and now appears to have
softened even more to the point where facts entailing bankruptcy may no
longer in fact result in it.107

Changes to community morality occur before changes to the law, and changes
to the law may be very slow to respond to this. Law is created by humans with
limited resources, both legislative and judicial. These resources can do their
best, but cannot be entirely up to date with the community’s conception of
morality. This may result in the law not always being consistent with
community morality. Judges asked to interpret such a law may give it the most
up to date interpretation that they can, taking into account modern changes in
community morality, so far as this is possible. They may apply principles

103 Joseph Raz, Between Authority and Interpretation (Oxford University Press, 2009) 7.
104 H L A Hart, The Concept of Law (Oxford University Press, 3rd ed, 2012) 100.
105 For a detailed discussion of balancing information privacy rights in the EU: See Toy,
‘Different Planets or Parallel Universes’, above n 29. The practice of balancing
information privacy rights against other rights is less developed in the US but it is
present, such as in Riley v California 573 US __ (2014) where the Supreme Court held
that privacy interests must be balanced against law enforcement interests.
106 Ronald Dworkin, Law’s Empire (Fontana Press, 1986) 96.
107 An example is the ‘No Assets Procedure’: Insolvency Act 2006 (NZ) ss 361–377B. This
allows an alternative to bankruptcy for persons who meet certain criteria, and avoids
some of the consequences of bankruptcy.

EAP 21
Journal of Law, Information and Science Vol 25 2017

according to the correct weight they would be given under the current
community morality. These principles may be reflected to a greater or lesser
extent in the wording of a statute. The closer the wording of a statute comes to
these principles, the easier it will be for citizens in society to adjust their
conduct in accordance with the law.

This argument regarding changes in community morality is central to the
research in this paper. In selecting the standards to be used in a privacy audit,
an auditor may need to make a decision to use either the information privacy
laws in place in a particular country or the latest developments in international
thought regarding best practice in information privacy. These may be found in
documents containing proposals for legislative and/or policy reform regarding
information privacy. The conception of community morality includes these
documents as indicators of what society as a whole regards as the latest
developments regarding information privacy rights. The latest developments
may not yet be reflected in legislation or case law. It is likely that this is the
situation in many countries because information privacy laws take
considerable time to enact and to change, but their subject matter is greatly
affected by rapid changes in technology.

Examples of the speed at which legislation in this area changes may be seen in
New Zealand’s review of its privacy law. The New Zealand Law Commission
began this review in October 2006 and completed it in July 2011. However, the
recommendations made have not yet been adopted in legislation by the New
Zealand government. Also, the United States has been toying with the idea of
federal privacy legislation to cover consumers generally for some time now.
The White House, under the pre-Trump administration, introduced a draft
Consumer Privacy Bill of Rights in both 2012108 and 2015.109 Prior to that, however,
other consumer privacy bills have been proposed in the US, the most notable
of which is the bi-partisan Commercial Privacy Bill of Rights proposed by
Senators Kerry and McCain from opposing political parties.110 However, the US
has not yet adopted any of these proposals in legislation. Where information
privacy legislation is updated infrequently, there is a strong risk that it does
not reflect community morality. Privacy audits may bridge this fissure.

108 ‘Consumer Data Privacy in a Networked World: A Framework for Protecting
Privacy and Promoting Innovation in the Global Digital Economy’ (Report, United
States White House Office, 2012)
<http://repository.cmu.edu/cgi/viewcontent.cgi?article=1096&context=jpc>.
109 Administration Discussion Draft: Consumer Privacy Bill of Rights Act of 2015 (2015) The
White House <https://www.scribd.com/document/257168595/Administration-
Discussion-Draft-Consumer-Privacy-Bill-of-Rights-Act-of-2015>.
110 John McCain, ‘Kerry, McCain Introduce Commercial Privacy Bill of Rights’ (Press
Release, 12 April 2011) <http://www.mccain.senate.gov/public/index.cfm/press-
releases?ID=4a92a6f4-daf7-2f4a-84e7-3eb83276af23>.

EAP 22
Generating Standards for Privacy Audits: Theoretical Bases from Two Disciplines

Accountable organisations should have a process of continuous review of their
accountability mechanisms, including privacy auditing.111

Finally, from an ethical perspective, privacy auditors may wish to apply the
latest thinking about information privacy to their privacy audits. The ethical
influence is consistent with Critical Theory. Organisations that wish to act not
only within the letter of the law, or even within the spirit of the law, but also in
accordance with general ethical principles may find that it is necessary to apply
standards that go beyond those enacted in local legislation.

Conclusion

The practice of privacy auditing has been impeded by a lack of a commonly
accepted theoretical basis. Consistency of approach among those who perform
privacy audits may be assisted by agreement on a theory of privacy auditing.
However, developing privacy audits as an application of the information
privacy laws of a single jurisdiction is unlikely to provide consistency across
the various countries in which privacy audits have been produced. Information
privacy laws in individual jurisdictions may be out-dated and may fail to
address the latest movements in information privacy thought. Privacy auditing
could be improved by increased flexibility and the ability to apply to changes
in technology and changing social norms regarding information privacy.

Privacy auditing is not yet fully developed and therefore there are limited
benefits from a traditional Positivist analysis of privacy auditing. However,
Non-Positivist perspectives are present in the disciplines of accounting and
law. Critical Theory provides a theoretical perspective that can inform the
approach taken to the research in this paper. Furthermore, Critical Theory
provides the opportunity to suggest improvements to the practice of privacy
auditing. It is suggested that privacy audits may be improved by the use of
standards that come closer to harmonisation.112 Additionally, this would allow
the standards to be updated to more modern criteria than are currently
contained within national information privacy laws.

111 Martin Abrams, ‘Accountability: A Compendium for Stakeholders’ (Report, Centre
for Information Policy Leadership, March 2011) 7
<http://informationaccountability.org/wp-content/uploads/Centre-
Accountability-Compendium.pdf>.
112 Toy, ‘Different Planets or Parallel Universes’, above n 29.

EAP 23