Professional Documents
Culture Documents
Received July 18, 2019, accepted August 4, 2019, date of publication August 9, 2019, date of current version August 23, 2019.
Digital Object Identifier 10.1109/ACCESS.2019.2934226
ABSTRACT From an individual’s perspective, technological advancement has merits and demerits. Video
captured by surveillance cameras while a person goes about their daily life may improve their personal
safety but the images collected may also represent an invasion of their privacy. Because of the ease of digital
information sharing, there exists a need to protect that visual information from illegal utilization by untrusted
parties. The European parliament has ratified the General Data Protection Regulation (GDPR), which has
been effective since May 2018 with a view to ensuring the privacy of European Union (EU) citizens’
and visitors’ personal data. The regulation has introduced data safeguards through Pseudonymisation,
Encryption, and Data protection-by-design. However, the regulation does not assist with technical and
implementation procedures, such as video redaction, to establish those safeguards. This paper refers to the
GDPR term ‘‘personal data’’ as ‘‘visual personal data’’ and aims to discuss regulatory safeguards of visual
privacy, such as reversible protection, from the technological point-of-view. In the context of GDPR, the roles
of machine learning (i.e. within computer vision), image processing, cryptography, and blockchain are
explored as a way of deploying Data Protection-by-Design solutions for visual surveillance data. The paper
surveys the existing market-based data protection solutions and provides suggestions for the development
of GDPR-compliant Data Protection-by-Design video surveillance systems. The paper is also relevant for
those entities interacting with EU citizens from outside the EU and for those regions not currently covered
by such regulation that may soon implement similar provisions.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see http://creativecommons.org/licenses/by/4.0/
VOLUME 7, 2019 111709
M. N. Asghar et al.: Visual Surveillance Within the EU General Data Protection Regulation
the purpose of safety. Visual surveillance is the principal The following quotation of Benjamin Franklin is well
context of the current paper. suited to today’s secure society:
Security is an overarching concept that is also tied to ‘‘Any society that will give up a little liberty to gain a little
an individual’s personal data security. It is not simply the security will deserve neither and lose both.’’
security offered to companies, other organizations, and state In response, it may be said that the European Union (EU)
players as a service to protect their interests. For an indi- parliament has introduced a General Data Protection Regu-
vidual, during each working day, the thought of being safe lation (GDPR) [7] into European law in respect to the pro-
starts at the time of leaving home and ends on the return tection of individuals’ rights to freedom, while their personal
home. Video surveillance cameras, such as those that are data, in the form of texts, audio, and videos, are processed.
part of a Closed Circuit Television (CCTV) system, are GDPR seeks to ensure data protection and privacy during
utilized as safety tools by recording individuals’ activities. the processing of personal data of all individuals within the
People are very much exposed through video surveillance in EU and the European Economic Area (EEA). GDPR also
the name of safety or security. Examples of video surveil- covers the transfer of personal data outside the EU and
lance applications include: the monitoring of sensitive loca- EEA area. In GDPR law, the data subject (GDPR term for
tions (such as embassies, airports, nuclear plants, military a natural person/individual) has complete control over the
zones, and at border controls), intrusion detection (residential processing of their personal data. Without the consent of
and retail monitoring); public safety installations (such as the data subject, the data cannot be processed and exported
for traffic control, and at car parks and ATM machines); outside the EU. GDPR provides extensive guidelines to data
vehicle detection systems through license plate recognition; collectors and processors in order to maintain the confi-
event detection (such as during child care and the care of dentiality of the data subject’s information. The law covers
the elderly); and marketing and statistics-gathering systems the rights of individuals and instructs data controllers and
(such as for discovering customers’ habits and behaviours, supervisory authorities accordingly. If personal data collec-
and recording the number of visitors). Illicit activities such tion companies do not provide GDPR-compliant solutions
as theft, assault, and shooting incidents are captured by for their data subjects (customers and employees), heavy
the widespread deployment of surveillance cameras, which penalties could well result. A survey [8] shows that 80%
clearly benefits the majority of citizens. Nevertheless, with of businesses in Europe know minimal details or almost
the advent of enhanced CCTV based security, more chal- nothing about the implementation of GDPR within their com-
lenges are presented in terms of people’s privacy protection, panies. In fact, GDPR is a global data protection regime,
and their dignity and free will, even when they are being which according to the authors’ analysis can be summarized
monitored. as:
The invasion of privacy, due to the extensive use of surveil-
• Harmonization of privacy protection regulation in the
lance cameras, has been widely and frequently reported. Par-
EU.
ticularly in the UK, some may say that there is an excessive
• Freedom rights for EU citizens/visitors by providing
number of CCTV cameras. For example, any citizen, as a
safeguards for their identifiable data.
daily average in London, is caught on about 300 CCTV
• Notification: Meaningful symbols must be placed by
cameras [2]. In fact, there are about 4 million CCTV cameras
controllers/processors to notify people whenever CCTV
in the country as a whole. In respect to the abuse and misuse
recording is in progress.
of surveillance data, there are reports [3] of various disturbing
• Access rights of individuals: Personal data should be
incidents. In 2005, according to the British Broadcasting
collected by initiating a consent form. The regulation
Corporation (BBC) news [4], a woman in Liverpool was spied
gives the right to individuals to gain access and request
on by four council workers misusing a street CCTV pan-tilt-
the erasure of their personal visual data in a reasonable
zoom camera. The extremely intrusive use of CCTV surveil-
time-frame.
lance by Nahid Akbar (in 2017) [6] in her neighbourhood was
• Privacy Safeguards: Data should not be stored and
penalised by the Scottish court. The Scottish court found that
exported openly; there should be privacy safeguards to
Ms. Akbar was in breach of the fifth Principle of Data Protec-
hide the originality and identification of data through
tion, resulting in the £17,000 damages awarded to the affected
automated techniques.
Mr. and Mrs. Wooley. Outside the UK, an instance of illegal
• Retention Period: Personal visual data should be kept
spying on celebrities and government officials occurred [5]
for a specific time-frame.
when a museum’s CCTV camera was misused by a secu-
rity guard to spy on the Chancellor of Germany’s (Angela This paper overviews the relevant articles within the first
Merkel’s) private flat. Such incidents can occur in numerous four chapters of the GDPR regulation (see Section II.A)
sensitive situations: such as when people are identified in in terms of re-defining the concepts of personal data,
surveillance video captured during a demonstration or rally, pseudonymisation, encryption and Data Protection-by-
or when, in oppressive regimes, political opponents active Design in accordance with the visual protection of data
against a sitting government can be threatened or harmed subjects. Thus, the following research questions are posed
later. for a GDPR-compliant video surveillance industry:
TABLE 1. GDPR descriptions for data and the authors’ definitions for visual data.
Q1: What is the visual personal data of the data sub- • A review of the existing solutions for GDPR-compliant
ject? (Descriptions of data relevant to visual data are given video-redaction methods for surveillance applications;
in Table 1 and Section III describes visual data). and
Q2: Do all recorded visual data need to be protected? • Forthcoming real-time technologies for implementing
(Section II (specifically II.A)) Data Protection-by-Design solutions.
Q3: In what circumstances are GDPR-compliant surveil-
The remainder of the paper is organized as follows.
lance cameras required? (Section III.A)
Section II is an overview of the pertinent articles of the
Q4: What are the pseudonymisation vs. anonymisation
GDPR regulation. That section provides a clear picture of
strategies for visual data protection? (Section IV.A)
GDPR for the CCTV surveillance industry, along with GDPR
Q5: What is Data Protection-by-Design in respect to video
safeguards and distinguishable concepts of pseudonymi-
surveillance companies? (Sections IV and V)
sation vs. anonymisation. Section III presents identifiable
The purpose of this paper is to clarify the aforementioned
visual personal data, identified premises and use-cases for
questions with the following contributory points:
mandatory GDPR-compliant surveillance. Section IV dis-
• Brief overview of the GDPR relevant articles requir- cusses the role of technology in GDPR-compliant Data
ing clarification for visual data collection compa- Protection-by-Design solutions with market-based solu-
nies/controllers/processors. tions. Section V presents future Data Protection-by-Design
• The role of technology in the implementation of Data solutions by incorporating other technologies and finally,
Protection-by-Design solutions for CCTV data besides Section V1 concludes the paper with several observations and
the implementation of policies. comments.
II. GENERAL DATA PROTECTION REGULATION 99 articles [7]. However, for the implementation of GDPR-
The Data Protection Directive 95/46/EC (Directive 95/46/EC, compliant surveillance systems, the first four chapters cover
1995) was introduced in October 1995 by the EU and came all necessary articles for secure processing of personal data by
into force in December 1995. The purpose of this directive the controllers/processors. For the ease of readers, a flowchart
was to protect the individual’s personal data. The directive of focussed GDPR articles in this paper is presented in Fig. 1.
was implemented in October 1998 as an important part of Notice that as the main focus of this paper is technological
EU privacy and human rights law. That directive was super- aspects of GDPR. Though the authors have applied due care
seded with the European General Data Protection Regulation and consideration to legal aspects of GDPR, readers should
(GDPR), adopted in April 2016 and becoming enforceable bear in mind the technological focus and perform due dili-
on May 25, 2018 in all EU member states. The aim of GDPR gence in respect to any legal aspects, which are added for
is to harmonize the degree of data protection across the EU the reader’s convenience and are not intended to be legally
countries. By introducing a single law across all EU states, definitive.
the intention was that it will bring better transparency to the
processing of an individual’s data and boost the collective 1) GDPR DESCRIPTION and DEFINITIONS
digital economy of the 28 participating EU states (possibly Article 4 of GDPR [7] covers the definitions of terms used
27 states if the UK leaves the EU). Personal data-collecting throughout the regulation. Table 1 defines the term data. For
organizations in these EU countries, will probably be most CCTV controllers, this current paper also discusses some
affected by the GDPR. other important definitions to clarify the concepts. Overall,
Data protection, as defined by the Joe Meade (Ireland’s GDPR considers seven types for the term ‘data’ through-
Data Protection Commissioner) [9] is as follows: out the regulation and all are relevant for surveillance data
‘‘Personal data protection applies to all our interactions collection companies. However, notice that according to
with public and private sector organizations and thus applies GDPR (article 9) ‘‘Genetic Data’’ and ‘‘Biometric Data’’ are
to applications, purchases and transactions in State services, included in the ‘sensitive’ data definition or equally ‘‘spe-
business and economic matters, in the social and medical cial categories’ of personal data definition. Thus, in terms
areas, in the workplace and in the globalized technological of GDPR definitions, as opposed to descriptions, ‘‘Genetic
arena.’’ Data’’ and ‘‘Biometric Data’’ appear together in the same
The Data Protection Acts 1988 and 2003 confer rights definitions.
onto individuals and place responsibilities on the shoulders
of those who process personal data. GDPR applies globally; 2) GDPR PRINCIPLES & LAWFULNESS OF PROCESSING
thus companies outside the EU processing personal data of Article 5, clause-1 describes six (6) GDPR principles, that
an EU citizen with the aim of providing services, selling is lettered a–f, for ensuring privacy during processing of
goods or monitoring the behavior of any EU citizen, e.g. personal data.
any video surveillance companies [10] or health centers, are [Art. 5-1(a)] Lawfulness, fairness and transparency
bound to the regulation. A Data Protection Officer (DPO) Lawfulness: Processing must be in accordance with the
should be appointed by these companies. The appointed DPO GDPR criteria.
will be in charge of the GDPR compliance of that organi- Fairness: Personal data should be processed in ways that
zation. The stakes have been set high, as failure to comply the subjects would reasonably expect and not use it in ways
will result in a substantial fine of 4% of the company’s that have unjustified adverse effects on them.
global revenue or of 20 million Euros, whichever is higher. Transparency: Explicit reasons for the collection and pro-
Due to the recent misuse of Facebook’s customer data [11], cessing of personal data are required.
all eyes are on the proper protection of customers’ private [Art. 5-1(b)] Purpose limitations
information. Personal data can only be obtained for ‘‘specified, explicit
The resulting harmonization in law makes it easier for EU and legitimate purposes’’. The data subject must be aware
citizens to understand how their data is being used and how of the reason for collecting data. There should not be any
they can raise complaints, even if they are not located in processing without further consent, although, the data related
an EU member country. GDPR is not applied to IT-related to public interest, research or statistical purposes have no
data alone; it has broad-sweeping implications for a whole obligations in respect to purpose limitation.
company, such as for the handling of sales, marketing and [Art. 5-1(c)] Data minimisation
human resourced data. Data collected specific to a subject should be ‘‘adequate,
relevant and limited to what is necessary in relation to the
A. OVERVIEW OF FOCUSED GDPR ARTICLES purposes for which they are processed’’, which means that the
The EU GDPR aims to increase the privacy of the EU’s minimum amount of data should be collected and retained for
citizens and allows regulatory authorities to use maximum specific processing.
powers to take actions against any businesses/companies that [Art. 5-1(d)] Accuracy
breach the law. The regulation (GDPR) has 173 recitals The collected data must be ‘‘accurate and where necessary
(representing GDPR goals), and 11 chapters which cover kept up to date’’. There should not be any alterations by
way of ensuring good protection against identity theft. Data which are out of the scope of this paper. Article 11 indicates
controllers must build editable data management systems to that if the controller is processing personal data, which do no
allow the subject to update their data. longer identify that particular data subject, then the controller
[Art. 5-1(e)] Storage limitations is no longer compliant with the GDPR.
The regulator expects that personal data are ‘‘kept in a form
which permits identification of data subjects for no longer 3) RIGHTS OF THE DATA SUBJECT
than necessary’’. With proper safeguards, data related to Chapter III-GDPR is an important chapter, which demands a
public interest, research or statistical purposes can be stored proper focus from all companies dealing with the processing
for a long time. The data that are no longer required should of personal data. This is a detailed chapter with five sub-
be erased/deleted from the repositories of a controller. sections. Section 1, Article 12 transcribes the transparency
[Art. 5-1(f)] Integrity and confidentiality and modalities of the rights of data subjects. Section 2,
During processing, the collected data must maintain its including Articles 13-15, is about information and access
integrity and confidentiality. The controllers should use to personal data. Article 13 deals with the information that
appropriate technical or organizational measures to provide should be provided to the data subject when their personal
‘‘appropriate security of the personal data including protec- data is being acquired or processed. Article 13(2) defines
tion against unlawful processing or accidental loss, destruc- the additional information provided by the controller to the
tion or damage’’. data subject regarding personal data retention time, the right
[Art. 5-2] Accountability to request an erasure of personal data, withdraw consent,
Controllers shall be accountable if the data processing is the right to lodge a complaint with the supervisory authority,
not compliant with clause 5.1(a-f) principles. the statutory or contractual requirements, and the existence
Article 6 concerns the lawfulness of processing by pro- of automated decision making for profiling. Article 14 is
viding a proper ‘‘consent form’’ for the data subject for the about personal data that has not been directly obtained from
processing of their personal data. The processing should be the data subject. Article 15 has information about the right
done while keeping the legal obligations in mind. Article 6 of access by the data subject. The controller should provide
(4)(e) mentions the appropriate safeguards for personal data a copy of the personal data undergoing processing, which
by using the procedures of encryption or pseudonymisation shall not adversely affect the rights of freedom of others.
(Section II.C). Article 6(f) clarifies that legitimate processing Section 3 ranging from Articles 16 to 20 are fundamentals
of data by the controllers is allowed, except that this process- that should be known by EU citizens. Articles 16 to 18 are
ing should not override the rights and freedom of the data about their rights of rectification, erasure, and restrictions on
subject, especially if the data subject is a child. the processing of their personal data. In short, the following
Article 7 is about the conditions for getting consent about eight (8) rights are given to data subjects under the umbrella
the processing of personal data from the subject. Articles 8 to of GDPR:
10 discuss the processing of personal data related to children, 1. Right to access (Article 15): Data subjects have the
special categories and criminal convictions and offences, right to request access to their collected personal data
and can inquire of a company about the use of their a data subject. Article 32(1) describes the following security
data. The company is bound to provide a cost free copy measures:
of personal data in an electronic format upon request. (a) Pseudonymisation and encryption of personal data.
2. Right to rectification (Article 16): This ensures that (b) Ensuring the procedures apply across the range of secu-
individuals can have their data updated, in case of rity services i.e. ‘‘confidentiality, integrity, availability
changes and if the data are incomplete or incorrect. and resilience’’.
3. Right to erasure (Article 17): Data collection com- (c) In the event of an emergency, quickly restoring the
panies must delete the data of subjects if they are no availability and access to personal data.
longer their customers, or if they withdraw their con- (d) Ensuring the regular evaluation and validation testing
sent from a company to use their personal data. of the adopted security procedures for effective privacy
4. Right to restrict processing (Article 18): Individuals protection.
can request that their data should not be used for pro-
cessing. Their record can remain in place, without any Article 32(2) points out the risks for processing, i.e. acci-
dental and unlawful destruction, loss, alteration, unauthorized
use.
5. Right to be informed (Article 19): This specifies that disclosure of processed and stored data. These risks should
before collection of any data by companies, individuals also be assessed along with assessment of the appropriateness
must be informed. Consumers have to opt in for their of security levels.
data to be collected, and consent must be freely given
rather than implied. 6) DATA PROTECTION IMPACT ASSESSMENT
6. Right to data portability (Article 20): Individuals have Article 35 describes the procedure of Data Protection Impact
a right to transfer their data from one service provider Assessment (DPIS), whereby, if processing is done automat-
to another. ically through technologically-advanced means, then a DPIS
7. Right to object and stop the processing of their data for should be performed. DPIS should be carried out by the
direct marketing (Article 21). controller upon the advice of a designated Data Protection
8. Right to be notified within 72 hours of a data breach Officer (DPO). DPIS must be performed on the automated
(Article 34). procedures, i.e. profiling in which legal impacts upon a data
subject are assessed, whether there exists a special category
of data or personal data relating to the criminal convictions
4) GDPR PRIVACY PRINCIPLES
and offences. DPIS must contain a systematic description
Article 25 describes the key privacy principle for all busi-
of the envisaged processing operation in a legitimate way.
nesses, i.e. Data Protection-by-Design and –by-default. This
The assessment of the risks to the rights and freedoms of
article serves as the backbone for GDPR-compliant busi-
data subjects should be given. The measures to address the
nesses by implementing privacy protection for the sensitive
risks, i.e. safeguards, security measures and mechanisms to
data of a subject.
ensure the protection of personal data in compliance with the
• Data Protection-by-Design: This means that ‘‘appro- regulation should also be present in the DPIS. Article 36 is
priate organizational and technical measures to ensure about prior consultation of controllers with the supervisory
personal data security and privacy are embedded authority prior to the DPIS under Article 35, if processing
into the complete lifecycle of an organization’s prod- would result in a high risk in the absence of measures taken
ucts, services, applications, and business and techni- by the controller to mitigate the risk. Article 37 ensures
cal procedures’’. The proposed technical measures are the designation of a DPO with professional qualities and,
pseudonymisation and data minimization. in particular, expert knowledge of data protection law and
• Data Protection-by-Default: This means that: (i) only practices and the ability to fulfil the tasks referred in Article
necessary personal data are collected, stored, or pro- 39. Article 37(2) facilitates that by stating that, for this group
cessed; and (ii) personal data are not accessible by of undertakings, controllers may appoint a vigilant and con-
un-authorized people. stantly available DPO. A DPO can also be a staff member of
• Certification: Compliance with Data Protection-by- a controller’s team. Articles 38 and 39 describe the position
Design and -by-Default requirements should be demon- and duties of a DPO in detail.
strated through an approved certification (as stated in
Article 42). B. GDPR SAFEGUARDS
From the previous section’s overview of GDPR, it can
5) SECURITY OF PROCESSING be clearly perceived that EU citizens have control over
Article 32 is an important article in respect to providing their personal data and without their will, companies can-
secure processing for personal data. This article emphasizes not collect and process their personal information. GDPR
that the controller and processors should implement appropri- relaxes the controllers/processors’ need to retain the per-
ate technical and organizational measures to ensure an appro- sonal data of the subject by providing safeguards in the
priate level of security to preserve the rights and freedoms of form of pseudonymised information and/or encryption.
within images [23] usually works through state-of-art com- TABLE 2. Irreversible vs. reversible video redaction techniques.
puter vision algorithms. Such computer vision algorithms are
increasingly robust when performing object detection and
tracking.
Object Detection is the procedure of finding real-world
instances, i.e. objects such as faces, bodies, bicycles, build-
ings, and licence plates, in images or videos. Object-
detection algorithms classically employ extracted fea-
tures and learning algorithms to recognize instances of
an object category. They then take advantage of object clas-
sification and localization techniques to achieve accuracy
in their results. Thus, to make a secure Data protection-by-
design solution for any CCTV system, CV works by ROI
selection (manually) or by detection/tracking (automatically)
through pattern recognition and learning techniques (includ-
ing Deep learning). Afterwards visual privacy protection
methods will be applied through video redaction. in Table 2 and summarized after the Table. Fig. 5 shows the
Currently, there are frequently utilized, pre-trained mod- visual effects (produced by in-house experiments) of these
els for object detection including: YOLO (You Only Look methods applied to an image.
Once) [24], Regional CNN (RCNN), Fast RCNN, Mask (a) Blurring: This redaction method [29] is applied
RCNN [25], and Multibox [26]. through image filters. The blurring filters can be
Object Recognition is based on: template matching; color applied to a complete video frame or some specific
matching; shape-based matching; or facial landmark detec- region/object, such as a face, person, licence plates,
tion (by locating the facial key points (15 to 68 unique or signage. Common blurring filters are: (1) Mean fil-
points)). In the case of faces, these key points mark important ter; (2) Weighted average filter; and (3) Gaussian filter.
areas of the face, such as the eyes, corners of the mouth, and For effective blurring, the Gaussian filter is utilized in
the nose. most privacy-protection applications.
(b) Pixelation: The process of enlarging pixels within
2) ROLE OF IMAGE/VIDEO PROCESSING images to give them a blurred effect is known as pixe-
Image/video processing is extensively used to provide visual lation [30]. It is also performed through pixel interpo-
protection to CCTV surveillance videos. Image processing lation for high distortion effects. Interpolation works in
is the manipulation of a digitized image to improve (or two directions by using known data points to estimate
deteriorate) the visual quality of given image. It comprises the values at unknown points. Thus, image interpola-
of different types of manipulations. For example: (1) image tion occurs in the distortion of an image from one-pixel
segmentation is employed to identify the pixel color-based grid to another grid.
information from images; (2) geometric transformations (c) Mosaic: Small blocks of pixels from different regions
(enlargement, reduction, and rotation) may take place; are combined to give an un-identifiable effect to the
(3) there may be a combination or blending of images; picture [31], e.g. a face is obscured through pixel blocks
(4) image editing may take place; and (5) there could be taken from different areas of an image
interpolation [27]. (d) Cartooning: This is an image distortion technique
For quick safeguard methods, GDPR emphasizes the word which uses blurring filters and pixelation together 32]
‘‘Pseudonymisation’’ fifteen (15) times in the regulation, to give robust privacy to videos.
separate to the term Encryption. The term ‘anonymous data’ (e) Masking: The process of hiding visual information by
occurs once in Recital 26 [28]; otherwise, the term anonymi- replacing the original data with some unknown data.
sation is not discussed within the whole regulation. For (f) Warping: The process of distorting shapes in images
the convenience of applying pseudonymisation as a GDPR- by digitally manipulating them to hide the original
compliant solution, companies provide masking to detected shape. The warp is applied [33] by first transforming
objects in the visual data. Thus, most CCTV controllers the location of pixels with a vector and then linearly
consider masking as the pseudonymised solution, which is interpolating the image.
wrong in the light of the definition given in Section II.C. (g) Morphing: This is a fluid transformation from one
It is worth mentioning here that irreversible schemes image to another. Images are morphed [33] by cross
can only technically revert back to an original or can be dissolving cuts in images.
identified/recognized again, if the originals are saved in Morphing is equivalent to warping (shape) plus cross-
storage for re-access. Most AI-based tools accessed over dissolving (colors)
the Internet save originals for recovery of the manipulated Morphing, warping, face swapping are done through
image. Widely-used video redaction methods are classified Facial Landmark detection.
(h) Visual Abstraction: The process of visual abstrac- also works well in those code systems that rely
tion [34] is the replacement of objects/persons appear- upon codebooks and which act to transform plaintext
ing in images by a visual model to protect the privacy into code-text. It is also widely used for securing
of the individual, while enabling that person’s activ- credit/debit card numbers. In visual data tokenization,
ity. Abstraction can be obtained in a variety of ways: swapping of image pixels randomly with other values,
Change the face of a person to some cartoon face; is known as scrambling [35]. While, for complete
apply a 3D avatar (Fig. 5) to the person’s whole body; object replacement, this can be implemented as a visual
use a silhouette or blind box; and so on. Complete abstraction [34] as discussed above.
removal of the object within an image is also consid- (k) False Colors: In this method [37], image privacy pro-
ered to be visual abstraction. This abstraction can be tection is achieved through mapping the original image
reversible if applied through tokenization (discussed to some other colour palette. It is a reversible tech-
below). nique and the original colours are reversed back to the
(i) Scrambling: Scrambling [35] refers to the permuta- original.
tion of data within an image. The formal definition (l) Hashing: This is an irreversible technique [38] for
of permutation is the re-ordering of data through spe- converting arbitrary sized data to fixed size data. It is
cific patterns or at random. If scrambling is done used for image indexing or finding the same images in
through defined patterns or with a seed value, then a database. However, it cannot obfuscate images. Many
this is called a reversible technique and data can be GDPR blogs discuss this as a data security process but,
retrieved by using pattern information and a seed in fact, it cannot be used for the redaction of visual
value. On the other hand, if scrambling is by a ran- data [39].
dom re-ordering then it is known as an irreversible
technique. 3) ROLE OF CRYPTOGRAPHY
(j) Tokenization: Tokenization [36] is basically a Cryptography is the practice and study of data securing
non-mathematical data security method and a reversible techniques [40]. Cryptography is a reversible process using
technique. The process of tokenization is the encryption and decryption methods. In fact, encryption is the
swapping-out of sensitive information and the replace- only reversible solution (through decryption) designated by
ment of it with random numbers. The original and the GDPR in its Articles. In the context of CCTV video,
mapped numbers are separately stored in tables. This encryption can act on all or part of the video material with
the intent of protecting the privacy of individuals captured by generally-considered to be the strongest, the Advanced
cameras and appearing within the video footage. Encryption Standard (AES) [47], has a complex set of sixteen
Thus, encryption is the process to make the original data rounds to compute using a minimal 64-bit block size (longer
(plaintext) into an unintelligible form (ciphertext) by means key lengths are available), which takes almost four times
of encryption algorithms known as ciphers, with some spe- as much time than when the video is encrypted with a
cific random secret value(s) known as a key. For visual data lightweight cipher (see Fig. 11, [48]). Selective encryption
protection, encryption is applied with the cipher and symmet- is the most adoptable way to obfuscate visual data because it
ric or asymmetric key/s over the full or part of a CCTV video. takes much less encryption time to secure the video than the
The cipher with the same or different key serves to decrypt the naïve or full form of encryption.
ciphertext in symmetric or asymmetric cryptography respec-
tively. The regions or some important information such as B. MARKET-BASED DATA PROTECTION-BY-DESIGN
color pixels or motion in videos can be identified by image SOLUTIONS
processing and computer vision methods and then encryption This section surveys current Data protection-by-design solu-
acts as a safeguard for reversible obfuscation of visual data. tions, operating globally within the commercial market place.
There are many forms of encryption [41] which can provide Some provide reversible solution with encryption while the
some measure of protection. For example, there are: majority solutions only use video redaction through irre-
1) Naïve Encryption: Full video encryption with a suit- versible techniques.
able block cipher. The encrypted video is not playable StratoKey, [49] is a data protection-by-design solution for
through video recorders and cannot be watched without cloud and Software-as-a-Service (SaaS) for EU clients. This
decryption. solution incorporates authentication and access controls for
2) Selective/Sufficient Encryption: Specific parts/objects accessing personal data over a third-party cloud. StratoKey
in the videos are detected and then encrypted. In the divides data into layers. This layering adds to the exist-
literature, other words i.e. ROI-based, soft, lightweight ing security, while providing a stringent form of additional
and partial encryption are also applied to this type of security. StratoKey also provides Rule-based Security and
encryption. The videos are obfuscated in a way that Policy enforcement tools such as real-time Data Loss Preven-
they are viewable but not recognizable. The obfuscated tion, geographical access fencing, device profiling and other
videos may remain format compliant, and playable measures to ensure security and data privacy. Specifically,
in the distorted form. The obfuscation is applied by StratoKey supports ‘best-in-class’ Federal Information Pro-
strong ciphers, so that the adversary is not able to cessing Standard (FIPS) 140-2 validated AES encryption
reconstruct the recognizable version through inference using a 256-bit key and Format-Preserving Encryption
attacks [42]. algorithms. In addition, StratoKey has in-built support for
3) Transparent/perceptual encryption: This is applied standard key rotations. StratoKey also supports locking
over multiple versions of a video [43]; a lower quality encryption keys to individual applications and even indi-
version is free to view but to view a full-quality ver- vidual groups of users. These encryption key management
sion, the encryption key is required. Consequently, this features are native features within the StratoKey product.
method of encryption is applicable to the pay-per-view StratoKey also supports third- party key management ser-
industry. vices via the Key Management Interoperability Protocol
In cryptography, the robustness of the applied encryption is (KMIP). In fact, it provides both hardware and software
estimated through the robustness of the cipher and the length security.
of secret key/s to avoid perceptual attacks on videos and brute Smartcrypt, [50] provides an on-the-spot encryption solu-
force attacks on the keys. Numerous state-of-art algorithms tion with Transparent Data Encryption (TDE) to the sensitive
have been proposed and tested in the last three decades to data of EU citizens. Smartcrypt encrypts sensitive data at
provide confidentiality to the data. The prevailing ciphers are its creation time and saves it in the encrypted form. This
DES, AES, RC4, and RSA, though DES has been largely Smartcrypt encryption stays with the data even when it is
replaced by AES because it can easily be broken by a brute- moved and replicated to other user devices, file servers,
force attack (trying with all possible keys) with current PCs or external systems. Smartcrypt employs the AES cipher in
and RC4 has recently been broken by cryptographers [44]. Cipher-Block-Chaining (CBC) mode with a 256-bit key. For
For a secure key, a key space greater than 2100 is resilient to signing digital certificates, RSA 2048 Probabilistic Signature
brute force attacks up to 2020 (by virtue of the time needed to Scheme (PSS) with preliminary SHA 512 hashing of data is
guess every key possibility) [45]. Selective encryption on the used. They (AES and RSA) have their own key storage and
carefully chosen video syntax elements within a large volume retrieval mechanisms and are implemented on existing Intel
of videos data has proven to be strong, and, hence, cannot be and IBM hardware accelerators.
easily cryptanalyzed [46]. Sighthound, [51] offers GDPR-compliant video redaction
It is noted here that there is always a trade-off software to EU clients for public surveillance. This software
between security and computational complexity when is available as a plug-in for cloud services, or as a stand-alone
applying cryptography to visual data. The block-algorithm product for Windows- and Linux-based servers. The software
is implemented through CV and Deep Learning models for C. LIMITATIONS OF EXISTING SOLUTIONS
automatic detection of people, faces and licence plates in The majority of market-based solutions discussed above pro-
real-time. It can be applied through manual selection to blur vide irreversible solutions, such as pixelation and blur, mak-
specific objects such as street signs or animals. The software ing that data no longer subject to GDPR. However, as dis-
is also able to detect people’s age and gender, as well as their cussed in Section II-C, this type of data storage is not useful
emotions. An irreversible blurring technique is applied to the for future purposes, even for legal use by supervisory authori-
data subject. ties and stakeholders. Some surveyed market-based solutions
An Intelligent Video Analytics services [52] for public provide encryption-based reversible and robust solutions by
safety organizations provides advanced features of redaction means of the AES cipher but mostly they are applied to
and facial recognition to help investigations by security agen- textual data, not videos. Thus, there is a need for reversible
cies. This is a complete security model to monitor suspi- and reliable visual Data protection-by-design solutions in the
cious activities through cameras. In this software, an ROI is future.
extracted and blurring is applied to specific ROIs. Surveillance data collection companies should understand
The GDPR-compliant securityRuntime (secRT) [53] is that the video redaction-based irreversible solutions are not
implemented with the AES cryptographic algorithm along sufficient in themselves. There is a pressing role for computer
with appropriate key management for the protection of tex- vision, image processing, and cryptography along with secure
tual data. This includes the process of tokenization of sen- key management solutions [57] to provide a complete privacy
sitive data, e.g. on the digits of a credit card number. secRT protection model to visual surveillance data. Therefore, all
applies encryption to the original digits before tokenization these information technology domains must be combined
and it then stores them in a database only intended for together to provide a robust Data protection-by-design solu-
tokens. tion for the surveillance industry.
Genetec software provides encryption and authentication
for videos by means of the Omnicast Internet Protocol (IP) V. FORTHCOMING DATA PROTECTION-BY-DESIGN
video management system [54]. Genetec offers cloud hosting, SOLUTIONS
storage and sharing of video, while Identity Cloak is a local Data protection-by-design solutions must ensure the four
application of the software (see next). security services, i.e. confidentiality, integrity, availability
Facit Data Systems Identity Cloak, and resiliency (Article 32(1)). For effective GDPR-compliant
(https://ipvm.com/reports/cloak-identity) provides GDPR- CCTV surveillance application, the following basic
compliant solutions for CCTV videos. It provides two modes requirements must be considered by future researchers and
of blurring, (a) standard, and (b) full body. In standard developers:
blurring, faces along with the upper body and arms are 1) Perceptual Security: The footage is effectively visually
considered for blurring, while full-body blurring covers the distorted and proof against attacks to remove that distortion.
entire body of a data subject. Identity Cloak’s face detec- Here, the term ‘‘effectively’’ means that the distortion level
tion performs more consistently than Genetec Clearance, within the video is sufficient to make video unwatchable. The
at higher angles of incidence, while also not losing faces as video can be viewable but not understandable or pleasantly
persons move away from the camera or across the field of watchable.
view. 2) Reversibility: The implemented safeguard should be
Pro Pixelated (https://www.propixelated.ie/) solutions use reversible, so that the video can be used for legal purposes,
irreversible pixelation for static CCTV images. They pro- if required.
vide the services of full face, complete image, number plate 3) Intelligibility: The activity level in the footage should
and other types of sensitive data blurring (pixelation) to EU not be obscured to prevent the identification of un-
clients. ethical/unlawful behaviors. However, notice that this require-
Kinesense [55], offers video redaction services for CCTV ment may be very difficult to be implement because: (i) there
videos. Irreversible masking and pixelation techniques are is no common definition of ‘‘unethical’’ behaviors; and (ii)
provided through annotation. An ROI is masked and pixe- unlawful behaviors vary between different EU states.
lated through secure filters and areas outside the ROI can be 4) Efficiency: Camera devices operated with reduced hard-
pixelated for the purpose of location hiding. ware specifications (such as the Raspberry Pi and CMOS
Redaction, uses the concept of blurring and pixela- sensors) need real-time and efficient privacy-protection solu-
tion for ROI-based video redaction. The software pro- tions.
vides GDPR-compliant solutions for body-worn cameras 5) Format compliance: Secured footage should be decod-
and drone footage and also provides CCTV anonymization able within video players in an obscured/unintelligible
(https://www.redaction.ie/). form.
Face404 software [56] performs the blurring of selected Thus, Data protection-by-design encompasses a complete
ROIs, e.g. human faces in a video. The Face404 (based on AI technical solution, which may further include authentication,
and CV) is a secure cloud-based solution, which meets the access rights, digital evidence chain, video records storage,
requirements of GDPR compliance for stored videos. and a method of generating successive secret keys from
particularly when they export data to a third party. The third [6] Scottish Court. (2017). £17,000 Damages Awarded by Scottish Court
party may need to hire trained cryptographers to manage and in Neighbour CCTV Dispute | Brett Wilson LLP. Accessed: May 2019.
[Online]. Available: https://www.brettwilson.co.uk/blog/17000-damages-
edit that information. awarded-scottish-courtwooley-wooley-v-nahid-akbar-akram-2017-sc-
Data protection-by-design is still at an immature stage and edin-7-neighbour-cctv-dispute/
requires reliable secure technologies for the privacy provision [7] EU GDPR. (2016). EUR-Lex–32016R0679–EN–EUR-Lex. Accessed:
May 2019. [Online]. Available: https://eur-lex.europa.eu/eli/reg/2016/
of digitized visual data in the context of the regulation. Article 679/oj
25 serves as a backbone for secure technology-based solu- [8] C. Babel. (2017). Privacy and the EU GDPR US and UK Privacy Pro-
tions and Article 32 clarifies the security services provided fessionals. [Online]. Available: http://www.corporatecomplianceinsights.
com/wp-content/uploads/2017/11/TrustArc-Privacy-and-the-EU-GDPR-
by those solutions. For instance, Confidentiality requires Research-Report-09.26.17.pdf
that personal data whether in storage or transit can only be [9] G. Mainard. (2018). Data Protection Issues. Accessed: Jun. 2019. [Online].
accessed by authenticated persons/systems to avoid inter- Available: https://www.localenterprise.ie/DublinCity/Start-or-Grow-your-
Business/Knowledge-Centre/eBusiness/Data-Protection-Issues/
ception attacks. Integrity requires that any modification to [10] D. George, K. Reutimann, and A. Tamò-Larrieux, ‘‘GDPR bypass by
personal data whether in storage or transit can only be per- design? Transient processing of data under the GDPR,’’ SSRN Electron. J.,
formed by authenticated persons/systems with the consent of Aug. 2018.
[11] G. Petro. (2018). Facebook’s Scandal and GDPR are Creating New
the data subject. Availability requires that certain hardware Opportunities for Retail. Accessed: Jul. 2019. [Online]. Available:
and software services can only be accessed by authorized https://www.forbes.com/sites/gregpetro/2018/05/27/facebooks-scandal-
persons/systems, a failing in which can be due to an interrup- and-gdpr-are-creating-new-opportunities-for-retail#481601a6626c
[12] Data Protection Commission. (2018). Anonymisation and
tion attack by which a system is spoiled or made unusable. pseudonymisation | Data Protection Commissioner. Accessed: Feb. 2019.
Resilience ensures the robustness of the technology, which [Online]. Available: https://www.dataprotection.ie/en/guidance-landing/
should restore itself to its original form after attacks. anonymisation-and-pseudonymisation
[13] S. Stalla-Bourdillon and A. Knight, ‘‘Anonymous data v. personal data-
To conclude, as visual privacy is of great importance, thus, false debate: An EU perspective on anonymization, pseudonymization
there is a vital need to deploy complete Data protection-by- and personal data,’’ Wisconsin Int. Law J., vol. 34, no. 2, pp. 284–322,
design solutions for affected companies. The paper has dis- 2017.
[14] S. Gilad-Gutnick, E. S. Harmatz, K. Tsourides, G. Yovel, and P. Sinha,
cussed technological solutions provided by companies and it ‘‘Recognizing facial slivers,’’ J. Cogn. Neurosci., vol. 30, no. 7,
further suggests improvements. In the opinion of this paper’s pp. 951–962, Jul. 2018.
authors, cryptography with video redaction, acting together [15] I. Bouchrika and M. S. Nixon, ‘‘People detection and recognition using
gait for automated visual surveillance,’’ in Proc. IET Conf. Crime Secur.,
or in isolation are not sufficient. Multiple information tech- 2006, pp. 576–581.
nology domains will need to work hand-in-hand to imple- [16] V. B. Semwal, N. Gaud, and G. C. Nandi, ‘‘Human gait state prediction
ment effective Data protection-by-design solutions for video using cellular automata and classification using ELM,’’ in Proc. Mach.
Intell. Signal Anal., 2019, pp. 135–145.
surveillance systems. Clearly, an additional cost will ensue [17] Handbook No. 1: The Right to Respect for Private and Family Life.
in order to mitigate the privacy risks arising from the use of A Guide to the Implementation of Article 8 of the European Convention
visual data. However, this cost will result in a meaningful on Human Rights, European Court of Human Rights, Strasbourg, France,
Dec. 2001.
implementation of GDPR-compliance by video surveillance [18] L. van Zoonen, ‘‘Privacy concerns in smart cities,’’ Government Inf. Quart.,
companies, which will have a positive impact upon EU busi- vol. 33, no. 3, pp. 472–480, 2016.
nesses. Importantly, similar solutions will become necessary [19] R. Buyya, S. T. Selvi, and X. Chu, Object Oriented Programming With
Java: Essentials and Applications. New York, NY, USA: McGraw-Hill,
outside the EU, if not already implemented and they are 2009.
already necessary for those companies interacting with EU [20] Information Commissioners’ Office (ICO). In the Picture: A Data
clients from outside the EU. Protection Code of Practice for Surveillance Cameras and Personal
Information Version 1.2.. Accessed: Jul. 2019. [Online]. Available:
ACKNOWLEDGMENTS https://ico.org.uk/for-organisations/guide-to-data-protection-1998/
We would like to thank the reviewers for their valuable com- encryption/scenarios/cctv/
[21] A. Romanou, ‘‘The necessity of the implementation of Privacy by Design
ments in improving the quality of the article and in enhancing in sectors where data protection concerns arise,’’ Comput. Law Secur. Rev.,
the readability and usefulness of our manuscript. vol. 34, no. 1, pp. 99–110, 2018.
[22] S. Sah, A. Shringi, R. Ptucha, A. M. Burry, and R. P. Loce, ‘‘Video
REFERENCES redaction: A survey and comparison of enabling technologies,’’ J. Electron.
[1] C. Tikkinen-Piri, A. Rohunen, and J. Markkula, ‘‘EU general data protec- Imag., vol. 26, no. 5, 2017, Art. no. 051406.
tion regulation: Changes and implications for personal data collecting com- [23] P. Sinha, B. Balas, Y. Ostrovsky, and R. Russell, ‘‘Face recognition by
panies,’’ Comput. Law Secur. Rev., vol. 34, no. 1, pp. 134–153, Feb. 2018. humans: Nineteen results all computer vision researchers should know
[2] G. Pillai. (2012). Caught on Camera: You are Filmed on CCTV about,’’ Proc. IEEE, vol. 94, no. 11, pp. 1948–1962, Nov. 2006.
300 Times a Day in London. Accessed: May 2019. [Online]. Avail- [24] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, ‘‘You only look once:
able: https://www.ibtimes.co.uk/britain-cctv-camera-surveillance-watch- Unified, real-time object detection,’’ in Proc. IEEE Conf. Comput. Vis.
london-big-312382 Pattern Recognit., Jun. 2016, pp. 779–788.
[3] A. Cavallaro, ‘‘Privacy in video surveillance,’’ IEEE Signal Process. Mag., [25] K. He, G. Gkioxari, P. Dollár, and R. Girshick, ‘‘Object detection for
vol. 24, no. 2, pp. 166–168, Mar. 2007. dummies part 3: R-CNN family,’’ Facebook AI Res., New York, NY, USA,
[4] Q. M. Rajpoot and C. D. Jensen, ‘‘Video surveillance: Privacy issues and Tech. Rep., 2017.
legal compliance,’’ in Promoting Social Change and Democracy Through [26] Z. Li and F. Zhou, ‘‘FSSD: Feature fusion single shot multibox detec-
Information Technology, V. Kumar and J. Svenson, Eds. Hershey, PA, tor,’’ Dec. 2017, arXiv:1712.00960. [Online]. Available: https://arxiv.org/
USA: IGI Global, 2015, ch. 4, pp. 69–92. abs/1712.00960
[5] R. Furlong. (2006). BBC NEWS | Europe | Germans Probe Merkel [27] J. R. Padilla-López, A. A. Chaaraoui, and F. Flórez-Revuelta, ‘‘Visual
Spy Camera. Accessed: May 2019. [Online]. Available: http://news.bbc. privacy protection methods: A survey,’’ Expert Syst. Appl., vol. 42, no. 9,
co.uk/2/hi/europe/4849806.stm pp. 4177–4195, 2015.
[28] M. Mourby, E. Mackey, M. Elliot, H. Gowans, S. E. Wallace, J. Bell, [53] EPeri. (2019). Open Source—EPeri [Internet]. Accessed: May 15, 2019.
H. Smith, S. Aidinlis, and J. Kaye, ‘‘Are ‘pseudonymised’ data always [Online]. Available: https://eperi.com/open-source/
personal data? Implications of the GDPR for administrative data research [54] Clearance. (2019). Omnicast | Genetec [Internet]. Accessed:
in the UK,’’ Comput. Law Secur. Rev., vol. 34, no. 2, pp. 222–233, May 15, 2019. [Online]. Available: https://www.genetec.com/solutions/
2018. all-products/omnicast
[29] J. Flusser, S. Farokhi, C. Höschl, T. Suk, B. Zitová, and M. Pedone, [55] Kinesense. Video Redaction for GDPR. Accessed: Feb. 1, 2019. [Online].
‘‘Recognition of images degraded by Gaussian blur,’’ IEEE Trans. Image Available: https://www.kinesense-vca.com/2018/05/25/kinesense-tips-
Process., vol. 25, no. 2, pp. 790–806, Feb. 2016. tricks-18-gdpr-addition/
[30] T. Gerstner, D. DeCarlo, M. Alexa, A. Finkelstein, Y. Gingold, and [56] SeekLayer. SeekLayer—Fast, Affordable, GDPR-Compliant
A. Nealen, ‘‘Pixelated image abstraction with integrated user constraints,’’ Video Redaction. Accessed: May 15, 2019. [Online]. Available:
Comput. Graph., vol. 37, no. 5, pp. 333–347, 2013. https://www.seeklayer.com#cctv_redaction_section
[31] Y. Kusama, H. Kang, and K. Iwamura, ‘‘Mosaic-based privacy-protection [57] S. Bellovin and R. Housley, Guidelines for Cryptographic Key Manage-
with reversible watermarking,’’ in Proc. 12th Int. Joint Conf. e-Bus. ment, Standard RFC 4107, IETF, Fremont, CA, USA, Jun. 2005.
Telecommun., Jul. 2015, pp. 98–103. [58] A. Senior, S. Pankanti, A. Hampapur, L. Brown, Y.-L. Tian, A. Ekin,
[32] A. Erdélyi, T. Barát, P. Valet, T. Winkler, and B. Rinner, ‘‘Adap- J. Connell, C. F. Shu, and M. Lu, ‘‘Enabling video privacy through
tive cartooning for privacy protection in camera networks,’’ in Proc. computer vision,’’ IEEE Security Privacy, vol. 3, no. 3, pp. 50–57,
11th IEEE Int. Conf. Adv. Video Signal Based Surveill., Aug. 2014, May 2005.
pp. 44–49. [59] T. Aste, P. Tasca, and T. D. Matteo, ‘‘Blockchain technologies: The foresee-
[33] J. Gomes and L. Velho, ‘‘Warping and Morphing,’’ in Proc. Image Process. able impact on society and industry,’’ Computer, vol. 50, no. 9, pp. 18–28,
Comput. Graph., 1997, pp. 271–296. Jan. 2017.
[34] K. Chinomi, N. Nitta, Y. Ito, and N. Babaguchi, ‘‘PriSurv: Privacy pro- [60] S. Nakamoto. (2008). Bitcoin: A Peer-to-Peer Electronic Cash System.
tected video surveillance system using adaptive visual abstraction,’’ in [Online]. Available: http://www.cryptovest.com
Advances in Multimedia Modeling—MMM (Lecture Notes in Computer [61] M. Liu, J. Shang, P. Liu, Y. Shi, and M. Wang, ‘‘VideoChain: Trusted video
Science), vol. 4903, S. Satoh, F. Nack, and M. Etoh, Eds. Berlin, Germany: surveillance based on blockchain for campus,’’ in Cloud Computing and
Springer, 2008, pp. 144–154. Security—ICCS (Lecture Notes in Computer Science), vol. 11066, X. Sun,
[35] F. Dufaux, ‘‘Video scrambling for privacy protection in video surveil- Z. Pan, and E. Bertino, Eds. Cham, Switzerland: Springer, 2018, pp. 48–58.
lance: Recent results and validation framework,’’ Proc. SPIE vol. 8063, [62] H. Vranken, ‘‘Sustainability of bitcoin and blockchains,’’ Current Opinion
May 2011, Art. no. 806302. Environ. Sustainability, vol. 28, pp. 1–9, Oct. 2017.
[36] T. Spies and R. T. Minner, ‘‘System for protecting sensitive [63] H. Al-Khateeb, G. Epiphaniou, and H. Daly, ‘‘Blockchain for modern digi-
data with distributed tokenization,’’ U.S. Patents 0 046 853 A1, tal Forensics: The chain-of-custody as a distributed ledger,’’ in Blockchain
Feb. 13, 2014. and Clinical Trial (Advanced Sciences and Technologies for Security
[37] S. Çiftçi, A. O. Akyüz, and T. Ebrahimi, ‘‘A reliable and reversible image Applications), H. Jahankhani, S. Kendzierskyj, A. Jamal, G. Epiphaniou,
privacy protection based on false colors,’’ IEEE Trans. Multimedia, vol. 20, and H. Al-Khateeb, Eds. Springer-Verlag, 2019.
no. 1, pp. 68–81, Jan. 2018. [64] K. Wüst and A. Gervais, ‘‘Do you need a Blockchain?’’ in Proc. Crypto
[38] V. E. Liong, J. Lu, Y.-P. Tan, and J. Zhou, ‘‘Deep video hashing,’’ IEEE Valley Conf. Blockchain Technol., Jun. 2018, pp. 45–54.
Trans. Multimedia, vol. 19, no. 6, pp. 1209–1219, Jun. 2017.
[39] A. Ewerlöf, ‘‘GDPR pseudonymization techniques,’’ Medium Corp. (US),
San Francisco, CA, USA, Tech. Rep., May 2018.
[40] D. R. Stinson, Cryptography: Theory and Practice, 3rd ed. London, U.K.:
Chapman & Hall, 2005.
[41] H. Hofbauer and A. Uhl, ‘‘Identifying deficits of visual security metrics for
images,’’ Signal Process., Image Commun., vol. 46, pp. 60–75, Aug. 2016.
[42] M. N. Asghar, M. Ghanbari, M. Fleury, and M. J. Reed, ‘‘Sufficient encryp-
tion based on entropy coding syntax elements of H.264/SVC,’’ Multimedia
Tools Appl., vol. 74, no. 23, pp. 10215–10241, 2015.
[43] M. N. Asghar, R. Kousar, H. Majid, and M. Fleury, ‘‘Transparent encryp-
tion with scalable video communication: Lower-latency, CABAC-based
schemes,’’ J. Vis. Commun. Image Represent., vol. 45, pp. 122–136,
May 2017.
[44] A. Popov, Prohibiting RC4 Cipher Suites, Standard RFC 7465, IETF,
Fremont, CA, USA, Feb. 2015.
[45] G. Alvarez and S. Li, ‘‘Some basic cryptographic requirements for
chaos-based cryptosystems,’’ Int. J. Bifurcation Chaos, vol. 16, no. 8, MAMOONA N. ASGHAR received the Ph.D.
pp. 2129–2151, 2006. degree with the School of Computer Science
[46] M. N. Asghar, M. Ghanbari, M. Fleury, and M. J. Reed, ‘‘Confidentiality and Electronic Engineering, University of Essex,
of a selectively encrypted H.264 coded video bit-stream,’’ J. Vis. Commun. Colchester, U.K., in 2013. She has been a Marie
Image Represent., vol. 25, no. 2, pp. 487–498, 2014. Sklodowska-Curie (MSC) Career-Fit Research
[47] J. Daemen and V. Rijmen, The Design of Rijndael: AES—The Advanced Fellow with the Software Research Institute,
Encryption Standard. Berlin, Germany: Springer, 2002. Athlone Institute of Technology (AIT), Ireland,
[48] A. Shifa, M. N. Asghar, S. Noor, N. Gohar, and M. Fleury, ‘‘Lightweight since June 2018. As an MSC Principal Investigator
cipher for H.264 videos in the Internet of multimedia things with encryp- (PI), her research targets the proposals and imple-
tion space ratio diagnostics,’’ Sensors, vol. 19, no. 5, p. 1228, 2019. mentation of technological solutions for General
[49] StratoKey. (2018). Technical White Paper. Accessed: Feb. 18, 2019. Data Protection Regulation (GDPR) compliant Surveillance systems. She is
[Online]. Available: https://www.stratokey.com/resources/stratokey-
also a regular Faculty Member with the Department of Computer Science and
white-paper
Information Technology (DCS & IT), The Islamia University of Bahawalpur,
[50] Pkware. (2019). Smartcrypt Transparent Data Encryption (TDE) Reliable
Encryption for Structured and Unstructured Data [Internet]. Accessed:
Punjab, Pakistan, where she is currently on postdoc leave. She has more
May 15, 2019. [Online]. Available: https://www.pkware.com/smartcrypt- than 14 years of teaching and R&D experience. She has published several
tde ISI indexed journal articles along with numerous International conference
[51] Sighthound. Computer Vision Software With Deeply-Learned AI Vision papers. She is also actively involved in reviewing for renowned journals and
API & SDK’s | Technology. Accessed: Feb. 1, 2019. [Online]. Available: conferences. Her research interests include security aspects of multimedia
https://www.sighthound.com/technology/ (image, audio and video), compression, visual privacy, encryption, steganog-
[52] IBM. IBM Intelligent Video Analytics—Overview—Ireland. Accessed: raphy, secure transmission in future networks, video quality metrics, and key
Jul. 7, 2019. [Online]. Available: https://www.ibm.com/analytics/ie/en/ management schemes.
NADIA KANWAL received the M.Sc. and Ph.D. MARTIN FLEURY received the degree in modern
degrees in computer science from the University history from Oxford University, U.K., the degree
of Essex, Essex, U.K., in 2009 and 2013, respec- in maths/physics from the Open University, Milton
tively. She is currently a Marie Sklodowska-Curie Keynes, U.K., the M.Sc. degree in astrophysics
Career-Fit Postdoctoral Fellow with the Software from the QMW College, University of London,
Research Institute, Athlone Institute of Technol- U.K., in 1990, the M.Sc. degree in parallel com-
ogy (AIT), Ireland. The primary objective of this puting systems from the University of South-West
fellowship is to propose technological solutions England, Bristol, in 1991, and the Ph.D. degree in
for privacy protection of humans as per GDPR parallel image-processing systems from the Uni-
guidelines. She is also associated with the Lahore versity of Essex, Colchester, U.K. He was a Senior
College for Women University, Pakistan, where she is also an Associate Lecturer with the University of Essex, after which he became a Visiting Fel-
Professor and is also on leave to pursue Postdoctoral Fellowship. She is low. He is currently associated with the School of Engineering, Arts, Science,
applying deep learning methods to improve the performance of vision algo- Technology and Engineering (EAST), University of Suffolk, Ipswich, U.K.
rithms for detection and matching tasks which can help to develop robust He is also a Free-Lance Consultant. He has authored or coauthored around
solutions for different vision-related applications. Her research interests 295 and book chapters on topics such as document and image compression
include machine learning, image/video processing, medical imaging, and algorithms, performance prediction of parallel systems, software engineer-
privacy. She remained a student member of the IEEE Computer Society, ing, reconfigurable hardware, and vision systems. He has published or edited
the Institution of Engineering and Technology, and the British Machine books on high performance computing for image processing and peer-to-peer
Vision Association. She has been actively involved in reviewing for reputed streaming. His current research interests include video communication over
conferences and journals. wireless networks.
Title :
Visual Surveillance Within the EU General Data Protection Regulation: A Technology
Perspective
Name of Authors :
Mamoona N. Asghar, Nadia Kanwal, Brian Lee, Martin Fleury, Marco Herbst and Yuansong
Qiao
Abstract :
Background info of the article : The article explains on need to protect that visual information
from illegal utilization by untrusted parties. This is because, though technology has given many
benefits to people, it also comes with disadvantages whereby it could invade their privacy. As an
example, a surveillance camera captures videos of a person could improve their personal
safety, however, if the videos is used by untrusted parties, it could be a breach of privacy and
brings harm to the particular person.
Findings :
- European General Data Protection Regulation (GDPR) was adopted in April 2016 and
came into force in May 2018 in every EU member state.
- Purposes of GDPR are to ensure transparency to the act of processing one’s data and
improve the collective digital economy of the EU states
- Rights are conferred onto individuals in which the responsibilities are carried on by those
who process personal data.
- GDPR is globally adopted in which the companies outside the EU that process personal
data of any EU citizen with the purpose of selling goods, giving services, tracking the
actions of any EU citizens are subject to the regulation. A Data Protection Officer must
be appointed by the companies who bear the responsibility of ensuring GDPR
compliance. Disobedience will cause a substantial fine of 6% of the company’s global
revenue or of RM30 Million Euros (whichever is greater).
Article 5(1) provides the GDPR principles in protecting the privacy to the act of processing
personal data. The controllers of the collected data must be held accountable for the failure in
complying as provided under Article 5(2).
- Article 5(1)(a): Lawfulness, fairness and transparency.
1) Lawfulness: processing personal data must comply with the GDPR requirements.
2) Fairness: personal data shall be processed in means that the subjects are able to
reasonably expect. The data cannot be employed that brings any unjustified
adverse effects on the subjects.
3) Transparency: explicit grounds in collecting and processing of personal data
must be disclosed.
- Article 5(1)(b): Limitation of the personal data processing is that personal data can be
obtained subject to the specified, explicit and legitimate purposes. The subjects must be
informed of the reasons for the act of collecting personal data and the consent of the
subjects must be obtained.
- Article 5(1)(c): Personal data of a subject must be minimised to what is adequate,
relevant and limited to what is necessary for the purpose of data processing.
- Article 5(1)(d): The data must be accurate in which any alteration of the data is
prohibited. Editable data management systems must also be made in permitting the
subjects to update their personal data.
- Article 5(1)(e): The personal data can only be kept for no longer than necessary. If the
data are no longer necessary, the data must be removed from the repositories of a
controller.
- Article 5(1)(f): The data must be protected by integrity and confidentiality in which the
controllers must employ any organizational measures in ensuring the security of the
personal data including protecting against unlawful processing or accidental loss,
destruction or damage to the data.
Article 6
- Article 6(4)(e): Proper safeguards must be employed such as the procedures of
encryption or pseudonymisation (replacing any information which could be used to
identify an individual with a pseudonym, or, in other words, a value which does not allow
the individual to be directly identified) to protect the personal data.
- Article 6(f): lawful processing of personal data is permitted provided that the act of
processing must not offend the rights and freedoms of the data subject (especially if the
subject is a minor).
8 rights of data subjects under GDPR. These rights are conferred to the EU citizens.
1. Article 15: Right to access- data subjects have the right to request access for the
collected data. The companies are required to provide a cost free copy of collected
personal data in an electronic format.
2. Article 16: Right to rectification- the subjects have the right to alter and update their
collected data
3. Article 17: Right to erasure- the companies are required to remove or erase the
collected and processed personal data of subjects when it is found that the data is no
longer necessary or when the data subjects withdraw their consent.
4. Article 18: Right to restrain processing- the data subjects may request for the data to be
held back from any act of processing. The data can be kept by the controller without any
use.
5. Article 19: Right to be informed- the data subjects must be informed of the collection of
their personal data by the companies and thus free consent of the subjects must be
obtained for the collection.
6. Article 20: Right to data portability- the data subjects have the right to transfer their
personal data from one service provider to another service provider.
7. Article 21: Right to oppose the processing of their personal data for direct marketing of
the companies.
8. Article 34: Right to be given notice of any breach of the personal data collection. (must
be notified within 72 hours.
3 Interesting Points
-personal data may include the person’s physical information (facial features, clothing
information) and behaviour (facial gestures, physical actions).
-the recognition of humans through this information is an easy task for humans
-a research on facial recognition was conducted and humans were found to be good at
recognising highly distorted faces
-Gilad Gutnick et al. found that thinned and flattened celebrity faces could be recognised by
humans accurately up to 80% distortion. Beyond that, recognition starts to be less accurate
however half of the celebrities were still identified correctly by the volunteers
-privacy protection should be affected through methods which will distort the video at a sufficient
level
-a data is said to be recorded in a personal space when it occurs in a domestic setting or within
a household (Article 8 of the European Convention on Human Rights)
-examples: CCTV system installed in homes or in the playground to monitor children both for
safety reasons, CCTV installed in garages for vehicle safety
-video footages obtained in those circumstances do not require protection since they are not
recorded to be forwarded to any third party unless a serious incident relating to crime or tort
happens. Therefore, generally, these CCTV footages (private data) are deemed as invaluable
for local authorities, city planners or other third parties. GDPR compliance is not mandatory.
-the data is not linked to an individual and cannot be used for surveillance purpose
-examples: monitoring of traffic flows within public transport, shopping malls, crowds in
road/public parks
-since the data is used for flows of vehicles or impersonal crowds instead of measuring
individuals, GDPR compliance is not necessary. However, it is necessary to protect such data
from being misused
-We find this point interesting and necessary to be known because our faces and actions are
being recorded by CCTVs installed all around the premises we enter or associate with.
Therefore, it is important for us to know our rights as the data subject and if our privacy is
protected by the law.
-In Malaysia, CCTV recordings in public or private places are subject to the Personal Data
Protection Act 2010 (Act 709).
-Section 8 states that subject to section 39, no personal data shall, without the consent of the
data subject, be disclosed for any purpose other than allowed by the Act.
-Personal data processed only for the purposes of that individual's personal, family, or
household affairs, including recreational purposes, are exempted from the PDPA. (similar to the
personal space visual data concept)
Recommendation
Conclusion
New York and New Jersey Make an Early Effort to Regulate Artificial Intelligence
Marc S. Martin, Charlyn L. Ho, and Michael A. Sherling
Artificial Intelligence and the Fair Housing Act: Algorithms Under Attack?
William T. Gordon, Katherine Kirkpatrick, and Katherine Mueller
The Convertible Debt Valuation Cap: The Trigger Financing Investor Perspective
Paul A. Jones
Everything Is Not Terminator: AI Issues Raised by the California Consumer Privacy Act
John Frank Weaver
FULL
®
COURT
PRESS
RAIL The Journal of Robotics,
Artificial Intelligence & Law
Volume 3, No. 1 | January–February 2020
EDITOR
Victoria Prussen Spears
Senior Vice President, Meyerowitz Communications Inc.
BOARD OF EDITORS
Miranda Cole
Partner, Covington & Burling LLP
Kathryn DeBord
Partner & Chief Innovation Officer, Bryan Cave LLP
Paul B. Keller
Partner, Norton Rose Fulbright US LLP
Garry G. Mathiason
Shareholder, Littler Mendelson P.C.
Elaine D. Solomon
Partner, Blank Rome LLP
Linda J. Thayer
Partner, Finnegan, Henderson, Farabow, Garrett & Dunner LLP
Mercedes K. Tunstall
Partner, Pillsbury Winthrop Shaw Pittman LLP
Edward J. Walters
Chief Executive Officer, Fastcase Inc.
Publishing Staff
Publisher: Morgan Morrissette Wright
Journal Designer: Sharon D. Ray
Cover Art Design: Juan Bustamante
This publication is sold with the understanding that the publisher is not engaged
in rendering legal, accounting, or other professional services. If legal advice or
other expert assistance is required, the services of a competent professional should
be sought.
Editorial Office
711 D St. NW, Suite 200, Washington, D.C. 20004
https://www.fastcase.com/
For questions about the Editorial Content appearing in these volumes or reprint
permission, please call:
Customer Service
Available 8am–8pm Eastern Time
866.773.2782 (phone)
support@fastcase.com (email)
Sales
202.999.4777 (phone)
sales@fastcase.com (email)
ISSN 2575-5633 (print)
ISSN 2575-5617 (online)
U.S. AI Regulation Guide:
Legislative Overview and
Practical Considerations
Yoon Chae*
Robotics, Artificial Intelligence & Law / January–February 2020, Vol. 3, No. 1, pp. 17–40.
© 2020 Full Court Press. All rights reserved.
ISSN 2575-5633 (print) / ISSN 2575-5617 (online)
18 The Journal of Robotics, Artificial Intelligence & Law [3:17
legislative sessions ago (2015–2016), five bills during the last term
(2017–2018), and 13 bills for the current legislature (2019–2020).
Other than regulatory instruments on autonomous vehicles 5—
which are beyond the scope of this article—most of the lawmaking
on AI, automated systems, and their underlying technologies have
been in the following areas:
Policy
The Executive Order and the NIST Plan
Algorithmic Accountability
Algorithmic Accountability Act
sponsors, Senator Blunt, to make sure “that people are given the
information and [] the control over how their data is shared” when
it comes to facial recognition.49
If enacted, the bill would generally prohibit covered entities
from using “facial recognition technology to collect facial recog-
nition data” of end users without providing notice and obtaining
their consent.50 “Covered entities” is broadly defined and would
include any non-governmental entity that “collects, stores, or pro-
cesses facial recognition data.”51 This definition would therefore
seemingly include entities like app operators that collect facial
data, businesses that use the technology on their premises, and
outside vendors that process such data for the original data collec-
tors.52 Facial recognition data is tied to personal information (i.e.,
“unique personal identification of a specific individual”), but may
also include pseudonymized information (i.e., where “a unique,
persistent identifier” is assigned).53
Covered entities would also be prohibited from using the tech-
nology to discriminate against the consumers, from repurposing the
facial data, or from sharing such data with third parties without
obtaining further consent from the end users.54 The covered entities
would also not be allowed to condition service of the technology
on the consumer’s consent to waive privacy rights when the use of
facial recognition technology is not necessary for that service.55 In
an effort to provide more oversight, the bill also requires the cov-
ered entities to conduct meaningful human review before making
any final decision based on the output of the technology if it can
result in a reasonably foreseeable harm or be “highly offensive”
to a reasonable end user.56 If the technology is made “available as
an online service,” then the covered entity would additionally be
required to make available an API to enable at least one third party
to conduct reasonable independent tests for accuracy and bias.57
The bill, however, contains some notable exceptions. For
example, the bill exempts “security applications” that use the
technology for “loss prevention” or “to detect or prevent criminal
activity.”58 The bill also exempts products or services “designed for
personal file management or photo or video sorting or storage if
the facial recognition technology is not used for unique personal
identification of a specific individual,” involves “identification of
public figures for journalistic media created for public interest,”
“involves identification of public figures in copyrighted material,”
or is used in an emergency.59
2020] U.S. AI Regulation Guide 25
The law would not preempt state laws, except to the extent such
regulations are “inconsistent” with the provisions of the bill.60 If
passed, the act would thus likely not preempt stricter state laws,
such as Illinois’s Biometric Information Privacy Act (“BIPA”) or
other state biometric information privacy laws in Texas and Wash-
ington.61 The bill does not provide for a private right of action, and
would instead be enforced by the FTC or via civil suits brought by
the affected state’s attorney general.62
This bill is worth monitoring as it makes its way through Con-
gress given that it is a bipartisan bill63 with a narrow scope that
has received early conceptual support from several technology
companies.64 Again, like the Algorithmic Accountability Act, the
bill mirrors certain aspects of the GDPR, such as the differentiation
between processors and controllers.65 Numerous commentators
believe that this bill signals the types of impending laws on facial
recognition technology. In fact, in 2019, lawmakers in at least 10
states have introduced bills to ban or delay the use of facial recog-
nition technology by government agencies or businesses.66
There has been a wave of state and local laws and bills on the
procurement and use of facial recognition technology. For example,
California introduced a bill, A.B. 1215, on February 21, 2019, which
would prohibit law enforcement agencies and officials from using
any “biometric surveillance system,” including facial recognition
technology, in connection with an officer camera or data collected
by the camera. 71 On the same day, another bill, A.B. 1281, was
introduced in California, which would require California businesses
that use facial recognition technology to disclose such usage on a
physical sign that is “clear and conspicuous at the entrance of every
location that uses facial recognition technology.”72
Senate Bill 1385 was introduced in Massachusetts on January 22,
2019, to establish a moratorium on the use of face recognition sys-
tems by state and local law enforcement, and Senate Bill 5687 was
introduced in New York on May 13, 2019, to propose a temporary
stop to the use of facial recognition technology in public schools.73
Similarly, companion bills, S.B. 5528 and H.B. 1654, were intro-
duced in Washington in January 2019, concerning the procurement
and use of facial recognition technology by government entities
and privacy rights relating to facial recognition technology.74
As for cities, San Francisco and Oakland, California, and Somer-
ville, Massachusetts, passed ordinances this summer to ban the use
of facial recognition software by the police and other government
agencies.75
On a related note, there has also been a resurgence of biomet-
ric privacy bills being introduced in state legislatures. Although
the Illinois Biometric Information Privacy Act and Texas’s law
on the Capture or Use of Biometric Identifier have been around
since 2008 and 2009,76 other states have recently started enact-
ing or introducing privacy laws on biometric identifiers, such as
2020] U.S. AI Regulation Guide 27
Transparency
California’s Bolstering Online Transparency Act
External Resources
Technical Standards
Notes
* Yoon Chae is an associate in Baker McKenzie’s Dallas office, where he
advises on IP and technology matters. He also served as the firm’s first fellow
at the World Economic Forum’s Centre for the Fourth Industrial Revolution,
where he researched diverse policy issues arising from AI. This article reflects
his personal views and not the positions of the firm. He may be reached at
yoon.chae@bakermckenzie.com.
1. See The Law Library of Cong., Regulation of Artificial Intel-
ligence in Selected Jurisdictions 1 (2019).
2. Other examples include South Korea’s Mid- to Long-Term Master
Plan in Preparation for the Intelligent Information Society (December 2016),
the United Kingdom’s National Robotics and Autonomous Systems Strategy
(January 2017) and AI in the UK: ready, willing and able? report (April 2018),
Canada’s Pan-Canadian Artificial Intelligence Strategy (March 2017), France’s
National Artificial Intelligence Research Strategy (March 2017) and National
Strategy for AI (March 2019), China’s National AI Strategy (July 2017), United
Arab Emirates’ Strategy for Artificial Intelligence (October 2017), Germany’s
National Strategy for AI (November 2018), Singapore’s Proposed Model
Artificial Intelligence Governance Framework (January 2019), and Japan’s
AI Strategies 2019 (June 2019), among many others. Non-governmental
organizations have also been active in issuing guidelines and frameworks
on ethical development of AI. See, e.g., IEEE, Ethically Aligned Design
(2017); Microsoft, The Future Computed (2018).
3. Mina Hanna, We Don’t Need More Guidelines or Frameworks on
Ethical AI Use. It’s Time for Regulatory Action, BRINK News (July 25, 2019),
https://www.brinknews.com/we-dont-need-more-guidelines-or-frameworks
34 The Journal of Robotics, Artificial Intelligence & Law [3:17
will favor a policy and standards governance approach over a more onerous
command-and-control-type regulatory agency rulemaking approach leading
to regulations (which the Trump administration often refers to as ‘barriers’)”).
17. Jory Heckman, NIST Sets AI Ground Rules for Agencies With-
out “Stifling Innovation,” Federal News Network (Aug. 22, 2019,
5:58 pm), https://federalnewsnetwork.com/artificial-intelligence/2019/08/
nist-sets-ai-ground-rules-for-agencies-without-getting-over-prescriptive.
18. Id.
19. H.R. Res. 153, 116th Cong. (2019). For an overview of the recent
developments in AI, see Kay Firth-Butterfield & Yoon Chae, Artificial Intel-
ligence Collides with Patent Law, World Economic Forum (2018), http://
www3.weforum.org/docs/WEF_48540_WP_End_of_Innovation_Protect
ing_Patent_Law.pdf.
20. Id.
21. Assemb. Con. Res. 215, 2018 Reg. Sess. (Cal. 2018) (enacted).
22. Id.
23. Asilomar AI Principles, Future of Life Institute, https://future
oflife.org/ai-principles (last visited Nov. 12, 2019).
24. Id.
25. Yoon Chae, Recent Trends in California’s AI Regulation Efforts,
Law360 (July 11, 2019), https://www.law360.com/articles/1176167/
recent-trends-in-california-s-ai-regulation-efforts.
26. Asilomar AI Principles, supra note 23.
27. Algorithmic Accountability Act of 2019, S. 1108, H.R. 2231, 116th
Cong. (2019); see also Yoon Chae, supra note 25.
28. Sneha Revanur, In a Historic Step Toward Safer AI, Democratic
Lawmakers Propose Algorithmic Accountability Act, Medium (Apr. 20,
2019), https://medium.com/@sneharevanur/in-a-historic-step-toward-
safer-ai-democratic-lawmakers-propose-algorithmic-accountability-act-
86c44ef5326d.
29. Algorithmic Accountability Act of 2019, supra note 27, at § 2(2) and
3(b) (emphases added). The bill would also require the covered entities to
conduct data protection impact assessments on their “high-risk information
systems.” See id. at § 2(6) and 3(b).
30. Id. at § 3(b)(1)(C).
31. Id. at § 3(b)(1)(D).
32. Id. at § 2(1) (emphases added). “High risk” automated decision
systems would include those that: (1) pose a significant risk to the privacy
or security of personal information of consumers, or of resulting in or con-
tributing to inaccurate, unfair, biased, or discriminatory decisions affecting
consumers; (2) make or facilitate human decision-making based on system-
atic and extensive evaluations of consumers (including attempts to analyze/
predict sensitive aspects of their lives) and alter legal rights of consumers or
otherwise affect consumers; (3) involve the personal information of consumers
36 The Journal of Robotics, Artificial Intelligence & Law [3:17
https://www.wbur.org/bostonomix/2019/06/28/somerville-bans-govern
ment-use-of-facial-recognition-tech. For further discussion of state and local
regulatory development on facial recognition technology, see Gibson Dunn,
Artificial Intelligence and Autonomous Systems Legal Update
(2Q19) 5-6, July 23, 2019.
76. See Illinois Biometric Information Privacy Act, 740 ILCS 14/1; Tex.
Bus. & Com. Code § 503.001.
77. Wash. Rev. Code § 19.375 et seq.
78. CCPA will be codified at Cal. Civ. Code § 1798.100 to 1798.198.
79. An Act relative to consumer data privacy, S. 120, 191st Leg. (Mass.
2019).
80. Cal. Bus. & Prof. Code § 17940 et seq.
81. See Noam Cohen, Will California’s New Bot Law Strengthen
Democracy?, New Yorker (July 2, 2019), https://www.newyorker.com/
tech/annals-of-technology/will-californias-new-bot-law-strengthen-
democracy; Jamie Williams & Jeremy Gillula, Victory! Dangerous Ele-
ments Removed from California’s Bot-Labeling Bill, Electronic Frontier
Foundation (Oct. 5, 2018), https://www.eff.org/deeplinks/2018/10/
victory-dangerous-elements-removed-californias-bot-labeling-bill.
82. See Cal. Bus. & Prof. Code § 17941(a) (“It shall be unlawful for any
person to use a bot to communicate or interact with another person in Cali-
fornia online, with the intent to mislead the other person about its artificial
identity for the purpose of knowingly deceiving the person about the content
of the communication in order to incentivize a purchase or sale of goods or
services in a commercial transaction or to influence a vote in an election. A
person using a bot shall not be liable under this section if the person discloses
that it is a bot.”).
83. Id. at § 17940(a).
84. Id. at § 17941(a)-(b).
85. Id. at § 17941(b)-(c); see also Harry Valetk & Brian Hengesbaugh,
California Now Has a New Bot Disclosure Law—Will Other States Follow Suit?,
b:INFORM (July 26, 2019), http://www.bakerinform.com/home/2019/7/26/
california-now-has-a-new-bot-disclosure-law.
86. Cal. Bus. & Prof. Code § 17942(c) (“This chapter does not impose a
duty on service providers of online platforms, including, but not limited to,
Web hosting and Internet service providers.”).
87. See Harry Valetk & Brian Hengesbaugh, supra note 85 (“Violating
the B.O.T. Act’s disclosure requirements could constitute a violation of Cali-
fornia’s Unfair Competition Act (Cal. Code BPC § 17206), which includes a
civil penalty of up to USD 2,500 for each violation. Accordingly, each ‘like,’
‘share,’ or ‘re-tweet’ performed by a bot without a proper disclosure could
result in a USD 2,500 penalty”).
88. See Jamie Williams & Jeremy Gillula, supra note 81.
89. See id.
2020] U.S. AI Regulation Guide 39
90. Anti-Eavesdropping Act, A.B. 1395, 2019 Reg. Sess. (Cal. 2019),
status available at https://leginfo.legislature.ca.gov/faces/billTextClient
.xhtml?bill_id=201920200AB1395 (last amended in Senate on June 26, 2019).
91. Id. at § 3 (proposing amendment of Cal. Bus. & Prof. Code
§ 22948.20(a)).
92. Id. at § 3 (proposing amendment of Cal. Bus. & Prof. Code
§ 22948.20(b)(1)-(3)).
93. Id. at § 3 (proposing amendment of Cal. Bus. & Prof. Code
§ 22948.20(d)).
94. Artificial Intelligence Video Interview Act, H.B. 2557, 101st Gen.
Assemb., 2019 Reg. Sess., Public Act 101-0260 (Ill. 2019).
95. Id. at § 5(1)-(3).
96. Id. at § 10 and 15.
97. Theodore F. Claypoole & Dominic Dhil Panakal, Illinois Law
Regulates Use of AI in Video Interviews, The National Law Review (Aug.
15, 2019), https://www.natlawreview.com/article/illinois-law-regulates-use-
ai-video-interviews; see also Justin Steffen & Heather Renée Adams, Hiring
ex Machina: Preparing for Illinois’s Artificial Intelligence Video Interview Act,
Legaltech News (Sept. 23, 2019, 7:00 AM), https://www.law.com/legaltech
news/2019/09/23/hiring-ex-machina-preparing-for-illinoiss-artificial-intel
ligence-video-interview-act (providing that the Act is silent on “whether job
candidates have a private right of action for violations of the Act”).
98. Theodore F. Claypoole & Dominic Dhil Panakal, supra note 97.
99. FUTURE of Artificial Intelligence Act of 2017, S. 2217, H.R. 4625,
115th Congress (2017).
100. Ben Kelly & Yoon Chae, supra note 44.
101. See GrAITR Act of 2019, H.R. 2202, 116th Congress (2019); AI-IA,
S. 1558, 116th Congress (2019).
102. GrAITR Act of 2019, supra note 101, at § 101(6)(F); AI-IA, supra
note 101, at § 101(6)(F).
103. GrAITR Act of 2019, supra note 101, at § 301(b); AI-IA, supra note
101, at § 301(b).
104. AI in Government Act of 2019, S. 1363, H.R. 2575, 116th Congress
(2019).
105. Id. at § 3(b)(5) and 4(a)(3).
106. Personal Data Prot. Comm’n Singapore, A Proposed Model
Artificial Intelligence Governance Framework 3.3 (2019).
107. Id. at 3.4.1-2.
108. World Econ. Forum, Draft Guidelines for AI Procurement
12 (2019).
109. See id. at 13.
110. AI Fairness 360, IBM Developer (Nov. 14, 2018), https://developer
.ibm.com/open/projects/ai-fairness-360; see also NIST Plan, supra note 13,
at App. III (AI-Related Tools).
40 The Journal of Robotics, Artificial Intelligence & Law [3:17
1. Abstract
This article discusses the state of U.S. regulation of artificial intelligence (“AI”),
covering legislative instruments directed to algorithmic accountability, the use of
facial recognition technology and associated data, and promoting “transparency”
when using AI, as well as pending federal bills on general governance or
research issues for AI, among other things. The author also discusses practical
considerations when using AI for businesses and in-house counsel.
- Senate and House bills for the Algorithmic Accountability Act were
introduced in Congress.
- The intention behind the bill was to require “companies to regularly
evaluate their tools for accuracy, fairness, bias, and discrimination.”
- It is the first federal legislative effort to regulate AI across industries, with
broad proposed protections that mirror the General Data Protection
Regulation (“GDPR”) in several ways.
- why did you find the points interesting (have they been addressed locally in
Malaysia?
Because it is imposing a legal obligation for manufacturers to be more cautious
and conscious of their product's algorithm risk on accuracy, fairness, bias,
dis-crimination, privacy, and security instead of just blaming the messed up
algorithms as they remain difficult to track (AI Blackbox phenomenon) when the
systems facing errors.
- Yes. As the regulation of AI is still in infancy and mostly the law is too
general, this bill is definitely beneficial to the legal realm because it
regulates AI algorithmic accountability with more specific guidelines for
the manufacturers to apply and be accountable for.
- why did you find the points interesting (have they been addressed locally in
Malaysia?
- These issues have not yet been addressed in malaysia, but it would be
good to apply such law in malaysia since the approach is different
because in malaysia, especially in application such as telegram, we
usually use bot to minimise our search and effort as it do anything on our
command without knowing that they could pretend to be a person to chat
with us and extract information for commercial or political purpose and we
also have this new feature in most of AI product where it able to process,
listen and record what we were saying so these laws will protect our
information from being used or given to anyone for any purposes that we
do not consent to. It is also to protect people from the new norm of
interviews which is via video sent to the related place and people. This
law surely will be useful since the person concerned might use our
information without consent and it is really worrying since it is now the
new norm but we do not have any laws to protect ourselves if any issues
regarding this thing were to happen.
- Definitely. All of these features namely bot, speakers that could identify
and do according to what we command and extract information from the
analysed video or picture are now being used and applied in lots of
technology in Malaysia. Thus, if Malaysia were to apply such laws, we
could prevent a lot of risk and issues regarding this matter since it
specifically mention how to apply it and to what extent it should be applied
to and it also covers most of the used feature that is now being used
world widely especially in Malaysia thus this laws will come in handy and
very useful.
- The bill would generally prohibit ‘covered entities’ from using “facial
recognition technology to collect facial recognition data” of end users
without providing notice and obtaining their consent.
- Covered entities would also be prohibited from using the technology to
discriminate against the consumers, from repurposing the facial data, or
from sharing such data with third parties without obtaining further consent
from the end users.
- why did you find the points interesting (have they been addressed locally in
Malaysia?
- Because the bill provides in detail the definitions of ‘covered entity’ which
include any non-governmental entity that “collects, stores, or processes
facial recognition data such as app operators that collect facial data,
businesses that use the technology on their premises, and outside
vendors that process such data for the original data collectors.
- The bill also requires the covered entities to conduct meaningful human
review before making any final decision based on the output of the
technology if it can result in a reasonably foreseeable harm or be “highly
offensive” to a reasonable end user.
- Besides, the bill also contains some notable exceptions. The bill exempts
“security applications” that use the technology for “loss prevention” or “to
detect or prevent criminal activity.” The bill also exempts products or
services “designed for personal file management or photo or video sorting
or storage if the facial recognition technology is not used for unique
personal identification of a specific individual,” involves “identification of
public figures for journalistic media created for public interest,” “involves
identification of public figures in copyrighted material,” or is used in an
emergency.
- As far as the bill is concerned, Malaysia has not yet introduced any bill
related to facial recognition, specifically regarding users' privacy. If
Malaysia followed the development of the US by implementing the law,
our country would have a proper legal framework on how to tackle any
existing issues arising from facial recognition matters.
3. Concluding Remarks
- From this article, most of the laws that were discussed in it can surely be
applied and emulated in Malaysia since most of the technology that is
being used in Malaysia currently have the same feature as those
discussed in the article. Hence, it is possible if Malaysia were to
experience such issues and the fact is that such cases already exist but
since there is no law to govern the issues nothing could be done to the
perpetrator as well as the victim. Therefore, by applying such laws we
could prevent any similar risk or issue that has happened before in other
countries from happening to Malaysia. Moreover, with specific and certain
regulations, those who are accountable for such action may be held liable
for any risk or issue coming from the AI.
DATE DOWNLOADED: Tue Jan 4 00:34:38 2022
SOURCE: Content Downloaded from HeinOnline
Citations:
-- Your use of this HeinOnline PDF indicates your acceptance of HeinOnline's Terms and
Conditions of the license agreement available at
https://heinonline.org/HOL/License
-- The search text of this PDF is generated from uncorrected OCR text.
-- To obtain permission to use this article beyond the scope of your license, please use:
Copyright Information
THE DEPLOYMENT OF ARTIFICIAL INTELLIGENCE TOOLS IN THE
HEALTH SECTOR: PRIVACY CONCERNS AND REGULATORY ANSWERS
WITHIN THE GDPR
Mirko Forti Q
TABLE OF CONTENTS
mirko.forti@santannapisa.it
I. INTRODUCTION
However, opening the black box would mean showing the functioning
mechanism of the algorithm to market competitors, which could stifle
innovation unless more transparent forms of AI became eligible for patent
protection.7 Moreover, making the working processes of this software
detectable would be technically impossible: AI algorithms continuously
change their working routine to follow new patterns and perform tasks in an
ever more efficient way.8 Changing their computing patterns allows them to
produce more reliable diagnostic outcomes. Accordingly, a human observer
would not be able to understand the reasoning process of AI tools and its
relationship to the data processed, even if these were visible. Ultimately, this
lack of interpretability9 can result in a lack of trust in the effective
s Robin C. Feldman and others, 'Artificial Intelligence in the Health Care Space:
How We Can Trust What We Cannot Know', (2019) 30 Stanford Law and Policy
Review 399.
6 Yavar Bathaee, 'The Artificial Intelligence Black Box and the Failure of Intent
and Causation' (2018) 31 Harvard Journal of Law and Technology 889.
7 See Ana Ramalho, 'Patentability of AI-Generated Inventions: is a Reform of the
Patent System Needed?' (2018) <https://papers.ssrn.com/sol3/papers.cfm?
abstract_id=3168 7 03> accessed 19 November 2020.
8 Jenna Burrell, 'How the Machine "Thinks": Understanding Opacity in Machine
Learning Algorithms' (2016)1 Big Data and Society 10.
9 Feldman (n 5); William J. Murdoch and others, 'Definitions, Methods and
Applications in Interpretable Machine Learning' (2019) 116 PNAS 22071.
32 EuropeanJournal of Legal Studies {Vol. 113 No. i
Can the current European legal framework adequately address the main
privacy-related issues that arise from the use of AI software for diagnostic
purposes? More specifically, can the European Regulation foster the
development of predictive medicine and, at the same time, protect the rights
of patients involved in the medical treatments? Starting from previous
scholarship, such as the work of Ann Cavoukian regarding the principle of
privacy by design,10 this article seeks to find normative solutions within the
GDPR to address the deployment of AI tools in the medical sector.
Furthermore, it attempts to find a point of balance between two opposing
interests: on the one hand, the privacy rights of every individual and, on the
other hand, the general interest to stimulate scientific and medical research
and, in more general terms, the right to public health."
The article begins by addressing the most relevant norms of the GDPR to
highlight the privacy-related issues arising from the deployment of AI tools
in the medical sector. More specifically, it focuses on the main legal
uncertainties arising from incompatibilities between the use of AI tools for
predictive medicine and norms like the principles of transparency, fairness
and lawfulness, the issue of free, informed and specific consent, the right to
be forgotten, and the prohibition of automated decisions. The article then
provides a few reflections about the main privacy threats raised by the
development of predictive medicine and how to overcome them. The
inherent opaqueness of AI algorithms may present a challenge for the
transparent and lawful functioning of predictive medicine. However, the
concepts of privacy by design and privacy by default12 could represent the
The recent approval and entry into force of the GDPR established new
privacy standards for data protection at a European level.3 While in one
respect the GDPR actually reduces obligations for data controllers regarding
access to clinical data compared to previous legislation, it also limits
utilisation of health data without consent, regulates its secondary use (that is,
use of data for purposes beyond those originally planned),14 and requires data
processing activities in all fields, including predictive medicine, to comply
with certain fundamental principles. The principles of fairness, transparency
and lawfulness are the cornerstones of the current European privacy legal
framework.5 They form the main thread uniting all processing activities and
ensure the protection of the fundamental rights of people involved.
'3 Melanie Bourassa Fourcier and others, 'Integrating artificial intelligence into
health care through data access: can the GDPR act as a beacon for policymakers?'
{2019}Journal of Law and the Bioscences 317.
"4 William Lowrance, 'Learning from the Experience: Privacy and the Secondary
Use of Data in Health Research', (2003), 8Journal of Health Services Research
&
Policy 2.
15 GDPR, art 5.1
16 European Union Agency for Fundamental Rights and Council of Europe,
Handbook on European DataProtection Law (Publications Office of the European
Union 2018) 118.
34 EuropeanJournal of Legal Studies (Vol. 113 No. ii
19 Gunnar B.J. Andersson and others, 'Do Not Harm: The Balance of "Beneficence"
and "Non-Maleficence"' (2010) 3 5( 9 S) Spine S2; Vittorio Bufacchi, 'Justice as
Non-Maleficence' (2020) 67(162) Theoria 1.
20 Melissa D McRadden and others, 'Ethical Limitations of Algorithmic Fairness
Solutions in Health Care Machine Learning, (2020) 2 The Lancet Digital Health
E221.
21 Ruha Benjamin, 'Assessing Risk, Automating Racism' (2019) 366(6464) Science
421.
22 McRadden (n 20).
23 Michael Marmot, 'Social Determinants of Health Inequalities' (2005) 365(9464)
The Lancet P1o99.
24 Nanette K.Wenger, 'Cardiovascular Disease: The Female Heart Is Vulnerable. A
Call to Action grom the ioQ Report' (2012) 35 Clinical Cardiology 134.
2021} Deployment ofArtificialIntelligence Tools in the Health Sector 35
In the medical sector, the goal is to make the diagnostic routine more user-
centric, protecting the identity rights of the patients. New technological
approaches could help.27 For instance, so-called federated learning (a
machine learning technique to process data through decentralized devices, in
which each server assembles its own dataset) facilitates collaborations across
34 GDPR, art 9.
35 GDPR, art.4.11; Mary Donnelly, Maeve McDonagh, 'Health Research, Consent
and the GDPR Exemption' (2019) 26 EuropeanJournal of Health Law 97.
36 European Data Protection Board, 'Guidelines 05/2020 on Consent under
Regulation 2016/679' (4 May 2020) <https://edpb.europa.eu/sites/edpb/files/files/
file1/edpbguidelines_2o2oo5_consent_en.pdf> accessed 20 November 2020.
3 GDPR art 7.
38 European Data Protection Board (n 36).
3 Article 29 Working Party, Guidelines on Transparency under Regulation
2016/679 (11 April 2018) 17/EN WP26o rev.o1 <https://ec.europa.eu/newsroom/
article29/item-detail.cfm?item-id=623o51> accessed 2 March 2020.
40 On the issue of capacity to consent, see Michelle Biros, 'Capacity, Vulnerability,
and Informed Consent for Research' (2018) 46 The Journal of Law, Medicine
&
Ethics 72
41 Chassang (n 32); Niam Clarke and others, 'GDPR: An Impediment to Research?',
(2019) 188 Irish Journal of Medical Science 1129; Miranda Mourby and others,
'Governance of Academic Research Data under the GDPR - Lessons from the
UK' (2019) 9 International Data Privacy Law 192.
38 EuropeanJournal of Legal Studies (Vol. 113 No. ii
52 GDPR, recit 4.
53 GDPR, art 6.4.
54 Sandra Wachter and others, 'Counterfactual Explanations Without Opening the
Black Box' (2018) 31 Harvard Journal of Law & Technology 841.
42 EuropeanJournal of Legal Studies W01.113 No. i
In a time of new approaches to data protection, the GDPR remains the 'gold
standard' in the European framework. However, the GDPR reflects an era
when AI had not yet reached the current levels of technological development
and, more specifically, the field of predictive medicine was in its infancy. As
a result, the Regulation is not fully compatible with the inherent features of
machine learning tools. The resulting legal uncertainty could osbstruct the
development of AI technologies, increasing costs and reducing benefits for
users and patients.6 The EU must act to bridge this gap and fully regulate the
use of AI tools in everyday life, including the medical sector, and achieve a
uniform and coherent policy approach regarding AI matters. Encouragingly,
the European Commission has acknowledged the threats to privacy posed by
machine learning applications and cleared the way for adjusting relevant EU
legislative frameworks. 62
61 Ibid.
62 European Commission, 'White Paper on Artificial Intelligence - A European
Approach to Excellence and Trust' COM (2020) 65 final.
44 EuropeanJournal of Legal Studies {Vol. 113 No. i
GDPR norms are often vague and undefined,6 3 which may be a normative
choice to keep pace with the ongoing technological progress. Specific
mandatory requirements aimed at AI tools may become rapidly obsolete. It
is necessary instead to create a trustworthy environment for developing AI
applications for healthcare purposes with patient privacy rights in mind. The
GDPR already provides valuable instruments, such as the privacy by design
principle, 64 that could help reach this goal. Embedding data protection
features in diagnostics routines would help overcome the black-box barrier
of algorithms. Software providers would train their AI tools to respect
privacy rules from the very first phases of their working patterns, securing
lifecycle protection for the user during the entire duration of data processing
activities. The patient would become the focus of the diagnostic process.
Scientists and physicians would coordinate the work of AI algorithms.
Humans would control the entire process; not machines. This would ensure
the full respect of human rights of every individual involved, resolving the
tension between the privacy rights of the individual and the public health
needs of society.
63 Ibid.
64 Cavoukian (n io).
TUTORIAL WEEK 8 - ARTICLE READING
1. Background of Article
● TITLE
- THE DEPLOYMENT OF ARTIFICIAL INTELLIGENCE TOOLS IN THE HEALTH
SECTOR: PRIVACY CONCERNS AND REGULATORY ANSWERS WITHIN
THE GDPR
● JOURNAL
- European Journal of Legal Studies (Vol 13, No 1)
HEINONLINE
● NAME OF AUTHOR
- Mirko Forti
2. Abstract
● Background Information
- This article aims to examine the privacy and data security issues of
deploying machine learning algorithms in the medical sector.
● Objective
- Addressing the issue of privacy-related arising from deployment of AI tools
in medical sector
- Focusing on main legal uncertainties arising from incompatibilities
between the use of AI tools for predictive medicine and norms such as
(principles of transparency, fairness and lawfulness, issue of free,
informed and specific consent, right to be forgotten, and prohibition of
automated decision)
- Provide reflections about the main privacy threats raised by the
development of predictive medicine and how to overcome them.
1
● Principles of law/legislations -
- General Data Protection Regulation (GDPR)
- Concerning privacy and data protection
2
hospital in Malaysia in order to provide faster COVID-19 diagnosis.
● The question is whether Malaysia’s use of AI tools in the medical sector has
addressed the issue of principle of transparency as discussed above? The
answer is yes. The World Health Organization (WHO) has been prompted to
issue a formal guiding principles for proper design and usage of AI in the
healthcare industry.
● There are 6 guiding principles to the basis for AI regulation and governance for
its member countries. One of the principles or known as Principle 3 concerns
about ensuring transparency, explainability, and intelligibility. It is stated that
“transparency requires that sufficient information be published or documented
before the design or deployment of an AI technology. Such information must be
easily accessible and facilitate meaningful public consultation and debate on how
the technology is designed and how it should or should not be used”
● Moreover, the issue of protection of health data in Malaysia is also primarily
governed by the Personal Data Protection Act 2010 (PDPA 2010) and its
subsidiary legislation. The Act is designed to protect personal data by imposing
certain requirements on data users and imposing certain rights on the data
subject in relation to his personal data.
● It is worth noting that the PDPA 2010 does not include a defined definition for the
terms "consent" or "explicit consent," nor does it provide any minimum standard
or formalities in terms of the consent that must be obtained. However, data users
are reminded that they must preserve a record of data subjects' consents.
Obviously, the requirement of agreement imposed by the PDPA 2010 on health
data demonstrates that our present legislation is consistent with the WHO's AI
Principle 1 which read “Protecting human autonomy: In the context of health
care, this means that humans should remain in control of health-care systems
and medical decisions; privacy and confidentiality should be protected, and
patients must give valid informed consent through appropriate legal frameworks
for data protection.
3
Point 2
● The other thing that caught our attention in this article is regarding The right to be
forgotten on ‘How can an AI algorithm forgets its memory.’
● Article 17 of the GDPR provides a right to be forgotten. The so-called right to be
forgotten is formulated by the Court of Justice of the EU in its judgment of Google
Spain case.
● The right to be forgotten applies to different circumstances such as when the
data are no longer needed for the original purposes or when data subjects
withdraw their consent. For example, a patient wants his medical data to be
erased thus the data controller must erase the data from the AI public domain
also removing any links related to them.
● Technically , AI algorithms need to be fed and trained with a huge amount of data
to improve their processing capabilities to produce a better outcome. For
example Physicians feed medical data to the algorithms to train the AI, which
acquire new information to increase the overall knowledge of algorithms.
● I would say regarding provision stipulated under Article 17 of the GDPR, which is
right to be forgotten, is something good and inline with human rights to secure
their privacy rights and data. However, in regard to AI systems it would radically
affect their algorithm in producing outcome, because AI needs this kind of
dataset to produce a better outcome decision and without this data it may
potentially harm patients.
Point 3
● Even though the GDPR was introduced, there is still a gap that needs to be
fulfilled where the regulation is not fully compatible with the advancement
inherent features of various machine learning tools. Therefore, the EU needs to
update and ensure the uniformity of the current law in order for it to be in line with
the development of technology as well as provide protection for the users.
● GDPR norms are often vague and undefined, which may be a normative choice
where it is inconsistent to keep pace with the ongoing technological progress.
● It is important to ensure in the healthcare system in which besides advancement
4
of AI for the healthcare purposes, it can still respect human privacy.
This article will be beneficial to the legal realm as it can help as a guideline to resolve
the issue between the protection of the human rights which is on protection of the data
privacy as well as to fulfill the needs of the society. It also can help to ensure the gaps
can be fulfilled.
5
DATE DOWNLOADED: Mon Jan 3 09:38:20 2022
SOURCE: Content Downloaded from HeinOnline
Citations:
-- Your use of this HeinOnline PDF indicates your acceptance of HeinOnline's Terms and
Conditions of the license agreement available at
https://heinonline.org/HOL/License
-- The search text of this PDF is generated from uncorrected OCR text.
-- To obtain permission to use this article beyond the scope of your license, please use:
Copyright Information
TAKING Al PERSONALLY: HOW THE E.U. MUST LEARN
TO BALANCE THE INTERESTS OF PERSONAL DATA
PRIVACY & ARTIFICIAL INTELLIGENCE
Matthew Humerickt
TMatthew Humerick holds a J.D. from Michigan State University College of Law (class of 2018)
and will sit for the Massachusetts Bar in July of 2018.
393
394 SANTA CLARA HIGH TECH. L.J. [Vol. 34
TABLE OF CONTENTS
INTRODUCTION
1. David Meyer, Vladimir Putin Says Whoever Leads in Artificial Intelligence Will Rule
the World, FORTUNE (Sept. 4, 2017), http://bit.do/Meyer Putin (citing a recent quote by Vladimir
Putin when discussing the concept of artificial intelligence and its impact on the world).
2. See Maureen Dowd, Elon Musk's Billion-Dollar Crusade to Stop the AI Apocalypse,
VANITY FAIR (Apr. 2017), http://bit.do/DowdElon-Musk. Elon Musk, arguably one of the
greatest innovators of the 21st Century, has openly combatted the rise of AI along with only
highly-regarded minds, such as Stephen Hawking and Bill Gates. Id. This article discusses Musk's
concerns over the development of AI and how its ability to learn from humans, only to outthink
them, casts a grim outlook on humanity's future. Id.
3. GDPR Portal: Site Overview, EU GDPR, http://bit.do/EUGDPR (last visited Oct. 2,
2017) [hereinafter GDPR Portal].
4. See Jordan Novet, Everyone Keeps Talking About AI - Here's What It Really Is and
Why It's So Hot Now, CNBC (June 17, 2017), http://bit.do/Novet AI (citing the first scholarly
definition of AI, as defined in 1956 by John McCarthy, a math professor at Dartmouth College,
who defined AI as "every aspect of learning or any other feature of intelligence [that] can in
principle be so precisely described that a machine can be made to simulate it"). Nowadays, AI
has developed to include terms, such as "machine learning" and "deep learning," which describe
the complexity of technological processes used to collect and utilize data. Id.
5. See Guide to the General Data Protection Regulation (GDPR), INFO. COMM'R'S
OFFICE, http://bit.do/ICO_Overview (last visited Oct. 4, 2017).
396 SANTA CLARA HIGH TECH. L.J. [Vol. 34
GDPR codifies several E.U. consumer rights in PII that cannot coexist
with Al, including: the requirement of explicit consent, the right of
erasure, the right to explanation of automated decisions, and data
portability rights.6 While these aforementioned rights are only afforded
to E.U. citizens, the territorial scope of the GDPR applies to those
processing this protected data, wherever they may be located or
established. As a result, business leaders and legislatures across the
globe must address these compliance issues to determine whether they
can lawfully sustain Al development under the GDPR. To remain
competitive within the Al field, the E.U. must find a way to balance its
interest in data privacy against those in the advancement of Al.
Part I of this Comment provides an overview of what Al is and
how it utilizes personal data to develop. Contained within Part I is also
a brief overview of the current international Al situation and how the
E.U.'s relevance is rapidly declining. Part II discusses data privacy and
Al law within the European Union, and how new regulations may
adversely impact sustained algorithmic development under the current
Al model. Part III proposes two courses of action on how the E.U.
should adapt its views on data privacy and protection to better facilitate
the use and development of Al.
At its most basic level, Al is "a system that can learn how to
learn."" Humans write initial algorithms for a system that enables the
computer to subsequently write its own algorithms, without additional
human oversight or interaction. 9 This process allows Al to
continuously learn from, and solve new problems within, an ever-
changing environment, based on its continuing collection of data.'o
While Al may seem straightforward by this definition, there are
6. See Bernard Marr, New Report: Revealing the Secrets of AI or Killing Machine
Learning?, FORBES (Jan. 12, 2017), http://bit.do/Marr New-Report; see also Rand Hindi, Will
Artificial Intelligence Be Illegal in Europe Next Year?, ENTREPRENEUR (Aug. 9, 2017),
http://bit.do/Hindi Al-illegal.
7. See Regulation 2016/679, of the European Parliament and of the Council of 27 April
2016 on the protection of natural persons with regard to the processing of personal data and on
the free movement of such data, and repealing Directive 95/46/EC (General Data Protection
Regulation), arts. 1-3, 2016 O.J. (L 119) 32, 33 [hereinafter GDPR].
8. FRANCESCO COREA, ARTIFICIAL INTELLIGENCE AND EXPONENTIAL TECHNOLOGIES:
BUSINESS MODELS EVOLUTION AND NEW INVESTMENT OPPORTUNITIES 1-2 (2017).
9. Id. at 2.
10. Id. "While human being actions proceed from observing the physical world and
deriving underlying relationships that link cause and effect in natural phenomena, an artificial
intelligence is moved entirely by data and has no prior knowledge of the nature of the relationship
among those data." Id.
2018] TAKING Al PERSONALLY 397
11. Big Data, GARTNER: IT GLOSSARY, http://bit.do/Gartner Big-Data (last visited Apr.
28, 2018).
12. See Big Data, ArtificialIntelligence, Machine Learning, and Data Protection, INFO.
COMM'R'S OFFICE, 1, 6 [hereinafter Big Data]. But see Sean Jackson, Big Data in Big Numbers
- It's Time to Forget the "Three V's" and Look at Real-World Figures, COMPUTING (Feb. 18,
2016), http://bit.do/Jackson_3Vs (redefining "big data" in away that emphasizes its presence here
and now, rather than as a number or size).
13. See Jacob Morgan, A Simple Explanation of 'The Internet of Things', FORBES (May
13, 2014), http://bit.do/Morgan Simple-Explanation ("Simply put, this is the concept of basically
connecting any device with an on and off switch to the Internet (and/or to each other). This
includes everything from cellphones, coffee makers, washing machines, headphones, lamps,
wearable devices and almost anything else you can think of. This also applies to components of
machines, for example ajet engine of an airplane or the drill of an oil rig.").
14. See Ian Murphy, Could AILead to Breaches of GDPR?, ENTERPRISE TIMES (June 21,
2017), http://bit.do/Murphy_Al.
15. See Deb Miller Landau, ArtificialIntelligenceandMachineLearning: How Computers
Learn, IQ (Aug. 17, 2016), http://bit.do/Landau_Al. Through machine learning, Al undergoes a
three-step process: "[S]tep one is perceiving the world, using data to detect patterns. Step two is
recognizing those patterns, and step three is taking an action based on that recognition." Id. See
generallyETHEM ALPAYDIN, INTRODUCTION TO MACHINE LEARNINGpassim (Thomas Dietterich
et al. eds., 2d ed. 2010).
398 SANTA CLARA HIGH TECH. L.J. [Vol. 34
24. See Rishi lyengar, These Three Countries are Winning the Global Robot Race, CNN
(Aug. 21, 2017), http://bit.do/Iyengar 3-countries.
25. Id. China's government is perhaps the largest proponent of AI development, as
demonstrated by its plans to become the world leader of AI in 2030 through a $150 billion-dollar
plan. Id. See also Sherisse Pham, China Wants to Build a $150 Billion AlIndustry, CNN (July 21,
2017), http://bit.do/PhamChina.
26. See European Commission Press Release IP/18/3362, Artificial intelligence:
Commission outlines a European approach to boost investment and set ethical guidelines (Apr.
25, 2018) (detailing a recent collaborative effort amongst E.U. Member States to invest "at least
t20 billion [in AI investments and research] by the end of 2020"); see also Tania Rabesandratana,
With e1.5 Billion for Artificial Intelligence Research, Europe PinsHopes on Ethics, SCI. (Apr.
25, 2018, 12:35 PM), http://bit.do/Rabesandratana (commenting on the E.U.'s recent plans to
invest in Al and how various issues remain unaddressed).
27. See Simon Baker, Which Countries and Universities Are Leading on AI Research,
TIMES HIGHER EDUC. (May 22, 2017), http://bit.do/BakerAl-research; Fabian, The European
Artificial Intelligence Landscape | More Than 400AI Companies Built in Europe, ASGARD (July
31, 2017), http://bit.do/Fabian European-Al (showing that the majority of AI startups in the E.U.
are located within London (97), compared to the next closest city, Berlin (30)).
28. See Fabian, supra note 27 (discussing how the U.K. has "by far the strongest AI
ecosystem" in Europe, including five of Europe's top ten funded AI companies). Since 2005,
Europe has submitted the third-most Al-related patent applications, trailing behind only the U.S.
and China. China May Match or Beat America in AL, ECONOMIST (July 15, 2017),
http://bit.do/EconomistChina-match-beat. At the same time, Britain has more than double the
amount of AI companies than that of any other European country, ranking at third overall and
followed by Germany at seventh. Id.
29. Alex Hunt & Brian Wheeler, Brexit: All You Need to Know About the UK Leaving the
EU, BBC (Apr. 12, 2018), http://bit.do/HuntBrexit.
30. See Rachel Pells, UK andEurope Must Join Forces' on AI Research Despite Brexit,
TIMES HIGHER EDUC. (May 3, 2018), http://bit.do/PellsBrexit (discussing the U.K.'s expertise
400 SANTA CLARA HIGH TECH. L.J. [Vol. 34
reasons for and against the U.K.'s decision to part from the E.U.-not
discussed in this Comment-a commissioned report to the U.K.
Parliament projects that Al could lead to a f630 billion boon in the
U.K.'s economy by 2035.31 Presumably, the U.K. will use its exit from
the E.U. to develop its own laws for data privacy and Al in a timelier
and more efficient fashion than the E.U., so as to seize this competitive
advantage.32 Not only will Brexit call into question the GDPR's
applicability and territorial scope, but the U.K. will also use less
restrictive provisions to better balance the interests of consumer data
privacy rights with Al. 33 Similar to the actions of the United States,
China, and India, the U.K. plans to balance consumer interests with Al
interests rather than overregulating one or the other.3 4
While Brexit may ultimately lead to a boon in the U.K.'s Al
efforts, the E.U.'s Al industry will suffer greatly from the loss of the
U.K. The E.U should encourage Al efforts so that it remains
competitive in the technological arms race. However, the provisions of
the GDPR may be the primary inhibiting factor in the development of
Al by remaining E.U. Member States and, if unaddressed, may lead to
future departures by Member States.35 Whereas the top Al countries
have minimally regulated Al and the use of PII in big data, the E.U.'s
GDPR creates heightened standards for PII.3 6 Ultimately, the E.U.,
through its efforts to become the world-leader in consumer data
in AI research and why it is necessary for the E.U. and the U.K. to collaborate their efforts in
order to remain competitive in the AI sector).
31. See DAME WENDY HALL & JER(ME PESENTI, DEPARTMENT FOR DIGITAL, CULTURE,
MEDIA & SPORT AND DEPARTMENT FOR BUSINESS, ENERGY & INDUSTRIAL STRATEGY,
GROWING THE ARTIFICIAL INTELLIGENCE INDUSTRY IN THE UK, 2017, at 1-2; Sam Shead,A New
Report Tells the Government How It Can Supercharge'Al, BUS. INSIDER (Oct. 14, 2017, 7:01
PM), http://bit.do/Shead_Supercharge-Al.
32. See sources cited supra note 31.
33. See sources cited supra note 31.
34. See HALL & PESENTI, supra note 31, at 5, 66-74 (arguing that while regulation is
necessary, the report recommends that the industry and government work collaboratively to
establish a U.K. AI Council that oversees, rather than curbs, the development of Al).
35. While this is only speculative, support for the E.U. has declined in nearly all of its
Member States. See Rebecca Flood, REVEALED: Which Countries Could Be Next to Leave the
EU?, EXPRESS (last updated Oct. 2, 2016, 2:34 PM), http://bit.do/FloodNext-to-leave (quoting
ajoint statement issued after the Brexit vote by the foreign ministers of prominent Member States,
"We are aware that discontent with the functioning of the EU as it is today is manifest in parts of
our societies.. .. We take this very seriously and are determined to make the EU work better for
all our citizens.). Were the E.U. to incidentally hinder the use and development of AI in
comparison to the rest of the developing world, it is possible that additional Member States will
contemplate exits so as not to fall behind.
36. See Nick Wallace & Daniel Castro, Center for Data Innovation, The Impact of the
EU's New Data Protection Regulation on Al 1-4, 25-27 (2018), http://bit.do/Wallace_Impact.
2018] TAKING Al PERSONALLY 401
37. See Charter of Fundamental Rights of the European Union, 2012 O.J. (C 326)
[hereinafter Charter]. While Article 7 of the Charter of Fundamental Rights, id. art. 7, provides a
general right to respect one's private communications, Article 8, id. art. 8, explicitly addresses the
protection of personal data.
38. Charter, supra note 37, art. 8.
39. See Directive 95/46/EC, of the European Parliament and of the Council of 24 October
1995 on the protection of individuals with regard to the processing of personal data and on the
free movement of such data, 1995 O.1. (L 281) 31 [hereinafter DPD]. As a Directive, the DPD
held less authority on the E.U. Member States where each could "determine more precisely the
conditions under which the processing of personal data is lawful." Id. art. 5. However, because
the DPD is only a directive, its authority was limited to setting minimum standards rather than
prescribing a uniform standard of regulation like the way the GDPR will. See Courtney M.
Bowman, A Primer on the GDPR: What You Need to Know, PROSKAUER (Dec. 23, 2015),
http://bit.do/BowmanPrimer; DANIEL J. SOLOVE & PAUL M. SCHWARTZ, INFORMATION
PRIVACY LAW 849, 1135 (Erwin Chemerinsky et al. eds., 5th ed. 2015).
40. GDPR Portal,supra note 3.
41. See Sven Jacobs & Christoph Ritzer, DataPrivacy:Al and the GDPR, NORTON ROSE
FULBRIGHT: DATA PROTECTION BLOG (Nov. 2, 2017), http://bit.do/JacobsDataPrivacy.
42. European Parliament Press Release 201701101PR57613, Robots: Legal Affairs
Committee calls for EU-Wide Rules (Jan. 12, 2017). While there is recognition in the E.U. that it
must promulgate Al laws, discussions have mostly centered on robotics and its potential liabilities
but not how to develop Al.
402 SANTA CLARA HIGH TECH. L.J. [Vol. 34
43. GDPR, supra note 7, art. 2. Subject to a few narrow exceptions in subsection (2) of
Article 2, the GDPR's material scope is "to the processing of personal data wholly or partly by
automated means and to the processing other than by automated means of personal data which
form part of a filing system or are intended to form part of a filing system." Id. art. 2(1).
44. Id. art. 3. According to the GDPR, its scope "applies to the processing of personal data
in the context of the activities of an establishment of a controller or a processor in the Union,
regardless of whether the processing takes place in the Union or not." Id. art. 3(1). In addition, the
rest of the article expands the GDPR's territorial scope to those controllers and processors not
established in the union generally, and to those where Member State law applies through public
international law. Id. arts. 3 (2)-(3).
45. Id. art. 4(1). Not elsewhere defined in the GDPR, a "data subject" is an "identified or
identifiable natural person." Id.
46. Id. art. 4(2).
2018] TAKING Al PERSONALLY 403
Even prior to the GDPR, the European Court of Justice (ECJ), the
E.U.'s highest court, liberally read data privacy laws where personal
data may be processed or at risk through a convoluted process. For
example, on October 19, 2016, the ECJ heard a case that involved the
German government and its practice of capturing dynamic IP
addresses. Here, the ECJ broadly interpreted the DPD to protect
certain IP addresses because controllers could "likely reasonably"
compare their data with a third-party's separate system, which contains
identifying information, to identify individual users.4 " By broadly
holding that seemingly anonymous information can be identifying if
possibly used by a third-party, the ECJ demonstrated a liberal
interpretation of data protection policies. Since this broad holding
occurred under the less exhaustive DPD, it is fair to speculate that there
will almost be a presumption of violation in the more restrictive, but
broadly defined, GDPR.
In addition to its material scope, the GPDR attempts to create
international law through an expansive territorial scope. 49 According to
Article 3, the GDPR applies to controllers or processors who process
the personal data of E.U. citizens, and who are: (1) established in the
E.U.;o (2) not established in the E.U., but the activities concern data
subjects in the E.U.;" and, (3) to those subject to Member State law.52
While Article 3's broad territorial scope is likely to face legal
challenges by non-E.U. companies and countries prosecuted under this
47. See Case C-582/14, Breyer v. Bundesrepublik Deutschland, 2016 E.C.R. 11-779. A
dynamic IP address is an IP address assigned to a device when using a public network or internet
service that changes each time the user reconnects. This practice allows a single user to a new IP
address each time a page is reconnected to.
48. Id. Here, the German government collected reported PII when users logged into its
system while a separate system captured IP addresses. These captured IP addresses were not
logged as PII; however, because an in-depth comparison of the two systems could ultimately
identify a user, the ECJ found that consumer data could be traced back to an individual.
49. See GDPR, supranote 7, art. 3.
50. Id. art. 3(1). The first territorial component of Article 3 states that the GDPR "applies
to the processing of personal data in the context of the activities of an establishment of a controller
or a processor in the Union, regardless of whether the processing takes place in the Union or not."
Id.
51. Id. art. 3(2). The second territorial provision of Article 3 states that the GDPR
applies to the processing of personal data of data subjects who are in the Union by
a controller or processor not established in the Union, where the processing
activities are related to: (a) the offering of goods or services, irrespective of
whether a payment of the data subject is required, to such data subjects in the
Union; or (b) the monitoring of their behaviour as far as their behaviour takes place
within the Union.
Id.
52. Id. art. 3(3). The final territorial provision of Article 3 states that the GDPR "applies to
the processing of personal data by a controller not established in the Union, but in a place where
Member State law applies by virtue of public international law." Id.
404 SANTA CLARA HIGH TECH. L.J. [Vol. 34
1. Right to Consent
102 of the Treaty, which are cited within the GDPR, see 2012 O.J. (C 326) 88, 89. Based on its
use, it may be inferred to mean the actions of one or a collective group. See id.
61. See GDPR, supranote 7, arts. 7, 13(f), 14(g), 15(h), 17, 20-22.
62. Id. arts. 6-7.
63. "Consent" under the GDPR is defined as "any freely given, specific, informed and
unambiguous indication of the data subject's wishes by which he or she, by a statement or by a
clear affirmative action, signifies agreement to the processing of personal data relating to him or
her." Id. art. 4(11).
64. See id. art. 7 (listing conditions on consent, such as how controllers must prove that
consent was freely given).
65. Id. art. 8 (Under the standards of Article 8, consent for children refers to "the offer of
information society services directly to a child"). The GDPR cross-references the definition of the
term "information society services" to Directive (EU) 2015/1535 of the European Parliament and
of the Council, id. art. 4(25). This Directive defines the term broadly as "any service normally
provided for remuneration, at a distance, by electronic means and at the individual request of a
recipient of services." Directive 2015/1535, of the European Parliament and of the Council of 9
September 2015 laying down a procedure for the provision of information in the field of technical
regulations and of rules on Information Society services, art. 1(b), 2015 O.J. (L 241) 1, 3
[hereinafter Information Society].
66. See GDPR, supra note 7, art. 8(1). If a child is below the minimum age, only "the
holder of parental responsibility over the child" may authorize consent. Id.
67. Id. art. 8(2).
406 SANTA CLARA HIGH TECH. L.J. [Vol. 34
73. See GDPR, supra note 7, art. 17. In addition to a right to erasure, data subjects also
have a right to the restriction of processing, id. art. 18, which, while not as potentially catastrophic,
will lead to the same or similar cumbersome results.
74. Id. art. 17. A controller must erase personal data when any of the following applies:
(a) the personal data are no longer necessary in relation to the purposes for
which they were collected or otherwise processed;
(b) the data subject withdraws consent on which the processing is based
according to point (a) of Article 6(1), or point (a) of Article 9(2), and
where there is no other legal ground for the processing;
(c) the data subject objects to the processing pursuant to Article 21(1) and
there are no overriding legitimate grounds for the processing, or the data
subject objects to the processing pursuant to Article 21(2);
(d) the personal data have been unlawfully processed;
(e) the personal data have to be erased for compliance with a legal obligation
in Union or Member State law to which the controller is subject;
(f) the personal data have been collected in relation to the offer of
information society services referred to in Article 8(1).
Id. art. 17(1).
75. Id. art. 17(2).
408 SANTA CLARA HIGH TECH. L.J. [Vol. 34
(2) the right to then transmit this data to another controller, without
hindrance.so These rights allow data subjects to facilitate the spread of
information, which may ease data collection efforts by smaller
controllers. On the other hand, the right to portability poses similar
problems to those inherent in the rights to consent and erasure."'
The right to portability requires that controllers maintain
processes to identify and isolate an individual's PII.8 2 This, and the
requirement to provide a structured report to the data subject, are fairly
simple tasks. The second right, embedded within Article 20, will
potentially cause issues for controllers: the right to transmit data to
another controller.8 3 In addition to the ability to exercise the right to be
forgotten, and all its associated issues, 4 the data subject may now
require controllers to relinquish their competitive advantages.
Development of Al under its current model hinges on the
collection of big data; large data sets provide a distinct competitive
advantage. Companies spend millions of dollars solely to improve the
data collection processes.' By allowing a data subject to retrieve or
extract this data, smaller companies can collect comparable amounts of
PII without needing to spend as much on their collection processes."
In theory, Article 20's requirements may lead to data parity, which
could in turn create more competitive Al markets that ultimately
benefit consumers. Companies would then have a less distinct
advantage, or disadvantage, based on the amount of data available to
them. While it is possible that this data parity may help with the overall
development of Al, it may also lead to greater risks for companies and
negatively impact Al.
Since consumers will be able to choose who retains their
information, companies will need to emphasize their public relations
and data security-otherwise, data subjects may not trust controllers
with their information. For example, when large scale data breaches
occur, consumers immediately feel violated. Here, the loss of trust and
desire for remedial action will likely result in consumers requiring
80. Id. art. 20(1). Upon request, a controller has an obligation to deliver the personal data
in a "structured, commonly used and machine-readable format." Id.
81. See discussion supra Sections II.A.1 - II.A.2.
82. See Article 29 Data Protection Working Party, Guidelines on the Right to Data
Portability,EUR. COMM'NDOC. WP 242 rev.01 (Apr. 5, 2017).
83. Id. at 4-5.
84. See discussion supra Section II.A.2.
85. Data & Analytics Survey: Big Data and Its Use Cases Keep Growing, IDG (2016),
http://bit.do/IDG Survey.
86. See Ruth Janal, Data Portability - A Tale of Two Concepts, 8 J. INTELL. PROP., INFO.
TECH. & E-CoM. L. 59, 60-61 (2017).
87. See id. for further discussion on this theory.
410 SANTA CLARA HIGH TECH. L.J. [Vol. 34
4. Right to Explanation
88. GDPR, supra note 7, art. 22(i) ("The data subject shall have the right not to be subject
to a decision based solely on automated processing, including profiling, which produces legal
effects concerning him or her or similarly significantly affects him or her.").
89. Id. art. 22(3).
90. Id. art. 22(2). The right not to be subject to an automated decision does not apply if the
decision:
(a) is necessary for entering into, or performance of, a contract between
the data subject and a data controller;
(b) is authorised by Union or Member State law to which the controller is
subject and which also lays down suitable measures to safeguard the
data subject's rights and freedoms and legitimate interests; or
(c) is based on the data subject's explicit consent.
Id.
91. Id. art. 22(3) (When automated decisions are made either to perform a contract or with
explicit consent, as outlined in Article 22(2), a controller must still "implement suitable measures
to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to
obtain human intervention on the part of the controller, to express his or her point of view and to
contest the decision.").
92. "Profiling" is defined as:
[A]ny form of automated processing of personal data consisting of the use of personal data
to evaluate certain personal aspects relating to a natural person, in particular to analyse or
predict aspects concerning that natural person's performance at work, economic situation,
health, personal preferences, interests, reliability, behaviour, location or movements[.]
Id. art. 4(4), at 33.
2018] TAKING Al PERSONALLY 411
93. Id. art. 15(1)(h), at 43. Under Article 15, data subjects are given a right of access to
their controlled personal data, including but not limited to, the purpose of the processing, the
categories of data, and the envisaged time period. See id. art. 15(1).
94. Id. art. 21(1), at 45 ("The data subject shall have the right to object, on grounds relating
to his or her particular situation, at any time to processing of personal data concerning him or her
which is based on point (e) or (f) of Article 6(1), including profiling based on those provisions.
The controller shall no longer process the personal data unless the controller demonstrates
compelling legitimate grounds for the processing which override the interests, rights and freedoms
of the data subject or for the establishment, exercise or defence of legal claims."). Additionally,
the right to object to processing must be "explicitly brought to the attention of the data subject
and shall be presented clearly and separately from any other information." Id. art. 21(4), at 46.
Because of this, consumers know they can object to unfavorable decisions made using their
individual PII.
95. Id. art. 6(1)(e), (f), at 36. As referenced, the relevant portions of Article 6(1) referred
to by Article 2 1(1) include when:
(d) processing is necessary for the performance of a task carried out in the
public interest or in the exercise of official authority vested in the
controller; [and]
(e) processing is necessary for the purposes of the legitimate interests
pursued by the controller or by a third party, except where such interests
are overridden by the interests or fundamental rights and freedoms of the
data subject which require protection of personal data, in particular where
the data subject is a child.
Id.
96. Id. art. 21(1), at 45.
97. See id. art. 18(1), at 44.
412 SANTA CLARA HIGH TECH. L.J. [Vol. 34
104. Compare id. art. 21(1), with id. art. 21(2), (3) (while controllers may defend their
continued processing when lawfully conducted under Article 6(1)(e), (f), processing for the
purposes of direct marketing is objectionable without rebuttal).
105. See The Future ofRobotics andArtificial Intelligence in Europe, EUR. COMM'N: DIG.
SINGLE MKT. BLOG (Feb. 16, 2017), http://bit.do/DSM-blog (discussing the E.U. Commission's
adoption of the Digitising European Industry Strategy in 2016, which recognizes robotics and AI
as cornerstone technologies). See also Rich Haridy, EU Move to Bring in AI Laws, but Reject
Robot Tax Proposal, NEWS ATLAS (Feb. 16, 2017), http://bit.do/Haridy EU-move. According to
this article, the E.U. Parliament voted 396 to 123, with 85 abstentions, to pass a resolution to
regulate the development of AI and robotics. Id. This resolution has since passed to the E.U.
Commission, the E.U.'s executive branch, to determine whether or not to accept or reject the
proposal to regulate. Id. The E.U. Parliament's decision to regulate was based on a report sent to
the E.U.'s Legal Affairs Committee, which called for regulation of AI and robotics. Id.
106. See Eur. Parl. Comm'n. on Legal Affairs, DraftReport with Recommendations to the
Commission on Civil Law Rules on Robotics, 2015/2103(INL) (May 31, 2016),
http://bit.do/EuroParlInitial-report [hereinafter DraftReport].
107. See id. at 7.
108. Id. ("to ensure a timely and well-informed response to the new opportunities and
challenges arising from the technological development of robotics").
109. See id. at 10-12.
110. See id. at 8 (noting that regulation must pay particular attention to data privacy and
protection as it concerns technical integration into hardware and software, necessity and
proportionality standards, and the use of personal data as currency).
111. Big Data, supra note 12, at 19-56.
112. Id. (listing a multitude of liability and operation concerns for the use and development
of Al, including: fairness of processing, conditions for processing, and data minimization, among
414 SANTA CLARA HIGH TECH. L.J. [Vol. 34
others).
113. Id.
114. Id. at 95.
115. See, e.g., Draft Report, supra note 106 (emphasizing that privacy and data protection
guarantees must be embedded in the E.U.'s regulatory framework for robotics and AI). The E.U. 's
cautious approach to the development of AI is evident in its policy. Id. at 7 (noting that the
"potential for empowerment through the use of robotics is nuanced by a set of tensions or risks
relating to human safety, privacy, integrity, dignity, autonomy and data ownership"). In response
to the growing commercialization of the robotics and AI sector, the E.U. has stressed the
importance of consumer protection (pointing out that the "use of personal data as a 'currency'
with which services can be 'bought' . . . may not lead to a circumvention of the basic principles
governing the right to privacy and data protection). Id. at 8.
2018] TAKING Al PERSONALLY 415
B. E. U FundingofAlternative Al Models
CONCLUSION
By
Matthew Humerickt
(Santa Clara High
Technology Law Journal 34, no. 4 (May 2018): 393-418)
Abstract :
● Part II discusses data privacy and Al law within the European Union, and how new
regulations may adversely impact sustained algorithmic development under the
current Al model.
● Part III proposes two courses of action on how the E.U. should adapt its views on data
privacy and protection to better facilitate the use and development of Al.
● While issues one and two are more easily discernible, companies and organizations
continue to scramble to understand what the GDPR entails and whether they are in
compliance with it or not.
● However, organizations cannot guarantee compliance with the GDPR, especially for
Al, where the data encompassed within big data fields can often trace back to
individuals.
● While the intent of the GDPR is to ease data processing through uniform data privacy
and protection standards, its rapid implementation, coupled with its high standards,
threaten to result in tremendous liability risks for data controllers and processors
worldwide.
● Specifically, based on the existing Al development models, machine learning and big
data will cause organizations worldwide to fall under the scope of the GDPR, whether
they know it or not.
● This potential for liability would be even greater in unsupervised Al models, which
inherently have minimal to no human oversight. Once deemed within its scope,
organizations will need to comply with the GDPR's provisions.
● Even if located outside of the E.U., Article 3 presumably may hold anyone liable,
especially those controllers interested in conducting business with companies either
within the E.U., or who trade with E.U.-based companies.
● If held liable, companies stand to face severe penalties in the form of administrative
fines up to 20,000,000 EUR, or in the case of an undertaking, up to 4 % of the total
worldwide annual turnover of the preceding financial year, whichever is higher.
Second Point:
● Article 7 of the GDPR the burden is on the controller to prove that the data subject
unambiguously and freely consented, among other conditions and that data subjects
retain the right to withdraw consent at any time.
● Both the need for consent and the right to withdraw consent threaten the development
of Al because it could limit the amount of data available to learn from.
● Data subjects, in certain situations, can exercise a right to restrict the processing of
their information.
● The current model of deep learning through neural networks demonstrates how Al's
development hinges on utilizing multitudes of data to continuously adapt to the
surrounding environment.
● Current Al model hinges around neural networks that interweave all sets of data, this
isolation is unlikely to occur. It is likely that the GDPR's consent provision will result
in either large scale Al regression or continual liability risks for those continuing to
derive learnings from unlawfully processed information.
Third Point:
● Article 22 grants data subjects the right to not be subject to decisions based solely on
automated processing." Instead, the data subjects may, at the very least, exercise a
right to human intervention and explanation. Even if subject to automated
decision-making, individuals still have a right to know of its existence, including
profiling and, if requested, must be provided with "meaningful information about the
logic involved, as well as the significance and the envisaged consequences of such
processing for the data subject
● Data subjects also have a right to object to automated processing when data
processing is for either public interest reasons or when the data subject's fundamental
rights and freedoms outweigh the interests of the processing controller or third party.
Having said that, the introduction of the GDPR presents an opportunity for Malaysian
businesses, particularly organisations dealing with AI, to buckle up and set a higher standard
of protection and procedure to ensure that their business processes are in sync with the
changes in both the local and global personal data protection regulatory regime. Moving
forward, it is proposed that Malaysian companies and businesses (data controllers) conduct
Data Protection Impact Assessment to understand the risks to the security and privacy of the
data processed by AI and ways to mitigate those risks. This would eventually help to balance
the interest of personal data privacy as well as the development of artificial intelligence. The
ratification of appropriate and necessary principles under the GDPR as part of Malaysia’s
personal data protection laws is vital to ensure growth of AI whilst balancing the emerging
need to improve data protection.