You are on page 1of 20

The EU Commission’s Guidance on Article 17 of the Copyright in the

Digital Single Market Directive – A Guide to Virtue in Content


Moderation by Digital Platforms?

Christophe Geiger & Bernd Justin Jütte*

“Unless virtue guides us, our choice must be wrong”


William Penn1

Abstract:

The Directive on Copyright in the Digital Single Market might fundamentally change
the way works and other subject matter protected by copyright will be used on online
platforms. Article 17 of the Directive obliges such platforms, based on information they
receive from rightholders, to ensure that copyright infringing uploads made by their
users are prevented and/or removed by an internalized procedure over which platform
operators exercise control. Important enforcement mechanisms are thereby left to
private entities solely deciding what content can (or cannot) be available on their
platforms. More problematic, the immense masses of uploads seem to be only
manageable by automated AI-based filtering technologies incapable of properly
distinguishing between lawful and unlawful uses, which leads to a serious danger of
overblocking of perfectly legitimate content and thus to potentially important limitations
of fundamental rights such as the right to freedom of expression. At the same time,
Article 17 also forbids general monitoring of the content available on platforms and
clearly mandates to safeguard user’s rights protected by exceptions and limitations to
copyright such as quotations, criticism and review or caricature, parody or pastiche.
This obligation of result leads to an unsolvable conflict and in a great difficulty for

*
Christophe Geiger is Professor of Law at the Centre for International Intellectual Property Studies
(CEIPI), University of Strasbourg (France); Affiliated Senior Researcher at the Max Planck Institute for
Innovation and Competition (Munich, Germany) and Spangenberg Fellow in Law & Technology at the
Spangenberg Center for Law, Technology & the Arts, Case Western Reserve University School of Law
(Cleveland, US); Bernd Justin Jütte is Assistant Professor in Intellectual Property Law, University
College Dublin, Sutherland School of Law (Ireland) and Senior Researcher Vytautas Magnus
University, Faculty of Law (Kaunas, Lithuania).
1
William Penn, “Some Fruits of Solitude” (London: Headley Brothers 1905), p. 67.

Electronic copy available at: https://ssrn.com/abstract=3876608


Member States of implementing the provision in a fundamental rights compatible way.
Moreover, the lack of clarity of the provision has led Member States to interpret it in
significantly diverging ways, jeopardizing the objective of harmonisation behind the
Directive. Therefore, the interpretation guidelines by the European Commission
foreseen by Article 17(10) to help Member States in their implementation effort were
impatiently awaited, in particular since the provision is facing an action for annulment
before the Court of Justice of the European Union for potential violating the right to
freedom of expression.

The Guidance on Article 17 finally issued by the Commission in June 2021 after
several postponements, although it provides for some useful clarifications and certain
safeguards for users of protected works, but unfortunately does not depart from a
system based on monitoring and automated filtering and thus is likely to fail protecting
fundamental rights in an appropriate manner. This paper analyses the main additions
and proposed interpretation tools that the Guidance brings to platform’s content
moderation as mandated by Art. 17 CDSM. It argues that in order to establish a
virtuous content moderation system and to help Member States to implementing
Article 17 in a balanced way, it would have been essential to address the more
fundamental concerns when it comes to Article 17, and in particular the fact that
privately operated algorithmic tools and not independent assessors based on
copyright law’s equilibrium are deciding what content should be available online and
to acknowledge the inherent limits and flaws of technology. We conclude that in the
absence of a truly independent arbiter between the interests of users, platforms and
rightholders, Article 17 of the Directive is likely not to comply with European
fundamental rights and the basic principles of EU law.

Electronic copy available at: https://ssrn.com/abstract=3876608


1. Introduction

The Directive on Copyright in the Digital Single Market2 (CDSM Directive) was adopted
in May 2019 and the European Union (EU) legislator had left the Member States (MS)
a generous transposition period until the 7 June 2021.3 Nevertheless, many MS
struggled to implement the Directive within this two-year period.4 One of the main
reasons for these difficulties is Article 17, which changes the liability rules for so-called
online-content sharing service providers (OCSSPs), a complicated provision with ten
rather lengthy paragraphs full of internal contradictions resulting from heated debates
and difficult last minute compromises in the adoption process.5 Aware of its
complexity, its last paragraph foresaw a stakeholder dialogue that should provide
“issue guidance on the application of this Article” to provide for the needed assistance
to MS in transposing this provision.6 The stakeholder dialogue was concluded in 20207
but it took the Commission until 4 June 2021, a Friday before the transposition period
expired on a Monday, to finally publish its guidance based on this stakeholder
dialogue after several postponements.8 The Commission had given some indication

2
Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright
and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC,
[17.04.2019] OJ L 130, 17.5.2019, p. 92-125 (CDSM Directive). See for a global appreciation of the
Directive: Séverine Dusollier, ’The 2019 Directive on Copyright in the Digital Single Market: Some
progress, a few bad choices, and an overall failed ambition’ [2020] Common Market Law Review 979
and João Pedro Quintais, ’The New Copyright in the Digital Single Market Directive: A Critical Look’
[2020] European Intellectual Property Review 28.
3
Article 29(1) CDSM Directive.
4
Until the deadline expired on 7 June 2021, only three MS (Germany, Hungary and The Netherlands)
had fully implemented the provisions of the CDSM Directive. See for the current transposition status of
the various provisions of the CDSM Directive: Communia, DSM Directive Implementation Portal,
available at: https://www.notion.so/DSM-Directive-Implementation-Portal-
97518afab71247cfa27f0ddeee770673, accessed: 26.06.2021.
5
For a detailed n analysis see Christophe Geiger & Bernd Justin Jütte, ’Platform Liability Under Art. 17
of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An
Impossible Match’ [2021 forthcoming] GRUR International, available at:
SSRN: http://ssrn.com/abstract=3776267.
6
Article 17(10) CDSM Directive.
7
For an interim report on the stakeholder dialogues see e.g. Paul Keller, Article 17 stakeholder
dialogue: What we have learned so far – Part 1, available at:
http://copyrightblog.kluweriplaw.com/2020/01/13/article-17-stakeholder-dialogue-what-we-have-
learned-so-far-part-1/, accessed: 07.05.2020.
8
European Commission, Communication from the Commission to the European Parliament and the
Council. Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market,
COM(2021) 288 final, Brussels, European Commission, 04.06.2021.

Electronic copy available at: https://ssrn.com/abstract=3876608


as to the content of the guidance in a document published as part of a targeted
consolation in September 20209 but it deviated in important points in the final version
of the guidance.

Briefly after the Directive had been adopted, the Polish government challenged certain
parts of Article 17 through an action for annulment before the Court of Justice of the
European Union (CJEU),10 claiming that an implied obligation to filter content would
violate the right to freedom of expression. In fact, the interplay of fundamental rights
within Article 17 is far more complicated, involving further fundamental rights, including
freedom to conduct a business, freedom to impart and receive information, freedom
of artistic creativity, the right to privacy, the right to a fair trial and the right to property.11

In this period of anticipation whether the Commission would propose a fundamental-


rights-compliant interpretation to Article 17 and, as a result, pre-empt the challenge to
Article 17, academics and NGOs raised serious doubts whether this would be
possible;12 the position taken by the European Commission in the hearing on the

9
See with a link to the document: Paul Keller, Commission consultation on Article 17 guidance: User
rights must be protected at upload, available at: https://www.communia-
association.org/2020/09/02/commission-consultation-article-17-guidance-user-rights-must-protected-
upload/, accessed: 11.06.2021.
10
CJEU, Action brought on 24.07.2019, Republic of Poland v. European Parliament and Council of the
European Union, Case C-401/19. On this challenge see Christophe Geiger & Bernd Justin Jütte, The
Challenge to Article 17 CDSM, an opportunity to establish a future fundamental rights-compliant liability
regime for online platforms, available at: http://copyrightblog.kluweriplaw.com/2021/02/11/the-
challenge-to-article-17-cdsm-an-opportunity-to-establish-a-future-fundamental-rights-compliant-
liability-regime-for-online-platforms/, accessed: 11.02.2021.
11
For a complete assessment of the fundamental rights at issue in Art. 17 CDSM, see Geiger & Jütte
(n 5)), pp. 7-13. On the importance of fundamental rights in interpreting EU intellectual property law,
see e.g. Christophe Geiger, ‘Constitutionalising” Intellectual Property Law? The Influence of
Fundamental Rights on Intellectual Property in the European Union’, [2006] International Review of
Intellectual Property and Competition 371.
12
See for example the contributions by Geiger & Jütte (n 5)) and Julia Reda, Joschka Selinger, &
Michael Servatius, Article 17 of the Directive on Copyright in the Digital Single Market: a Fundamental
Rights Assessment (Study for Gesellschaft für Freiheitsrechte), available at:
https://freiheitsrechte.org/home/wp-content/uploads/2020/11/GFF_Article17_Fundamental_Rights.pdf,
accessed: 26.01.2020; specifically on the rights of users see Joao Pedro Quintais et al., ’Safeguarding
User Freedoms in Implementing Article 17 of the Copyright in the Digital Single Market Directive:
Recommendations from European Academics’ [2019] Journal of Intellectual Property, Information
Technology and E-Commerce Law 277, and in the context of the ban on general monitoring and its
fundamental rights justifications, see Christina Angelopoulos & Martin Senftleben, ’An Endless
Odyssey? Content Moderation Without General Content Monitoring Obligations’ [2021] Available at
SSRN: https://ssrn.com/abstract=3871916 . This is indeed a larger problem in the context of addressing
unlawful content on the internet which has also been discussed in the context of the proposed Digital
Services Act (European Commission, Proposal for a Regulation of the European Parliament and of the
Council on a Single Market For Digital Services (Digital Services Act) an amending Directive

Electronic copy available at: https://ssrn.com/abstract=3876608


Polish challenge gave some hope as the Commission seemed to rule out any ex ante
blocking of user uploads legitimated by limitations to copyright law.13 Adding to this
tension, and excitement for some, was the postponement of the delivery of the Opinion
of Advocate General (AG) Saugmandsgaard Øe in the action for annulment, which
had originally been scheduled for the end of April 2021 and is now expected for 15
July.14
In this context the guidance provided by the European Commission provides useful
pointers whether Article 17 can be implemented at national level in a way that is in
conformity with EU fundamental rights. In this contribution to the larger debate on
Article 17,15 we highlight some of the important aspects of the Guidance.

2. The main obligations under Article 17 CDSM Directive and the


provision’s inherent contradictions

2000/31/EC, COM(2020) 825 final, Brussels, European Commission, 15.12.2020), see for that purpose
Giancarlo Frosio & Christophe Geiger, ’Taking Fundamental Rights Seriously in the Digital Service Act’s
Platform Liability Regime’ [2020] Available at SSRN:
https://ssrn.com/abstract=https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3747756 1.
13
On the issues discussed during the hearing and the split between the European Institutions and some
MS, incl. Spain and France see Paul Keller, CJEU hearing in the Polish challenge to Article 17: Not
even the supporters of the provision agree on how it should work, available at:
http://copyrightblog.kluweriplaw.com/2020/11/11/cjeu-hearing-in-the-polish-challenge-to-article-17-
not-even-the-supporters-of-the-provision-agree-on-how-it-should-work/, accessed: 16.12.2020.
14
There are two reasons why the Opinion could have been postponed: first, the AG was waiting for the
publication of the Guidelines, second, the AG wanted to await the outcome of CJEU, Judgment of
22.06.2021, YouTube and Cyando, Joined Cases C-682/18 and C-683/18, EU:C:2021:503.
15
See only the contributions of Matthias Leistner, ’European Copyright Licensing and Infringement
Liability Under Art. 17 DSM-DirectiveCan We Make the New European System a Global Opportunity
Instead of a Local Challenge?’ [2020] Zeitschrift für Geistiges Eigentum/Intellectual Property Journal
123; Martin Senftleben & Christina Angelopoulos, ’The Odyssey of the Prohibition on General
Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce
Directive and Article 17 of the Directive on Copyright in the Digital Single Market (October 22, 2020).
Available at SSRN: https://ssrn.com/abstract=3717022’ [October 2020] Available at SSRN:
https://ssrn.com/abstract=3717022 1; Gerald Spindler, ’The Liability system of Art. 17 DSMD and
national implementation – contravening prohibition of general monitoring duties?’ [2019] Journal of
Intellectual Property, Information Technology and E-Commerce Law 344; Gerald Spindler, ’Art. 17
DSM-RL und dessen Vereinbarkeit mit primärem Europarecht. Zugleich ein Beitrag zu
Umsetzungsmöglichkeiten’ [2020] Gewerblicher Rechtschutz und Urheberrecht 253; Gerald Spindler,
’Upload-Filter: Umsetzungsoptionen zu Art. 17 DSM-RL’ [2020] Computer und Recht 50; Martin
Husovec & Joao Pedro Quintais, ’How to License Article 17? Exploring the Implementation Options for
the New EU Rules on Content-Sharing Platforms’ [2019] Available at SSRN:
https://ssrn.com/abstract=3463011 or http://dx.doi.org/10.2139/ssrn.3463011 1.

Electronic copy available at: https://ssrn.com/abstract=3876608


In what seems to be a clear departure from the case-law of the CJEU on the right of
communication to the public under Article 3(1) of the Information Society Directive,16
Article 17(1) makes certain hosting platforms, so-called online-content sharing service
providers (OCSSPs)17 directly liable for content uploaded by their users. This means
that OCSSPs must themselves obtain the necessary authorization for such uploads.18
If OCSSPS do not succeed in obtaining the relevant authorizations, presumably by
concluding licenses with large rightholders and collecting societies, or when the
relevant rightholders refuse to grant such licenses, OCSSPs can still escape liability
by demonstrating that they have (a) made best efforts in obtaining the necessary
authorisations,19 that they (b) ensure that works uploaded without prior authorization
and for which the rightholders have provided for the relevant and necessary
information are made unavailable on their services20 and (c) remove works upon
notification by the rightholder in an expeditious manner and to prevent their future
uploads.21

16
Directive 2001/29/EC of the European Parliament and of the Council of 22 May 2001 on the
harmonisation of certain aspects of copyright and related rights in the information society, [22.05.2001]
OJ L 167, 22.6.2001, p. 10-19 (InfoSoc Directive). See for example the recent ruling in Youtube/Cyando,
in which the Court held that “the operator of a video-sharing platform or a file-hosting and -sharing
platform, on which users can illegally make protected content available to the public, does not make a
‘communication to the public’ of that content, within the meaning of that provision, unless it contributes,
beyond merely making that platform available, to giving access to such content to the public in breach
of copyright” (CJEU, C-683/18 (YouTube and Cyando), para. 83); see also AG Saugmandsgaard Øe,
Opinion of 16.07.2020, YouTube, Joined Cases C-682/18 and C-683/18, EU:C:2020:586, paras. 53-
131. It is true that the right of communication to the public has been interpreted extensively but, through
the application of the ‘safe harbour’ regime of the e-Commerce Directive, EU law provides a liability
exemption to intermediaries such as hosting platforms, which are not aware of infringement taking place
on their services, and further provided them with a remediating escape route if they remove, and to a
certain extent prevent uploads of infringing content. The blocking and filtering obligations, on the other
hand, have been interpreted by the CJEU very restrictively and their legitimacy has been questioned
with regard to fundamental rights (see e.g. CJEU, Judgment of 16.06.2012, SABAM v Netlog, Case C-
360/10, EU:C:2012:85). The Commission Guidance clarifies the relation between Article 3(1) InfoSoc
Directive and Article 17(1) CDSM Directive by stating that “Article 17 is a lex specialis to Article 3 of
Directive 2001/29/EC and Article 14 of Directive 2000/31/EC. It does not introduce a new right in the
Union’s copyright law. Rather, it fully and specifically regulates the act of ‘communication to the public’
in the limited circumstances covered by this provision.” (COM(2021) 288 final, p. 2).
17
Article 2(6) CDSM Directive defines an OCSSP as “a provider of an information society service of
which the main or one of the main purposes is to store and give the public access to a large amount of
copyright-protected works or other protected subject matter uploaded by its users, which it organises
and promotes for profit-making purposes.”
18
Article 17(1), second sentence, CDSM Directive.
19
Article 17(4)(a) CDSM Directive, the Commission Guidance explains the substance of this obligation
in Chapter IV, but does not go beyond outlining the possible licensing solutions. To what extent
OCSSPs must act proactively to engage in negotiations with rightholders remains unclear.
20
Article 17(4)(b) CDSM Directive.
21
Article 17(4)(c) CDSM Directive.

Electronic copy available at: https://ssrn.com/abstract=3876608


Article 17(7) also provides that legitimate uses on such platforms should not be
prevented, meaning that users should be permitted to upload lawful content. Such
lawful content must at least include content for which users have obtained
authorization or for which authorization is not necessary because an upload is covered
by a copyright exception or limitation. Article 17(9) further provides that effective
dispute resolution mechanisms are provided at various levels, including recourse to
the ordinary courts.

How exactly this calibration of ‘best efforts’ obligations arising out of Article 17(4) and
obligations of result22 guaranteed under Article 17(7) could be conducted remained
unclear; the Commission’s Guidance was hoped to bring some clarity.

3. The inevitability of automated filtering and its incapacity to secure


appropriately the balance in copyright law

It is clear from the Commission’s Guidance that filtering technology will be an essential
element in the toolbox of OCSSPs to comply with their obligations under Article 17(4).
The Commission does not prescribe which technology OCSSPs should employ, but it
highlights content recognition as a commonly used technology while also recognizing
the effectiveness of other technologies.23 The Commission also admits that these
technologies cannot distinguish between lawful or unlawful uses.24 That means that
the use of automated filtering will certainly lead to ex-ante blocking of ‘legitimate uses’,
a possibility that the Commission and some MS had excluded prior to the publication

22
See COM(2021) 288 final, p. 26.
23
It has to be noted positively that the Commissions list a number of possible alternative technological
means that can be used, including hashing, watermarking, the use of metadata and keyword search,
or a combination of different technologies (COM(2021) 288 final, p. 11-12), showing that automated
filtering is one tool among others and that therefore other tools could be used if it is efficient and
automated filtering is a disproportionate burden, in particular when it comes to smaller platforms.. As
the Commission notes, “in many cases, is expected that service providers will rely (or continue to rely)
on different technological tools in order to comply with their obligation under Article 17(4)(b)”.
24
This goes both ways, see e.g. COM(2021) 288 final, pp. 13 & 15, but the Commission also stresses
that lawful content will, effectively, in many cases be blocked particularly due to the limitation of the
technological solutions currently available (p. 20).

Electronic copy available at: https://ssrn.com/abstract=3876608


of the Guidance. In the absence of guidance by the Commission, some Member States
have tried to find solutions to avoid as much as possible excessive ex ante filtering.
For example, the German implementation of Article 17 and relies on quantitative
standards to define uses “presumably authorized by law”.25 If uploaded content fulfils
the relevant standards, the upload cannot be blocked pre-emptively.26

The concept of legitimate uses, which shall not be affected by the mechanism of Article
17(4), includes uses covered by copyright exceptions and limitations and uses for
which the user has obtained authorization, but also such uses that concern content
not protected by copyright or for which the term of protection has expired.27 This
potential for over-blocking legitimate uses, and, as a result a limitation to the right to
freedom of expression, lies at the heart of the Polish challenge to Article 17(4).

Instead of banning the filtering and bocking of user-uploaded content altogether, which
could have been a logical step to avoid over-blocking but might have reduced the
efficiency of Article 17, the Commission opted for a solution that cushions the negative
impact of the application of filtering technology to user uploads. It has therefore
devised a bifurcated solution. As a general rule, content recognition or similar
technologies should be required to filter and block only “manifestly infringing”
uploads.28 Exceptionally, other (non-“manifestly infringing”) earmarked content29
could be blocked with ex ante human review before the content will become initially
available.30

25
Entwurf eines Gesetzes zur Anpassung des Urheberrechts an die Erfordernisse des digitalen
Binnenmarktes – Drucksachen 19/27426, 19/28171, available at
https://www.bundesrat.de/SharedDocs/drucksachen/2021/0401-0500/428-
21.pdf?__blob=publicationFile&v=1, Artikel 3, §§ 9-10.
26
See Communia, German Article 17 implementation law sets the standard for protecting user rights
against overblocking, available at: https://www.communia-association.org/2021/05/20/german-article-
17-implementation-law-sets-the-standard-for-protecting-user-rights-against-overblocking/, accessed:
26.06.2021; under the German implementation law, an upload cannot be blocked if it consists of less
than 50% of an original work, combines parts of the work with other content, and if the use is minor in
nature (i.e. shorter than 15 second of audio or video, 160 characters of text files, or 125 kb of graphic
files) or has been flagged by the uploader as or permitted (e.g. covered by an exception).
27
COM(2021) 288 final, pp. 19-20.
28
COM(2021) 288 final, p. 20.
29
Content earmarked by the relevant rightholders as content whose availability could cause significant
harm to them (COM(2021) 288 final p.13).
30
COM(2021) 288 final, p. 22.

Electronic copy available at: https://ssrn.com/abstract=3876608


Limiting the scope of lawful filtering to “manifestly infringing” content can be traced
back to a position the Commission had already taken during the hearing on the action
for annulment, and which was supported by the Council and the Parliament.31 Without
precisely defining in the Commission Communication what “manifestly infringing”
means it can be deduced from the Guidance and the examples the Commission
provides that this must refer to uploads that are identical or to a high degree similar to
protected works and other subject matter. For example, “[e]xact matches of entire
works or of significant proportions of a work should normally be considered manifestly
infringing” but also works which have been distorted in order to avoid recognition by
automated content recognition technology.32 On the contrary, uploads that only
partially match information provided by rightholder, because a user “has significantly
modified the work in a creative manner” would not be considered manifestly infringing.
The likelihood of an infringing act based on a quantitative assessment of similarity
therefore seems to justify ex ante blocking.33

The Commission’s approach as reflected in the Guidance is a more nuanced


appreciation of user uploads but it will still result in the blocking of legitimate uses.
Indeed, a purely quantitative assessment of uploads in order to detect infringing
content cannot be equated to a legal analysis, which requires to consider context,
message and a variety of other factors.34 And there are also ample examples in which
the upload of complete works or other subject matter falls within the scope of

31
Paul Keller, CJEU hearing in the Polish challenge to Article 17: Not even the supporters of the
provision agree on how it should work, available at:
http://copyrightblog.kluweriplaw.com/2020/11/11/cjeu-hearing-in-the-polish-challenge-to-article-17-
not-even-the-supporters-of-the-provision-agree-on-how-it-should-work/, accessed: 16.12.2020, see
also with approval of a similar approach Quintais et al. (n 12)), p. 280.
32
COM(2021) 288 final, p. 21.
33
The Commission avoids the formulation “ex ante blocking” but refers to “ex ante human review” after
which “the service provider may block the upload or make it available”. In the interim period, the result
is, however, that content is temporarily blocked until it has been humanly reviewed.
34
See for example the judgment in CJEU, Judgment of 03.09.2014, Deckmyn, Case C-201/13,
EU:C:2014:2132, in which the CJEU stated that in interpreting the parody exception of Article 5(3)(k) of
the Information Society Directive requires that “all the circumstances of the case must be taken into
account.” (para. 28). Such an analysis is not possible with currently available automated tools, see Evan
Engstrom & Nick Feamster, ’The Limits of Filtering: A Look at the Functionality & Shortcomings of
Content Detection Tools’ [2017] Available: https://www.engine.is/the-limits-of-filtering.

Electronic copy available at: https://ssrn.com/abstract=3876608


application of a copyright exception.35 A legal assessment of the lawfulness of an
upload cannot simply be replaced by automated checks based on quantitative
similarity, or, as the Commission put it: “The identification of manifestly infringing
content and other content by automated means does not represent a legal assessment
of the legitimacy of an upload, including whether it is covered by an exception.”36 In
short, the technology assesses if the upload can be made available, not its legality.37

4. The limits of human review: in (desperate) search for the ‘super-


human’-reviewer

This is precisely the problem of automated filtering and the right to freedom of
expression. The inability of automated systems to distinguish between lawful and
unlawful uses inevitably will result in the blocking of lawful speech without an initial
judicial determination. The Commission’s Guidance, albeit non-binding, seems to
condone this outcome. It exposes certain potentially lawful expression to mechanisms
without effective legal safeguards, at least safeguards that take effect only after
expression has been blocked. This is not to say that “manifestly infringing” is a wrong
or inappropriate criterion, indeed it might be the only way to give some legitimacy to
Article 17 CDSM Directive.38 But the problems exposed here illustrate the complexity

35
See for example CJEU, Judgment of 29.07.2019, Funke Medien NRW, Case C-469/17,
EU:C:2019:623, in which the CJEU in principle agreed that the upload of entire military reports can fall
under the exception for quotation, similarly, the CJEU confirmed the application, in principle, of certain
exceptions to the upload on a complete manuscript in CJEU, Judgment of 29.07.2019, Spiegel Online,
Case C-516/17, EU:C:2019:625, on both cases see Bernd Justin Jütte & Giulia Priora, ’Leaking of
secret military reports qualifies as reporting of current events’ [2020] Journal of Intellectual Property
Law & Practice 681–682 and Giulia Priora & Bernd Justin Jütte, ’No copyright infringement for
publication by the press of politician’s controversial essay’ [2020] Journal of Intellectual Property Law
& Practice 583–584; on these decisions see Christophe Geiger & Elena Izyumenko, ’The
Constitutionalization of Intellectual Property Law in the EU and the Funke Medien, Pelham and Spiegel
Online Decisions of the CJEU: Progress, but Still Some Way to Go!’ [2020] International Review of
Intellectual Property and Competition 282.
36
COM(2021) 288 final, pp. 20-21.
37
For a critique, see Geiger & Jütte (n 5)), p. 26 (with reference to Christophe Geiger, ’The answer to
the machine should not be the machine: safeguarding the private copy exception in the digital
environment’ [2008] European Intellectual Property Review 121: “The answer to the machine should
not be in the machine”, or in short: what the is acceptable online or what is not needs to be decided
collectively and not by a few, and via independent mechanisms that duly safeguard fundamental
rights”).
38
Using the “manifestly illegal” criteria in order to determine what content could be blocked as a result
of the “best efforts” obligation of Art. 17 (4) has been proposed by several scholars, see Geiger & Jütte
(n 5)), Frosio & Geiger (n 12)) and Quintais et al. (n 12)) However, what is to be considered meeting

10

Electronic copy available at: https://ssrn.com/abstract=3876608


of assessing user uploads, distinguishing between lawful and unlawful uses, and the
importance of a thorough independent human review39.

Moreover, internal human review as a mechanism to distinguish between borderline


cases that have been identified based on purely quantitative assessments is hardly a
solution that will ensure a proper functioning of Article 17 as a mediator between the
various interests involved. It must be recalled that the relevant platforms that would
come within the scope of Article 17 operate globally and are accessible in all MS of
the EU. As a result, the publication of content will be subject to 27 different copyright
regimes. And while the relevant exclusive rights are fully harmonized in the EU,
exceptions and limitations are not.40 A human reviewer would have to assess the
lawfulness of an upload against all available copyright exceptions of 27 different MS.
This is a task that would require a ‘super-human’-reviewer, or more likely, a small army
of such highly trained individuals who would have to possess legal knowledge that
cannot be expected from the most senior legal professional or legal academic with a
particular passion for comparative European copyright law.41

this requirement is debated. In the of the authors, this could be typically the role of an independent
regulation authority to propose a definition for this concept and provide with examples.
39
Interestingly, the guidance foresees that in the case of automated blocking of “manifestly illegal
content”, user’s need to be informed: “When manifestly infringing uploads are identified and are blocked
i.e. are not uploaded, users should be notified of this without undue delay and should still be able to
contest the blocking, by giving reasons for their request, under the redress mechanism provided for in
Article 17(9)” (COM(2021) 288 final; COM(2021) 288 final, p. 23).
40
See Christophe Geiger & Franciska Schönherr, ’Defining the Scope of Protection of Copyright in the
EU: The Need to Reconsider the Acquis regarding Limitations and Exceptions’, in: Tatiana-Eleni
Synodinou (ed), Codification of European Copyright Law. Challenges and Perspectives (Kluwer Law
International 2012), 133; Christophe Geiger & Franciska Schönherr, ‘Limitations to Copyright in the
Digital Age’, in: Andrej Savin & Jan Trzaskowski (eds.), Research Handbook on EU Internet Law
(Edward Elgar, 2014), 110; Christophe Geiger and Franciska Schönherr, Frequently Asked Questions
(FAQ) of Consumers in relation to Copyright, Summary Report (EUIPO 2017)
<https://euipo.europa.eu/ohimportal/web/observatory/observatory-publications> (listing exceptions and
limitations to copyright as one of the areas of major divergence in national copyright law); Tito Rendas,
Exceptions in EU Copyright Law (Kluwer Law International, 2021), 153 sq.
41
On the limits of human review to fix AI-based enforcement processes, see Ben Green & Amba Kak,
The False Comfort of Human Oversight as an Antidote to A.I. Harm, available at:
https://slate.com/technology/2021/06/human-oversight-artificial-intelligence-laws.html, accessed:
28.06.2021: “Policymakers and companies eager to find a “regulatory fix” to harmful uses of technology
must acknowledge and engage with the limits of human oversight rather than presenting human
involvement—even “meaningful” human involvement—as an antidote to algorithmic harms. (…) We
also need to subject human oversight to greater research and scrutiny, further studying what human
oversight does and does not accomplish and how to structure human-algorithm interactions to facilitate
better collaborations. This requires preliminary testing of human oversight mechanisms before they are
enshrined in policy and monitoring human oversight behaviours as a standard feature of algorithmic

11

Electronic copy available at: https://ssrn.com/abstract=3876608


The inevitable conclusion is that human review cannot possibly performed with the
quality and speed that would be required. This applies to reviews that would have to
safeguard that lawful uploads of users are not prevented or inhibited in any other way,
and also to reviews that would ensure that uploads flagged as lawful, but which are
not covered by any exception or are otherwise permitted are removed. The sensitivity
of these decisions in their temporal and substantive dimension do lot lend themselves
to internal review by private platform operators.

Moreover, the standards for such human review are not set out in the Guidance and
there is a good argument to be made, and has indeed been made,42 in favor of an
independent institution or by a judicial or quasi-judicial body to ensure that uploads to
online content-sharing providers is properly assessed, and which could take
stewardship over developing fundamental rights compliant approaches to automated
filtering which, again, seems to be unavoidable in the context of the obligations created
by Article 17(4) CDSM Directive.

5. Privileging ‘earmarked’ content: toward the watering down of user


rights?

Under Article 17, three interests collide, which are all representative of different
fundamental rights. On the one side, OCSSPs must demonstrate best efforts to
prevent infringing user-uploads and remove infringing content upon notification, which
is qualified by the general prohibition of general monitoring.43 Users, on the other side,

impact assessments and A.I. audits, which are becoming popular policy mechanisms to evaluate A.I.
systems.”
42
Geiger & Jütte (n 5)) and more generally Frosio & Geiger (n 12)).
43
Article 17(8) CDSM Directive. More detailed on this issue, see Angelopoulos & Senftleben (n 15)).
With regard to the understanding of the notion of “general monitoring” in the CDSM Directive in
comparison with the same notion on Article 15 of the E-Commerce directive, the Guidance provides
little insights. It is merely stated that “whilst the concept of general monitoring is not defined in Article
17(8), it is expressed therein in the same terms as in Article 15 of Directive 2000/31. However, when
applying Article 17(8) first paragraph the context, scheme and objective of Article 17 and the special
role of online content-sharing service providers under this provision should be taken into account”. This
seems to imply that the notion is to be interpreted in the spirit and light of the CDSM Directive and thus
to have a potentially slightly different scope than in Article 15. However, it is then also subject to the

12

Electronic copy available at: https://ssrn.com/abstract=3876608


are guaranteed that lawful uploads, including those covered by an exception or
limitation, must not be prevented through the best efforts undertaken by OCSSPs.44
Article 17, as a whole, serves to promote the interests of rightholders to receive
remuneration for uploads of their works on online content-sharing platforms.45 In this
triangular relationship, the guidance provided by the Commission suggests a new role,
more active role for rightholders, which forms part of an indispensable cooperation
between OCSSPs and rightholders, without which the effectiveness of Article 17(4)
cannot be ensured.46

The Guidance suggests that MS can foresee that rightholders can “earmark” certain
content. Earmarked content refers to “content whose availability could cause
significant harm”47 to rightholders, content which is “particularly time sensitive”, such
as pre-released music of files and other comparable subject matter. It is important to
point out that earmarked content and its blocking privilege is not subject to a
‘manifestly infringing’ standard.48 This means that even content that contains short
excerpts of earmarked content or which contains only small amounts of earmarked
content in relation to the entirety of the uploaded material in one particular upload will
be subject to this specific regime.

As the second prong of the bifurcation of its approach to blocking, the Commission
suggest that specific earmarked content will, “when proportionate and where
possible”, be subject to “rapid ex ante human review by online content-sharing service
providers”.49 The content should be identified by rightholders and earmarking specific

inherent contradictions of Article 17, and the difficult question remains how to implement the obligation
of Article 17(4) while, at the same time, safeguarding Article 17(7). In our understanding, in order to be
compatible with fundamental rights, it could only lead to filtering of very limited and targeted content
such as “manifestly infringing” material, understood in a very restrictive manner (see Geiger & Jütte (n
5)).
44
Article 17(7) CDSM Directive.
45
See e.g. Recital 61 CDSM Directive.
46
COM(2021) 288 final, p. 11.
47
COM(2021) 288 final, p. 13, the Commission adds that earmarking such content requires rightholders
to properly justify why this content carries “high risks of significant economic harm” (p. 22).
48
COM(2021) 288 final, p. 22.
49
COM(2021) 288 final, p. 22.

13

Electronic copy available at: https://ssrn.com/abstract=3876608


content should be “properly justified by rightholders.”50 In relation to earmarked
content, OCSSPs have to exercise a “heightened care”, although the Commission
states expressly that this heightened level of care should not lead to a disproportionate
burden for OCSSPs, nor should it result in a general monitoring obligation.51 The
detection of earmarked content in uploaded content would initially result in the
unavailability of that content on a platform and only be cleared (or continuously
blocked) after it has been reviewed. This new element introduced by the Guidance is
significant as it changes the Commission’s stance on ex ante blocking and filtering of
content and, as a result, the Commission’s view on what constitutes a fair balance
between the fundamental rights involved.52

Whilst the rationale behind this solution proposed by the Commission is


understandable, and to a certain degree absolutely reasonable, the solution proposed
creates further legal uncertainty and raises additional fundamental rights problems.
First, the effects of earmarking and its relation to the “manifestly infringing” standard
are not clear. The guidance also does not determine beyond a relatively vague
standard how OCSSPs should treat user uploads which contain earmarked content. It
seems to suggest, however, that where possible and proportionate, such content
should be subject to expeditious ex ante human review. But what happens to such
content if an OCSSP is generally unable to perform such reviews? Indeed, the
Commission admits that, not only in relation to earmarked content “[t]here may also
be more complex cases where additional criteria or a combination of criteria may need
to be taken into account” to determine the lawfulness of uploaded content.53 There is
also the element of time: a rapid ex ante human review conducted by OCSSPs is
bound to create mistakes for reasons already mentioned above: no human can
determine with certainty whether content that is not manifestly infringing is lawful or

50
COM(2021) 288 final, p. 22.
51
COM(2021) 288 final, p. 22.
52
Not only does the introduction of a special carve-out from the ‘no-blocking’ principle further restrict
the rights of users, most notable the right to freedom of expression, but it also puts further moderation
obligations on OCSSPs and affects their right to conduct a business, see to that effect Julia Reda &
Joschka Selinger, Germany attempts to square the circle in its implementation of Article 17 CDSMD –
Part 2, available at: http://copyrightblog.kluweriplaw.com/2021/06/03/germany-attempts-to-square-the-
circle-in-its-implementation-of-article-17-cdsmd-part-2/, accessed: 11.06.2021 and Geiger & Jütte (n
5)), p. 24.
53
COM(2021) 288 final, p. 22.

14

Electronic copy available at: https://ssrn.com/abstract=3876608


unlawful in all MS of the EU. In a worst-case scenario, certain uploads will have to be
blocked for some MS while they must remain accessible in others. In any case, the
Guidance recognizes that moderation of content, earmarked or not, will be a process
of trial-and-error, which will have to be refined through a process of cooperation
between rightholders and OCSSPs, taking into consideration feedback provided by
users.54

Earmarking content will inevitably lead to filtering and blocking. The standards as to
what can be earmarked must be meticulously defined to avoid wholesale blocking of
any economically valuable content. If not, earmarking could easily lead to a
presumption for the platforms that the content is manifestly illegal and thus potentially
to an over-blocking of all earmarked content to avoid liability or litigation. The CJEU
underlined in YouTube/Cyando that monitoring information is only permitted in relation
to specific content.55 Admittedly, the Commission seems conscious that this additional
possibility open to rightholders can have a negative impact and therefore carefully
specifies that “this heightened care for earmarked content should be limited to cases
of high risks of significant economic harm, which ought to be properly justified by
rightholders. This mechanism should not lead to a disproportionate burden on service
providers nor to a general monitoring obligation”56. Moreover, the additional human
review for earmarked content may only be undertaken “when proportionate, possible
and practicable”57. However, if the legitimate concern was about avoiding that
particular time-sensitive content goes online58 and thus causes a significant harm to
rightholders, this should have been included in the assessment of what is manifestly
infringing and can be blocked automatically. In this scenario, earmarking would then

54
COM(2021) 288 final, p. 22.
55
CJEU, C-683/18 (YouTube and Cyando), para. 113.
56
COM(2021) 288 final, p. 22.
57
COM(2021) 288 final, p. 22.
58
Such as a pre-release of music or a film, which understandably needs to be avoided and where the
use of automated tools seems to be legitimate to safeguard the interests deriving from Art. 17(2) of the
Charter of Fundamental Rights of the EU. Another situation, however not listed in the Guidance, would
be when the content has been taken down after a court decision, in order to avoid that it is uploaded
again ((in this spirit, the CJEU in its Glawischnig-Piesczek decision (CJEU, Judgment of 03.10.2019,
Glawischnig-Piesczek, Case C-18/18, EU:C:2019:821, para. 53), held that an injunction to remove
content identical or equivalent to content previously declared unlawful by a Court would be compatible
with Article 15(1) E-Commerce Directive and thus not falling under ban of general monitoring).

15

Electronic copy available at: https://ssrn.com/abstract=3876608


only be used to provide information from the rightholders to the platform about time
sensitive works so that they can be particularly attentive to this content, not with
additional obligations or duties of care.

6. Assessment and outlook: Constructing a virtuous content moderation


framework for digital platforms in the EU

The Guidance is what it is, a guidance and not hard law, but it can be argued that it
does not provide what it promises: helping Member States to implement in a
fundamental rights compliant manner and in a harmonized way Article 17 in their
national law and thus to establish a virtuous legal framework for content moderation
by platforms in the EU. However, the approach taken by the Commission should be
seen as a step in the right direction. It tries to keep restrictions to user uploads at a
minimum whilst also having regard to the interests of rightholders. Interestingly, it also
specifies that the assessment of whether an OCSSP has made ‘best efforts’ “should
be made on a case-by-case basis, according to the proportionality principle set out in
Article 17(5) and within the respect of Article 17(7) to (9).”59 However, the guidance
provided for OCSSPs are, unfortunately, often still too vague and imprecise as that
they could serve as guidelines to perform their obligations with best efforts.

Second, an important role is assigned to OCSSPs in moderating content. For


earmarked content, OCSSPs have to conduct an ex ante human review, and if
necessary reinstate an upload if pre-emptively blocked or, to put it in a more politically
palatable formulation, delayed content is found to be lawful. But also in other
circumstances, for example when a user protests against ex ante blocking, or when a
rightholder initiates an ex post complaint (with a potential counterclaim by the
concerned user), OCSSP are the first instance of institutional adjudication. Although
Article 17(9) provides that users should have access to out-of-court redress

59
COM(2021) 288 final, p. 13. According to the Commission, “this means in practice that online content-
sharing service providers should not be expected to apply the most costly or sophisticated solutions if
this would be disproportionate in their specific case. This applies also in the case of content earmarked
by the relevant rightholders as content whose availability could cause significant harm to them (…).
Moreover, as explained in Recital 66, it cannot be excluded that in some cases unauthorised content
can be avoided only following a notification by rightholders”.

16

Electronic copy available at: https://ssrn.com/abstract=3876608


mechanisms and access to the courts of the Member States, crucial decisions will be
made by private operators and, in addition, at their expenses. As a result, OCSSPs
incur a double burden, the primary liability to obtain authorization for uploads made by
their users and to moderate, technologically and quasi-judicially, between rightholders
and users.

The Guidance is problematic as is leaves some of the most fundamental questions


unanswered and the gaps left by the Commission will have to be filled by the MS
during the transposition of the CDSM Directive. The Guidance was supposed to
provide MS with precisely that: guidance so the respective national transpositions
could achieve some form of harmonization in the EU. It must be stated that the
Guidance does not perform the function is should have had according to Article 17(10)
CDSM Directive. The core of this conundrum lies in the tension between the obligation
of results and the obligation of best efforts in, respectively Article 17(4) and 17(9). The
Commission expressly establishes a hierarchy to the effect that the obligation to
ensure the availability of lawful content, of whichever nature, supersedes the
obligation to ensure the unavailability of unlawful content.60 At least temporarily lawful
content could be made unavailable if it contains elements of earmarked content, which
is problematic for a freedom of expression perspective.61

On the positive side, the guidance provides important clarification in relation to the
mandatory nature of certain exceptions and limitations which are referred to in Article
17(7). According to this provision, MS must ensure that users uploading content on
online-content sharing services must be able to rely on exception for quotation
criticism and review and for uses for the purpose of caricature, parody or pastiche.
Exceptions with a similar scope are already contained in the InfoSoc Directive, where
they are however only optional, meaning that MS are not obliged to implement them.
The Guidance now provides that, at least for uses falling within the scope of Article
17, these exceptions must be implemented into national law, and they should be
understood as autonomous notions of EU.62 The now mandatory nature of these

60
COM(2021) 288 final, pp. 2-3.
61
Geiger & Jütte (n 5)), pp. 23-24.
62
COM(2021) 288 final, p. 19.

17

Electronic copy available at: https://ssrn.com/abstract=3876608


selected exceptions and their autonomous nature will add some degree of legal
certainty. However, the Commission has missed an opportunity to make these
exceptions mandatory also for offline uses.63 It is hard to understand why MS should
be obliged to enable making parody or a quotation online on a platform when
uploading content, but not obliged to enable the same acts in relation to the same work
offline. Such a differentiation makes no sense, and it is hoped that the issue of the
harmonisation of exceptions and limitations will soon be taken up by the legislator64 or
by the CJEU, which could declare all exceptions and limitations autonomous concepts
of EU law (at least, those justified by fundamental rights).65

The Guidance also fails its mission for other reasons: First, it simply comes too late.
At a point in time – a weekend before the transposition period expired – when three
MS had already implemented Article 17 and others were in advanced stages of
transposition, such guidance does not serve to assist national legislatures to draft
proper and systematically sound legislation. The result of the absence of direction,
before and after the publication of the Guidance, is that the transposition models in
the various MS already vary significantly. 66 Therefore, it simply does not provide the
legal certainty that the wording of Article 17 itself already failed to provide, and, as a
result, the Directive will unfortunately not achieve the necessary harmonization.67 The

63
This is particularly remarkable considering that the CJEU has suggested that certain exceptions are
quasi-mandatory because of their importance for the exercise of fundamental rights, see CJEU,
Judgment of 29.07.2019, Pelham and others, Case C-476/17, EU:C:2019:624, para. 60; CJEU, C-
469/17 (Funke Medien NRW), para. 58; CJEU, C-516/17 (Spiegel Online), para. 43; see also Bernd
Justin Jütte & Joao Pedro Quintais, ’The Pelham Chronicles: sampling, copyright and fundamental
rights’ [2021] Journal of Intellectual Property Law & Practice 213–225, p. 223.
64
See in this sense e.g. Christophe Geiger, Giancarlo Frosio, & Oleksander Bulayenko, ’The EU
Commission’s Proposal to Reform Copyright Limitations: A Good but Far Too Timid Step in the Right
Direction’ [2018] European Intellectual Property Review 4.
65
Christophe Geiger, ’The Role of the Court of Justice of the European Union: Harmonizing, Creating
and sometimes Disrupting Copyright Law in the European Union’, in: Irini Stamatoudi (ed), New
Developments in EU and International Copyright Law (Kluwer Law International 2016).
66
See for example the transpositions of Germany and Finland, which are more considerate of the rights
of users and do not include special rightholder safeguards for earmarked content: see Paul Keller,
Finnish Article 17 implementation proposal prohibits the use of automated upload filters, available at:
http://copyrightblog.kluweriplaw.com/2020/12/23/finnish-article-17-implementation-proposal-prohibits-
the-use-of-automated-upload-filters/, accessed: 24.12.2020 and Julia Reda & Joschka Selinger,
Germany attempts to square the circle in its implementation of Article 17 CDSMD – Part 1, available at:
http://copyrightblog.kluweriplaw.com/2021/06/02/germany-attempts-to-square-the-circle-in-its-
implementation-of-article-17-cdsmd-part-1/, accessed: 11.06.2021.
67
Geiger & Jütte (n 5)), p. 25. See also critical in relation to other the failure to effectively harmonize
exceptions within the scope of Article 17 CDSM Directive: Bernd Justin Jütte & Giulia Priora, A further
step into a systematic distortion: The EC Guidance on Article 17 CDSM Directive further complicates

18

Electronic copy available at: https://ssrn.com/abstract=3876608


indications provided in the Guidance, but also the YouTube/Cyando judgment demand
caution not to easily affirm that Article 17 is in compliance with fundamental rights and
general principles of EU law.

The Guidance does however come in time for, or has possibly even purposively
delayed, the Opinion of AG Saugmandsgaard Øe in Case 401/19. The AG now must
decide, considering the ‘guidance’ provided and looking at the evidence presented by
the MS in the national transpositions, if Article 17 can be implemented in a manner
that complies with fundamental rights. We have argued elsewhere that the Court
should not restrict its review only to the right to freedom of expression, because other
fundamental rights are inextricably linked to the question whether Article 17 obliges
OCSSPs to monitor and filter in a general manner.68 And other important questions
remain. The technological limitations and the anticipated amount of earmarked content
and user complaints as a result cast in doubt whether the mechanisms required under
Article 17 are able to provide sufficient reassurances that the rights of users are
respected. Whether OCSSPs will realistically be able to implement technological
solutions that prevent the upload of earmarked content while at the same time
ensuring an ample level of human review is doubtful. One can only hope that all these
elements will be sufficient for the CJEU to annul a provision that, after bringing a lot of
unsatisfaction still carries so much uncertainty, and that a virtuous legal framework for
content moderation will be elaborated in the near future, possibly in the context of the
upcoming DSA regulation under discussion.69 The legitimacy and acceptability of the
copyright system will surely be enhanced.70

copyright exceptions, available at: http://copyrightblog.kluweriplaw.com/2021/06/09/a-further-step-into-


a-systematic-distortion-the-ec-guidance-on-article-17-cdsm-directive-further-complicates-copyright-
exceptions/, accessed: 11.06.2021.
68
See Geiger & Jütte (n 5)), p. 24.
69
The DSA also is dealing with the same issue of platform liability and content moderation, and is facing
similar challenges with regard to fundamental rights. See Frosio & Geiger (n 12); Naomi Appelman,
João Pedro Quintais & Ronan Fahy, Article 12 DSA: Will platforms be required to apply EU fundamental
rights in content moderation decisions?, available at: https://dsa-observatory.eu/2021/05/31/article-12-
dsa-will-platforms-be-required-to-apply-eu-fundamental-rights-in-content-moderation-decisions/,
accessed 28.06.2021.
70
It is has to be noted however, that as a result of the recent YouTube-decisions, the Court of Justice
has already laid the ground for a quite elaborated OCSSP platform liability regime, which would be
applicable if Article 17 is annulled: CJEU, C-683/18 (YouTube and Cyando). Thus, in the transitory
period until a new (possibly horizontal) regime for content moderation is elaborated, a particular duty of
care for platforms will already be in place, as the platform will only be considered as not communicating
the work to the public, “unless it contributes, beyond merely making that platform available, to giving

19

Electronic copy available at: https://ssrn.com/abstract=3876608


access to such content to the public in breach of copyright. That is the case, inter alia, where that
operator has specific knowledge that protected content is available illegally on its platform and refrains
from expeditiously deleting it or blocking access to it, or where that operator, despite the fact that it
knows or ought to know, in a general sense, that users of its platform are making protected content
available to the public illegally via its platform, refrains from putting in place the appropriate
technological measures that can be expected from a reasonably diligent operator in its situation in order
to counter credibly and effectively copyright infringements on that platform, or where that operator
participates in selecting protected content illegally communicated to the public, provides tools on its
platform specifically intended for the illegal sharing of such content or knowingly promotes such sharing,
which may be attested by the fact that that operator has adopted a financial model that encourages
users of its platform illegally to communicate protected content to the public via that platform” (emphasis
added). Hopefully, the notion “appropriate technological measures” to be implemented by platforms
does not refer to automated filtering or is constrained to situations of “manifestly illegal” uploads
understood in a restrictive sense.

20

Electronic copy available at: https://ssrn.com/abstract=3876608

You might also like