Professional Documents
Culture Documents
Abstract:
The Directive on Copyright in the Digital Single Market might fundamentally change
the way works and other subject matter protected by copyright will be used on online
platforms. Article 17 of the Directive obliges such platforms, based on information they
receive from rightholders, to ensure that copyright infringing uploads made by their
users are prevented and/or removed by an internalized procedure over which platform
operators exercise control. Important enforcement mechanisms are thereby left to
private entities solely deciding what content can (or cannot) be available on their
platforms. More problematic, the immense masses of uploads seem to be only
manageable by automated AI-based filtering technologies incapable of properly
distinguishing between lawful and unlawful uses, which leads to a serious danger of
overblocking of perfectly legitimate content and thus to potentially important limitations
of fundamental rights such as the right to freedom of expression. At the same time,
Article 17 also forbids general monitoring of the content available on platforms and
clearly mandates to safeguard user’s rights protected by exceptions and limitations to
copyright such as quotations, criticism and review or caricature, parody or pastiche.
This obligation of result leads to an unsolvable conflict and in a great difficulty for
*
Christophe Geiger is Professor of Law at the Centre for International Intellectual Property Studies
(CEIPI), University of Strasbourg (France); Affiliated Senior Researcher at the Max Planck Institute for
Innovation and Competition (Munich, Germany) and Spangenberg Fellow in Law & Technology at the
Spangenberg Center for Law, Technology & the Arts, Case Western Reserve University School of Law
(Cleveland, US); Bernd Justin Jütte is Assistant Professor in Intellectual Property Law, University
College Dublin, Sutherland School of Law (Ireland) and Senior Researcher Vytautas Magnus
University, Faculty of Law (Kaunas, Lithuania).
1
William Penn, “Some Fruits of Solitude” (London: Headley Brothers 1905), p. 67.
The Guidance on Article 17 finally issued by the Commission in June 2021 after
several postponements, although it provides for some useful clarifications and certain
safeguards for users of protected works, but unfortunately does not depart from a
system based on monitoring and automated filtering and thus is likely to fail protecting
fundamental rights in an appropriate manner. This paper analyses the main additions
and proposed interpretation tools that the Guidance brings to platform’s content
moderation as mandated by Art. 17 CDSM. It argues that in order to establish a
virtuous content moderation system and to help Member States to implementing
Article 17 in a balanced way, it would have been essential to address the more
fundamental concerns when it comes to Article 17, and in particular the fact that
privately operated algorithmic tools and not independent assessors based on
copyright law’s equilibrium are deciding what content should be available online and
to acknowledge the inherent limits and flaws of technology. We conclude that in the
absence of a truly independent arbiter between the interests of users, platforms and
rightholders, Article 17 of the Directive is likely not to comply with European
fundamental rights and the basic principles of EU law.
The Directive on Copyright in the Digital Single Market2 (CDSM Directive) was adopted
in May 2019 and the European Union (EU) legislator had left the Member States (MS)
a generous transposition period until the 7 June 2021.3 Nevertheless, many MS
struggled to implement the Directive within this two-year period.4 One of the main
reasons for these difficulties is Article 17, which changes the liability rules for so-called
online-content sharing service providers (OCSSPs), a complicated provision with ten
rather lengthy paragraphs full of internal contradictions resulting from heated debates
and difficult last minute compromises in the adoption process.5 Aware of its
complexity, its last paragraph foresaw a stakeholder dialogue that should provide
“issue guidance on the application of this Article” to provide for the needed assistance
to MS in transposing this provision.6 The stakeholder dialogue was concluded in 20207
but it took the Commission until 4 June 2021, a Friday before the transposition period
expired on a Monday, to finally publish its guidance based on this stakeholder
dialogue after several postponements.8 The Commission had given some indication
2
Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright
and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC,
[17.04.2019] OJ L 130, 17.5.2019, p. 92-125 (CDSM Directive). See for a global appreciation of the
Directive: Séverine Dusollier, ’The 2019 Directive on Copyright in the Digital Single Market: Some
progress, a few bad choices, and an overall failed ambition’ [2020] Common Market Law Review 979
and João Pedro Quintais, ’The New Copyright in the Digital Single Market Directive: A Critical Look’
[2020] European Intellectual Property Review 28.
3
Article 29(1) CDSM Directive.
4
Until the deadline expired on 7 June 2021, only three MS (Germany, Hungary and The Netherlands)
had fully implemented the provisions of the CDSM Directive. See for the current transposition status of
the various provisions of the CDSM Directive: Communia, DSM Directive Implementation Portal,
available at: https://www.notion.so/DSM-Directive-Implementation-Portal-
97518afab71247cfa27f0ddeee770673, accessed: 26.06.2021.
5
For a detailed n analysis see Christophe Geiger & Bernd Justin Jütte, ’Platform Liability Under Art. 17
of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An
Impossible Match’ [2021 forthcoming] GRUR International, available at:
SSRN: http://ssrn.com/abstract=3776267.
6
Article 17(10) CDSM Directive.
7
For an interim report on the stakeholder dialogues see e.g. Paul Keller, Article 17 stakeholder
dialogue: What we have learned so far – Part 1, available at:
http://copyrightblog.kluweriplaw.com/2020/01/13/article-17-stakeholder-dialogue-what-we-have-
learned-so-far-part-1/, accessed: 07.05.2020.
8
European Commission, Communication from the Commission to the European Parliament and the
Council. Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market,
COM(2021) 288 final, Brussels, European Commission, 04.06.2021.
Briefly after the Directive had been adopted, the Polish government challenged certain
parts of Article 17 through an action for annulment before the Court of Justice of the
European Union (CJEU),10 claiming that an implied obligation to filter content would
violate the right to freedom of expression. In fact, the interplay of fundamental rights
within Article 17 is far more complicated, involving further fundamental rights, including
freedom to conduct a business, freedom to impart and receive information, freedom
of artistic creativity, the right to privacy, the right to a fair trial and the right to property.11
9
See with a link to the document: Paul Keller, Commission consultation on Article 17 guidance: User
rights must be protected at upload, available at: https://www.communia-
association.org/2020/09/02/commission-consultation-article-17-guidance-user-rights-must-protected-
upload/, accessed: 11.06.2021.
10
CJEU, Action brought on 24.07.2019, Republic of Poland v. European Parliament and Council of the
European Union, Case C-401/19. On this challenge see Christophe Geiger & Bernd Justin Jütte, The
Challenge to Article 17 CDSM, an opportunity to establish a future fundamental rights-compliant liability
regime for online platforms, available at: http://copyrightblog.kluweriplaw.com/2021/02/11/the-
challenge-to-article-17-cdsm-an-opportunity-to-establish-a-future-fundamental-rights-compliant-
liability-regime-for-online-platforms/, accessed: 11.02.2021.
11
For a complete assessment of the fundamental rights at issue in Art. 17 CDSM, see Geiger & Jütte
(n 5)), pp. 7-13. On the importance of fundamental rights in interpreting EU intellectual property law,
see e.g. Christophe Geiger, ‘Constitutionalising” Intellectual Property Law? The Influence of
Fundamental Rights on Intellectual Property in the European Union’, [2006] International Review of
Intellectual Property and Competition 371.
12
See for example the contributions by Geiger & Jütte (n 5)) and Julia Reda, Joschka Selinger, &
Michael Servatius, Article 17 of the Directive on Copyright in the Digital Single Market: a Fundamental
Rights Assessment (Study for Gesellschaft für Freiheitsrechte), available at:
https://freiheitsrechte.org/home/wp-content/uploads/2020/11/GFF_Article17_Fundamental_Rights.pdf,
accessed: 26.01.2020; specifically on the rights of users see Joao Pedro Quintais et al., ’Safeguarding
User Freedoms in Implementing Article 17 of the Copyright in the Digital Single Market Directive:
Recommendations from European Academics’ [2019] Journal of Intellectual Property, Information
Technology and E-Commerce Law 277, and in the context of the ban on general monitoring and its
fundamental rights justifications, see Christina Angelopoulos & Martin Senftleben, ’An Endless
Odyssey? Content Moderation Without General Content Monitoring Obligations’ [2021] Available at
SSRN: https://ssrn.com/abstract=3871916 . This is indeed a larger problem in the context of addressing
unlawful content on the internet which has also been discussed in the context of the proposed Digital
Services Act (European Commission, Proposal for a Regulation of the European Parliament and of the
Council on a Single Market For Digital Services (Digital Services Act) an amending Directive
2000/31/EC, COM(2020) 825 final, Brussels, European Commission, 15.12.2020), see for that purpose
Giancarlo Frosio & Christophe Geiger, ’Taking Fundamental Rights Seriously in the Digital Service Act’s
Platform Liability Regime’ [2020] Available at SSRN:
https://ssrn.com/abstract=https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3747756 1.
13
On the issues discussed during the hearing and the split between the European Institutions and some
MS, incl. Spain and France see Paul Keller, CJEU hearing in the Polish challenge to Article 17: Not
even the supporters of the provision agree on how it should work, available at:
http://copyrightblog.kluweriplaw.com/2020/11/11/cjeu-hearing-in-the-polish-challenge-to-article-17-
not-even-the-supporters-of-the-provision-agree-on-how-it-should-work/, accessed: 16.12.2020.
14
There are two reasons why the Opinion could have been postponed: first, the AG was waiting for the
publication of the Guidelines, second, the AG wanted to await the outcome of CJEU, Judgment of
22.06.2021, YouTube and Cyando, Joined Cases C-682/18 and C-683/18, EU:C:2021:503.
15
See only the contributions of Matthias Leistner, ’European Copyright Licensing and Infringement
Liability Under Art. 17 DSM-DirectiveCan We Make the New European System a Global Opportunity
Instead of a Local Challenge?’ [2020] Zeitschrift für Geistiges Eigentum/Intellectual Property Journal
123; Martin Senftleben & Christina Angelopoulos, ’The Odyssey of the Prohibition on General
Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce
Directive and Article 17 of the Directive on Copyright in the Digital Single Market (October 22, 2020).
Available at SSRN: https://ssrn.com/abstract=3717022’ [October 2020] Available at SSRN:
https://ssrn.com/abstract=3717022 1; Gerald Spindler, ’The Liability system of Art. 17 DSMD and
national implementation – contravening prohibition of general monitoring duties?’ [2019] Journal of
Intellectual Property, Information Technology and E-Commerce Law 344; Gerald Spindler, ’Art. 17
DSM-RL und dessen Vereinbarkeit mit primärem Europarecht. Zugleich ein Beitrag zu
Umsetzungsmöglichkeiten’ [2020] Gewerblicher Rechtschutz und Urheberrecht 253; Gerald Spindler,
’Upload-Filter: Umsetzungsoptionen zu Art. 17 DSM-RL’ [2020] Computer und Recht 50; Martin
Husovec & Joao Pedro Quintais, ’How to License Article 17? Exploring the Implementation Options for
the New EU Rules on Content-Sharing Platforms’ [2019] Available at SSRN:
https://ssrn.com/abstract=3463011 or http://dx.doi.org/10.2139/ssrn.3463011 1.
16
Directive 2001/29/EC of the European Parliament and of the Council of 22 May 2001 on the
harmonisation of certain aspects of copyright and related rights in the information society, [22.05.2001]
OJ L 167, 22.6.2001, p. 10-19 (InfoSoc Directive). See for example the recent ruling in Youtube/Cyando,
in which the Court held that “the operator of a video-sharing platform or a file-hosting and -sharing
platform, on which users can illegally make protected content available to the public, does not make a
‘communication to the public’ of that content, within the meaning of that provision, unless it contributes,
beyond merely making that platform available, to giving access to such content to the public in breach
of copyright” (CJEU, C-683/18 (YouTube and Cyando), para. 83); see also AG Saugmandsgaard Øe,
Opinion of 16.07.2020, YouTube, Joined Cases C-682/18 and C-683/18, EU:C:2020:586, paras. 53-
131. It is true that the right of communication to the public has been interpreted extensively but, through
the application of the ‘safe harbour’ regime of the e-Commerce Directive, EU law provides a liability
exemption to intermediaries such as hosting platforms, which are not aware of infringement taking place
on their services, and further provided them with a remediating escape route if they remove, and to a
certain extent prevent uploads of infringing content. The blocking and filtering obligations, on the other
hand, have been interpreted by the CJEU very restrictively and their legitimacy has been questioned
with regard to fundamental rights (see e.g. CJEU, Judgment of 16.06.2012, SABAM v Netlog, Case C-
360/10, EU:C:2012:85). The Commission Guidance clarifies the relation between Article 3(1) InfoSoc
Directive and Article 17(1) CDSM Directive by stating that “Article 17 is a lex specialis to Article 3 of
Directive 2001/29/EC and Article 14 of Directive 2000/31/EC. It does not introduce a new right in the
Union’s copyright law. Rather, it fully and specifically regulates the act of ‘communication to the public’
in the limited circumstances covered by this provision.” (COM(2021) 288 final, p. 2).
17
Article 2(6) CDSM Directive defines an OCSSP as “a provider of an information society service of
which the main or one of the main purposes is to store and give the public access to a large amount of
copyright-protected works or other protected subject matter uploaded by its users, which it organises
and promotes for profit-making purposes.”
18
Article 17(1), second sentence, CDSM Directive.
19
Article 17(4)(a) CDSM Directive, the Commission Guidance explains the substance of this obligation
in Chapter IV, but does not go beyond outlining the possible licensing solutions. To what extent
OCSSPs must act proactively to engage in negotiations with rightholders remains unclear.
20
Article 17(4)(b) CDSM Directive.
21
Article 17(4)(c) CDSM Directive.
How exactly this calibration of ‘best efforts’ obligations arising out of Article 17(4) and
obligations of result22 guaranteed under Article 17(7) could be conducted remained
unclear; the Commission’s Guidance was hoped to bring some clarity.
It is clear from the Commission’s Guidance that filtering technology will be an essential
element in the toolbox of OCSSPs to comply with their obligations under Article 17(4).
The Commission does not prescribe which technology OCSSPs should employ, but it
highlights content recognition as a commonly used technology while also recognizing
the effectiveness of other technologies.23 The Commission also admits that these
technologies cannot distinguish between lawful or unlawful uses.24 That means that
the use of automated filtering will certainly lead to ex-ante blocking of ‘legitimate uses’,
a possibility that the Commission and some MS had excluded prior to the publication
22
See COM(2021) 288 final, p. 26.
23
It has to be noted positively that the Commissions list a number of possible alternative technological
means that can be used, including hashing, watermarking, the use of metadata and keyword search,
or a combination of different technologies (COM(2021) 288 final, p. 11-12), showing that automated
filtering is one tool among others and that therefore other tools could be used if it is efficient and
automated filtering is a disproportionate burden, in particular when it comes to smaller platforms.. As
the Commission notes, “in many cases, is expected that service providers will rely (or continue to rely)
on different technological tools in order to comply with their obligation under Article 17(4)(b)”.
24
This goes both ways, see e.g. COM(2021) 288 final, pp. 13 & 15, but the Commission also stresses
that lawful content will, effectively, in many cases be blocked particularly due to the limitation of the
technological solutions currently available (p. 20).
The concept of legitimate uses, which shall not be affected by the mechanism of Article
17(4), includes uses covered by copyright exceptions and limitations and uses for
which the user has obtained authorization, but also such uses that concern content
not protected by copyright or for which the term of protection has expired.27 This
potential for over-blocking legitimate uses, and, as a result a limitation to the right to
freedom of expression, lies at the heart of the Polish challenge to Article 17(4).
Instead of banning the filtering and bocking of user-uploaded content altogether, which
could have been a logical step to avoid over-blocking but might have reduced the
efficiency of Article 17, the Commission opted for a solution that cushions the negative
impact of the application of filtering technology to user uploads. It has therefore
devised a bifurcated solution. As a general rule, content recognition or similar
technologies should be required to filter and block only “manifestly infringing”
uploads.28 Exceptionally, other (non-“manifestly infringing”) earmarked content29
could be blocked with ex ante human review before the content will become initially
available.30
25
Entwurf eines Gesetzes zur Anpassung des Urheberrechts an die Erfordernisse des digitalen
Binnenmarktes – Drucksachen 19/27426, 19/28171, available at
https://www.bundesrat.de/SharedDocs/drucksachen/2021/0401-0500/428-
21.pdf?__blob=publicationFile&v=1, Artikel 3, §§ 9-10.
26
See Communia, German Article 17 implementation law sets the standard for protecting user rights
against overblocking, available at: https://www.communia-association.org/2021/05/20/german-article-
17-implementation-law-sets-the-standard-for-protecting-user-rights-against-overblocking/, accessed:
26.06.2021; under the German implementation law, an upload cannot be blocked if it consists of less
than 50% of an original work, combines parts of the work with other content, and if the use is minor in
nature (i.e. shorter than 15 second of audio or video, 160 characters of text files, or 125 kb of graphic
files) or has been flagged by the uploader as or permitted (e.g. covered by an exception).
27
COM(2021) 288 final, pp. 19-20.
28
COM(2021) 288 final, p. 20.
29
Content earmarked by the relevant rightholders as content whose availability could cause significant
harm to them (COM(2021) 288 final p.13).
30
COM(2021) 288 final, p. 22.
31
Paul Keller, CJEU hearing in the Polish challenge to Article 17: Not even the supporters of the
provision agree on how it should work, available at:
http://copyrightblog.kluweriplaw.com/2020/11/11/cjeu-hearing-in-the-polish-challenge-to-article-17-
not-even-the-supporters-of-the-provision-agree-on-how-it-should-work/, accessed: 16.12.2020, see
also with approval of a similar approach Quintais et al. (n 12)), p. 280.
32
COM(2021) 288 final, p. 21.
33
The Commission avoids the formulation “ex ante blocking” but refers to “ex ante human review” after
which “the service provider may block the upload or make it available”. In the interim period, the result
is, however, that content is temporarily blocked until it has been humanly reviewed.
34
See for example the judgment in CJEU, Judgment of 03.09.2014, Deckmyn, Case C-201/13,
EU:C:2014:2132, in which the CJEU stated that in interpreting the parody exception of Article 5(3)(k) of
the Information Society Directive requires that “all the circumstances of the case must be taken into
account.” (para. 28). Such an analysis is not possible with currently available automated tools, see Evan
Engstrom & Nick Feamster, ’The Limits of Filtering: A Look at the Functionality & Shortcomings of
Content Detection Tools’ [2017] Available: https://www.engine.is/the-limits-of-filtering.
This is precisely the problem of automated filtering and the right to freedom of
expression. The inability of automated systems to distinguish between lawful and
unlawful uses inevitably will result in the blocking of lawful speech without an initial
judicial determination. The Commission’s Guidance, albeit non-binding, seems to
condone this outcome. It exposes certain potentially lawful expression to mechanisms
without effective legal safeguards, at least safeguards that take effect only after
expression has been blocked. This is not to say that “manifestly infringing” is a wrong
or inappropriate criterion, indeed it might be the only way to give some legitimacy to
Article 17 CDSM Directive.38 But the problems exposed here illustrate the complexity
35
See for example CJEU, Judgment of 29.07.2019, Funke Medien NRW, Case C-469/17,
EU:C:2019:623, in which the CJEU in principle agreed that the upload of entire military reports can fall
under the exception for quotation, similarly, the CJEU confirmed the application, in principle, of certain
exceptions to the upload on a complete manuscript in CJEU, Judgment of 29.07.2019, Spiegel Online,
Case C-516/17, EU:C:2019:625, on both cases see Bernd Justin Jütte & Giulia Priora, ’Leaking of
secret military reports qualifies as reporting of current events’ [2020] Journal of Intellectual Property
Law & Practice 681–682 and Giulia Priora & Bernd Justin Jütte, ’No copyright infringement for
publication by the press of politician’s controversial essay’ [2020] Journal of Intellectual Property Law
& Practice 583–584; on these decisions see Christophe Geiger & Elena Izyumenko, ’The
Constitutionalization of Intellectual Property Law in the EU and the Funke Medien, Pelham and Spiegel
Online Decisions of the CJEU: Progress, but Still Some Way to Go!’ [2020] International Review of
Intellectual Property and Competition 282.
36
COM(2021) 288 final, pp. 20-21.
37
For a critique, see Geiger & Jütte (n 5)), p. 26 (with reference to Christophe Geiger, ’The answer to
the machine should not be the machine: safeguarding the private copy exception in the digital
environment’ [2008] European Intellectual Property Review 121: “The answer to the machine should
not be in the machine”, or in short: what the is acceptable online or what is not needs to be decided
collectively and not by a few, and via independent mechanisms that duly safeguard fundamental
rights”).
38
Using the “manifestly illegal” criteria in order to determine what content could be blocked as a result
of the “best efforts” obligation of Art. 17 (4) has been proposed by several scholars, see Geiger & Jütte
(n 5)), Frosio & Geiger (n 12)) and Quintais et al. (n 12)) However, what is to be considered meeting
10
this requirement is debated. In the of the authors, this could be typically the role of an independent
regulation authority to propose a definition for this concept and provide with examples.
39
Interestingly, the guidance foresees that in the case of automated blocking of “manifestly illegal
content”, user’s need to be informed: “When manifestly infringing uploads are identified and are blocked
i.e. are not uploaded, users should be notified of this without undue delay and should still be able to
contest the blocking, by giving reasons for their request, under the redress mechanism provided for in
Article 17(9)” (COM(2021) 288 final; COM(2021) 288 final, p. 23).
40
See Christophe Geiger & Franciska Schönherr, ’Defining the Scope of Protection of Copyright in the
EU: The Need to Reconsider the Acquis regarding Limitations and Exceptions’, in: Tatiana-Eleni
Synodinou (ed), Codification of European Copyright Law. Challenges and Perspectives (Kluwer Law
International 2012), 133; Christophe Geiger & Franciska Schönherr, ‘Limitations to Copyright in the
Digital Age’, in: Andrej Savin & Jan Trzaskowski (eds.), Research Handbook on EU Internet Law
(Edward Elgar, 2014), 110; Christophe Geiger and Franciska Schönherr, Frequently Asked Questions
(FAQ) of Consumers in relation to Copyright, Summary Report (EUIPO 2017)
<https://euipo.europa.eu/ohimportal/web/observatory/observatory-publications> (listing exceptions and
limitations to copyright as one of the areas of major divergence in national copyright law); Tito Rendas,
Exceptions in EU Copyright Law (Kluwer Law International, 2021), 153 sq.
41
On the limits of human review to fix AI-based enforcement processes, see Ben Green & Amba Kak,
The False Comfort of Human Oversight as an Antidote to A.I. Harm, available at:
https://slate.com/technology/2021/06/human-oversight-artificial-intelligence-laws.html, accessed:
28.06.2021: “Policymakers and companies eager to find a “regulatory fix” to harmful uses of technology
must acknowledge and engage with the limits of human oversight rather than presenting human
involvement—even “meaningful” human involvement—as an antidote to algorithmic harms. (…) We
also need to subject human oversight to greater research and scrutiny, further studying what human
oversight does and does not accomplish and how to structure human-algorithm interactions to facilitate
better collaborations. This requires preliminary testing of human oversight mechanisms before they are
enshrined in policy and monitoring human oversight behaviours as a standard feature of algorithmic
11
Moreover, the standards for such human review are not set out in the Guidance and
there is a good argument to be made, and has indeed been made,42 in favor of an
independent institution or by a judicial or quasi-judicial body to ensure that uploads to
online content-sharing providers is properly assessed, and which could take
stewardship over developing fundamental rights compliant approaches to automated
filtering which, again, seems to be unavoidable in the context of the obligations created
by Article 17(4) CDSM Directive.
Under Article 17, three interests collide, which are all representative of different
fundamental rights. On the one side, OCSSPs must demonstrate best efforts to
prevent infringing user-uploads and remove infringing content upon notification, which
is qualified by the general prohibition of general monitoring.43 Users, on the other side,
impact assessments and A.I. audits, which are becoming popular policy mechanisms to evaluate A.I.
systems.”
42
Geiger & Jütte (n 5)) and more generally Frosio & Geiger (n 12)).
43
Article 17(8) CDSM Directive. More detailed on this issue, see Angelopoulos & Senftleben (n 15)).
With regard to the understanding of the notion of “general monitoring” in the CDSM Directive in
comparison with the same notion on Article 15 of the E-Commerce directive, the Guidance provides
little insights. It is merely stated that “whilst the concept of general monitoring is not defined in Article
17(8), it is expressed therein in the same terms as in Article 15 of Directive 2000/31. However, when
applying Article 17(8) first paragraph the context, scheme and objective of Article 17 and the special
role of online content-sharing service providers under this provision should be taken into account”. This
seems to imply that the notion is to be interpreted in the spirit and light of the CDSM Directive and thus
to have a potentially slightly different scope than in Article 15. However, it is then also subject to the
12
The Guidance suggests that MS can foresee that rightholders can “earmark” certain
content. Earmarked content refers to “content whose availability could cause
significant harm”47 to rightholders, content which is “particularly time sensitive”, such
as pre-released music of files and other comparable subject matter. It is important to
point out that earmarked content and its blocking privilege is not subject to a
‘manifestly infringing’ standard.48 This means that even content that contains short
excerpts of earmarked content or which contains only small amounts of earmarked
content in relation to the entirety of the uploaded material in one particular upload will
be subject to this specific regime.
As the second prong of the bifurcation of its approach to blocking, the Commission
suggest that specific earmarked content will, “when proportionate and where
possible”, be subject to “rapid ex ante human review by online content-sharing service
providers”.49 The content should be identified by rightholders and earmarking specific
inherent contradictions of Article 17, and the difficult question remains how to implement the obligation
of Article 17(4) while, at the same time, safeguarding Article 17(7). In our understanding, in order to be
compatible with fundamental rights, it could only lead to filtering of very limited and targeted content
such as “manifestly infringing” material, understood in a very restrictive manner (see Geiger & Jütte (n
5)).
44
Article 17(7) CDSM Directive.
45
See e.g. Recital 61 CDSM Directive.
46
COM(2021) 288 final, p. 11.
47
COM(2021) 288 final, p. 13, the Commission adds that earmarking such content requires rightholders
to properly justify why this content carries “high risks of significant economic harm” (p. 22).
48
COM(2021) 288 final, p. 22.
49
COM(2021) 288 final, p. 22.
13
50
COM(2021) 288 final, p. 22.
51
COM(2021) 288 final, p. 22.
52
Not only does the introduction of a special carve-out from the ‘no-blocking’ principle further restrict
the rights of users, most notable the right to freedom of expression, but it also puts further moderation
obligations on OCSSPs and affects their right to conduct a business, see to that effect Julia Reda &
Joschka Selinger, Germany attempts to square the circle in its implementation of Article 17 CDSMD –
Part 2, available at: http://copyrightblog.kluweriplaw.com/2021/06/03/germany-attempts-to-square-the-
circle-in-its-implementation-of-article-17-cdsmd-part-2/, accessed: 11.06.2021 and Geiger & Jütte (n
5)), p. 24.
53
COM(2021) 288 final, p. 22.
14
Earmarking content will inevitably lead to filtering and blocking. The standards as to
what can be earmarked must be meticulously defined to avoid wholesale blocking of
any economically valuable content. If not, earmarking could easily lead to a
presumption for the platforms that the content is manifestly illegal and thus potentially
to an over-blocking of all earmarked content to avoid liability or litigation. The CJEU
underlined in YouTube/Cyando that monitoring information is only permitted in relation
to specific content.55 Admittedly, the Commission seems conscious that this additional
possibility open to rightholders can have a negative impact and therefore carefully
specifies that “this heightened care for earmarked content should be limited to cases
of high risks of significant economic harm, which ought to be properly justified by
rightholders. This mechanism should not lead to a disproportionate burden on service
providers nor to a general monitoring obligation”56. Moreover, the additional human
review for earmarked content may only be undertaken “when proportionate, possible
and practicable”57. However, if the legitimate concern was about avoiding that
particular time-sensitive content goes online58 and thus causes a significant harm to
rightholders, this should have been included in the assessment of what is manifestly
infringing and can be blocked automatically. In this scenario, earmarking would then
54
COM(2021) 288 final, p. 22.
55
CJEU, C-683/18 (YouTube and Cyando), para. 113.
56
COM(2021) 288 final, p. 22.
57
COM(2021) 288 final, p. 22.
58
Such as a pre-release of music or a film, which understandably needs to be avoided and where the
use of automated tools seems to be legitimate to safeguard the interests deriving from Art. 17(2) of the
Charter of Fundamental Rights of the EU. Another situation, however not listed in the Guidance, would
be when the content has been taken down after a court decision, in order to avoid that it is uploaded
again ((in this spirit, the CJEU in its Glawischnig-Piesczek decision (CJEU, Judgment of 03.10.2019,
Glawischnig-Piesczek, Case C-18/18, EU:C:2019:821, para. 53), held that an injunction to remove
content identical or equivalent to content previously declared unlawful by a Court would be compatible
with Article 15(1) E-Commerce Directive and thus not falling under ban of general monitoring).
15
The Guidance is what it is, a guidance and not hard law, but it can be argued that it
does not provide what it promises: helping Member States to implement in a
fundamental rights compliant manner and in a harmonized way Article 17 in their
national law and thus to establish a virtuous legal framework for content moderation
by platforms in the EU. However, the approach taken by the Commission should be
seen as a step in the right direction. It tries to keep restrictions to user uploads at a
minimum whilst also having regard to the interests of rightholders. Interestingly, it also
specifies that the assessment of whether an OCSSP has made ‘best efforts’ “should
be made on a case-by-case basis, according to the proportionality principle set out in
Article 17(5) and within the respect of Article 17(7) to (9).”59 However, the guidance
provided for OCSSPs are, unfortunately, often still too vague and imprecise as that
they could serve as guidelines to perform their obligations with best efforts.
59
COM(2021) 288 final, p. 13. According to the Commission, “this means in practice that online content-
sharing service providers should not be expected to apply the most costly or sophisticated solutions if
this would be disproportionate in their specific case. This applies also in the case of content earmarked
by the relevant rightholders as content whose availability could cause significant harm to them (…).
Moreover, as explained in Recital 66, it cannot be excluded that in some cases unauthorised content
can be avoided only following a notification by rightholders”.
16
On the positive side, the guidance provides important clarification in relation to the
mandatory nature of certain exceptions and limitations which are referred to in Article
17(7). According to this provision, MS must ensure that users uploading content on
online-content sharing services must be able to rely on exception for quotation
criticism and review and for uses for the purpose of caricature, parody or pastiche.
Exceptions with a similar scope are already contained in the InfoSoc Directive, where
they are however only optional, meaning that MS are not obliged to implement them.
The Guidance now provides that, at least for uses falling within the scope of Article
17, these exceptions must be implemented into national law, and they should be
understood as autonomous notions of EU.62 The now mandatory nature of these
60
COM(2021) 288 final, pp. 2-3.
61
Geiger & Jütte (n 5)), pp. 23-24.
62
COM(2021) 288 final, p. 19.
17
The Guidance also fails its mission for other reasons: First, it simply comes too late.
At a point in time – a weekend before the transposition period expired – when three
MS had already implemented Article 17 and others were in advanced stages of
transposition, such guidance does not serve to assist national legislatures to draft
proper and systematically sound legislation. The result of the absence of direction,
before and after the publication of the Guidance, is that the transposition models in
the various MS already vary significantly. 66 Therefore, it simply does not provide the
legal certainty that the wording of Article 17 itself already failed to provide, and, as a
result, the Directive will unfortunately not achieve the necessary harmonization.67 The
63
This is particularly remarkable considering that the CJEU has suggested that certain exceptions are
quasi-mandatory because of their importance for the exercise of fundamental rights, see CJEU,
Judgment of 29.07.2019, Pelham and others, Case C-476/17, EU:C:2019:624, para. 60; CJEU, C-
469/17 (Funke Medien NRW), para. 58; CJEU, C-516/17 (Spiegel Online), para. 43; see also Bernd
Justin Jütte & Joao Pedro Quintais, ’The Pelham Chronicles: sampling, copyright and fundamental
rights’ [2021] Journal of Intellectual Property Law & Practice 213–225, p. 223.
64
See in this sense e.g. Christophe Geiger, Giancarlo Frosio, & Oleksander Bulayenko, ’The EU
Commission’s Proposal to Reform Copyright Limitations: A Good but Far Too Timid Step in the Right
Direction’ [2018] European Intellectual Property Review 4.
65
Christophe Geiger, ’The Role of the Court of Justice of the European Union: Harmonizing, Creating
and sometimes Disrupting Copyright Law in the European Union’, in: Irini Stamatoudi (ed), New
Developments in EU and International Copyright Law (Kluwer Law International 2016).
66
See for example the transpositions of Germany and Finland, which are more considerate of the rights
of users and do not include special rightholder safeguards for earmarked content: see Paul Keller,
Finnish Article 17 implementation proposal prohibits the use of automated upload filters, available at:
http://copyrightblog.kluweriplaw.com/2020/12/23/finnish-article-17-implementation-proposal-prohibits-
the-use-of-automated-upload-filters/, accessed: 24.12.2020 and Julia Reda & Joschka Selinger,
Germany attempts to square the circle in its implementation of Article 17 CDSMD – Part 1, available at:
http://copyrightblog.kluweriplaw.com/2021/06/02/germany-attempts-to-square-the-circle-in-its-
implementation-of-article-17-cdsmd-part-1/, accessed: 11.06.2021.
67
Geiger & Jütte (n 5)), p. 25. See also critical in relation to other the failure to effectively harmonize
exceptions within the scope of Article 17 CDSM Directive: Bernd Justin Jütte & Giulia Priora, A further
step into a systematic distortion: The EC Guidance on Article 17 CDSM Directive further complicates
18
The Guidance does however come in time for, or has possibly even purposively
delayed, the Opinion of AG Saugmandsgaard Øe in Case 401/19. The AG now must
decide, considering the ‘guidance’ provided and looking at the evidence presented by
the MS in the national transpositions, if Article 17 can be implemented in a manner
that complies with fundamental rights. We have argued elsewhere that the Court
should not restrict its review only to the right to freedom of expression, because other
fundamental rights are inextricably linked to the question whether Article 17 obliges
OCSSPs to monitor and filter in a general manner.68 And other important questions
remain. The technological limitations and the anticipated amount of earmarked content
and user complaints as a result cast in doubt whether the mechanisms required under
Article 17 are able to provide sufficient reassurances that the rights of users are
respected. Whether OCSSPs will realistically be able to implement technological
solutions that prevent the upload of earmarked content while at the same time
ensuring an ample level of human review is doubtful. One can only hope that all these
elements will be sufficient for the CJEU to annul a provision that, after bringing a lot of
unsatisfaction still carries so much uncertainty, and that a virtuous legal framework for
content moderation will be elaborated in the near future, possibly in the context of the
upcoming DSA regulation under discussion.69 The legitimacy and acceptability of the
copyright system will surely be enhanced.70
19
20