You are on page 1of 12

Before the

Library of Congress
U.S. Copyright Office
Washington, DC

In re
Section 512 Study: Notice and Request for
Public Comment

Docket No. 2015-7

Facebook, Inc. respectfully submits the following comments in response to the U.S.
Copyright Office’s Notice of Inquiry regarding section 512 of the DMCA, published on
December 31, 2015. U.S. Copyright Office, Section 512 Study: Notice and Request for Public
Comment, 80 Fed. Reg. 81862 (Dec. 31, 2015).
By way of background, Facebook is a service available online and as a mobile
application that is used by nearly 1.6 billion people around the world. Facebook’s mission is to
give people the power to share and to make the world more open and connected, allowing users
to stay in touch with friends and family, to discover what is going on in the world, and to share
and express what matters to them. Facebook has emerged as an important platform for speech
and a key arena for individuals and entities to disseminate and discover information, ideas,
opinions, photos, activities, videos, news, and so on. Facebook also provides businesses, brands,
organizations, celebrities, and others with new avenues for sharing their content and new
avenues for consumers to engage with that content.
General Effectiveness of Safe Harbors

Are the Section 512 safe harbors working as Congress intended?

Yes. Section 512 was intended to “[p]reserve[] strong incentives for service providers
and copyright owners to cooperate to detect and deal with copyright infringements that take
place in the digital networked environment,” S. Rep. No. 105-190, 105th Cong., 2d Sess. 40
(1998), with the goal of ensuring that “the efficiency of the Internet w[ould] continue to improve
and that the variety and quality of services on the Internet w[ould] expand.” Id. at 2. And as the
Notice acknowledges, the purpose of the section 512 safe harbors was also to provide “greater
certainty to service providers concerning their legal exposure for infringements that may occur in
the course of their activities.” Id. at 20.
These objectives have been realized. The success of section 512 is attributable to the fact
that it prescribes an appropriately shared and apportioned burden as between rights owners and
service providers to address infringing conduct by third parties. To this end, section 512 states
that service providers do not have damages liability for content posted by their users so long as
they remove any such infringing content upon obtaining actual or constructive knowledge of it.

In compliance with section 512, covered online service providers have implemented robust
notice-and-takedown programs to ensure that content reported to them as infringing is
expeditiously removed.
Facebook’s policies and procedures illustrate the efficacy of the balance of
responsibilities laid out in the DMCA. These begin with Facebook’s Statement of Rights and
Responsibilities as well as its Community Standards, which prohibit users from posting content
that infringes third parties’ intellectual property rights. In addition, Facebook allows intellectual
property rights owners (or their agents) to report content to Facebook through various means,
including its online reporting forms and its DMCA designated agent (registered with the
Copyright Office pursuant to section 512(c)(2)). For their part, rights owners frequently avail
themselves of these reporting mechanisms, and Facebook maintains a global notice-andtakedown team that removes content in response to valid reports of alleged infringement. Also
in keeping with section 512, Facebook has implemented a policy pursuant to which it disables
the accounts of repeat infringers in appropriate circumstances. Information about these measures
and additional topics relating to intellectual property is included in Facebook’s Intellectual
Property Help Center (available at In
sum, these measures, which properly allocate responsibilities among the relevant stakeholders
and ensure that instances of online infringement are effectively addressed, demonstrate that
section 512 is working as intended.

Have courts properly construed the entities and activities covered by the section 512
safe harbors?

Yes. Courts have interpreted the definition of “online service provider” in section
512(k)(1)(A) as Congress intended. Consistent with the statutory definition, courts have
properly found that a wide variety of entities constitute online service providers, including
webhosting services, online discussion fora, photo- and video-sharing sites, celebrity-oriented
fan sites, and many others. These interpretations are in keeping with the deliberately broad
definition of “service provider” contained in the statute. Courts also have appropriately applied
the safe harbors to activities such as organizing, categorizing, formatting, and internal copying of
user-generated content, again consistent with the statute.
The courts’ interpretations of section 512 have achieved what Congress intended in
enacting section 512. As noted, section 512 was designed to provide “incentives for service
providers and copyright owners to cooperate to detect and deal with copyright infringements,” S.
Rep. No. 105-190 at 40, and to provide “greater certainty to service providers concerning their
legal exposure for infringements that may occur in the course of their activities.” Id. at 20. The
clarity with which policing responsibilities are allocated in section 512, and the certainty such
clarity has provided to services like Facebook, has contributed significantly to the growth of the
digital economy and to the dramatic increase in the availability of permissible user-generated
content in all media on platforms that rely on section 512.



How have section 512’s limitations on liability for online service providers impacted
the growth and development of online services?

Section 512 has played an important role in Facebook’s growth by creating reasonable
and effective means for stakeholders to address online infringement and by delineating these
stakeholders’ respective roles and responsibilities. Facebook’s history bears this out. While
many factors surely have played a part in Facebook’s growth over the years, the predictability
afforded by section 512’s limitations on liability have provided important assurance that
Facebook could expand its business around user-generated content so long as it complied with
section 512. The same appears to have held true for the many other U.S. online service providers
that have thrived since the enactment of section 512.

How have section 512’s limitations on liability for online service providers impacted
the protection and value of copyrighted works, including licensing markets for such

Section 512 has helped expand opportunities for copyright owners to enhance the value
of their works. The purpose of the statute was to foster the growth of the Internet, while also
ensuring that intellectual property rights are protected. To this end, Facebook has become a
prominent platform for the promotion and dissemination of copyrighted works and for the
discovery of those works by users around the globe. Content creators ranging from television
networks and movie studios to news and book publishers regularly use Facebook to distribute a
wide variety of creative content to Facebook’s community of some 1.6 billion users. Indeed,
some of the most engaging content on the platform originates from these rights owners,
including breaking news, complete episodes of television shows, full-length theatrical movies,
and more.

Do the section 512 safe harbors strike the correct balance between copyright owners
and online service providers?

Yes. The existing rights and responsibilities set forth in section 512 have broadly
achieved a balance that has (1) allowed the digital economy to grow by allowing intermediaries
to flexibly conduct their business; (2) enabled rights owners and others to address infringing or
otherwise illegal content, including by reporting content to intermediaries and working with
intermediaries on additional voluntary anti-infringement measures; and (3) protected the public’s
freedom of expression and access to information. In the first instance, the allocation of
responsibilities under the section 512 safe harbors appropriately reflects the fact that rights
owners are best suited to determine whether their rights are being infringed in a particular
circumstance. The burden then shifts to the intermediary to remove infringing content when
rights owners notify it of that content. In meeting this obligation, Facebook has invested
substantial human, financial, and technological resources in addressing online infringement.
These resources are described in further detail throughout these comments.


Notice-and-Takedown Process

How effective is section 512’s notice-and-takedown process for addressing online

It is quite effective. As described in further detail in response to Question #1 above,
Facebook has implemented many measures aimed at protecting intellectual property rights on its
platform. These include a global notice-and-takedown team dedicated to responding to reports
of intellectual property infringement as well as a policy to terminate repeat infringers when
appropriate. These measures, and others discussed throughout these comments, ensure that
infringement by users of Facebook’s service is effectively addressed. Indeed, consistent with the
statute, reported content is quickly removed from Facebook’s service, and the accounts of repeat
infringers are disabled. The DMCA, which has encouraged cooperation between online services
and rights owners, has also allowed these stakeholders to develop voluntary measures aimed at
combatting online infringement (including those discussed in response to Questions #11 and #15
below). In sum, and as discussed further in response to Question #7, while the DMCA by
necessity imposes some burden on the respective parties, its procedures unquestionably result in
the effective and consistent removal of infringing content from the Internet.

How efficient or burdensome is section 512’s notice-and-takedown process for
addressing online infringement? Is it a workable solution over the long run?

Section 512’s notice-and-takedown process has proven to be not only a workable
solution, but also an efficient one, and it will remain so over the long run. To be sure,
administering its notice-and-takedown program requires a substantial devotion of resources by
Facebook. But Facebook takes intellectual property rights, as well as its obligations under
section 512, very seriously and invests heavily in the people and products needed to meet its
obligations. This includes global coverage, across multiple languages, in order to process
copyright and other intellectual property reports submitted by rights owners. While, as noted,
the section 512 notice-and-takedown process necessarily imposes some burdens on stakeholders,
those burdens are reasonable and appropriate to address online infringement.
The process is also efficient, with Facebook removing reported infringing content
expeditiously after the submission of a rights owner’s report. Section 512 has additional built-in
efficiencies, most notably its requirement that online service providers have adopted and
reasonably implemented a repeat infringer policy. This requirement means that a user who has
been reported multiple times for infringement will be unable to continue to engage in infringing
conduct on the platform because the online service provider is statutorily obligated to take action
against that user. Thus, rights owners will not need to continue reporting a user who has
repeatedly infringed copyright because the user will be terminated pursuant to the repeat
infringer policy.

In what ways does the process work differently for individuals, small-scale entities,
and/or large-scale entities that are sending and/or receiving takedown notices?

The flexibility afforded by the section 512 safe harbors has properly allowed for the
relevant stakeholders to adapt their practices in ways consistent with their resources, while also


staying in compliance with DMCA requirements. Indeed, even at the time the DMCA was
enacted, there was already an array of services operating at different scales. The DMCA
procedures worked effectively at that time and continue to do so today, even as “the variety and
quality of services on the Internet [has] expand[ed].” S. Rep. No. 105-190 at 2. To be sure,
Facebook likely has a greater capacity to develop and implement procedures for complying with
section 512 than many smaller entities, and what may be financially or technologically feasible
for Facebook very well may not be feasible for such smaller entities. But, to impose greater
obligations with more resources would assuredly stifle innovation, contrary to Congress’s intent
in enacting the DMCA.

Please address the role of both “human” and automated notice-and-takedown
processes under section 512, including their respective feasibility, benefits, and

There are two aspects of the notice-and-takedown process that implicate the
human/automated distinction. First, some rights owners submit reports under section 512
manually (i.e., by humans), while others use automated technologies to identify potentially
infringing content and/or to submit reports. Second, some online service providers, in certain
circumstances, may use automation to process reports received under section 512 and to remove
content identified in those reports.
On the reporting side, some degree of human review and rights owners’ analysis of
whether reported content is infringing may be necessary prior to submission of a report. In
Facebook’s experience, the exclusive use of automation by some rights owners has been
responsible for inaccurate reporting. This has included automatically-generated reports directed
to what are obviously permissible uses. Facebook has found that, in many circumstances,
automation can make it difficult for rights owners to distinguish legitimate uses from infringing
ones, notwithstanding their obligation to take non-infringing purposes into account in submitting
reports under section 512. This includes, for example, reports targeted at obviously fair or
licensed uses. In fact, the use of automation (or at least automation that is unable to differentiate
between permissible and impermissible uses) may be inconsistent with a rights owner’s
obligation to have a good-faith basis for any reports submitted under section 512.
With respect to Facebook’s processing of reports submitted by rights owners under
section 512, Facebook does not currently rely on automation to remove reported content.
Instead, as noted elsewhere, Facebook employs a global team to review submitted reports and to
remove allegedly infringing content identified in those reports. While future circumstances may
necessitate a different approach, such human review, while imposing additional burdens, is
important now to help ensure that submitted reports are valid and complete and to guard against
reports that may be fraudulent or submitted in bad faith.

Does the notice-and-takedown process sufficiently address the reappearance of
infringing material previously removed by a service provider in response to a
notice? If not, what should be done to address this concern?

The problem of infringing material reappearing after being removed is best addressed by
diligently enforced repeat infringer policies. As stated in Facebook’s Intellectual Property Help


Center, Facebook has implemented a repeat infringer policy and, consistent with this policy,
disables the accounts of repeat infringers in appropriate circumstances, which prevents them
from continuing to engage in infringement. This practice, along with the regular submission of
reports by rights owners and expeditious removal of content in response to those reports, helps
provide the necessary safeguards against the potential reappearance of infringing material.
The alternative of requiring service providers to affirmatively prevent the reposting of
previously reported material after it has been removed may sound straightforward, but in fact it is
inconsistent with settled legal principles, results in a poor user experience, and is technologically
difficult to accomplish. Based on Facebook’s experience, implementing a “stay down” policy to
screen out identical content on an ongoing basis, without any subsequent reports, would result in
potentially large amounts of content being blocked even when that content may be perfectly
permissible in another context. A user’s upload of copyrighted content may be infringing in one
instance but not in others based on fair use, a licensing arrangement, or a host of other reasons.
To impose a “stay down” obligation on intermediaries would automatically eliminate
consideration of these factors and could significantly impede free expression, to the detriment of
the broader public.
In addition, even for sophisticated services like Facebook, it is technologically difficult, if
not impossible, to ensure that the same or similar content, once reported, will stay down
indefinitely. Slight variations in the content could result in its reappearance, and users intent on
gaming the system likely would find workarounds. Imposing liability under these circumstances
could be ruinous and, indeed, could be an insurmountable obstacle for small and/or start-up
services. Congress appeared to recognize these realities in enacting section 512(m), which
provides that the DMCA’s limitations on liability are not conditioned on a service provider
“monitoring its service or affirmatively seeking facts indicating infringing activity.” There is no
way to reconcile a “stay down” obligation with section 512(m). On its face, such an obligation
would require a service provider to affirmatively identify and block content across its platform
on an ongoing basis. Any revision of section 512 that would limit section 512(m) would conflict
with Congress’s well-considered judgment as to the proper allocation of responsibility for
policing infringing user-generated content online.

Are there technologies or processes that would improve the efficiency and/or
effectiveness of the notice-and-takedown process?

One of the strengths of section 512 is the flexibility it provides to online service providers
and rights owners to cooperate with each other to combat online infringement. The procedures
delineated in section 512 provide the baseline for this cooperation while preserving breathing
room for interested parties to build upon those procedures as needed. In Facebook’s experience,
this cooperation has resulted in numerous improvements and enhancements to its antiinfringement policies and practices. Some of these are necessarily confidential, but one that has
been publicly disclosed is Facebook’s new copyright management tool. This tool, announced in
August 2015, supplements Facebook’s other anti-infringement measures and is intended for
rights owners whose video content may be subject to infringement. Currently in beta, the tool
flags uploaded videos that match the rights owners’ content and allows those rights owners to
quickly and efficiently report the videos to Facebook for removal.


Such dynamic solutions would be less likely in a setting other than what currently exists
under section 512. For example, a statutory provision designating specific solutions would
almost certainly be outdated as soon as it was enacted. Moreover, any attempt to statutorily
require technological solutions would effectively impose greater obligations on service providers
and concomitantly expose them to greater potential liability if they fail to meet those heightened

Does the notice-and-takedown process sufficiently protect against fraudulent,
abusive or unfounded notices? If not, what should be done to address this concern?

By its terms, section 512(f) provides adequate protection against fraudulent, abusive, or
unfounded notices, but courts have occasionally construed those terms too narrowly. As the
DMCA makes clear, the initial burden to ensure that notices are not fraudulent, abusive, or
unfounded rests with rights owners, who should take steps to ensure that their notices are legally
valid and made in good faith. To encourage that practice, courts should more readily enforce
section 512(f) and impose appropriate sanctions in cases where rights owners are found to have
submitted invalid notices.

Has section 512(d), which addresses “information location tools,” been a useful
mechanism to address infringement that occurs as a result of a service provider’s
referring or linking to infringing content? If not, what should be done to address
this concern?
No response.


Have courts properly interpreted the meaning of “representative list” under section
512(c)(3)(A)(ii)? If not, what should be done to address this concern?

As a general matter, courts have properly held that takedown notices must identify with
specificity the nature of the infringement at issue so as not to put the service provider in the
position of having to make that determination on its own or to engage in monitoring, which
contravenes section 512(m). The courts’ holdings necessarily mean that rights owners must
specifically identify the copyrighted work claimed to have been infringed and the material that is
claimed to be infringing. The Notice indicates that some rights owners have questioned whether
this specificity is at odds with the authorization in section 512(c)(3)(A)(ii) of a “representative
list” of works infringed. This is not the case. To the extent rights owners seek broader
permission to use a “representative list” to relieve themselves of the burden of identifying
specific claimed infringements, any such burden-shifting would undermine the careful balance of
obligations codified in section 512. Indeed, in Facebook’s experience, a rights owner’s failure to
specifically identify all infringed works (and thus all instances of infringement) compels the very
proactive monitoring that section 512(m) expressly does not require. And perhaps more
importantly, without knowing each of the works that the rights owner believes is being infringed,
it is impossible for a service to take any meaningful action to identify and stop the alleged
infringement at issue.



Please describe, and assess the effectiveness of, voluntary measures and best
practices – including financial measures, content “filtering” and takedown
procedures – that have been undertaken by interested parties to supplement or
improve the efficacy of section 512’s notice-and-takedown process.

Facebook has implemented various voluntary measures and best practices to supplement
its notice-and-takedown process. Some of these are confidential, but it is a matter of public
record that for many years Facebook has used Audible Magic, which is a third-party service that
maintains a database of copyrighted audio and audiovisual content, including songs, movies, and
television programs. When a Facebook user attempts to upload a video that matches content in
the Audible Magic database, that system is designed to block the video from being uploaded,
subject to certain criteria such as the right of the user to appeal the block if the user has the rights
or is otherwise entitled to upload the video. (The Audible Magic functionality is different from
Facebook’s own copyright matching technology, discussed in response to Question #11 above,
because Audible Magic blocks matching videos from being uploaded to Facebook at the outset.
Facebook’s copyright matching technology, somewhat differently, identifies matching videos
that have been uploaded, but then allows rights owners to report those videos via a notice under
section 512.)
The flexibility of section 512, which does not require any proactive measures, let alone
require particular ones, has greatly enhanced Facebook’s ability to explore this terrain. If
Facebook were required to take proactive measures, at risk of liability, it would lessen its
appetite to experiment with tools that may or may not work (or not work as well as envisioned).
The voluntary nature of the current regime allows for, and encourages, experimentation and
cooperation with rights owners to continue to explore new solutions.
Counter Notifications

How effective is the counter-notification process for addressing false and mistaken
assertions of infringement?

In circumstances that qualify for the counter-notification procedures set forth in section
512(g), Facebook follows those procedures and has found them to be workable and effective.
From Facebook’s standpoint, the procedures are simple and not overly burdensome, and they
involve an appropriate interplay between rights owners and users concerning the legality of
particular content, facilitated by the platform. This process leads to greater precision in
identifying and remedying infringements and guards against false and mistaken assertions of

How efficient or burdensome is the counter-notification process for users and
service providers? Is it a workable solution in the long run?
See response to Question #16 above.


In what ways does the process work differently for individuals, small-scale entities,
and/or large-scale entities that are sending and/or receiving counter notifications?
No response.


Legal Standards

Assess courts’ interpretations of the “actual” and “red flag” knowledge standards
under the section 512 safe harbors, including the role of “willful blindness” and
section 512(m)(1) (limiting the duty of a service provider to monitor for infringing
activity) in such analyses. How are judicial interpretations impacting the
effectiveness of section 512?

Courts have correctly construed the “actual” and “red flag” knowledge standards when
they have interpreted these standards to require knowledge of specific, identifiable instances of
infringement. In the case of “red flag” knowledge, the standard appropriately has been held to
require subjective knowledge of facts that would have made the specific infringement obvious to
a reasonable person. In this regard, courts have correctly held that knowledge should not be
imputed to an online service provider merely because, without more, content on its service may
be commercial or well-known. Courts also have correctly held that the mere fact that an
employee viewed particular material is not enough to give rise to “red flag” knowledge of
With respect to willful blindness, courts have rightly recognized that this type of imputed
knowledge requires conscious avoidance of facts concerning specific infringements and have
made clear that willful blindness is not synonymous with failure to monitor infringement
proactively (even if there may be general awareness of infringing conduct). In other words,
courts have made clear that neither “red flag” knowledge nor “willful blindness” can be
construed to trigger a duty to investigate whether content is infringing, which would conflict
with section 512(m). Such interpretations of these alternative forms of knowledge are essential
to preserving the integrity of the DMCA safe harbors.

Assess courts’ interpretations of the “financial benefit” and “right and ability to
control” standards under the section 512 safe harbors. How are judicial
interpretations impacting the effectiveness of section 512?

Courts have properly interpreted the “right and ability to control” and “financial benefit
directly attributable to the infringing activity” standards as used in section 512(c)(1)(B). With
respect to the control prong, courts have correctly held that it must mean more than the reserved
right to block or remove access to material posted by users (which is required by section
512(c)(1)(C)). Likewise, courts have made clear that actions such as providing formatting and
general content instructions, inviting and prescreening user submissions, offering certain
assistance to users, and enhancing the presentation of content do not, without more, jeopardize
DMCA safe harbor protections. This should not change, as platforms need the latitude to
optimize the appearance and organization of user-generated content in a manner they deem
appropriate and that users expect.
With respect to the financial benefit prong, courts have appropriately followed the statute
and held that merely charging for an online service or selling advertisements against usergenerated content that may be infringing does not constitute the receipt of a financial benefit
attributable to infringement. Rather, there must be a demonstrable causal relationship between


infringing content and a benefit to the service provider, and such a benefit is not established
merely by the fact that some infringing content appears on a platform that generates revenue.
Given the foregoing, Facebook does not believe there is any need to modify section
512(c)(1)(B) or section 512(c)(1)(C). Both provisions allow for necessary flexibility in how a
platform, operating in good faith, handles and monetizes user-generated content without
endangering the DMCA’s safe harbor protection.

Describe any other judicial interpretations of section 512 that impact its
effectiveness, and why.
No response.

Repeat Infringers

Describe and address the effectiveness of repeat infringer policies as referenced in
section 512(i)(A).

Section 512(i)(A) has proven to be effective at preventing users intent on repeatedly
infringing others’ copyrights from continuing to do so. As noted above, Facebook has adopted
and implemented (and informed users of) a repeat infringer policy that works as envisioned in
conjunction with Facebook’s notice-and-takedown process. Under this policy, Facebook
disables the accounts of repeat infringers, thereby preventing future possible infringements by
those users from occurring (even without any further action from rights owners). In short, repeat
infringer policies under the DMCA are effective at curbing online infringement.

Is there sufficient clarity in the law as to what constitutes a repeat infringer policy
for purposes of section 512’s safe harbors? If not, what should be done to address
this concern?

Yes, there is sufficient clarity. Section 512 wisely does not purport to prescribe any
particular repeat infringer policy. Instead, it merely requires that such a policy be “adopted and
reasonably implemented” and that its existence be disclosed to users. 17 U.S.C. § 512(i)(A).
The flexibility inherent in the statute allows service providers like Facebook to take action
against repeat infringers when appropriate. As just one example, a user who appears intent on
committing ongoing blatant infringements would be disabled sooner than a user who appears to
have committed isolated inadvertent infringements or who could respond well to user education.
(Requiring disclosure of the details of a repeat infringer policy is also problematic because it
would allow bad actors to potentially circumvent the policy once they know its details.)


Standard Technical Measures

Does section 512(i) concerning service providers’ accommodation of “standard
technical measures” (including the definition of such measures set forth in section
512(i)(2)) encourage or discourage the use of technologies to address online

Facebook’s position regarding “standard technical measures” is stated in its response to
Question #25 below.

Are there existing or emerging “standard technical measures” that could or should
apply to obtain the benefits of section 512’s safe harbors?

Regardless of whether there are existing or emerging “standard technical measures,”
legislating a list of technical measures to be employed by either rights owners or service
providers would be a mistake. Given the importance of rights owners and service providers
having the latitude to experiment and innovate with anti-infringement techniques, such a
prescriptive list would be unduly limiting. In particular, service providers should be allowed and
encouraged to cooperate with rights owners and to experiment with additional measures without
fear that doing so would either trigger imputed knowledge of infringing activity or impose a
legal obligation to continue using those measures. In any event, any attempt to devise a
prescriptive list of such measures would prove ineffectual, since that list would be outdated as
soon as it was adopted.

Is section 512(g)(2)(C), which requires a copyright owner to bring a federal lawsuit
within ten business days to keep allegedly infringing content offline – and a counternotifying party to defend any such lawsuit – a reasonable and effective provision? If
not, how might it be improved?

As noted in response to Question #16, the counter-notification process is working as

Is the limited injunctive relief available under section 512(j) a sufficient and
effective remedy to address the posting of infringing material?

Yes. Section 512(j) appropriately restricts relief to blocking access to the infringing
material and terminating responsible users while otherwise protecting the service provider from
liability, in keeping with the purpose of the safe harbors.

Are remedies for misrepresentation set forth in section 512(f) sufficient to deter and
address fraudulent or abusive notices and counter notifications?
Facebook’s position regarding section 512(f) is stated in its response to Question #12



Other Issues

Please provide any statistical or economic reports or studies that demonstrate the
effectiveness, ineffectiveness, and/or impact of section 512’s safe harbors.

As discussed throughout these comments, the success of section 512 can be seen in the
thriving ecosystem of U.S. Internet companies that have developed since the law’s enactment. In
one recent report (available at, Deloitte estimated
that in 2014 Facebook enabled around $227 billion of economic activity and 4.5 million jobs
globally. The U.S. captured the largest share of that economic activity, some $100 billion,
supporting more than one million jobs.

Please identify and describe any pertinent issues not referenced above that the
Copyright Office should consider in conducting its study.

To reiterate Facebook’s main points: Section 512 has played an important role in
Facebook’s growth and has created a balanced framework for intellectual property protection on
the Internet. The limitations on liability offered by section 512 have allowed Facebook to
provide a global platform for entities and individuals to exchange information of all kinds
without the potentially paralyzing risk of copyright infringement liability. At the same time,
Facebook’s robust notice-and-takedown program has effectively and efficiently allowed for
prompt resolution of section 512 notices. In addition, by eschewing specific prescribed
procedures or technologies, the statute has allowed for adaption to evolving circumstances and
has given service providers and rights owners flexibility to develop voluntary protocols
consistent with those circumstances and relevant needs (including some of the protocols
discussed in these comments). Accordingly, section 512 is working as Congress intended.
Facebook notes the concerns raised by rights owners regarding whether the tremendous
increase in the amount of user-generated content since enactment of the DMCA, and the
accompanying volume of online infringement, is placing on rights owners an unmanageable
burden in attempting to police their rights online. It is important to recognize that the existence
of infringing conduct online presents a problem and a challenge for Facebook as well as for
rights owners. However, for the reasons explained in these comments, section 512 as presently
constructed codifies the optimal balancing of responsibilities for addressing online infringement.
As noted above, rights owners are best suited to identify infringements of their rights, and online
services are obligated under the DMCA to remove any instances of infringements reported by
those rights owners and take action against repeat infringers. This apportionment of burdens has
worked as intended to further Congress’s goal of fostering the development of a thriving
Internet, to the benefit of the general public.