You are on page 1of 22

Neil Miller

SoundCloud Operations, Inc.
111 5th Avenue
New York, NY 10003

Before the
U.S. COPYRIGHT OFFICE
LIBRARY OF CONGRESS

In the Matter of
Section 512 Study

Docket No. 2015–7

COMMENTS OF
SOUNDCLOUD OPERATIONS, INC.
(on behalf of itself and its affiliates)

April 1, 2016

1

I.

ORGANIZATION INFORMATION

SoundCloud Operations, Inc. and its affiliates (including SoundCloud Limited) operate the online
platform known as “SoundCloud,” which can be accessed via http://soundcloud.com and via native
mobile apps. The SoundCloud platform is a hosting platform for user-uploaded music and non-music
audio content, and is widely used by creators, including musicians, comedians, politicians, podcasters and
others to share their originally-created works with a global audience. The platform is also widely used by
companies that own or control copyrighted works, including record companies, music publishing
companies, audiobook publishers and others.

II.

SUBJECTS OF INQUIRY

General Effectiveness of Safe Harbors
1. Are the section 512 safe harbors working as Congress intended?
The section 512 safe harbors are working as Congress intended. Congress was aware of the
potential of electronic commerce at the time it enacted section 512, and sought to promote its
development by limiting the risk of crippling liability that intermediaries would otherwise face without
these protections.
Companies such as Facebook, Amazon, eBay, Twitter, Instagram, Etsy, Snapchat and many more
benefit from, and in many cases, depend upon the section 512 safe harbors. These protections have
created new opportunities for commerce, creativity and communication on a scale never before seen. The
phenomenon of social media could not exist without the safe harbors, and whether one uses YouTube,
Facebook, SoundCloud or Twitter to follow President Obama1 or Grumpy Cat,2 there is no doubt that
these forms of communication are firmly established as part of contemporary culture.
More importantly, the people that are most empowered by these new services are individuals. If
you are a small manufacturer of handmade goods, you no longer have to pitch your goods to multiple
retailers in the hope that one will be persuaded to carry them—you can instead set up a store on Etsy or
Amazon Marketplace and have instant access to a global customer base. So-called “citizen journalism”
has provided the world with real-time, eyewitness reports on events of global significance via social
media3, and in the case of individual copyright owners, more creators are able to reach more consumers
than ever before without the need to contract with large global media companies. SoundCloud’s
community of 18 million creators reaches an audience of 175 million listeners every month. The
opportunity that this brings for creators is immense. No longer do aspiring artists need to post a demo
1

https://www.facebook.com/barackobama; https://twitter.com/barackobama?lang=en;
https://www.youtube.com/user/whitehouse; https://soundcloud.com/whitehouse
2
http://www.businessinsider.com/grumpy-cat-has-earned-its-owner-nearly-100-million-in-just-2-years2014-12?IR=T
3

http://www.washington.edu/news/2011/09/12/new-study-quantifies-use-of-social-media-in-arab-spring/
2

cassette to a record company, and hope that someone might listen to it and offer them an exclusive
recording agreement. Now, artists can simply upload their works to SoundCloud and share them globally
via social media within seconds. They retain control of their copyrights, they can communicate directly
with their fans, build their own audience, control their own release schedules, decide what to share and
when, understand who’s listening to what, and know where their music is most popular and therefore
where to tour. For artists, this is incredibly powerful.
Furthermore, the system is democratic. Previously, the chances of unsigned artists getting their
albums into retail stores, or their songs played on the radio, were very slim, and even if they were
successful in doing so, their works would be unlikely to receive the same prominence, either in-store or in
terms of radio airplay, as albums or songs from established artists. On SoundCloud, however, every artist
has the same opportunity—every artist has a profile page, every track has a unique URL, and it is
SoundCloud’s community of creators and listeners that decides which tracks rise to prominence. Kansas
City rapper Rory Fresco recently signed a deal with Epic Records after SoundCloud users discovered his
track by means of SoundCloud’s “related tracks” algorithm.4 This kind of opportunity is only possible
with an open, democratic platform.
These services can also be very lucrative for this new breed of young creators. The minimum pretax income required to make it onto Forbes’ top 10 YouTube stars of 20155 was $2.5 million, with the
highest grossing individual—26 year old Swede, Felix Kjellberg (known as PewDiePie to his 40 million
YouTube subscribers)—earning $12 million from his video game commentaries.
In the case of music, artists such as Arctic Monkeys, Calvin Harris and Lily Allen famously rose
to prominence on MySpace, one of the early social networks.6 Each has since forged a highly successful
career, with Arctic Monkeys headlining the prestigious Glastonbury Festival on two separate occasions,
and Calvin Harris earning a reported $66 million in 2015 alone, according to Forbes.7 New Zealand
singer-songwriter Lorde self-released her first EP via her SoundCloud page in 2012.8 Her debut album
subsequently went on to sell over 2 million copies in the U.S.; she has won multiple awards, including
two Grammys, two Billboard music awards, and an MTV Video Music Award, her album was certified
double-platinum by the RIAA in 2014, and is rated as Billboard’s 189th greatest album of all time (by way
of comparison, Simon & Garfunkel’s “Bridge Over Troubled Water” and The Beatles’ “Meet the Beatles”
are numbers 187 and 188 respectively).9
The positive impact of section 512 would be significantly diluted of course if pirate sites were
also able to benefit from the limitations of liability. However, that is not the case. The courts have
4

http://www.mtv.com/news/2752585/rory-fresco-soundcloud-kanye-west-epic-records/
http://www.forbes.com/sites/maddieberg/2015/10/14/the-worlds-highest-paid-youtube-stars2015/#545ff4b8542c
6
http://www.digitalspy.com/music/feature/a660773/what-happened-to-the-8-biggest-myspace-musicstars/
7
http://www.forbes.com/pictures/eeel45fgdkh/6-calvin-harris-66-mi/
8
http://www.billboard.com/articles/news/5687161/lorde-the-billboard-cover-story
9
http://www.billboard.com/charts/greatest-billboard-200-albums
5

3

consistently denied the protection of section 512 to services that fail to comply with its requirements
(such as reasonably implementing a repeat infringer policy), or that actively encourage infringement.
Peer-to-peer networks generally have been found ineligible for protection on technical grounds, while
pirate sites that promote infringement operate outside the protection of section 512’s safe harbor (even if
they claim to be protected by it) because these sites and services typically have knowledge of widespread
infringement and fail to take action, or actively induce infringement. For these and other reasons, pirate
sites or services such as the original Napster service, Aimster, and others have been held not to qualify for
the protection of section 512 and platforms that might otherwise have qualified have been denied
protection where they failed to reasonably implement a repeat infringer policy.10
Section 512 has enabled innovation. Services that encourage infringement are deprived of its
protection, while others that adhere to its requirements—including an ever growing list of new and
innovative services—are protected and afforded the opportunity to flourish. At the same time, copyright

10

See, e.g., Columbia Pictures Indus., Inc. v. Fung, 710 F.3d 1020, 1039-47 (9th Cir. 2013) (holding that
various providers of BitTorrent trackers that used a hybrid peer-to-peer file sharing protocol were
ineligible for section 512’s safe harbors because, among other things: (i) BitTorrent trackers are not
“service providers” for purposes of section 512(a), and (ii) they had actual knowledge and red flag
awareness of infringement, in a case where they were held liable for affirmatively inducing infringement);
In re Aimster Copyright Litig., 252 F. Supp. 2d 634 (N.D. Ill. 2002), aff’d on other grounds, 334 F.3d 643
(7th Cir. 2003) (holding that a peer-to-peer service could not benefit from section 512’s safe harbors in
part because it had failed to reasonably implement its repeat infringer policy and because material passed
between users was not transmitted “through” the system within the meaning of 17 U.S.C.A. §
512(b)(1)(B)); Capitol Records, LLC v. Escape Media Group, Inc., No. 12-cv-6646-AJN, 2015 WL
1402049, at *6-13, 44-58 (S.D.N.Y. Mar. 25, 2015) (entering summary judgment against Grooveshark
where the court found that Grooveshark had not reasonably implemented its repeat infringer policy);
Disney Enterprises, Inc. v. Hotfile Corp., Case No. 11-cv-20427, 2013 WL 6336286 (S.D. Fla. Sept. 20,
2013) (holding Hotfile ineligible for the DMCA safe harbor for material stored at the direction of a user
where it failed to reasonably implement its repeat infringer policy); Arista Records LLC v. Usenet.com,
Inc., 633 F. Supp. 2d 124, 153-54 (S.D.N.Y. 2009) (granting terminating sanctions and summary
judgment against a Usenet hosting service and its owner where the court found defendants knew or
should have known that their site was being used for infringement based on employee communications
and where the defendants had tools available which they used to block certain content and users but did
not employ those tools to block infringement); In re Aimster Copyright Litig., 252 F. Supp. 2d 634 (N.D.
Ill. 2002), aff’d on other grounds, 334 F.3d 643 (7th Cir. 2003) (holding that a peer-to-peer service could
not benefit from the DMCA safe harbors in part because it had failed to reasonably implement its repeat
infringer policy and because material passed between users was not transmitted “through” the system
within the meaning of 17 U.S.C.A. § 512(b)(1)(B)); A&M Records, Inc. v. Napster, Inc., No. 99-cv05183-MHP, 2000 WL 573136, at *18 (N.D. Cal. May 12, 2000) (holding that Napster, a peer-to-peer
network, was not eligible for the safe harbor created by section 512(a) for transitory digital network
communications because users exchanged infringing files directly—not through Napster’s servers); see
generally, e.g., Viacom Int’l, Inc. v. YouTube, Inc., 676 F.3d 19, 35 (2d Cir. 2012) (holding that
knowledge or awareness, which would deprive a site or service of DMCA protection, may be established
by evidence of willful blindness, which the court characterized as a deliberate effort to avoid guilty
knowledge); Fung, 710 F.3d at 1043 (explaining that “inducing actions”—or measures deemed to induce
copyright infringement—were relevant to the court’s determination that the defendant had red flag
awareness and therefore was not entitled to the DMCA safe harbors).
4

owners are afforded a quick, cheap, extra-judicial remedy in the event of user infringement. On this basis,
section 512 operates as Congress intended.11

2. Have courts properly construed the entities and activities covered by the section 512 safe
harbors?
SoundCloud believes that U.S. courts have generally interpreted section 512 consistent with the
terms of the statute and legislative intent, with the exception of the decision in Agence France Presse v.
Morel,12 which SoundCloud believes was wrongly decided.13
At the time of drafting section 512, Congress was aware of the technologies of the time, including
services that allowed users to host, store or post third party material (e.g. America Online, as it then was),
and search engines (e.g. Yahoo!). However, rather than apply a restrictive approach to its drafting based
on those technologies, Congress deliberately used very broad terms, including “service provider” and
“material stored at the direction of the user,” and even used terms that did not exist at the time such as
“information location tools” (rather than “search engine”), expressly to ensure that future technologies
and services would be covered, provided they met the requirements of the statute.
Since its enactment in 1998, a diverse range of service providers have been afforded the
protection of section 512, including online retailers (such as eBay14 and Amazon15), user uploaded video
11

The one exception to this statement is the operation of section 512(i)(2) dealing with “standard
technical measures” where there has been no adoption of standard industry protocols and technologies.
This is discussed further in our response to Question 24 below.
12
No. 10-cv-02730-AJN, 2013 WL 146035 (S.D.N.Y. Jan. 14, 2013).
13
In Morel, the court mistakenly applied a narrower definition of service provider than the broad
definition set forth in 512(k)(1)(B), which should be applied to all of the safe harbors other than the
transitory digital network communications safe harbor created by section 512(a), including the user
storage safe harbor created by section 512(c), which was the safe harbor at issue in that case. In Morel,
the court construed the definition of service provider in part by reference to sections 512(a)-(d) apparently
without realizing that section 512(k) makes clear that the definition of the term is narrower when applied
to section 512(a) than for the other safe harbors. Rather than properly applying section 512(k), the court
relied on dictionary definitions of “service” for the proposition that a service provider must “do
something useful,” concluding, somewhat inexplicably, that “licensing copyrighted material online more
closely resembles the mere sale of goods (albeit, in this case, intellectual property) than facilitating users’
activities online,” even though there is no statutory basis for excluding sites that license content or sell
products from the definition of “service provider” applicable to the user storage safe harbor. The court in
Morel seemed to draw a distinction between platforms where users may buy and sell products, such as
eBay and Amazon, and those that sell or license material directly, such as Getty Images, even if the
material offered for sale or license was stored at the direction of a user. The court did not adequately
explain why providing a platform for consumers to purchase products from third parties was “useful” but
directly selling third-party products or content to the public would not be so. More importantly, the
court’s novel focus on “usefulness,” and its own assumptions about whether sites that sell or license
goods or services are more or less useful, is divorced from the language of the statute and the focus on
material stored at the direction of a user, which has been broadly and inclusively defined.

5

services (such as YouTube16), photo sharing services (such as Photobucket17) and many others. All of
these services have been found to be “service providers” within the meaning of section 512. It is now well
established that section 512 applies broadly. As Judge Ronnie Abrams of the Southern District of New
York noted, “courts have consistently found that websites that provide services over and above the mere
storage of uploaded user content are service providers pursuant to . . . § 512(k)(1)(B)’s expansive
definition . . . .” He further explained that “a provider of online services that hosts and distributes user
material by permitting its users to upload, share and view videos . . .” qualifies as a service provider
“[e]ven though [the service provider’s] activities are not limited to such . . . .”18
Congress intended for section 512 to have broad application, and this intent has generally been
upheld by the courts.

3. How have section 512’s limitations on liability for online service providers impacted the
growth and development of online services?
As mentioned in our response to Question 1 above, the section 512 safe harbors have created an
open, global and democratic ecosystem for creators and consumers. In a world where media interests are
rapidly consolidating—where, for example, 80% of the global recorded music market is controlled by
three companies—these new services, enabled by section 512, ensure that talented, independent creators
have a voice. These services have already launched the careers of a number of successful artists, and
supported the careers of many thousands, if not millions, more, with SoundCloud plays, YouTube views,
Twitter followers and Facebook likes becoming a new form of social currency and validation.
The impact of the section 512 safe harbors on contemporary culture cannot be overstated—
without the section 512 safe harbors, many of the online services that we use every day simply would not
exist.

14

Hendrickson v. eBay, Inc., 165 F. Supp. 2d 1082, 1088 (C.D. Cal. 2001) (holding eBay to be a service
provider).
15
Corbis Corp. v. Amazon.com, Inc., 351 F. Supp. 2d 1090, 1100 (W.D. Wash. 2004) (“Amazon operates
websites, provides retail and third-party selling services to Internet users, and maintains computers to
govern access to its websites.”).
16
Viacom Int'l, Inc. v. YouTube, Inc., 676 F.3d 19, 28 (2d Cir. 2012) (holding YouTube to be a service
provider under the DMCA).
17
Wolk v. Kodak Imaging Network, Inc., 840 F. Supp. 2d 733, 744 (S.D.N.Y. 2012) (holding that
“[b]ecause Photobucket offers a site that hosts and allows online sharing of photos and videos at the
direction of users, Photobucket, like YouTube.com or Veoh.com, qualifies as a ‘service provider’ under §
512(k)(1)(B)” for purposes of the user storage safe harbor), aff’d mem., 569 F. App’x 51 (2d Cir. 2014).
18
Capitol Records, LLC v. Vimeo, LLC, 972 F. Supp. 2d 500, 511 (S.D.N.Y. 2013).

6

4. How have section 512’s limitations on liability for online service providers impacted the
protection and value of copyrighted works, including licensing markets for such works?
The Section 512 safe harbors have enabled many of the platforms and tools on which musicians,
filmmakers and other copyright owners now rely. SoundCloud, for example, is widely used by aspiring
and established recording artists to showcase and distribute their work, build an audience and connect
with their fans. SoundCloud is an essential tool for the music industry, and without the limitations of
liability afforded by section 512, SoundCloud would not be able to offer this service to artists; nor, we
suspect, would the social networks, blogs, or other parts of the web ecosystem to which these artists share
their work from SoundCloud, and on which they rely for global reach.
As mentioned in our response to Question 1 above, the opportunities that online services provide
for copyright owners—particularly individual copyright owners—are immense. Despite this, however,
much criticism, has been leveled at online service providers by some copyright owners and their
representatives, who argue that these services that benefit artists so greatly are “unfairly claiming
protection”19 under the section 512 safe harbors, which were, allegedly, “not intended to apply to them,”20
leading to a so-called “value gap” (i.e., a shortfall between the minimum value that copyright owners
ascribe to their content, and the income that such content generates in a free market environment).
Record companies, recording artists and songwriters are among the most prolific users of the
SoundCloud platform, and have been for many years. They post content to SoundCloud on a daily basis.
They do so voluntarily and with no expectation of remuneration, knowing that SoundCloud’s global
reach, its vibrant community of highly-engaged users, and its rich data and statistics are essential tools in
the promotion of their work, and provide a level of control and transparency that they don’t receive
anywhere else. These copyright owners use SoundCloud and other platforms for strategic reasons—they
appreciate that no-one is obliged to post content to these platforms, and as far as we are aware, these
copyright owners are not arguing that the content that they post is eroding the licensing marketing for that
content.
If one accepts that copyright owners should be entitled to post content that they own and that
services should be entitled to host this content at the direction of these copyright owners, the question
then becomes whether the presence of unauthorized content is eroding the licensing market. That is a
question of verification of rights ownership, i.e., how does one ensure that the person posting the content
is, in fact, the copyright owner?
As described in our response to Question 5 below, copyright owners are frequently the only ones
able to determine whether or not a particular use of their works is authorized. Nevertheless, in order to
protect the interests of creators—the core of its community—SoundCloud has invested millions of dollars
in technology, people and processes to minimize the risk of copyright infringement, including by the
implementation of content filtering technology (discussed further in our response to Question 15 below).
19
20

https://www.prsformusic.com/digitalfocus/approach/pages/digital-battle.aspx
http://www.ifpi.org/downloads/Digital-Music-Report-2015.pdf

7

As Congress recognized, the risk of abuse is ever-present, and should be balanced and shared.
SoundCloud would prefer to see greater cooperation from and between copyright owners and service
providers in identifying and preventing uses of copyrighted works that are unauthorized and damaging to
copyright owners’ interests, rather than copyright owners seeking to hold service providers solely
responsible for the actions of their users and/or demanding payment for material that these users make
available contrary to the service providers’ terms of use.

5. Do the section 512 safe harbors strike the correct balance between copyright owners and
online service providers?
In general terms, SoundCloud believes that the section 512 safe harbors do strike the right
balance between copyright owners and online service providers, although as discussed in our response to
Questions 16 to 18 below, SoundCloud believes that the counter-notification process may be burdensome
for copyright owners.
Section 512 places the responsibility on copyright owners to identify allegedly infringing
material, and to inform the service provider of the location of that material. While that may appear to
place a disproportionate burden on copyright owners with respect to online services operating at scale, the
fact is that, in many cases, copyright owners are the only ones able to determine whether or not a
particular use is authorized. This is particularly true in the music industry, where rights ownership is
complex and fragmented.
Section 512(c)(1)(A)(ii) requires a service provider to disable access to or remove files not simply
in response to takedown notices, but also where it has actual knowledge of infringement, and in
circumstances where infringing activity is otherwise apparent (commonly referred to as “red flag
awareness”). This is as it should be, but in practice, identifying “red flag” material causes difficulty for
service providers as it frequently involves subjective decision-making. Copyrighted sound recordings
may be uploaded by someone other than the copyright owner(s), but if the recording is uploaded by the
artist, his/her manager, producer or marketing agent, it is frequently impossible for a service provider to
determine—without direction from the copyright owner—whether or not that use is authorized.
In enacting section 512 in 1998, Congress had sought to encourage the development of new
services, to encourage innovation and promote free speech and artistic expression, while also providing
copyright owners with a fast and inexpensive mechanism to protect against unauthorized distribution of
their works. This is exactly what section 512 has accomplished. The need for a fair balance between the
interests of copyright owners, service providers and—critically—users of these services, was identified
when section 512 was enacted, and the balance was struck at that time. The need for that balance still
exists today, just as it did at inception, and perhaps even more so now, as we have come to see the vital
role that online services play in today’s society, and particularly in the creative industries.

8

Notice-and-Takedown Process
6. How effective is section 512’s notice-and-takedown process for addressing online
infringement?
Prior to the enactment of section 512, a copyright owner wishing to have its content removed
from an online service would have had to hire a lawyer, send a cease and desist letter, and potentially file
suit for injunctive relief, all of which used to take a great deal of time to accomplish and was very costly.
Today, when dealing with responsible services that acknowledge and comply with their responsibilities
under section 512, a copyright owner can have allegedly infringing material removed simply by sending a
takedown notice, without the need to hire an attorney, and at limited expense.
Compared to the process prior to enactment of section 512, there can be no doubt that section
512’s notice-and-takedown requirements are highly effective at addressing online infringement.

7. How efficient or burdensome is section 512’s notice-and-takedown process for addressing
online infringement? Is it a workable solution over the long run?
Section 512 protects legitimate service providers, while leaving exposed those sites, services and
individuals that actively engage in infringement. As noted in response to Question 1, online infringement
on peer-to-peer sites and services that actively promote infringement or fail to reasonably implement
repeat infringer policies fall outside the protection of section 512’s safe harbors.
Section 512’s notice-and-takedown system imposes a substantial burden on service providers in
devising, staffing, implementing, maintaining and improving the tools and processes necessary to act on
takedown notices and resulting counter-notifications, and creates a substantial burden on copyright
owners in providing notice of alleged infringement (due to the fact, as noted in our response to Question 5
above, that copyright owners, not service providers, are better able to determine whether or not particular
material allegedly infringes that copyright owner’s rights). Ultimately, however, this reflects the balance
struck by Congress to protect both copyright owners and service providers.
Where platforms are operating at scale, manual identification of individual instances of alleged
infringement is a burdensome process for copyright owners, but would be even more burdensome for
service providers given that service providers are unable to determine whether or not a particular use is
authorized by the copyright owner.
As mentioned above, and as discussed further in our response to Question 15 below, in order to
protect the interests of its creator community, SoundCloud has invested heavily in technology, people,
tools and processes, particularly content filtering technology, to protect against online infringement. In
the absence of content filtering technology or other similar measures, the notice-and-takedown process
may be more burdensome for copyright owners with respect to service providers operating at scale.

9

8. In what ways does the process work differently for individuals, small-scale entities,
and/or large-scale entities that are sending and/or receiving takedown notices?
In SoundCloud’s view, the process should be no different for individuals, small-scale entities or
large-scale entities. To differentiate the process based on the “size” of the copyright owner—however that
might be determined—risks creating a two (or more) tier system, whereby certain copyright owners’
rights would be deprioritized in favor of the rights of others. In SoundCloud’s view, this is not acceptable.

9. Please address the role of both ‘‘human’’ and automated notice-and-takedown processes
under section 512, including their respective feasibility, benefits, and limitations.
In SoundCloud’s experience, human notice and takedown processes generally—with some
exceptions—result in notices that are less frequently wrong. They require appropriate consideration of the
issues at hand, and result in fewer “false positives” and counter-notifications when compared to
automated takedowns. They are, however, very difficult for copyright owners to manage with respect to
online services operating at scale.
Automated notice-and-takedown processes, on the other hand, are scalable, but are frequently
inaccurate and indiscriminate. They generally rely on computer scripts to auto-generate takedown notices
based on URLs that appear on webpages. There is no consideration of whether or not the content is
actually what the script believes it to be, and in the absence of any human review, it is difficult to see how
these notices can be considered valid, given the requirements of section 512(c)(3)(A)(v) for a statement,
made in good faith, that the material is not authorized by the copyright owner, its agent or the law, and
obligations on copyright owners following Lenz v. Universal Music Corp.21 to properly consider fair use.
Inaccurate takedown notices are problematic as they can result in material being removed
unnecessarily, which creates significant additional work for service providers and their users in
submitting and processing counter-notifications and reinstating content, as well as causing serious
damage to the service provider’s reputation with its users. More importantly, they have a chilling effect
on creativity and free expression if service providers feel compelled to act on all such notices.
SoundCloud believes that any takedown notice that is automatically generated must be subject to human
verification before it is submitted.
The best automated solution for tackling online copyright infringement at scale is the
implementation of content filtering technology. Content filtering solutions analyze the material itself, not
the metadata or URL associated with that material, and are therefore better able to determine the nature of
the material.
By blocking unauthorized material at the point of upload, rather than removing that content at a
later date, any possible damage resulting from unauthorized publication is reduced or eliminated.

21

801 F.3d 1126 (9th Cir. 2015).
10

However, as discussed in our response to Question 15 below, collaboration between service providers and
copyright owners is essential for the proper implementation of content filtering solutions.

10. Does the notice-and-takedown process sufficiently address the reappearance of
infringing material previously removed by a service provider in response to a notice? If not, what
should be done to address this concern?
There are two obvious ways in which allegedly infringing material previously removed in
response to a takedown notice can reappear on a service. The first is where material is reinstated
following submission of a valid counter-notification in response to which the copyright owner chooses
not to file suit against the alleged infringer. This is discussed further in SoundCloud’s response to
Questions 16 to 18 below.
The second is where allegedly infringing material is re-uploaded to the service, following
removal in response to a takedown notice. SoundCloud believes that technical solutions such as filtering
may reduce this risk in many instances.

11. Are there technologies or processes that would improve the efficiency and/or
effectiveness of the notice-and-takedown process?
There is technology that can prevent the unauthorized upload and publication of known
copyrighted material, in particular content filtering technology mentioned at various points throughout
this submission. As a responsible online service provider, SoundCloud has invested heavily in this
technology, and believes that other online services wishing to benefit from the section 512 safe harbors
should be encouraged to do likewise. This is discussed further in our response to Question 15 below.
However, given the complexity of rights ownership in the music and audio-visual industries,
content filtering technology should be seen as supplemental to a copyright owner’s obligation to review
and submit takedown notices, not as a replacement therefor.

12. Does the notice-and-takedown process sufficiently protect against fraudulent, abusive or
unfounded notices? If not, what should be done to address this concern?
SoundCloud is aware of the malicious use of takedown notices in other contexts,22 but fraudulent,
abusive or unfounded takedown notices have not, to date, been especially common on its own platform.
22

See, e.g., Automattic Inc. v. Steiner, 82 F. Supp. 3d 1011 (N.D. Cal. 2015) (entering a default judgment
under section 512(f)); Curtis v. Shinsachi Pharm. Inc., 45 F. Supp. 3d 1190 (C.D. Cal. 2014) (entering a
default judgment under section 512(f) where a seller alleged that between 2011 and 2013 defendants, who
were her competitors, submitted 30 false Notices of Claimed Infringement to eBay, resulting in the
removal of at least 140 listings and causing eBay to issue strikes against her selling account, as well as
11

That said, SoundCloud is seeing a worrying trend as copyright owners increasingly use the notice-andtakedown process to try to have removed content for reasons other than copyright infringement—for
example, a major record company recently required the removal from SoundCloud of several thousand
user-uploaded recordings that did not infringe any copyright of the record company, but merely
referenced the name of an artist signed to that label (for example, in the context of user-uploaded cover
recordings paying tribute to the original artist).
Auto-generated takedown notices also present a problem. While we assume such notices are not
intended to be fraudulent, abusive or unfounded, the absence of any human verification means that these
notices can have this effect and result in removal of content without justification. In SoundCloud’s view,
all automatically generated takedown notices should be subject to human verification prior to submission.

13. Has section 512(d), which addresses ‘‘information location tools,’’ been a useful
mechanism to address infringement that occurs as a result of a service provider’s referring or
linking to infringing content? If not, what should be done to address this concern?
Section 512 provides a quick, simple and inexpensive means for copyright owners to disable links
to allegedly infringing material without having to file suit to do so.

14. Have courts properly interpreted the meaning of ‘‘representative list’’ under section
512(c)(3)(A)(ii)? If not, what should be done to address this concern?
As a general matter, from the perspective of a service provider, anything other than specific,
individual URLs makes the identification and removal of allegedly infringing content extremely difficult,
if not impossible. For example, when SoundCloud is provided with a URL for a particular track,
SoundCloud is able immediately to remove or disable access to that track. If SoundCloud is instead
provided with the name of a song, or even a representative list of allegedly infringing material,
SoundCloud would be required, not only to identify each instance of that song hosted on its platform,
which can frequently only be determined by listening to the track (of which there are over 120 million on
SoundCloud), but then to determine whether or not each individual version or use of that song is

allegedly false notices to Google, PayPal and Serversea); Tuteur v. Crosley-Corcoran, 961 F. Supp. 2d
333 (D. Mass. 2013) (holding that the defendant stated a claim under section 512(f)); Design Furnishings,
Inc. v. Zen Path, LLC, 97 U.S.P.Q.2d 1284, 2010 WL 5418893 (E.D. Cal. Dec. 23, 2010) (enjoining the
defendant, pursuant to section 512(f), from sending takedown requests to eBay directed at the plaintiff’s
wicker products—specifically “from notifying eBay that defendant has copyrights in the wicker patio
furniture offered for sale by plaintiff and that plaintiff’s sales violate those copyrights.”); Online Policy
Group v. Diebold, Inc., 337 F. Supp. 2d 1195 (N.D. Cal. 2004) (holding that the defendant was liable
under section 512(f) for sending notifications to service providers because the evidence “suggests
strongly that Diebold sought to use the DMCA’s safe harbor provisions—which were designed to protect
ISPs, not copyright holders—as a sword to suppress publication of embarrassing content rather than as a
shield to protect its intellectual property.”).
12

infringing. Neither SoundCloud, nor any other service provider, is able to determine which uses are
authorized by the copyright owner and which are not.
A copyright owner, when submitting takedown notices, should be required to list URLs for all
material that it wants to have removed. Equally, service providers, when providing tools or mechanisms
for submission of takedown notices (such as online forms), should ensure that these tools allow for
multiple URLs to be included in the same notice.

15. Please describe, and assess the effectiveness or ineffectiveness of, voluntary measures
and best practices— including financial measures, content ‘‘filtering’’ and takedown procedures—
that have been undertaken by interested parties to supplement or improve the efficacy of section
512’s notice-and-takedown process.
As mentioned at various points throughout this submission, SoundCloud has invested heavily in
content filtering technology and in the people, tools and processes necessary to implement and operate
this technology at scale. Content filtering solutions represent a significant barrier to the most
opportunistic and egregious instances of copyright infringement—for example, users uploading
commercially released master sound recordings or motion pictures. Content filtering technology is
available from various third party providers. While filtering tools require a significant investment by
service providers, SoundCloud believes that this investment is necessary for any company that is serious
about minimizing instances of online infringement. SoundCloud voluntarily implemented content filtering
technology in 2010.
Effective implementation of content filtering solutions by service providers does, however,
require the cooperation of copyright owners. Content filtering technology depends on reference files and
data, which need to be submitted by copyright owners—unless copyright owners are prepared to deliver
reference files and ownership data to service providers for the purposes of content filtering, and are
prepared to actively manage that data (e.g. by resolving conflicts in ownership data), the technology
cannot meets its full potential. If, as SoundCloud recommends, service providers wishing to take
advantage of the section 512 safe harbors are encouraged to voluntarily adopt content filtering solutions,
copyright owners likewise should be encouraged to deliver reference files and ownership data to service
providers—and to do so on a timely basis, to keep this information accurate and up-to-date—and to
resolve ownership conflicts with users and other copyright owners.
Furthermore, it is important that content filtering technology be applied appropriately, to avoid
blocking content incorrectly. “False positives” can be damaging—at best, they are frustrating for users
and create a great deal of additional work for service providers in processing counter-notifications and
rebuilding trust with users; at worst, they have a negative impact on freedom of expression.
Despite the fact that content filtering technology has been commercially available for more than a
decade, many copyright owners still choose not to take advantage of these solutions. One owner of
several million valuable commercial sound recordings actively refused to deliver reference files to
SoundCloud until 3 months ago, despite having been aware of SoundCloud’s implementation of content
13

filtering technology for more than five years, and despite many requests from SoundCloud to do so in
order to ensure maximum protection for that copyright owner’s rights.

Counter Notifications
16. How effective is the counter-notification process for addressing false and mistaken
assertions of infringement?
It is critical, in SoundCloud’s view, given the complexity of rights and rights ownership in the
music and audio-visual industries, that users of online services have an opportunity to dispute the removal
of any of their uploads by service providers in response to a takedown notice received from a copyright
owner. Without a counter-notification process, the notice-and-takedown process is open to abuse.
Under the current regime, to benefit from section 512(g)’s exemption from liability for removing
user content in response to a takedown notice, service providers are required to reinstate content on
receipt of a counter-notification unless the copyright owner notifies the service provider that it has filed
suit against the alleged infringer. In SoundCloud’s view, this places too great a burden on the copyright
owner in many cases and undermines the value of the notice-and-takedown process, which has
implications for service providers and users.
The requirement to file suit may make sense in the case of occasional disputes between similarly
situated parties, but in most cases, the disputes are between copyright owners and individual end users,
and in SoundCloud’s experience, in relatively few of these cases are there genuine issues at stake. It is not
practical for copyright owners to take legal action against every alleged infringer in every case, yet unless
the copyright owner does so, there is a risk that the allegedly infringing material will need to be reinstated
by the service provider in order for the service provider to avoid liability to the user.23

17. How efficient or burdensome is the counter-notification process for users and service
providers? Is it a workable solution over the long run?
As stated above, it is SoundCloud’s view that a counter-notification process is essential in order
to prevent abuse of the takedown process, leading to censorship, and limits on creativity, free expression
and cultural diversity. In practice, however, the current process may be too burdensome for copyright
owners.

18. In what ways does the process work differently for individuals, small scale entities,
and/or large-scale entities that are sending and/or receiving counter notifications?

23

Assuming there is no separate contractual limitation of liability between the user and the service
provider.
14

The grounds for filing a counter-notification should be the same for all persons, whether they are
individuals, small-scale entities or large-scale entities. SoundCloud believes that this is the correct
approach—no entity should have a greater or lesser right to defend an allegation of infringement made
against it.

Legal Standards
19. Assess courts’ interpretations of the ‘‘actual’’ and ‘‘red flag’’ knowledge standards
under the section 512 safe harbors, including the role of ‘‘willful blindness’’ and section 512(m)(1)
(limiting the duty of a service provider to monitor for infringing activity) in such analyses. How are
judicial interpretations impacting the effectiveness of section 512?
As mentioned in our response to Question 1, there is a body of case law denying the protection of
section 512 to services that engage in activities that constitute willful blindness, or that choose otherwise
not to act when they acquire actual knowledge or red flag awareness. This is as it should be.
With respect to red flag awareness, SoundCloud would reiterate the point made in its response to
Question 5 above, that authorization from the copyright owner, and therefore red flag awareness, can be
difficult if not impossible to determine in many cases in the absence of notification or direction from the
copyright owner. The implementation of content filtering technology can help to reduce the instances
where judgment calls are required to be made, by enabling copyright owners to determine those sound
recordings that they wish to prevent users from uploading and any exceptions that should be made.

20. Assess courts’ interpretations of the ‘‘financial benefit’’ and ‘‘right and ability to
control’’ standards under the section 512 safe harbors. How are judicial interpretations impacting
the effectiveness of section 512?
In SoundCloud’s view, judicial interpretations of the “financial benefit” and “right and ability to
control” tests are generally consistent with Congressional intent to provide broad safe harbor protection
for service providers that do not actively encourage infringement.
In the context of section 512(c)(1)(B), courts have determined that “right and ability to control”
requires a higher standard than that necessary to establish common law vicarious liability—i.e., more than
merely the ability to block access or remove content. If this were not the case, there would exist a conflict
between section 512(c) (which requires service providers to remove or disable access to material in
response to notice, knowledge or awareness of infringing activity), and section 512(c)(1)(B) (which could
disqualify that service provider from protection on grounds that the service provider has a “right and
ability to control”).24
24

See Viacom Int’l, Inc. v. YouTube, Inc., 676 F.3d 19, 37-38 (2d Cir. 2012); CoStar Group, Inc. v.
LoopNet, Inc., 373 F.3d 544, 555 (4th Cir. 2004); UMG Recordings, Inc. v. Shelter Capital Partners LLC,
718 F.3d 1006, 1028-29 (9th Cir. 2013) (“Given Congress’ explicit intention to protect qualifying service
15

Courts have construed the “right and ability to control” provision as requiring “something more”
than merely the ability to remove or block access to materials posted on a service provider’s website.25
That “something more” requires the service provider to exert “substantial influence” on the activities of
users, which may include high levels of control over user activities or purposeful conduct.26
With respect to “financial benefit,” the better reasoned cases have held that a “financial benefit”
within the meaning of section 512(c)(1)(B) means that a site or service is a draw for infringement, not
simply that the site or service earns money, including from infringing files.27
All of this is consistent with the intent to provide broad protection under section 512 and foster
the development of online businesses.

21. Describe any other judicial interpretations of section 512 that impact its effectiveness,
and why.
SoundCloud has no comment to this question.

providers who would otherwise be subject to vicarious liability, it would be puzzling for Congress to
make § 512(c) entirely coextensive with the vicarious liability requirements, which would effectively
exclude all vicarious liability claims from the § 512(c) safe harbor. . . In addition, it is difficult to
envision, from a policy perspective, why Congress would have chosen to exclude vicarious infringement
from the safe harbors, but retain protection for contributory infringement. It is not apparent why the
former might be seen as somehow worse than the latter.”) (citing Edward Lee, Decoding the DMCA Safe
Harbors, 32 Colum. J.L. & Arts 233, 236-67 (2009) and Mark A. Lemley, Rationalizing Internet Safe
Harbors, 6 J. Telecomm. & High Tech. L. 101, 104 (2007)).
25
See, e.g., Viacom Int’l, Inc. v. YouTube, Inc., 676 F.3d 19, 38 (2d Cir. 2012), quoting Capitol Records,
Inc. v. MP3Tunes, LLC, 821 F. Supp. 2d 627, 645 (S.D.N.Y. 2011); UMG Recordings, Inc. v. Shelter
Capital Partners LLC, 718 F.3d 1006, 1029-30 (9th Cir. 2013).
26
See Viacom Int’l, 676 F.3d at 38; UMG Recordings, 718 F.3d at 1030; see also Io Group, Inc. v. Veoh
Networks, Inc., 586 F. Supp. 2d 1132, 1151 (N.D. Cal. 2008) (holding that “the plain language of section
512(c) indicates that the pertinent inquiry is not whether Veoh had the right and ability to control its
system, but rather, whether it has the right and ability to control the infringing activity. Under the facts
and circumstances presented here, the two are not one and the same.”).
27
See, e.g., Perfect 10, Inc. v. CCBill LLC, 488 F.3d 1102, 1117–18 (9th Cir. 2007) (holding that the
financial interest prong requires a showing that “the infringing activity constitutes a draw for subscribers,
not just an added benefit.”); Wolk v. Kodak Imaging Network, Inc., 840 F. Supp. 2d 733, 748 (S.D.N.Y.
2012) (holding that the defendants did not have a financial interest directly attributable to infringing
activity within the meaning of section 512(c)(1)(B) because there was no evidence that either Photobucket
or the Kodak defendants “capitalizes specifically because a given image a user selects to print is
infringing . . ., [t]he Defendants’ profits are derived from the service they provide, not a particular
infringement . . . [and] Photobucket has no knowledge of which images users may select to send to the
Kodak Defendants to be printed, and, as such, Photobucket has no ability to control whether users request
that infringing material be printed.”), aff’d mem., 569 F. App’x 51 (2d Cir. 2014).

16

Repeat Infringers
22. Describe and address the effectiveness of repeat infringer policies as referenced in
section 512(i)(A).
In SoundCloud’s view, repeat infringer policies are generally effective, and are a critical part of
the process of preventing infringement and punishing willful infringers.
Users that are penalized most heavily from repeat infringement policies are those that have built a
significant online presence around their account and for whom losing that account could mean losing
many years of hard work. Obviously, if these users are engaging in outright piracy, there should be little
sympathy and their accounts should indeed be terminated. However, in SoundCloud’s experience, these
situations are often more nuanced. The complexity of rights ownership within the music industry can
mean that takedown notices are issued where there is clearly no intent to infringe by a user that is far from
a willful infringer—often, this includes artists themselves.28
In many cases, the copyright owner that issues the takedown notice does not necessarily wish for
their actions to result in account termination. However, as section 512 is conditioned upon service
providers having an effective repeat infringer policy in place, service providers may feel pressured to
apply their policies rigidly. A service provider has discretion according to the wording of section
512(i)(1)(A) not to terminate an account “in appropriate circumstances,” but the practical reality is that
there is little incentive for service providers to exercise that discretion, and consequently service providers
often err on the side of termination.
In SoundCloud’s view, repeat infringer policies are extremely important in the fight against
online piracy, but for the section 512 safe harbors to be entirely conditioned on these policies in all cases
is disproportionate. There should undoubtedly be clear obligations on service providers to terminate the
accounts of repeat infringement, but in SoundCloud’s view, there should be limited exceptions where
accounts are not terminated if all relevant parties agree that this is the fairer outcome—for example, if
copyright owners were encouraged to state, when filing the takedown notice, whether or not the takedown
should result in a “strike.”
In any event, in SoundCloud’s view, any pre-conditions to the application of section 512 should
focus on steps to prevent infringement occurring the first place (e.g. through the implementation of
content filtering technology), not on steps to punish users after the damage has been done. SoundCloud
believes that this will better focus the attention of all parties on prevention rather than cure, and lead to
more effective measures.

28

http://thisisadynasty.tumblr.com/post/87945465547/brbdeleting-soundcloud

17

23. Is there sufficient clarity in the law as to what constitutes a repeat infringer policy for
purposes of section 512’s safe harbors? If not, what should be done to address this concern?
Section 512(i)(A) allows flexibility for different services to formulate and reasonably implement
their repeat infringer policies. Among other things, courts have approved of a “3-strikes” rule.29
Flexibility in implementing repeat infringer policies is important to protect the rights of users and in light
of the fact that not all online infringement is willful or intentional.

Standard Technical Measures
24. Does section 512(i) concerning service providers’ accommodation of ‘‘standard technical
measures’’ (including the definition of such measures set forth in section 512(i)(2)) encourage or
discourage the use of technologies to address online infringement?
The legislative history of section 512 reveals an intent by Congress that copyright owners and
service providers would work together to come up with “standard technical measures,” but it is not clear
to SoundCloud that this ever happened. In discussing analogous “industry standard communications
protocols and technologies” in the context of the systems caching safe harbor, the House Report
accompanying the final bill states that Congress expected “that the Internet industry standards setting
organizations, such as the Internet Engineering Task Force and the World Wide Web Consortium, will act
promptly and without delay to establish these protocols.”30 Copyright owners and service providers
cooperated on the Secure Digital Music Initiative (SDMI),31 but were ultimately unable to achieve
consensus. While there have been some voluntary agreements—such as the User Generated Content
Principles, available at http://www.ugcprinciples.com—there still arguably are no standard technical
measures as defined and contemplated by Congress.
As noted by one court, “[t]here is no indication that the ‘strong urging’ of both the House and
Senate committees reporting on this bill has led to ‘all of the affected parties expeditiously [commencing]
voluntary, interindustry discussions to agree upon and implement the best technological solutions
available to achieve these goals.”32

See, e.g., Capitol Records, LLC v. Vimeo, LLC, 972 F. Supp. 2d 500, 511-17 (S.D.N.Y. 2013); Viacom
Int’l Inc. v. YouTube, Inc., 718 F. Supp. 2d 514 (S.D.N.Y. 2010), aff’d in relevant part on other grounds,
676 F.3d 19, 40-41 (2d Cir. 2012); UMG Recordings, Inc. v. Veoh Networks Inc., 665 F. Supp. 2d 1099,
1118 (C.D. Cal. 2009), aff’d on other grounds sub nom; UMG Recordings, Inc. v. Shelter Capital
Partners LLC, 718 F.3d 1006 (9th Cir. 2013); Perfect 10, Inc. v. CCBill, LLC, 340 F. Supp. 2d 1077,
1094 n.12 (C.D. Cal. 2004), aff’d in part on other grounds, 488 F.3d 1102 (9th Cir.), cert. denied, 522
U.S. 1062 (2007).
30
H.R. Conf. Rep. No. 105-796 (1998).
31
https://en.wikipedia.org/wiki/Secure_Digital_Music_Initiative.
32
Perfect 10, Inc. v. Cybernet Ventures, Inc., 213 F. Supp. 2d 1146, 1174 (C.D. Cal. 2002), quoting H.R.
Rep. 105-551(II), at 61; S. Rep. at 52 (1998).
29

18

Despite it being the clear intention of Congress for them to do so, copyright owners have so far
not, as far as SoundCloud is aware, taken advantage of this provision or otherwise sought to participate in
the development of technical solutions aimed at protecting their rights.

25. Are there any existing or emerging ‘‘standard technical measures’’ that could or should
apply to obtain the benefits of section 512’s safe harbors?
It is not clear that there are any “standard technical measures” within the meaning of section
512(i). When considered more generally, basic content filtering technology could, with appropriate crossindustry support, become a standard technical measure. However, this would require discussion and clear
consensus on numerous factors, which, in SoundCloud’s experience, has been difficult to obtain even
between copyright owners, let alone between copyright owners and service providers. Any obligation on
service providers to adopt content filtering must be subject to certain reciprocal obligations on the part of
copyright owners to provide data to make the tools more effective, as described in our response to
Question 15 above.

Remedies
26. Is section 512(g)(2)(C), which requires a copyright owner to bring a federal lawsuit
within ten business days to keep allegedly infringing content offline—and a counter-notifying party
to defend any such lawsuit—a reasonable and effective provision? If not, how might it be
improved?
As discussed in our responses to Question 16 above, SoundCloud believes that this requirement
may be disproportionately burdensome to copyright owners.

27. Is the limited injunctive relief available under section 512(j) a sufficient and effective
remedy to address the posting of infringing material?
A well-functioning notice-and-takedown process is a more effective remedy against online
infringement than injunctive relief. Notice-and-takedown is quick, cheap and effective, certainly when
compared to the time and cost involved in applying for injunctive relief.
To obtain an injunction, a copyright owner would need to establish that: (1) it has suffered an
irreparable injury, (2) the remedies available at law such as monetary damages are inadequate to
compensate for that injury, (3) a remedy in equity is warranted considering the balance of hardships
between the plaintiff and defendant, and (4) the public interest would not be disserved by a permanent

19

injunction.33 Injunctive relief is an “extraordinary and drastic remedy” that “is never awarded as of
right.”34
Responsible service providers such as SoundCloud expeditiously disable access to or remove
content in response to valid takedown notices. Injunctive relief, therefore, will rarely be warranted in
cases involving legitimate service providers.

28. Are the remedies for misrepresentation set forth in section 512(f) sufficient to deter and
address fraudulent or abusive notices and counter notifications?
The remedies set forth in section 512(f) deter misconduct in some instances, but in many cases do
not. For example, SoundCloud believes that some users of online services may file counter-notifications
to have content reinstated knowing that their grounds are invalid, but knowing equally that copyright
owners are unlikely to file a suit to prevent the content from reappearing. SoundCloud does not believe
that these users are deterred by the indemnification provisions of section 512(f).
That said, SoundCloud does believe that 512(f) remedies are an important part of the process, and
are particularly important in ensuring fairness. In SoundCloud’s view, these remedies, and their deterrent
effect, might be more effective if they were given more prominence in the notice-and-takedown and
counter-notification process. SoundCloud recommends that an express indemnification statement be a
required part of any takedown notice or counter-notification.

Other Issues
29. Please provide any statistical or economic reports or studies that demonstrate the
effectiveness, ineffectiveness, and/or impact of section 512’s safe harbors.
In SoundCloud’s view, it is difficult to quantify the effectiveness, ineffectiveness or impact of the
section 512 safe harbors other than to point out again (as mentioned in response to Question 1) that
significant parts of the digital economy in the U.S. and worldwide would not exist without them, and that
many millions of individual creators and copyright owners would be significantly disadvantaged as a
result.
It is easy to “blame” the existence of section 512’s safe harbors for a decline in record sales for
example, but that decline, while unarguable, hides other trends.

33

Winter v. Natural Resources Defense Council, Inc., 555 U.S. 7, 20, 22 (2008); eBay Inc. v.
MercExchange, LLC, 547 U.S. 388 (2006).
34
Munaf v. Geren, 553 U.S. 674, 689-90 (2008).

20

According to a Nielsen study in 2014,35 93% of the U.S. population listens to music, spending
more than 25 hours listening each week, with 75% listening online. More people are choosing to listen to
music (75%) than watch TV (73%). On demand audio streaming grew 83% to 145 billion streams in
2015,36 having grown 54% in the year before.37 In other words, consumers are as engaged as ever with
music, which is unsurprising as there is arguably more music available to be discovered and heard than
ever before. On SoundCloud alone, 11 million creators are heard every month, compared to 2,000 to
3,000 on terrestrial radio.
According to Nielsen, radio continues to be the primary source of music discovery for U.S.
consumers. Terrestrial radio, it should be noted, is a medium that is free to the user and advertising
supported, and from which record companies in the U.S. derive no revenue.
While 61% of respondents to the Nielsen study38 discover music via radio, nearly two thirds of
teens—the future generation of media consumers, and a demographic that is increasingly turning away
from traditional media—discover their music through word of mouth recommendations. “Word of
mouth” is, of course, not to be taken literally—teens aren’t recommending music to each other on a one to
one basis, they are sharing links via social media to hundreds of their connections at a time. Platforms
such as SoundCloud, and social media in general, enable this to happen.
Furthermore, while consumers are spending less on CDs and cassettes, sales of vinyl continue to
enjoy a strong resurgence, increasing to more than 12 million units in 2015—the tenth consecutive year of
growth. Notably as well, of the money that consumers choose to spend on music—$109 per year on
average—more than half is spent on festivals and concerts. Consumers are prepared to spend more on live
experiences than on owning music.
Consumers are still spending money on music, but they are spending in different ways, and where
consumer behavior changes, the market must adapt. Consumers are prepared to spend heavily on
concerts, but they are unlikely to buy a ticket to see an artist whose music they have never heard, which
makes it more important than ever for artists to promote themselves and to distribute their music as
widely as possible. Platforms such as SoundCloud enable this to happen. Artists become entrepreneurs:
using online services, they are able to decide what to release, when, and how; they are able to collaborate
with other artists, and to connect directly with their fans via social media in a way that hasn’t previously
been possible; and they get to see, measure and understand—via the rich data that platforms are able to
provide—where and how their fans are engaging with their music.
Artists have more control than ever over their careers. That level of empowerment and the
cultural diversity it creates is only possible because of section 512’s safe harbors. Next year’s winner of
35

http://www.nielsen.com/us/en/insights/news/2015/everyone-listens-to-music-but-how-we-listen-ischanging.html
36
http://www.nielsen.com/us/en/insights/reports/2016/2015-music-us-year-end-report.html
37
http://www.nielsen.com/us/en/press-room/2015/2014-nielsen-music-report.html
38
http://www.nielsen.com/us/en/insights/news/2015/music-is-still-the-soundtrack-to-our-lives.html

21

the Grammy Award for Best New Artist likely already has an account on SoundCloud. He or she is
almost certainly making daily use of YouTube, Instagram, Snapchat, Facebook, Tumblr, Twitter or any
one of a number of other services that have been enabled by section 512. To limit their ability to use these
services to achieve the success that their talent deserves would be unfortunate. To do so in the interests of
entertainment industry executives would be a tragedy.

30. Please identify and describe any pertinent issues not referenced above that the
Copyright Office should consider in conducting its study.

In SoundCloud’s view, copyright owners should be required to submit a takedown notice as a
precondition to filing suit against a service provider for copyright infringement. Increasingly, copyright
trolls are bringing suits without any attempt to avail themselves of the tools, processes and remedies that
service providers, not to mention Congress, have provided for them. The cost of litigation in these cases,
especially for smaller service providers, can be prohibitive.
Respectfully submitted,

By:

Filed: April 1, 2016

22

_____________________________
SoundCloud Operations, Inc.
on behalf of itself and its affiliates