Professional Documents
Culture Documents
Anyone who has spent time following debates about speech and privacy regulation comes to
recognize the striking parallels between these two policy arenas. In this paper we will highlight
the common rhetoric, proposals, and tactics that unite these regulatory movements.
Moreover, we will argue that, at root, what often animates calls for regulation of both speech
and privacy are two remarkably elitist beliefs:
1. People are too ignorant (or simply too busy) to be trusted to make wise decisions
for themselves (or their children); and/or,
2. All or most people share essentially the same values or concerns and, therefore,
“community standards” should trump household (or individual) standards.
While our use of the term “elitism” may unduly offend some understandably sensitive to
populist demagoguery, our aim here is not to launch a broadside against elitism as Time
magazine culture critic William H. Henry once defined it: “The willingness to assert unyieldingly
that one idea, contribution or attainment is better than another.”1 Rather, our aim here is to
critique that elitism which rises to the level of political condescension and legal sanction. We
attack not so much the beliefs of some leaders, activists, or intellectuals that they have a better
idea of what it in the public’s best interest than the public itself does, but rather the imposition
of those beliefs through coercive, top-down mandates.
That sort of elitism—elitism enforced by law—is often the objective of speech and privacy
regulatory advocates. Our goal is to identify the common themes that unite these regulatory
movements, explain why such political elitism is unwarranted, and make it clear how it
threatens individual liberty as well as the future of free and open Internet. As an alternative to
this elitist vision, we advocate an empowerment agenda: fostering an environment in which
users have the tools and information they need to make decisions for themselves and their
families.
Adam Thierer (atherer@pff.org) is a Senior Fellow with The Progress & Freedom Foundation and the director of
its Center for Digital Media Freedom. Berin Szoka (bszoka@pff.org) is a Senior Fellow with PFF and the Director of
PFF’s Center for Internet Freedom. The views expressed here are their own, and are not necessarily the views of
the PFF board, other fellows or staff.
1. William A. Henry, In Defense of Elitism (1995) at 2-3.
2. See Adam Thierer, The Progress & Freedom Foundation, Congress, Content Regulation, and Child
Protection: The Expanding Legislative Agenda, Progress Snapshot 4.4, Feb. 2008, www.pff.org/issues-
pubs/ps/2008/ps4.4childprotection.html. Like American courts, we use the term “speech” as a broad catch-all for
communications, including both actual speaking as well as other forms of transmitting, as well as receiving,
information (“content”).
3. See generally Adam Thierer, Don’t Scapegoat Media, USA Today, Dec. 4, 2008, www.pff.org/issues-
pubs/ps/2008/ps4.24scapegoatmedia.html; Marjorie Heins, Not in Front of the Children, “Indecency,” Censorship,
and the Innocence of Youth (2001); Karen Sternheimer, It’s Not the Media: The Truth about Pop Culture’s Influence
on Children (2003); Karen Sternheimer, Kids These Days: Facts and Fictions about Today’s Youth (2006).
4. See Adam Thierer, The Progress & Freedom Foundation, FCC Violence Report Concludes that Parenting
Doesn’t Work, PFF Blog, Apr. 26, 2007, http://blog.pff.org/archives/2007/04/fcc_violence_re.html.
5. See Adam Thierer, The Progress & Freedom Foundation, Sen. Rockefeller Gives Up on Parenting at Senate
Violence Hearing, PFF Blog, June 26, 2007, http://blog.pff.org/archives/2007/06/sen_rockefeller_1.html.
6. Adam Thierer, Conservatives, Porn, and “Community Standards,” The Technology Liberation Front, March
2, 2009, http://techliberation.com/2009/03/02/conservatives-porn-and-community-standards.
Progress on Point 16.19 Page 3
those actions.7 Alternatively, regulatory advocates claim that advertising and marketing efforts
are inherently “manipulative” and that people do not realize they are being duped into
surrendering personal information or into buying products or services they supposedly don’t
need.8 Of course, those regulatory advocates rarely pause to explain to us how it is that they
were not also duped and manipulated by the same things—again revealing their deeply-rooted
elitism! (As discussed below, this makes it clear how the psychological phenomenon of “third-
person effect hypothesis” is driving much of this debate.)
“Protecting The Children” is also used as a rhetorical cover for regulation here, but not as often
in debates over speech controls.9 Instead, regulatory advocates mostly focus on adults who are
presumed not to know what is in their own best interest—necessitating paternalistic
government intervention on their behalf.
7. Berin Szoka & Adam Thierer, The Progress & Freedom Foundation, Online Advertising & User Privacy:
Principles to Guide the Debate, Progress Snapshot 4.19, Sept. 2008, www.pff.org/issues-
pubs/ps/2008/ps4.19onlinetargeting.html.
8. Jeff Chester, for decades the great gadfly of American advertising, has decried “the system … developed
to track each and every one of us and our behavior for one-on-one marketing efforts” as “manipulative, intrusive
and un-democratic.” Wendy Melillo, Q&A: Chester Writes the Book on Privacy, Dec. 11, 2007,
www.gfem.org/node/227. For instance, Chester and other leading “privacy advocates” ridicule the idea of smart
phones as a “liberating technology” and insist that,
Despite the glowing words about customization and personalized service, what marketers and
advertisers are increasingly offering consumers is merely the illusion of free choice. Mobile
operators offer their various options and services, not on an individual basis, but preconfigured
according to segmented demographic profiles.
Center for Digital Democracy and U.S. Public Interest Research Group, Complaint and Request for Inquiry and
Injunctive Relief Concerning Unfair and Deceptive Mobile Marketing Practices, Jan. 13, 2009 (emphasis original),
www.democraticmedia.org/files/FTCmobile_complaint0109.pdf. See generally Berin Szoka & Adam Thierer, The
Progress & Freedom Foundation, Targeted Online Advertising: What’s the Harm & Where Are We Heading?,
Progress on Point 16.2, Feb. 2009, www.pff.org/issues-pubs/pops/2009/pop16.2targetonlinead.pdf.
9. Berin Szoka & Adam Thierer, The Progress & Freedom Foundation, COPPA 2.0: The New Battle over
Privacy, Age Verification, Online Safety & Free Speech, Progress on Point 16.11, May 2009, www.pff.org/issues-
pubs/pops/2009/pop16.11-COPPA-and-age-verification.pdf.
10. The Supreme Court has used a “right to privacy” to strike down laws against the use of contraception by
married couples, Griswold v Connecticut, 381 U.S. 479 (1965), and abortion, Roe v. Wade, 410 U.S. 113 (1973).
Page 4 Progress on Point 16.19
Few on either side stop to consider the relationship between speech and privacy. In fact, they
are but two sides of the same coin. After all, what is your “right to privacy” but a right to stop
me from observing you and speaking about you?11 “Protecting privacy,” therefore, typically
means restricting speech rights in the process. Advocates of privacy regulation often insist that
the use, processing and collection of information are “conduct” unprotected by the First
Amendment, but in fact, the First Amendment broadly protects the gathering and distribution
of information as part of the process of communication (“speech”).12 Similarly, attempts to
“clean up” speech or “protect The Children,” often require regulations that would betray the
privacy of adults by expanding the role of government, and impose serious burdens on
businesses and markets—such as age verification mandates13 or extensive data retention
requirements.14
11. Eugene Volokh, Freedom of Speech and Information Privacy: The Troubling Implications of a Right to Stop
People From Speaking About You, 52 Stanford L. Rev. 1049 (2000), available at www.pff.org/issues-
pubs/pops/pop7.15freedomofspeech.pdf.
12. See , Amicus Brief for Association Of National Advertisers, Cato Institute, Coalition For Healthcare
Communication, Pacific Legal Foundation And The Progress & Freedom Foundation In Support Of Appellants, IMS
Health v. Sorrell, No. 09-1913-cv(L), 09-2056-cv(CON) (2nd Cir. 2009), available at www.pff.org/issues-
pubs/filings/2009/071309-Brief-Amici-Curiae-ANA-et-al-Second-Circuit-(09-1913-cv).pdf.
13. See Adam Thierer, The Progress & Freedom Foundation, Social Networking and Age Verification: Many
Hard Questions; No Easy Solutions, Progress on Point No. 14.5, March 2007, www.pff.org/issues-pubs/
pops/pop14.8ageverificationtranscript.pdf; Adam Thierer, The Progress & Freedom Foundation, Statement
Regarding the Internet Safety Technical Task Force’s Final Report to the Attorneys General, Jan. 14, 2008,
www.pff.org/issues-pubs/other/090114ISTTFthiererclosingstatement.pdf; Nancy Willard, Why Age and Identity
Verification Will Not Work—And is a Really Bad Idea, Jan. 26, 2009, www.csriu.org/PDFs/digitalidnot.pdf; Jeff
Schmidt, Online Child Safety: A Security Professional’s Take, The Guardian, Spring 2007,
www.jschmidt.org/AgeVerification/Gardian_JSchmidt.pdf.
14. Adam Thierer, The Progress & Freedom Foundation, Mandatory Data Retention: How Much is
Appropriate, PFF Blog, June 26, 2006, http://blog.pff.org/archives/2006/06/mandatory_data.html
15. Adam Thierer, The Progress & Freedom Foundation, The Perils of Mandatory Parental Controls and
Restrictive Defaults, Progress on Point 14.4, Apr. 11, 2008, www.pff.org/issues-
pubs/pops/2008/pop15.4defaultdanger.pdf.
Progress on Point 16.19 Page 5
embedded or included (as proposed in Australia and with China’s “Green Dam” filter),16 and
possibly, (2) that such controls be defaulted to their most restrictive position—forcing users to
opt-out of the controls later if they want to consume media rated above a certain threshold.
More sophisticated advocates of speech controls and privacy regulation will likely argue that
their paternalism is less elitist or intrusive because they merely want to “nudge” the public into
making “better” decisions. Economist Richard Thaler and legal scholar Cass Sunstein (director
of President Obama’s Office of Information and Regulatory Affairs, responsible for analyzing
most new federal regulations) popularized this approach with their 2008 book Nudge:
Improving Decisions about Health, Wealth, and Happiness. Based on behavioral economics
studies, they argue that both government and private actors must inevitably make decisions
about “choice architecture” and that, by setting defaults, incentives and rules smartly, “choice
architects” can and should improve decision-making without blocking, fencing-off or
significantly burdening choices.17
In this regard, Sunstein and Thaler’s approach parallels the work of Lawrence Lessig, one of the
most influential Internet policy thinkers. Lessig has argued that the “architecture” of “code”
(how software is written) “regulates” all online activities and requires government oversight
and intervention to keep in check. Otherwise, he warned ominously a decade ago, “Left to
itself, cyberspace will become a perfect tool of control.”18 Lessig’s hyper-pessimistic
predictions have proven unwarranted, however. Far from fostering a world of “perfect control,”
code and cyberspace have proven remarkably difficult to regulate, but nonetheless has
generally benefited consumers and citizens without centralized direction.19 Still, Lessig,
Sunstein, and others of this ilk persist in their advocacy of “nudges” of many varieties to impose
their will on cyberspace through mandates from above.
But while it might be possible to define “better decisions” and argue that poor choice
architecture leads people to choose things they clearly don’t want in contexts like investment
decisions and mortgages, how can elites know what other people really want in highly
subjective contexts like privacy and speech? Should they rely on opinion polls—the highly
subjective results of which depend heavily on “choice architecture” of question-crafting—to
guess what the right default should be?20 Was the Chinese proposal to mandate deployment of
16. Adam Thierer, China’s Green Dam Filter and the Threat of Rising Global Censorship, PFF Blog, June 17,
2009, http://blog.pff.org/archives/2009/06/chinas_green_dam_filter_and_threat_of_rising_globa.html.
17. They define choice architecture as follows: “A structure designed by a choice architect(s) to improve the
quality of decisions made by homo sapiens. Often invisible, choice architecture is the specific user-friendly shape
of an organization's policy or physical building when homo sapiens come into contact with it. Examples of choice
architecture include a voter ballot, a procedure for handling well-meaning people who forget a deadline, or a
skyscraper.” Nudge Glossary of Terms, www.nudges.org/glossary.cfm.
18. Lawrence Lessig, Code and Other Laws of Cyberspace (1999) at 6.
19. See Adam Thierer, Code, Pessimism, and the Illusion of “Perfect Control,” Cato Unbound, May 2009,
www.cato-unbound.org/2009/05/08/adam-thierer/code-pessimism-and-the-illusion-of-perfect-control
20. See Solveig Singleton & Jim Harper, With A Grain of Salt: What Consumer Privacy Surveys Don't Tell Us,
2001, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=299930.
Page 6 Progress on Point 16.19
“Green Dam” just a harmless “nudge” because users weren’t barred from uninstalling the
filtering software that must accompany their computers (i.e., “opting-out”)? The problem
becomes even more difficult where trade-offs among competing values are inevitable. For
example, data collection about Internet users raises privacy concerns for some but benefits all,
creating more funding for “free” content (i.e., speech) and services users prefer by making
more valuable the advertising that supports online publishers. In short, regulations of speech
and privacy are likely to be pure paternalism, even when billed as “libertarian paternalism as
Thaler and Sunstein label their approach.21
What might be called “regulatory blackmail” is also a time-honored tradition among both
advocates of speech controls and privacy regulation. When censorship advocates have
previously been impeded by the First Amendment, they have worked behind the scenes with
lawmakers or regulatory agencies to use indirect pressure and strong-arming tactics to extract
“voluntary concessions” from companies or others.22 For example, in 2004, the FCC strong-
armed radio giant Clear Channel into agreeing to a “voluntary” consent decree that involved
taking Howard Stern off the air.23 Similarly, in 2008, XM and Sirius Satellite Radio finally agreed
to set aside 4% of their system capacity for use by politically favored racial minorities (a kind of
speech control) as a “voluntary condition” of their merger—after the FCC had sat on their
application for nearly 16 months.24 This race-based preference would have been
unconstitutional if the FCC had imposed it directly.25 While the FTC has been far less prone to
such abuse and actually plays a key role in holding companies to their promises, its current
Chairman, Jon Leibowitz, has hung the “regulatory sword of Damocles” over the heads of the
online advertising industry, threatening them with a “day of reckoning” if he doesn’t get what
he wants from industry self-regulatory efforts.”26 The sword could actually fall if the FTC turns
self-regulation into the European model of “co-regulation,” where the government steers and
industry simply rows.27
21. As Cato Institute scholar Will Wilkinson has argued, the book’s “agreeably banal doctrine of choice-
preserving helpfulness” blurs the lines between paternalism and libertarianism, and thus “the thrust of the
conceptual renovation behind the term libertarian paternalism is to empower, not limit, political elites.” Why
Opting Out Is No "Third Way,” Reason, October 2008, www.reason.com/news/show/128916.html. See also Adam
Thierer, The Progress & Freedom Foundation, Sunstein’s “Libertarian Paternalism” is Really Just Paternalism, PFF
Blog, April 7, 2008, http://blog.pff.org/archives/2008/04/sunsteins_liber.html.
22. See Robert Corn-Revere, “’Voluntary’ Self-Regulation and the Triumph of Euphemism,” in Rationales &
Rationalizations: Regulating the Electronic Media (Robert Corn-Revere, ed., 1997), at 183-208.
23. Telecom Policy Report, Commission Settles Indecency Charges, But At What Cost?, June 30, 2004,
http://findarticles.com/p/articles/mi_m0PJR/is_25_2/ai_n6091525.
24. See Adam Thierer, XM-Sirius, Regulatory Blackmail, and Diversity, June 17, 2008,
http://blog.pff.org/archives/2008/06/xmsirius_regula.html.
25. See Comments of W. Kenneth Ferree on Implementation of Sirius-XM Merger Condition, The Progress &
Freedom Foundation, MB Docket No. 07-57, March 30, 2009, www.pff.org/issues-
pubs/filings/2009/033009siriusXMconditionfiling.pdf.
26. See Szoka & Adam Thierer, supra note 8 at 3.
27. See id. at 2.
Progress on Point 16.19 Page 7
28. Thomas Sowell, The Vision of the Anointed: Self-Congratulation as a Basis for Social Policy (1995) at 5.
29. Alice Marwick, To Catch a Predator? The MySpace Moral Panic, First Monday, Vol. 13, No. 6-2, June 2008,
www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2152/1966; Wade Roush, The Moral Panic over
Social Networking Sites, Technology Review, Aug. 7, 2006, www.technologyreview.com/communications/17266;
Anne Collier, Why Techopanics are Bad, Net Family News, April 23, 2009, www.netfamilynews.org/2009/04/why-
technopanics-are-bad.html; Adam Thierer, Parents, Kids & Policymakers in the Digital Age: Safeguarding Against
‘Techno-Panics,’ Inside ALEC, July 2009, at 16-17, www.alec.org/am/pdf/Inside_July09.pdf; Adam Thierer, Progress
& Freedom Foundation, Technopanics and the Great Social Networking Scare, PFF Blog, June 10, 2008,
http://techliberation.com/2008/07/10/technopanics-and-the-great-social-networking-scare.
30. Supra note 13.
31. In the 109th Congress, former Rep. Michael Fitzpatrick (R-PA) introduced the Deleting Online Predators
Act (DOPA), which proposed a ban on social networking sites in public schools and libraries. DOPA passed the
House of Representatives shortly thereafter by a lopsided 410-15 vote, but failed to pass the Senate. The measure
was reintroduced just a few weeks into the 110th Congress by Senator Ted Stevens (R-AK), the ranking minority
Page 8 Progress on Point 16.19
3. A need for government to drastically curtail the dangerous behavior of the many [must
stop kids and adults from being online together on same sites], in response to the
prescient conclusions of the few [some state Attorneys General].32
4. A disdainful dismissal of arguments to the contrary as either uninformed, irresponsible, or
motivated by unworthy purposes [child safety researchers and others are told that their
research is meaningless or offbase].33
We also see this model in play in other debates, such as efforts to regulate “excessively violent”
video games and television programming.34 And consider how this model plays out on the
privacy front:
1. Assertion of a great danger to the whole society [amorphous privacy violations], a danger
to which the masses of people are oblivious.
2. An urgent need for government action *“baseline federal privacy regulation”+ to avert
impending catastrophe.
3. A need for government to drastically curtail the dangerous behavior of the many [anyone
who shares information online], in response to the prescient conclusions of the few [a
handful of privacy advocacy groups].
4. A disdainful dismissal of arguments to the contrary as either uninformed, irresponsible, or
motivated by unworthy purposes [any suggestion that privacy concerns are being
overblown and that most information-sharing is socially beneficial is dismissed out-of-
hand].
Worse yet, regulatory intervention in these cases simply begets more and more intervention to
correct the inevitable failures of, or dissatisfaction with, previous interventions.35 Thus, the
“crisis” cycle never ends.
member and former chairman of the Senate Commerce Committee. It was section 2 of a bill that Sen. Stevens
sponsored titled the “Protecting Children in the 21st Century Act” (S. 49), but was later removed from the bill. See
Declan McCullagh, Chat Rooms Could Face Expulsion, CNet News.com, July 28, 2006, http://news.com.com/2100-
1028_3-6099414.html?part=rss&tag=6099414&subj=news.
32. See Emily Steel & Julia Angwin, MySpace Receives More Pressure to Limit Children’s Access to Site, Wall
Street Journal, June 23, 2006, online.wsj.com/public/article/SB115102268445288250-
YRxkt0rTsyyf1QiQf2EPBYSf7iU_20070624.html; Susan Haigh, Conn. Bill Would Force MySpace Age Check, Yahoo
News.com, March 7, 2007, www.msnbc.msn.com/id/17502005.
33. See, e.g., Letter of Henry McMaster, Attorney General, South Carolina to Attorney General Richard
Blumenthal and Attorney General Roy Cooper Regarding Internet Safety Task Force (“ISTTF”) Report, January 14,
2009, www.scag.gov/newsroom/pdf/2009/internetsafetyreport.pdf
34. See Adam Thierer, The Progress & Freedom Foundation, Video Games and “Moral Panic,” PFF Blog, Jan.
23, 2009, http://blog.pff.org/archives/2009/01/video_games_and_moral_panic.html ; Adam Thierer, The Progress
& Freedom Foundation, Fact and Fiction in the Debate over Video Game Regulation, Progress Snapshot 13.7,
March 2006, www.pff.org/issues-pubs/pops/pop13.7videogames.pdf.
35. “All varieties of interference with the market phenomena not only fail to achieve the ends aimed at by
their authors and supporters, but bring about a state of affairs which—from the point of view of their authors’ and
advocates’ valuations—is less desirable than the previous state affairs which they were designed to alter. If one
wants to correct their manifest unsuitableness and preposterousness by supplementing the first acts of
Progress on Point 16.19 Page 9
intervention with more and more of such acts, one must go farther and farther until the market economy has been
entirely destroyed and socialism has been substituted for it.” Ludwig von Mises, Human Action, at 858 (3rd ed.
1963) (1949).
36. See generally Adam Thierer, The Progress & Freedom Foundation, Media Myths: Making Sense of the
Debate over Media Ownership (2005) at 119-123, www.pff.org/issues-pubs/books/050610mediamyths.pdf
(Explaining how the third-person effect serves as a powerful explanation for the heated backlash that followed an
FCC effort to moderately liberalize media ownership rules in 2003-04).
37. W. Phillips Davison, The Third-Person Effect in Communication, 47 Public Opinion Quarterly 1, Spring 1983,
at 3.
38. For the best overview of third-person effect research, see Douglas M. McLeod, Benjamin H. Detenber, and
William P. Eveland., Jr., Behind the Third-Person Effect: Differentiating Perceptual Processes for Self and Other, 51
Journal of Communication, Vol. 51, No. 4, 2001, at 678-695.
Page 10 Progress on Point 16.19
39. Vincent Price, David H. Tewksbury & Li-Ning Huang, Third-person Effects of News Coverage: Orientations
Toward Media, Journalism & Mass Communications Quarterly, Vol. 74, at 525-540.
40. Douglas M. McLeod, William P. Eveland & Amy I. Nathanson, Support for Censorship of Violent and
Misogynic Rap Lyrics: And Analysis of the Third-Person Effect, Communications Research, Vol. 24, 1997, at 153-174.
41. Hernando Rojas, Dhavan V. Shah, and Ronald J. Faber, For the Good of Others: Censorship and the Third-
Person Effect, International Journal of Public Opinion Research, Vol. 8, 1996, at 163-186.
42. James D. Ivory, Addictive, But Not For Me: The Third-Person Effect and Electronic Game Players’ Views
Toward the Medium’s Potential for Dependency and Addiction, University of North Carolina at Chapel Hill, School
of Journalism and Mass Communication, Aug. 2002.
43. Albert C. Gunther, Overrating the X-rating: The Third-person Perception and Support for Censorship of
Pornography, Journal of Communication, Vol. 45, No. 1, 1995, at 27-38
44. Supra note 37 at 14. Along these lines, a December 2004 Washington Post article documented the
process by which the Parents Television Council, a vociferous censorship advocacy group, screens various
television programming. One of the PTC screeners interviewed for the story talked about the societal dangers of
various broadcast and cable programs she rates, but then also noted how much she personally enjoys HBO’s “The
Sopranos” and “Sex and the City,” as well as ABC’s “Desperate Housewives.” Apparently, in her opinion, what’s
good for the goose is not good for the gander! See Bob Thompson, Fighting Indecency, One Bleep at a Time, The
Washington Post, Dec. 9, 2004, at C1, www.washingtonpost.com/wp-dyn/articles/A49907-2004Dec8.html.
Progress on Point 16.19 Page 11
than “dumb banner” ads previously used by other webmail providers.45 Self-appointed (or, to
extend Sowell’s framework, “self-anointed”) privacy advocates howled that Google was going
to “read users’ email,” and led a crusade to ban such algorithmic contextual targeting.46 Thierer
responded to these critics by pointing out that the service was purely voluntary and noted:
you don’t speak for me and a lot of other people in this world who will be more
than happy to cut this deal with Google. So do us a favor and don’t ask the
government to shut down a service just because you don't like it. Privacy is a
subjective condition and your value preferences are not representative of everyone
else's values in our diverse nation. Stop trying to coercively force your values and
choices on others. We can decide these things on our own, thank you very much.47
Interestingly, however, the frenzy of hysterical indignation about Gmail was followed by a
collective cyber-yawn: Users increasingly understood that algorithms, not humans, were doing
the “reading” and that, if they didn’t like it, they didn’t have to use it. Today, nearly 150 million
of people around the world use Gmail, and it has a steadily growing share of the webmail
market. Even though cyber-consumers have embraced the service, some privacy advocates
persist in their effort to shut down Gmail. They appear determined to stop at nothing to
impose their will on others—the essence of political elitism—even if that means cutting off free
email service for 150 million people!48
A similar debate has played out more recently regarding targeted online advertising in general.
Advertising on search engines is, much like Gmail, targeted “contextually” based on search
terms entered by users and most advertising on other websites is based on the nature of
content on a site or page. But certain data is collected about users as they browse to make that
advertising more effective—by measuring its performance, reducing fraud, preventing over-
exposure, etc. Some privacy advocates have insisted that industry self-regulation of such
practices (even if enforced by the FTC) is inadequate and have called for preemptive regulation.
They are even more offended by “behavioral advertising” which allows publishers whose
content would have little value as the basis for contextually targeting advertising on their own
sites to compete for more highly valued advertising by showing ads to users based on other
sites they’ve visited. In both cases, data collection can increase the funding available to
publishers to produce more of the content and services preferred by users, thus conferring an
45. See Chris Anderson, Free: The Future of a Radical Price at 112-118 (2009).
46. See Letter from Chris Jay Hoofnagle, Electronic Privacy Information Center, Beth Givens, Privacy Rights
Clearinghouse, Pam Dixon, World Privacy Forum, to California Attorney General Lockyer, May 3, 2004,
http://epic.org/privacy/gmail/agltr5.3.04.html.
47. See email from Adam Thierer to Declan McCullaugh on Politech Email discussion group, April 30, 2004,
http://lists.jammed.com/politech/2004/04/0083.html (emphasis added).
48 . See Complaint and Request for Injunction of the Electronic Privacy Information Center against Google,
Inc., March 17, 2009, http://epic.org/privacy/cloudcomputing/google/ftc031709.pdf; see also Ryan Radia, Should
the FTC Shut Down Gmail and Google Docs Because of an Already-Fixed Bug?, Technology Liberation Front Blog,
March 18, 2009, http://techliberation.com/2009/03/18/should-the-ftc-shut-down-gmail-and-google-docs-
because-of-an-already-fixed-bug/.
Page 12 Progress on Point 16.19
enormous indirect benefit on users, but also directly benefits users by increasing the relevance
of the advertising they see.49 For some of the more extreme advocates of privacy regulation,
however, there are no trade-offs, only absolutist “solutions:” To them, privacy is so obviously
desirable that they feel at ease in deciding what’s best for everyone else. Such absolutists often
respond with righteous indignation and conspiratorial fulmination when challenged to identify
the harm against which they’re protecting consumers, while disdainfully dismissing all talk of
the benefits of online advertising as self-serving industry propaganda.50
49. See Berin Szoka & Mark Adams, The Progress & Freedom Foundation, The Benefits of Online Advertising &
the Costs of Regulation, PFF Working Paper, forthcoming.
50. Anti-advertising crusader Jeff Chester often resorts to questioning the motives of those who question
whether his regulatory prescriptions would actually benefit consumers, see, e.g.,
http://techliberation.com/2009/06/17/behavioral-advertising-industry-practices-hearing-some-issues-that-need-
to-be-discussed/#comment-11698840. See generally Jeff Chester, Digital Destiny: New Media and the Future of
Democracy (2007).
51. “The only freedom which deserves the name is that of pursuing our own good in our own way, so long as
we do not attempt to deprive others of theirs or impede their efforts to obtain it. Each is the proper guardian of his
own health, whether bodily or mental and spiritual.” John Stuart Mill, On Liberty (Penguin Classics, 1859, 1986) at
72.
52. Adam Thierer, The Progress & Freedom Foundation, Parental Controls & Online Child Protection, Special
Report, Version 4.0, Summer 2009, www.pff.org/parentalcontrols.
53. Adam Thierer, Berin Szoka & Adam Marcus, The Progress & Freedom Foundation, Privacy Solutions, PFF
Blog, Ongoing Series, http://blog.pff.org/archives/ongoing_series/privacy_solutions.
Progress on Point 16.19 Page 13
them, these elitists skip right past user empowerment and channel their energies into
regulations that would impose a top-down, one-size-fits all standard on all adults and families—
or even into trying to craft the perfect “nudge” that will help users make what elites believe to
be the “right” decisions. Of course, these tools can, and should, be improved. Those groups
worried about speech/content and privacy issues should focus on how we might drive such
protections from the bottom-up by empowering individuals instead of government
bureaucrats. The goal in both cases should be a “let-a-thousand-flowers-bloom” approach,
which offers diverse tools and strategies for our diverse citizenry.54 We need not accept “one-
size-fits” all approaches, whether they be regulatory mandates or “nudges,” based on the
presumption that elites know best.
Finally, it is vital not to lose sight of what’s ultimately at stake here. If regulatory approaches
trump the empowerment agenda we have described, the future of a free and open Internet—
indeed, as technology converges, the future of all media—is at risk.55 By imposing
technological solutions from the top-down that can never keep pace with technological change,
regulation necessarily forecloses freedom and innovation.56 By contrast, individual
empowerment allows innovation to flourish. The better approach across the board is
education, not regulation.57 Empowerment, not elitism, is the path forward. The digital elite
should be leading this effort by developing and promoting technologies of empowerment, not
crafting regulatory mandates to force their will upon us.58
54. Comments of Adam Thierer, The Progress & Freedom Foundation, In the Matter of Implementation of the
Child Save Viewing Act; Examination of Parental Control Technologies for Video or Audio Programming; MB Docket
No. 09-26, April 16, 2009, www.pff.org/issues-pubs/filings/2009/041509-%5bFCC-FILING%5d-Adam-Thierer-PFF-
re-FCC-Child-Safe-Viewing-Act-NOI-(MB-09-26).pdf.
55. See Adam Thierer, FCC v. Fox and the Future of the First Amendment in the Information Age, Engage, Feb.
20, 2009, www.fed-soc.org/doclib/20090216_ThiererEngage101.pdf
56. “To act on the belief that we possess the knowledge and the power which enable us to shape the
processes of society entirely to our liking, knowledge which in fact we do not possess, is likely to make us do much
harm.” Friedrich von Hayek, “The Pretence of Knowledge,” in The Essence of Hayek, (Hoover Inst., 1984), at 276.
57. Adam Thierer, The Progress & Freedom Foundation, Two Sensible, Education-Based Legislative
Approaches to Online Child safety, Progress Snapshot 3.10, Sept. 2007, www.pff.org/issues-
pubs/ps/2007/ps3.10safetyeducationbills.pdf.
58 . See, e.g., Berin Szoka, Google, CDT, Online Advertising & Preserving Persistent User Choice Across Ad
Networks Through Plug-ins, Technology Liberation Front Blog, March 13, 2009, http://techliberation.com/2009/
03/13/google-cdt-online-advertising-preserving-persistent-user-choice-across-ad-networks-through-plug-ins/.
Page 14 Progress on Point 16.19
The Progress & Freedom Foundation is a market-oriented think tank that studies the digital revolution and its
implications for public policy. Its mission is to educate policymakers, opinion leaders and the public about issues
associated with technological change, based on a philosophy of limited government, free markets and civil liberties.
Established in 1993, PFF is a private, non-profit, non-partisan research organization supported by tax-deductible
donations from corporations, foundations and individuals. The views expressed here are those of the authors, and do not
necessarily represent the views of PFF, its Board of Directors, officers or staff.
The Progress & Freedom Foundation 1444 Eye Street, NW Suite 500 Washington, DC 20005
202-289-8928 mail@pff.org www.pff.org