The Inverse Relationship between Secrecy and Privacy
Author(s): Julie E. Cohen
Source: Social Research, Vol. 77, No. 3, Limiting Knowledge in a Democracy (FALL 2010), pp. 883-898 Published by: The New School Stable URL: http://www.jstor.org/stable/40972297 . Accessed: 06/08/2014 22:33 Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at . http://www.jstor.org/page/info/about/policies/terms.jsp . JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org. . The New School is collaborating with JSTOR to digitize, preserve and extend access to Social Research. http://www.jstor.org This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions Julie E. Cohen The Inverse Relationship between Secrecy and Privacy THE INVERSE-RELATIONSHIP NARRATIVE WITHIN CIVIL LIBERTARIAN DISCOURSE, IT IS COMMONLY HELD that there is an inverse relationship between government secrecy and the privacy of individual citizens. According to this inverse-relationship narrative, secrecy enables and perpetuates privacy invasion by shielding government prying from public scrutiny. Absent the secrecy, or so the story goes, the public would call government to account for its misdeeds, after which constitutional and statutory protections would kick in and the proper balance between public and private life would be restored. If we tell the inverse-relationship story often enough and indignantly enough, it can come to seem as though we might achieve sufficient protection for both privacy and democracy simply by limiting official secrecy. The inverse-relationship story of how privacy is lost and gained is an appealing one. Stories that cast government as the greatest threat to individual welfare, and that envision individual welfare as protected precisely to the extent that government is restrained, have powerful cultural resonance in American public discourse. One might say that they exist in our political DNA - in the fundamentally liberal political philosophy that animates our politics and our markets. Portions of this essay are adapted from my forthcoming book, Configuring the Networked Self: Copyright, Surveillance, and the Production of Networked Space (Yale University Press, forthcoming). social research Vol 77 : No 3 : Fall 2010 883 This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions In the case of privacy, however, the story is wrong. Devaluation of privacy is bound up with our political economy and with our public discourse about information policy in important ways that have little or nothing to do with official conduct. This devaluation proceeds in two opposite but mutually reinforcing patterns: by valorizing private economic arrangements organized around trade secrecy and by elevat- ing openness as an ultimate good. There is an inverse relationship between privacy and secrecy, but there is an equally powerful inverse relationship between openness and privacy that for ideological reasons we are inclined to resist discussing. And the very same liberal commit- ments that generate the inverse-relationship story prevent us from understanding what privacy ought to mean. THE POWER OF SECRECY ACROSS THE PUBLIC/PRIVATE DIVIDE In the emerging networked information economy, access to personal information about current and potential customers is considered the key ingredient in market success. The United States has become the center of a large and growing market for personal information, encom- passing all kinds of data about individual attributes, activities, and preferences. Trade in some information, such as financial and health information, is subject to legal restrictions, but most other types of information flow freely among participants, ranging from large finan- cial institutions to search engines to divorce attorneys and private detectives. Flows of data are facilitated by corporate data brokers like ChoicePoint, Experian, and Axciom (Hoofhagle 2004: 600-08). To help companies (and governments) make the most of the information they purchase, an industry devoted to "data mining" and "behavioral adver- tising" has arisen; firms in this industry compete with one another to develop more profitable methods of sorting and classifying individual consumers. The driver of markets in personal information is a kind of privacy, but it is the privacy of private property. Information disclosed by individuals through their commercial relationships becomes the 884 social research This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions private property of providers of services and goods, and that property itself is bought, sold, and traded. The ultimate object of this trade is the creation of individualized economies of attention, in which we are known by our preferences and habits and captured by our loyalties. Personalization also plays a key role in the vision of the future of the Internet as a "semantic web" (Berners-Lee et al. 2001) that connects people, information, and things. The interactivity of the emerging semantic web is comprehensively mediated by information about indi- viduals' preferences and transactional histories. To be sure, government is an important customer of private sector data processors. In the United States, a number of federal agen- cies have awarded multimillion dollar contracts to corporate data brokers to supply them with personal information about both citi- zens and foreign nationals. Privacy restrictions that limit the extent to which the government can itself collect personal information gener- ally do not apply to such purchases at all (Hoofhagle 2004: 622-23). The government has deployed secrecy to great effect where these initiatives are concerned, with the result that we still understand too little about many of them. Legal regimes purporting to guarantee official trans- parency are in fact indeterminate on how much openness to require. For example, the federal Freedom of Information Act (FOIA) mandates far-reaching disclosure of information about government actions and processes, but exempts classified information and information about law enforcement techniques and procedures if such disclosure would "risk circumvention of the law" or create risks to life or physical safety (552(b)(7)). Even so, most government uses of personal information, whether collected directly or acquired from private companies, ultimately are subject to transparency requirements, including those imposed by the FOIA, and they are subject to the supervision of courts. In the United States, the same requirements do not apply to most commer- cial data-processing operations. The guidelines on fair information practices adopted by the Organization for Economic Cooperation and Development (OECD) (1980) and enacted as a directive by the European The Inverse Relationship between Secrecy and Privacy 885 This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions Union (EU) (1995) require parties that collect personal information to provide disclosures specifying the purposes for which the information will be used and any potential recipients other than the original collec- tor. They also must afford data subjects a meaningful opportunity to examine and correct the information. In the United States, however, the OECD guidelines generally have not been applied to most private- sector uses and transfers of personal data; instead, such activities are regulated only by background prohibitions against unfair and deceptive trade practices. Most reputable firms that deal directly with consumers do disclose some information about their "privacy practices," but the incentive is to formulate disclosures about both purposes and poten- tial recipients in the most general terms possible. This practice in turn shields secondary recipients of personal data, most of whom do not disclose information about their activities at all. Even the highly granular purpose and recipient disclosures required under a strict interpretation of the OECD guidelines, more- over, would not necessarily shed light on the operational significance of collected information. Telling someone what pieces of information were considered for the purposes of making decisions about credit or medical coverage provides no information about how that information mattered. It reveals very little about the other assumptions used to construct the operational heuristic, nor does it indicate how different information would have changed the result. Efforts to gain access to operational information about private- sector uses of personal information run into the first of the two discourses of information policy that I mentioned at the start of this essay: the discourse of economic secrecy. Economic regimes of trade secrecy have as their principal purposes the protection of innovation and competition. Such regimes reproduce as a matter of course many of the patterns of nondisclosure that we find so threatening when they manifest within government. Within trade secrecy law and practice, it is not only normal but also and more fundamentally desirable that information should be made available only to those authorized to know it. Although we do not typically acknowledge this, trade secrecy 886 social research This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions and state secrecy are equally important aspects of our national infor- mation policy. Government disclosures typically are structured so as not to disrupt patterns of trade secrecy, and this is legally sanctioned: the FOIA (552(b)(4)) exempts trade secret information from disclosure in most cases. The nexus between state secrecy and economic secrecy has not gone unnoticed in information policy debates. Scholars like Danielle Citron (2008) have pointed out that regimes of economic secrecy forti- fied by the FOIA trade secrecy exemption may operate to shield newly privatized, formerly public functions such as the design of electronic voting processes from public scrutiny. Citron argues that due process protections against arbitrary state action should extend across the public/private divide to reach the actions of the nominally private actors now performing such functions. The larger problem, however, goes beyond the transfer of public functions across the public/private divide. The more important question is why the public/private divide should presumptively insulate the information-processing practices of other private actors from public scrutiny. Regimes of secrecy fortified by intellectual property law operate to deny us access to large catego- ries of decisions that have real and immediate effect on every facet of our day-to-day lives, ranging from decisions about access to credit and insurance to more mundane decisions about the information that we are shown. They are therefore a legitimate and urgent subject of public concern. THE IDEOLOGY OF OPENNESS The obvious remedy for too much secrecy, of course, is more openness. So, for example, some have argued that the best way to equalize the power disparities resulting from regimes of state or corporate secrecy is to give everyone access to the same information that governments and corporations have (Brin 1999; Mann et al. 2003). If surveillance feeds and search strings alike were public property, or so the argument goes, their ability to underwrite public and private assertions of power would be greatly reduced. The Inverse Relationship between Secrecy and Privacy 887 This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions Here we encounter the second of the two discourses of informa- tion policy that work to devalue privacy. If sunlight is the best disinfec- tant and free expression the foundation of our democracy, then it seems only logical to think more sunlight and more information will make our public discourse purer and more democratic. As for all of the incon- venient, embarrassing bits of information that are suddenly networked and searchable, we should all just learn to get past the awkwardness and enter a postprivacy era (for example, Zittrain 2008: 228-34). The alternative - making distinctions among the Internet's information flows and regulating some of them - would threaten cherished free- doms of speech and inquiry. On that reasoning, secrecy and openness are complementary halves of a binary that is thought to contain within it all of the possible responses to information policy problems. From a privacy perspective, neither argument follows. First, the information policy discourse of openness is extraordi- narily resistant to recognizing that the "openness" practiced by ordi- nary people, both online and off, is a matter of degree. The design of most networked information services mirrors this insensitivity. When Facebook announced a commercial arrangement called the Beacon program, which would notify members of their friends' purchases, it assumed users would be delighted. When Google introduced its new networking service, Google Buzz, automatically enrolled all Gmail customers, and publicly listed their top Gmail correspondents as their "friends," some wondered why anyone would object. The public backlash that followed each of these incidents, and many others, was entirely unsurprising. There are many reasons that one might prefer not to share information about all of one's purchases or all of one's private correspondence with all of one's friends. The designers of Facebook Beacon and Google Buzz betrayed a fatal insensitivity to the fine contextual distinctions that we make all the time in our interac- tions with the world, and to our reasons for making them. The everyday practice of life involves the creation and manage- ment of boundaries between different activities and relationships. To an extent, these processes of boundary management are implicitly recog- 888 social research This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions nized in Alan Westin's pathbreaking and influential discussion of privacy interests, which identified "reserve" as a critical aspect of privacy (1967: 37-42). Ultimately, however, reserve is too one-dimensional a notion to be useful in characterizing the range of social processes that result from selective withholding and selective disclosure. A richer conceptualization of the differential control that social processes entail is social psycholo- gist Irwin Altman's model of privacy as a dialectical process of boundary regulation (1975). While Westin presented a relatively static taxonomy of types of interpersonal separation, Altman crafted a dynamic model designed to encompass the range of processes by which privacy in its various forms is created and maintained. Altman characterized privacy as "a central regulatory process by which a person (or group) makes himself more or less accessible and open to others," and identified "the concepts of personal space and territorial behavior" as the principal regulatory mechanisms in the process (1975: 3). He observed that the concepts of personal space and territorial behavior inform a range of privacy-regulat- ing behaviors; together, those behaviors constitute a coherent system for personal boundary management that responds dynamically to changing circumstances, needs, and desires. Importantly, while the term "privacy" carries with it specific cultural baggage, the processes described by Altman have a more universal character. Although different cultures have different conven- tions about personal space and territory, people in every culture use personal space and territory to manage interpersonal boundaries (Altman 1977). Those processes mediate human interaction both physi- cally and conceptually; our understandings of selfhood are shaped by the habits of boundary management that we develop. Widespread, undifferentiated disclosures threaten our ability to manage our bound- aries, with potentially drastic consequences for the processes by which we articulate our identities, define our beliefs, and formulate our poli- tics. As Helen Nissenbaum (2009) explains, such disclosures destroy the contextual integrity to which we have become accustomed. And as Altman's model makes clear, we require some ability to manage contextual integrity in order to function in society. The Inverse Relationship between Secrecy and Privacy 889 This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions Second, the information policy discourse of openness is almost willfully blind to the economies of desire that exist in information markets - economies that the ideology of openness itself helps to create. Jodi Dean (2009) identifies a tension between secrets and public- ity that exists at the heart of our political economy, within the core of a set of practices that she terms "communicative capitalism." Within communicative capitalism, the economic logics of information markets are fortified by a media culture that prizes exposure and an intellec- tual ethos that assigns that media culture independent normative value because of the greater "openness" it fosters. Building on Dean's framework, surveillance theorist Kirstie Ball (2009: 641-45) argues that voluntarily disclosed information circulates in twinned economies of authenticity and perversity; disclosures are called forth by manufac- tured norms of participation but they also take on fetish value exactly because they represent slices of authentic reality. Emerging practices of self-exposure align neatly with processes of personalization that operate in information markets, and that fuel the emerging semantic web. The point here is not that giant corpo- rations extract information from us against our will or in ways that overtly telegraph economic or political subordination; it is precisely the opposite. The individualized economies of attention that charac- terize the emerging networked information society depend critically on our willing participation. In the networked information society, we are all in the personal-information-processing business. Basic network economics dictates that platforms like Facebook and Google have value only to the extent that enough of us voluntarily provide them with the raw material. The rub is that those activities have value to Facebook and Google only to the extent that they can be monetized. Flows of information within the semantic web constitute an interlinked series of "surveillant assemblages" (Haggerty and Ericson 2000): heteroge- neous, loosely coupled sets of institutions that seek to harness the raw power of information by fixing flows of information cognitively and spatially. Of critical importance within Haggerty and Ericson's frame- work, the surveillant assemblage operates upon its subjects not only 890 social research This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions by the "normalized soul training" of Foucauldian theory (614-15), but also by seduction. The surveillance society is not a grim dystopia; to the contrary, flows of information within the surveillant assemblage prom- ise a cornucopia of benefits and pleasures, including price discounts and social status. In return for its pleasures, however, the surveillant assemblage demands full enrollment, which cultural and political norms of openness and "sunlight" help to elicit. PRIVACY AND LIBERAL ANXIETIES Why, though, should we think that any of this is a problem? After all, we have chosen it, or so the story goes, and we choose it again and again every time we buy music, or groceries, or airline tickets, and every time we share updates with our friends. This is the point at which the foundational commitments of liberalism get in the way. They tell us that the choices that individuals make about disclosing information are definitionally autonomous and therefore presumptively efficient, and that aggregated, accurate information promotes truth-discovery. One can imagine two reasons to be skeptical of these answers. One is that information processing is good for far less than we think. The other reason is that privacy is good for far more. Both possibilities warrant our careful, critical attention. Let us begin with the first possibility: What exactly is informa- tion processing good for? What social goods would protection for privacy prevent us from achieving? The conventional answer has two parts: information processing gives us what we want, and information processing advances the pursuit of knowledge and truth. We should see immediately that the first answer is question begging. Wants can be manufactured, and can be self-destructive. A hallmark of civilization is precisely the capacity for both individual and collective discipline in the face of excessive and potentially self-destructive wants. Perhaps surprisingly, the account of information processing as inevitably truth-enhancing fares no better. That account, which I have labeled the "information-processing imperative" (Cohen forthcoming, chap. 3) comes to us directly from the Enlightenment; it is grounded in The Inverse Relationship between Secrecy and Privacy 891 This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions a view of information gathering as knowledge discovery along a fixed, linear trajectory of forward progress. Within the framework defined by the information-processing imperative, the interest in getting and using more complete information is presumptively rational and of the utmost importance. The truth value of the information is assumed and elevated to a level beyond ideology; as a result, the other work that information processing does goes unaddressed and usually unacknowl- edged. Information processing is not a neutral activity, however; it requires choices about categories and priorities that are open to interro- gation (Bowker and Star 1999). History is rife with examples - ranging from genocide to invidious discrimination to banal tales of bureaucratic excess - of the ways that precise, granular information about individu- als and groups can be turned to unjust and sometimes horrific ends (for example, Black 2001). Imbued with the values of Enlightenment rationalism, we tend to regard these episodes as unfortunate anoma- lies, but we should not. As Frederick Schauer (2003) explains, opposi- tion to entrenched societal discrimination is hard to reconcile with commitment to the truth value of information; the line between useful heuristics and invidious stereotypes is vanishingly thin. Sorting and discrimination are synonyms; the one entails the other (Gandy 2009: 55-74). Privacy theorists tend to think that the solution to problems of invidious discrimination is better (information-based) metrics for separating the invidious frameworks from the truthful ones. Thus, for example, Lior Strahilevitz (2008: 376-81) contrasts valuable "informa- tion" with wasteful "signals," and argues that privacy policy should encourage use of the former rather than the latter. That seems reason- able enough, but it assumes an ontological distinction between the two categories that does not exist. Faith in the truth value of information reaches its zenith in processes of risk management, but the relationship between infor- mation processing and risk is much more complicated than the infor- mation-processing imperative acknowledges. Events in the post-9/11 world reveal a dialectical relationship between new technological 892 social research This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions methods of managing risks and risks that new technological methods create. Large-scale data mining and complex, automated systems for managing critical infrastructures and activities rely heavily on algo- rithms that align and systematize the meanings of data about people and events. Formally, such systems approximate the requirement of logical completeness, an approximation that becomes stronger as more and more data are collected. Much evidence suggests, however, that relying on such techniques to the exclusion of human judgment does not eliminate the risk of system failure, but instead magnifies the probability that system failures will be large and catastrophic. So, for example, the U.S. government's development of a profile-based system for screening airline passengers inspired the "Carnival Booth" study (Chakrabarti and Strauss 2002) in which a pair of MIT-based researchers demonstrated how a terrorist group might defeat the screening system by hiding its agents within designated low-risk groups. The recent and still ongoing meltdown of the global financial system was precipitated by the toxic combination of reliance on automated, logically complete financial models and regulatory deference to those models (Bamberger 2010). In debates about privacy and information processing, we would benefit from acknowledging that information processing is always- already the subject of someone's regulatory agenda. The logics of infor- mation processing require (and already receive) external discipline. Exercising that discipline with care for justice requires making norma- tive decisions about the conduct of information processing and its appropriate limits. What about the second possibility: What is privacy good for? We tend to think of privacy as sheltering the fixed, autonomous self against the vicissitudes of technological and social change. That view of the self derives from the tradition of liberal political theory, and it explains a great deal about the way U.S. legal scholars and policymakers respond to information privacy problems. If one takes the autonomous, ratio- nal liberal self as the descriptive and normative baseline, it becomes very hard to understand what a generalized entitlement to privacy with The Inverse Relationship between Secrecy and Privacy 893 This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions respect to our everyday behaviors and transactions might accomplish. Except for information about a handful of concededly sensitive topics, it is hard to imagine mere disclosure altering trajectories of behavior that presumptively flow from our free will. Because the liberal self exists outside of any particular context, it is hard to understand why changing the context of a disclosure should change its privacy impact. Some commentators argue that privacy serves a dignitary func- tion - worth preserving not because it affects our decisions or actions, but because it spares our feelings. Within our political culture, however, dignitary interests are considered anemic relative to liberty interests. If the disclosures enabled by new technologies are thought to serve inter- ests in market and expressive liberty, it is easy to conclude that liberty interests should prevail. Privacy comes to seem both unnecessary and vaguely retrograde, a doomed attempt to hold back the inexorable tide of progress. What if, though, it is not the idea of privacy that is the problem? What if the problem, instead, is the idea of the autonomous, rational, decontextualized self that privacy theoretically protects? Although legal and policy discourse clings to it, as a descriptive matter the model of liberal selfhood is increasingly discredited in most other areas of contemporary thought, ranging from philosophy to sociology to cultural studies to cognitive theory. For most contemporary thinkers, it makes far more sense to speak of an emergent, relational subjectivity that is continually shaped and reshaped by everything to which we are exposed. That understanding dovetails with Altman's model (1975) of privacy as a dialectical process of boundary regulation by which under- standings of selfhood are constructed over time. In general, U.S. privacy scholars are deeply resistant, even hostile, to the idea of the socially constructed self. As Jeffrey Rosen (2000: 166) puts it, "I'm free to think whatever I like even if the state or the phone company knows what I read." That argument is a product of the liberal conception of autonomy, pure and simple; it posits that choice negates social shaping and social shaping negates choice. That understanding of social shaping is far too binary, however; social shaping need not 894 social research This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions entail the negation of self. Other scholars conclude that "privacy" is itself an artifact of liberal political theory. According to Peter Galison and Martha Minow (2005: 277-84), rights of privacy are inseparably tied to the liberal conception of the autonomous, prepolitical self. They argue that privacy as we know it (in advanced Western societies) ulti- mately will not withstand the dissolution of the liberal self diagnosed by contemporary social and cultural theory. But the understanding of privacy as tied to autonomy represents only one possible conception of privacy's relation to selfhood. If boundary regulation plays a critical role in processes of self- constitution, then the relationship between privacy and selfhood is more complex than either liberal optimism or liberal pessimism suggests. I have argued that One can choose to understand the autonomous liberal self and the dominated postmodernist subject as irreconcil- able opposites, or one can understand them as two (equally implausible) endpoints on a continuum along which social shaping and individual liberty combine in varying propor- tions. Taking the latter, more realistic perspective, more- over, it is possible to meld contemporary critiques of the origins and evolution of subjectivity with the more tradi- tionally liberal concerns that have preoccupied American privacy theorists. Postmodernist social and cultural theory seeks to cultivate a critical stance toward claims to knowl- edge and self-knowledge. In a society committed at least to the desirability of the liberal ideal of self-determination, that perspective should be an appealing one (Cohen forth- coming, chap. 3). It is precisely in the malleable, unfixed nature of our subjectivity that we can locate the possibilities for meaningful self-actualization and social "progress" that traditionally have been among liberalism's cardi- nal aspirations. The Inverse Relationship between Secrecy and Privacy 895 This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions On this account of subjectivity, privacy is suddenly far more important than we as a society have been willing to admit. This is so not because privacy shelters fixed, autonomous selfhood from the pres- sures of change, but because it does exactly the opposite: it shelters emergent subjectivity from external efforts to render it orderly and predictable. By preventing dissolution of the boundaries that separate contexts and spaces from one another, privacy counteracts the informa- tional and spatial logics of surveillance, which seek to impose a grid of fixed, stable meaning on human activity. Privacy widens the interstices among processes of social shaping, furnishing emergent subjectivity with room for play. This enables the development of critical perspec- tive, and creates the conditions for both personal and social change. CONCLUSION Open access to information is an important underpinning of our politi- cal culture, but critical subjectivity also is a good that we cannot do without. If so, then privacy - and the necessary possibility of limits on knowledge - should not be lightly surrendered. The pursuit of our liberal aspirations requires that we do precisely that which our stron- gest liberal instincts forbid: interrogate regimes of secrecy that exist on both sides of the public-private divide, and scrutinize with equal rigor our ideology of openness. REFERENCES Altman, Irving. The Environment and Social Behavior: Privacy, Personal Space, Territory, Crowding. Monterey, Calif: Brooks/Cole Publishing, 1975.
. "Privacy Regulation: Culturally Universal or Culturally Specific?" journal of Social Issues 33:3 (1977): 66-84. Ball, Kirstie S. "Exposure: Exploring the Subject of Surveillance." Information, Communication, and Society 12:5 (March 2009): 639-657. Bamberger, Kenneth A. "Technologies of Compliance: Risk and Regulation in a Digital Age." Texas Law Review 88:4 (March 2010): 669-739. Berners-Lee, Tim, James Hendler, and Ora Lassila. "The Semantic Web." Sdentific American (April 2001): 34-43. 896 social research This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions Black, Edwin. IBM and the Holocaust: The Strategic Alliance between Nazi Germany and America's Most Powerful Corporation. New York: Crown Books, 2001. Bowker, Geoffrey C, and Susan Leigh Star. Sorting Things Out: Classification andlts Consequences. Cambridge: MIT Press, 1999. Brin, David. The Transparent Society: Will Technology Force Us to Choose Between Privacy and Freedom? New York: Basic Books, 1999. Chakrabarti, Samidh, and Aaron Strauss. "Carnival Booth: An Algorithm for Defeating the Computer-Assisted Passenger Screening System." First Monday 7:10 (October 2002) <http://firstmonday.org/htbin/ cgiwrap/bin/ojs/index.php/fin/article/view/992/913>. Citron, Danielle Keats. "Technological Due Process." Washington University Law Review 85:6 (2008): 1249-1313. Cohen, Julie E. Configuring the Networked Self: Copyright, Surveillance, and the Production of Networked Space. New Haven: Yale University Press, forthcoming. Dean, Jodi. Publicity's Secret: How Technoculture Capitalizes on Democracy. Ithaca: Cornell University Press, 2002. European Union. Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data. 1995 O.J. (L 281): 31. Freedom of Information Act. Public Law No. 89-554, 80, Stat. 383 (codified as amended at 5 U.S.C 552(b)(l)-(7)). Galison, Peter, and Martha Minow. "Our Privacy, Ourselves in an Age of Technological Intrusions." Human Rights and the "War on Terror." Ed. Richard Ashby Wilson. New York: Cambridge University Press, 2005: 258-294. Gandy, Oscar H., Jr. Coming to Terms with Chance: Engaging Rational Discrimination and Cumulative Disadvantage. New York: Ashgate Publishing, 2009. Haggerty, Kevin D., and Richard V. Ericson. "The Surveillant Assemblage." British journal ofSodology 51:4 (2000): 605-622. Hoofhagle, Chris Jay. "Big Brother's Little Helpers." North Carolina Journal The Inverse Relationship between Secrecy and Privacy 897 This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions of International Law and Commercial Regulation 29 (Summer 2004): 595-637. Mann, Steve jason Nolan, and Bany Wellman. "Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments." Surveillance and Society 1:3 (2003): 331-355. Nissenbaum, Helen. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford: Stanford University Press, 2009. Organization for Economic Cooperation and Development (OECD). "OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data" (September 23, 1980) <http://www.oecd. org/document/18/0,3343,en_2649_34255_1815186_l_l_l_l,00. html>. Rosen, Jeffrey. The Unwanted Gaze: The Destruction of Privacy in America. New York: Random House, 2000. Schauer, Frederick. Profiles, Probabilities, and Stereotypes. Cambridge: Harvard University Press/Belknap Press, 2003. Strahilevitz, Lior Jacob. "Privacy Versus Antidiscrimination." University of Chicago Law Review 75:1 (Winter 2008): 363-81. Westin, Alan F. Privacy and Freedom. New York: Atheneum, 1967. Zittrain, Jonathan. The Future of the Internet- And How to Stop It. New Haven: Yale University Press, 2008. 898 social research This content downloaded from 203.217.177.216 on Wed, 6 Aug 2014 22:33:27 PM All use subject to JSTOR Terms and Conditions