You are on page 1of 14

Text Out of Context: Situating Postmodernism Within an Information Society

Author(s): N. Katherine Hayles


Source: Discourse, Vol. 9, ON TECHNOLOGY (Cybernetics, Ecology, and the Postmodern
Imagination) (Spring-Summer 1987), pp. 24-36
Published by: Wayne State University Press
Stable URL: http://www.jstor.org/stable/41389085
Accessed: 11-06-2016 20:18 UTC

REFERENCES
Linked references are available on JSTOR for this article:
http://www.jstor.org/stable/41389085?seq=1&cid=pdf-reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
http://about.jstor.org/terms

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted
digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about
JSTOR, please contact support@jstor.org.

Wayne State University Press is collaborating with JSTOR to digitize, preserve and extend access to
Discourse

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
Text Out of Context:
Situating Postmodernism With-
in an Information Society

N. Katherine Hayles

In retrospect it all seems so obvious. Of course Claude Shan-


non had to leave meaning out of the question; otherwise, how
could he quantify the new concept that he called "information"?
As an engineer working for Bell Labs, Shannon was interested
not only in creating a scientific theory, but in paving the way for a
new technology. For both purposes, reliable quantification was
essential. But any adequate theory had to be able to account for
the information transmitted through natural language, and the
notorious capacity of words to mean dif ferent things in dif ferent
contexts seemed to pose an insurmountable barrier.
Shannon cut through this Cordian knot by declaring that the
quantity he called "information" had nothing to do with mean-
ing.1 Rather, he defined it as a function of probability, which
means that the information content of a message cannot be calcu-
lated in absolute terms, only with reference to other possible
messages that may have been sent. In ef fect, Shannon solved the
problem of how to quantify information by defining it internally
through relational dif ferences between elements of a message
ensemble, rather than externally through its relation to the con-
text that invests it with a particular meaning. It is this inward-
turning definition that allows the information content of a mes-

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
Spring-Summer 1987 25

sage to be always the same, regardless of the context into which it


is inserted. Thus the first, and perhaps the most crucial, move in
the information revolution was to separate text from context.
Without this stratagem, information technology as we know it
could not have come into being.
To define information as independent of meaning was an
extreme step, not taken without dissent. Donald MacKay, an-
other early researcher in information theory, argued that infor-
mation without meaning is not a message but nonsense.2 To show
how a theory that would incorporate meaning within its quantita-
tive structure might be developed, MacKay defined the informa-
tion content of a message as the change it makes in the recipient's
conditional readiness to act. By defining information through its
ef fect on the receiver, MacKay hoped to create a theory which
would recognize the embeddedness of text in context. However,
his proposal had the inadvertent effect of making clear just how
hopeless it would be to quantif y information if one did not sepa-
rate it from context. Quantification of his theory would have
required far more extensive knowledge of the central nervous
system than is available even today, much less so in the 1950s
when the issue was decided. Although some scientists did speak
out against separating text from context, then, they did not carry
the day because they could not come up with a quantifiable mea-
sure of information.
We have seen this move before; Bacon made it in the Novum
Organon. Bacon understood that reality might be holistic in the
sense that a rigorous separation between subject and object is not
possible. Nevertheless, using a rhetoric of power and domina-
tion, he argued that it was necessary to act as г/ such a separation
could be made, because otherwise man could never establish a
"chaste and lawful marriage between Mind and Nature." * It may
appear in Bacon's metaphor marriage that nature is the superior
partner; but this is only a strategic illusion, for man "cannot
command nature except by obeying her."4 Warren Weaver, the
commentator who interpreted Shannon's theory for a general
scientif ic audience, similarly argued that the separation of infor-
mation from meaning was a regrettable but necessary price to
pay if man was to subdue the information channel he likened to a
discreet "of fice girl" who processes all that comes to her desk
without regard for its meaning.5
That both Bacon and Weaver used gendered metaphors to
justify their decontextualizations hints at how deeply grounded
in prevailing cultural assumptions these theories were. The pro-

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
26 Discourse 9

position that theory is context-laden has become a commonplace


in the poststructuralist era; but it is worth repeating here, be-
cause it foregrounds why Shannon's theory was a watershed.
Such well-known experiments in separating text f rom context as
cubism and dadaism testify to currents within the culture that
preceded Shannon's theory, and that in a sense enabled it by
creating a context in which his was an obvious or thinkable move.
But once embodied in the technology, information theory did
not just transform the content of its cultural context; rather, it
fundamentally altered how contexts are constituted. Never be-
fore in human history had the cultural context itself been consti-
tuted through a technology that makes it possible to fragment,
manipulate, and reconstitute informational texts at will. For post-
modern culture, the manipulation of text and its consequently
arbitrary relation to context is our context .e
Information theory alone, of course, would not have had
such an impact, were it not for the technology it made possible.
One of the first practical applications of information technology
was improved weapon guidance systems. The drive to develop
such systems became acute during World War II, when aircraft
had progressed suf ficiently so that they could take sophisticated
evasive maneuvers during fire. The problem of tracking them
was a problem in probability, for their movements could not be
predicted with certainty. One of the first theoreticians to work on
the problem was Norbert Wiener, who made the connection
between the ability to gain information about the aircraf t's move-
ments and a theory of information based on probability.7 This
connection was to prove extremely fruitful during the decade
following the war, and by 1960 sophisticated automatic guidance
systems were a reality. The complete separation of the operator
from the context in which the weapons are used has had a pro-
found ef fect on the psychology of modern warfare. Primo Levi,
in his account of Jewish partisans f ighting against the Germans in
World War II, gives a vivid account of the distancing even early
information technology made possible.

I was in the artillery, you know. It's not like having a rifle.
You set up the piece, you aim, you fire, and you can't see a
thing . . . Who knows how many men have died at my hand?
Maybe a thousand, maybe not even one. Your orders come
by field telephone or radio, through earphones: left three,
drop one, you obey, and that's the end of it. It's like bomber
planes; or when you pour acid into an anthill to kill the ants:

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
Spring-Summer 1987 27

a hundred thousand ants die, and you don't feel anything,


you aren't even aware of it.8

Another area in which the separation of information from


its original context has had dramatic effects is genetic engineer-
ing. Once the informational nature of the genetic code was un-
derstood, information theory joined with genetics to create bio-
technology.9 Rapid technological innovation in this field has
made it possible to separate genetic material from its original
context and either excise it, for example when a defective gene
sequence is removed, or replace it with new genetic material to
create mutant forms. One application is to splice DNA proteins
into bacteria and introduce them into the body to help activate
the body's own immunological responses to disease.10 If the body
is considered as an informational text, this technique opens the
body's interior space to a literal embodiment of intertextuality,
for the foreign bacteria's DNA merges with the DNA that was the
body's originary text to create an intertextual code that decon-
structs the distinction between exterior and interior, text and
context.

The impact that these physical enactments of decontextuali-


zation will have on cultural values is most immediately apparent
in the new techniques of birth technology. Eggs from a woman
can be withdrawn from the body, frozen for an indefinite period,
fertilized in vitro , and placed in the uterus of the same woman at a
later time or in another woman. As the formerly integral connec-
tion between the genetic text contained in an unfertilized egg and
its biological context in the mother is disrupted, traditional defi-
nitions of "birth," "child," and "mother" all have to be re-exam-
ined. The complex legal and ethical issues to which these separa-
tions of genetic text and context give rise are already being
adjudicated in the courts. Of course, not all the effects of techno-
logical decontextualization are so visible and dramatic as these
examples. But the pervasiveness of countless smaller instances
make them no less important a part of the postmodern experi-
ence.

How much of what we call postmodernism is a response to


the separation of text from context that information technology
makes possible? Jean-François Lyotard begins his definition of
postmodernism by recalling Kant's philosophy of the sublime,
which situates sublimity in the incommensurability between be-
ing able to conceive of something and being able to present it. For
example, infinite power and goodness can be conceived; but any
attempts to present them are painfully inadequate (witness Mil-
ton's God in Paradise Lost). Modernism, Lyotard argues, is charac-
terized by attempts "to make visible that there is something which

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
28 Discourse 9

can be conceived and which can neither be seen or made visi-


ble.'41 For example, in Proust's Remembrance of Things Past the
identity of consciousness is implicitly conceived but never directly
presented, because it is constantly falling victim to the time it can
recall but not encompass. Nevertheless, Proust's writing is fully
present to itself, in the sense that its syntax and vocabulary do not
inscribe this incommensurability within themselves. According to
Lyotard, postmodernism occurs when the unpresentable has
"become perceptible in [the] writing itself, in the signifier" (Lyo-
tard 80), as in Joyce's Finnegans Wake.
Lyotard's definition foregrounds postmodernism's associ-
ation with the transdisciplinary movement to refute the classical
ideal of transparent language and replace it with the opacity of
discourse. From Bataille's desecration of the eye to Kristeva's
argument that women's language is located in the semiotic rather
than the semantic, from DeLillo's pastiche in White Noise to Cal-
vino's second-person address in If on a winters night a traveler ;
postmodernism is identified with the recognition that there is no
transcendent viewpoint which can create a universal context, no
"outside-the-text." Theorists of the postmodern have recognized
the importance of pastiche and of what Fredric Jameson calls the
simulacrum, the object created by random cannibalizing of pre-
vious codes so that an image is created for which there is no
original. And intermittently they have recognized the impor-
tance of information technology.12 But the transformation of
theory into technology that Shannon's move made possible has
been largely ignored, and the centrality of decontextualization to
our experience of the postmodern has consequently been under-
emphasized. To see how pervasive and banal the fracturing of
texts and their transplantation into reconstituted contexts has
become, we have only to think of an MTV video. As demon-like
creatures give way to cows grazing in a meadow, in the midst of
which a singer with blue hair suddenly appears, followed by cars
engulfed in flames, the images and medium collaborate to create
a technological demonstration that any text can be embedded in
any context. The disappearance of a stable, universal context for
our texts is the context for postmodern culture.
In literary studies, the separation of text from context was
already well underway with the American New Criticism of the
1940s and 1950s, a movement contemporaneous with the devel-
opment of information theory. But it was not until information
had begun to replace capital as the dominant medium of ex-
change in the most highly developed societies that deconstruc-
tion exposed the deeper implications for literary theory of sepa-
rating text from context.13 If the text is taken to be a closed
system, existing as a verbal icon independent of context, decon-

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
Spring-Summer 1987 29

struction can operate upon it to transform the binary oppositions


used to generate meaning into series of "undecidables" that make
meaning radically indeterminate.14 If the text is taken to be an
open system, whose meaning is ostensibly stabilized once it is
embedded in a particular context, deconstruction can show that
the text is subject to penetration by a potentially infinite prolif-
eration of intertexts, in an endless process of dissemination that
is "always already" present.15
The "always already" formula is significant, for it suggests
that textual indeterminancy has always existed, independent of
the historical context that I have been sketching here. Ina strictly
logical sense this may be true, insofar as even a 12th century
manuscript could be transported into another area or culture
where its meaning would be changed. But the information revo-
lution has made us much more aware of the problematics of
decontextualization. When "texts" can mean not only ink marks
on pages but polarized positions on magnetic disks, electron
beams sweeping across cathode tubes, and light impulses in optic
fibers, questions such as "What is a text?" are over-determined by
cultural conditions in a way they were not in pre-information
eras. And when those "texts" are routinely transported into con-
texts violently at odds with their original contexts, the relation
between text and context becomes problematic in a way it was not
for Proust or Joyce.
I emphasize the central role of information theory and tech-
nology in creating the cultural conditions out of which current
literary theory emerged because it seems to me that separating
literary debate from this context results in the same destabiliza-
tion of meaning that separating text from context inevitably en-
tails. For example, Wendell Harris recently argued that a better
approach to textuality than deconstruction was "ecological criti-
cism," an umbrella term he coined to include speech-act theory,
the sociology of knowledge, argumentation theory, and discourse
analysis. Harris relates the emergence of deconstruction to Saus-
surean linguistics, in which "the values or meanings of signs are
generated by the relations between them, not by the correspon-
dence between the signs and an independent and pre-existing
reality."16 However, he fails to note that the inward-turning of the
theory to the internal, relational structure of message elements is
precisely analogous to Shannon's theory of information. He ig-
nores this correspondence - assuming he is aware of it - be-
cause he sees deconstruction only in relation to other literary
theories. Instead of situating deconstruction or ecological criti-
cism in its cultural context, he argues for the superiority of
ecological approaches over deconstruction on the grounds that
one does not need to be absolutely certain of a context in order to

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
30 Discourse 9

reconstruct it. The irony, of course, is that even a theorist who


champions contextual approaches ignores the deep connections
that current literary theory has with its cultural context.
The meaning that emerges once we place critical theory
within this cultural context is that the separation of text from
context is synonomous with the exercise of power. Whether in
biotechnology, disinformation campaigns of hightech weapons,
the ability to separate text from context and to determine how
the new context will be reconstituted is literally the power of life
and death. In this context, what Niklas Luhmann calls "context
control" is crucial to understanding how relations between power
and knowledge are constituted in postmodern society.17 Very
often there are no villains in these stories, only informational
processes whose full implications are grasped by no one because
human beings are no longer in control.
As an example, consider a problem with international reper-
cussions, the present drought in central Africa. Recently experts
have questioned how extensive it really is. Crucial to deciding this
question are satellite transmissions, because they provide the
only visual information sweeping enough to encompass the large
land mass concerned. These transmissions are computer en-
hanced images; that is, the original informational text is fed into
the satellite computer and reconstituted into a more "coherent"
image designed to enhance its salient features. Therein lies the
problem, for the enhancement process is designed as an infor-
mation-preserving transformation only for the certain variables.
Other variables are necessarily changed, because their alteration
is precisely what allows the image to be enhanced for the context
of interest. When an unforseen context becomes important - as
is now the case in Af rica - it is not in general possible to retrieve
the information lost in the transformation, or even to compen-
sate accurately for changes in the "enchanced" information.
As crucial decisions come to rely more and more on informa-
tion that has been processed and reconstituted in ways the deci-
sion-makers do not understand, the gap between text and con-
text will increase. Such acceleration is virtually guaranteed by the
rates at which information technology continues to advance.
Consider the following review of information technology materi-
als by John S. Mayo:

Since 1960 the number of circuit elements on the most


advanced chips has nearly doubled each year. Memory chips
[smaller than a fingertip] incorporating more than two mil-

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
Spring-Summer 1987 31

lion elements interconnected by about five million conduc-


tors are now on the market. Less than three decades ago,
before the advent of microelectronics, it would have taken a
wireman 10 years to hook up two million discrete circuit
elements. A product that once would have been worth 10
manyears . . . will fall to just a few dollars.18

So rapid is the development that the writer estimates the limits of


silicon technology will be reached in about a decade. Even now,
intense research is being conducted into alternative materials.
The most promising are gallium arsenide chips, which will have
the potential to use photons (light particles) rather than electrons
to carry information. Projections indicate that the limits of pho-
tonic materials will be reached within two decades. Significantly,
Mayo's conclusion is not that limits inherent in the materials will
finally act as a brake on development, but that humans will be-
come increasingly peripheral to information processing.

Materials science has carried the information and communi-


cation industry far and will carry it farther still: to the point
at which photonic systems have the functional powers of
control and networking that only electronic systems have
today. Yet information itself is not material, and materials
science has its limits . . . The higher the functionality of a
system is, the more important is its software component. To
reach the highest levels of functionality, that is, to build
systems that are self-directing and independent of intelli-
gent human beings, is the job not only of materials scientists
but also of their partners the software engineers.19

Experts in software engineering, aware of its growing im-


portance, express alarm about the implications of software fail-
ures in large-scale systems.20 They point to the failure of The
Bank of New York's computer program for handling sales of
government securities; when the program failed, the bank could
not transmit the necessary data to the New York Federal Reserve
Bank and ended up owing about $32 billion by the end of the day.
To cover the deficiency, it was forced to borrow $24 billion
(pledging all of its assets), and when the system was finally
brought back up two days later, owed millions of dollars in inter-
est payments. Even more chilling are software failures when
human lives are at stake. Science News reports that in separate
incidents, "software bugs were responsible for the death of one
patient and injuries to two others when an irradiation unit for
cancer therapy generated 'inappropriate' doses."21 Areas that
are critically dependent on large-scale software include air traffic
control systems, nuclear reactors, chemical plants, and defense
and aerospace systems. The experimental X-29 aircraft, for ex-

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
32 Discourse 9

ample, has a swept-forward wing design that is areodynamically


unstable. The pilot has no choice but to rely on the aircraft's
computer, because adjustments must be made so quickly that
only a computer can handle them. Testing has already revealed
errors in the software for this aircraft.
If single software programs can fail, even more likely are
failures when two or more large software programs are inter-
faced. This would be the case, for example, with the systems
involved in the proposed Strategic Defense Initiative (Star Wars).
The Union of Concerned Scientists, a prestigious group that
includes over half of the living American Nobel Laureates, has
spoken out eloquently on the risks involved. "The problem of
designing a reliable computer program to coordinate and control
this intricate system is staggering. . . . Many battle decisions
would have to be left up to computers, as very short reaction
times would be involved. . . . The actual conditions of a nuclear
war under which [Star Wars] would have to operate could never
be simulated, making an adequate text program impossible.'"22
As context control becomes increasingly vital to human sur-
vival, and at the same time is increasingly turned over to comput-
ers, it is possible to discern in these developments a metanarra-
tive. Lyotard, who characterizes postmodernism as "incredulity
toward metanarratives," nevertheless provides the basis for un-
derstanding the metanarrative that has made context control a
central issue in our society. To construct his metanarrative, Lyo-
tard argues that during the Enlightenment narrative knowledge
was replaced by scientific knowledge as the dominant way of
knowing, and that the quest for legitimacy inherent to scientific
discourse in turn became the model for an ideal state. "The
people debate among themselves about what is just or unjust in
the same way that the scientific community debates about what is
true or false; they accumulate civil laws just as scientists accumu-
late scientific laws; they perfect their rule of consensus just as the
scientists produce new 'paradigms' to revise their rules in light of
what they have learned" (Lyotard 30).
Hovering on the threshold of recognition in Lyotard's text,
especially in passages concerned with the technology of "tele-
tronics," is the realization that another chapter has been added to
this metanarrative. As photonics replaces electronics, transmis-
sion speeds will continue to increase, since messages will travel at
the speed of light. Thus the reaction times required to transmit
texts and embed them in new contexts will be much shorter than
they are even now. When the time within which critical decisions
must be made are measured in microseconds, where is there time
for the "people [to] debate among themselves about what is just

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
Spring-Summer 1987 33

or unjust," to "perfect their rule of consensus," and to "revise


their rules in light of what they have learned"? The incredibly
rapid reaction times that information technology has necessitat-
ed imply that the metanarrativeVnext chapter will show democ-
racy giving way to technocracy. The equal to the state as the
people is the state as the computer.
Searching for some way to resist this obvious conclusion,
Lyotard turns to his version of what Clif ford Geertz has called
"local knowledge." Instancing the work of Mandelbrot on f ractal
geometry, Thom on catastrophe theory, and Godei on formal
systems, Lyotard constructs a dif ferent ending, in which "para-
logy" will supersede the global knowledge sought by traditional
science.23 This reconceptualization of what it means for knowl-
edge to be legitimate will "Let us wage a war on totality; let us be
witnesses to the unpresentable; let us activate the differences and
save the honor of the name" (Lyotard 82).
What Lyotard proposes as the solution to the decontextuali-
zation characteristic of postmodernism, then, is a new kind of
knowledge which recognizes that there are no global contexts,
that every context is local and circumscribed, and that before one
can analyze any text, even the informational texts of physical
systems, one must identify the limits of the operative local context
and locate the text within it. Put this way, Lyotard's program for
science can be seen as analogous to such important developments
in the humanities as the new historicism, the critique of the
totalizing gaze within discourse theory, and the emerging femi-
nist attack on scientific epistemologa24 In the metanarrative Lyo-
tard constructs to counter the sinister implications of decontex-
tualization, context is restored by recognizing its limitations.
Obviously there is a strong contradiction here, for Lyotard is
in effect proposing a global theory of local knowledge.25 A relat-
ed paradox is his explanation of postmodern incredulity toward
metanarratives by a metanarrative that explains why we no long-
er believe in metanarratives. 2e In my view, these paradoxes do
not in themselves invalidate his position, since they are reflections
of the larger paradox that the context of postmodernism is the
absence of any stable context. The more serious problem is the
ending he writes to his metanarrative, which envisions informa-
tion technology restored to the people at the same time as (or
perhaps because?) local knowledge has superceded global knowl-
edge as the legitimate form of knowing. Lyotard can make this
claim only by ignoring the fact that any new form of knowledge,
including "paralogy," will be embedded in the same institutions

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
Discourse 9

that have brought us to this critical point. To believe in Lyotard's


metanarrative, we must decontextualize it from the technocratic
context that will determine how new modes of knowledge are in
fact used in the world.
The alternative is to recognize that the postmodern suspi-
cion of global knowledge itself arises from a technological con-
text that is global in scope. What ending will be written for the
metanarrative we are living is not yet clear; but whatever the
outcome, it will be the product of complex and paradoxical inter-
actions between the text of local knowledge and the context of
global technology.

Notes

1 Claude E. Shannon, "A Mathematical Theory of Information,"


Bell System Technical Journal 27 (July and October 1948): 379-423, 623-
56; Claude E. Shannon and Warren Weaver, The Mathematical Theory of
Communication (Urbana: U of Illinois P, 1949).

2 Donald M. Mac Kay, Information, Mechanism and Meaning (Cam-


bridge: MIT Press, 1969).

4 Quoted in Evelyn Fox Keller's discussion of Bacon's gendered


metaphors in Reflections on Gender and Science (New Haven: Yale UP) 36.

4 Francis Bacon, The New Organon and Related Writings, ed. Fulton
Anderson (New York: Bobbs-Merrill, 1960) 119.

5 The example is used by Weaver in Shannon and Weaver, 1 16.

e The transition from aesthetic to technological decontextualiza-


tion is one of the distinguishing characteristics of postmodernism as
compared to modernism. Andreas Huyssen in "Mapping the Postmo-
dern," New German Critique 33 (1984): 5-55, surveys the climates in
France, Germany and the United States that made modernism possible
and that gave rise to postmodernism. Huyssen argues - mistakenly in
my view - that postmodernism has little in common with poststructura-
lism; that is, within postmodernism he sees a continuing rift between the
aesthetic and the technological. I argue later in this essay that there are
substantive parallels between postmodern technology and poststructur-
alist theory. Huyssen is correct, however, in the sense that these parallels
are not the result of direct influence, but of complex mediation through
the culture.

7 An account of Wiener's contribution to the war effort is given in


Jeremy Campbell, Grammatical Man : Information , Entropy, Language and
Life (New York: Simon and Schuster, 1982) 15-31.

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
Spring-Summer 1987 35

8 Primo Levi, If Not Now, When?, trans. William Weaver (New York:
Summit Books, 1985) 110-11.

9 For a discussion of how the use of information theory in biotech-


nology is underlaid by an "informatics of domination," see Donna
Haraway, "A Manifesto for Cyborgs: Science, Technology, and Social
Feminism in the 1980s," Socialist Review 80 (1985): 65-107.

10 A summary of this technology is given in Mary Murray, "Battling


Illness with Body Proteins," Science News 131 (January 17, 1987): 42-45.
See also Franklin W. Stahl, "Genetic Recombination," Scientific American
256 (1987): 90-101.

"Jean-François Lyotard, The Postmodern Condition: A Report on


Knowledge , trans. Geoff Bennington and Brian Massumi, Theory and
History of Literature, vol. 10 (Minneapolis: U of Minnesota P, 1984) 78.
Further references to this source are given in the text.

12 Exceptions are Lyotard and Donna Haraway; Jameson is more


typical in that he acknowledges the "deep constitutive relationships of
[postmodernism] to a whole new technology" (Jameson 58), but does not
explore them or their relation to decontextualization.

l:*The assault of the text/context distinction within deconstruction is


too well-known to need extensive annotation. Landmarks include Jac-
ques Derrida, "Signature Event Context," Margins of Philosophy (Chica-
go: Chicago UP, 1982) 307-30; Michel Foucault, Language, Counter-Mem-
ory, Practice: Selected Essays and Interview, ed. Donald F. Bouchard, trans.
D.G. Bouchard and Sherry Simon (Ithaca: Cornell UP, 1977); and Ro-
land Barthes, S/Z, trans. Richard Miller (New York: Hill and Wang,
1984).

14 Derrida discusses the double gesture of opening and closing the


text in Of Grammatology, trans. Gayatri Spivak (Baltimore: Johns Hopkins
UP, 1974) 33-34.

15 Derrida in "Signature Event Context" argues that "Every sign,


linguistic or non-linguistic, spoken or written . . . can be cited, put be-
tween quotation marks; thereby it can break with every given context,
and engender infinitely new contexts in an absolutely nonsalvable fa-
shion" (320).

10 Wendall V. Harris, "Toward an Ecological Criticism: Contextual


versus Unconditioned Literary Theory," College English 48.2 (February
1986): 116-31.

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms
36 Discourse 9

17 Niklas Luhmann, Légitimation durch Verfahren (Neuweid: Luchter-


hand, 1969).

l8John S. Mayo, "Materials for Information and Communication,"


Scientific American 255.4 (October 1986): 61.

19 Mayo 65.

2<)The following examples are drawn from Ivars Peterson's report


on software problems in "Warning: This Software May Be Unsafe,"
Science News 130 (September 13, 1986): 171-73.

21 Peterson 172.

22 This was taken from a newsletter included in a mass mailing by


the Union of Concerned Scientists.

24 Lyotard never explicitly defines "paralogy" but intends it to in-


clude "the study of open systems, local determinism, antimethod" (Lyo-
tard, n 1 2 1 , 100).

24 Seminal for the new historicism is Stephen Creenblatt, Renais-


sance of Self -Fashioning (Chicago: Chicago UP, 1980); for the totalizing
gaze, Michel Foucault, Discipline and Punish: The Birth of the Prison , trans.
Alan Sheridan (New York: Pantheon, 1977); and for feminist epistemol-
oga Sandra Harding, The Science Question in Feminism (Ithaca: Cornell
UP, 1986).

2r> Paul de Man in "The Resistance to Theory," Yale French Studies 63


(1982): 3-20, considers a related paradox when he recognizes that he is
in ef fect proposing a global theory of local reading.

2,i Fredric Jameson explores the significance of this paradox in his


"Foreword" to Lyotard's text (Lyotard vii-xxi).

This content downloaded from 129.174.21.5 on Sat, 11 Jun 2016 20:18:54 UTC
All use subject to http://about.jstor.org/terms

You might also like