You are on page 1of 21

Beyond Presence:

Epistemological and Pedagogical


Implications of ‘Strong’ Emergence
DEBORAH OSBERG
GERT J.J. BIESTA
University of Exeter

ABSTRACT: In this paper we argue that the notion of strong


emergence offers a challenge to the idea, currently dominant in
schooling, that knowledge somehow relates to a pre-existing world,
present in itself. We do this first by providing an account of strong
emergence, showing how it brings into question the assumption of
determinism. Following this we explain the epistemological
consequences of this failure of determinism and in so doing develop
an “emergentist” epistemology which has some compatibilities with
deconstruction. Finally, we show that the emergentist critique of
determinism makes it possible to imagine a form of schooling
which is no longer concerned with questions about how best to
teach the child about a pre-existing world (which, largely, are
questions about whether this would or should be presented or
represented in schools). Instead it becomes possible to imagine a
form of schooling which is concerned with questions about
responsibility and response.

KEYWORDS: Presence, pedagogy, epistemology, emergence,


schooling, determinism, deconstruction, knowledge, complexity,
responsibility.

Introduction
Ever since modern schooling emerged in the 17th century, it has been
assumed that purpose of schooling is to help the child acquire
knowledge of a real or pre-existing world (natural, social, cultural, and
so on) that exists somewhere outside of the school (see Biesta & Osberg,
2007). It has further been assumed that it is possible to teach children
about this pre-existing world either by telling them about it (i.e.,
representing it to them) or by showing it to them (presenting it to them
directly) thereby letting them discover it for themselves. The point is,
whatever the method of instruction, modern schooling has mostly been
about getting the child to understand a pre-existing world.

Interchange, Vol. 38/1, 31–51, 2007. © Springer 2007


DOI 10.1007/s10780-007-9014-3
32 DEBORAH OSBERG and GERT J.J. BIESTA

In this paper we argue that some insights from complexity science, in


particular the notion of strong emergence, offer a challenge to the idea
that knowledge somehow relates to a pre-existing world, present in
itself. Although the notion of emergence is by no means new, we believe
complexity science – and in particular Ilya Prigogine’s work in
thermodynamics – offers something new to debates about this concept
which makes it possible to think again about the epistemological
implications of this concept. Our main argument in this paper is that if
the epistemological implications of strong emergence are taken into
account then the whole pedagogical argument about whether the real
or pre-existing world should be presented or represented in schools in
order that the child will get an accurate understanding of it falls away
(see Biesta & Osberg, 2007). This, of course, raises the question of what
sort of questions pedagogy should then be concerned with if not with
issues about how to impart knowledge of a pre-existing world to the
child. We believe that through an examination of the epistemological
implications of strong emergence, it becomes possible to imagine a form
of schooling that is not structured around questions of how best to teach
the child about a pre-existing world (which, largely, are questions about
whether this world should be presented or represented in schools).
Instead it becomes possible to imagine a form of schooling which is
concerned with questions of responsibility and response. In view of this
we believe the notion of emergence makes a valuable contribution to a
possible shift in understanding about what schooling is for.

The Notion of Emergence


The History of the Notion
In most standard accounts of the history of the concept of emergence
(e.g., Stanford Encyclopedia of Philosophy), the term is said to have
been coined by G.H. Lewes (1875) to differentiate between chemical
products that could be logically derived from their constituents and
those that could not. He called the former “resultants” and the latter
“emergents.” Following this formulation the philosophy of
“emergentisim” began taking shape, mostly led by the British
emergentists including Lewes, Broad, Morgan, and Alexander. Jaegwon
Kim suggests that:
At the core of these ideas was the thought that as systems acquire
increasingly higher degrees of organisational complexity they begin
to exhibit novel properties that in some sense transcend the
properties of their constituent parts, and behave in ways that
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 33

cannot be predicted on the basis of the laws governing simpler


systems. (1999, p.3)
Emergence therefore came to be defined, first and foremost, as the
creation of new properties. More specifically, it came to be understood as
a process whereby properties that have never existed before and, more
importantly, are inconceivable from what has come before, are created
or somehow come into being for the first time.
In the early part of the 20th century the idea of emergence was
highly problematic because it brought into question the idea of
determinism during a time when scientific reductionism was on the rise
(Kim, 1999). This, and an apparent link with “vitalism,” according to
Kim, contributed to the emergentist movement failing to become a
visible part of mainstream philosophy of science early in the 20th
century. Nevertheless, the rise of non-linear mathematics and
complexity science in the past three decades, together with the decline
of reductionism, have resulted in a “resurgence of emergentism” (Kim,
1999, p. 2).

Contemporary Understandings of Emergence


Contemporary understandings of emergence have retained the idea that
emergence introduces properties that are novel and sometimes even
inconceivable or unimaginable. The point about the unimaginability of
what emerges is pressed home by John Ziman who remarks that when
we are dealing with emergence even the word property is problematic
as:
Entities can emerge with features that are so novel that they do
not conform to any … pre-existing criteria. It is not just that these
new entities have different properties. Previously unimaginable
notions of what constitutes a property are required. (2003, p. 1626
[Italics added])
Nevertheless, as Chalmers (2002) has noted, with the rise of complexity
science and non-linear mathematics, there now appear to be two
understandings of emergence – a strong and a weak version – which are
associated with different understandings of the term novel. In the weak
sense, emergent properties are understood as novel in that they are
unexpected given the principles governing the lower-level domain. Such
unexpected properties (no matter how inconceivable or unimaginable)
can emerge entirely deterministicaly from non-linear rules of
interaction, as is the case with fractals. Although the fractals
themselves are surprising, their emergence is nevertheless completely
explainable in terms of the lower level. In this sense complexity science
34 DEBORAH OSBERG and GERT J.J. BIESTA

has to a certain extent reconciled emergence and determinism. Even


properties that are unpredictable, inconceivable, or unimaginable before
they actually appear are still understood to be a consequence of
deterministic physical processes, that is, they are still logically derived
from their constituents. For this reason complexity science and non-
linear mathematics brings into question Lewes’ (1875) classification
system which distinguished emergent from resultant chemical products
on the basis that emergents could not be logically derived from their
constituents. Since, with non-linear mathematics, both can be shown to
be the product of deterministic processes the distinction falls away.
While the scientific and philosophical (epistemological) prospects for
weak emergence, as explicated by the complexity sciences have been
described as promising (Bedau, 1997), we want to make clear that (a) in
this paper we are not concerned with this variety of emergence and (b)
complexity science is not wholly concerned with weak emergence. We
believe that Ilya Prigogine’s work in thermodynamics – for which he
was awarded the Nobel Prize in chemistry in 1977 – is incompatible
with weak emergence and in fact supports a theory of strong emergence.
In the strong sense emergent properties can be understood as novel
in that they are not deducible even in principle “from the most complete
and exhaustive knowledge of their emergence bases” (Kim, 1999, p. 6).
Strong emergence therefore presents a direct challenge to determinism
(the idea that given one set of circumstances there is only one logical
outcome). With strong emergence what emerges is always radically
novel. It is this strong emergence the British emergentists had in mind.
Lloyd Morgan’s 1923 description of emergence captures a sense of the
radical novelty involved. In Morgan’s words: “Under what I call
emergent evolution, stress is laid on the incoming of the new … if
nothing new emerges – if there be only regrouping of pre-existing
events and nothing more – then there is no emergent evolution” (1923,
pp. 1-2, [Italics added]).

A Prigoginian Understanding of Emergence


The word emergence seldom appears in Prigogine’s writings.
Nevertheless, Prigogine was very much concerned with those kinds of
processes that give rise to increasingly higher levels of organisational
complexity. As such one could say his work has been intimately
concerned with emergence. More specifically, one could say his work
explores the notion of emergence through a meticulous examination of
the passage between the micro and the macro level. In explaining the
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 35

dynamics of this passage he provides a convincing argument for the idea


that emergent phenomena are in principle not reducible to or calculable
from the lower levels from which they emerged. It is for this reason that
we would say Prigogine’s work can be read as a theory of strong
emergence although it should be noted that Prigogine calls it a “theory
of irreversible processes” (Prigogine & Stengers, 1984, p. 310). In
particular, this theory makes clear that the crux of what separates
resultants from emergents is a distinction between the kind of processes
that take place in closed systems1 and those that take place in open
systems.1 The former are, in principle, reversible processes (more about
this in the next section of this paper), while the latter, he claims, are
strictly irreversible (we would say emergent). If this distinction between
reversible and irreversible processes is not made we have no means to
understand why “emergents” should be different from resultants, so in
this sense Prigogine’s work can be seen as having provided a crucial
layer of understanding to the emergentist debate.

Prigogine’s Work on Irreversible Processes


Prigogine’s main aim was to better understand irreversible processes in
the physical sciences. In the physical sciences irreversible processes are
generally considered to be those that inevitably run down or tend to a
state of disorder. It has been assumed that all known processes in the
universe eventually tend towards a state of disorder (following the 2nd
Law of Thermodynamics). The main explanation for all this
irreversibility in the universe is that there are simply more ways for a
system to be disordered than ordered so the chances of the universe
becoming disordered are higher than the chances of it becoming ordered
(Gell-Mann, 1994 provides an example of this understanding of
irreversibility).With this understanding irreversibility is not a special
property of a particular kind of process. Since it is possible to logically
determine every step of their progression towards a more disordered
state – and hence in theory run the process both backwards and
forwards – apparently irreversible processes are in principle reversible
and the property of irreversibility (the irreversible tendency to disorder)
is understood simply as an illusion or side effect of standard
deterministic (reversible) processes. In other words there is nothing
unusual about the mechanics of irreversible processes.
Prigogine, however, brought into question this understanding of
irreversibility as being simply an illusion or side effect of timeless
physical processes. He showed, instead, that this understanding of
36 DEBORAH OSBERG and GERT J.J. BIESTA

applies only to closed systems, that is, systems with distinct boundaries.
When such systems run down – that is, when there are irreversible
changes to states that are more disordered – we can logically deduce
every step of the progression to disorder and so, in principle, such
processes are fully deterministic and reversible. On the other hand,
when we are dealing with open systems – which are systems that
interact with their environment and which change themselves and their
environment in the process – there are irreversible changes towards
states that are more ordered. Furthermore, these irreversible changes
are not deterministic but probabilistic. This probabilistic element
according to Prigogine – and we shall explain this in more detail in what
follows – lends a completely different meaning to the concept of
irreversibility. The irreversibility of such processes can no longer be
understood as a side effect or illusion. Instead it must be understood as
an integral property of the process itself. In other words there is
something special about these sorts of processes. They are distinctly
different from reversible (deterministic) processes. This is the crux of
Prigogine’s “theory of irreversibility” (or ‘strong’ emergence). To
appreciate how Prigogine’s formulation of irreversibility affects
determinism, we need to introduce three important concepts that
Prigogine uses. These are non-equilibrium, self-organisation, and
bifurcation. We then need to explain how bifurcation introduces a
probabilistic element into our descriptions of process, which in turn
challenges determinism.

Non-Equilibrium
Non-equilibrium is a state that characterises open systems. The most
obvious examples of such systems are boiling water, tornadoes,
lightning and all living systems. These are systems that are exchanging
energy and matter with their environment and which exist only because
they are open. If an open system is cut off from its environment it dies
or simply fades away. It cannot be separated from the fluxes that
sustain it. In contrast to this, equilibrium systems are closed to their
environment. They are essentially static, inert systems that, once
formed, can be maintained indefinitely without further interaction with
their environment. An example of a system at equilibrium is a container
of cold water. If we started to apply heat from below we would be
starting to push the system away from equilibrium. Heat would be
‘coming in’ to the system and the system would move into a ‘non-
equilibrium’ state. When the system has been pushed sufficiently far
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 37

from equilibrium the water responds by organising itself into a macro-


level pattern, that is, it erupts into turbulence (it boils).2 The system
‘jumps’ to a new level of order. This response is produced by the non-
equilibrium situation and is a response that will be maintained as long
as the non-equilibrium situation producing it is maintained.

Self-Organisation
The spontaneous appearance of the macro-level pattern is a process
which Prigogine calls “self-organisation.” This process is entirely in
accord with known physical laws. The micro-level entities are simply
obeying known physical laws and in doing so the macro-level pattern
spontaneously emerges. There are many examples of such spontaneous
patterning in nature, such as tornadoes, turbulence and the flocking
behaviour of birds to name but a few.3 Prigogine calls these self-
organised patterns “dissipative structures” and claims that the
emergence of such structures is a characteristic of non-equilibrium.
Further, he insists that most of reality is not stable and equilibrial, but
characterised by process and dissipative structures.
But we need to look a bit more closely at the concept of self-
organisation because we have said that the lower level entities form
themselves or self-organise into macro-level patterns entirely in
accordance with known physical laws. This seems to imply that the
macro-level structure that emerges at the higher level is entirely
explainable in terms of known physical laws. If this were the case
Prigogine’s theory might be a theory of emergence, but it would not be
a theory of strong emergence. It would mean that the universe is
unfolding in an entirely deterministic fashion. It is here that the concept
of bifurcation is required. It is bifurcation, not self-organisation, that
brings determinism into question.

Bifurcation
According to Prigogine, while everything that is taking place at non-
equilibrium takes place under necessary conditions these conditions,
while necessary, do not determine in its full reality that which emerges
in non-equilibrium conditions. This, he maintains, is because when a
system responds to an external ‘flux’ by ‘jumping’ to a new level of order,
there are always a number of structural possibilities for a higher level
of order that would be equally satisfactory in terms of the known
physical laws. This means that the single actualised version – the
‘solution’ that is ‘chosen’ by the emerging system – is always one among
38 DEBORAH OSBERG and GERT J.J. BIESTA

a number of plausible alternatives that happened not to occur. Prigogine


calls the point at which these possibilities appear a bifurcation point.4
This is a point that corresponds to a symmetry break (which means
additional degrees of freedom have been provided in a particular
dimension as a result of the system being pushed out of equilibrium) so
at bifurcation the system must choose between several equally
satisfactory symmetry options. Prigogine has shown that as a system is
pushed further and further from equilibrium, additional bifurcations
(symmetry breaks) will appear. So the question is how the system selects
its options and this is where Prigogine’s work becomes contentious and
indeed has contributed to what Max Planck dubbed the “determinism
quarrel” in theoretical physics (Freire, 2003).

The Role of Chance: the Roots of Strong Emergence


Prigogine insists that the choice that is made at a birfurcation is not
determined. It is purely the result of chance (Pomian, 1990 has collected
some of the most important contributions to the determinism debate
surrounding Prigogine’s work). In his words: “The system ‘chooses’ one
of the possible branches available when far from equilibrium. But
nothing in the macroscopic equations justifies the preference for any one
solution” (Prigogine, 1997, p. 68 [Italics added]).
The difficulty with this – and this is the difficulty that brings
determinism into question – is that “chance” according to Prigogine 5
“can neither be defined nor understood” (Prigogine, 1997, p.5). Chance
is something that is in itself not present in the macroscopic equations
or, at least, always missing. Because something is always missing from
the equations that describe such a process (i.e., something is missing
from the logic of the process) it is in principle impossible to provide a
complete description of the emergent level from the submergent level
components. Such systems, in other words are strongly emergent
because what happens at the micro-level is in principle insufficient to
provide a complete description of what emerges at the macro-level.
What we have, therefore, is a system that is indeterminate or strongly
emergent despite the fact that it is operating according to known
physical laws. Because probability is built into the system at every
bifurcation, we must understand it as an operator in what emerges. It
is not just that we have insufficient information about the system to
know what will emerge, we cannot determine what will emerge even in
principle (all we have are probabilities). Furthermore, because chance
is involved at each bifurcation and since it takes only a very few
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 39

bifurcations to produce an inordinate number of options (Figure 1), the


trajectory of the system is radically indeterminate.

Figure 1. Fractal tree showing how simple binary


branching can quickly lead to an inordinate number of
outcomes.

Probability therefore introduces an irreducible element into our


description of nature which is quite foreign to strictly deterministic
systems with trajectories that can be fully determined and therefore
read forwards or backwards in time. Deterministic systems are, in
principle, reversible systems. Emergent systems are, in principle,
irreversible systems.
Because (at least according to Prigogine) most of reality is
characterised by dissipative structures (or irreversible processes) we can
therefore understand ourselves to be in an emergent (rather than a
40 DEBORAH OSBERG and GERT J.J. BIESTA

deterministic) universe where the present and the future is always more
than the sum of the parts from which it was generated. Because an
irreversible or emergent universe does not unfold like an automaton for
which we can simply discover the rules which are present but hidden
from view, irreversibility, Prigogine insists, requires a reformulation of
our descriptions of natural processes. In an emergent universe,
probability or chance – that which can never be known – must play a
fundamental role in our descriptions of nature. In Prigogine’s words:
“We need a more dialectical view of nature” (Prigogine, 1997, p.182).
This brings us to the epistemological implications of strong emergence.

The Epistemological Implications of Strong Emergence


Prigogine’s work – as we have tried to show – brings into question the
assumption that the universe necessarily unfolds in a deterministic
fashion. This bringing into question of the assumption of determinism,
so we would like to argue in this section, makes it possible to rethink
the relationship between the world and our knowledge of it. In what
follows we explore some of the ways in which Prigogine’s challenge to
the idea of a deterministic universe affects our understanding of the
concept of knowledge in relation to the past, the present and the future.
In doing this we find it helpful to draw on the work of George Herbert
Mead and Jacques Derrida.

Knowledge of the Past (The Loss of Truth in History)


Let us begin by reviewing the meaning of the past (history) in a fully
deterministic world. With strict determinism each successive state of a
process follows on and is entirely predictable from what came before.
Since, in principle, nothing is missing from the equations that describe
each of these states – because chance is not involved – the process can
be understood from any temporal standpoint (forwards or backwards in
time). This means, if we are concerned with the history or past states of
a process, it is possible to refer to the logic that drives the whole process
and use this logic to work backwards, through successive stages in the
sequence, thereby getting to the correct history of the process. The idea
that we can correctly determine the configuration of each stage of a
process, back through time, leads us to believe that each of these stages
has a true identity, an identity that we can somehow capture. Henri
Bergson provides a useful metaphor in this regard, calling this a
“cinematographical” view of temporality with each stage in the history
of a temporal sequence being the equivalent of a photographic snapshot,
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 41

a frozen moment in time. The past, with such an understanding, is an


immutable past (see Figure 2). Prigogine’s view of irreversibility
destroys this view of history.

Figure 2. A linear understanding of temporality, in which history


is immutable.

If a process is probabilistic – or emergent – rather than deterministic


this means not everything can be accounted for as the process
progresses from one stage to the next. Since there is always something
missing in the jump between stages this means there is no immutable
logic driving the process, which we can use to work out the past states
of the process. But if this is the case how should we understand the
histories that we have already constructed by means of a logic we
though we had? For example what should we make of the series of
“snapshots in time” (Bergson, 1911) that trace the unfoldment of our
universe all the way to the Big Bang? Or the snapshots that trace the
history of madness for that matter? (see Foucault, 1965, and also
Derrida’s analysis of this book: Derrida 1978)
We think George Herbert Mead’s Philosophy of the Present (Mead,
1932) is quite helpful in this regard for he denies the idea that there is
a real or immutable past which we have recourse to. Mead argues that
it is only from our experience of the present that we are able to
reconstruct the past. This means our historical accounts of the past will
always give us a story of the past from the perspective of the present.
The past, for Mead “must always be set over against a present in which
the emergent appears, and the past, which must then be looked at from
the standpoint of the emergent becomes a different past” (1932, p. 2,
[Italics added]).
42 DEBORAH OSBERG and GERT J.J. BIESTA

Since, in an emerging universe, each subsequent present always


includes more than was present in the present that has just passed (see
Figure 3), we must, with each subsequent present, always rewrite the
past. As Mead puts it:
There is an entire absence of finality in such presentations. It is of
course the implication of our research method that the historian in
any field of science will be able to reconstruct what has been, as an
authenticated account of the past. Yet we look forward with vivid
interest to the reconstruction, in the world that will be, of the
world that has been, for we realise that the world that will be
cannot differ from the world that is without rewriting the past to
which we now look back. (Mead, 1932, p. 3)
Our knowledge of the past is not in any sense accurate in a timeless or
immutable sense. We cannot accurately represent the past, all we can
do is keep reconstructing the past in a way that makes sense from the
perspective of the present. Since the materials out of which the past is
constructed lie in the present (which, in an emerging universe is always
more than its past) this means each new present must necessarily bring
a reinterpretation of the past.

Figure 2. Diagram to show how each new present brings a


reconstruction of the past. For example, theories A, B, and C,
about the possible origin of the universe (these being theories
which arise from different presents a, b, and c, respectively),
would be qualitatively different not cumulatively different.
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 43

Knowing the Present (Experiencing the Impossible)


As we have seen from Mead, in an emergent (probabilistic, irreversible)
universe it is pointless to think we can know it in a timeless, immutable
sense and so Prigogine’s challenge to determinism has important
implications for our knowledge of the past. But what of our knowledge
of the present? Even if we cannot know the past in an immutable sense,
can we know the present? Can we know the present moment as it really
is? The problem with knowing the present is that the very concept of
knowledge implies knowledge of a moment/world that is already in the
past (or is at least in the process of passing). This is the case even if we
understand knowledge as lived experience for, in an emergent universe,
knowledge which is arising in the present is always already passing into
being insufficient for that present. We can never have knowledge of the
world (even the present world of lived experience) because, in an
emergent universe, we would constantly have to reassess our knowledge
of the present in the very moment we acquired it. But if we cannot get
knowledge from lived experience then how should we understand the
concept? One way of making sense of the concept is to think of
knowledge itself in emergentist terms. Before doing that let us review
what it means to have knowledge of the present in non-emergentist (i.e.,
deterministic) terms.
If we think of knowledge as continually taking place in our
engagement with the world then we could say that knowledge results
from our engagement with the world. This, however, would imply that
all the bits and pieces of the world somehow determine the knowledge
that results from our engagement with them. But if the world itself
determines what knowledge can result from our engagement with it,
this in turn implies that knowledge has a reversible relationship with
the world: that is, if the world determines knowledge then our
knowledge tells us what the world is really like (in an immutable sense).
We have already seen, however, that in an emergent universe this sort
of relationship between the world and knowledge would be impossible.
If, on the other hand, we think of knowledge (or knowing) not as
determined by our engagement with the present, but as emerging from
our engagement with the present, then the problem goes away. If
knowledge/knowing has an emergent (rather than a deterministic)
relationship with the world, then we must bear in mind that although
what we can know is constrained or conditioned by the present that we
engage with, each knowledge event – which is to say each taking place
of knowledge (knowing) – is necessarily also radically new. It is
44 DEBORAH OSBERG and GERT J.J. BIESTA

radically new because, although it follows from what has come before,
it does not follow on logically from what has come before. It does not
follow on logically because it contains an addition – a supplement –
which was not present in what came before. The knowledge that
emerges in the present must therefore, at the same time, affirm that
which preceded it, and also be “unthinkable” from that which precedes
it. To say this another way, emergent knowledge draws on what is
there, but not as a ground to think our way into that which follows on
logically (deterministically) from this ground (and thereby to grasp or
understand the way something is, or could be), but rather to find new
ground which is incalculable from the ground we are on. If we were only
taking the next logical step this would imply we would always be
responding from a set of pre-given rules. From the perspective of
emergence knowledge is therefore not about understanding or grasping
a meaning in its essence or completely or adequately (or even
inadequately). Knowledge, does not bring us closer to what is already
present. Rather it emerges into that which is unthinkable from the
ground that precedes it and therefore transcends or brings into question
the knowledge we thought we had (in the representational sense).
Emergent knowledge, in other words, moves us into a new reality which
is incalculable from what came before. Because it enables us to
transcend what came before, this means it also allows us to penetrate
deeper into that which does not seem possible from the perspective of the
present. This emergentist understanding of knowledge, so we believe,
comes close to key insights developed by Derrida, under the label of
“deconstruction” which he defines 6 as “the experience of the impossible”
(Derrida, 1992a, p. 200).
To make sense of this comment it is necessary to understand that
Derrida uses the terms “impossible” and “incalculable” to denote “that
which cannot be foreseen as a possibility” (see Biesta, 2001, p. 48). We
believe this is also a way of describing the radical invention or novelty
involved in of strong emergence. An invention, Derrida argues, is
incalculable before it actually appears. It has to “declare itself to be the
invention of that which did not appear to be possible; otherwise it only
makes explicit a program of possibilities within the economy of the same.
(Derrida, cited in Biesta, 2001, p. 33 [Italics added]).
This description of invention bears a strong resemblance to Lloyd
Morgan’s 1923 description of emergence which also captures a sense of
the radical novelty involved in strong emergence. In Morgan’s words.
“Under what I call emergent evolution, stress is laid on the incoming of
the new … if nothing new emerges – if there be only regrouping of pre-
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 45

existing events and nothing more – then there is no emergent evolution”


(1923, pp.1-2 [Italics added).
Although a comparison between strong emergence and
deconstruction is risky, we make this link because we believe there are
at least some epistemological compatibilities between deconstruction
and strong emergence. Of particular importance is the similiarity in the
way both deconstruction and strong emergence challenge existing
knowledge. Neither strong emergence nor deconstruction challenge
existing knowledge by overturning it. Rather, they ask us to imagine a
future which is incalculable from the perspective (or logic) of existing
knowledge. They do this through affirming existing knowledge without
allowing it to overrule what is to come. By acknowledging but not
following existing knowledge, both deconstruction and strong emergence
seek to negotiate a passage between the knowledge that has been and
that which is still to come.

Knowledge and the Future (Taking Responsibility)


In an emergent (or irreversible) universe the future is always radically
inventionalistic, that is to say it is, in principle, incalculable from the
perspective of the present because “the emergent is not there in
advance, and by definition could not be brought within even the fullest
presentation of the present” (Mead, 1932, p.10). Not knowing what the
future will bring does not mean, however, that we have no responsibility
for the future. Quite the contrary. As Derrida points out, it is when we
make secure and unambiguous decisions in a particular situation
because we think we know in advance what the future will bring (if we
apply the rules) that we are in fact not acting responsibly. For Derrida,
this type of responsibility:
Simply follows a direction and elaborates a programme. It makes
of action the applied consequence, the simple application of a
knowledge or know-how. It makes of ethics and politics a
technology. No longer of the order of practical reason or decision,
it begins to be irresponsible. (Derrida, 1992b, p. 41)
From the discussion so far, it should be clear that Derrida’s notion of
irresponsibility applies to an emergent rather than a deterministic
universe. In a deterministic universe, where the rules are pre-given,
following the rules is the best we can do in terms of responsibility for,
if we already know what the outcome of these rules will be, the only way
we can be responsible is to choose to do the right thing. In an emergent
universe, however – where the present is not contained in the past –
simply following the rules can only be seen as irresponsible for the
46 DEBORAH OSBERG and GERT J.J. BIESTA

present moment does not follow the same rules as the moment that has
passed. Since each new present is radically new, in that it contains
elements that were not present in the past, each new present requires
its own unique interpretation. No existing interpretation or set of rules
can do it justice.
This, however, does not mean we can ignore what came before. If we
ignore the lessons of the past we again become irresponsible. We must
therefore make two apparently contradictory gestures at the same time.
We must make a decision now, based on what has come before but at
the same time we cannot rely on what has come before to make this
decision. In Derrida’s words:
For a decision to be just and responsible, it must, in its proper
moment if there is one, be both regulated and without regulation:
it must conserve the law and also destroy it or suspend it enough
to have to reinvent it in each case, rejustify it, at least reinvent it
in the reaffirmation and the new and free confirmation of its
principle. Each case is other, each decision is different and requires
an absolutely unique interpretation, which no existing, coded rule
can or ought to guarantee absolutely. (Derrida, 1990, p. 971)
The invention Derrida speaks of is not something that can be logically
described, for it does not follow on from a logic (or set of rules) that came
before. Derrida puts it thus:
The condition of possibility of this thing called responsibility is a
certain experience and experiment of the possibility of the
impossible: the testing of the aporia from which one may invent the
only possible invention, the impossible invention. (1992b, p. 41).
“But there is no responsibility that is not the experience and the
experiment of the impossible. (pp. 44-45)

A Pedagogy of Invention
We believe that a complexity inspired epistemology suggests a
“pedagogy of invention” (we borrow this phrase from Ulmer, 1985) for
it brings into view the idea that knowledge does not bring us closer to
what is already present but, rather, moves us into a new reality which
is incalculable from what came before. Because knowledge enables us
to transcend what came before, this means it also allows us to penetrate
deeper into that which does not seem possible from the perspective of the
present. Knowledge, in other words, is not conservative, but radically
inventionalistic. This means when we are thinking about schooling, we
should not think of it as primarily being about providing children with
knowledge of an already determined world. We need to bear in mind
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 47

that the world we are teaching about has always already passed and so
any attempt to transmit the rules of this world (which, in an emergent
universe are no longer appropriate for the present) might be considered
pedagogically irresponsible. What then do we do in schools? We would
like to answer this question in terms of what we perceive to be two
extremely important functions of schooling in the modern world. One
function of schooling is to teach the young how to take care of the world
(see Arendt, 1954/1961). We educate the young about the world that is
and the world that has been precisely because we care about and wish
to take responsibility for the future, the world that will emerge. Another
function of schooling is to facilitate the emergence of human subjectivity
(Biesta, 2006). We teach so that children can become better human
beings. Both these functions of schooling are intimately connected with
the concept of emergence, the emergence of the world on the one hand
and the emergence of human subjectivity on the other. Teachers are
responsible both for the emergence of the world (the future) and for the
emergence of human subjectivity, and these two functions of schooling
are very closely intertwined. In this sense responsibility and emergence
can be considered key to the whole idea of schooling.
When we consider schooling in terms of taking responsibility for the
future, then we must acknowledge that the answers to the future do not
lie in the present or the past and so we cannot teach the answers to the
future. In an emergent universe we cannot rely on the rules of the past
to dictate what we should do in the future. For this reason it is
misguided to think of schools as places where the rules of the past are
taught in order to take care of the future. Such an attitude succeeds only
in replicating the past and holding the world still. Schools, rather,
should be thought of as places where the world is renewed. As Hannah
Arendt so aptly remarked:
Our hope always hangs on the new which every generation brings;
but precisely because we can base our hope only on this, we destroy
everything if we try to control the new that we, the old, can dictate
how it will look. (Arendt, 1954/1961, p. 11)
The idea of schools being places where the world is renewed is very
much caught up with the idea of human subjectivity since it is largely
the choices made by human subjects which cause the world to emerge
in the way that it does. But if we cannot rely on the rules of the world
(that has always already passed) to make responsible choices, then how
do we use schooling to ensure that the future is taken care of? To do so
we must ensure children are allowed to respond responsibly in the
present. As we argued in previous sections, to respond responsibly we
48 DEBORAH OSBERG and GERT J.J. BIESTA

are asked to respond to the present (which is always a present that has
already passed) by simultaneously following it and not following it. We
follow it in a conventional sense, in that we acknowledge and affirm the
rules (logic) of the past, but then if we try to be responsible and
understand the present from its own perspective, there always comes a
moment when this logic is no longer appropriate for the present (for the
present is always a new present) and then we have no rule to follow. In
this moment of undecidability we must make a choice without
knowledge/logic. We are forced to take a position which is not dictated
by the rules of the past. But in this taking of a position, we ourselves
emerge or come into being. We become who we are through such
moments of undecidability. This seems to imply that in taking up
responsibility for the future, schools also facilitate the emergence – or
what Biesta (2006) calls the “coming into presence” – of the human
subjects (whose choices renew the world).
If this is the case, then it is clear that schools should do more than
teach the child about a pre-given world. If the coming into presence of
human subjectivity takes place when the human subject is presented
with an opportunity to respond responsibly – i.e. take a position on
something about which it is impossible to simply follow the rules of the
past – then schools need to ensure that children at least have the
opportunity to respond responsibly to what they are presented with.
This seems to suggest that the objective of presenting content in the
school curriculum is not to ensure that children get an accurate
understanding of it, but to provide children with the opportunity to
engage with content deeply enough to respond responsibly to it. This
means allowing them to make choices, and hence take a position, based
on the full contingency of what is presented in the curriculum and the
present in which it is presented and without knowing what to do. It is
necessary, in other words, to allow undecidability to exist in the
classroom. This undecidability must also extend to the curricular
content itself, which should not follow a logic of predetermination. To
determine the curricular content in advance would be to apply a rule…
“this is what should happen in all schools, not that” which, in itself
would be insensitive to the contingency of the present, and hence
pedagogically irresponsible. Who is to say what the curricular content
should be, particularly in today’s climate of multiculturalism?
If undecidability would be allowed to exist in the classroom, schools
could become places that facilitate the emergence of human subjects as
unique beings, each taking a position, rather than being places which
encourage the (blind) following of rules carried over from a different
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 49

present. In doing this they would also be places which facilitate the
renewal of the world rather than its replication. The classroom must
become a place where meanings can be responsibly negotiated and hence
where the new is allowed to appear.

Conclusion
In this paper we have tried to argue that an epistemology inspired by
strong emergence allows a different way of thinking about the
organisation of schooling, which is not another pedagogical version on
the representational-presentational spectrum that was discussed in our
other paper in this issue (Biesta & Osberg, 2007) which is primarily
concerned with ways in which the child can be brought to an accurate
understanding of the world that is present (either through
presentational or representational means). We believe that if we re-
think the purposes of schooling using insights from complexity and
deconstruction, which suggest an emergentist relationship between the
world and our knowledge of it, then we must think of schools as not as
places where the meanings of a present world (which is also a world that
has always already passed) are replicated and hence preserved. Instead
schools can be thought of as places where new worlds are allowed to
emerge, or to say this differently where the world is renewed. To do this
we must make sure that within schools children have the opportunity
not only to engage deeply and responsibly with curricular content, but
also have the opportunity to respond to it, to make choices, take a
position, and be heard.

NOTES
1. Closed systems, essentially, are those that do not interact or exchange
information with their environment, while open systems, typically are those
that do interact with or exchange information with their environment
changing themselves and their environment in the process.
2. Turbulence is a complex pattern of order, not a disordered or chaotic
state.
3. A well-used, micro-level example is the case of the Raleigh-Bénard
instability. If we apply heat from below to a thin layer of water, the water
responds by organising itself into tiny convection currents called Bénard
cells. In other words a macro-level pattern emerges which transcends and
subsumes the micro-level components.
4. The name is misleading, however, as it means separation in two, when
in fact there may be several possibilities.
50 DEBORAH OSBERG and GERT J.J. BIESTA

5. Here Prigogine draws on Abraham De Moive, one of the founders of the


classical theory of probability.
6. Derrida suggests this is the “least bad definition of deconstruction”
(1992a, p. 200).

ACKNOWLEDGMENTS
This paper was originally presented as part of a symposium entitled
Complexity and the Idea of a Pedagogy of Invention at the British
Educational Research Association Conference, Manchester, 2004.
Symposium co-presenters included Brent Davis and Dennis Sumara.
Thanks to Barney Ricca for useful comments on an early draft of this paper.

REFERENCES
Arendt, H. (1961). The crisis in education. In Between past and future. New
York: Penguin Books. (Original work published 1954)
Bedau, M.A. (1997). Weak emergence. In J. Tomberlin (Ed.), Philosophical
perspectives: Mind, causation, and world (Vol. 11, pp. 375-399).
Malden, MA: Blackwell Publishers.
Bergson, H. (1911). Creative evolution. New York: Henry Holt and
Company.
Biesta, G.J.J. (2001). Preparing for the incalculable. In G.J.J. Biesta & D.
Egea-Kuehne (Eds.), Derrida & education (pp. 32-34). New York:
Routledge.
Biesta, G.J.J. (2006). Beyond learning. Democratic education for a human
future. Boulder, CO: Paradigm Publishers.
Biesta, G.J.J. & Osberg, D. (2007). Beyond re/presentation: A case for
updating the epistemology of schooling. Interchange, 38(1).
Chalmers, D.J. (2002). Varieties of emergence. Available from:
http://jamaica.u.arizona/~chalmers/papers/granada.html
Chalmers, D.J. (2006). Strong and weak emergence. In P. Clayton & P.
Davies (Eds.), The re-emergence of emergence (pp. 244-256). Oxford, UK:
Oxford University Press.
Derrida, J. (1990). The force of law: The mystical foundation of authority.
Cardozo Law Review, 11(5-6), 919-1045.
Derrida, J. (1992a). Afterw.rds: Or, at least, less than a letter about a letter
less (G. Bennington, Trans.). In N. Royle (Ed.), Afterwords (pp. 197-
203). Tampere, Finland: Outside Books.
Derrida, J. (1992b). The other heading: Reflections on today’s Europe (P.-A.
Brault & M.B. Naas, Trans.). Bloomington, IN: Indiana University
Press.
Derrida, J. (1998). Forgiveness and mercy in politics and law. In
Conversation with Jacques Derrida. N. Benjamin, Law and Humanism
Speaker Series. Cardozo School of Law, October.
EMERGENCE, KNOWLEDGE, AND PEDAGOGY 51

Foucault, M. (1965). Madness and civilization: A history of insanity in an


age of reason (R. Howard, Trans.). New York: Random House.
Freire, O.J. (2003). A story without an ending: The quantum physics
controversy 1950-1970. Science and Education, 12, 573-586.
Gell-Mann, M. (1994). The quark and the jaguar: Adventures in the simple
and complex. London: Little Brown Company.
Kim, J. (1999). Making sense of emergence. Philosophical Studies, 95, 3-36.
Lewes, G.H. (1875). Problems of life and mind (vol. 2). London: Kegan Paul,
Trench, Turbner.
Mead, G.H. (1932). The philosophy of the present. La Salle, IL: Open Court.
Morgan, C.L. (1923). Emergent evolution. London: Williams and Norgate.
Pomian, K. (Ed.). (1990). La querelle du d eterminisme. Philosophie de la
science aujourd’hui. Paris: Gallimard/Le D ebat.
Prigogine, I. (1997). The end of certainty: Time, chaos, and the new laws of
nature. London: The Free Press.
Prigogine, I. & Stengers, I. (1984). Order out of chaos: Man’s new dialogue
with nature. London: Bantam Books.
Ulmer, G.L. (1985). Applied grammatology: Post(e)-pedagogy from Jacques
Derrida to Joseph Beuys. Baltimore: Johns Hopkins University Press.
Ziman, J. (2003). Emerging out of nature into history: The plurality of the
sciences. Philosophical Transactions of the Royal Society of London, 361
(1809), 1617-1633.

Authors Address:
School of Education and Lifelong Learning
University of Exeter
Heavitree Road
Exeter EX1 2LU
UNITED KINGDOM
EMAIL: g.biesta@exeter.ac.uk
d.c.osberg@exeter.ac.uk

You might also like