Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
Complexity and Self-organization

Complexity and Self-organization

Ratings: (0)|Views: 61 |Likes:
Published by Giorgio Bertini

More info:

Published by: Giorgio Bertini on Oct 24, 2010
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

02/01/2013

pdf

text

original

HEYLIGHEN (in Bates & Maack. eds)
- 1 -
Complexity and Self-organization
prepared for the Encyclopedia of Library and Information Sciences,
edited by Marcia J. Bates and Mary Niles Maack (Taylor & Francis, 2008)
Francis Heylighen
Free University of Brussels, Krijgskundestr. 33, 1160 Brussels, Belgium
fheyligh@vub.ac.be
Keywords: complex systems, unpredictability, self-organization, networks, non-linearity,
multi-agent systems, emergence
Abstract: this article introduces some of the main concepts and methods of the science
studying complex, self-organizing systems and networks, in a non-technical manner.
Complexity cannot be strictly defined, only situated in between order and disorder. A
complex system is typically modeled as a collection of interacting agents, representing
components as diverse as people, cells or molecules. Because of the non-linearity of the
interactions, the overall system evolution is to an important degree unpredictable and
uncontrollable. However, the system tends to self-organize, in the sense that local
interactions eventually produce global coordination and synergy. The resulting structure
can in many cases be modeled as a network, with stabilized interactions functioning as
links connecting the agents. Such complex, self-organized networks typically exhibit the
properties of clustering, being scale-free, and forming a small world. These ideas have
obvious applications in information science when studying networks of authors and their
publications.
HEYLIGHEN (in Bates & Maack. eds)
- 2 -
INTRODUCTION
In the last two decades, a new paradigm for scientific inquiry has been emerging:
complexity. Classical science, as exemplified by Newtonian mechanics, is essentially
reductionist: it reduces all complex phenomena to their simplest components, and then
tries to describe these components in a complete, objective and deterministic manner
(3,8). The philosophy of complexity is that this is in general impossible: complex
systems, such as organisms, societies or the Internet, have properties—emergent
properties—that cannot be reduced to the mere properties of their parts. Moreover, the
behavior of these systems has aspects that are intrinsically unpredictable and
uncontrollable, and cannot be described in any complete manner. At best, we can find
certain statistical regularities in their quantitative features, or understand their qualitative
behavior through metaphors, models, and computer simulations.
While these observations are mostly negative, emphasizing the traditional qualities that
complex systems lack, these systems also have a number of surprisingly positive features,
such as flexibility, autonomy and robustness, that traditional mechanistic systems lack.
These qualities can all be seen as aspects of the process of self-organization that typifies
complex systems: these systems spontaneously organize themselves so as to better cope
with various internal and external perturbations and conflicts. This allows them to evolve
and adapt to a constantly changing environment.
Processes of self-organization literally create order out of disorder (3). They are
responsible for most of the patterns, structures and orderly arrangements that we find in
the natural world, and many of those in the realms of mind, society and culture. The aim
of information science can be seen as finding or creating such patterns in the immense
amount of data that we are confronted with. Initially, patterns used to organize
information were simple and orderly, such as “flat” databases in which items were
ordered alphabetically by author’s name or title, or hierarchically organized subject
indices where each item was assigned to a fixed category. Present-day information
systems, such as the world-wide web, are much less orderly, and may appear chaotic in
comparison. Yet, being a result of self-organization, the web possesses a non-trivial
structure that potentially makes information retrieval much more efficient. This structure
and others have recently been investigated in the science of networks, which can be seen
as part of the sciences of complexity and self-organization.
The concept of self-organization was first proposed by the cyberneticist W. Ross Ashby
(1) in the 1940s and developed among others by his colleague Heinz von Foerster (2).
During the 1960s and 1970s, the idea was picked up by physicists and chemists studying
phase transitions and other phenomena of spontaneous ordering of molecules and
particles. These include Ilya Prigogine (3), who received a Nobel Prize for his
investigation of self-organizing “dissipative structures”, and Hermann Haken (4), who
dubbed his approach “synergetics”. In the 1980s, this tradition cross-fertilized with the
emerging mathematics of non-linear dynamics and chaos, producing an investigation of
complex systems that is mostly quantitative, mathematical, and practiced by physicists.
However, the same period saw the appearance of a parallel tradition of “complex
HEYLIGHEN (in Bates & Maack. eds)
- 3 -
adaptive systems” (5), associated with the newly founded Santa Fe Institute for the
sciences of complexity, that is closer in spirit to the cybernetic roots of the field. Building
on the work of John Holland, Stuart Kauffman, Robert Axelrod, Brian Arthur and other
SFI associates, this approach is more qualitative and rooted in computer simulation. It
took its inspiration more from biology and the social sciences than from physics and
chemistry, thus helping to create the new disciplines of artificial life ands o c i a l
simulation. The remainder of this paper will mostly focus on this second, simulation-
based tradition, because it is most applicable to the intrinsically social and cognitive
processes that produce the systems studied by information science. Although the other,
mathematical tradition sometimes uses the term “complex systems” to characterize itself,
the labels of “non-linear systems” or “chaos theory” seem more appropriate, given that
this tradition is still rooted in the Newtonian assumption that apparently complex
behavior can be reduced to simple, deterministic dynamics—an assumption which may
be applicable to the weather, but not to the evolution of a real-world social system.
Extending both traditions, the turn of the century witnessed a surging popularity of
research into complex networks. This was inspired mostly by the growth of the world-
wide web and the models proposed by Watts and Strogatz (6), and Barabasi and Albert
(7).
At present, the “science of complexity” taken as whole is still little more than a collection
of exemplars, methods and metaphors for modeling complex, self-organizing systems.
However, while it still lacks integrated theoretical foundations, it has developed a number
of widely applicable, fundamental concepts and paradigms that help us to better
understand both the challenges and opportunities of complex systems. The present article
will try to introduce the most important of these concepts in a simple and coherent
manner, with an emphasis on the ones that may help us to understand the organization of
networks of information sources.
COMPLEX SYSTEMS
There is no generally accepted definition of complexity (13): different authors have
proposed dozens of measures or conceptions, none of which captures all the intuitive
aspects of the concept, while they are applicable only to a very limited type of
phenomena, such as binary strings or genomes. For example, the best-known measure,
“Kolmogorov complexity”, which is the basis of algorithmic information theory, defines
the complexity of a string of characters as the length of the shortest program that can
generate that string. However, this implies that random strings are maximally complex,
since they allow no description shorter than the string itself. This contradicts our intuition
that random systems are not truly complex. A number of more complex variations on this
definition have been proposed to tackle this issue, but they still suffer from the fact that
they are only applicable to strings, not to real-world systems. Moreover, it has been
proven that the “shortest possible” description is in general uncomputable, implying that
we can never be sure that we really have determined the true complexity of a string.
In spite of these fundamental problems in formalizing the notion of complexity, there are
a number of more intuitive features of complex systems that appear again and again in

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->