Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
Chaitin: Toward a Mathematical Description of Life

Chaitin: Toward a Mathematical Description of Life



|Views: 80 |Likes:
Published by Brad Sommers

More info:

Published by: Brad Sommers on Apr 04, 2009
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





In R. D. Levine and M. Tribus, TheMaximum Entropy Formalism, MIT Press,1979, pp. 477–498Gregory J. ChaitinAbstract
In discussions of the nature of life, the terms “complexity,” “organ-ism,” and “information content,” are sometimes used in ways remark-ably analogous to the approach of algorithmic information theory, mathematical discipline which studies the amount of information nec-essary for computations. We submit that this is not a coincidence and that it is useful in discussions of the nature of life to be able to refer toanalogous precisely defined concepts whose properties can be rigorously studied. We propose and discuss a measure of degree of organization 
G. J. Chaitin
and structure of geometrical patterns which is based on the algorith-mic version of Shannon’s concept of mutual information. This paper is intended as a contribution to von Neumann’s program of formulating mathematically the fundamental concepts of biology in a very general setting, i.e. in highly simplified model universes.
1. Introduction
Here are two quotations from works dealing with the origins of life andexobiology:These vague remarks can be made more precise by in-troducing the idea of information. Roughly speaking, theinformation content of a structure is the minimum numberof instructions needed to specify the structure. Once cansee intuitively that many instructions are needed to specifya complex structure. On the other hand, a simple repeatingstructure can be specified in rather few instructions. [1]The traditional concept of life, therefore, may be toonarrow for our purpose... We should try to break awayfrom the four properties of growth, feeding, reaction, andreproduction... Perhaps there is a clue in the way we speakof living
They are
highly organized,
and perhapsthis is indeed their essence... What, then, is organization?What sets it apart from other similarly vague concepts? Or-ganization is perhaps viewed best as “complex interrelated-ness”... A book is complex; it only resembles an organismin that passages in one paragraph or chapter refer to otherselsewhere. A dictionary or thesaurus shows more organiza-tion, for every entry refers to others. A telephone directoryshows less, for although it is equally elaborate, there is littlecross-reference between its entries... [2]If one compares the first quotation with any introductory article onalgorithmic information theory (e.g. [3–4]), and compares the secondquotation with a preliminary version of this paper [5], one is struckby the similarities. As these quotations show, there has been a great
Toward a Mathematical Definition of “Life”
3deal of thought about how to define “life,” “complexity,” “organism,”and “information content of organism.” The attempted contributionof this paper is that we propose a rigorous quantitative definition of these concepts and are able to prove theorems about them. We do notclaim that our proposals are in any sense definitive, but, following vonNeumann [6–7], we submit that a precise mathematical definition mustbe given.Some preliminary considerations: We shall find it useful to distin-guish between the notion of degree of interrelatedness, interdependence,structure, or organization, and that of information content. Two ex-treme examples are an ideal gas and a perfect crystal. The completemicrostate at a given time of the first one is very difficult to describefully, and for the second one this is trivial to do, but neither is or-ganized. In other words, white noise is the most informative messagepossible, and a constant pitch tone is least informative, but neither isorganized. Neither a gas nor a crystal should count as organized (seeTheorems 1 and 2 in Section 5), nor should a whale or elephant be con-sidered more organized than a person simply because it requires moreinformation to specify the precise details of the current position of eachmolecule in its much larger bulk. Also note that following von Neu-mann [7] we deal with a discrete model universe, a cellular automataspace, each of whose cells has only a finite number of states. Thus weimpose a certain level of granularity in our idealized description of thereal world.We shall now propose a rigorous theoretical measure of degree of or-ganization or structure. We use ideas from the new algorithmic formu-lation of information theory, in which one considers individual objectsand the amount of information in bits needed to compute, construct,describe, generate or produce them, as opposed to the classical for-mulation of information theory in which one considers an ensemble of possibilities and the uncertainty as to which of them is actually thecase. In that theory the uncertainty or “entropy” of a distribution isdefined to be
and is a measure of one’s ignorance of which of the
possibilities ac-tually holds given that the
a priori 
probability of the
th alternative is

Activity (7)

You've already reviewed this. Edit your review.
1 hundred reads
avinash6686 liked this
Metaphilos liked this
Claudio Borges liked this
bearspear liked this
Andrej Ristic liked this

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->