Professional Documents
Culture Documents
To cite this article: LARS LÖFGREN (1977): COMPLEXITY OF DESCRIPTIONS OF SYSTEMS: A FOUNDATIONAL STUDY, International
Journal of General Systems, 3:4, 197-214
This article may be used for research, teaching, and private study purposes. Any substantial or systematic
reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to
anyone is expressly forbidden.
The publisher does not give any warranty express or implied or make any representation that the contents
will be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses should
be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims,
proceedings, demand, or costs or damages whatsoever or howsoever caused arising directly or indirectly in
connection with or arising out of the use of this material.
l nt. J. General Systems © Gordon and Breach Science Publishers LId.
1977, YoU, pp. 197-214 Printed in Great Britain
By definition (Webster) a system is deemed complex and complicated when the interconnection or arrangement of
its parts is difficult to trace or understand. Tracing and understanding are associated with cerebral learning
processes. By the aid of a previously developed model for such processes we can study the difficulties that define the
complexity impressions and thus provide ,a foundational, rather than methodological, study of complexity. The
nature of complexity as difficulty of perception suggests that a complexity classification should. in general, be more
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
difficult than a simplicity classification. This thesis is discussed and applied with a mathematical notion of degree of
difficulty.
INDEX TERMS Learning processes, description processes, foundational study, perception, complexity
classification, simplicity classification.
following two symbol-strings as descriptions (and We first recognize two ways, the two ways, in
not as objects in the above sense): "the sum of the which descriptions arc generated. Either a de-
numbers ten and onc-hundrcd-and-twentytwo," scription of a system is extracted from the system by
denoted C, and "the smallest set which contains a learning process (compare an acquired character
itself as a member," denoted D. When asked about in a biological context), or it is generated by an
the complexity of the two descriptions, most people hereditary-like mechanism (compare an inherited
(with a sufficient back-ground) would point at D as character and the biological dichotomy of charac-
the more complex although C is slightly longer than ters into acquired and inherited characters).
D. To be sure it may be the relative easiness by which Next, a description would not be a description if
the meaning of the description C is comprehended not associated with an interpretation. By interpret-
that accounts for the impression that C is less ing the description we get that which is described.
complex than D. Notice that the "system" indicated in Figure I
The examples suggest t hat the concept of com- may be considered an object as well as a description.
plexity basically refers to cognition processes. We In the latter case the description may for example be
want to analyse it in terms of those processes which described in a higher-level language (eventually in
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
generate our inner (cerebral) descriptions and those the same language as that used for the primary
which arc responsible for our awareness of mean- description, provided that it is rich enough to permit
ings of descriptions (of systems intended as de- self-reference). Again, as we have argued, the
scriptions, like the printed pages of this paper, generated description may be looked at as an object.
cerebral description, etc.), However, its character of description is unmistak-
As we have seen, the complexity of a system able as soon as it is associated with an in-
depends upon whether the system is considered an terpretation.
object or a description. Yet it may be argued that The learning or description mechanism indicated
objects arc in fact descriptions (that objects are in Figure I is supposed to be of a general kind. It
existential perceptions occurring in the cerebral covers verbal descriptions in natural or formal
description of the data-now that stimulates our languages as well as descriptions in cerebral
receptors; sec Section 7), or that descriptions are in languages. It will be explained further in Section 3
fact objects (typed sentences, DNA-molecules, etc). (where we shall see that learning actually involves
How could we then make a distinction between interpreta tion).
object and description when dealing with their Let us now return to the classification of
complexities? complexities with reference to Figure 1. For the
We want to avoid the problem by talking about moment we consider only description by learning
description process versus interpretation process but not by inheritance, because the latter is not
instead of object versus description. After all, the directly associated with cognition phenomena and
difficulties of tracing or understanding a string thus not with complexity sensations. This does not
depend on whether we describe it (as an object) or prevent us, later on, from considering complexities
interpret it (as a description). Although there is an of genetic descriptions (as objects and as de-
interdependence between description- and scriptions ).
interpretation-extractions, we shall be able to What remains are two mutually inverse pro-
isolate the corresponding difficulties, cf. Sections 4 cesses, a learning or description process and an
and 5. interpretation process. Each of them is associated
Figure I illustrates the inverse character of with cognition phenomena, and we are thus led to
description processes (learning, inheritance) and two corresponding kinds of complexity.
interpretation processes. Associated with the learning process is the
complexity of describing, the d-complexity. It is
measured by the difficulty associated with extract-
lnheritance~ ing the description of a system S, the descri ption
~learning ~~ responsible for how we "see" S and feel its
systemY- ~ description complexity. The complexities discussed above for
the strings A and B are of this type. After having
'interpretation/
discussed the learning process in Section 3 we will
return to a further classification of d-complexities.
FIGURE 1 Description and interpretation processes. Associated with the interpretation process is the
COMPLEXITY OF DESCRIPTIONS OF SYSTEMS 199
complexity of interpreting, the i-complexity. It is ProJr... for 1l. .. cr1ptiOtl of llle dau_
tile generation <_-------, nw {tile aurroundin& or
measured by the difficulty associated with extract-
ing the interpretation (meaning) of a description.
The complexities discussed above for the strings C
and D are of this type. In Section 5, i-complexities
will be further classified (depending on the language
and the nature of the II priori knowledge with
respect to which the description is considered a
description).
3 THE LEARNING OR
DESCRIPTION PROCESS
FIG URE 2 Structure oftearning (description) process.
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
of the proper axioms, provided that the theories Finally we want to make the comment that some
have thc same logical basis). of the subprocesses of Figure 2 may themselves be
A strict definition of the confirmation relations regarded learning processes, again with a structure
can be given!": 16 in terms of the deduction relation according to Figure 2. Consider for example the
A ~.,.11 ("thc formula B can be derived from the generation of relevant questions. It depends on a
formula A in the theory T" or equivalently "B is a fixed basis for the actual theory. How is this basis, a
theorem in the theory T(A), i.e., Taugmented with logical basis for example, determined? By a learning
A as a new axiom"). Equivalently, the definition can process which in turn requires a still more basic
be given in terms of complexity values I W which I, basis, etc. Finally we may come down to a basis that
arc defined in Section 5.1. Such equivalent for- is genetically determined. In general it is thus
mulations are given within parenthesis in the reasonable to consider hierarchies of learning
following definition. structures of the type of Figure 2.
and B as difficult (complex) takes place in our type of awareness we have for B predict for us that it
subconscious minds. The strings are of such lengths will be difficult for us to find rules. We may hence
that they are not recognized in a single act. The refrain from trying and simply classify B as difficult,
visual center of attention sweeps over the strings or complex, the easy way, i.e., by an easy prediction
and covers two or a few more figures at a time. When of a forthcoming difficulty.
seeing two O's close to each other in string A, the Our discussions so far suggest a dichotomy of
subconscious hypothesis is formed that also the strings, z, into complex (difficult) and simple (easy)
neighboring figures are O's, The hypothesis is strings, depending upon whether their learned
supported, strengthened, etc., and we tend to "see" descriptions are longer or shorter than a given
half of the string as only O's and, in a similar way, the length, IV. The learned description depends both on
other half as only I's. Since this part of the learning the descriptive language, L, and the actual learning
process is carried out subconsciously, we may even process. For languages with limited expressibility, it
get the impression that we are recognizing A in a is reasonable to assume that the learning process is
single act. That this is not the case, however, may be adapted to L such that the learned description
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
argued from the way we are aware of the other coincides with the shortest description of z that is
string, B. Here simple hypotheses of the above kind expressible in L, denoted s(z, L). With the notation
will be rejected. Most likely the hypotheses that can [s(z, L)] for the length of the shortest description
at all be generated in this type of subconscious s(z, L), we are thus led to the following complexity
"seeing" are very limited. We may not be able to and simplicity predicates.
express any rules in the internal language, in terms
of which we can describe B sufficiently short to be DEFINITION 2 er(z, IV), the d-complexit)' predicate
aware of it the way we are aware of A with its very
"the string z is complex with respect to the number
short internal description. When letting the eye
IV," is defined by:
sweep over B we can only become aware of it as "a
string of'O's and I's." The shortening of the detailed er(z, IV): [s(z, L)] > IV.
description of B, which is necessary to have it
contained in the limited awareness space, makes us 6(z, IV), the d-simplicity predicate "the string z is
miss a lot of facts of B (which are sequentially simple with respect to the number IV," is defined by:
generated-and forgotten-when letting the eye
6(z, w): ,er(z, IV) (: [s(z, L)] ~ IV).
sweep over B). Compare the fact that it is much
easier to remember A than B in such a detail that we
The extensions of these predicates define com-
can reconstruct the strings from their short-term
plexity and simplicity classes as follows.
memorizations.
It is my impression that it is not more difficult to
classify B as "difficult" than it is to classify A as DEFINITION 3 C w , the d-complexit y class IVitIJ
"easy," i.e., when looking at the strings for just a respect to IV, is defined by:
short while. This probably depends upon the fact
C w = {z :[(z, IV)} (= {z: [s(z, L)] > IV)).
that the learning processes at play are entirely
subconscious and, furthermore, so fast that we are Sw' the d-simplicit y class IVitIJ respect to IV, IS
not even aware of the learning times. Should there defined by:
be any difference in the difficulties of learning, we
will simply not be aware of them. Yet we do classify Sw = {z: 6(z, IV)} (= {z: [s(z, L)] ~ IV}).
B as "difficult" and A as "easy." These different
impressions may be due to the difference in the way Our impression that it is as easy to classify a string
we are aware of the strings. It is easy to be aware of A as easy (simple) as it is to classify a string as difficult
in such a detail that we can reconstruct A from the (complex) is valid only within this case of subcon-
short-term memory we immediately develop. Our scious learning. It may be compared with an alleged
awareness of B is different. When we find this, which saying of Salvador Dali. Asked ifhe found it difficult
we do easily, we may consciously try to find rules for to paint, Dali admitted: "No, it is either easy or it is
B in the hope that they will permit a sufficiently impossible l"
short description. That search process may generate Later on, when dealing with more expressible
a direct impression of difficulty. But if we do not languages and more conscious learning processes,
have any real need for remembering B, we let the we will find that it is in fact more difficult to classify a
2(J2 L. LOFGREN
difficult string as difficult than it is to classify an easy for an entirely different hypothesis or, eventually, for
string as casy (cf. Sections 4.2 and 4.5). an hypothesis concerning a rule for the exceptions.
Let us reconsider, for a moment, the string A of
Section 2. Here the rule that a 0 has a 0 as neighbor
4.2 CllllScio/ls DijJic/llt ies ;11
and a I a 1 as neighbor becomes increasingly
Partly CO/lSC;O/lS Learning
supported both in the left part of the string and in
Let us recall from Section 3 that learning essentially the right part. In the middle, however, there is an
is a search process. It searches for rules hidden in a exception. Nevertheless, the rule with the exception
string (an observation report) by suggesting hy- permits a shortening of the description of the string
potheses for test and eventual confirmation into such that we can easily memorize it. We may even
such rules. A rule is naturally considered the more tend to "see" the string as two disconnected strings,
powerful the more observable facts it concerns, and each satisfying the rule without exception. However,
a measure of the power of the rules that have been when looking at the whole string as a test sequence
found is the shortening of the description (of the to be continued to the right, the exception from the
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
observation report) that can be obtained by rule makes us feel uncertain of how to predict the
utilizing the rules, next figure. Should it be a I or should we expect
The domain of search may be lixed and small (cf. another exception') If the exceptions occur re-
the fixed strings A and B of the previous discussion). gularly, it is about time to expect another exception!
Again, it may be unlimited, as in the case where Z Consider instead the following sequence, E:
= Z(II) is considered an observation report for the 101010101010101010101. Here the rule "the suc-
lirst II observations of a behavior that we want to cessor of a 1(0) is 0(1)" is valid throughout without
describe, Hypotheses then naturally concern Z for exception (i.e., wherever there is a successor). When
all II, and the problem arises how to decide on such going through the sequence as a test sequence, the
hypotheses with linitc tests. corresponding hypothesis is getting an increasing
Consider an hypothesis concerning an inlinite number of supports and we feel safer in predicting E
domain. For example, a universally quantilied than in predicting A.
hypothesis that can be particularized to an infinity Universally quantified hypotheses are of special
of distinct test-propositions. Should the hypothesis interest for learning processes because of the strong
be true, it would have a very large (inlinite) shortening effects they have on the description,
shortening cffcct on the description of the (steadily provided that they are confirmed. If such an
increasing] observation report. Of course, such an hypothesis H is obtaining a steadily increasing
hypothesis cannot be verified by a finite test support when the test sequence Z is increased (when
sequence, It can be supported, however, and in cases increasing n in Z(ll)), then the redundancy of Z with
of strong support it may be considered confirmed. respect to H is steadily increasing. Compare
However, if such a universally quantified hypothesis Definition I, saying that a certain step B in the test
is falsilicd, its negation, i.e., an existentially quan- sequence z supports H with respect to the a priori
tified hypothesis, will be verified (cf. Theorem 1). An knowledge T if and only if B can be deduced from H
existentially quantified hypothesis, however, will in T, i.e., when B is redundant with respect to H.
have almost no shortening effect on the description. Hence, we may measure the support that z gives to
Rather, it has the character of a single observation, H by the shortening (removal of redundancy) that
or step, in the test sequence z. can be obtained by describing z in terms of H.
If an hypothesis H: vx Rx, where x indicates It is when an hypothesis is getting a continuously
(varies over) thc steps of z, is obtaining increasing increasing support, i.e., support with no in-
support until a step X o is reached where H is falsified, termediary rejection, that it may be considered for
i.e., I Rx o, then we may look at R as a rule that confirmation. In this case of steadily increasing
holds with certain exceptions. This means that we redundancy of the sequence z (with respect to the
arc in fact eonsidcring a new hypothesis hypothesis) the ratio [d(z, L]/[z], where d(z, L) is
H, :l/x(xi'xo)=>Rx, which is slightly longer and the actual description of z in L which utilizes the
slightly Icss powcrfulthan H. Should there be many hypothesis, is steadily decreasing with increasing
exceptions also from this new hypothesis, certainly a length of z. In fact, the behavior of this ratio may
much longer and weaker hypothesis can be formed. indicate how close we are to a reasonable termi-
However, thc shortening effect of this hypothesis nation of the learning process.
may be so small that, instead, we may start to look Let us in the actual case of partly conscious
COMPLEXITY OF DESCRIPTIONS OF SYSTEMS 203
learning associate feelings of increasing difficulty of but distinguish parts of a system, for example the
learning with an increasing number of test steps that balls on a billiards table. Let us assume that the
have to be performed before the learning process is involved descriptive language, L, has a low power of
terminated. At each step we have to decide if the expressibility such that it cannot express any
observation (the new element of the test sequence) is relations between the parts except being distinct.
irrelevant for the hypothesis H, falsifies H, or For example, the color and form of the balls, the way
supports H (cf. Definition I). they bounce against each other, are properties that
Let z , (n) and z 2 (u) be two test sequences, each of cannot be expressed in L. Furthermore, let us
a reasonably large test length, n, such that the ratio assume that L has an alphabet consisting of just a
[s(z, L)]/[z] is large (close to 1) for z, (Il) and small single letter, a.
(close to 0) for z2(n). Which is the most difficult to A most natural description of a system in this
learn? language L will be a string of a's, like aaaa, with a
We can conclude that z, contains fewer rules to be one-to-one correspondence between the letter-
found than Z2 because of its longer shortest occurences and the parts of the system as "seen" by
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
description. In fact, since the ratio for z\ is close to I, the learning mechanism. Any such description will
we know that the test sequence z, (Il) already be a shortest description, and its length will indicate
produced can contain but little support for eventual the number of parts, the numerosity, of the system.
hypotheses. Hence, the learning process for z , The d-complexity predicate [(z, w) (cf Definition
cannot be terminated at this point (n), but will have 2) will become a numerosity predicatc ".zconsists of
to be continued with increasing difficulty oflearning more than w parts." The d-complexity class with
as a result. respect to w, C", will become the class of systems
The learning process for Z2' on the other hand, with more than w parts.
may be terminated already at this point Il,provided ln many contexts we deal with numerosity as a-
that so many of the hidden rules (we know that they simple form of complexity also when we have at our
are there) have been found, that also the ratio disposal a more powerful language than that
[d(z, L)]/[z] is close to O. Recall at this point that we described above. This means, however, that we arc
assume that Il is reasonably large, such that we may then deliberately abstracting from properties that
assume that {s(z,L)] and [d(z,L)] are about the go beyond the mere distinctness of parts that
same. underlies numerosity as a complexity-measure.
Hence, we conclude that z\ with its larger
[s(z, L)]/[z]-ratio is more difficult to learn than Z2.
Indeed, it is the behavior of the ratio [s(z,L)]/[z] 4.4 Intricacy and Over-All
for increasing [z] that determines whether a string z d-Complexity
shall be classified as difficult or easy. Let <p(z) be a Should I try to describe the whole world, I would
delimiting ratio-behavior such that sequences with certainly find it more complicated than if I tried to
ratios above <p(z) are classified as difficult and describe just a part of it, provided that I used one
sequences with ratios below <p(z) are classified as and the same depth of description.
easy. Then, without having to go into a specification However, even if [ restrict myself to systems with
of <p(z) at this point, we can define relative the same numerosity, I may get very different
complexity and simplicity classes with respect to a impressions of difficulty, as shown by considering
<p-function as follows. the two strings A and B of Section 2.
The part of the complexity impression that is
DEFINITION 4 r~, the relative d-complexity class
with respect to the ratio-function '11, is defined by: independent of numerosity, has to do with the
nature of the interrelations of the parts, the intricacy
r ~ = {z: [s(z, L )]/[z] > '11 (z)} = {;:(I:(z, [z]<p (z))}. of the system. A reasonable measure of the intricacy
of a string z is [s(z, L)]/[z], because [z] measures
I~, the relative d-simplicity class with respect to the the numerosity of z (considered as a system of
ratio-function '11, is defined by: positioned letters). Hence, using the notation of
Definition 4, we may define the intricacy class with
I~ = {z: [s(z, L )]/[z] ~ <p(z)} = {z: 5(z, [z]<p(z))}. respect to the ratio-constant Y. as r X' where Y. denotes
4.3 N umerosit y the function '11 which has the value :,{ for all z.
Notice that intricacy, as well as numerosity, arc
Consider a primitive learning mechanism that can complexity measures primarily associated with
204 L. LOFGREN
strings or fixed length (i.e., the domain or search for 2, 3, and 4, we obtain the following results
the learning process is fixed). On the other hand, r "', concerning the computability or complexity classes.
wit h <p depending on z, was developed with the view
LEMMA 1 For /10 universal Turing machille U is
that Z(II) was a steadily increasing observation
s(z, U) a computable (recursive)Jullction of z.
report (i.e., the domain of search for the learning
process is considered unlimited). This is Theorem 11.2 or Relative Explanations of
Should I want to compare two fixed strings z, and Systems. 1 2 II was first proved in Ref. 9.
Z2 or different lengths, I may lind that the
numerosity or z, is larger than the numerosity or Z2, LEM MA 2 For no universal Turing machine U is
and yet that the intricacy of Zl is smaller than the there all algorithm (effective process) for the de-
termination ofa z such that [s(z, U)] > x, where x is an
intricacy of Z2' Which, then, has the greatest over-all
arbitrarily given number.
complexity? Well, since the strings are fixed, a
natural answer is to apply Definition 3, i.e., to This is Theorem 11.3 or Relative Explanations or
compare z, and Z2 with respect to the length or their Systems.'?
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
iii) r, = r" containes 110 illfillite, recursively ell- follows. If a set is recursive (computable), then and
umerable subset. only then is there an effective method of deciding for
arbitrarily given x:s whether x belongs to the set or
Proof (i):E" is recursively enumerable, because
not. If a set is recursively enumerable (the extension
S(z, w), and hence also S(z, [z]cp(z)), is sernicom-
of a semicomputable predicate), then there is an
putable (Theorem 2). (ii) Suppose that r" is finite.
efTeetive method for deciding of arbitrarily given
Then [s(z,U)]>[z]cp(z) holds for just a finite
x:s, that in fact do belong to the set, that they belong
number of strings z, Let m be the length of the
to the set, i.e., the set is positively decidable in an
longest of these, and let \I' = d"max (m, k) + i. If a is
efTective way. However, if a set is not recursively
the number of symbols in the Turing machine
enumerable, then it is efTectively impossible to
alphabet, there are precisely a W + I distinct z-strings
decide for arbitrarily given x:s whether x belongs to
of length \I' + I and each of these must have a
the set even for all those x:s that in fact do belong to
shortest description such that [s(z, U)] ~ [z]cp(z)
< [z], i.e., a description of at most length w. the set.
However, there can be at most «" + a W - I +~...+ a Hence, the simplicity sets S.. (cf. Definition 3 and
Theorem 2) and :E" (cf. Theorem 3) are positively
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
passive method the self-reproducingautomaton contains within measured by the difficulty associated with extract-
itself a passive description of itself and reacts this description in ing the interpretation (meaning) of a description.
such a way that the description cannot interfere with the In the following subsections we will discuss
automaton's operations. In the active method the self-
reproducingautomaton examines itself and thereby constructs a various types of interpretations, and see how
description of itself Von Neumann suggests that this second corresponding types of i-complexity result.
method would probably lead to paradoxes of the Richard type,
and for this reason he adopts the first method.
5.1 Complexity and Syntactical
The two methods of generating self-descriptions can Information
immediately be identified with the two ways of
Let Wand V be well-formed formulas (wff's),
producing a description as are illustrated in Figure
considered as descriptions with reference to a theory
I, namely inheritance and learning.
T. In a definite sense the meaning of W includes the
Inheritance, where a genetic description of an
meaning of V, if V is a consequence of W in T. For
organism is contained within the organism and
example, the meaning of "a snowball" ineludes the
released in thc act of reproduction, corresponds to
meaning of sentences like "it feelscold in my hands,"
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
Then the syntactical information 12 of W with emitted from the source. The i-complexity of a
respect to T, 1(W,T)=d,r{X:(XEL) and (W~TX) whole string A emitted from the source will be I (A)
and (1'- T X )} satisfies Definition 6 for I W 1- =~ I(Xi)' where Xi is the symbol at the i-th position
complexity in the sense that: of A.
The probability that the i-th symbol of A is x,
I I,
I(H-;T)= I(V, T) if and only if I Wi = V p(x), is independent of i and hence the average i-
1(W,T)~I(V,T)ifandonlyifl WI~I VI.
complexity, C(n, H), ofa string of length n produced
by the source will be C(n, H) = nH, where
Information, and thus complexity, is frequently
associated with independence from the a priori H= l: p(x) log 'l/p(x)
knowledge T (or with degree of freedom). If Wand xea
V are two descriptions such that I WI> I V I, then W
is more deductively independent than V because is the entropy of the source. Here we may compare n
W~TV but not V~TW Independence from T may with the numerosity and H with the intricacy for d-
complexities (where, instead, the problem is to
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
plcxity properties). In the following we will sum- However, if"',(x) is undefined for a given z and x,
marizc thc axiomatic description and provide it is true that a corresponding computation sequ-
motivational aspects partly with reference to the ence y is not defined as a number, but nevertheless
recursive Turing machine predicate 12 T(z, X, y) and an infinite behavior of the machine Z is defined.
partly with reference to the ideas of expected Accordingly it has been suggested to define <!J.(.x) as
difficulties of difficulty classification, advanced in a certain nonnumber (infinity) when "'.(.x) is
Sect ion 4.5. . undefined.
Let us recall that T(z, x, y) is true if and only if t he Condition (ii) implies that there is an effective way
Turing muchinc Z, with code number z, when of deciding, for arbitrarily given z, x, IV, if <!J,(x) ~ II'
starting from the argument x performs the com- is true or false. Here ~ is the "less than" relation
putation sequence Y, with code number y. The among numbers, such that if<!J,(x) is undefined as a
computation sequence is a description of what the number, then it is false that <!J,(x) ~ II' and, of course,
machine does, step for step such that each segment also false that <!J, (x ) > 11'.
of the sequence describes the state of the machine, Why should, in this case of i-complexity (com-
the scanning position, and the contents of the tape at plexity of interpreting), the predicate <!J,(x) ~ II' be
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
the corresponding step of the computation. Thus computable (recursive)? After all, in the case of d-
the computation sequence Y itself, or its code complexity (complexity of describing), the cor-
number .1', may be taken as a representative of the responding predicate 6(z, IV), i.e., [s(z, L)] ~ II' is in
actual computation and its complexity. However, Y general noncomputable (cf, Definition 2 and Theo-
is a complete description of the computation and rem 2).
contains further properties beside those which A machine oriented motivation of condition (ii)
directly bears on the complexity of computation. In may be developed as follows. The type of in-
trying to abstract away from such irrelevant terpretation we are considering, namely an in-
properties it has been suggested to measure the terpretation of a description as a program that
complexity of the computation "',(x) with a orders the machine how to compute, is obviously
measure-function (!>,(x), with natural numbers as effective (computable). Describing processes, on the
values, according to the following definition." 4 other hand, are in general noneffective (cf. Section
4.5). Now, the difficulty of interpreting is, in the
DEFINITION 8 <.)),(x) is a complexity measure for
actual case of computational complexity, realized
the computation "',(x) if: by the computer (for example as a length of
i) (I>, (x) is defined if and only if'" ,(x) is defined, computation, number of tape positions used, etc).
ii) the simplicity predicate <!J,(x) ~ II' is comput- Hence, if<!J,(x) is defined as a number, that number
able in z, x, 11'. is computable and with it the predicates <!Jz(x) ~ IV
and <!J,(x) > 11'. On the other hand, if <!J,(x) is not
Let us first see how these two properties of a
complexity measure (I>,(x) are to be found among
<
defined as a number, the defined program z, x) still
effectively determines a machine behavior, an
the properties of the computation sequence itself. infinite behavior, and we can from the triple z, x, II'
i) By definition "',(x) is defined if and only if effectively determine that <!J,(x)~1I' is false (but not
3yT(z. :x, .1'), i.e., if and only if the code that <!J,(x) > II' is false; cf. the following Theorem 5),
number .1'( =(I>,(x)) is defined. at least if<!J, (x ] is measured by y.
ii) The predicate T(z, x, .1') is recursive, and with A II10re abstract motivation of condition Iii) may
it the predicate 3y(y~1I' and T(z,x,y)) be developed with reference to the idea that a
complexity classification should be more difficult
(=(I>.(.x) ~ w], because the existential quanti-
than a simplicity classification (cf. Section 4.5). To
fication is bounded.
begin with, let us observe that condition (i) implies
Condition (i) is natural because "'_(x) is defined if that the complexity predicate <!J,(X) > II' (cf. [(z, 11')
and only if the corresponding computation sequ- of Definition 2) is noncomputable.
ence Y is defined, and a complexity of computation,
(I>,(x), is not naturally defined if the computation is THEOREM 5 Let <I>,(x) be any functions that
not defined. Conversely, if the computation Y is satisfy condition (i) of Definition 8, such that <!J,(x),
defined, its complexity should also be defined. when defined, takes values from the natural numhers
Otherwise the measure would not be considered 1,2,3, .... Then the complexity predicate <!J,(x ) > II' is
sufficiently well developed. noncomputable.
COMPLEXITY OF DESCRIPTIONS OF SYSTEMS 209
Proof Let us assume that the complexity pre- Given a function cp(x), it may be computed by all
dicate <I>,(X) > II' is computable. Then the simplicity infinity of computations
predicate <I>,(X);£ II' will also be computable. This
can be seen as follows. The simplicity predicate is
Z"Z2"" (cp(x)=ifi'l(x)=ifi,,(x)= ... ).
equivalent with a bounded existential quantifi- It is natural to ask for a computation z; with a
cation over complexity predicates: smallest complexity. On that point Definition 8
<I>,(x);£ 11';; 3y (0 <)';£ 11') and (<I>,(x)=.I') implies the interesting result that there are comput-
;;3)' (0 <.1';£ 11') and (<I>,(x)j y) able functions that do not have a computation of
and (<I>,(X)> y-I). minimal complexity. The following theorem is
proved by Blum.'
Hence, since the complexity predicate is assumed
computable, the simplicity predicate will also be THEOREM 7 Let <I>,(x) be a complexit y measure
computable. Finally, by condition (i) of Definition 8, according to Definition 8, and let f(x) he allY
ifi,(x) is defined if and only if <I>,(x) is defined, and monotone increasing recursive function. I1Jel1 there
<Il,(x) is undefined if and only if both <I>,(x» II' and exists a recursive function cp(x) having /10 simples I
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
<1>, (x);£ II' are false. Hence we can effectively decide if computation in the [ollowing sense. Given (lily
ifi,(x) is defined or not. This contradiction against computation z, fur cp(ifi, (x)=cp(x)), there ex ists
the non-computability of the halting problem for another computation Z2/or cp(ifi,,(x)=Cp(x)) such
Turing machines proves the theorem. that f (<1>" (x)) ;£<1>'1 (:dfor all sufficienti» large x.
Hence, if we accept condition (i], the complexity Thus, however fastly growing the function f is
predicate will be noncomputable, and the com- chosen to be, there is a function cp such that any
putability of the simplicity predicate, i.e., condition computation z 1 for cp will turn out to be at least f
(ii), is a natural assurance for the complexity times as complex as needed by another computation
classifications to be more difficult than the simp- Z2' This reasoning can again be applied on the
licity classifications. Furthermore, condition (ii) computation Z2, and so on.
implies that the complexity predicate is semicom- It is true, however, that function complexities here
putable and that <I>,(x) is partially computable in z are being compared for all but a finite number of
and x. arguments ("for almost all x "), It may thus happen
that those arguments, that for some reason are
THEOREM 6 Let <I>,(x) be a complexity measure found to be the most interesting, all belong to the
according 10 Definition 8. Theil <I>,(x) is partial finite number which are excluded by the above
recursive ill z and x, and the complexity predicate theorem (a so called speed-up theorem).
<1>, (x) > \I' semicomputable (hut 1I0t computable; cf Computational complexity is a well studied field
Theorem 5). although rather broad and, perhaps, not very clearly
Proof Since <I>,(x);£II', according to condition defined. Some authors' include within com-
(ii), is computable the function f(z,x)=dofJlII'[<I>,(x) putational complexity not only complexity of
;£ 11'] is partially computable and equal to <I>,(x), i.e., computations of functions (a so called dynamic
<I>,(x) is partially computable in Z,X. Hence the measure of computational complexity). but also
complexity predicate <I>,(x) > II' is semicomputable. complexity of describing functions (with their
computational programs; a so ca.led static measure
Definition 8 has interesting consequences con- of computational complexity). In our terminology,
cerning the computational complexity ofJimctiulIs. a static measure of computational complexity is a d-
Notice that <I>,o(x o ), if defined, specifies the com- complexity, whereas a dvnarnic measure is an i-
plexity of computing the specific function value complexity. This is why we have studied questions
ifi,o(xo), when computed with the computation concerning static measures already in Section 4.5.
sequence Yo such that T(zo, XO,)I O)' On the other Furthermore, computational aspects on proofs in
hand, the complexity of computing a whole func- formal theories (cf. the computability requirements
tion, say ifi,o' will have to be determined by Zo, i.e., on the axioms and rules of inference in a formal
by <1>'0 rather than by an isolated computation theory), may suggest the area of proof-complexlt y as
».
sequence Yo (determined by (zo, x o In the case of a branch of computational complexity.
computing functions rather than function values we Parts of proof-complexity are akin to d-
will thus speak of computations specified by z- complexity. For example when the complexity is
values rather than by y-values. considered as the difficulty of finding (learning) a
210 L. LOFGREN
proof to a given theorem, or when considered as the imperfect. This concept clearly belongs to the subject of
information, and quasi-thermodynamical considerations are
difficulty of memorizing a given proof. On the other relevant to it. I know no adequate name for it, but it is best
hand, the work needed to check if an alleged proof described by calling it "complication." It is effectivity in
really is a proof is akin to computational complexity complication, or the potentiality to do things. I am not thinking
in the form of i-complexity (also akin to the work of about how involved the object is, but how involved its purposive
operations are. In this sense, an object is of the highest degree of
checking the wellformedness of a given formula that
complexity ifit can do very difficult and involved things.
implies part of the meaning of the formula).
Concerning proof-complexity measured by the
Evolving biological organisms have the ability to
length of the proofs we want to mention a result of
produce something more complicated than them-
Godcl-Mostowski '8 which has a certain re-
selves. Certain artilicial automata on the other
semblance with Theorem 7. Let 5 be a first-order
hand, like machine tools, are more complicated than
arithmetic and 5 I an extended arithmetic, essen-
the elements which can be made with them. Von
tially containing also second-order quantifications
Neumann suggests the hypothesis that corn-
(for details, see!"). Then there are wffs W which are
plication is degenerative below a certain level but
theorems in both 5 and 5, such that the shortest 5,-
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
6./ OIlVOIl Neumann's Cut-Point It would seem that von Neumann here in fact is
for Complexity using distinct meanings of complexity, partly the
In his work on self-reproducing automata, von potentiality aspect of i-complexity and partly a d-
Neumann" considered the high potentiality to do complexity akin to numerosity. Without a more
things that may be found in very complicated precise definition of complexity, von Neumann's
automata. Indeed, he more or less identified cut-point hypothesis may lack in contents, and it is
complexity with such a potentiality (and thereby often remarked that trivially simple automata may
used complexity in a restricted sense that we have be considered self-reproductive.
tried to avoid in this more general study of the Since the basic question of concern is the
concept of complexity). On p. 78 of his book" he potentiality aspect of complexity, we want to
writes: consider r-complexities alone, first in connection
There is a concept which will be quite useful here, of which we with anonreferential type of interpretation and later
have a certainintuitiveidea, but which is vague, unscientific, and in connection with a referential.
COMPLEXITY OF DESCRIPTIONS OF SYSTEMS 211
As a starting point for the development of a ing is thought of as neutral with respect to the
non referential complexity aspect, let us consider a deseribing process, which is a natural assumption in
suggestion of von Neumann that when an auto- the case of constructing automata, then we have to
maton is not very complicated, the description of the consider whether the deseription is to be produced
functions of that automaton is simpler than a by the automaton according to the passive or active
description of the automaton itself but that the method (cf. Section 4.6).
situation is reversed with respect to complicated Let us assume that the passive method is at play
automata. The suggestion has been commented (cf. our intention to consider i-complexities alone).
upon by Burks and Godel ; see pp. 54-56 of Ref. 25. Let a, be a productive automaton with a behavior
Let T be a theory of automata such that T(w,,), which is interpreted by the surrounding as a
i.e., T augmented with the proper axioms w", description of another automaton a2 which again is
describes the automaton A. In the same way, let productive, etc. Then, since a i produces a i + l e we can
T(~) describe the behavior of the automaton A, say that a, is more complex than ai + 1 if we consider
provided that ~ are the full behavior axioms for A. a complexity which results from I Wi-complexity
For automata like Turing machines it can happen
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
6.2 Liniit ations jor WI Effective DEFINITION lOA string z of length v, written on
Increase ofthe Length of an alphabet of n symbols, is randomized 10 a degree I'
Randomized Sequences in relation to U if s(z, U)~s(w, U) holds for I' 'n'- of
the n" strings II' of length I', where r is a rational
In discussing von M iscs' axiom of randomness
number such that 0<1';£ I.
Popper:" writes:
The axiom or randomness or, as it is sometimes called, "the THEOREM 9 For /10 universal Turing machine U
principle or the excluded gambling system," is designed to give and for no degree of randomization 1'>0 is there Wl
mathematical expression to the chance-like character or the algorithm forthe generation of a string z(v), where v
sequence. Clearly, a gambler would be able to improve his
chances by thc usc or a gambling system if sequences or penny is an arbit rariiy given length of the string, such that
losses showed regularities such as. say, a fairly regular z(v) is randomized to degree I' relative to U.
appearance of tuils after every run or three heads. Now the axiom
or randomness pustulates or all collectives that there does 110t In view of the above Corollary we may as well
exist a gambling systcin that can be successfully applied to them. interpret Theorem 9 as saying that there is no
effective way of generating sequences with an
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
Instead of founding the concept of randomness arbitrarily large over-all d-cDmplexity (cf. Section
upon the powers of gamblers, I suggested" that we 4.4). (Cf. also a theorem!" which says that rando-
replace the gambler with an elTcctive machine, a mized sequences can only be effectively generated
Turing machine, and consider a sequence as provided that they are shorter than a fixed length,
random if no Turing machine can find any rules in which depends on U.)
it with which to shorten the description of it (if there Other definitions of randomized sequences have
are rules they can bc used to shorten the description; been suggested which also involve the notion of
ef. Section 4.2 and 4.5). Hence I suggested that a effective computability in one way or another.": 17lt
sequence is randomized if its shortest description would seem that they all imply that it is impossible
with respect to a universal Turing machine is to effectively generate arbitrarily long randomized
maximal, and that a sequence is the more rando- sequences.
mized thc longer its shortest description.": "
DEFINITtON 9 Let z, and Z2 be two strings of 7 REDUCTION OF COMPLEXITY
equal length, written on the alphabet of a universal
Turing machine U. Then, in relation to U, z, is //lore d-complexities are usually wanted small just as we
randomized than Z2 if and only if s(z" U) > S(Z2, U). want the difficulties of learning or describing our
With U dcfining thc semantics of L, Definitions 9 surroundings 10 be made small.
and 2 immediately suggest that randomization be Let us recall that a d-complexity is not an intrinsic
identified with d-complexity as follows, property of objects but rather a concept that
depends on our learning capacity. Hence we may
COROLLARY ofDefinit ion Zand 9. Let 2, and Z2 be well consider to reduce the complexity of an object,
two st rings 0/ equal lengiu written on the alphabet of not by changing the object, but by changing our
a universal Turing //lachine U. Then z, is more views of it, for example by describing it in a more
randomized than 22 if and only if z, is more d- powerful language.
complex than Z2 according to Definition 2 (with We want to illustrate such a complexity reduction
s(z, U) suhstit uted for s(z, L)), with a type of reduction that occurs within our
Argument z, is more d-complex than Z2 accord- cerebral description processes and which is likely to
ing to Definition 2 if and only if there is a number II' generate our existential perceptions of concrete
such that [(z" 11') and 6(Z2' 11'), i.e, such that z , IS objects. Thus we will outline a (foundational) study
complex and Z2 simple with respect to 11'. This is of objects by looking into their nature as per-
equivalent with the condition thai s(z" U) ceptions. The interested reader is referred to
>S(Z2' U), i.e., with the condition that z, is more LOfgren 15 for a fuller study. Distinct methodologi-
randomized than Z2' cal studies of things and objects have been given by
I previously defined '2 a degree of randomization Bunge 5 and Goguen."
as follows and showed that there is no effective way Let us first notice that it is not the objects of the
of generating arbitrarily long seq uences randomized real world that are the entries to our cerebral
to a nonzero degree. description mechanisms. Rather, it is an enormous
COMPLEXITY Of DESCRIPTIONS Of SYSTEMS 213
data-f1ow into our receptors that is the direct entry. process, is sufficiently advanced to allow a renaming
In trying to describe that f1ow-in itself inde- process, essentially of a type known from de-
scribably large-the description process must re- finability theory.i"
duce its complexity. We want to suggest that it is this For example, if Wx is a very large wIT that occurs
reduction that calls forth perceptions of existing frequently in a description, it may be economical to
objects. Thus the objects, in terms of which we "see" give it a short name and to use this name, instead of
the real world, are the results of the cerebral Wx itself, wherever possible in the description. For
description process rather than being the entries to example, if Q is a not yet used predicate symbol,
that process. such a naming can be accomplished by adding the
Let us, for the development of the arguments, wlTltx(Qx= Wx), the definition, as an axiom to the
recall a principle according to which the extensions actual theory (description). Such a naming process
of concepts vary inversely with their intensions. will be harmless in the sense that it fulfills the
Nagel?" considers it a logical principle. requirements of eliminability and noncreativity.i"
Let Ext A={A"A 2, ... } be the extension of a Yet it can be quite elTective in the sense that it may
shorten the description of the data-f1ow con-
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012
level of confirmation, or first at a certain level of 6. A. Church, "On the Concept of a Random Sequence." Bull.
cxprcssibility of the language used in the description Amer. Math. Soc .. 46, 1940, PI'. 130-135.
7. M. Davis. Computability and Unsolvabilnv. McGraw-Hili,
process, is beyond the reach of this essay. Tnstead New York, 1958.
we have intended to support the likelihood of just 8. J. Goguen, "Objects." lilt. J. General Systems, I, 1975, PI'.
existential perceptions, taken for granted that 237-243.
perceptions in general do occur. 9. L. Lofgren, "Recognition of Order and Evolutionary
Systems." Computer and Information Sciences, Vol.lI,edited
As we have seen, the possibility of reducing by J. Tou, Academic Press, New York, 1967.
complexity depends essentially on the nature of the 10. L. Lofgren, "Relative Recursiveness of Randomization and
language. Our ideas about languages arc mostly Law-Recognition." Notices Amer. Math. Soc., 16, 1969, p.
based on our acquaintance with the external 685.
11. L. LOfgren, "Complexities of Descriptions and Random
communication languages. It is important, how- Numbers." J. Symbolic Logic, 36, 1971, p. 360. [Abstract.)
ever, to conceive of languages very broadly when 12. L. Lofgren, "Relative Explanations of Systems." Trends ill
associating general learning processes with lan- General Systems Theory, edited by G. Klir, John Wiley, New
guages (in ·which the produced descriptions are York,1972.
13. L. LOfgren, "On the Formalizability of Learning and
Downloaded by [Stanford University Libraries] at 11:09 27 September 2012