Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
Measures of Degeneracy and Redundancy in Biological Networks

Measures of Degeneracy and Redundancy in Biological Networks

Ratings: (0)|Views: 10 |Likes:
Published by hamletquiron

More info:

Published by: hamletquiron on Oct 31, 2010
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

10/31/2010

pdf

text

original

 
 Proc. Natl. Acad. Sci. USA
Vol. 96, pp. 3257–3262, March 1999Neurobiology
Measures of degeneracy and redundancy in biological networks
G
IULIO
T
ONONI
, O
LAF
S
PORNS
,
AND
G
ERALD
M. E
DELMAN
The Neurosciences Institute, 10640 John J. Hopkins Drive, San Diego, CA 92121
Contributed by Gerald M. Edelman, December 29, 1998
 ABSTRACT Degeneracy, the ability of elements that arestructurally different to perform the same function, is aprominent property of many biological systems ranging fromgenes to neural networks to evolution itself. Because struc-turally different elements may produce different outputs indifferent contexts, degeneracy should be distinguished fromredundancy, which occurs when the same function is per-formed by identical elements. However, because of ambiguitiesin the distinction between structure and function and becauseof the lack of a theoretical treatment, these two notions oftenare conflated. By using information theoretical concepts, wedevelop here functional measures of the degeneracy andredundancy of a system with respect to a set of outputs. Thesemeasures help to distinguish the concept of degeneracy fromthat of redundancy and make it operationally useful. Throughcomputer simulations of neural systems differing in connec-tivity,weshowthatdegeneracyislowbothforsystemsinwhicheach element affects the output independently and for redun-dant systems in which many elements can affect the output ina similar way but do not have independent effects. By contrast,degeneracy is high for systems in which many differentelements can affect the output in a similar way and at the sametime can have independent effects. We demonstrate thatnetworks that have been selected for degeneracy have high values of complexity, a measure of the average mutual infor-mation between the subsets of a system. These measurespromise to be useful in characterizing and understanding thefunctional robustness and adaptability of biological networks.
There are many examples of biological systems composed of elements that are structurally different but which, undercertain conditions, can perform similar functions. Classicillustrations include nucleotide triplets degenerate in the thirdposition that code for the same amino acid, different proteinsthat catalyze the same enzymatic reaction, and differentgenotypes that produce the same phenotype. In a neurobio-logical context, the capability of different, nonisomorphicstructures to yield isofunctional outputs, effects, or conse-quences has been called degeneracy (1, 2). This term previ-ouslywasusedinimmunologytorefertotheabilityofdifferentantibodies to bind to the same antigen (3). In physics, the termdegeneracy is applied to systems taking on several discrete ordistinct energy values or states.Examples of degeneracy in neurobiology abound (2). Theconvergent-divergent connectivity of the brain suggests thatlarge numbers of neuronal groups are able to affect the outputof any chosen subset of neurons in a similar way. For example,a large number of different brain structures can influence, inseries or in parallel, the same motor outputs, and afterlocalized brain lesions, alternative pathways capable of gener-ating functionally equivalent behaviors frequently emerge (4).Degeneracy also is encountered at the neuronal level. Neuralsignaling mechanisms typically use parallel as well as converg-ing pathways of transmitters, receptors, kinases, phosphatases,and second messengers. A constrained set of outputs such asspecific changes in the levels of second messengers or in geneexpression thus can be brought about by a large number of different input combinations. Although many similar examples exist in all fields and levelsof biology, a specific notion of degeneracy has yet to be firmlyincorporated into biological thinking, largely because of thelack of a formal theoretical framework. As a consequence,instances of degeneracy often are not treated as exemplifica-tions of a biological principle, but are discounted as redun-dancy. In technical usage, redundancy refers to duplication orrepetition of elements within electronic or mechanical com-ponents to provide alternative functional channels in case of failure. In information theory, it refers to the repetition of parts or all of a message to circumvent transmission error. Although the structural definition of redundancy appliesreadily in engineering, the requirement of identity amongelements in a population or in biological networks is rarelysatisfied and in any case is difficult to assess. The biologicalexamples mentioned above, for instance, do not in generalinvolve structurally identical parts. This fact has importantconsequences,becauseunlikeredundantelements,degenerateelements may produce different outputs in different contexts.In the present paper, we propose to clarify these issues bymoving explicitly to the functional level, defining both degen-eracy and redundancy in information theoretical terms. Thisapproach allows us to distinguish between degeneracy andredundancy within a unified framework and, through the useof well-defined measures, to make both concepts operationallyuseful. Although these measures are applied here to neuralexamples, in principle they may be extended to any biologicalnetwork or complex system.
THEORY 
 As in previous papers, we consider neural systems consistingof a number of elementary components, which we take torepresent neuronal groups sharing anatomical connections (5,6). We then study the dynamic interactions between theseelements and an output sheet, assuming that the global sta-tistical properties of a system do not change with time (sta-tionarity). With these assumptions in place, we characterizedegeneracy and redundancy in terms of the average mutualinformation between subsets of elements within a neuralsystem and the output sheet.Consider a neural system X with n elements whose activityis described by a Gaussian stationary multidimensional sto-chastic process (7) that produces a set of outputs by means of an output sheet O and a fixed connectivity matrix CON(X;O)between O and a subset of the system units (Fig. 1
 A
). The jointprobability density function describing such a multivariateprocess, corresponding here to its functional connectivity, canbe characterized (7, 8) in terms of entropy (H) and mutualinformation (MI). Entropy and mutual information are usedhere in their statistical connotation; they can be thought of as
The publication costs of this article were defrayed in part by page chargepayment. This article must therefore be hereby marked ‘‘
 advertisement
’’ inaccordance with 18 U.S.C. §1734 solely to indicate this fact.
PNAS is available online at www.pnas.org.
To whom reprint requests should be addressed. e-mail: tononi@nsi.edu.
3257
 
multivariate generalizations of variance and covariance inunivariate statistics that are sensitive both to linear andnonlinear interactions. For instance, consider a
j
th subset X
 j k
of the system X composed of 
k
units, and the output sheet O.The MI between X
 j k
and O is given by:MI(X
 j k
;O)
H(X
 j k
)
H(O)
H(X
 j k
,O),
[1]
 where H(X
 j k
) and H(O) are the entropies of X
 j k
and Oconsidered independently, and H(X
 j k
, O) is the joint entropy of subset X
 j k
and output O. Thus, MI(X
 j k
;O) measures the portionof entropy shared by the system subset X
 j k
and the output O.It is high if both X
 j k
and O have high entropy (high variance)and share a large fraction of it (high covariance); it is low orzero if X
 j k
and O have low entropy or are statistically inde-pendent.
Degeneracy and Redundancy.
By itself, mutual informationis a measure of the deviation from statistical independencebetween two subsets and is insensitive to the direction of theinteraction: a positive value of mutual information between X
 j k
and O could be the result of a causal effect of X
 j k
on O throughforward connections, of O on X
 j k
through back connections, orofanothersubsetsimplyprovidingstatisticallycorrelatedinputto both X
 j k
and O. Here we are interested only in measuring thecausal effects on the output of changes in the state of subsetsof the system. To evaluate these causal effects, we consider thechange in mutual information between each subset X
 j k
and O when that subset is injected with a fixed amount of variance(uncorrelated random noise, Fig. 1
 B
). We call the mutualinformation value obtained by such perturbation of eachsubset MI
P
(X
 j k
;O). For simplicity, we shall assume that beforeinjecting the variance the variance of the system is zero, so thatthe initial value of mutual information between the system andthe output is also zero.We define the degeneracy D
N
(X;O) of X with respect to Oas:D
N
(X;O)
 k
1
 n
[
MI
P
(X
 j k
;O)
(
 k
 n
)MI
P
(X;O)].
[2a]
 According to Eq.
2a
, D
N
(X) is high when the mutualinformation between the whole system (
 k
 n
) and the outputis high and at the same time the average mutual informationbetween small subsets of the system (small values of 
k
) and theoutput is higher than would be expected from a linear increaseover increasing subset size (Fig. 2
 A
).D
N
(X;O) can be expressed in mathematically equivalent ways. In particular, D
N
(X;O) corresponds to the averagemutual information shared between bipartitions of X and theoutput O, summed over all bipartition sizes:D
N
(X;O)
1/2
 k
1
 n
MI
P
(X
 j k
;X
X
 j k
;O)
,
[2b]
 wherethemutualinformationthatissharedbetweenX
 j k
,X-X
 j k
,and O is MI
P
(X
 j k
;X-X
 j k
;O)
MI
P
(X
 j k
;O)
MI
P
(X-X
 j k
;O)
MI
P
(X;O). Thus, according to Eq.
2b
, D
N
(X;O) is high when,on average, the mutual information shared between anybipartition of the system and the output is high (Fig. 2
 B
).We now define the redundancy R(X;O) of X with respect toO as the difference between the summed mutual informationupon perturbation between
n
subsets of size
k
1 and O andthe mutual information upon perturbation between the entiresystem (
 k
n
) and O:R(X;O)
 j
1
 n
[MI
P
(X
1
 j
;O)]
MI
P
(X;O).
[3]
 According to this definition, redundancy is high if the sum of the mutual information between each element and the outputis much larger than the mutual information between the entiresystem and the output. This means that each of the elementsof the system contributes similar information with respect tothe output. Redundancy is zero if all elements of the systemcontribute to the output independently and the mutual infor-mation between the entire system and O is equal to the sumof the mutual information between each element of the systemand O.Based on this definition, we also can express D
N
(X;O) interms of average redundancy values with respect to O forincreasing subset sizes:D
N
(X;O)
 k
1
 n
[(
 k
 n
)R(X;O)
R(X
 j k
;O)
].
[4]
 According to Eq.
4
, D
N
(X;O) is high when the redundancy of the system (i.e., for
k
n
) with respect to the output is highand at the same time the average redundancy for small subsets(small values of 
k
) is lower than would be expected from alinear increase over increasing subset size (Fig. 2
C
). A measure of degeneracy that does not require averagingamong different subsets is also usefully introduced. Thismeasure, which we shall designate by italics as
D
(X;O), and
F
IG
. 1. Schematic diagram illustrating bases for the proposedmeasure of degeneracy. (
 A
) We consider a system X, composed of individual elements (gray circles,
n
8) that are interconnected(arrows) among each other [with a connection matrix CON(X)] andthat also have connections to a set of output units O [with a connectionmatrix CON(X;O)]. (
 B
) A subset (shaded in gray) of the system X isperturbed by injecting (large Vs at the top) a fixed amount of variance(uncorrelated noise) into each of its constituent units. This perturba-tion activates numerous connections within the system (thick arrows)and produces changes in the variance of a number of units within Xand O. The resulting mutual information under perturbationMI
P
(X
 k j
;O) is computed (see Eq.
1
). This procedure is repeated for allsubsets of sizes 1
k
n
of the system X.F
IG
. 2. Graphical representation of different expressions for de-generacy. (
 A
) Degeneracy expressed in terms of the average mutualinformation between subsets of X and O under perturbation (see Eq.
2a
). (
 B
) Degeneracy expressed in terms of the average mutualinformation shared between bipartitions of X and O (see Eq.
2b
). (
C
)Degeneracy expressed in terms of the average redundancy (see Eq.
4
). A graphical interpretation for the degeneracy
D
(X;O) (see Eq.
5
) isindicated as a dotted rectangular area with height corresponding tothat of bar at
n
1.
3258 Neurobiology: Tononi
et al
.
Proc. Natl. Acad. Sci. USA 96 (1999)
 
 which is related to degeneracy D
N
(X;O), measures the portionof the entropy of the output that is jointly accounted for bydifferent elements of the system. It is given by:
 D
X;O)
MI
P
(X
1
 j
;X
X
1
 j
;O)
R(X;O)
MI
P
(X,O)
MI
P
(X
1
 j
;O
X-X
1
 j
;
O
,
[5]
 where the sum is over the
n
elements and MI
P
(X
 j
1
;O
X-X
 j
1
;O)is the conditional mutual information between each elementand O given the mutual information between the rest of thesystem and O. The relationship between D
N
(X;O) and
D
(X;O)is displayed graphically in Fig. 2
C
.
Relationship to Complexity and Integration.
In previous work (5), we introduced a measure called neural complexityC
N
(X), which measures the degree to which an isolated systemis both functionally integrated (different elements of thesystem are integrated and behave coherently) and functionallysegregated or specialized (different elements are relativelyindependent and the system is highly differentiated). We alsointroduced the system integration I(X), a measure of the totalreduction of entropy because of the interactions within thesystem (5). The expressions for complexity and integration of a system bear a striking formal resemblance to the expressionsfor the degeneracy and redundancy of a system with respect toan output (Fig. 3). If, in the expression for D
N
(X;O), onereplaces MI
P
(X
 j k
;O)—the mutual information between eachsubset and the output—by H(X
 j k
)—the entropy of each sub-set—one obtains C
N
(X) and the equations defining the twomeasures are formally identical. As defined, I(X)
H(
 x
i
)
H(X), i.e., the difference between the sum of the entropies of the
 n
individualcomponentsconsideredindependentlyandtheentropy of X considered as a whole [compare Eq.
3
; comparealso the original information theoretical expression for redun-dancy (7) as the ratio (
H(
 x
i
)
H(X))
H(
 x
i
)]. The neuralcomplexity C
N
(X) of system X is defined in three mathemat-ically equivalent ways (Fig. 3
A
C
): C
N
(X)
[
H(X
 j k
)
(
 k
 n
)H(X)]
1
2
MI(X
 j k
;X-X
 j k
)
[(
 k
 n
)I(X)
I(X
 j k
)
] (compare with Eqs.
2a
,
2b
, and
4
). Arelatedexpressionforcomplexity(9)thatdoesnotrequireaveraging among different subsets measures the portion of theentropy of a system that is accounted for by the interactionsamong its elements (Fig. 3
C
). This measure, which we shalldesignate by italics as
C
(X), is given by
MI(X
 j
1
;X-X
 j
1
)
I(X)
H(X)
H(X
 j
1
X-X
 j
1
), where (X
 j
1
X-X
 j
1
) is the conditionalentropy of each element given the entropy of the rest of thesystem (compare Eq.
5
). Note that both C
N
(X) and
C
(X) havezero values for systems composed of disconnected elements,low values for systems composed of elements that are inte-grated and homogeneous (undifferentiated) and high valuesfor systems that are both integrated and differentiated.
Implementation.
To evaluate D
N
(X;O) for systems withmany different connectivity patterns, we implemented severalmodel systems as linear realizations. As described (5, 6), thisimplementation allows us to derive covariance matrices ana-lytically. Each linear system X consisted of 
n
neural elementsconnected to each other according to a connection matrixCON(X)withnoself-connections.CON(X)wasnormalizedsothat the absolute value of the sum of the afferent synaptic weights per unit was set to a constant value
w
1. A subsetof 
m
output elements (
 m
 n
) was connected (CON(X;O), Fig.1
 A
) one by one to elements of the output sheet O withconnections of unit weight.We consider the vector
A
of random variables that repre-sents the activity of the elements of X after perturbation, i.e.,after injecting uncorrelated Gaussian noise
R
of unit magni-tude into a subset X
 j k
of elements. For example, if the subsetreceiving the injection of uncorrelated noise corresponds tothe entire system, under stationary conditions we have that
 A
 A
CON(X)
R
. By defining Q
[1
CON(X)]
1
andaveraging over the states produced by successive values of 
R
, we obtain the covariance matrix under perturbation COV
P
(X)
 A
t
 A
Q
t
R
t
R
Q
Q
t
Q, where the superscriptt refers to the transpose. The covariance matrix then isnormalized to a correlation matrix to ensure that all outputelements have unit variance, and the correlation matrix ismultiplied through the output connection matrix CON(X;O)to obtain the correlation matrix between the system and theoutputsheet.UnderGaussianassumptions,alldeviationsfromindependence among the units are expressed by their covari-ances; from these, values of H(X) and therefore of MI
P
(X;O)can be derived according to standard formulae (8). Thisprocedure is repeated by applying the uncorrelated Gaussiannoise in turn to all possible subsets of the system, in such a waythat
R
has unit magnitude for the elements of each subset andis set to zero for the other elements. If the resulting covariancematrix has some elements with zero variance, the correspond-ing rows and columns are eliminated.
ILLUSTRATION
We first illustrate the measures of degeneracy defined aboveby applying them to several simple examples. Then, we dem-onstrate that an increase in degeneracy with respect to anoutput pattern tends to be accompanied by an increase incomplexity.
Examples of Independence, Degeneracy, and Redundancy.
Three basic examples can be constructed to illustrate keyproperties of degeneracy. Fig. 4 (
Top
) shows an example of anetwork for which different elements produce totally inde-pendent effects or do not affect the output at all (indepen-dence). Fig. 4 (
 Bottom
) shows an example of a fully connectednetwork; all of its units can affect the output but their effectson the output are essentially identical, i.e., the units showfunctional redundancy. Fig. 4 (
 Middle
) shows an example of adegenerate network with structured connections; many of theelements have effects on the output that are both functionallyredundantandfunctionallyindependent(degeneracy).Graphsof connectivities are given in Fig. 4
 A
; resulting correlationmatrices are presented in Fig. 4
 B
. Plots in Fig. 4
C
 E
display,respectively, the distributions of average mutual informationbetween subsets of X and O (Eq.
2a
), average mutual infor-mation shared between bipartitions of X and O (Eq.
2b
) andaverage redundancy (Eq.
4
), with the resulting degeneracyD
N
(X;O) displayed as the shaded area. D
N
(X;O) is zero for asystem (Fig. 4
C–E
,
Top
) for which all elements affect theoutput independently (zero redundancy). A system (Fig. 4
C–E, Bottom
) that has very high functional redundancy hasrelatively low values of degeneracy, as different combinationsof the system’s units have similar effects on the output. Asystem (Fig. 4
C–E, Middle
) in which many combinations of 
F
IG
. 3. Graphical representation of different expressions for com-plexity. Note the homology between these expressions and thoseillustrated in Fig. 2. (
 A
) Complexity, C
N
(X), expressed in terms of theaverage entropy. (
 B
) Complexity expressed in terms of the averagemutual information. (
C
) Complexity expressed in terms of the averageintegration (see ref. 9). A graphical interpretation for the complexity
C
(X) is indicated as a dotted rectangular area with height correspond-ing to that of bar at
n
1.
Neurobiology: Tononi
et al
.
Proc. Natl. Acad. Sci. USA 96 (1999)
3259

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->