You are on page 1of 1

64 CYBERNETICS

Note that this expression (Eq. 3.091) is positive, and that it is inde-
pendent of w. It is one-half the logarithm of the ratio of the sum of
the mean squares of u and v to the mean square of v. If v has only a
small range of variation, the amount of information concerning u
which a knowledge of u + v gives is large, and it becomes infinite as
b goes to 0.
We can consider this result in the following light: let us treat u as a
message and v as a noise. Then the information carried by a precise
message in the absence of a noise is infinite. In the presence of a
noise, however, this amount of information is finite, and it approaches
0 very rapidly as the noise increases in intensity.
We have said that amount of information, being the negative
logarithm of a quantity which we may consider as a probability, is
essentially a negative entropy. It is interesting to show that, on
the average, it has the properties we associate with an entropy.
Let ef,(x)and if,(x) be two probability densities; then [ef,(x) + if,(x)]/2
is also a probability density. Then

J-oo ef,(x) + if,(x) l


2 ~
ef,(x) + if,(x) d
2 X

~ J_: ef,t) log ef,(x)dx + J: if,(;) log if,(x) dx (3.10)

This follows from the fact that


a+b a+b 1
- 2 - log - 2 - ~ 2 (a log a+ blog b) (3.11)

In other words, the overlap of the regions under ef,(x) and if,(x)
reduces the maximum information belonging to ef,(x) + if,(x). On
the other hand, if ef,(x) is a probability density vanishing outside
(a, b),

J_: ef,(x)log ef,(x)dx (3.12)

is a minimum when ef,(x) = 1/(b - a) over (a, b) and is zero elsewhere.


This follows from the fact that the logarithm curve is convex
upward.
It will be seen that the processes which lose information are, as we
should expect, closely analogous to the processes which gain entropy.
They consist in the fusion of regions of probability which were
originally distinct. For example, if we replace the distribution of a
certain variable by the distribution of a function of that variable

You might also like