Professional Documents
Culture Documents
W. It Is One-Half The Logarithm of The Ratio of The Sum of U U V Gives Is Large, and It Becomes Infinite As U As A
W. It Is One-Half The Logarithm of The Ratio of The Sum of U U V Gives Is Large, and It Becomes Infinite As U As A
Note that this expression (Eq. 3.091) is positive, and that it is inde-
pendent of w. It is one-half the logarithm of the ratio of the sum of
the mean squares of u and v to the mean square of v. If v has only a
small range of variation, the amount of information concerning u
which a knowledge of u + v gives is large, and it becomes infinite as
b goes to 0.
We can consider this result in the following light: let us treat u as a
message and v as a noise. Then the information carried by a precise
message in the absence of a noise is infinite. In the presence of a
noise, however, this amount of information is finite, and it approaches
0 very rapidly as the noise increases in intensity.
We have said that amount of information, being the negative
logarithm of a quantity which we may consider as a probability, is
essentially a negative entropy. It is interesting to show that, on
the average, it has the properties we associate with an entropy.
Let ef,(x)and if,(x) be two probability densities; then [ef,(x) + if,(x)]/2
is also a probability density. Then
In other words, the overlap of the regions under ef,(x) and if,(x)
reduces the maximum information belonging to ef,(x) + if,(x). On
the other hand, if ef,(x) is a probability density vanishing outside
(a, b),