816 Random Signals App. A
the probability distributions are independent of time, it seems intuitively that the am-
plitude distribution (histogram) of a long segment of an individual sequence of samples
should be approximately equal to the single probability density that describes each of
the random variables of the random-process model. Similarly, the arithmetic average of
a large number of samples of a single sequence should be very close to the mean of the
process. To formalize these intuitive notions, we define the time average of a random
Process as
. 1 4
(0) = Jim ary 2 (A25)
Similarly, the time autocorrelation sequence is defined as
L
* . 1 *
(xnamXa) = fim 375 ee (A.26)
It can be shown that the preceding limits exist if {xy} is a stationary process with finite
mean. As defined in Eqs. (A.25) and (A.26), these time averages are functions of an
infinite set of random variables and thus are properly viewed as random variables
themselves. However, under the condition known as ergodicity, the time averages in
Eqs. (A.25) and (A.26) are equal to constants in the sense that the time averages of
almost all possible sample sequences are equal to the same constant. Furthermore,
they are equal to the corresponding ensemble average? That is, for any single sample
sequence {x[n]} for —co