Professional Documents
Culture Documents
0.1 Capacity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
0.1 Capacity
The information capacity of the Gaussian channel with power constraint Etr
depends on the mutual information I (x, y)
1
h (z) = log2 (det [2πeRzz ]) bits (2)
2
Let us consider zero mean i.i.d. symbols drawn form a Gaussian distribution.
Suppose also that the symbols and the noise are uncorrelated . From the
receive vector equation we can get the covariance matrix for user k as follows:
Ryk yk =E yk ykH