You are on page 1of 1

Contents

0.1 Capacity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

0.1 Capacity
The information capacity of the Gaussian channel with power constraint Etr
depends on the mutual information I (x, y)

I (x, y) = h (y) − h (n) . (1)


n
i z ∈ R have mean µ and covariance
Theorem 1 Leth the random vector
matrix Rzz = E (z − µ) (z − µ)T . Then,

1
h (z) = log2 (det [2πeRzz ]) bits (2)
2
Let us consider zero mean i.i.d. symbols drawn form a Gaussian distribution.
Suppose also that the symbols and the noise are uncorrelated . From the
receive vector equation we can get the covariance matrix for user k as follows:

Ryk yk =E yk ykH
 

=Hk P̄k diag (ak ) Rss diag (ak ) P̄H H


k Hk
K
X
+ Hk P̄i diag (ai ) Rss diag (ai ) P̄H H
i Hk + Rnn (3)
i=1
i6=k

Considering that Rss = σs2 I, we obtain

You might also like