You are on page 1of 15

CHAPTER 2

Problem 2.1 :
P (Ai ) =

3


P (Ai, Bj ), i = 1, 2, 3, 4

j=1

Hence :
P (A1 ) =

3


P (A1 , Bj ) = 0.1 + 0.08 + 0.13 = 0.31

j=1

P (A2 ) =

3


P (A2 , Bj ) = 0.05 + 0.03 + 0.09 = 0.17

j=1

P (A3 ) =

3


P (A3 , Bj ) = 0.05 + 0.12 + 0.14 = 0.31

j=1

P (A4 ) =

3


P (A4 , Bj ) = 0.11 + 0.04 + 0.06 = 0.21

j=1

Similarly :
P (B1 ) =

4


P (Ai , B1 ) = 0.10 + 0.05 + 0.05 + 0.11 = 0.31

i=1

P (B2 ) =

4


P (Ai , B2 ) = 0.08 + 0.03 + 0.12 + 0.04 = 0.27

i=1

P (B3 ) =

4


P (Ai , B3 ) = 0.13 + 0.09 + 0.14 + 0.06 = 0.42

i=1

Problem 2.2 :
The relationship holds for n = 2 (2-1-34) : p(x1 , x2 ) = p(x2 |x1 )p(x1 )
Suppose it holds for n = k, i.e : p(x1 , x2 , ..., xk ) = p(xk |xk1 , ..., x1 )p(xk1 |xk2 , ..., x1 ) ...p(x1 )
Then for n = k + 1 :
p(x1 , x2 , ..., xk , xk+1 ) = p(xk+1 |xk , xk1 , ..., x1 )p(xk , xk1 ..., x1 )
= p(xk+1 |xk , xk1 , ..., x1 )p(xk |xk1 , ..., x1 )p(xk1 |xk2 , ..., x1 ) ...p(x1 )
Hence the relationship holds for n = k + 1, and by induction it holds for any n.

Problem 2.3 :
Following the same procedure as in example 2-1-1, we prove :
1
pX
pY (y) =
|a|

yb
a

Problem 2.4 :
Relationship (2-1-44) gives :


yb

p
pY (y) =
X
a
3a [(y b) /a]2/3
1

1/3

X is a gaussian r.v. with zero mean and unit variance : pX (x) = 12 ex


Hence :
2/3
1
12 ( yb
a )
e
pY (y) =
3a 2 [(y b) /a]2/3
pdf of Y
0.5
0.45
0.4

a=2

0.35

b=3

0.3
0.25
0.2
0.15
0.1
0.05
0
10

0
y

10

Problem 2.5 :
(a) Since (Xr , Xi ) are statistically independent :
pX (xr , xi ) = pX (xr )pX (xi ) =
2

1 (x2r +x2i )/22


e
2 2

2 /2

Also :
Yr + jYi = (Xr + Xi )ej
Xr + Xi = (Yr + jYi ) ej = Yr cos + Yi sin + j(Yr sin + Yi cos )


Xr = Yr cos + Yi sin
Xi = Yr sin + Yi cos

The Jacobian of the above transformation is :


J=

X
r

Yr

Xr

Xi
Yr
Xi
Yi

Yi

cos sin
=
sin cos

=1

Hence, by (2-1-55) :
pY (yr , yi) = pX ((Yr cos + Yi sin ) , (Yr sin + Yi cos ))
(yr2 +yi2 )/22
1
= 2
2e
(b) Y = AX and X = A1 Y


Now, pX (x) = (212 )n/2 ex x/2 (the covariance matrix M of the random variables x1 , ..., xn is
M = 2 I, since they are i.i.d) and J = 1/| det(A)|. Hence :
1
1

1  1
2
ey (A ) A y/2
2
n/2
(2 ) | det(A)|

pY (y) =

For the pdfs of X and Y to be identical we require that :


| det(A)| = 1 and (A1 ) A1 = I = A1 = A
Hence, A must be a unitary (orthogonal) matrix .

Problem 2.6 :
(a)

jvY

Y (jv) = E e

jv

=E e

x
i=1 i

=E

 n



jvxi

i=1

n


E ejvX = X (ejv )

i=1

But,
pX (x) = p(x 1) + (1 p)(x) X (ejv ) = 1 + p + pejv


Y (jv) = 1 + p + pejv

n

n

(b)
E(Y ) = j

dY (jv)
|v=0 = jn(1 p + pejv )n1 jpejv |v=0 = np
dv

and
E(Y 2 ) =


d2 Y (jv)
d
jv n1 jv
|
=

jn(1

p
+
pe
)
pe
= np + np(n 1)p
v=0
v=0
d2 v
dv

E(Y 2 ) = n2 p2 + np(1 p)

Problem 2.7 :

(jv1 , jv2 , jv3 , jv4 ) = E ej(v1 x1 +v2 x2 +v3 x3 +v4 x4 )

4 (jv1 , jv2 , jv3 , jv4 )


|v1 =v2 =v3 =v4 =0
v1 v2 v3 v4
From (2-1-151) of the text, and the zero-mean property of the given rvs :
E (X1 X2 X3 X4 ) = (j)4

(jv) = e 2 v Mv
where v = [v1 , v2 , v3 , v4 ] , M = [ij ] .
We obtain the desired result by bringing the exponent to a scalar form and then performing
quadruple dierentiation. We can simplify the procedure by noting that :
1 
(jv)
= i ve 2 v Mv
vi

where i = [i1 , i2 , i3, i4 ] . Also note that :


j v
= ij = ji
vi
Hence :

4 (jv1 , jv2 , jv3 , jv4 )


|V=0 = 12 34 + 23 14 + 24 13
v1 v2 v3 v4

Problem 2.8 :
For the central chi-square with n degress of freedom :
(jv) =

1
(1 j2v 2 )n/2
4

Now :

jn 2
d(jv)
d(jv)
=
|v=0 = n 2
E (Y ) = j
n/2+1
dv
dv
(1 j2v 2)
 
d2 (jv)
2n 4 (n/2 + 1)
d2 (jv)
2
=

E
Y
=

|v=0 = n(n + 2) 2
n/2+2
2
2
dv 2
dv
(1 j2v )

The variance is Y2 = E (Y 2 ) [E (Y )]2 = 2n 4


For the non-central chi-square with n degrees of freedom :
(jv) =
where by denition : s2 =

i=1

1
(1

j2v 2)n/2

2
2
ejvs /(1j2v )

m2i .

d(jv)
jn 2
2
2
js2
=
+
ejvs /(1j2v )
n/2+1
n/2+2
dv
(1 j2v 2 )
(1 j2v 2)
|v=0 = n 2 + s2
Hence, E (Y ) = j d(jv)
dv


s2 (n + 4) 2 ns2 2
s4
d2 (jv)
n 4 (n + 2)
2
2
=
+
+
ejvs /(1j2v )
n/2+2
n/2+3
n/2+4
2
dv
(1 j2v 2 )
(1 j2v 2 )
(1 j2v 2 )
Hence,

E Y2 =
and



d2 (jv)
4
2 2
2
2
|
=
2n
+
4s

+
n
+
s
v=0
dv 2


Y2 = E Y 2 [E (Y )]2 = 2n 4 + 4 2 s2

Problem 2.9 :
The Cauchy r.v. has : p(x) =

a/
,
x2 +a2

< x < (a)

E (X) =

xp(x)dx = 0

since p(x) is an even function.




E X
Note that for large x,

x2
x2 +a2

a
=
x p(x)dx =

x2
dx
x2 + a2

1 (i.e non-zero value). Hence,




E X 2 = , 2 =
5

(b)

a/ jvx
a/
ejvx dx
e
dx
=
2
2
x + a
(x + ja) (x ja)
This integral can be evaluated by using the residue theorem in complex variable theory. Then,
for v 0 :


a/ jvx
e
(jv) = 2j
= eav
x + ja
x=ja

(jv) = E

jvX

For v < 0 :

a/ jvx
e
(jv) = 2j
x ja

Therefore :

= eav v
x=ja

(jv) = ea|v|

Note: an alternative way to nd the characteristic function is to use the Fourier transform
relationship between p(x), (jv) and the Fourier pair :
eb|t|

c
1
, c = b/2, f = 2v
2
c + f2

Problem 2.10 :
(a) Y =

1
n

i=1

Xi , Xi (jv) = ea|v|

Y (jv) = E ejv n

n
i=1

Xi

n


E ej n Xi =

i=1

n


Xi (jv/n) = ea|v|/n

= ea|v|

i=1

(b) Since Y (jv) = Xi (jv) pY (y) = pXi (xi ) pY (y) =

a/
.
y 2 +a2

(c) As n , pY (y) = y2a/


, which is not Gaussian ; hence, the central limit theorem does
+a2
not hold. The reason is that the Cauchy distribution does not have a nite variance.

Problem 2.11 :
We assume that x(t), y(t), z(t) are real-valued stochastic processes. The treatment of complexvalued processes is similar.
(a)
zz ( ) = E {[x(t + ) + y(t + )] [x(t) + y(t)]} = xx ( ) + xy ( ) + yx ( ) + yy ( )
6

(b) When x(t), y(t) are uncorrelated :


xy ( ) = E [x(t + )y(t)] = E [x(t + )] E [y(t)] = mx my
Similarly :
yx ( ) = mx my
Hence :
zz ( ) = xx ( ) + yy ( ) + 2mx my

(c) When x(t), y(t) are uncorrelated and have zero means :
zz ( ) = xx ( ) + yy ( )

Problem 2.12 :
The power spectral density of the random process x(t) is :
xx (f ) =

xx ( )ej2f d = N0 /2.

The power spectral density at the output of the lter will be :


yy (f ) = xx (f )|H(f )|2 =

N0
|H(f )|2
2

Hence, the total power at the output of the lter will be :


yy ( = 0) =

yy (f )df =

N0
2

|H(f )|2df =

N0
(2B) = N0 B
2

Problem 2.13 :

X1


MX = E [(X mx )(X mx ) ] , X = X2
, mx is the corresponding vector of mean values.
X3

Then :

MY

Hence :

=
=
=
=
=

E [(Y my )(Y my ) ]
E [A(X mx )(A(X mx )) ]
E [A(X mx )(X mx ) A ]
AE [(X mx )(X mx ) ] A
AMx A

11
0
11 + 13

0
4
0
MY =

22
11 + 31
0
11 + 13 + 31 + 33

Problem 2.14 :
Y (t) = X 2 (t), xx ( ) = E [x(t + )x(t)]

yy ( ) = E [y(t + )y(t)] = E x2 (t + )x2 (t)

Let X1 = X2 = x(t), X3 = X4 = x(t + ). Then, from problem 2.7 :


E (X1 X2 X3 X4 ) = E (X1 X2 ) E (X3 X4 ) + E (X1 X3 ) E (X2 X4 ) + E (X1 X4 ) E (X2 X3 )
Hence :

yy ( ) = 2xx (0) + 22xx ( )

Problem 2.15 :
pR (r) =

2
(m)

 m
m

r 2m1 emr

We know that : pX (x) =

2 /

, X=

p
1/ R

Hence :
1

2
m
pX (x) =
1/ (m)

1/

m 

.
2m1 m(x)2 /
x
e
=

2
2
mm x2m1 emx
(m)

Problem 2.16 :
The transfer function of the lter is :
H(f ) =

1
1
1/jC
=
=
R + 1/jC
jRC + 1
j2f RC + 1
8

(a)
xx (f ) = 2 yy (f ) = xx (f ) |H(f )|2 =

(b)
yy ( ) = F 1 {xx (f )} =

2
RC

2
(2RC)2 f 2 + 1

1
RC

1 2
( RC
) + (2f )2

ej2f df

Let : a = RC, v = 2f. Then :


2
yy ( ) =
2RC

2 a| |
2 | |/RC
a/ jv
e
dv
=
=
e
e
a2 + v 2
2RC
2RC

where the last integral is evaluated in the same way as in problem P-2.9 . Finally :

E Y 2 (t) = yy (0) =

2
2RC

Problem 2.17 :
If X (f ) = 0 for |f | > W, then X (f )ej2f a is also bandlimited. The corresponding autocorrelation function can be represented as (remember that X (f ) is deterministic) :
X ( a) =


n=

Let us dene :

X (

sin 2W

n

a)
2W
2W

n
2W

n
2W

(1)

n
n sin 2W t 2W



X(t) =
X(
)
2W 2W t n
n=

2W

We must show that :


or

=0
E |X(t) X(t)|

m
m sin 2W t 2W

X(t)


)
X(
=0
E X(t) X(t)
2W 2W t m
m=

(2)

2W

First we have :
E



n



m
m
n m sin 2W t 2W



) = X (t
)
)
X(t) X(t) X(
X (
2W
2W
2W
2W t n
n=

2W

But the right-hand-side of this equation is equal


application
of (1) with a = m/2W.
 to zero by 

Since this is true for any m, it follows that E X(t) X(t) X(t) = 0. Also
E

n
sin 2W t 2W
n



t)
X(t) X(t) X(t) = X (0)
X (
2W
2W t n
n=

2W

Again, by applying (1) with a = t anf = t, we observe that the right-hand-side of the equation
is also zero. Hence (2) holds.

Problem 2.18 :


Q(x) = 12 x et /2 dt = P [N x] , where N is a Gaussian r.v with zero mean and unit


variance. From the Cherno bound :
2

P [N x] evx E evN

(1)

where v is the solution to :




E NevN xE evN = 0
Now :

E evN

1
2

= ev
= ev
and

E NevN =

2 /2

1
2

(2)
2 /2

evt et


dt
2 /2

e(tv)

dt

2 /2

d  vN 
2
E e
= vev /2
dv

Hence (2) gives :


v = x
and then :

(1) Q(x) ex ex

2 /2

Problem 2.19 :
Since H(0) =

h(n) = 0 my = mx H(0) = 0

10

Q(x) ex

2 /2

The autocorrelation of the output sequence is


yy (k) =


i

h(i)h(j)xx (k j + i) = x2

h(i)h(k + i)

i=

where the last equality stems from the autocorrelation function of X(n) :


xx (k j + i) = x2 (k j + i) =

x2 , j = k + i
0,
o.w.

Hence, yy (0) = 6x2 , yy (1) = yy (1) = 4x2 , yy (2) = yy (2) = x2 , yy (k) = 0 otherwise.
Finally, the frequency response of the discrete-time system is :

j2f n
H(f ) =
h(n)e
= 1 2ej2f + ej4f

1 ej2f

2

= ej2f ejf ejf


= 4ejf sin 2 f

2

which gives the power density spectrum of the output :


yy (f ) = xx (f )|H(f )|2 = x2 16 sin 4 f = 16x2 sin 4 f

Problem 2.20 :
 |k|

1
2

(k) =
The power density spectrum is
(f ) =

k=

(k)ej2f k
 k

k=

1
2

ej2f k +

1 j2f k k
)
k=0 ( 2 e

1
1ej2f /2

2cos 2f
5/4cos 2f

3
54 cos 2f

11

k=0

1 j2f k
)
k=0 ( 2 e

1
1ej2f /2

 1 k j2f k
e

Problem 2.21 :
We will denote the discrete-time process by the subscript d and the continuous-time (analog)
process by the subscript a. Also, f will denote the analog frequency and fd the discrete-time
frequency.
(a)

d (k) = E [X (n)X(n + k)]


= E [X (nT )X(nT + kT )]
= a (kT )

Hence, the autocorrelation function of the sampled signal is equal to the sampled autocorrelation
function of X(t).
(b)
d (k) = a (kT ) =

a (F )ej2f kT df

 (2l+1)/2T
l= (2l1)/2T

 1/2T

 1/2T
l= 1/2T

1/2T

a (F )ej2f kT df

a (f + Tl )ej2F kT df

l=

a (f + Tl ) ej2F kT df

Let fd = f T. Then :
 1/2

1 

a ((fd + l)/T ) ej2fd k dfd


d (k) =
1/2 T l=

(1)

We know that the autocorrelation function of a discrete-time process is the inverse Fourier
transform of its power spectral density
d (k) =

 1/2

Comparing (1),(2) :
d (fd ) =

1/2

d (fd )ej2fd k dfd

1 
fd + l
)
a (
T l=
T

(c) From (3) we conclude that :


d (fd ) =

1
fd
a ( )
T
T

i :
a (f ) = 0,

f : |f | > 1/2T
12

(2)

(3)

Otherwise, the sum of the shifted copies of a (in (3)) will overlap and aliasing will occur.

Problem 2.22 :
(a)
a ( ) =
=
=

W

a (f )ej2f df
ej2f df

sin 2W

By applying the result in problem 2.21, we have


d (k) = fa (kT ) =

(b) If T =

1
2W

, then :

d (k) =

sin 2W kT
kT

2W = 1/T,
k=0
0,
otherwise

Thus, the sequence X(n) is a white-noise sequence. The fact that this is the minimum value of
T can be shown from the following gure of the power spectral density of the sampled process:

fs W

fs

fs + W

fs W

fs

fs + W

We see that the maximum sampling rate fs that gives a spectrally at sequence is obtained
when :
1
W = fs W fs = 2W T =
2W
|
(c) The triangular-shaped spectrum (f ) = 1 |f
, |f | W may be obtained by convolvW

ing the rectangular-shaped spectrum 1 (f ) = 1/ W , |f | W/2. Hence, ( ) = 21 ( ) =

13

1
W

sin W

2

.Therefore, sampling X(t) at a rate


with autocorrelation function :
1
d (k) =
W

sin W kT
kT

2

1
T

= W samples/sec produces a white sequence

sin k
k

=W

2

W,
k=0
0, otherwise

Problem 2.23 :
Lets denote : y(t) = fk (t)fj (t).Then :


fk (t)fj (t)dt =

y(t)dt = Y (f )|f =0

where Y (f ) is the Fourier transform of y(t). Since : y(t) = fk (t)fj (t) Y (f ) = Fk (f ) Fj (f ).


But :

1 j2f k/2W
e
fk (t)ej2f t dt =
Fk (f ) =
2W

Then :

Fk (a) Fj (f a)da
Y (f ) = Fk (f ) Fj (f ) =

and at f = 0 :
Y (f )|f =0 =
=

F (a) Fj (a)da

 2k

j2a(kj)/2W
da
e

1
2W

1/2W, k = j
0,
k=
 j

Problem 2.24 :


1
|H(f )|2df
G 0
For the lter shown in Fig. P2-12 we have G = 1 and
Beq =

Beq =


0

|H(f )|2df = B

For the lowpass lter shown in Fig. P2-16 we have


H(f ) =

1
1
|H(f )|2 =
1 + j2f RC
1 + (2f RC)2

14

So G = 1 and

Beq =
=
=


2
0 |H(f )| df
1 
2
2
1
4RC

|H(f )| df

where the last integral is evaluated in the same way as in problem P-2.9 .

15

You might also like