You are on page 1of 559
Solutions Manual for Communication Systems 4th Edition Simon Haykin McMaster University, Canada Preface This Manual is written to accompany the fourth edition of my book on Communication Systems. It consists of the following: * Detailed solutions to all the problems in Chapters 1 to 10 of the book + MATLAB codes and representative results for the computer experiments in Chapters 1, 2, 3, 4, 6, 7,9 and 10 I would like to express my thanks to my graduate student, Mathini Sellathurai, for her help in solving some of the problems and writing the above-mentioned MATLAB codes. I am also grateful to my Technical coordinator, Lola Brooks for typing the solutions to new problems and preparing the manuscript for the Manual. Simon Haykin Ancaster April 29, 2000 CHAPTER I roblem As an illustration, three particular sample functions of the random process X(t), corresponding to F = W/4, W/2, and W, are plotted below: sin (2nWe) t arty sin(an # t inton & sin(an ft) t -2 ° 2 Ww Ww To show that X(t) is nonstationary, we need only observe that every waveform illustrated above is zero at t = 0, positive for 0 < t < 1/M, and negative for -1/2W < t < 0. ‘Thus, the probability density function of the random variable X(t,) obtained by sampling X(t) at t, = 1/tW is identically zero for negative argument, whereas the probability density function of the random variable X(t) obtained by sampling X(t) at t = -1/IW is nonzero only for negative arguments. Clearly, therefore, fe? # fee) "D> and the random process X(t) is nonstationary. Problem 1.2 X(t) = A cos(2nf,t) efore, X, = A cos(2nf,t,) Since the amplitude A is uniformly distributed, we may write 1 waar 0 < ¥, < costar t,) fy (x) = cra °, otherwise Fx er cos (2mEt, = 4 ° cos (2r£_t,) Similarly, we may write Xi, =A coslant (ty) and 1 {aaah O ) = costam(t -tp)I >) of the procs Since every weighted sum of the samples, Z(t) is Gaussian, it follows that Z(t) is a Gaussian process. Furthermore, we note that 2 2 eae) 7 EC = 1 This result is obtained by putting t,=t, in Eq. (1). Similarly, 2 2 E(Z2 e P28) * ElZ*(t,)] = 1 Therefore, the correlation coefficient of Z(t,) and Z(tp) is Covl2( t4)20 ty I @ 2 4)72(ty) Hence, the joint probability density function of Z(t,) and 2(tp) Fo¢ 4) 52g) BrP Be) = © exl-RC2 4422) where 1 awh ~cos"2e (t=t9)] 1 = Es By sintan(e,-ta) Q(24425) 2 sin“L2n(ty-to)] (>) We note that the covariance of Z(t,) and Z(t) depends only on the time difference tyrtg. The process Z(t) is therefore widessense stationary. Since it is Gaussian it is also strictly stationary. Problem 1.5 @) Let X(t) = A + ¥(t) where A is a constant and Y(t) is a zero-mean random process. The autocorrelation function of X(t) is * Ry(e) = ELK(ter) X(t)] E((A + ¥(ter)] [A + ¥(t)]} Eta? + A Y(ter) +A Y(t) + Y(ter) Y(t) a RCD Which shows that Ry(t) contains a constant component equal to A°. (by Let X(t) = Ay cos(2uf .t + 6) + Z(t) where Ay cos(2nf .t+é) represents the sinusoidal component of X(t) and @ is a random phase variable. The autocorrelation function of X(t) is R(t) = E(K(ter) X(t)I EUIA, cos(2nft + 2nfjx + 6) + Z(t+r)] (A, cos(2nft + 8) + Z(t)]} ELAZ cos(2af,t + 2nfyr + 6) cos(2ift + 8)1 + ElZ(ter) Ay cos(2nf + 6)] + ELA, cos (anf t + ant. + 6) 2(t)] + EZ (ter) 2¢t)) 2 (Ag/2) cost2nt.t) + Ry(x) which shows that Ry(t) contains a sinusoidal component of the same frequency as X(t). Problem 1.6 (a) We note that the distribution function of X(t) is oO, x <0 Ad rAd Ace Faces® and the corresponding probability density function is 1 1 f(g) = FOX) + 3 Cx - A) which are illustrated below: a oe 1.0 (b) By ensemble-averaging, we have EIX(t)] = Sx fycgy (2) ax £ x Oh 6G) +b oex - a0) ax A - 2 The autocorrelation function of X(t) is Ry(t) = E(X(ter) X¢t)] Define the square function Sqq (t) as the square-wave shown below: 0 Sa, (t) 1.0 Then, we may write Ryle) = EIA Sag Ct - 1 Say (t= ty +t) Say (t= ty) + goat, 5 ‘a Ty a * Ta 7 4 ea 24h vic 2. 2 es 2 Since the wave is periodic with period Tg, Ry(t) must also be periodic with period Ty. (c) On a time-aver aging basis, we note by inspection of Fig. P/.4that the mean is ext = £ Next, the autocorrelation function Ty/2 a x(ter) x(t) dt Tye L 1 Kx(ter)x(t)> = has its maximum value of A°/2 at t = 0, and decreases linearly to zero at t = Tg/2. Therefore, 2 emcees éxtter) x(t)? #5 = 2 0 Again, the autocorrelation must be periodic with period Ty. (4) We note that the ensemble-aver aging and time-averaging procedures yield the same set of results for the mean and autocorrelation functions. Therefore, X(t) is ergodic in both the mean and the autocorrelation function. Since ergodicity implies wide-sense stationarity, it follows that X(t) must be wide-sense stationary. Problem 1.7 (a) For tr] > T, the random variables X(t) and X(ter) occur in different pulse intervals and are therefore independent. Thus, E(X(t) X(ter)) = E(X(t)] E(K(ter)], lot Since both amplitudes are equally likely, we have E(X(t)] = Elx(ter)] for It] > T, 4/2, Therefore, 2 A Rt) =e For It] <1, the random variables occur in the same pulse interval if t, s ) Form the non-negative quantity EUUXCtar) + ¥(t)}7) = ELK? (ter) + ax(ter) Y(t) + ¥2(t)) E(x*(ter)] + 26(x(ter) ¥(t)) + ELY@(t] RCO) + ARyy(t) + RyCO) Hence , Ry(O) + Aye) + Ry(O) > 0 or Wy 1

You might also like