You are on page 1of 51

TON DUC THANG UNIVERSITY

FACULTY OF ELECTRICAL AND


ELECTRONICS ENGINEERING

402064
ENGINEERING ANALYSIS
CHAPTER 4: RANDOM PROCESSES

.Phuong T. Tran, PhD


OBJECTIVES

 Understand the basic knowledge of random


vectors and random processes
 Understand the Central Limit Theorem and
its applications
 Application in communication systems

02/11/2015 402064 - Chapter 4: Random Processes 2


CHAPTER 4: RANDOM PROCESSES

 4.1 Random vectors


 4.2 Function of random vectors
 4.3 Law of Large Numbers – Central Limit
Theorem
 4.4 Random processes
 4.5 Stationary random processes – Random
signals

02/11/2015 402064 - Chapter 4: Random Processes 3


PROBABILITY MODEL OF N RANDOM
VARIABLES
1. Generalization of the probability model for a
pair of random variables.
2. Definition 1: The joint CDF of X1, X2,…,Xn
is FX , X ,..., X ( x1 , x2 ,..., xn )  P( X 1  x1 ,..., X n  xn )
1 2 n

3. Definition 2: The joint PMF of the discrete


random variables X1, X2,…,Xn is
FX1 , X 2 ,..., X n ( x1 , x2 ,..., xn )  P( X 1  x1 ,..., X n  xn )
4. Definition 3: A random vector is a column
vector [X1 X2 …Xn]T, where each Xi is a RV.
02/11/2015 402064 - Chapter 4: Random Processes 4
JOINT PDF OF N CONTINUOUS
RVS
Definition 4: The joint PDF of N continuous
random variables X1, X2,…,Xn is
 N FX1 , X 2 ,..., X n ( x1 , x2 ,..., xn )
f X1 , X 2 ,..., X n ( x1 , x2 ,..., xn ) 
x1...xn

02/11/2015 402064 - Chapter 4: Random Processes 5


SOME PROPERTIES
If X1, X2,…,Xn are discrete random variables with joint
PMF PX1,X2,…,Xn(x1,x2,…,xn) then:

 PX1 , X 2 ,...,X n ( x1 , x 2 ..., xn )  0


   P
xS X1 xS X n
X1 , X 2 ,..., X n ( x1 , x2 ,..., xn )  1

02/11/2015 402064 - Chapter 4: Random Processes 6


SOME PROPERTIES

If X1, X2,…,Xn are continuous random variables with joint


PDF fX1,X2,…,Xn(x1,x2,…,xn) then:
 f X , X ,...,X ( x1 , x 2 ..., xn )  0 x
1 2 n x 1 n

 FX1 , X 2 ,...,Xn ( x1 , x 2 ..., xn )  


 
f X 1 , X 2 ,...,X n (u1 , u2 ..., un ) du1...dun
 


 
f X1 , X 2 ,...,X n (u1 , u2 ..., un )du1...dun  1

02/11/2015 402064 - Chapter 4: Random Processes 7


SOME PROPERTIES
The probability of an event A expressed in terms of the
random variables X1,…,Xn is
 For discrete RVs:
P ( A)  
( x1 ,..., xn )A
PX1 , X 2 ,...,Xn ( x1 , x 2 ..., xn )

 For continuous RVs:


P ( A)     f X1 , X 2 ,...,X n ( x1 , x2 ..., xn )dx1...dxn
A

02/11/2015 402064 - Chapter 4: Random Processes 8


EXAMPLE

The random vector (Y1,…,Y4) have the joint PDF

4 if 0  y1  y2  1, 0  y3  y4  1
fY1 ,Y2 ,Y3 ,Y4 (y1 , y2 , y3 , y4 )  
0 otherwise

Let C denote the event that maxi{Yi} ≤ ½. Find P(C).

02/11/2015 402064 - Chapter 4: Random Processes 9


VECTOR NOTATION

1. The CDF of a random vector X is


FX (x)  FX1 , X 2 ,..., X n ( x1 , x2 ,..., xn )
2. The PMF of a discrete random vector X is
PX (x)  PX1 , X 2 ,..., X n ( x1 , x2 ,..., xn )

3. The PDF of a continuous random vector X is


f X (x)  f X1 , X 2 ,..., X n ( x1 , x2 ,..., xn )

02/11/2015 402064 - Chapter 4: Random Processes 10


MARGINAL PROBABILITY FUNCTIONS

For a joint PMF PX,Y,Z,W(x,y,z,w) of discrete random


variables W, X, Y, Z, some marginal PDF are:
X ,Y,Z ( x, y, z)   PX ,Y,Z,W ( x, y, z, w)
 P
wSW

 PW,Z (w, z)   P
xS X ySY
X ,Y,Z,W ( x, y, z, w)

 PX ( x)    P
wSW ySY zS Z
X ,Y,Z,W ( x, y, z, w)

02/11/2015 402064 - Chapter 4: Random Processes 11


MARGINAL PROBABILITY FUNCTIONS

For a joint PDF fX,Y,Z,W(x,y,z,w) of discrete random


variables W, X, Y,Z, some marginal PDF are:
 f X ,Y,Z ( x, y, z) 

P X ,Y,Z,W ( x, y, z, w)
 
 f W,Z (w, z)  
 
f X ,Y,Z,W ( x, y, z, w)dxdy
  
 f X ( x)  
  
f X ,Y,Z,W ( x, y, z, w)dwdydz

02/11/2015 402064 - Chapter 4: Random Processes 12


INDEPENDENCE OF RANDOM
VECTORS
 Random variables X1,…,Xn are independent if for
all x1,…,xn
 Discrete: PX ,...,X ( x1 ,..., x n )  PX ( x1 ) PX ( x2 )...PX ( xn )
1 n 1 2 n

 Continuous: f X ,...,X ( x1 ,..., x n )  f X ( x1 ) f X ( x2 )... f X ( xn )


1 n 1 2 n

 Random vectors X and Y are independent if


 Discrete: PX,Y (x, y )  PX (x) PY ( y )
 Continuous: f X,Y (x, y )  f X (x) f Y (y )

02/11/2015 402064 - Chapter 4: Random Processes 13


GAUSSIAN RANDOM VECTORS
 Random variables X and Y are said to be jointly
Gaussian or bivariate Gaussian if their joint PDF
has the form
1
f X ,Y ( x, y )  
2 X  Y 1   2
X ,Y

  x    2 2  X ,Y ( x   X )( y  Y )  y    2 
  X
   
Y
 
  X   XY  Y  
 exp   
2(1   X2 ,Y )
 
 
where X, Y > 0 and -1 < X,Y < 1.
02/11/2015 402064 - Chapter 4: Random Processes 14
BIVARIATE GAUSSIAN RANDOM
VARIABLES

 = 0.9, X = Y = 1, X = Y = 0

02/11/2015 402064 - Chapter 4: Random Processes 15


BIVARIATE GAUSSIAN RANDOM
VARIABLES

 = -0.9, X = Y = 1, X = Y = 0

02/11/2015 402064 - Chapter 4: Random Processes 16


BIVARIATE GAUSSIAN RANDOM
VARIABLES

 = 0, X = Y = 1, X = Y = 0

02/11/2015 402064 - Chapter 4: Random Processes 17


GAUSSIAN RANDOM VECTORS
 X is the Gaussian (X,CX) random vector with
expected value X and covariance CX if and only if
1  1 1 T
f X ( x)  .exp 
 2 ( x  μ ) T
C ( x  μ ) 
   X 
n /2 1/2 X X X
2 det(C )
where det(CX) is the determinant of CX, satisfies
det(CX) > 0.
 Theorem 1: A Gaussian random vector X has
independent components if and only if CX is a
diagonal matrix.

02/11/2015 402064 - Chapter 4: Random Processes 18


CHAPTER 4: RANDOM PROCESSES

 4.1 Random vectors


 4.2 Function of random vectors
 4.3 Law of Large Numbers – Central Limit
Theorem
 4.4 Random processes
 4.5 Stationary random processes – Random
signals

02/11/2015 402064 - Chapter 4: Random Processes 19


FUNCTION OF RANDOM VECTORS
 Let X be a vector of n i.i.d. random variables each
with CDF FX(x) and PDF fX(x).
 The CDF and PDF of Y = max{X1,…,Xn} are
FY (y)   FX ( y )  fY ( y )  n  FX ( y ) 
n n 1
f X ( y)

 The CDF and PDF of Y = min{X1,…,Xn} are


FW (w)  1   1  FX (w)  fW (w)  n  1  FX ( w) 
n n 1
f X (w)

 Proof: textbook p. 283.

02/11/2015 402064 - Chapter 4: Random Processes 20


FUNCTION OF RANDOM VECTORS
 Let X be a vector of n i.i.d. random variables each
with CDF FX(x) and PDF fX(x).
 The CDF and PDF of Y = max{X1,…,Xn} are
FY (y)   FX ( y )  fY ( y )  n  FX ( y ) 
n n 1
f X ( y)

 The CDF and PDF of Y = min{X1,…,Xn} are


FW (w)  1   1  FX (w)  fW (w)  n  1  FX ( w) 
n n 1
f X (w)

 Proof: textbook p. 283.

02/11/2015 402064 - Chapter 4: Random Processes 21


FUNCTION OF RANDOM VECTORS

 For a random vector X, the random variable g(X)


has expected value:
 Discrete: E[ g ( X)]     g (x) PX (x)
x1S X1 xn S Xn
 
 Continuous: E[ g ( X)]     g (x) f X (x)dx1...d xn
 

02/11/2015 402064 - Chapter 4: Random Processes 22


LINEAR FUNCTION OF RANDOM
VECTORS
 Given the continuous random vector X, define the
derived random vector Y such that Yk = aXk + b for
constants a >0 and b. The CDF and PDF of Y are:
 y1  b yn  b  1  y1  b yn  b 
FY (y )  FX  ,  f Y (y )  n f X  , 
 a a  a  a a 
 If X is a continuous random vector and A is an
invertible matrix, then Y = AX + b has PDF
1
f Y (y )  f X  A 1 ( y  b) 
det( A)

02/11/2015 402064 - Chapter 4: Random Processes 23


ASSIGNMENT
 Problems: 8.1.1 – 8.1.5, 8.1.9, 8.2.2, 8.3.3
 Read textbook from p.307 to p.330.

02/11/2015 402064 - Chapter 4: Random Processes 24


CHAPTER 4: RANDOM PROCESSES

 4.1 Random vectors


 4.2 Function of random vectors
 4.3 Law of Large Numbers – Central Limit
Theorem
 4.4 Random processes
 4.5 Stationary random processes – Random
signals

02/11/2015 402064 - Chapter 4: Random Processes 25


CENTRAL LIMIT THEOREM

 Theorem 2: Suppose X1, X2,…, Xn are sequence of


independent, identically distributed (i.i.d) random
variables, each with mean  and variance 2.
Then the distribution of:
X 1  X 2  ...  X n  n
Y
 n
tends to be standard normal as n  .
 The central limit theorem states that a large sum of
independent random variables each with finite
variance tends to behave like a normal RV.
02/11/2015 402064 - Chapter 4: Random Processes 26
CENTRAL LIMIT THEOREM
 Example: Flip the coin n times, let X be the
number of H. Then X can be considered as the
sum of n independent Bernouili random variables.

02/11/2015 402064 - Chapter 4: Random Processes 27


CHAPTER 4: RANDOM PROCESSES

 4.1 Random vectors


 4.2 Function of random vectors

 4.3 Law of Large Numbers – Central Limit

Theorem
 4.4 Random processes

 4.5 Stationary random processes – Random

signals
Reading assignment: Chapter 13 (pp. 429-452)

02/11/2015 402064 - Chapter 4: Random Processes 28


DEFINITIONS AND EXAMPLES

 Definition 5: A stochastic process X(t) consists of


an experiment with a probability measure P(.)
define on a sample space S and a time function
x(t,s) that assigns a real number to each outcome
sS at each time t.
 x(t,s) is called the sample function of X(t).
 The ensemble of a stochastic process is the set of
all possible time functions that can result from an
experiment.
 Stochastic process is also called random process.

02/11/2015 402064 - Chapter 4: Random Processes 29


DEFINITIONS AND EXAMPLES

02/11/2015 402064 - Chapter 4: Random Processes 30


DEFINITIONS AND EXAMPLES

 Example 1: X (t )  a cos( 0t   ), where  is a


uniformly distributed random variable in [0,2),
represents a continuous stochastic process.
 Example 2 (Random walk): Toss a coin multiple

times. For each toss, if the outcome is H, you win


$1, otherwise you lose $1.
X(n) = your profit after nth toss. This is a discrete
stochastic process.

02/11/2015 402064 - Chapter 4: Random Processes 31


TYPES OF STOCHASTIC PROCESSES

02/11/2015 402064 - Chapter 4: Random Processes 32


ENSEMBLE AVERAGE AND TIME
AVERAGE
 Ensemble average: With t fixed at t=t0, X(t0) is a
random variable, we have the averages ( expected
value and variance) as we studied earlier.
 Time average: applies to a specific sample function
x(t, s0), and produces a typical number for this
sample function.

02/11/2015 402064 - Chapter 4: Random Processes 33


EXAMPLE 3
 Consider an experiment in which we record M(t), the
number of active calls at a telephone switch at time
t, at each second over an interval of 15 minutes.
 A trial of this experiment yields the sample function
m(t,s) as below.

02/11/2015 402064 - Chapter 4: Random Processes 34


EXAMPLE 3
 An ensemble average is the average number of
calls in progress at t = 403s.
 A time average is the average number of calls in
progress during a specific 15-minute interval.

02/11/2015 402064 - Chapter 4: Random Processes 35


EXAMPLE 4
 In QPSK communication systems, one of 4 equally
probable symbols s0, s1, s2, s3 is transmitted in T (s).
 If symbol si is sent, a waveform x(t,si) = cos(2f0t
+/4 + i/2) is transmitted during the interval [0,T].
 For every T seconds, a random symbol in {s0, s1, s2,
s3} is transmitted. Here, the experiment is to transmit
n symbols over the interval [0,nT].
 An outcome corresponds to a sequence of n
symbols and a function has duration nT seconds.

02/11/2015 402064 - Chapter 4: Random Processes 36


EXAMPLE 4

02/11/2015 402064 - Chapter 4: Random Processes 37


RANDOM VARIABLES FROM
RANDOM PROCESS
 At particular
time t1, each
x(t1,s) is a
sample
value of a
random
variable,
denoted by
X(t1).
For a specific t1, X(t1) is a random variable with distribution:
F ( x, t1 )
F ( x, t1 )  p[ X (t1 )  x ] f ( x, t1 ) 
x
38
02/11/2015 402064 - Chapter 4: Random Processes 38
EXPECTED VALUE AND
CORRELATION
 Definition 6: The expected value of a stochastic
process X(t) is the deterministic function:
 X  t   E  X  t  
 Definition 7: The autocorrelation function of the
stochastic process X(t) is
RX  t ,   E  X  t  . X  t    
 Definition 8: The autocovariance function of the
stochastic process X(t) is
CX(t,) = E[(X(t) - X(t))(X(t + ) - X(t + )].

02/11/2015 402064 - Chapter 4: Random Processes 39


EXPECTED VALUE AND
CORRELATION
 The covariance indicates how much the process is
likely to change in the  seconds elapsed between t
and t + .
 A high covariance indicates that the sample function
is unlikely to change much in the -second interval.
 A covariance near zero suggests rapid change.

02/11/2015 402064 - Chapter 4: Random Processes 40


CHAPTER 4: RANDOM PROCESSES

 4.1 Random vectors


 4.2 Function of random vectors

 4.3 Law of Large Numbers – Central Limit

Theorem
 4.4 Random processes

 4.5 Stationary random processes – Random

signals
Reading assignment: Chapter 13 (pp. 452-468)

02/11/2015 402064 - Chapter 4: Random Processes 41


STATIONARY PROCESS
 Recall in a stochastic process X(t), there is a
random variable X(t1) at every time t1 with PDF fX(t1)
(x).
 For most random processes, the PDF fX(t1)(x)
depends on t1.
 For a special class of random processes know as
stationary processes, fX(t1)(x) does not depend on t1.
 Therefore: the statistical properties of the stationary
process do not change with time (time-invariant).

02/11/2015 402064 - Chapter 4: Random Processes 42


STATIONARY PROCESS
 Definition 9: A stochastic process X(t) is stationary
if and only if for all sets of time instants t1, …,tm and
any time difference 
f X (t1 ),..., X (tm ) ( x1 ,..., xm )  f X (t1  ),..., X (t m  ) ( x1 ,..., xm )
 Definition 10: A random sequence Xn is stationary if
and only if for all sets of integer time instants n 1,
…,nm and any time difference k
f n1 ,...,n m ) ( x1 ,..., xm )  f n1  k ,...,n m k ( x1 ,..., xm )

02/11/2015 402064 - Chapter 4: Random Processes 43


PROPERTIES
 Theorem 3: Let X(t) be a stationary random process.
For constants a>0 and b, Y(t) = aX(t) + b is also a
stationary random process.
 Theorem 4: For a stationary process X(t), the
expected value, the autocorrelation, and the
autocovariance have the following properties for all t:
 X(t) = X (constant)
 RX(t,) = RX(0,) = RX()
 CX(t,) = RX() – (X)2 = CX()

02/11/2015 402064 - Chapter 4: Random Processes 44


WIDE-SENSE STATIONARY
RANDOM PROCESS
 Definition 11: X(t) is called a wide-sense stationary
random process if and only if for all t, we have
E[X(t)] = X = const
and RX(t,) = RX(0,) = RX()

 Definition 12: Xn is called a wide-sense stationary


random sequence if and only if for all n, we have
E[Xn] = X = const
and RX(n,k) = RX(0,k) = RX(k)

02/11/2015 402064 - Chapter 4: Random Processes 45


PROPERTIES
 For wide-sense stationary random process X(t)
RX(0) ≥ 0 and RX() = RX(-)
RX(0) = E[X2(t)] is called the average power of the
random process X(t).
 For wide-sense stationary random sequence Xn
RX(0) ≥ 0 and RX(k) = RX(-k)
RX(0) = E[(Xn)2] is called the average power of the
random sequence Xn.

02/11/2015 402064 - Chapter 4: Random Processes 46


GAUSSIAN PROCESS
 X(t) is a Gaussian stochastic process if and only if X
= [X(t1) … X(tk)]T is a Gaussian random vector for any
integer k > 0 and any set of time instants t1,t2,…,tk
 If X(t) is a wide-sense stationary Gaussian process,
then X(t) is also a stationary random process.
 W(t) is a white Gaussian noise process if and only if
W(t) is a stationary Gaussian stochastic process with
the properties W = 0 and RW() = N0(t).

02/11/2015 402064 - Chapter 4: Random Processes 47


POISSON PROCESS

 Definition 13: A stochastic process N(t) is a counting


process if for every sample function, n(t,s) = 0 for t<0
and n(t,s)Z and non-decreasing with time.
 Definition 14: A counting process N(t) is called a
Poisson process with rate  if:
a) N(t1) – N(t0) is a Poisson random variable with
expected value (t1 – t0).
b) For any pair of non-overlapping intervals (t0,t1] and
(t0’,t1’], N(t1) – N(t0) and N(t1’) – N(t0’) are
independent random variables.
02/11/2015 402064 - Chapter 4: Random Processes 48
PROPERTIES OF POISSON PROCESS
 Theorem 5: For a Poisson process of rate , the
interarrival times X1, X2,… are an i.i.d.
randomsequence with the exponential PDF
 e   x if x  0
f X ( x)  
0 otherwise
 Theorem 6: A counting process with independent
exponential () interarrival times X1,X2,… is a
Poisson process of rate .
 Theorem 7: If N1(t) and N2(t) are 2 independent
Poisson processes of rates 1 and 2, respectively,
then N1(t) + N2(t) is a Poisson process of rate 1+2.
02/11/2015 402064 - Chapter 4: Random Processes 49
SUMMARY
Random vectors Law of Large
Joint PDF of random vector Numbers
Independence of random vectors Central Limit
Function of random vector Theorem

Random processes
Stationary random process

Wide-sense stationary process

Gaussian process

Poisson process

02/11/2015 402064 - Chapter 4: Random Processes 50


ASSIGNMENT

 Read Chapter 11 (pp. 366-389).


 Self-studying: Poisson process
 Exercises: 13.2.1, 13.2.2, 13.2.4, 13.4.1-13.4.4,
13.8.3, 13.8.4, 13.9.5, 13.11.1, 13.12.2

02/11/2015 402064 - Chapter 4: Random Processes 51

You might also like