You are on page 1of 16

Lecture Notes 6

Random Processes

• Definition and Simple Examples


• Important Classes of Random Processes
◦ IID
◦ Random Walk Process
◦ Markov Processes
◦ Independent Increment Processes
◦ Counting processes and Poisson Process

• Mean and Autocorrelation Function


• Gaussian Random Processes
◦ Gauss–Markov Process

EE 278B: Random Processes 6–1

Random Process

• A random process (RP) (or stochastic process) is an infinite indexed collection


of random variables {X(t) : t ∈ T }, defined over a common probability space
• The index parameter t is typically time, but can also be a spatial dimension
• Random processes are used to model random experiments that evolve in time:
◦ Received sequence/waveform at the output of a communication channel
◦ Packet arrival times at a node in a communication network
◦ Thermal noise in a resistor
◦ Scores of an NBA team in consecutive games
◦ Daily price of a stock
◦ Winnings or losses of a gambler

EE 278B: Random Processes 6–2


Questions Involving Random Processes

• Dependencies of the random variables of the process


◦ How do future received values depend on past received values?
◦ How do future prices of a stock depend on its past values?
• Long term averages
◦ What is the proportion of time a queue is empty?
◦ What is the average noise power at the output of a circuit?
• Extreme or boundary events
◦ What is the probability that a link in a communication network is congested?
◦ What is the probability that the maximum power in a power distribution line
is exceeded?
◦ What is the probability that a gambler will lose all his captial?
• Estimation/detection of a signal from a noisy waveform

EE 278B: Random Processes 6–3

Two Ways to View a Random Process

• A random process can be viewed as a function X(t, ω) of two variables,


time t ∈ T and the outcome of the underlying random experiment ω ∈ Ω
◦ For fixed t, X(t, ω) is a random variable over Ω
◦ For fixed ω, X(t, ω) is a deterministic function of t, called a sample function

X(t, w1 )

t
X(t, w2 )

t
X(t, w3 )

t
t1 t2
X(t1 , w) X(t2 , w)

EE 278B: Random Processes 6–4


Discrete Time Random Process

• A random process is said to be discrete time if T is a countably infinite set, e.g.,


◦ N = {0, 1, 2, . . .}
◦ Z = {. . . , −2, −1, 0, +1, +2, . . .}
• In this case the process is denoted by Xn , for n ∈ N , a countably infinite set,
and is simply an infinite sequence of random variables
• A sample function for a discrete time process is called a sample sequence or
sample path
• A discrete-time process can comprise discrete, continuous, or mixed r.v.s

EE 278B: Random Processes 6–5

Example

• Let Z ∼ U[0, 1], and define the discrete time process Xn = Z n for n ≥ 1
• Sample paths:
xn 1
2
1 1
Z= 2 4 1
8 1
16
n
xn
1 1
Z= 4 4
1
16 1
64
n
xn
Z=0
0 0 0 ...
n
1 2 3 4 5 6 7 ...

EE 278B: Random Processes 6–6


• First-order pdf of the process: For each n, Xn = Z n is a r.v.; the sequence of
pdfs of Xn is called the first-order pdf of the process

xn

z
0 1

Since Xn is a differentiable function of the continuous r.v. Z , we can find its


pdf as
1 1 1 −1
fXn (x) = = xn , 0 ≤ x ≤ 1
nx(n−1)/n n

EE 278B: Random Processes 6–7

Continuous Time Random Process

• A random process is continuous time if T is a continuous set


• Example: Sinusoidal Signal with Random Phase
X(t) = α cos(ωt + Θ) , t≥0
where Θ ∼ U[0, 2π] and α and ω are constants
• Sample functions: x(t)
α
Θ=0
π π 3π 2π t
2ω ω 2ω ω
x(t)
π
Θ= 4
t

x(t)
π
Θ= 2
t

EE 278B: Random Processes 6–8


• The first-order pdf of the process is the pdf of X(t) = α cos(ωt + Θ). In an
earlier homework exercise, we found it to be
1
fX(t)(x) = p , −α < x < +α
απ 1 − (x/α)2

The graph of the pdf is shown below

fX(t)(x)

1
απ

x
−α 0 +α
Note that the pdf is independent of t. (The process is stationary )

EE 278B: Random Processes 6–9

Specifying a Random Process

• In the above examples we specified the random process by describing the set of
sample functions (sequences, paths) and explicitly providing a probability
measure over the set of events (subsets of sample functions)
• This way of specifying a random process has very limited applicability, and is
suited only for very simple processes
• A random process is typically specified (directly or indirectly) by specifying all its
n-th order cdfs (pdfs, pmfs), i.e., the joint cdf (pdf, pmf) of the samples
X(t1), X(t2), . . . , X(tn)
for every order n and for every set of n points t1, t2, . . . , tn ∈ T
• The following examples of important random processes will be specified (directly
or indirectly) in this manner

EE 278B: Random Processes 6 – 10


Important Classes of Random Processes

• IID process: {Xn : n ∈ N } is an IID process if the r.v.s Xn are i.i.d.


Examples:
◦ Bernoulli process: X1, X2, . . . , Xn, . . . i.i.d. ∼ Bern(p)
◦ Discrete-time white Gaussian noise (WGN): X1, . . . , Xn, . . . i.i.d. ∼ N (0, N )
• Here we specified the n-th order pmfs (pdfs) of the processes by specifying the
first-order pmf (pdf) and stating that the r.v.s are independent
• It would be quite difficult to provide the specifications for an IID process by
specifying the probability measure over the subsets of the sample space

EE 278B: Random Processes 6 – 11

The Random Walk Process

• Let Z1, Z2, . . . , Zn, . . . be i.i.d., where


( 1
+1 with probability 2
Zn =
1
−1 with probability 2

• The random walk process is defined by


X0 = 0
n
X
Xn = Zi , n≥1
i=1

• Again this process is specified by (indirectly) specifying all n-th order pmfs
• Sample path: The sample path for a random walk is a sequence of integers, e.g.,
0, +1, 0, −1, −2, −3, −4, . . .
or
0, +1, +2, +3, +4, +3, +4, +3, +4, . . .

EE 278B: Random Processes 6 – 12


Example: xn
3
2
1
n
1 2 3 4 5 6 7 8 9 10
−1
−2
−3

zn : 1 1 1 −1 1 −1 −1 −1 −1 −1
• First-order pmf: The first-order pmf is P{Xn = k} as a function of n. Note that
k ∈ {−n, −(n − 2), . . . , −2, 0, +2, . . . , +(n − 2), +n} for n even
k ∈ {−n, −(n − 2), . . . , −1, +1, +3, . . . , +(n − 2), +n} for n odd
Hence, P{Xn = k} = 0 if n + k is odd, or if k < −n or k > n

EE 278B: Random Processes 6 – 13

Now for n + k even, let a be the number of +1’s in n steps, then the number
of −1’s is n − a, and we find that
n+k
k = a − (n − a) = 2a − n ⇒ a =
2
Thus
P{Xn = k} = P{ 12 (n + k) heads in n independent coin tosses}
 
n
= n+k · 2−n for n + k even and −n ≤ k ≤ n
2

P{Xn = k}, n even

k
. . . −6 −4 −2 0 +2 +4 +6 ...

EE 278B: Random Processes 6 – 14


Markov Processes

• A discrete-time random process Xn is said to be a Markov process if the process


future and past are conditionally independent given its present value
• Mathematically this can be rephrased in several ways. For example, if the r.v.s
{Xn : n ≥ 1} are discrete, then the process is Markov iff
pXn+1|Xn (xn+1|xn, xn−1) = pXn+1|Xn (xn+1|xn)
for every n
• IID processes are Markov
• The random walk process is Markov. To see this consider
P{Xn+1 = xn+1 | Xn = xn} = P{Xn + Zn+1 = xn+1 | Xn = xn}
= P{Xn + Zn+1 = xn+1 | Xn = xn}
= P{Xn+1 = xn+1 | Xn = xn}

EE 278B: Random Processes 6 – 15

Independent Increment Processes

• A discrete-time random process {Xn : n ≥ 0} is said to be independent


increment if the increment random variables
Xn1 , Xn2 − Xn1 , . . . , Xnk − Xnk−1
are independent for all sequences of indices such that n1 < n2 < · · · < nk
• Example: Random walk is an independent increment process because
n1 n2 nk
X X X
X n1 = Zi , Xn2 −Xn1 = Zi , . . . , Xnk −Xnk−1 = Zi
i=1 i=n1+1 i=nk−1+1

are independent because they are functions of independent random vectors


• The independent increment property makes it easy to find the n-th order pmfs
of a random walk process from knowledge only of the first-order pmf

EE 278B: Random Processes 6 – 16


• Example: Find P{X5 = 3, X10 = 6, X20 = 10} for random walk process {Xn}

Solution: We use the independent increment property as follows


P{X5 = 3, X10 = 6, X20 = 10} = P{X5 = 3, X10 − X5 = 3, X20 − X10 = 4}
= P{X5 = 3}P{X5 = 3}P{X10 = 4}
     
5 −5 5 −5 10 −10
= 2 2 2 = 3000 · 2−20
4 4 7
• In general if a process is independent increment, then it is also Markov. To see
this let Xn be an independent increment process and define
∆Xn = [X1, X2 − X1, . . . , Xn − Xn−1]T
Then
pXn+1|Xn (xn+1 | xn) = P{Xn+1 = xn+1 | Xn = xn}
= P{Xn+1 − Xn + Xn = xn+1 | ∆Xn = ∆xn, Xn = xn}
= P{Xn+1 = xn+1 | Xn = xn}

• The converse is not necessarily true, e.g., IID processes are Markov but not
independent increment

EE 278B: Random Processes 6 – 17

• The independent increment property can be extended to continuous-time


processes:
A process X(t), t ≥ 0, is said to be independent increment if X(t1),
X(t2) − X(t1), . . . , X(tk ) − X(tk−1) are independent for every
0 ≤ t1 < t2 < . . . < tk and every k ≥ 2
• Markovity can also be extended to continuous-time processes:
A process X(t) is said to be Markov if X(tk+1) and (X(t1), . . . , X(tk−1)) are
conditionally independent given X(tk ) for every 0 ≤ t1 < t2 < . . . < tk < tk+1
and every k ≥ 3

EE 278B: Random Processes 6 – 18


Counting Processes and Poisson Process

• A continuous-time random process N (t), t ≥ 0, is said to be a counting process


if N (0) = 0 and N (t) = n, n ∈ {0, 1, 2, . . .}, is the number of events from 0 to
t (hence N (t2) ≥ N (t1) for every t2 > t1 ≥ 0)
Sample path of a counting process:
n(t)

5
4
3
2
1
0 t
0 t1 t2 t3 t4 t5
t1, t2, . . . are the arrival times or the wait times of the events
t1, t2 − t1, . . . are the interarrival times of the events

EE 278B: Random Processes 6 – 19

The events may be:


◦ Photon arrivals at an optical detector
◦ Packet arrivals at a router
◦ Student arrivals at a class
• The Poisson process is a counting process in which the events are “independent
of each other”
• More precisely, N (t) is a Poisson process with rate (intensity) λ > 0 if:
◦ N (0) = 0
◦ N (t) is independent increment
◦ (N (t2) − N (t1)) ∼ Poisson(λ(t2 − t1)) for all t2 > t1 ≥ 0
• To find the kth order pmf, we use the independent increment property
P{N (t1) = n1, N (t2) = n2, . . . , N (tk ) = nk }
= P{N (t1 ) = n1, N (t2) − N (t1) = n2 − n1, . . . , N (tk ) − N (tk−1) = nk − nk−1}
= pN (t1)(n1)pN (t2)−N (t1)(n2 − n1) . . . pN (tk )−N (tk−1)(nk − nk−1)

EE 278B: Random Processes 6 – 20


• Example: Packets arrive at a router according to a Poisson process N (t) with
rate λ. Assume the service time for each packet T ∼ Exp(β) is independent of
N (t) and of each other. What is the probability that k packets arrive during a
service time?
• Merging : The sum of independent Poisson process is Poisson. This is a
consequence of the infinite divisibility of the Poisson r.v.
• Branching : Let N (t) be a Poisson process with rate λ. We split N (t) into two
counting subprocesses N1(t) and N2(t) such that N (t) = N1(t) + N2(t) as
follows:
Each event is randomly and independently assigned to process N1(t) with
probability p, otherwise it is assigned to N2(t)
Then N1(t) is a Poisson process with rate pλ and N2(t) is a Poisson process
with rate (1 − p)λ
This can be generalized to splitting a Poisson process into more than two
processes

EE 278B: Random Processes 6 – 21

Related Processes

• Arrival time process: Let N (t) be Poisson with rate λ. The arrival time process
Tn , n ≥ 0 is a discrete time process such that:
◦ T0 = 0
◦ Tn is the arrival time of the nth event of N (t)
• Interarrival time process: Let N (t) be a Poisson process with rate λ. The
interarrival time process is Xn = Tn − Tn−1 for n ≥ 1
• Xn is an IID process with Xn ∼ Exp(λ)
Pn
• Tn = i=1 Xi is an independent increment process with
Tn2 − Tn1 ∼ Gamma(λ, n2 − n1) for n2 > n1 ≥ 1, i.e.,
λn2−n1 tn2−n1−1 −λt
fTn2 −Tn1 (t) = e
(n2 − n1 − 1)!
• Example: Let N1 (t) and N2(t) be two independent Poisson processes with rates
λ1 and λ2 , respectively. What is the probability that N1(t) = 1 before
N2(t) = 1?

EE 278B: Random Processes 6 – 22


• Random telegraph process: A random telegraph process Y (t), t ≥ 0, assumes
values of +1 and −1 with Y (0) = +1 with probability 1/2 and −1 with
probability 1/2, and
Y (t) changes polarities with each event of a Poisson process with rate λ > 0
Sample path:
y(t)

+1

t
0 x1 x2 x3 x4 x5
−1

EE 278B: Random Processes 6 – 23

Mean and Autocorrelation Functions

• For a random vector X the first and second order moments are
◦ mean µ = E(X)
◦ correlation matrix RX = E(XXT )
• For a random process X(t) the first and second order moments are
◦ mean function: µX (t) = E(X(t)) for t ∈ T

◦ autocorrelation function: RX (t1, t2) = E X(t1)X(t2) for t1, t2 ∈ T
• The autocovariance function of a random process is defined as
  
CX (t1, t2) = E X(t1) − E(X(t1)) X(t2) − E(X(t2))
The autocovariance function can be expressed using the mean and
autocorrelation functions as
CX (t1, t2) = RX (t1, t2) − µX (t1)µX (t2)

EE 278B: Random Processes 6 – 24


Examples

• IID process:
µX (n) = E(X1) (
E(X12) n1 = n2
RX (n1, n2) = E(Xn1 Xn2 ) =
(E(X1))2 n1 6= n2
• Random phase signal process:
Z 2π
α
µX (t) = E(α cos(ωt + Θ)) = cos(ωt + θ) dθ = 0
0 2π

RX (t1, t2) = E X(t1)X(t2)
Z 2π 2
α
= cos(ωt1 + θ) cos(ωt2 + θ) dθ
0 2π

α2 
Z

= cos(ω(t1 + t2) + 2θ) + cos(ω(t1 − t2)) dθ
0 4π
α2
= cos(ω(t1 − t2))
2

EE 278B: Random Processes 6 – 25

• Random walk:
n
X  n
X
µX (n) = E Zi = 0 = 0
i=1 i=1

RX (n1, n2) = E(Xn1 Xn2 )


= E [Xn1 (Xn2 − Xn1 + Xn1 )]
= E(Xn21 ) = n1 assuming n2 ≥ n1
= min{n1 , n2} in general

• Poisson process:
µN (t) = λt
RN (t1, t2) = E(N (t1)N (t2))
= E [N (t1)(N (t2) − N (t1) + N (t1))]
= λt1 × λ(t2 − t1) + λt1 + λ2 t21 = λt1 + λ2t1t2 assuming t2 ≥ t1
= λ min{t1, t2} + λ2t1t2

EE 278B: Random Processes 6 – 26


Gaussian Random Processes

• A Gaussian random process (GRP) is a random process X(t) such that


[X(t1), X(t2), . . . , X(tn) ]T
is a GRV for all t1, t2, . . . , tn ∈ T
• Since the joint pdf for a GRV is specified by its mean and covariance matrix, a
GRP is specified by its mean µX (t) and autocorrelation RX (t1, t2) functions
• Example: The discrete time WGN process is a GRP

EE 278B: Random Processes 6 – 27

Gauss-Markov Process

• Let Zn , n ≥ 1, be a WGN process, i.e., an IID process with Z1 ∼ N (0, N )


The Gauss-Markov process is a first-order autoregressive process defined by
X1 = Z 1
Xn = αXn−1 + Zn , n > 1,
where |α| < 1
• This process is a GRP, since X1 = Z1 and Xk = αXk−1 + Zk where Z1, Z2, . . .
are i.i.d. N (0, N ),
    
X1 1 0 ··· 0 0 Z1
 X2   α
   . 1 · · · 0 0   Z2 
 X3  =  . .. . . . .. ..   
  Z3 
 .   n−2
αn−3 · · · 1 0   .. 
 
 .  α
Xn αn−1 αn−2 · · · α 1 Zn
is a linear transformation of a GRV and is therefore a GRV
• Clearly, the Gauss-Markov process is Markov. It is not, however, an independent
increment process

EE 278B: Random Processes 6 – 28


• Mean and covariance functions:
µX (n) = E(Xn) = E(αXn−1 + Zn)
= α E(Xn−1) + E(Zn) = α E(Xn−1) = αn−1 E(Z1) = 0
To find the autocorrelation function, for n2 > n1 we write
n2 −n
X1−1
n2 −n1
X n2 = α X n1 + αiZn2−i
i=0

Thus
RX (n1, n2) = E(Xn1 Xn2 ) = αn2−n1 E(Xn21 ) + 0 ,
since Xn1 and Zn2−i are independent, zero mean for 0 ≤ i ≤ n2 − n1 − 1
Next, to find E(Xn21 ), consider
E(X12) = N
E(Xn21 ) = E (αXn1−1 + Zn1 )2 = α2 E(Xn21−1 ) + N
 

Thus
1 − α2n1
E(Xn21 ) = N
1 − α2

EE 278B: Random Processes 6 – 29

Finally the autocorrelation function is

1 − α2 min{n1,n2}
RX (n1, n2) = α|n2−n1| N
1 − α2
• Estimation of Gauss-Markov process: Suppose we observe a noisy version of the
Gauss-Markov process,
Yn = X n + W n ,
where Wn is a WGN process independent of Zn with average power Q
We can use the Kalman filter from Lecture Notes 4 to estimate Xi+1 from Y i
as follows:
Initialization:
X̂1|0 = 0
2
σ1|0 =N

EE 278B: Random Processes 6 – 30


Update: For i = 2, 3, . . . ,
X̂i+1|i = αX̂i|i−1 + ki(Yi − X̂i|i−1),
2 2
σi+1|i = α(α − ki)σi|i−1 + N, where
2
ασi|i−1
ki = 2
σi|i−1 +Q

Substituting from the ki equation into the MSE update equation, we obtain

2
α2Qσi|i−1
2
σi+1|i = 2 + N,
σi|i−1 +Q
This is a Riccati recursion (a quadratic recursion in the MSE) and has a steady
state solution:
α2Qσ 2
σ2 = 2 +N
σ +Q
Solving this quadratic equation, we obtain
p
2 N − (1 − α2)Q + 4N Q + (N − (1 − α2)Q)2
σ =
2

EE 278B: Random Processes 6 – 31

The Kalman gain ki converges to


p
−N − (1 − α2)Q + 4N Q + (N − (1 − α2)Q)2
k=
2αQ
and the steady-state Kalman filter is
X̂i+1|i = αX̂i|i−1 + k(Yi − X̂i|i−1)

EE 278B: Random Processes 6 – 32

You might also like