You are on page 1of 46

Chapter 2.

Poisson Processes

Prof. Ai-Chun Pang


Graduate Institute of Networking and Multimedia,
Department of Computer Science and Information Engineering,
National Taiwan University, Taiwan
Outline
• Introduction to Poisson Processes
• Properties of Poisson processes
– Inter-arrival time distribution
– Waiting time distribution
– Superposition and decomposition
• Non-homogeneous Poisson processes (relaxing stationary)
• Compound Poisson processes (relaxing single arrival)
• Modulated Poisson processes (relaxing independent)
• Poisson Arrival See Average (PASTA)

Prof. Ai-Chun Pang, NTU 1


Introduction
Inter-arrival
n~(t )
time

~
x3
~ arrival epoch
x2
~
x1

~ ~ ~ ~ t ~
S0 S1 S2 S3 S4

(i) nth arrival epoch S̃n is


n
S̃n = x̃1 + x̃2 + . . . + x̃n = i=1 x̃i
S̃0 = 0
(ii) Number of arrivals at time t is: ñ(t). Notice that:
if f if f
{ñ(t) ≥ n} ⇔ {S̃n ≤ t}, {ñ(t) = n} ⇔ {S̃n ≤ t and S̃n+1 > t}

Prof. Ai-Chun Pang, NTU 2


Introduction

Arrival Process: X = {x̃i , i = 1, 2, . . .}; x̃i ’s can be any


S = {S̃i , i = 0, 1, 2, . . .}; S̃i ’s can be any
N = {ñ(t), t ≥ 0}; −→ called arrival process

Renewal Process: X = {x̃i , i = 1, 2, . . .}; x̃i ’s are i.i.d.


S = {S̃i , i = 0, 1, 2, . . .}; S̃i ’s are general distributed
N = {ñ(t), t ≥ 0}; −→ called renewal process

Poisson Process: X = {x̃i , i = 1, 2, . . .}; x̃i ’s are iid exponential distributed


S = {S̃i , i = 0, 1, 2, . . .}; S̃i ’s are Erlang distributed
N = {ñ(t), t ≥ 0}; −→ called Poisson process

Prof. Ai-Chun Pang, NTU 3


Counting Processes
• A stochastic process N = {ñ(t), t ≥ 0} is said to be a counting process
if ñ(t) represents the total number of “events” that have occurred up
to time t.
• From the definition we see that for a counting process ñ(t) must
satisfy:
1. ñ(t) ≥ 0.
2. ñ(t) is integer valued.
3. If s < t, then ñ(s) ≤ ñ(t).
4. For s < t, ñ(t) − ñ(s) equals the number of events that have
occurred in the interval (s, t].

Prof. Ai-Chun Pang, NTU 4


Definition 1: Poisson Processes
The counting process N = {ñ(t), t ≥ 0} is a Poisson process with rate λ
(λ > 0), if:
1. ñ(0) = 0
2. Independent increments relaxed ⇒ Modulated Poisson Process

P [ñ(t) − ñ(s) = k1 |ñ(r) = k2 , r ≤ s < t] = P [ñ(t) − ñ(s) = k1 ]

3. Stationary increments relaxed ⇒ Non-homogeneous Poisson Process

P [ñ(t + s) − ñ(t) = k] = P [ñ(l + s) − ñ(l) = k]

4. Single arrival relaxed ⇒ Compound Poisson Process

P [ñ(h) = 1] = λh + o(h)
P [ñ(h) ≥ 2] = o(h)

Prof. Ai-Chun Pang, NTU 5


Definition 2: Poisson Processes
The counting process N = {ñ(t), t ≥ 0} is a Poisson process with rate λ
(λ > 0), if:
1. ñ(0) = 0
2. Independent increments
3. The number of events in any interval of length t is Poisson distributed
with mean λt. That is, for all s, t ≥ 0
n
−λt (λt)
P [ñ(t + s) − ñ(s) = n] = e , n = 0, 1, . . .
n!

Prof. Ai-Chun Pang, NTU 6


Theorem: Definitions 1 and 2 are equivalent.
Proof. We show that Definition 1 implies Definition 2. To start, fix u ≥ 0
and let
g(t) = E[e−uñ(t) ]
We derive a differential equation for g(t) as follows:

g(t + h) = E[e−uñ(t+h) ]
 
= E e−uñ(t) e−u[ñ(t+h)−ñ(t)]
   
−uñ(t) −u[ñ(t+h)−ñ(t)]
= E e E e by independent increments
 
−uñ(h)
= g(t)E e by stationary increments (1)

Prof. Ai-Chun Pang, NTU 7


Theorem: Definitions 1 and 2 are equivalent.
Conditioning on whether ñ(t) = 0 or ñ(t) = 1 or ñ(t) ≥ 2 yields
 
E e−uñ(h) = 1 − λh + o(h) + e−u (λh + o(h)) + o(h)
= 1 − λh + e−u λh + o(h) (2)

From (1) and (2), we obtain that

g(t + h) = g(t)(1 − λh + e−u λh) + o(h)

implying that
g(t + h) − g(t) −u o(h)
= g(t)λ(e − 1) +
h h

Prof. Ai-Chun Pang, NTU 8


Theorem: Definitions 1 and 2 are equivalent.
Letting h → 0 gives
g  (t) = g(t)λ(e−u − 1)
or, equivalently,
g  (t)
= λ(e−u − 1)
g(t)
Integrating, and using g(0) = 1, shows that

log(g(t)) = λt(e−u − 1)

or
−u −1)
g(t) = eλt(e → the Laplace transform of a Poisson r. v.

Since g(t) is also the Laplace transform of ñ(t), ñ(t) is a Poisson r. v.

Prof. Ai-Chun Pang, NTU 9


Theorem: Definitions 1 and 2 are equivalent
Let Pn (t) = P [ñ(t) = n].
We derive a differential equation for P0 (t) in the following manner:
P0 (t + h) = P [ñ(t + h) = 0]
= P [ñ(t) = 0, ñ(t + h) − ñ(t) = 0]
= P [ñ(t) = 0]P [ñ(t + h) − ñ(t) = 0]
= P0 (t)[1 − λh + 0(h)]
Hence
P0 (t + h) − P0 (t) o(h)
= −λP0 (t) +
h h
Letting h → 0 yields
P0 (t) = −λP0 (t)
Since P0 (0) = 1, then
P0 (t) = e−λt
Prof. Ai-Chun Pang, NTU 10
Theorem: Definitions 1 and 2 are equivalent
Similarly, for n ≥ 1
Pn (t + h) = P [ñ(t + h) = n]
= P [ñ(t) = n, ñ(t + h) − ñ(t) = 0]
+P [ñ(t) = n − 1, ñ(t + h) − ñ(t) = 1]
+P [ñ(t + h) = n, ñ(t + h) − ñ(t) ≥ 2]
= Pn (t)P0 (h) + Pn−1 (t)P1 (h) + o(h)
= (1 − λh)Pn (t) + λhPn−1 (t) + o(h)
Thus
Pn (t + h) − Pn (t) o(h)
= −λPn (t) + λPn−1 (t) +
h h
Letting h → 0,
Pn (t) = −λPn (t) + λPn−1 (t)

Prof. Ai-Chun Pang, NTU 11


The Inter-Arrival Time Distribution
Theorem. Poisson Processes have exponential inter-arrival time
distribution, i.e., {x̃n , n = 1, 2, . . .} are i.i.d and exponentially
distributed with parameter λ (i.e., mean inter-arrival time = 1/λ).
Proof.
e−λt (λt)0
x̃1 : P (x̃1 > t) = P (ñ(t) = 0) = = e−λt
0!
·
· · x̃1 ∼ e(t; λ)

x̃2 : P (x̃2 > t|x̃1 = s)


= P {0 arrivals in (s, s + t]|x̃1 = s}
= P {0 arrivals in (s, s + t]}(by independent increment)
= P {0 arrivals in (0, t]}(by stationary increment)
= e−λt ·
· · x̃2 is independent of x̃1 and x̃2 ∼ exp(t; λ).
⇒ The procedure repeats for the rest of x̃i ’s.
Prof. Ai-Chun Pang, NTU 12
The Arrival Time Distribution of the nth Event
Theorem. The arrival time of the nth event, S̃n (also called the waiting
time until the nth event), is Erlang distributed with parameter (n, λ).
Proof. Method 1 :


· · e−λt (λt)k
· P [S̃n ≤ t] = P [ñ(t) ≥ n] =
k=n
k!

· λe−λt (λt)n−1
· · fS̃n (t) = (exercise)
(n − 1)!
Method 2 :
fS̃n (t)dt = dFS̃n (t) = P [t < S̃n < t + dt]
= P {n − 1 arrivals in (0, t] and 1 arrival in (t, t + dt)} + o(dt)
= P [ñ(t) = n − 1 and 1 arrival in (t, t + dt)] + o(dt)
= P [ñ(t) = n − 1]P [1 arrival in (t, t + dt)] + o(dt)(why?)
Prof. Ai-Chun Pang, NTU 13
The Arrival Time Distribution of the nth Event
e−λt (λt)n−1
= λdt + o(dt)
(n − 1)!
f (t)dt λe −λt (λt)n−1
·
· · lim S̃n = fS̃n (t) =
dt→0 dt (n − 1)!

Prof. Ai-Chun Pang, NTU 14


Conditional Distribution of the Arrival Times
Theorem. Given that ñ(t) = n, the n arrival times S̃1 , S̃2 , . . . , S̃n have
the same distribution as the order statistics corresponding to n i.i.d.
uniformly distributed random variables from (0, t).
...........................................................................
Order Statistics. Let x̃1 , x̃2 , . . . , x̃n be n i.i.d. continuous random
variables having common pdf f . Define x̃(k) as the kth smallest value
among all x̃i ’s, i.e., x̃(1) ≤ x̃(2) ≤ x̃(3) ≤ . . . ≤ x̃(n) , then x̃(1) , . . . , x̃(n)
are known as the “order statistics” corresponding to random variables
x̃1 , . . . , x̃n . We have that the joint pdf of x̃(1) , x̃(2) , . . . , x̃(n) is

fx̃(1) ,x̃(2) ,...,x̃(n) (x1 , x2 , . . . , xn ) = n!f (x1 )f (x2 ) . . . f (xn ),

where x1 < x2 < . . . < xn (check the textbook [Ross]).


...........................................................................
Prof. Ai-Chun Pang, NTU 15
Conditional Distribution of the Arrival Times
Proof. Let 0 < t1 < t2 < . . . < tn+1 = t and let hi be small enough so that
ti + hi < ti+1 , i = 1, . . . , n.
· ·
· P [ti < S̃i < ti + hi , i = 1, . . . , n|ñ(t) = n]
⎛ ⎞
exactly one arrival in each [ti , ti + hi ]
P⎝ ⎠
i = 1, 2, . . . , n, and no arrival elsewhere in [0, t]
=
P [ñ(t) = n]
(e−λh1 λh1 )(e−λh2 λh2 ) . . . (e−λhn λhn )(e−λ(t−h1 −h2 ...−hn ) )
=
e−λt (λt)n /n!
n!(h1 h2 h3 . . . hn )
=
tn
· P [ti < S̃i < ti + hi , i = 1, . . . , n|ñ(t) = n] n!
· · = n
h 1 h 2 . . . hn t

Prof. Ai-Chun Pang, NTU 16


Conditional Distribution of the Arrival Times
Taking lim ( ), then
hi →0,i=1,...,n

n!
fS̃1 ,S̃2 ,...,S̃n |ñ(t) (t1 , t2 , . . . , tn |n) = n , 0 < t1 < t2 < . . . < tn .
t

Prof. Ai-Chun Pang, NTU 17


Conditional Distribution of the Arrival Times

Example (see Ref [Ross], Ex. 2.3(A) p.68). Suppose that travellers arrive
at a train depot in accordance with a Poisson process with rate λ. If
the train departs at time t, what is the expected sum of the
ñ(t)
waiting times of travellers arriving in (0, t)? That is, E[ i=1 (t− S̃i )] =?

.......... t
~ ~ ~ ~~
S1 S2 S3 S n (t )

Prof. Ai-Chun Pang, NTU 18


Conditional Distribution of the Arrival Times
Answer. Conditioning on ñ(t) = n yields
ñ(t)
 
n
E[ (t − S̃i )|ñ(t) = n] = nt − E[ S̃i ]
i=1 i=1

n
= nt − E[ ũ(i) ] (by the theorem)
i=1
n 
n 
n
= nt − E[ ũi ] (· · · ũ(i) = ũi )
i=1 i=1 i=1
t nt t
= nt − · n = (· · · E[ũi ] = )
2 2 2
ñ(t)

To find E[ (t − S̃i )], we should take another expectation
i=1
ñ(t)
 t λt2
· E[ (t − S̃i )] = · E[ñ(t)] =
· ·
i=1
2   2
=λt

Prof. Ai-Chun Pang, NTU 19


Superposition of Independent Poisson Processes
Theorem. Superposition of independent Poisson Processes

N
(λi , i = 1, . . . , N ), is also a Poisson process with rate λi .
1

Poisson λ1
Poisson λ2
Poisson
N
rate = ∑ λi
Poisson λN 1

<Homework> Prove the theorem (note that a Poisson process must


satisfy Definitions 1 or 2).

Prof. Ai-Chun Pang, NTU 20


Decomposition of a Poisson Process
Theorem.
• Given a Poisson process N = {ñ(t), t ≥ 0};
• If ñi (t) represents the number of type-i events that occur by time
t, i = 1, 2;
• Arrival occurring at time s is a type-1 arrival with probability p(s),
and type-2 arrival with probability 1 − p(s)

⇓then

• ñ1 , ñ2 are independent,


• ñ1 (t) ∼ P (k; λtp), and
 t
1
• ñ2 (t) ∼ P (k; λt(1 − p)), where p = p(s)ds
t 0

Prof. Ai-Chun Pang, NTU 21


Decomposition of a Poisson Process

p(s) Poisson ( k ; λ ∫ p ( s )ds )


Poisson λ

1-p(s) Poisson ( k ; λ ∫ [1 − p( s )]ds )

special case: If p(s) = p is constant, then

p Poisson rate λp
Poisson λ

1-p Poisson rate λ (1 − p )

Prof. Ai-Chun Pang, NTU 22


Decomposition of a Poisson Process
Proof. It is to prove that, for fixed time t,

P [ñ1 (t) = n, ñ2 (t) = m] = P [ñ1 (t) = n]P [ñ2 (t) = m]


e−λpt (λpt)n e−λ(1−p)t [λ(1 − p)t]m
= ·
n! m!
...........................................................................

P [ñ1 (t) = n, ñ2 (t) = m]




= P [ñ1 (t) = n, ñ2 (t) = m|ñ1 (t) + ñ2 (t) = k] · P [ñ1 (t) + ñ2 (t) = k]
k=0
= P [ñ1 (t) = n, ñ2 (t) = m|ñ1 (t) + ñ2 (t) = n + m] · P [ñ1 (t) + ñ2 (t) = n + m]

Prof. Ai-Chun Pang, NTU 23


Decomposition of a Poisson Process
• From the “condition distribution of the arrival times”, any event
occurs at some time that is uniformly distributed, and is independent
of other events.
• Consider that only one arrival occurs in the interval [0, t]:

P [type - 1 arrival|ñ(t) = 1]
 t
= P [type - 1 arrival|arrival time S̃1 = s, ñ(t) = 1]
0
×fS̃1 |ñ(t) (s|ñ(t) = 1)ds
 t  t
1 1
= P (s) · ds = P (s)ds = p
0 t t 0

Prof. Ai-Chun Pang, NTU 24


Decomposition of a Poisson Process

· P [ñ1 (t) = n, ñ2 (t) = m]


· ·

= P [ñ1 (t) = n, ñ2 (t) = m|ñ1 (t) + ñ2 (t) = n + m] · P [ñ1 (t) + ñ2 (t) = n + m]
⎛ ⎞
n+m e −λt (λt)n+m
= ⎝ ⎠ pn (1 − p)m ·
n (n + m)!
−λt
(n + m)! n m e (λt)n+m
= p (1 − p) ·
n!m! (n + m)!
e−λpt (λpt)n e−λ(1−p)t [λ(1 − p)t]m
= ·
n! m!

Prof. Ai-Chun Pang, NTU 25


Decomposition of a Poisson Process
Example (An Infinite Server Queue, textbook [Ross]).
G

Poisson λ departure

• Gs̃ (t) = P (S̃ ≤ t), where S̃ = service time


• Gs̃ (t) is independent of each other and of the arrival process
• ñ1 (t): the number of customers which have left before t;
• ñ2 (t): the number of customers which are still in the system at
time t;
⇒ ñ1 (t) ∼? and ñ2 (t) ∼?
Prof. Ai-Chun Pang, NTU 26
Decomposition of a Poisson Process
Answer.
ñ1 (t): the number of type-1 customers
ñ2 (t): the number of type-2 customers

type-1: P (s) = P (finish before t)


= P (S̃ ≤ t − s) = Gs̃ (t − s)
type-2: 1 − P (s) = Ḡs̃ (t − s)
  
· 1 t
· · ñ1 (t) ∼ P k; λt · Gs̃ (t − s)ds
t 0
  
1 t
ñ2 (t) ∼ P k; λt · Ḡs̃ (t − s)ds
t 0

Prof. Ai-Chun Pang, NTU 27


Decomposition of a Poisson Process
 t
· 1
· · E[ñ1 (t)] = λt · G(t − s)ds
t 0
 0 t−s=y
= λ G(y)(−dy)
t s=t−y
 t
= λ G(y)dy
0

As t → ∞, we have
 t
lim E[ñ2 (t)] = λ Ḡ(y)dy = λE[S̃] (Little’s formula)
t→∞ 0

Prof. Ai-Chun Pang, NTU 28


Non-homogeneous Poisson Processes
• The counting process N = {ñ(t), t ≥ 0} is said to be a non-stationary
or non-homogeneous Poisson Process with time-varying intensity
function λ(t), t ≥ 0, if:
1. ñ(0) = 0
2. N has independent increments
3. P [ñ(t + h) − ñ(t) ≥ 2] = o(h)
4. P [ñ(t + h) − ñ(t) = 1] = λ(t) · h + o(h)
 t
• Define “integrated intensity function” m(t) = λ(t )dt .
0

Theorem.
e−[m(t+s)−m(t)] [m(t + s) − m(t)]n
P [ñ(t + s) − ñ(t) = n] =
n!
Proof. < Homework >.
Prof. Ai-Chun Pang, NTU 29
Non-homogeneous Poisson Processes
Example. The “output process” of the M/G/∞ queue is a
non-homogeneous Poisson process having intensity function
λ(t) = λG(t), where G is the service distribution.
Hint. Let D(s, s + r) denote the number of service completions in the
interval (s, s + r] in (0, t]. If we can show that
 s+r
• D(s, s + r) follows a Poisson distribution with mean λ s G(y)dy,
and
• the numbers of service completions in disjoint intervals are
independent,
then we are finished by definition of a non-homogeneous Poisson
process.

Prof. Ai-Chun Pang, NTU 30


Non-homogeneous Poisson Processes
Answer.
• An arrival at time y is called a type-1 arrival if its service
completion occurs in (s, s + r].
• Consider three cases to find the probability P (y) that an arrival at
time y is a type-1 arrival:
Case 1 Case 2 Case 3

s s+r t
y y y

– Case 1: y ≤ s.

P (y) = P {s − y < S̃ < s + r − y} = G(s + r − y) − G(s − y)

– Case 2: s < y ≤ s + r.

P (y) = P {S̃ < s + r − y} = G(s + r − y)


Prof. Ai-Chun Pang, NTU 31
Non-homogeneous Poisson Processes
– Case 3: s + r < y ≤ t.
P (y) = 0
• Based on the decomposition property of a Poisson process, we
conclude that D(s, s + r) follows a Poisson distribution with mean
t
λpt, where p = (1/t) 0 P (y)dy.
 t  s  s+r
P (y)dy = [G(s + r − y) − G(s − y)]dy + G(s + r − y)dy
0 0 s
 t
+ (0)dy
s+r
 s+r  s
= G(s + r − y)dy − G(s − y)dy
0 0
 s+r  s  s+r
= G(z)dz − G(z)dz = G(z)dz
0 0 s

Prof. Ai-Chun Pang, NTU 32


Non-homogeneous Poisson Processes
• Because of

– the independent increment assumption of the Poisson arrival


process, and
– the fact that there are always servers available for arrivals,

⇒ the departure process has independent increments

Prof. Ai-Chun Pang, NTU 33


Compound Poisson Processes
• A stochastic process {x̃(t), t ≥ 0} is said to be a compound Poisson
process if
– it can be represented as
ñ(t)

x̃(t) = ỹi , t≥0
i=1

– {ñ(t), t ≥ 0} is a Poisson process


– {ỹi , i ≥ 1} is a family of independent and identically distributed
random variables which are also independent of {ñ(t), t ≥ 0}
• The random variable x̃(t) is said to be a compound Poisson random
variable.
• E[x̃(t)] = λtE[ỹi ] and V ar[x̃(t)] = λtE[ỹi2 ].

Prof. Ai-Chun Pang, NTU 34


Compound Poisson Processes
• Example (Batch Arrival Process). Consider a parallel-processing
system where each job arrival consists of a possibly random number
of tasks. Then we can model the arrival process as a compound
Poisson process, which is also called a batch arrival process.
• Let ỹi be a random variable that denotes the number of tasks
comprising a job. We derive the probability generating function
Px̃(t) (z) as follows:
       
Px̃(t) (z) = E z x̃(t) = E E z x̃(t) |ñ(t) = E E z ỹ1 +···+ỹñ(t) |ñ(t)
  
ỹ1 +···+ỹñ(t)
= E E z (by independence of ñ(t) and {ỹi })
    
= E E z ỹ1 · · · E z ỹñ(t) (by independence of ỹ1 , · · · , ỹñ(t) )
 
ñ(t)
= E (Pỹ (z)) = Pñ(t) (Pỹ (z))

Prof. Ai-Chun Pang, NTU 35


Modulated Poisson Processes
• Assume that there are two states, 0 and 1, for a “modulating process.”

0 1

• When the state of the modulating process equals 0 then the arrive rate
of customers is given by λ0 , and when it equals 1 then the arrival rate
is λ1 .
• The residence time in a particular modulating state is exponentially
distributed with parameter μ and, after expiration of this time, the
modulating process changes state.
• The initial state of the modulating process is randomly selected and is
equally likely to be state 0 or 1.
Prof. Ai-Chun Pang, NTU 36
Modulated Poisson Processes
• For a given period of time (0, t), let Υ be a random variable that
indicates the total amount of time that the modulating process has
been in state 0. Let x̃(t) be the number of arrivals in (0, t).
• Then, given Υ, the value of x̃(t) is distributed as a non-homogeneous
Poisson process and thus
(λ0 τ + λ1 (t − τ ))n e−(λ0 τ +λ1 (t−τ ))
P [x̃(t) = n|Υ = τ ] =
n!
• As μ → 0, the probability that the modulating process makes no
transitions within t seconds converges to 1, and we expect for this case
that  
n −λ0 t n −λ1 t
1 (λ0 t) e (λ1 t) e
P [x̃(t) = n] = +
2 n! n!

Prof. Ai-Chun Pang, NTU 37


Modulated Poisson Processes
• As μ → ∞, then the modulating process makes an infinite number of
transitions within t seconds, and we expect for this case that
(βt)n e−βt λ0 + λ1
P [x̃(t) = n] = , where β =
n! 2
• Example (Modeling Voice).
– A basic feature of speech is that it comprises an alternation of
silent periods and non-silent periods.
– The arrival rate of packets during a talk spurt period is Poisson
with rate λ1 and silent periods produce a Poisson rate with λ0 ≈ 0.
– The duration of times for talk and silent periods are exponentially
distributed with parameters μ1 and μ0 , respectively.
⇒ The model of the arrival stream of packets is given by a modulated
Poisson process.
Prof. Ai-Chun Pang, NTU 38
Poisson Arrivals See Time Averages (PASTA)
• PASTA says: as t → ∞

Fraction of arrivals who see the system in a given state


upon arrival (arrival average)
= Fraction of time the system is in a given state (time average)
= The system is in the given state at any random time
after being steady

• Counter-example (textbook [Kao]: Example 2.7.1)


1/2 1/2

service time = 1/2


inter-arrival time = 1

1 1
0 1 2 3 4 5

Prof. Ai-Chun Pang, NTU 39


Poisson Arrivals See Time Averages (PASTA)
– Arrival average that an arrival will see an idle system = 1
– Time average of system being idle = 1/2
• Mathematically,
– Let X = {x̃(t), t ≥ 0} be a stochastic process with state space S,
and B ⊂ S
– Define an indicator random variable

⎨ 1, if x̃(t) ∈ B
ũ(t) =
⎩ 0, otherwise

– Let N = {ñ(t), t ≥ 0} be a Poisson process with rate λ denoting the


arrival process
then,

Prof. Ai-Chun Pang, NTU 40


Poisson Arrivals See Time Averages (PASTA)
 t  t
ũ(s)dñ(s) ũ(s)ds
0 0
lim = lim
t→∞ ñ(t) t→∞ t
(arrival average) (time average)

• Condition – For PASTA to hold, we need the lack of anticipation


assumption (LAA): for each t ≥ 0,
– the arrival process {ñ(t + u) − ñ(t), u ≥ 0} is independent of
{x̃(s), 0 ≤ s ≤ t} and {ñ(s), 0 ≤ s ≤ t}.

• Application:
– To find the waiting time distribution of any arriving customer
– Given: P[system is idle] = 1 − ρ; P[system is busy] = ρ

Prof. Ai-Chun Pang, NTU 41


Poisson Arrivals See Time Averages (PASTA)
Case 1: system is idle
Poisson

Case 2: system is busy


Poisson

⇒ P (w̃ ≤ t) = P (w̃ ≤ t|idle) · P (idle upon arrival)


+ P (w̃ ≤ t|busy) · P (busy upon arrival)

Prof. Ai-Chun Pang, NTU 42


Memoryless Property of the Exponential Distribution
• A random variable x̃ is said to be without memory, or memoryless, if

P [x̃ > s + t|x̃ > t] = P [x̃ > s] for all s, t ≥ 0 (3)

• The condition in Equation (3) is equivalent to


P [x̃ > s + t, x̃ > t]
= P [x̃ > s]
P [x̃ > t]
or
P [x̃ > s + t] = P [x̃ > s]P [x̃ > t] (4)
• Since Equation (4) is satisfied when x̃ is exponentially distributed (for
e−λ(s+t) = e−λs e−λt ), it follows that exponential random variable are
memoryless.
• Not only is the exponential distribution “memoryless,” but it is the
unique continuous distribution possessing this property.
Prof. Ai-Chun Pang, NTU 43
Comparison of Two Exponential Random Variables
Suppose that x̃1 and x̃2 are independent exponential random variables
with respective means 1/λ1 and 1/λ2 . What is P [x̃1 < x̃2 ]?
 ∞
P [x̃1 < x̃2 ] = P [x̃1 < x̃2 |x̃1 = x]λ1 e−λ1 x dx
0∞
= P [x < x̃2 ]λ1 e−λ1 x dx
0∞
= e−λ2 x λ1 e−λ1 x dx
0∞
= λ1 e−(λ1 +λ2 )x dx
0
λ1
=
λ1 + λ2

Prof. Ai-Chun Pang, NTU 44


Minimum of Exponential Random Variables
Suppose that x̃1 , x̃2 , · · · , x̃n are independent exponential random variables,
with x̃i having rate μi , i = 1, · · · , n. It turns out that the smallest of the x̃i
is exponential with a rate equal to the sum of the μi .

P [min(x̃1 , x̃2 , · · · , x̃n ) > x] = P [x̃i > x for each i = 1, · · · , n]



n
= P [x̃i > x] (by independence)
i=1
n
= e−μi x
i=1
  n  

= exp − μi x
i=1

How about max(x̃1 , x̃2 , · · · , x̃n )? (exercise)

Prof. Ai-Chun Pang, NTU 45

You might also like