You are on page 1of 44

Queuing Analysis:

Introduction to Queuing Analysis


Hongwei Zhang
http://www.cs.wayne.edu/~hzhang

Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis.

Outline


Delay in packet networks

Introduction to queuing theory

Exponential and Poisson distributions

Poisson process

Littles Theorem

Outline


Delay in packet networks

Introduction to queuing theory

Exponential and Poisson distributions

Poisson process

Littles Theorem

Sources of Network Delay?




Processing Delay


Queueing Delay


Time buffered waiting for transmission

Transmission Delay


Time between receiving a packet and assigning the packet to an outgoing link queue

Time between transmitting the first and the last bit of the packet

Propagation Delay


Time spend on the link transmission of electrical signal

Independent of traffic carried by the link

Focus: Queueing & Transmission Delay

Outline


Delay in packet networks

Introduction to queuing theory

Exponential and Poisson distributions

Poisson process

Littles Theorem

Basic Queueing Model


Buffer

Server(s)

Departures

Arrivals
Queued





In Service

A queue models any service station with:




One or multiple servers

A waiting area or buffer

Customers arrive to receive service


A customer that upon arrival does not find a free
server waits in the buffer

Characteristics of a Queue

Number of servers m: one, multiple, infinite

Buffer size b

Service discipline (scheduling)




FCFS, LCFS, Processor Sharing (PS), etc

Arrival process

Service statistics

Arrival Process
n +1

n 1

n
tn


n : interarrival time between customers n and n+1

n is a random variable

{ n , n 1} is a stochastic process


Interarrival times are identically distributed and have a common


mean E [ n ] = E [ ] = 1/ , where is called the arrival rate

Service-Time Process
n +1

n 1

sn
t


sn : service time of customer n at the server

{ s n , n 1} is a stochastic process


Service times are identically distributed with common mean

E[ sn ] = E [ s ] =
is called the service rate

For packets, are the service times really random?

Queue Descriptors


Generic descriptor: A/S/m/k




A denotes the arrival process




For Poisson arrivals we use M (for Markovian)

S denotes the service-time distribution




M: exponential distribution

D: deterministic service times

G: general distribution

m is the number of servers

k is the max number of customers allowed in the system either in


the buffer or in service


k is omitted when the buffer size is infinite

Queue Descriptors: Examples







M/M/1: Poisson arrivals, exponentially distributed


service times, one server, infinite buffer
M/M/m: same as previous with m servers
M/M/m/m: Poisson arrivals, exponentially distributed
service times, m server, no buffering
M/G/1: Poisson arrivals, identically distributed service
times follows a general distribution, one server,
infinite buffer
*/D/ : A constant delay system

Outline


Delay in packet networks

Introduction to queuing theory

Exponential and Poisson distributions

Poisson process

Littles Theorem

Some probability distributions and random process




Exponential Distribution


Memoryless Property

Poisson Distribution

Poisson Process


Definition and Properties

Interarrival Time Distribution

Modeling Arrival Statistics

Exponential Distribution


A continuous R.V. X follows the exponential distribution with


parameter , if its pdf is:

e x
f X ( x) =
0

if x 0
if x < 0

=> Probability distribution function:

1 e x
FX ( x ) = P{ X x} =
0


Usually used for modeling service time

if x 0
if x < 0

Exponential Distribution (contd.)




Mean and Variance:


E[ X ] =

, Var( X ) =

Proof:

E[ X ] = x f X ( x ) dx = x e x dx =
= xe

x
0

E[ X ] = x e
2

+ e x dx =

2 x
0

dx = x e

Var( X ) = E[ X 2 ] ( E[ X ])2 =

+ 2 xe x dx =
1

E[ X ] =

Memoryless Property


Past history has no influence on the future

P{ X > x + t | X > t} = P{ X > x}


Proof:
P{ X > x + t | X > t} =

P{ X > x + t , X > t} P{ X > x + t}


=
P{ X > t}
P{ X > t}

e ( x +t )
= t = e x = P{ X > x}
e


Exponential: the only continuous distribution with the memoryless


property

Poisson Distribution


A discrete R.V. X follows the Poisson distribution with parameter if


its probability mass function is:

P{ X = k } = e


k
k!

, k = 0,1, 2,...

Wide applicability in modeling the number of random events that occur


during a given time interval (=>Poisson Process)


Customers that arrive at a post office during a day

Wrong phone calls received during a week

Students that go to the instructors office during office hours

packets that arrive at a network switch

etc

Poisson Distribution (contd.)




Mean and Variance


E[ X ] = , Var( X ) =

Proof:

E[ X ] = kP{ X = k } = e

k =0

= e
j =0

j
j!

k =0

= e e =

E[ X ] = k P{ X = k } = e
2

k =0

k =0

= e ( j + 1)

j =0

k k ! = e ( k 1)!

k =0

j
j!

= je
j =0

k
k!

=e

k ( k 1)!
k =0

j
j!

+ e

Var( X ) = E[ X 2 ] ( E[ X ])2 = 2 + 2 =

j =0

j
j!

= 2 +

Sum of Poisson Random Variables




Xi , i =1,2,,n, are independent R.V.s


Xi follows Poisson distribution with parameter i

Sum S = X + X + ... + X
n
1
2
n


Follows Poisson distribution with parameter

= 1 + 2 + ... + n

Sum of Poisson Random Variables (cont.)


Proof: For n = 2. Generalization by induction. The pmf of S = X1 + X2 is
P fS = mg =
=

m
X

k=0
m
X

P fX1 = k; X2 = m kg
f 1 = kg
g P fX
f 2 = m kg
g
P fX

k=0
m
X

mk
k

e1 1 e2 2
=
k!
(m k)!
k=0

= e

(1 +2)

m
1 X
m!
k1mk
2
m! k=0 k!(m k)!

+ 2)m
= e
m!
Poisson with parameter = 1 + 2.
(1 +2) (1

Sampling a Poisson Variable





X follows Poisson distribution with parameter


Each of the X arrivals is of type i with probability pi,
i =1,2,,n, independent of other arrivals;
p1 + p2 ++ pn = 1
Xi denotes the number of type i arrivals, then


X1 , X2 ,Xn are independent

Xi follows Poisson distribution with parameter i= pi

=> Splitting of Poisson process (later)

Sampling a Poisson Variable (contd.)


Proof: For n = 2. Generalize by induction. Joint pmf:
P fX1 = k1; X2 = k2g =
= P fX1 = k1; X2 = k2 jX = k1 + k2g P fX = k1 + k2g
k + k
k1 +k2
1
2
k1 k2

=
p1 p2 e
k1
(k1 + k2)!
1
=
(p1)k1 (p2)k2 e(p1+p2)
k1 !k2!
k1
k2
p1 (p1 )
p2 (p2 )
e
= e
k1 !
k2 !
X1 and X2 are independent
k1

P fX1 = k1g = ep1 (pk 1!) , P fX2 = k2g = ep2 (pk 2!)
1
2

k2

Xi follows Poisson distribution with parameter pi .

Poisson Approximation to Binomial




Binomial distribution with parameters


(n, p)

n
P{ X = k } = p k (1 p )n k
k


As n and p0, with np=


moderate, binomial distribution
converges to Poisson with parameter

Proof:
n
P{ X = k } = p k (1 p )n k
k
k

( n k + 1)...( n 1)n
=
1
k!
n n
( n k + 1)...( n 1)n

1
n
nk
n


e
1
n
n


1
1
n
n

P{ X = k }
e
n

k
k!

nk

Outline


Delay in packet networks

Introduction to queuing theory

Exponential and Poisson distributions

Poisson process

Littles Theorem

Poisson Process with Rate




{A(t): t0} counting process




A(t) is the number of events (arrivals) that have occurred from time 0 to
time t, when A(0)=0
A(t)-A(s) number of arrivals in interval (s, t]

Number of arrivals in disjoint intervals are independent

Number of arrivals in any interval (t, t+] of length




Depends only on its length

Follows Poisson distribution with parameter

P{ A(t + ) A(t ) = n} = e

( )n
, n = 0,1,...
n!

=> Average number of arrivals ; is the arrival rate

Interarrival-Time Statistics


Interarrival times for a Poisson process are independent and follow


exponential distribution with parameter
tn: time of nth arrival; n=tn+1-tn: nth interarrival time

P{ n s} = 1 e s , s 0
Proof:


Probability distribution function

P{ n s} = 1 P{ n > s} = 1 P{ A(tn + s ) A(tn ) = 0} = 1 e s




Independence follows from independence of number of arrivals in disjoint


intervals

Small Interval Probabilities




Interval (t+ , t] of length


P{ A(t + ) A(t ) = 0} = 1 + ( )
P{ A(t + ) A(t ) = 1} = + ( )
P{ A(t + ) A(t ) 2} = ( )
Proof:

Merging & Splitting Poisson Processes


1

p
p

1 + 2

1-p

2


A1,, Ak independent Poisson


processes with rates 1,, k
Merged in a single process
A= A1++ Ak
A is Poisson process with rate
= 1++ k

(1-p)



A: Poisson processes with rate


Split into processes A1 and A2
independently, with probabilities p and
1-p respectively
A1 is Poisson with rate 1= p
A2 is Poisson with rate 2= (1-p)

Modeling Arrival Statistics




Poisson process widely used to model packet arrivals in numerous


networking problems
Justification: provides a good model for aggregate traffic of a large
number of independent users


n traffic streams, with independent identically distributed (iid) interarrival


times with PDF F(s) not necessarily exponential
Arrival rate of each stream /n
As n, combined stream can be approximated by Poisson under mild
conditions on F(s) e.g., F(0)=0, F(0)>0

Most important reason for Poisson assumption: Analytic tractability of


queueing models

Outline


Delay in packet networks

Introduction to queuing theory

Exponential and Poisson distributions

Poisson process

Littles Theorem

Littles Theorem

T


: customer arrival rate

N: average number of customers in system

T: average delay per customer in system


Littles Theorem: System in steady-state

N = T

Counting Processes of a Queue


(t)
N(t)
(t)

t





N(t) : number of customers in system at time t


(t) : number of customer arrivals till time t
(t) : number of customer departures till time t
Ti : time spent in system by the ith customer

Time Averages


Time average over interval [0,t]

Steady state time averages

Nt

t
Tt

1 t
= N ( s )ds
t 0
a (t )
=
t
1 a (t )
=
Ti

a (t ) i =1
(t )
=
t

Littles theorem:



N = lim N t
t

= lim t
t

T = lim Tt
t

= lim t
t

N=T
Applies to any queueing system
provided that:


Limits T, , and exist, and

We give a simple graphical proof


under a set of more restrictive
assumptions

Proof of Littles Theorem for FCFS




(t)

FCFS system, N(0)=0


(t) and (t): staircase graphs

N(t)

N(t) = (t)- (t)


Ti

Shaded area between graphs

(t)

S (t ) = N ( s )ds
0

T1


T2
t

Assumption: infinitely often, N(t)=0. For any such t


1 t
(t ) 1 Ti
(
)
(
)
N
s
ds
=
T

N
s
ds
=
N t = tTt

i
0

0
t
t (t )
i =1
t

(t )

(t)

If limits NtN, TtT, t exist, Littles formula follows


We will relax the last assumption (i.e., infinitely often, N(t)=0)

Proof of Littles for FCFS (contd.)


(t)
N(t)

Ti
(t)

T1


T2

In general even if the queue is not empty infinitely often:

(t ) T 1 t
(t ) T
Ti N ( s )ds Ti
N ( s )ds

0
(t )
t
t 0
t (t )
i =1
i =1
tTt N t tTt
(t )

(t )

(t)

(t)

Result follows assuming the limits Tt T, t, and t exist, and =

Probabilistic Form of Littles Theorem





Have considered a single sample function for a stochastic process


Now will focus on the probabilities of the various sample
functions of a stochastic process
Probability of n customers in system at time t

pn (t ) = P{N (t ) = n}


Expected number of customers in system at t

n =0

n =0

E[ N (t )] = n.P{N (t ) = n} = npn (t )

Probabilistic Form of Little (contd.)





pn(t), E[N(t)] depend on t and initial distribution at t=0


We will consider systems that converge to steady-state, where there exist pn
independent of initial distribution

lim pn (t ) = pn , n = 0,1,...
t

Expected number of customers in steady-state [stochastic aver.]

EN = npn = lim E[ N (t )]
n =0


For an ergodic process, the time average of a sample function is equal to the
steady-state expectation, with probability 1.

N = lim N t = lim E[ N (t )] = EN
t

Probabilistic Form of Little (contd.)




In principle, we can find the probability distribution of the delay Ti for customer
i, and from that the expected value E[Ti], which converges to steady-state

ET = lim E[Ti ]


For an ergodic system

T = lim
i

Ti

= lim E[Ti ] = ET
i

Probabilistic Form of Littles Formula:

EN = .ET
where the arrival rate is define as

E[ (t )]
= lim
t
t

Time vs. Stochastic Averages




Time averages = Stochastic averages for all systems of interest in


this course


It holds if a single sample function of the stochastic process contains all


possible realizations of the process at t

Can be justified on the basis of general properties of Markov chains

Example 0: a single line


For a transmission line,



: packet arrival rate


NQ: average number of packets waiting in queue (i.e., not under
transmission)
W: average time spent by a packet waiting in queue (i.e., not including
transmission time)

=>

N Q = W

Similarly, if X is the average transmission time, then the average # of


packets under transmission is

= X
is also called the utilization factor

Example 1: a network


Given


A network with packets arriving at n different nodes, and the arrival rates
are 1, ..., n respectively.
N: average # of packets inside the network,

Then


Average delay per packet (regardless of packet length distribution and


routing algorithms) is

T=

i =1

Ni = Ti for each node i

Example 2: data transport (congestion control)




Consider


a window flow congestion system with a window of size W for each session

: per session packet arrival rate

T: average packet delay in the network

Then W T
=> if congestion builds up (i.e., T increases), must eventually decrease

Now suppose


network is congested and capable of maintaining delivery rate, then

W T
=> increasing W only increases delay T

Summary


Delay in packet networks

Introduction to queuing theory

A few more points about probability theory

The Poisson process

Littles Theorem

Homework #7


Problems 3.1, 3.4, and 3.6 of R1

Grading:


Overall points 130




20 points for Prob. 3.1

50 points for Prob. 3.4

60 points for Prob. 3.6

You might also like