You are on page 1of 27

Queuing Theory

(Theory of Waiting Lines)


Applications
• ATM
• Payment for purchase
• Ticket window
• Health services (eg. control of hospital bed
assignments)
• Airport traffic
• Job shop
• Telecommunications
• Traffic control
1
Random Variable
Random variable: A variable is random if it takes on
different values as a result of the outcomes of a random
experiment.

Random experiment: an experiment whose outcomes


cannot be predicted with certainty.

Examples:
X = number of heads when the experiment is flipping a coin 20
times.
C = the daily change in a stock price.
R = the number of miles per liter you get in your bike.
2
Types of Random Variables
A discrete random variable can take on only a
finite number of values or a countable number of
values.
- you can sit down and list all possible outcomes without missing any,

although it might take you an infinite amount of time.

X = sum of values on the roll of two dice: X has to be either 2, 3, 4, …, or 12.


Y = number of accidents on KGP-KOL Highway during a month: Y has to be 0,
1, 2, 3, 4, 5, 6, 7, 8, ……………”real big number”
3
Types of Random Variables
A continuous random variable can take on
any of the countless number of values in a given
interval.
- this means you can never list all possible outcomes even if you had an infinite
amount of time.
– usually measurement data [time, weight, distance, etc]
X = time it takes you to drive home from class: X > 0, might be 30.1 minutes
measured to the nearest tenth but in reality the actual time is
30.10000001…………………. minutes?)
Exercise: try to list all possible numbers between 0 and 1.
4
Probability distribution
• A random variable (discrete or continuous) has a
probability distribution.
• A probability distribution provides the possible
values of the random variable with the probability of
occurrence of that value in the experiment.

• A probability distribution can be in the form of a table,


graph or mathematical formula.

5
Probability Distributions

• Two Types of Probability Distributions


– Discrete: When the variables being measured can only
take on certain values, such as the integers 0, 1, 2, …, the
probability distribution is called a discrete distribution.
The distribution of the number of nonconformities would
be a discrete distribution.
– Continuous: When a variable being measured is expressed
on a continuous scale, its probability distribution is called
a continuous distribution. The probability distribution of
piston-ring diameter is continuous.

6
Discrete
Probability Distribution
1.List of All possible [x, P(x)] pairs
x = Value of Random Variable (Outcome)
P(x) = Probability Associated with Value
2.Mutually Exclusive (No Overlap)
3.Collectively Exhaustive (Nothing Left Out)
4. 0  P(x)  1
5.  P(x) = 1
7
Mean and Variance of a
discrete probability distribution
  E( X )  xi p(xi )
i

V ( X )    E  X  E( X )    xi  E(xi ) p(xi )
2 2 2
X
  i
2
   
 E ( X )  [ E ( X )]   xi p ( xi )   xi p ( xi )
2 2 2

 i   i 

8
A few notes about Expected Value
If c= a constant number (i.e., not a variable) and X and Y are any random
variables…
• E(c) = c
• E(cX)=cE(X)
• E(c + X)=c + E(X)
• E(X+Y)= E(X) + E(Y)

9
A few notes about Variance
If c= a constant number (i.e., not a variable) and X and Y are random
variables, then
 V(c) = 0
 V(c+X)= V(X)
 V(cX)= c2V(X)
 V(X+Y)= V(X) + V(Y) Only if X and Y are independent
 V(X+Y)= V(X) + V(Y)+2COV(X,Y), If X and Y are not
independent

10
The Expected Value and variance of
a continuous Random Variable
If dx is an infinitely small number, the probability that X is included within the interval
(x, x + dx) is equal to f(x)dx, i.e. P(x< X < x+dx) = Prob(x ≤ X ≤ x+dx) = f(x)dx
Where f(x) : Probability density function, a piecewise continuous function, f(x)≥0

 f ( x)dx  1
Rx
b
P(a  X  b)   f ( x)dx
a


E( X )     xf ( x)dx



V ( X )  E ( X   ) 2   

( x   ) 2 f ( x)dx
The Poisson Distribution
The distribution was derived by the French
mathematician Siméon Poisson in 1837,
and the first application was the description
of the number of deaths by horse kicking in
the Prussian army.

Suitable for modeling of rare events that occur infrequently in


space, time, volume, and so forth.
The Poisson Distribution
– Any random phenomenon that occurs on a per
unit basis (per unit area, per unit volume, per
unit time, etc.) is often well approximated by
the Poisson distribution.
– Example: number of defective rivets in an
airplane wing, spots per square meter of clothe,
m/c break down per week, the number of
misprints in a book, the number of customers
arriving per day, Wrong phone calls received
during a week
Assumptions
A random variable X follows a Poisson process provided
the following conditions are met

1. The probability of two or more occurrences in any


sufficiently small interval is 0.
2. The probability of occurrence is the same for any two
intervals of equal length.
3. The number of occurrences in any interval is
independent of the number of occurrences in any other
interval provided the intervals are not overlapping.
The Poisson Distribution
The Poisson distribution function is

e  x
p(x)  , x  0 ,1, 2 
x!
Where the parameter  > 0. The mean and
variance of the Poisson distribution are

2
  
For a Poisson random variable, the
variance and mean are the same!

 average number of events on per unit basis that happen over specified area,
volume, time, etc. (shape parameter)

Denoted by X ~ Poisson ()


Example
Arrivals at a bus-stop follow a
Poisson distribution with an average
of 4.5 every quarter of an hour.
Calculate the probability of fewer than
3 arrivals in a quarter of an hour.
The probabilities of 0 up to 2 arrivals can be calculated
directly from the formula


e  x
p ( x)  with  =4.5
x!
4.5 0
e 4.5
p (0)  So p(0) = 0.01111
0!
Similarly p(1)=0.04999 and p(2)=0.11248
So the probability of fewer than 3 arrivals
is 0.01111+ 0.04999 + 0.11248 =0.17358
Mean of Poisson Distribution
e  x
p ( x)  x  0,1, 2,...
x!


 e  x 

E ( X )   xp( x)   x  
x 0 x 0  x ! 


 e  x   e  x
 x  
x 1  x !  x 1 ( x  1)!


 x 1
  e     e   e  
x 1 ( x  1)!
Variance of Poisson Distribution
V ( X )  E  X 2    E ( X )   E  X ( X  1)   E ( X )   E ( X ) 
2 2


 e  x 
  x( x  1)       2

x 0  x 

 e  x 
  x( x  1)       2

x2  x 

e   x
    2
x2 x  2

 x2
 e 2 

x2 x2
   2

  2 e  e     2   2     2
V (X )  

  
Exponential Distribution
• A continuous random variable X follows the exponential
distribution with parameter , if its probability density
function is:
 e   x if x  0
f ( x)  
 0 if x  0

• Cumulative distribution function:


1  e   x if x  0
F ( x)  P{ X  x}  
 0 if x  0

2-20
Exponential Distribution
• Mean and Variance:
1 1
E[ X ]  , Var( X ) 
 2
Proof:
 
E[ X ]   x f ( x ) dx   x e   x dx 
0 0
 1
  xe   x 
  e   x dx 
0 0 
  2 2
E[ X ]   x  e
2 2  x
dx   x e 2  x 
 2  xe   x dx  E[ X ] 
0 0 0  2
2 1 1
Var( X )  E[ X 2 ]  ( E[ X ]) 2   
2 2 2

2-21
Graphing Exponential Distribution

f (x)

The probability that inter-arrival time is


within t time units

X=t

 e   x if x  0
f ( x)  
 0 if x  0
Model Queuing System
Queuing System
Customer Customer
Arrival Departure

Queue Server

• Use Queuing models to


– Describe the behavior of queuing systems
– Evaluate system performance
23
Kendall Notation (a/b/c):(d/e/f)
• To summarize the characteristics of the queuing
system
• Six parameters in shorthand: First three typically
used, unless specified
a: Arrival Distribution (Poisson arrival rate)
b: Service Distribution (Poisson departure/service rate)
c: Number of servers (=1, 2,…, ∞)
d: Service Discipline (FCFS/LIFO/SIRO, etc)
e: Maximum number allowed in the system (infinite or
finite)
f: Size of input source (finite/infinite) 24
Steady State Performance Measures of
Queuing System
Steady State condition: System state is independent of initial state and elapsed
time, i.e. system state becomes independent of time

Performance Measures

L : Average Number of customers in system

Lq : Average Number of customers in queue (Average queue length)

W : Average time a customer spends in system

Wq : Average time a customer spends in queue (Average waiting time)


Little’s Law
• Under steady state, L = W
: Average arrival rate (average number of customers arriving
per unit time)
Let
T : a long period of time;
n(t) : number of customers in the system at time t;
A(T) : area under the curve n(t) over time period T;
N(T) : number of arrivals during time period T.

Then, during time period T


(T) =N(T)/T= average arrival rate
=> N(T) = T (T) (i)
L(T) = A(T)/T =average number of customers
=> A(T) = T L(T) (ii)
W(T) = A(T)/N(T) =average time a customer
Number of customers in a queuing system versus time spends in the system
W(T) = L(T)/(T) (Form (i) and (ii))
 L(T) = (T) W(T)
T → infinity => L = W
Similarly, Lq = Wq
Arrival rate and Inter-arrival time distributions
• Arrival rate follows Poisson distribution:
Number of arrivals during specific time interval follows Poisson
distribution.
( t ) n e   t
Probability of n arrivals during time t , pn (t )  , n  0,1, 2,
n!
Mean = Variance =  = Average arrival rate
• Inter-arrival time follows exponential distribution:
Inter-arrival time t follows exponential distribution,
probability density function (pdf): f (t )   e  t , t  0
1 1
Mean   Average inter-arrival time and Variance = 2
 
T
Cumulative distribution function (cdf), p(t  T )    e  t  1  e  T , t  0
0

You might also like