You are on page 1of 51

Probability Review

Thinh Nguyen
Probability Theory Review

 Sample space
 Bayes’ Rule
 Independence
 Expectation
 Distributions
Sample Space - Events
 Sample Point
 The outcome of a random experiment
 Sample Space S
 The set of all possible outcomes
 Discrete and Continuous
 Events
 A set of outcomes, thus a subset of S
 Certain, Impossible and Elementary
Set Operations S
 Union A  B A B
 Intersection A  B
 Complement AC

 Properties AC A B
 Commutation
A B  B  A
 Associativity
A  B C   A B C
 Distribution
A   B C    A  B   A C 
 De Morgan’s Rule

 A  B
C
 AC  B C
Axioms and Corollaries
 Axioms  Corollaries
 0  P  A  P  AC   1  P  A
 P S 1  P  A  1
 If A  B    P    0
P  A  B   P  A  P  B   P  A  B 
 If A1, A2, … are pairwise P  A  P  B   P  A  B 
exclusive
  
P Ak    P  Ak 
 k 1  k 1
Conditional Probability
 Conditional Probability of A B S
event A given that event
B has occurred
P  A  B
P  A | B 
P  B
AC A B
 If B1, B2,…,Bn a partition B1
of S, then B2
P  A  P  A | B1  P  B1   ... 
P  A | B j  P  B j  A
B3
(Law of Total Probability)
Bayes’ Rule
 If B1, …, Bn a partition of S then

P  A  B j 
P  B j | A 
P  A
P  A | B j  P  B j 
 n

 P A | B  P B 
k 1
k k

likelihood  prior
posterior 
evidence
Event Independence
 Events A and B are independent if

P  A  B   P  A P  B 

 If two events have non-zero probability and are mutually


exclusive, then they cannot be independent
Random Variables
Random Variables
 The Notion of a Random
Variable
S
 The outcome is not always
a number ζ

 Assign a numerical value to X(ζ) = x


the outcome of the
experiment
x
 Definition
 A function X which assigns Sx
a real number X(ζ) to each
outcome ζ in the sample
space of a random
experiment
Cumulative Distribution Function
 Defined as the probability
of the event {X≤x} Fx(x)

FX  x   P  X  x  1

 Properties
0  FX  x   1 x
Fx(x)
lim FX  x   1
x  1
lim FX  x   0 ¾
x 
½
if a  b then FX  a   FX  a 
¼
P  a  X  b   FX  b   FX  a  0 1 2 3 x

P  X  x   1  FX  x 
Types of Random Variables
 Continuous  Discrete
 Probability Density  Probability Mass
Function Function
dFX  x 
fX  x  PX  xk   P  X  xk 
dx
x
FX  x   f X  t  dt FX  x    PX  xk  u  x  xk 


k
Probability Density Function
 The pdf is computed from fX(x)

dFX  x 
fX  x 
dx
 Properties
P  a  X  b    f X  x  dx
b
fX(x)
a
x
FX  x    f X  t  dt dx x

 P  x  X  x  dx   f X  x  dx
1  f X  t  dt


 For discrete r.v.


f X  x    PX  xk    x  xk 
k
Expected Value and Variance
 The expected value or mean of  The variance of X is
X is
Var  X    2  E  X  E  X   
2

E  X    tf X  t  dt  

 The standard deviation of X is
E  X    xk PX  xk 
k
 Properties Std  X     Var  X 
E  c  c  Properties
E  cX   cE  X  Var  c   0
E  X  c  E  X   c Var  cX   c 2Var  X 

Var  X  c   Var  X 
Queuing Theory
Example
Send a file over the internet

packet Modem link (fixed rate)


card
buffer
place

A
C
propagation
Delay Models

transmission

Computation
(Queuing)
time
Queue Model
Practical Example
Multiserver queue
Multiple Single-server queues
Standard Deviation impact
Queueing Time
Queuing Theory
The theoretical study of waiting lines,
expressed in mathematical terms

input server output

queue

Delay= queue time +service time


The Problem
Given
 One or more servers that render the service

 A (possibly infinite) pool of customers

 Some description of the arrival and service

processes.
Describe the dynamics of the system
Evaluate its Performance

 If there is more than one queue for the server(s), there may also
be some policy regarding queue changes for the customers.
Common Assumptions

 The queue is FCFS (FIFO).


 We look at steady state : after the system has
started up and things have settled down.

State=a vector indicating the total # of customers in


each queue at a particular time instant
(all the information necessary to completely describe
the system)
Notation for queuing systems
A /B /c /d

A = t h e in te r a r r iv a l tim e d is tr ib u tio n
B = t h e s e r v ic e t im e d is t r ib u t io n
c = th e n u m b e r o f s e rv e rs
d = th e q u e u e s iz e lim it omitted if infinite
Where A and B can be:
D for Deterministic distribution
M for Markovian (exponential) distribution
G for General (arbitrary) distribution
The M/M/1 System

Poisson Exponential output


Process server
queue
Arrivals follow a Poisson process
Readily amenable for analysis
Reasonable for a wide variety of situations

 a(t) = # of arrivals in time interval [0,t]


  = mean arrival rate
 t = k ; k = 0,1,…. ; 0

Pr(exactly 1 arrival in [t,t+]) = 


Pr(no arrivals in [t,t+]) = 1-
Pr(more than 1 arrival in [t,t+]) = 0

Pr(a(t) = n) = e- t ( t)n/n!


Model for Interarrivals and Service times

 Customers arrive at times t0 < t1 < .... - Poisson distributed


 The differences between consecutive arrivals are the
interarrival times : n = tn - t n-1
 n in Poisson process with mean arrival rate , are
exponentially distributed,
Pr(n  t) = 1 - e- t


Service times are exponentially distributed, with mean
service rate :
Pr(Sn  s) = 1 - e-s
System Features
 Service times are independent
 service times are independent of the arrivals

Both inter-arrival and service times are memoryless
Pr(Tn > t0+t | Tn> t0) = Pr(Tn  t)
future events depend only on the present state

 This is a Markovian System


Exponential Distribution
given an arrival at time x
P (x    (x  t ))
P ((  x )  t|  x ) 
P (  x )
P (  (x  t ))  P (  x ) (1  e  (x t )
)  (1  e  x
)
 
P (  x ) 1  P (  x )
 (x t )  x  x  t
e e e (1  e )
  x
  x
1  (1  e ) e
Same as probability starting at time = 0
Markov Models

• n+1 departure
Buffer
Occupancy •n •n

• n-1
arrival

t t
Probability of being in state n

Pn (t  t )  Pn (t )[(1  t )(1  t )  tt ]


 Pn 1 (t )[(t )(1  t )]
 Pn 1 (t )[(t )(1  t )]

as t  0, Taylor series
dPn (t )
Pn (t  t )  Pn (t )  t
dt
Steady State Analysis

Substituting for Pn (t  t )
(   ) Pn  Pn 1  Pn 1
Steady state
P0  P1
Markov Chains
  

0 1 ... n-1 n n+1

  
Rate leaving n = Pn (   )
Rate arriving n = Pn 1  Pn 1
Steady State Pn (   )  Pn 1  Pn 1
State 0 P0  P1
Substituting Utilization


P1  P0  P0

P2  (   ) P1  P0
 
P2  P1  P1  P0  P1 (  1)  P0
 
Substituting P1

P2  P0 (  1)  P0
  P0  P0  P0   P0
2 2

Pn   P0
n

• Higher states have decreasing probability


• Higher utilization causes higher probability
of higher states
What about P0

  P0

 Pn  1    P0  P0   
n n

n 0 n 0 n 0 1 
P0
1  P0  1  
1 
Pn  (1   ) n

Queue determined by 

E(n), Average Queue Size
  
q  E (n)   nPn   n(1   )   (1   ) n
n n

n 0 n 0 n 0


=
1- 
Selecting Buffers

E(N) 
1/3 .25
1 .5
3 .75
9 .9

For large utilization, buffers grow exponentially


Throughput

 Throughput=utilization/service time = /Ts

 For =.5 and Ts=1ms

 Throughput is 500 packets/sec


Intuition on Little’s Law
 If a typical customer spends T time units, on the overage, in
the system, then the number of customers left behind by
that typical customer is equal to

q  Tq
Applying Little’s Law
M/M/1 Average Delay
E (n)  E (T ) or w  Tw or q  Tq
E ( n)  1 1
E (T )    
  (1   )  (1   )   
1  / Ts
Ts  so E (T )   
  (1   )  (1   ) (1   )
Probability of Overflow

 
P (n  N )   pn  (1   )   n   N 1
n  N 1 n  N 1
Buffer with N Packets

N N
 1   N 1

 p n  1  p 0    p0 
n

n 0 n 0  1  
1  (1   )  n
p0  and pn 
1  N 1
1  N 1

(1   )  N
pN   (1   )  with   1
N N +1

1  N 1
Example
 Given
 Arrival rate of 1000 packets/sec
 Service rate of 1100 packets/sec
 Find  1000
 Utilization    0.91
 1100
 Probability of having 4 packets in the queue

P4  (1   ) .062
4

P1 .082, P2 .075, P3 .068, P5 .056


Example

E ( n)   9.99
1 
With infinite buffers
P (n  12)   121  .28
With 12 fixed buffers cell loss probability
( 1  )12
P12   .04
1  12 1

Pn  .11,.10,.09,.09,.08,.07,.07,.06,.05,.05,.04
Application to Statistcal
Multiplexing
 Consider one transmission
R/N
line with rate R.
 Time-division Multiplexing R/N
 Divide the capacity of the R/N
transmitter into N channels,
each with rate R/N. 1
T 
 

 Statistical Multiplexing
 Buffering the packets
R
coming from N streams into
a single buffer and
transmitting them one at a 1 T
T' 
time. N  N N
Network of M/M/1 Queues
3
4
1 2 3
1

2

1   1   2 2   1   2   3 3   1   3

i L  L1  L2  L3   1   2   3
Li 
i  i

1 J i
T 
 i 1 i  i
The customers pat at rate C since each
M/G/1 Queue customer pays C on the average and 
customers go through the queue per unit time.
Assume that every customer in the queue pays at rate R when his or
her remaining service time is equal
At atogiven
R. time t, the customers pay at a rate
S : Service Time equal to the sum of the remaining service times
Q : Queuing Time of all the customer in the queue. The queue
S 2come-first served, this sum is equal to
begin first
Total cost paid by a customer: the 
SQqueueing time of a customer who would
2
enter the queue at time t. 2
Expected cost paid by each customer: C 
E[ Q ] E [ S ]

 2
 E[Q] E[ S 2 ] 
E[Q]  C     
  2 
 E[ S 2 ]
S E[Q] 
2(1   )
1
0 T  E[Q] 
Q S 

You might also like