Professional Documents
Culture Documents
4 Review of Basic
Probability and Statistics
4.1 Introduction
Design the Perform statistic
simulation analyses of the
experiments simulation output data
Choose the
Validate the
input probabilistic
simulation
distribution
model
4.2 Random variables and their
properties
• Experiment is a process whose outcome is not
known with certainty.
• Sample space (S) is the set of all possible
outcome of an experiment.
• Sample points are the outcomes themselves.
• Random variable (X, Y, Z) is a function that
assigns a real number to each point in the
sample space S.
• Values x, y, z
Examples
• flipping a coin
S={H, T}
• tossing a die
S={1,2,…,6}
• flipping two coins
S={(H,H), (H,T), (T,H), (T,T)}
X: the number of heads that occurs
• rolling a pair of dice
S={(1,1), (1,2), …, (6,6)}
X: the sum of the two dice
Distribution (cumulative) function
F ( x ) P( X x ) for x
Properties:
1. 0 F ( x ) 1 for all x.
Then
p( x ) 1
i 1
i
F ( x) p( x )
xi x
i for all x
Examples
p(x)
1
5/6
2/3
1/2
1/3
1/6
0 1 2 3 4 x
1/2
1/3
1/6
0 1 2 3 4 x
x fydy 0
x
PX xPX
x, x
x
x
x
PX
x, x x fydy
fydy
x
FxPX , x for all x
f ( x ) F ( x )
b
PX Ia fydy FbFa I
a, b
f(x)
P( X [ x, x x ])
x x x x' x ' x x
1 if 0 x 1
fx
0 otherwise
If 0 x 1 , then
F ( x) 0 f ( y )dy 0 1dy x
x x
f(x) F(x)
1 1
0 1 x 0 1 x
f(x) for a uniform random variable on [0,1] F(x) for a uniform random variable on [0,1]
x
x
x
1 if 0 x 1 PX
x, x x fydy
fx
0 otherwise Fx xFx
x xx
x
where 0 x x x 1
Exponential random variable
F(x)
f(x)
1 1
0 x
0 x
Then
4
p X x xy
27
x
3
for x=1,2
y2
2
p Yy xy
27
y
9 for y=2,3,4
x1
are independent.
Joint probability density function
The random variables X and Y are jointly continuous if there exists a nonnegative
function f(x,y), such that for all sets of real numbers A and B,
X A, Y BB A f
P x, y
dxdy
x, yf X
f xf Y
y for all x and y
where
x f
f X x, y
dy
y f
f Y x, y
dx
are the (marginal) probability density functions of X and Y, respectively.
Example 4.11
Suppose that X and Y are jointly continuous random variables with
Then
1x
x0 24xydy 12xy 2|1
f X 0
x 12x
1 x2 for 0 x 1
1y
y0 24xydx 12yx 2
f Y 1y
0 1 y2 for 0 y 1
12y
Since
f12 , 1
2
6 32 2 f X 12
f Y12
1. E
cXcE
X
2.
E n
c X
i1 i i
n
i1 i
Xi
c E
Even if the X‘s
i are dependent.
Median x 0.5
The median x 0.5 of the random variable is defined to be the
smallest value of x such that F Xi
x 0.50. 5
f X i (x )
area=0.5
x 0.5 x
The median x 0.5 for a continuous random variable
Example 4.14
1. Consider a discrete random variable X that takes on each of the
values, 1, 2, 3, 4, and 5 with probability 0.2. Clearly, the mean
And the median of X are 3.
The median may be a better measure of central tendency than the mean.
Variance
Var( X i ) i2 E[( X i i )2 ] E ( X i2 ) i2 E ( X i2 ) [ E ( X i )]2
1 2
0 x 2dx 1
1
X 0 x f
E 2 xdx
3
Var X 22 1 1 2 1
XE
3 2 12
2
2 small
large
1. Var
X0
2. VarcXc 2VarX
i1 Xi i1 VarXi if the X i‘s are
3. Var
n n
i 2i
0 8x 2
1
1 x3dx
2
15
1 1
X0 xf X
E xdx 0 12x 21 x2dx 2
5
1 1
EY0 yf Yydy 0 12y 21 y2dy 2
5
CovX, Y EXYEXEY
2 2 2
15 5 5
2
75
If X i and X j are independent random variables
Cij 0
1 ij 1
State space is the set of all possible values that these random
variables can take on.
j 2 for j 1, 2,
Ci , i j Cj Cj
i2 i2 j C0
Example 4.22
Consider the output process D1, D2, for a covariance-stationary
M/M/1 queue with /1 .
Warmup period
In general, output processes for queueing systems are
positively correlated.
Unbiased estimators: n
Xi
Sample mean E [ X ( n )] Xn i1
n
n
X i X
n
2
Sample variance
2
E[S 2 (n)] 2 S 2
n i1
n 1
How close X n is to ?
X X
First observation Second observation
of X (n ) of X (n )
How close X n is to
to construct a confidence interval
n
XnVarn1
Var Xi
i1
n
1
Var X i (because the X i ’s are independent)
n2
i1
n
1
n2
VarX i
i1
2
1
n2 n
n2
Unbiased estimator
n
X i Xn
2
S 2n
Xn n
Var i1
nn 1
Density function Density function
for X n for X n
n large
n small
j 0 E ( S 2 (n)) 2
S 2
n
n/an1
E n VarX
n (3)
n 1
n1
an 1 2 1 j/nj
j1
If j 0, then a S 2
n1 and E n X
/nVar .
n
Example 4.24
D1, D2, , D10 from the process of delays for a covariance-
stationary M/M/1 queue with 0. 9 . Eq.(1) and (3)
S 2
E 0. 03282
10
S 210
E 0. 0034Var
D
10
10
10 10
Di
Di D10
2
Z n [ X (n) ] / 2 / n
Fn ( z ) P( Zn z )
4.5.1 Confidence Intervals
Central Limit Theorem: Fn z z as n , where z,
the distribution function of a normal random variable with
0 and 2 1, is given by
z 1 z e y 2 /2dy for z
2
Xn
t n / S 2n/n
Xn
Pz 1/2 z 1/2
S n/n
2
S 2n
P
Xnz 1/2 n
S 2n
Xnz 1/2 n
1
where z 1/2 ( 0 1) is the upper 1 /2 critical point for
a standard normal random variable.
f(x)
Shaded area = 1 a
z1a / 2 0 z1a / 2 x
1 percent
If n is sufficiently large, an approximate 100
confidence interval for is given by
S 2n
nz 1/2
X n confidence interval
Interpretation I: If one constructs a very large number of
independent 100 1 percent confidence intervals, each
based on n observations, where n is sufficiently large, the
proportion of these confidence intervals that contains (cover)
should be 1 .
S 2n
nt n1,1/2
X n
Where t n1,1/2 is the upper 1 /2 critical point for the t
distribution with n-1 df
f(x) Standard normal distribution
t distribution with 4df
0 x
Figure 4.16
Density function for the t distribution with 4df
and for the standard normal distribution.
Example 4.26
10 observations: 1.20, 1.50, 1.68, 1.89, 0.95, 1.49, 1.58, 1.55,
0.50, 1.09 are from a normal distribution,
S 210
10t 9,0.95
X 1. 34 1. 83 0. 17 1. 34 0. 24
10 10
Table 4.1 Estimated coverages based on 500 experiments
Distribution Skewness v n=5 n=10 n=20 n=40
Normal 0.00 0.910 0.902 0.898 0.900
EX 3
v v
2 3/2
4.5.2 Hypothesis tests for the mean
H 0 : 0
t n1,1/2 reject H0 H0
If |t n |
t n1,1/2 "accept" H0
Example 4.27
For Example 4.26,
To test the null hypothesis H 0 that 1 at level 0. 10 .
101
X 0. 34
t 10 2. 65 1. 83 t 9,0.95
S 2
10
/10 0. 17/10
We reject H 0 .
4.6 The Strong Law of Large Numbers