Professional Documents
Culture Documents
Probability Statistics Random Processes
Probability Statistics Random Processes
INDEX
! factorial .................... 3, 24
1st fundamental theorem of
probability...................... 16
2nd fundamental theorem of
probability...................... 17
absorbing matrices............ 21
absorption
probability................... 21
time ............................ 21
approximation theorem..... 18
area
sphere ......................... 25
Bayes' formula ................. 11
Bayes' inverse problem..... 11
Bayes' theorem................. 11
Bernoulli trials
central limit theorem.... 17
Bernoulli trials process ... 6, 7
beta density function......... 12
binomial coefficient............ 5
binomial distribution
central limit theorem.... 17
negative ........................ 7
binomial distribution function
........................................ 6
binomial expansion........... 25
binomial theorem ............. 25
birthday problem .............. 23
B-matrix........................... 21
calculus............................ 24
canonical form ................. 21
card problem .................... 23
central limit theorem......... 17
general form................ 18
sum of discrete variables
.............................. 18
central limit theorem for
Bernoulli trials................ 17
central limit theorem for
binomial distributions ..... 17
Chebyshev inequality ....... 16
coefficient
binomial........................ 5
multinomial................... 5
coefficients of ordinary
generating function ......... 20
combinations...................... 3
conditional density function
continuous................... 11
conditional probability 10, 11
continuous conditional
density function .............. 11
continuous uniform density. 9
convolution ...................... 15
example ...................... 16
correlation........................ 15
covariance........................ 15
cumulative distribution
function............................ 6
cumulative normal
distribution function.......... 6
cyclic permutation .............. 3
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
general math..................... 24
generating function..... 18, 19
ordinary ...................... 19
properties .................... 20
geometric distribution......... 7
glossary............................ 25
graphing terminology ....... 25
h(z) ordinary generating
function.......................... 19
hat check problem ............ 23
hypergeometric distribution 7
hypotheses ....................... 11
inclusion-exclusion principle
........................................ 3
independent events ............. 4
independent trials ............. 25
independent variable......... 25
inequality
Chebyshev .................. 16
Markov ....................... 13
infinite sample space .......... 3
infinite sum...................... 24
integration........................ 24
intersection......................... 3
joint cumulative distribution
function............................ 6
joint density function .......... 9
joint distribution................. 7
LHpitols rule................ 24
law of averages................. 16
law of large numbers ........ 16
linearizing an equation...... 25
logarithms ........................ 24
M mean first passage matrix
...................................... 22
m() distribution function.. 5
Markov chains.................. 20
Markov inequality ............ 13
mean................................ 12
median ............................. 25
medical probabilities......... 23
memoryless property ........ 12
moment generating function
................................ 18, 19
moments .......................... 19
multinomial coefficient....... 5
multinomial distribution ..... 7
multiple toss....................... 2
mutually exclusive.............. 3
NA(0,z) table of values..... 17
natural number e............... 24
Nc-matrix......................... 21
negative binomial
distribution ....................... 7
N-matrix .......................... 21
normal density function ...... 8
normal random variable ...... 4
normalization ..................... 8
NR-matrix........................ 21
ordering ............................. 3
ordinary generating function
...................................... 19
ProbabilityStatisticsRandomProc.pdf
coefficients.................. 20
p(j) coefficients of the
ordinary generating function
...................................... 20
permutation........................ 3
P-matrix........................... 21
poisson distribution ............ 7
poker probabilities............ 10
posterior probabilities ....... 11
prior probabilities ............. 11
probabilities
posterior...................... 11
prior............................ 11
probability.......................... 2
uneven .......................... 4
probability of absorption... 21
probability space ................ 3
probability tree ................. 11
problems .......................... 23
properties of expectation... 13
Q-matrix .......................... 21
quadratic equation ............ 25
random variable................ 25
normal .......................... 4
standard normal............. 4
regular chain .................... 22
properties .................... 22
reliability ......................... 10
reverse tree....................... 11
R-matrix........................... 21
round table ......................... 3
sample problems............... 23
sample space .................. 3, 5
series................................ 25
set properties...................... 3
single coin toss................... 2
Sn* standardized sum....... 17
specific generating functions
...................................... 19
sphere .............................. 25
standard deviation .... 6, 8, 14
standard normal density
function............................ 8
standard normal random
variable ............................ 4
standardized sum.............. 17
states................................ 20
step .................................. 20
stirling approximation....... 24
stochastic ......................... 25
sum
standardized ................ 17
sum of continuous variables
...................................... 15
sum of discrete variables... 15
table of values for NA(0,z) 17
time to absorption............. 21
t-matrix............................ 21
transient state ................... 21
transition matrix ............... 20
tree
probability................... 11
5/18/2001 Page 1 of 25
trials
independent................. 25
trigonometric identities..... 24
uneven probability.............. 4
uniform density .................. 9
uniform distribution...... 5, 25
union.................................. 3
V(X) variance .................. 14
variable
random........................ 25
variance ........................... 14
continuous random
variables................. 14
discrete random variables
.............................. 14
properties .................... 14
vector
fixed probability.......... 22
volume
sphere ......................... 25
PROBABILITY
Mathematically, the probability of an outcome is equal
to the number of possible positive outcomes divided
by the total possible outcomes (size of the sample
space).
P ( positive outcome ) =
1
2n
1
2n
3
5
number
of possible
positions
of first
heads
P ( H ) = P (T ) =
failure rate...................... 9
expected value.............. 12
n moments ..................... 19
correlation.................... 15
standard deviation ........ 14
2 variance...................... 14
getting exactly
P
=
two heads
}
n
number of
possible
positions
of second
heads
678
( n 1)
2{
order of occurance
is not a factor
1
2
1
2n
{
inverse
outcome
P ( HH ) = P ( HT ) = P (TH ) = P (TT ) =
1
4
getting exactly
P
=
three heads
}
n
number of
possible
positions
of third
heads
678 678
( n 1)( n 2 ) 1
2n
3
2{
{
number of ways to
order 3 items
inverse
outcome
P ( HH ) + P ( HT ) + P (TH ) =
3
4
1 P (TT ) =
Tom Penick
tom@tomzap.com
3
4
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 2 of 25
SET PROPERTIES
= {1, 2, 3, K}
The union of two sets refers to that
which is in set A or set B. In terms
of area, it is the sum of the areas
minus the common area.
AB
AB
PERMUTATION
a b c
c b a
P ( A B ) = P ( A) + P ( B ) P ( A B )
P ( A) + P ( A ) = 1
ORDERING/COMBINATIONS 1
INCLUSION-EXCLUSION PRINCIPLE
Expanding on a theorem presented in the previous
box, the probability that at least one event contained
within a group of (possibly) intersecting sets of events
occurs is the sum of the probabilities of each event
minus the sum of the probabilities of all 2-way
intersections, plus the sum of the probabilities of all
3-way intersections, minus the sum of all 4-way
intersections, etc.
n
P ( E1 E2 L En ) = P ( Ei )
i =1
1 i< j < k n
P(E E )
i
KONSTANTOPOULOS
There are 15 letters in the word. This includes 4 Os, 2 Ns, 2
Ts, and 2 Ss. How many 15-letter combinations can be
formed with these letters given that some letters are
identical?
number of words =
15!
4!2!2!2!
1 i< j n
P ( Ei E j Ek ) L
ORDERING/COMBINATIONS 2
In how many ways can n identical objects be arranged
in m containers?
SAMPLE SPACE
The set of all possible outcomes. (Means the same as
probability space, I think.) For example, if a coin is
tossed twice, the sample space is
n + m 1
( n + m 1)!
=
n 1 ( n 1)! ( m n )!
A problem that appeared in the textbook was, how may
ways can 6 identical letters be put in 3 mail boxes? The
answer is 28.
= {( H, H ) , ( H, T ) , ( T, H ) , ( T, T )}
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 3 of 25
INDEPENDENT EVENTS
p.139
P ( A B ) = P ( A) P ( B )
For example, the outcome of the first roll of a die does not
affect the second roll. The independence of two events can
be lost if the probabilities are not even, e.g. an unfair coin
or die.
Independence can also be expressed in terms of a
conditional probability. The probability of A given B is still A.
P ( A | B ) = P ( A)
And if this is true, then it is also true that
P ( B | A) = P ( B )
Plot of the density functions for normal random
variables with expectation = 0
VARIABLE
p.213
X = Z +
UNEVEN PROBABILITY
If we assign the probability n to the outcome of heads
of an unfair coin, then
P (H) = n
P (T) = 1 n
The probabilities of a double coin toss will be
HT:
(1 n )2
n (1 n )
n (1 n )
HH:
n2
TT:
TH:
Z=
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 4 of 25
BINOMIAL COEFFICIENT
DISTRIBUTION FUNCTIONS
p.95
DISTRIBUTION FUNCTION
p.19
a!
a
b =
( a b )! b !
52
10 is read 52 choose 10 and stands for the
( )()
( )
48!
4!
48!
4!
48 4
48
6
!
6!
4
4
!4!
6
4
(
)
(
) = 42!6! 0!4!
=
52!
52!
52
10
42!10!
( 52 10 ) !10!
48!
42!6! 48! 42!10!
=
=
= 0.000776
52!
42!6!52!
42!10!
m(
) DISCRETE UNIFORM
DISTRIBUTION FUNCTION p.19,367
The function assigning probabilities to a finite set of
equally likely outcomes. For example, the distribution
function of a fair double coin toss is
m ( H, H ) = m ( H, T ) = m ( T, H ) = m ( T, T ) =
1
4
2
3
4
5
6
1
mi =
MULTINOMIAL COEFFICIENT
A binomial coefficient becomes multinomial when there is
more than one type of object to choose or there is more than
one location to choose objects from.
Multiple object types:
n
n!
n n L n =
n1 ! n2 !L nk !
k
1 2
n
n + k 1 ( n + k 1) !
n =
n ! ( k 1) !
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
1/ ( l k + 1) , x = k , k + 1, k + 2,K , l
P ( X = x) =
otherwise
0,
=
k +l
2
2 =
Generating Function:
( l k )( l k + 2 )
12
etk et (l +1)
g (t ) =
( l k + 1) (1 et )
m ( ) = 2 + 4 + 8 + L
= 0
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 5 of 25
p.61
FX ( x ) = P ( X x ) =
f ( t ) dt
1
x
e( )
FX ( x ) =
/ 2 2
du
F ( x, y ) = P ( X x, Y y )
2 = np (1 p )
= np
p.184
( nk )
()
b ( n, p , k ) = n p k q n k
k
where
= np
( x ) = ( q + pe x )
2 = npq
()
b ( n, 0.5, k ) = n 0.5n
k
In the case of no successful outcomes (k = 0), the function
reduces to
b ( n, p , 0 ) = q n
( )
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 6 of 25
MULTINOMIAL DISTRIBUTION
M
N
(
m )(k m )
P ( of selecting m, n balls ) =
( M k+ N )
p.141
FX ,Y ( x, y ) = FX ( x ) FY ( y )
p.186
x 1 k x k
u ( x, k , p ) = P ( X = x ) =
p q
k 1
This seems to describe the Bernoulli Trials Process which
is a sequence of x chance experiments such that 1) each
experiment has 2 possible outcomes and 2) the probability p
of success is the same for each experiment and is not
affected by knowledge of previous outcomes.
X = random variable: the observation of an experimental
outcome
x = the number of attempts
k = the number of successful outcomes
p = the probability that any one event is successful
q = the probability that an event is not successful, 1 - p
POISSON DISTRIBUTION
GEOMETRIC DISTRIBUTION
p.184
The geometric distribution applies to a series of dualoutcome events (such as coin tosses) where p is the
probability of success on any given event, q is the
probability of failure, j is the event number, and T is a
random variable that stands for the event that
produces the first success. The geometric distribution
function has the memoryless property, p8. See also
Specific Generating Functions p19.
P (T = j ) = q j 1 p
1
=
p
q
= 2
p
2
pet
g (t ) =
1 qet
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
p.187
k
P(X = k) e
k!
=
2 =
g (t ) = e
et 1
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 7 of 25
EXPONENTIAL DISTRIBUTION
p.205
F ( t ) = 1 e t
Expectation:
Deviation:
Variance:
2 =
Generating Fnct.:
1
2
g (t ) =
P ( a X b ) = f ( x ) dx ,
p.212
DENSITY FUNCTIONS
f ( x) = F( x)
fX ( x) =
2
1
x / 2 2
e( )
2
p.325
( x) =
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
1 x2 / 2
e
2
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 8 of 25
f(
) CONTINUOUS UNIFORM DENSITY
p.205
f ( ) =
1
ba
ba
2
p.205
f ( t ) = e t
f (t )
p.165
F ( x, y )
, where f ( x ) = F ( x )
f ( x, y ) =
x y
2
F (t )
n F ( x1 , x2 ,L , xn )
x1 x2 L xn
t
e.g. for a random variable T
P (T > x ) = e x
Two exponential density
functions are plotted at
right with = 2 (high
failure rate) and = 0.5
(low failure rate). Note
that in each case at right
the area under the curve
is 1. Both curves extend
to infinity.
1
2 = 2
Expectation:
Variance:
tom@tomzap.com
www.teicontrols.com/notes
P (T x ) = 1 e x
f (t )
=2
1
=.5
Generating Function:
Tom Penick
F ( t ) = 1 e t
f ( t , u ) dt du
f ( x1 , x2 ,L , xn ) =
F ( x, y ) =
Deviation:
g (t ) =
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 9 of 25
RELIABILITY
POKER PROBABILITIES
p.205
reliability = f ( t ) dt
T
reliability = e
dt = e
( ) ( )
P ( A B) = P ( A B)
P A B = P A B
A ( B C ) = A B C = A B C
) (
( )
DE MORGANS LAWS
1
1
=
= 384.77 109
52
2,598,960
5
= failure rate
t = time [s]
= A B A C = ( A B ) ( A C )
1
4 48 44 13 = 0.04226
52
5
( )
P(A | B) =
P ( A B)
P (B)
P(A | B) =
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
1/ 6 1
=
1/3 2
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 10 of 25
f ( x) / P ( E ) ,
f ( x | E) =
0,
if x E
if x E
a
1
1/ (1/2 ) , if 0 x 1 / 2 2, if 0 x 1/ 2
f ( x | E) =
=
if 1/ 2 < x < 1 0, if 1/ 2 < x < 1
0,
x m (1 x )
x m (1 x )
n m
nm
dx
dx
PROBABILITY TREE
BAYES' THEOREM
P ( B | A) =
P (A | B) P (B)
P ( A)
Color
of ball
p ()
2/5
blk
1/5
3/5
wht
3/10
1/2
blk
1/4
1/2
wht
1/4
Urn
(start)
1/2
II
BAYES' FORMULA
P ( Hi | E ) =
Color
of ball
p.145
P ( H i ) P ( E | Hi )
Urn
p ()
1/5
5/9
II
1/4
6/11
3/10
5/11
II
1/4
4/9
9/20
(start)
blk
11/20
wht
P(H ) P( E | H )
k =1
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 11 of 25
B(
,
,x) BETA DENSITY FUNCTION
p.168
MEMORYLESS PROPERTY
p.206
= E ( X ) = xm ( x )
x
= E ( X ) = np
1 1
1
x (1 x ) ,
B ( , , x ) = B ( , )
0,
ba
2
= E(X ) =
otherwise
B ( , ) = x 1 (1 x )
1
Beta function:
if 0 x 1
dx
P ( success ) =
B ( , ) =
Tom Penick
( 1)!( 1)!
( + 1)!
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 12 of 25
= E(X ) =
provided that
x f ( x ) dx
x f ( x ) dx
is finite.
Otherwise the expected value does not exist. Note that the
limits of integration may be reduced provided they include
the sample space.
For an exponential density
f ( t ) = e t
(p.9), the
PROPERTIES OF EXPECTATION
p.268, 394
E ( X 2 ) = V ( X ) + 2
E(Xn) =
x n f X ( x ) dx
E ( X + Y ) = E ( X ) + E (Y )
If X is a random variable and c is a constant
E ( cX ) = cE ( X )
If X and Y are independent
E ( X Y ) = E ( X ) E (Y )
fX(x) = density function for the random variable X
expected value is
MARKOV INEQUALITY
E(
(X)) EXPECTATION OF A FUNCTION
p.229
E ( ( X )) = ( x ) m ( x )
P(X > k)
E(X )
k
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 13 of 25
VARIANCE
p.257
2 = V ( X ) = E
(( X ) ) = ( x ) m ( x )
2
( x ) m ( x)
2
p.271
2 = V ( X ) = E
(( X ) ) =
( x )
f ( x ) dx
V ( X + Y ) = V ( X ) + V (Y )
V ( cX ) = c 2V ( X )
V ( X + c) = V ( X )
V ( X ) = E ( X 2 ) 2
then
91 7 35
E( X ) = =
6 2 12 .
2
p.257
D( X ) = V ( X ) = E
(( X ) )
2
= npq
X = random variable: the observation of an experimental
outcome
V(X) = the variance of X
= the expected value of X, E(X)
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 14 of 25
cov(X,Y) COVARIANCE
p.280
cov ( X , Y ) = E ( X ( X ) ) (Y ( Y ) )
= E ( XY ) ( X ) (Y )
Property of covariance:
V ( X + Y ) = V ( X ) + V (Y ) + 2cov ( X , Y )
(X,Y) CORRELATION
p.281
( X ,Y ) =
cov ( X , Y )
V ( X ) V (Y )
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
CONVOLUTION
SUM OF RANDOM VARIABLES
p.285, 291
m3 = m1 m2
m3 ( z ) = m1 ( k ) m2 ( z k )
k
f ( x) g ( y) = h ( z) =
=
f ( z y ) g ( y ) dy
g ( z x ) f ( x ) dx
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 15 of 25
CONVOLUTION EXAMPLE
Suppose the distribution functions m1(x) and m2(y) for
the discrete random variables X and Y are
1
2
0
m1 ( x ) = m2 ( x ) =
1/8 3 / 8 1/ 2
Given Z = X + Y, we can find the distribution function
m3(z) of Z using convolution.
m3 ( z ) = m1 ( x ) m2 ( y ) = m1 ( k ) m2 ( z k )
k
1 1 1
P ( z = 0) = =
, k=0
8 8 64
1 3 3 1 3
P ( z = 1) = + = , k = 0, 1
8 8 8 8 32
1 1 3 3 1 1 17
P ( z = 2) = + + =
,
8 2 8 8 2 8 64
3 1 1 3 3
P ( z = 3) = + = , k = 1, 2
8 2 2 8 8
1 1 1
P ( z = 4) = = , k = 2
2 2 4
p.305
k = 0, 1, 2
Therefore
1
2
3
4
0
m3 ( z ) =
1/ 64 3 / 32 17 / 64 3 / 8 1/ 4
Note that
Sn
P n 0
n
as n
P n < 1
n
as n
Sn
n
CHEBYSHEV INEQUALITY
p.305,316
P ( X )
V (X)
2
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 16 of 25
p.325
fX ( x) =
a* =
2
1
x
/2 2
e ( ) see p8.
2
a np
npq
b np
npq
b* =
lim P ( a S n b ) = ( x ) dx *
*
b*
1 x2 / 2
e
2
where ( x ) =
= the deviation
2 = the variance
= the expected value of X, E(X)
S n* =
S n np
S n n
*
or S n =
jpq
n 2
a = lower bound
b = upper bound
(x) = standard normal density function
n = number of trials or selections
p = probability of success
q = probability of failure (1-p)
p.331
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
NA(z)
.0000
.0398
.0763
.1179
.1554
.1915
.2257
.2580
.2881
.3159
z
1.0
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
1.9
NA(z)
.3413
.3643
.3849
.4032
.4192
.4332
.4452
.4554
.4641
.4713
z
2.0
2.1
2.2
2.3
2.4
2.5
2.6
2.7
2.8
2.9
ProbabilityStatisticsRandomProc.pdf
NA(z)
.4772
.4821
.4861
.4893
.4918
.4938
.4953
.4965
.4974
.4981
z
3.0
3.1
3.2
3.3
3.4
3.5
3.6
3.7
3.8
3.9
NA(z)
.4987
.4990
.4993
.4995
.4997
.4998
.4998
.4999
.4999
.5000
5/18/2001 Page 17 of 25
S n
1 b x2 / 2
< b =
lim P a < n
e
dx
2
n
2 a
n
Note that
S n n
n
= S n* .
GENERATING FUNCTIONS
Discrete: g ( t ) = E e
a = lower bound
b = upper bound
n = number of trials or selections
S mn
1 b x2 / 2
< b =
lim P a < n
e
dx
n
sn
2 a
mn = the mean of Sn
sn = the deviation of Sn (square root of the variance)
a = lower bound
b = upper bound
n = number of trials or selections
APPROXIMATION THEOREM
p.365
tX
) = e j p(xj )
tx
j =1
( )=
Continuous: g ( t ) = E e
tX
etx f X ( x ) dx
( ) = b 1 a
Uniform Density: g ( t ) = E e
tX
b
a
etx dx
p.342
For n large:
P ( Sn = j ) :
where
xj =
( xj )
n
1
2n
( j n )2
2 n2
( j n )
n2
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 18 of 25
un MOMENTS
p.366, 394
dn
n = E ( X ) = n g ( t )
dt
t =0
n
Discrete: n =
(x ) P( X = x )
j =1
( )=
Continuous: n = E X
Mean:
x n f X ( x ) dx
et ( e nt 1)
1
pX ( j ) =
n
g (t ) =
= ( n + 1) / 2
2 = ( n 2 1) /12
n ( et 1)
()
p X ( j ) = n p j q n j
j
g ( t ) = ( pet + q )
= np
2 = np (1 p )
pet
1 qet
2 = q / p2
g (t ) =
0 = 1 for t = 0
= 1/ p
2 = 2 12 for t = 0
Sanity check:
p.366
p X ( j ) = q j 1 p
= 1 for t = 0
Variance:
nd
moment, etc.
pX ( j ) =
=
e j
j!
g (t ) = e
et 1
2 =
p.370
h ( z ) = g ( log z ) = z j p ( j )
j=0
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 19 of 25
MARKOV CHAINS
PROPERTIES OF GENERATING
FUNCTIONS p.371
For Y = X + a :
gY ( t ) = E ( etY ) = E et ( X + a )
gY ( t ) = E ( etY ) = E ( etbX )
For Y = bX :
= g X ( bt )
For Y = bX + a :
gY ( t ) = E ( etY ) = E e (
t bX + a )
For X * =
X
t
t /
gX
: g x* ( t ) = e
For Z = X + Y :
g Z ( t ) = E ( etZ ) = E et ( X +Y )
also
g Z ( t ) = g X ( t ) gY ( t )
hZ ( z ) = hX ( z ) hY ( z )
g (t ) = 1
For t = 0 :
p.370
For example, if
h( z) =
1
4
RAIN
1
2
NICE
1
4
1
2
1
4
1
2
SNOW
1
4
TRANSITION MATRIX
p.406
GENERATING FUNCTION
h
For example, let's say in the Land of Oz, there are never 2
nice days in a row. When they have a nice day, the
following day will be rain or snow with equal probability.
When they have snow or rain, there is a 50% chance that
the following day will be the same and an equal chance of
the other two possibilities. So the states look like this.
p( j) =
p.405
1
2
= E ( etX ) E ( etY ) = g X ( t ) gY ( t )
therefore
STATES
( j)
1/ 2 1/ 4 1/ 4
1/ 2 0 1/ 2
1/ 4 1/ 4 1/ 2
( 0)
j!
1 1
1
+ z + z2
4 2
4
rain
P = nice
snow
of {1/4,1/2,1/4}.
MATRIX POWERS
p.406
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 20 of 25
ABSORBING CHAINS
p.415
.4
.6
P=
3
4
1 0 0 0
.6 0 .4 0
0 .6 0 .4
0 0 0 1
CANONICAL FORM
p.417
P=
3
1
4
N = ( I Q)
.6
FUNDAMENTAL MATRIX OF AN
ABSORBING CHAIN p.418
1 0 0 .4
N =
= 2 1.32 .526
0 1 .6 0
3 .789 1.32
Q = a submatrix extracted from the P-matrix canonical form
and used to obtain the fundamental matrix
I = an identity matrix
TIME TO ABSORPTION
p.419
t = Nc
0 .4 .6 0
.6 0 0 .4
0 0 1 0
0 0 0 1
Q R
P=
0 I
678
1.32
.526
1
2
1.84
t =
=
where
PROBABILITY OF ABSORPTION
p.420
B = NR
From our example N-matrix of the previous box we have
1
1.32 .526 .6 0
B=
= 2 .789 .211
.789 1.32 0 .4
3 .474 .526
N = the fundamental matrix
R = a submatrix of the canonical form
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 21 of 25
SOLVING FOR w
p.433
p.436
w1 + w2 + w3 = 1
Pi1w = p11w1 + p21w2 + p31w3 = w1
W = lim P n
n
p.434
wP = w
p.433
Pc = c
w (P I ) = 0
P = the transition matrix
c = a column matrix of ones
w = the fixed probability vector
I = an identity matrix
Z FUNDAMENTAL MATRIX OF AN
ERGODIC CHAIN p.456
As with absorbing chains, the fundamental matrix of
an ergodic chain leads to useful information, but is
found in a different way
Z = ( I P + W)
mij =
z jj zij
wj
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 22 of 25
P ( success ) =
+i
++n
A CARD PROBLEM
A Gin hand of 10 cards is dealt. What is the
probability that 4 cards belong to one suit, and there
are 3 cards in each of two other suits?
4 3 13 13 13
1 2 4 3 3 = 0.044
52
10
The equation reads, from 4 suits choose 1 suit, from
the remaining 3 suits choose 2 suits, from one suit of
13 cards choose 4 cards, from another suit choose 3
cards, and from another suit choose 3 cards. Divide
the product of these by the number of 10-card hands
possible from a deck of 52 cards.
P ( E1 E2 L En ) =
1
n!
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 23 of 25
GENERAL MATHEMATICAL
EULER'S EQUATION
e j = cos + j sin
CALCULUS - INTEGRATION
x n +1
n
dx = x + C x dx = n + 1 + C
1 u
x
x
u
e dx = u e + C xe dx = ( x 1) e + C
TRIGONOMETRIC IDENTITIES
e + j + e j = 2cos
e + j e j = j 2sin
e j = cos j sin
ax
xe dx =
LOGARITHMS
ln x y = y ln x
ln x = b e b = x
ln e x = x
n ax 2
xe
( n 1) / 2 !
1 /2
2a ( n + )
1 3 5L ( n 1)
n/2 1 n/ 2
a
2( ) + a( )
dx =
e a ln b = b a
log a x = y a y = x
x dx = ln x + C
f ( x)
f (x)
lim
= lim
x c g ( x )
x c g ( x )
sin
cos
CALCULUS - DERIVATIVES
v u u v
d u
=
dx
v
v2
x
x
u
d
d
x
dx a = a ln a
dx a = u a ln a
d
dx
ln x =
Tom Penick
e = u e
u
1
x
tom@tomzap.com
d
dx
ln u =
for odd n
for even n
1
a dx = ln a a
x
u du = 12 u 14 sin 2u + C
u du = 12 u + 14 sin 2u + C
Integration by parts:
+C
u dv = uv v du
FACTORIAL
n! is the number of ways a collection of n objects can
be ordered. See also Stirling Approximation.
STIRLING APPROXIMATION
Useful in calculating large factorials.
n ! ; nne n
or
n ! = n n e n 2 n
ex INFINITE SUM
Useful in the subject of probability.
xk
e =
k =0 k !
x
d
dx
e ax
( ax 1) + C
a2
As n ,
x
x
1 e
n
u
u
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 24 of 25
SERIES
SPHERE
Area = d = 4r
Volume = 16 d 3 = 43 r 3
n ( n 1)
= 1 + 2 + 3 + L + ( n 1)
2
1
1+ x ; 1+ x , x = 1
2
GRAPHING TERMINOLOGY
1
x 3 x 2 5 x 3 35 x 4
; 1 +
+
L , 12 < x < 12
2
8
16 128
1+ x
1
; 1 + x 2 + x 4 + x6 + L , 12 < x < 12
2
1 x
1
; 1 + 2 x + 3 x 2 + 4 x 3 + L , 12 < x < 12
2
(1 x )
1
; 1 x + x 2 x 3 + L , 12 < x < 12
1+ x
1
; 1 + x + x 2 + x 3 + L , 1 < x < 1
1 x
x
= x + x 2 + x 3 + x 4 + L , 1 < x < 1
1 x
ex = 1 + x +
x2 x3 x 4
+ + +L
2 3! 4!
x ( x + 1)
= 1 + 2 + 3 +L + n
2
x ( x + 1)( 2 x + 1) 2 2 2
= 1 + 2 + 3 + L + n2
6
BINOMIAL THEOREM
Also called binomial expansion. When m is a positive
integer, this is a finite series of m+1 terms. When m is
not a positive integer, the series converges for -1<x<1.
(1 + x )
= 1 + mx +
m ( m 1)
2!
x2 +L +
m ( m 1)( m 2 )L ( m n + 1)
n!
x n +L
GLOSSARY
derangement A permutation of elements in which the position
of all elements have changed with respect to a reference
permutation.
distribution function A distribution function assigns
probabilities to each possible outcome. The sum of the
probabilities is 1. The probability density function may be
obtained by taking the derivative of the distribution function.
independent trials A special class of random variables. A
sequence of random variables X1, X2, Xn that are mutually
independent and that have the same distribution is called a
sequence of independent trials or an independent trials
process.
median A value of a random variable for which all greater
values make the distribution function greater than one half
and all lesser values make it less than one half. Or, a value
in an ordered set of values below and above which there is
an equal number of values.
random variable A variable representing the outcome of a
particular experiment. For example, the random variable X1
might represent the outcome of two coin tosses. It's value
could be HT or HH, etc.
stochastic Random; involving a random variable; involving
chance or probability.
uniform distribution The probabilities of all outcomes are
equal. If the sample space contains n discrete outcomes
numbered 1 through n, then the uniform distribution function
is m() = 1/n.
QUADRATIC EQUATION
2
GIven the equation ax + bx + c = 0 .
b b 2 4ac
x=
2a
LINEARIZING AN EQUATION
Small nonlinear terms are removed. Nonlinear terms
include:
variables raised to a power
variables multiplied by other variables
values are considered variables, e.g. t.
Tom Penick
tom@tomzap.com
www.teicontrols.com/notes
ProbabilityStatisticsRandomProc.pdf
5/18/2001 Page 25 of 25