Statement
Alternate Statement
x
CDF
Fx (x) = P (x x) = px (s)ds
P
P
Total Probability
P (A) = P (A Bi ), (Bi = S )
P (A) = P (ABi )P (Bi )
(x)
Conditional Probability P (AB) = P (A B)/P (B), (P (B) <> 0) pxxA (x) = pPx(A)
for x A
Bayes' Theorem
P (AB)P (B)
=
P
(BA)P
(A)
p
(x)p
(y)
=
p
xy=y
yx=x (y)px (x)
y
Expectation
E{g(x)} = g(x)px (x)dx
E(xA) = A xpxA (x)dx
variance
E((x x )2 )
nth moment: E(xn )
Covariance
cov(x, y) = E((x x )(y y ))
Correlation: E(xy)
Correlation coe
(x, y) = cov(x,y)
MGF: x (s) = E(esx )
x y
Distribution
pmf/pdf
parameter(s)
n
1p if x=0
Bernoulli(p)
px (x) = p if x=1
0<p<1
n
nx x
Binomial(n,p)
px (x) = Cx (1 p)
p f or x = 0, 1, 2, ..., n 0 < p < 1, n a positive integer
Geometric(p)
px (x) = (1 p)x1 pfor x = 1, 2, 3, ...
0<p<1
x1
xk k
Pascal(k,p)
px (x) = Ck1 (1 p)
p for x = k, k + 1, ... 0 < p < 1, k a positive integer
1
Discrete Uniform(k,l)
px (x) = kl+1
k, l integers, k < l
for x = k, k + 1, ..., l
1
Uniform(a,b)
px (x) = ba
a, b a < b
for x [a, b]
Exponential()
px (x) = ex for x > 0
, > 0
(x)2
1
Gaussian
px (x) = 2
,
e 22
2
x
e
Poisson()
px (x) = x! for x = 0, 1, 2, ...
>0
Here's another table giving their vital statistics.
Distribution
mean variance
MGF
Bernoulli(p)
p
p(1 p)
1 p + pes
Binomial(n,p)
np
np(1 p) (1 p + pes )n
pes
1
1
Geometric(p)
p
p2
1(1p)es
k(1p)
pes
k
k
Pascal(k,p)
( 1(1p)e
s)
p
p2
(lk+1)2
esk es(l+1)
Discrete Uniform(k,l) k+l
2
12
(lk+1)(1es )
(ba)2
a+b
ebs eas
Uniform(a,b)
2
12
s(ba)
1
1
Exponential()
2
s
2 2
Gaussian
2
es+s /2
s
Poisson
e(e 1)
= E(( x )( y
Vectorvalued random variables (or random vectors): C
xy
)') (covariance matrix) where and are the mean vectors of the two random
.
vectors. C
x = C
xx
1 (
x
)
1
e( x ) C
(The C
Multivariate Gaussian:p
x  is
x ( x ) = (2)n C

x
the determinant of the covariant matrix.)
A x ,then C
=
AC
A
.
y
x
or P (x x  k) k12
The Chebyshev inequality: P (x x  c) var(x)
c2
The Central Limit Theorem: Let Wn = ni=1 xi where xi are iid, with E(x) =
, and var(x) = x2 . Then the CDF of FWn approaches a Gaussian distribution
2
with meannx and
Pn variance nx . Another version, with the same hypotheses, has:
1
Let un = n( n i=1 xi x )/ . Then the pdf of un approaches the pdf of a N(0,1)
Gaussian variable.
Here's a table for the right half of the CDF of the N(0,1) random variable:
P
(z)
(z)
(z)
(z)
0.00000
0.50000
1.00000
0.84134
2.00000
0.97725
3.00000
0.99865
0.10000
0.53983
1.10000
0.86433
2.10000
0.98214
3.10000
0.99903
0.20000
0.57926
1.20000
0.88493
2.20000
0.98610
3.20000
0.99931
0.30000
0.61791
1.30000
0.90320
2.30000
0.98928
3.30000
0.99952
0.40000
0.65542
1.40000
0.91924
2.40000
0.99180
3.40000
0.99966
0.50000
0.69146
1.50000
0.93319
2.50000
0.99379
3.50000
0.99977
0.60000
0.72575
1.60000
0.94520
2.60000
0.99534
3.60000
0.99984
0.70000
0.75804
1.70000
0.95543
2.70000
0.99653
3.70000
0.99989
0.80000
0.78814
1.80000
0.96407
2.80000
0.99744
3.80000
0.99993
0.90000
0.81594
1.90000
0.97128
2.90000
0.99813
3.90000
0.99995
(If you need a value for a z between two of the values above, simply interpolate
or use the nearest value.)
Estimators (in all cases, observable is y, estimated value is x): x
M L (y) =
arg maxx pyx=x (y) (for random variables: for a parameter , M L = argmax py (y, )).
x
M AP (y) = arg maxx pxy=y (x) (or, you can use argmaxx pxy (x, y). x
M M SE (y) =
E(xy = y). x
LM M SE (y) = ay + b where a = yx , and b = x y a. For
the LMMSE, the meansquare error is E((M L = argmax py (y, )). x
M AP (y) =
arg maxx pxy=y (x) (or, you can use argmaxx pxy (x, y). x
M M SE (y) = E(xy = y).
x
LM M SE (y) = ay + b where a = yx , and b = x y a. For the LMMSE, the
meansquare error is E((
xLM M SE x)2 ) = x2 (1 ).
Stochastic Process formulas: Autocorrelation: Rx (t, ) = E(x(t)x(t+ ). Crosscorrelation: Rxy (t, ) = E(x(t)y(t + )). Autocovariance: Cx (t, ) = E((x(t)
x (t))(x(t + ) x (t + ))). Crosscovariance: Cxy (t, ) = E((x(t) x (t))(y(t +
) y (t + ))).
Cx ( ).)