You are on page 1of 2

m.s.

p. d.
a.s.
(
I
f
s
e
q
u
e
n
c
e
is
d
o
m
in
a
te
d
b
y
a
f
in
ite
s
e
c
o
n
d
m
o
m
e
n
t.)
a
s
in
g
le
r
a
n
d
o
m
v
a
r
ia
b
le
w
ith
Law of large numbers Suppose X
1
, X
2
, . . . is a sequence of random variables such that each X
i
has nite mean m. Let S
n
= X
1
+ +X
n
. Then
(a)
Sn
n
m.s.
m (hence also
Sn
n
p.
m and
Sn
n
d.
m) if for some constant c, Var(X
i
) c for all i, and
Cov(X
i
, X
j
) = 0 i ,= j (i.e. if the variances are bounded and the X
i
s are uncorrelated).
(b)
Sn
n
p.
m if X
1
, X
2
, . . . are iid. (weak law)
(c)
Sn
n
a.s.
m if X
1
, X
2
, . . . are iid. (strong law)
Central limit theorem Suppose X
1
, X
2
, . . . are i.i.d., each with mean and variance
2
. Let
S
n
= X
1
+ +X
n
. Then
Snn

n
converges in distribution to the N(0,
2
) distribution as n .
Cramers theorem Suppose E[X
1
] is nite, and that E[X
1
] < a. Then for > 0 there exists a
number n

such that P

Sn
n
a

exp(n(l(a) + )) for all n n

. Combining this bound with


the Cherno inequality yields lim
n
1
n
lnP

Sn
n
a

= l(a).
A Brownian motion, also called a Wiener process, with parameter
2
is a random process
W = (W
t
: t 0) such that
PW
0
= 0 = 1,
W has independent increments,
W
t
W
s
has the N(0,
2
(t s)) distribution for t s, and
W is sample path continuous with probability one.
A Poisson process with rate is a random process N = (N
t
: t 0) such that
N is a counting process,
N has independent increments, and
N(t) N(s) has the Poi((t s)) distribution for t s.
Kolomorov forward equations for time-homogeneous, continuous-time, discrete-state Markov
processes:
(t)
t
= (t)Q or

j
(t)
t
=

iS

i
(t)q
ij
or

j
(t)
t
=

iS,i=j

i
(t)q
ij

iS,i=j

j
(t)q
ji
,
The orthogonality principle Let 1 be a closed, linear subspace of L
2
(, T, P), and let X
L
2
(, T, P), for some probability space (, T, P). There exists a unique element Z

(also denoted
by
V
(X)) in 1 so that E[(X Z

)
2
E[(X Z)
2
] for all Z 1.
(a) (Characterization) Let W be a random variable. Then W = Z

if and only if the following


two conditions hold:
(i) W 1
(ii) (X W) Z for all Z in 1.
(b)(Error expression) E[(X Z

)
2
] = E[X
2
] E[(Z

)
2
].
Special cases:
1 = R Z

= E[X]
1 = g(Y ) : g : R
m
R, E[g(Y )
2
] < + Z

= E[X[Y ]
1 = c
0
+c
1
Y
1
+c
2
Y
2
+. . . +c
n
Y
n
: c
0
, c
1
, . . . , c
n
R Z

=

E[X [ Y ]
For random vectors:

E[X[Y ] = E[X] + Cov(X, Y )Cov(Y, Y )
1
(Y EY )
and for e = X

E[X[Y ] : Cov(e) = Cov(X) Cov(X, Y )Cov(Y, Y )
1
Cov(Y, X).
Conditional pdf for jointly Gaussian random vectors Let X and Y be jointly Gaussian
vectors. Given Y = y, the conditional distribution of X is N(

E[X[Y = y], Cov(e)).


Maximum likelihood (ML) and maximum a posterior (MAP) estimators

ML
(y) = arg max

p
Y
(y[)

MAP
(y) = arg max

p
|y
([y) = arg max

()p
Y
(y[) = arg max

p
,Y
(, y)
Proposition Suppose X is irreducible, aperiodic, discrete-time discrete-state Markov
(a) All states are transient, or all are positive recurrent, or all are null recurrent.
(b) For any initial distribution (0), lim
t

i
(t) = 1/M
i
, with the understanding that the limit
is zero if M
i
= +, where M
i
is the mean time to return to state i starting from state i.
(c) An equilibrium probability distribution exists if and only if all states are positive recurrent.
(d) If it exists, the equilibrium probability distribution is given by
i
= 1/M
i
. (In particular, if
it exists, the equilibrium probability distribution is unique).
A Karhunen-Lo`eve (KL) expansion for a random process X = (X
t
: a t b) is a
representation of the form X
t
=

N
n=1
C
n

n
(t) with N such that:
(1) the functions (
n
) are orthonormal:
m
,
n
) = I
{m=n}
, and
(2) the coordinate random variables C
n
are mutually orthogonal: E[C
m
C

n
] = 0 if n ,= m.
Jointly WSS X and Y in LTI systems:
h
Y
X U
V
k
R
U,V
= h

k R
XY
S
U,V
= HK

S
X,Y

You might also like