You are on page 1of 2

Fundamentals and conditional probabilities

P[A
c
] = 1 P[A] P[A B] = P[A] + P[B] P[AB]
P[A | B] =
P[AB]
P[B]
P[AB] = P[B]P[A | B]
A and B are mutually exclusive if P[AB] = 0.
A and B are independent if P[AB] = P[A]P[B], which also implies that P[A | B] = P[A].
Three events A, B and C are independent if every possible combination factors, e.g., P[ABC] = P[A]P[B]P[C],
P[A
c
C] = P[A
c
]P[C], etc.
If A
1
, A
2
, . . . , A
n
are mutually exclusive and

P[A
i
] = 1 (i.e., the A
i
are a way of breaking things down into all
possible cases), then
P[B] =

i
P[B | A
i
]P[A
i
] Law of Total Probability
P[A
j
| B] =
P[A
j
B]
P[B]
=
P[B | A
j
]P[A
j
]

i
P[B | A
i
]P[A
i
]
Bayes Theorem
CDFs and densities:
F(x) = P[X x]
If F(x) is continuous, then f(x) =
d
dx
F(x) is the density and P[a X b] =
_
b
a
f(x)dx
If F(x) has a jump at a, i.e. F(a) > lim
xa
F(x), then the size of the jump is the probability of X = a, i.e.,
P[X = a] = F(a) F(a

).
Moments:
E[X] =

x
x P[X = x] +
_

xf(x)dx
(for discrete random variables, only the rst part applies, for purely continuous, only the second, for mixed distri-
butions use both)
If X 0, then E[X] =
_

0
P[X > x]dx.
If X 0 and only takes on integer values, E[X] =

k=0
P[X > k].
E[X
2
] =

x
x
2
P[X = x] +
_
x
2
f(x)dx
E[g(X)] =

x
g(x)P[X = x] +
_
g(x)f(x)dx
Var[X] = E
_
(X
X
)
2
_
= E
_
X
2

(E[X])
2
Var[aX] = a
2
Var[X]
SD[X] =
_
Var[X]
Cov(X, Y ) = E[XY ] E[X] E[Y ]
Cov(X, X) = Var[X]
Corr(X, Y ) =
Cov(X, Y )
SD(X)SD(Y )
Coecient of Variation of X =
SD(X)
E(X)
Covariance is bilinear, meaning that it distributes as you might expect. This means that Cov(aX + bY, cZ) =
acCov(X, Z) + bcCov(Y, Z),
Var(aX + bY ) = a
2
VarX + 2abCov(X, Y ) + b
2
VarY .
Moment generating functions:
M
X
(t) = E
_
e
tX

M
X
(0) = E
_
e
0

= 1
d
dt
M
X
(t)

0
= E[X]
d
(n)
dt
(n)
M
X
(t)

0
= E[X
n
]
M
aX+b
(t) = E[e
atX+bt
] = e
bt
M
X
(at)
Normal, Gamma, and Binomial are the most common mgfs used on the exam.
If X and Y are independent, then M
X+Y
(t) = M
X
(t)M
Y
(t).
Joint moment generating functions:
M
X1,X2
(t
1
, t
2
) = Ee
t1X1+t2X2
E[X
m
1
X
n
2
] =

(m)
x
(m)
1

(n)
x
(n)
1
M(t
1
, t
2
)

(0,0)
M
X1
(t
1
) = Ee
t1X1+0X2
= M
X1,X2
(t
1
, 0)
Convolutions and transformations
If Z = X + Y , then f
Z
(z) =
_

f
X,Y
(x, z x)dx =
_

f
X
(x)f
Y
(z x) The last equality (convolution formula)
only holds if X and Y are indepedent.
You can also nd f
Z
(z) by nding the CDF and dierentiating.
If Y = g(X) and g is one-to-one, then f
Y
(y) = f
X
[g
1
(y)]

d
dy
g
1
(y)

If g is two-to-one (or more), then you can


use this with each inverse and add them up.
If (Y
1
, Y
2
) = g(X
1
, X
2
), then f
Y1,Y2
(y
1
, y
2
) = f
X1,X2
[g
1
(y
1
, y
2
)]

J(g
1
)

,
where J(g
1
) =

g
1
1
y
1
g
1
1
y
2
g
1
2
y
1
g
1
2
y
2

and

a b
c d

= ad bc

You might also like