Professional Documents
Culture Documents
1. 0 p x1 , , xn 1
2. p x1 , , xn 1
x1 xn
3. P X1 , , Xn A p x1 , , xn
x1 , , xn A
Example
x y 13 x y
52 The total number of
ways of choosing the
13 13 cards for the hand
13 13 26
x y 13 x y
p(x,y)
52
13
0 1 2 3 4 5 6 7 8 9 10 11 12 13
0 0.0000 0.0002 0.0009 0.0024 0.0035 0.0032 0.0018 0.0006 0.0001 0.0000 0.0000 0.0000 0.0000 0.0000
1 0.0002 0.0021 0.0085 0.0183 0.0229 0.0173 0.0081 0.0023 0.0004 0.0000 0.0000 0.0000 0.0000 -
2 0.0009 0.0085 0.0299 0.0549 0.0578 0.0364 0.0139 0.0032 0.0004 0.0000 0.0000 0.0000 - -
3 0.0024 0.0183 0.0549 0.0847 0.0741 0.0381 0.0116 0.0020 0.0002 0.0000 0.0000 - - -
4 0.0035 0.0229 0.0578 0.0741 0.0530 0.0217 0.0050 0.0006 0.0000 0.0000 - - - -
5 0.0032 0.0173 0.0364 0.0381 0.0217 0.0068 0.0011 0.0001 0.0000 - - - - -
6 0.0018 0.0081 0.0139 0.0116 0.0050 0.0011 0.0001 0.0000 - - - - - -
7 0.0006 0.0023 0.0032 0.0020 0.0006 0.0001 0.0000 - - - - - - -
8 0.0001 0.0004 0.0004 0.0002 0.0000 0.0000 - - - - - - - -
9 0.0000 0.0000 0.0000 0.0000 0.0000 - - - - - - - - -
10 0.0000 0.0000 0.0000 0.0000 - - - - - - - - - -
11 0.0000 0.0000 0.0000 - - - - - - - - - - -
12 0.0000 0.0000 - - - - - - - - - - - -
13 0.0000 - - - - - - - - - - - - -
Example
Suppose that we observe an experiment that has k
possible outcomes {O1, O2, …, Ok } independently n
times.
Let p1, p2, …, pk denote probabilities of O1, O2, …, Ok
respectively.
Let Xi denote the number of times that outcome Oi
occurs in the n repetitions of the experiment.
Then the joint probability function of the random
variables X1, X2, …, Xk is (The Multinomial
distribution)
n!
p x1 , , xn 1
p1x p2x
2 k
pkx
x1 ! x2 ! xk !
Note:
p1x1 p2x2 pkxk
is the probability of a sequence of length n containing
x1 outcomes O1
x2 outcomes O2
…
xk outcomes Ok
n! n
x1 ! x2 ! xk ! x1 x2 xk
n n x1 n x1 x2 xk
x1 x2 x3 xk
n! n x1 ! n x1 x2 !
x1 ! n x1 ! x2 ! n x1 x2 ! x3 ! n x1 x2 x3 !
n!
x1 ! x2 ! xk !
n! x1 x2 xk
p x1 , , xn p1 p2 p k
x1 ! x2 ! xk !
n
p1x1 p2x2 pkxk
x1 x2 xk
4! x y z
p x, y, z 0.30 0.50 0.20 x y z 4
x! y ! z !
z
Table: p(x,y,z) x y 0 1 2 3 4
0 0 0 0 0 0 0.0016
0 1 0 0 0 0.0160 0
0 2 0 0 0.0600 0 0
0 3 0 0.1000 0 0 0
0 4 0.0625 0 0 0 0
1 0 0 0 0 0.0096 0
1 1 0 0 0.0720 0 0
1 2 0 0.1800 0 0 0
1 3 0.1500 0 0 0 0
1 4 0 0 0 0 0
2 0 0 0 0.0216 0 0
2 1 0 0.1080 0 0 0
2 2 0.1350 0 0 0 0
2 3 0 0 0 0 0
2 4 0 0 0 0 0
3 0 0 0.0216 0 0 0
3 1 0.0540 0 0 0 0
3 2 0 0 0 0 0
3 3 0 0 0 0 0
3 4 0 0 0 0 0
4 0 0.0081 0 0 0 0
4 1 0 0 0 0 0
4 2 0 0 0 0 0
4 3 0 0 0 0 0
4 4 0 0 0 0 0
MARGINAL DISCRETE DISTRIBUTIONS
Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete
random variables with joint probability function
p(x1, x2, …, xq, xq+1 …, xk )
then the marginal joint probability function
of X1, X2, …, Xq is
p12 q x1 , , xq p x1 , , xn
xq 1 xn
• If the pair (X1,X2) of discrete random
variables has the joint pmf p(x1,x2), then the
marginal pmfs of X1 and X2 are
p1 x1 p x1 , x2 and p2 x2 p x1 , x2
x2 x1
Example
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
0 1 2 3 4 5 p X (x )
0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 0.4019
1 0.1646 0.1646 0.0617 0.0103 0.0006 0 0.4019
2 0.0823 0.0617 0.0154 0.0013 0 0 0.1608
3 0.0206 0.0103 0.0013 0 0 0 0.0322
4 0.0026 0.0006 0 0 0 0 0.0032
5 0.0001 0 0 0 0 0 0.0001
p Y (y ) 0.4019 0.4019 0.1608 0.0322 0.0032 0.0001
Example
Assume that the random variables X and Y have the joint
probability mass function given as
= λx e -λ
x!
∞
[Using e λ = ∑ λt ]
t =0 t!
Example
Let the joint distribution of X and Y be given as
f(x , y) = x + y x = 0,l,2,3 y = 0,1,2,
30
Find the marginal distribution function of X and Y. Marginal of X
2 2
x y 1 1 x 1
f ( x) f ( x, y ) [( x 0) ( x 1) ( x 2)] [3 x 3]
y 0 y 0 30 30 30 10
lim F x1 , x2 F , x2 0, x2 .
x1
lim F x1 , x2 F x1 , 0, x1.
x2
lim F x1 , x2 F , 1
x1
x2
P(a X 1 b, c X 2 d ) F b, d F b, c F a, d F a, c 0, a b and c d.
lim F x1 h, x2 lim F x1 , x2 h F x1 , x2 , x1 and x 2 .
h 0 h 0
CONDITIONAL DISCRETE DISTRIBUTION
Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete
random variables with joint probability function
p(x1, x2, …, xq, xq+1 …, xk )
then the conditional joint probability function
of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is
p x1 , , xk
p1 qq 1 k
x1 , , xq xq 1 , , xk
pq 1 k xq 1 , , xk
Let X and Y denote two discrete random variables
with joint probability function
p(x,y) = P[X = x, Y = y]
Then
pX |Y(x|y) = P[X = x|Y = y] is called the conditional
probability function of X given Y
=y
and
pY |X(y|x) = P[Y = y|X = x] is called the conditional
probability function of Y given
X=x
Note
pX Y x y P X xY y
P X x, Y y p x, y
P Y y pY y
and
pY X y x P Y y X x
P X x, Y y p x, y
P X x pX x
Example
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
y
0 1 2 3 4 5 p X (x )
0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 0.4019
1 0.1646 0.1646 0.0617 0.0103 0.0006 0 0.4019
x 2 0.0823 0.0617 0.0154 0.0013 0 0 0.1608
3 0.0206 0.0103 0.0013 0 0 0 0.0322
4 0.0026 0.0006 0 0 0 0 0.0032
5 0.0001 0 0 0 0 0 0.0001
p Y (y ) 0.4019 0.4019 0.1608 0.0322 0.0032 0.0001
The conditional distribution of X given Y = y.
pX |Y(x|y) = P[X = x|Y = y]
y
0 1 2 3 4 5
0 0.3277 0.4096 0.5120 0.6400 0.8000 1.0000
1 0.4096 0.4096 0.3840 0.3200 0.2000 0.0000
x 2 0.2048 0.1536 0.0960 0.0400 0.0000 0.0000
3 0.0512 0.0256 0.0080 0.0000 0.0000 0.0000
4 0.0064 0.0016 0.0000 0.0000 0.0000 0.0000
5 0.0003 0.0000 0.0000 0.0000 0.0000 0.0000
The conditional distribution of Y given X = x.
y
0 1 2 3 4 5
0 0.3277 0.4096 0.2048 0.0512 0.0064 0.0003
1 0.4096 0.4096 0.1536 0.0256 0.0016 0.0000
2 0.5120 0.3840 0.0960 0.0080 0.0000 0.0000
x 3 0.6400 0.3200 0.0400 0.0000 0.0000 0.0000
4 0.8000 0.2000 0.0000 0.0000 0.0000 0.0000
5 1.0000 0.0000 0.0000 0.0000 0.0000 0.0000
Example
The joint probability mass function of X and Y is given by
2. f x1 , , xn dx1 , , dxn 1
F ( x, y) 4t1t2 dt1dt2 x2 y 2 0 x 1, 0 y 1
0 0
MARGINAL CONTINUOUS DISTRIBUTION
Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous
random variables with joint probability density
function
f(x1, x2, …, xq, xq+1 …, xk )
2
(2x 3y) 0 x 1, 0 y 1
f(x, y) 5
0 otherwise
g( x ) f ( x, y ) dy
1
2
(2x 3 y ) dy
0 5
2 1 2
1 4 3
2xy 3y x
5 5 0 5 5
INDEPENDENCE OF RANDOM
VARIABLES
• If X1, X2,…,Xk are independent from each
other, then the joint pdf can be given as
f x1, x 2 ,..., x k f x1 f x 2 ...f x k
f x1 , , xk
f1 qq 1 k
x1 , , xq xq 1 , , xk
fq 1 k xq 1 , , xk
• If X1 and X2 are continuous random variables
with joint pdf f(x1,x2), then the conditional pdf
of X2 given X1=x1 is defined by
f x1, x 2
f x 2 x1 , x1 such that f x1 0, 0 elsewhere.
f x1
f ( x, y) f ( x) f ( y | x) f ( y) f ( x | y)
1
fY X y x if 0 y x 2
x2
f x, y f X x fY X y x
1
1x 1 1x
5 e 5
= 52x e 5
if 0 y x 2, x 0
x2
Example
Let X, Y, Z denote 3 jointly distributed random
variable with joint density function then
K x2 yz 0 x 1, 0 y 1, 0 z 1
f x, y , z
0 otherwise
Find the value of K.
Determine the marginal distributions of X, Y and Z.
Determine the joint marginal distributions of
X, Y
X, Z
Y, Z
Determining the value of K.
1 1 1
1 f x, y, z dxdydz K x2 yz dxdydz
0 0 0
1 1 3 x 1 1 1
x 1
K xyz dydz K yz dydz
0 0
3 x 0 0 0
3
1 2 y 1 1
1 y 1 1
K y z dz K z dz
0
3 2 y 0 0
3 2
2 1
z z 1 1 7 12
K K K 1 if K
3 4 0
3 4 12 7
The marginal distribution of X.
1 1
12
f1 x f x, y, z dydz x2 yz dydz
7 00
1 2 y 1 1
12 y 12 1
x2 y z dz x2 z dz
7 0 2 y 0
7 0 2
2 1
12 2 z 12 2 1
x z x for 0 x 1
7 4 0
7 4
The marginal distribution of X,Y.
1
12
f12 x, y f x, y, z dz x2 yz dz
7 0
2 z 1
12 2 z
x z y
7 2 z 0
12 2 1
x y for 0 x 1, 0 y 1
7 2
The marginal distribution of X,Y.
12 2 1
f12 x, y x y for 0 x 1, 0 y 1
7 2
12 2 1
f1 x x for 0 x 1
7 4
x , y , 1 0, 2 0, 1 1.
2 2
If X ,Y ~ BVN x , y , 1 , 2 , , then
2 2
X~N x , 1 and Y ~ N y , 2
2
V Y|X x y EY|X x pY | X y | x
y
2
E Y2 | X x EY|X x
• In general, E hY |X x h y pY | X y | x
y
• For X, Y continuous random variables, the conditional
expectation of Y given X = x is
EY|X x y fY | X y | x dy
• In general, E h Y | X x
y
h y f Y | X y | x dy
• If X and Y are jointly distributed random variables then,
E[E(Y|X)]=E(Y)
• If X and Y are independent random variables then,
E(Y|X)=E(Y)
E(X|Y)=E(X)
• If X and Y are jointly distributed random variables then,
Var(Y)=E X [Var(Y|X)]+VarX [E(X | Y)]
MOMENT GENERATING FUNCTION OF A
JOINT DISTRIBUTION
• The E(e^tX) and M’(0) approaches both work
M X ,Y (t1,t 2 ) E[e t1 X t 2Y
]
dn m
E[X nY m ] n m
M X ,Y (t1,t 2 ) |t1 t2 0
d t1d t 2