Professional Documents
Culture Documents
5.1 Introduction
5.1 Introduction
5.2 Joint CDFs of Bivariate Random Variables (or Vector Random Variables)
◈ Consider two random variables X and Y defined on the same sample space
- 예) X : 학생들의 성적, Y : 같은 학생들의 키
Mapping 함수가
2개임을 기억할 것!
A = { X ≤ x}
B = {Y ≤ y}
A = { X ≤ x} FX ( x) = P{ X ≤ x}
B = {Y ≤ y} FY ( y ) = P{Y ≤ y}
◈ The probability of the joint event -> joint probability distribution function (결합확률분포함수)
◈ Example)
- SJ (Joint Sample Space) : (1,1), (2,1), and (3,3)
- Probability : P(1,1)=0.2, P(2,1)=0.3, and P(3,3)=0.5
Find FX,Y(x,y)
Multiple Random Variables
Joint Distribution
◈ Example) A fair coin is tossed twice. S={(H,H), (H,T), (T,H), (T,T)}, 확률 { ¼, ¼, ¼, ¼ }
X = “number of heads on the first toss”, Y=“number of heads on the second toss”
- mapping : Heads를 1로, Tail을 0으로
- CDF 계산
Independent
- marginal pdf
f X ( x) = 2 / 4δ ( x) + 2 / 4δ ( x − 1)
f Y ( y ) = 2 / 4δ ( y ) + 2 / 4δ ( y − 1)
Multiple Random Variables
◈ Example 5.1) 그림을 그려서 이해할 것! P{ X > a and Y > b} = P{ X > a ∩ Y > b}
= 1 − P{ X ≤ a ∪ Y ≤ b}
= 1 − FX ( x) − FY ( y ) + FX ,Y ( x, y )
Multiple Random Variables
◈ 결합밀도함수의 특성
1) PMF는 negative가 될 수도 1보다 클 수도 없음 0 ≤ p X ,Y ( x, y ) ≤ 1
2) 결합밀도함수의 총합은 1
∑∑ p
x y
X ,Y ( x, y ) = 1
N M
FX ,Y ( x, y ) = ∑∑ P( x , y
n =1 m =1
n m )u ( x − x n )u ( y − y m ) = P (1,1)u ( x − 1)u ( y − 1) + P(2,1)u ( x − 2)u ( y − 1) + P(3,3)u ( x − 3)u ( y − 3)
N M
f X ,Y ( x, y ) = ∑∑ P( x , y
n =1 m =1
n m )δ ( x − x n )δ ( y − y m ) = P(1,1)δ ( x − 1)δ ( y − 1) + P(2,1)δ ( x − 2)δ ( y − 1) + P(3,3)δ ( x − 3)δ ( y − 3)
Multiple Random Variables
◈ The distribution function of one R.V. can be obtained by setting the value of the other variable to infinity.
◈ Example) 앞의 예제 에서
N M
FX ,Y ( x, y ) = ∑∑ P( x , y
n =1 m =1
n m )u ( x − x n )u ( y − ym )
∂ 2 FX ,Y ( x, y )
f X ,Y ( x, y ) = Joint Density Function N차 미분까지 개념 확장 가능
∂x∂y
N M
f X ,Y ( x, y ) = ∑∑ P( x , y
n =1 m =1
n m )δ ( x − x n )δ ( y − ym ) : 불연속인 경우 delta함수를 이용하여 정의
(1) f X ,Y ( x, y ) ≥ 0 ∫ ∫
( 4) F X ( x ) =
−∞ −∞
f X ,Y ( x, y )dxdy
y ∞
∞ ∞ F ( y) = ∫ ∫ f X ,Y ( x, y )dxdy
∫ ∫ f X ,Y ( x, y )dxdy = 1 Y
(2) −∞ −∞
−∞ −∞
x y
(3) FX ,Y ( x, y ) = ∫ ∫
−∞ −∞
f X ,Y ( x, y )dxdy
x2 y2
∫ ∫
∞
(5) P{x1 < x ≤ x2 , y1 < y ≤ y 2 } =
x1 y1
f X ,Y ( x, y )dxdy (6) f X ( x) = ∫ −∞
f X ,Y ( x, y )dy
∞
fY ( y) = ∫ f X ,Y ( x, y )dx
−∞
Multiple Random Variables
(1) f X ,Y ( x, y ) ≥ 0
∞ ∞
(2) ∫ ∫
−∞ −∞
f X ,Y ( x, y )dxdy = 1
π /2 2 2 π /2
∫ ∫
0 0
be − x cos( y )dxdy = b ∫
0 ∫
e − x dx cos( y )dy
0 1
−2
= b(1 − e ) = 1
b = 1 /(1 − e −2 )
Multiple Random Variables
∞
( 6) f X ( x ) = ∫ −∞
f X ,Y ( x, y )dy
∞
fY ( y) = ∫ f X ,Y ( x, y )dx
−∞
Marginal pdf
f X ,Y ( x, y ) = u ( x)u ( y )e − ( x + y )
∞ ∞
f X ( x) = ∫ −∞
f X ,Y ( x, y )dy = u ( x)e − x ∫ 0
e − y dy = u ( x)e − x
∞ ∞
f Y ( x) = ∫ −∞
f X ,Y ( x, y )dx = u ( y )e − y ∫ 0
e − x dx = u ( y )e − y
Multiple Random Variables
Statistical Independence
◈ 확률에서 통계적 독립이란?
◈ Random Variable에서는
P{ X ≤ x, Y ≤ y} = P{ X ≤ x}P{Y ≤ y}
pdf와 cdf에 대해서 정리하면 FX ,Y ( x, y ) = FX ( x) FY ( y ), f X ,Y ( x, y ) = f X ( x) f Y ( y )
1
f X ( x) = ∫ −∞
f X ,Y ( x, y )dy = u ( x)e −( x / 4) ∫0
e −( y / 3) dy = u ( x)e −( x / 4)
f X ,Y ( x, y ) = u ( x)u ( y )e −( x / 4) −( y / 3) = ∞ ∞
12 fY ( y) = ∫ f X ,Y ( x, y )dyx = u ( x)e −( y / 3) ∫ e −( x / 4) dyx = u ( x)e −( y / 3)
−∞ 0
Multiple Random Variables
◈ 조건 확률
◈ Conditional PMF
P[ X = x, Y = y ] p XY ( x, y ) P[ X = x, Y = y ] p XY ( x, y )
pY | X ( y | x) = = p X |Y ( x | y ) = =
P[ X = x] p X ( x) P[Y = y ] pY ( y )
1
◈ Example 5.8) p X ,Y ( x, y ) = (2 x + y ) x = 1,2; y = 1,2
18 4/18
6/18
2
∑ 18 (2 x + y) = 18 ∑ (2 x + y) = 18 (4 x + 3)
1 1 1
p X ( x) = x = 1,2 3/18
y y =1
5/18
2
∑ ∑
1 1 1
pY ( x) = (2 x + y ) = (2 x + y ) = (2 y + 6) y = 1,2
x
18 18 x =1
18
P[ X = x, Y = y ] p XY ( x, y ) 2 x + y
pY | X ( y | x) = = =
P[ X = x] p X ( x) 4x + 3
Multiple Random Variables
◈ Conditional PDF
f XY ( x, y ) f XY ( x, y )
f Y | X ( y | x) = f X |Y ( x | y ) =
f X ( x) fY ( y)
◈ Example 5.9)
xe − x ( y +1) , 0 ≤ x ≤ ∞; 0 ≤ y ≤ ∞
f XY ( x, y ) =
0, otherwise
∞
∞ ∞ e − xy
∫ ∫
−x − xy −x −x
f X ( x) = f X ,Y ( x, y )dy = xe e dy = xe − =e
−∞ 0
x 0
∞ ∞ e − x ( y +1)
f Y (yx) = ∫−∞
f X ,Y ( x, y )dx = ∫0
xe − x ( y +1) dx dv = e − x ( y +1) dx, v = −
y +1
부분적분 필요 u=x
∞ ∞
xe − x ( y +1) ∞ e − x ( y +1) e − x ( y +1) ∞
∫ ∫
1
E[ X ] = [uv]∞0 − v du = [− ]0 + dx = 0 + [− ] =
2 0
y +1 y +1 ( y + 1) ( y + 1) 2
0 0
f XY ( x, y ) xe − x ( y +1) f XY ( x, y )
f Y | X ( y | x) = = −x
= e − xy f X |Y ( x | y ) = = x( y + 1) 2 e − x ( y +1)
f X ( x) e fY ( y)
Multiple Random Variables
∞ ∞ e−x / ye− y e− y ∞
f Y ( x) = ∫
−∞
f X ,Y ( x, y )dx = ∫0 y
dx =
y ∫ 0
e − x / y dx =e − y
f XY ( x, y ) e − ( x / y ) e − y e − ( x / y )
f X |Y ( x | y ) = = =
fY ( y) ye − y y
∞ ∞
∫ ∫
1
E[ X | Y = y ] = xf X |Y ( x | y )dx = xe − x / y dx
0 y 0
부분적분 필요 u = x, dv = e − x / y dx → du = dx, v = − ye − x / y
1
[
− xye
y
−x / y
]∞
0 +y ∫ 0
∞
e − x / y dx = y
→ y
Multiple Random Variables
◈ X와 Y가 독립인 경우
X와 Y의 Joint PDF가 직각좌표계에서 다음과 같은 형태를 가질 경우 독립이다.
f XY ( x, y ) = constant × x − factor × y − factor , a ≤ x ≤ b, c ≤ y ≤ d
◈ 두 개의 R.V. X와 Y의 covariance의 정의
Cor ( X , Y ) = σ XY = E[( X − µ X )(Y − µY )]
= E[ XY − µY X − µ X Y + µ X µY ]
= E[ XY ] − µ X µY X와 Y가 독립이면, Covariance는 zero가 됨
◈ Correlation coefficient
a measure of how good a prediction of the value of one of the two random variables cab be formed
based on an observed value of the other.
Cor ( X , Y ) σ XY
ρ XY = =
Var ( X )Var (Y ) σ XσY -1에서 1까지의 값을 가짐
correlation이 +와 –의 의미는?
25e −5 y , 0 ≤ x ≤ 0.2; 0 ≤ y ≤ ∞
f XY ( x, y ) =
0, otherwise
Multiple Random Variables
- Probability that they meet - Probability that Ann arrives before Hans
P[ X − Y ≤ 5] P[Y ≤ X ]
Ann
Y
X
Y=
45
A: Hans가 10분 이전에 도착하는 경우
5
30 A B E C D
Ann이 5분 후에 도착하는 경우
5
=
Y
15 Ann이 5분 전에 도착하는 경우
30
5
0 X Hans
-5 5 10 20 50 60