Professional Documents
Culture Documents
Chapter 6
Chapter 6
Variables
Method of Distribution Functions
• X1,…,Xn ~ f(x1,…,xn)
• U=g(X1,…,Xn) – Want to obtain fU(u)
• Find values in (x1,…,xn) space where U=u
• Find region where U≤u
• Obtain FU(u)=P(U≤u) by integrating f(x1,
…,xn) over the region where U≤u
• fU(u) = dFU(u)/du
Example – Uniform X
• Stores located on a linear city with density
f(x)=0.05 -10 ≤ x ≤ 10, 0 otherwise
• Courier incurs a cost of U=16X2 when she delivers to a
store located at X (her office is located at 0)
u
U u 16 X 2 u X
4
u u
U u X
4 4
u 4 u u u
FU (u ) P(U u ) 0.05dx 0.05
0 u 1600
u 4 4 4 40
dFU (u ) u 1/ 2
fU (u ) 0 u 1600
du 80
Example – Sum of Exponentials
• X1, X2 independent Exponential()
• f(xi)=-1e-xi/xi>0, >0, i=1,2
• f(x1,x2)= -2e-(x1+x2)/ x1,x2>0
• U=X1+X2
U u X 1 X 2 u X 1 u x2
U u X 1 X 2 u X 2 u, X 1 u X 2
P (U u )
0 0
u
1 x1 / x2 /
u x2
2
e e dx1dx2 e
0
u 1
x2 /
e
x1 / u x2
0
dx2
u 1
e
0
x2 /
1 e
( u x2 ) /
dx2 e
u 1
0
x2 /
u 1
dx2 e ( x2 u x2 ) / dx2
0
1
1
1
1 e u / ue u / fU (u ) e u / e u / 2 e u /
u
1 u /
ue u 0 U ~ Gamma ( 2, )
2
Method of Transformations
• X~fX(x)
• U=h(X) is either increasing or decreasing in X
• fU(u) = fX(x)|dx/du| where x=h-1(u)
dX 1 dX 1
dU1 dU 2 dX 1 dX 2 dX 1 dX 2
| J |
dX 2 dX 2 dU1 dU 2 dU 2 dU1
dU1 dU 2
f (u1 , u2 ) f ( x1 , x2 ) | J | fU1 (u1 ) f (u1 , u 2 )du2
Example
• fX(x) = 2x 0≤ x ≤ 1, 0 otherwise
• U=10+500X (increasing in x)
• x=(u-10)/500
• fX(x) = 2x = 2(u-10)/500 = (u-10)/250
• dx/du = d((u-10)/500)/du = 1/500
• fU(u) = [(u-10)/250]|1/500| = (u-10)/125000
10 ≤ u ≤ 510, 0 otherwise
Method of Conditioning
• U=h(X1,X2)
• Find f(u|x2) by transformations (Fixing X2=x2)
• Obtain the joint density of U, X2:
• f(u,x2) = f(u|x2)f(x2)
• Obtain the marginal distribution of U by
integrating joint density over X2
fU (u ) f (u | x2 ) f ( x2 )dx2
Example (Problem 6.11)
• X1~Beta( X2~Beta(Independent
• U=X1X2
• Fix X2=x2 and get f(u|x2)
f ( x1 ) 6 x1 (1 x1 ) 0 x1 1 f ( x2 ) 3 x22 0 x2 1
dX 1
U X 1 x2 X 1 U / x2 1 / x2
dU
1
f (u | x2 ) 6(u / x2 )(1 u / x2 ) 0 u x2
x2
1 2 u
f (u , x2 ) f (u | x2 ) f ( x2 ) 6(u / x2 )(1 u / x2 ) 3x2 18u 1 0 u x2 1
x2 x2
18u 2
1 1
fU (u ) f (u | x2 ) f ( x2 )dx2 18u
u u x2
1
u
dx2 18ux2 18u 2 ln( x2 ) 18u 0 18u 2 18u 2 ln(u )
18u (1 u u ln(u )) 0 u 1
Problem 6.11
5
Density of U=X1X2
4
f(u)
f(u|x2=.25)
f(u|x2=.5)
f(u|x2=.75)
3
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
u
Method of Moment-Generating Functions
• X,Y are two random variables
• CDF’s: FX(x) and FY(y)
• MGF’s: MX(t) and MY(t) exist and equal for |t|<h,h>0
• Then the CDF’s FX(x) and FY(y) are equal
• Three Properties:
– Y=aX+b MY(t)=E(etY)=E(et(aX+b))=ebtE(e(at)X)=ebtMX(at)
– X,Y independent MX+Y(t)=MX(t)MY(t)
– MX1,X2(t1,t2) = E[et1X1+t2X2] =MX1(t1)MX2(t2) if X1,X2 are indep.
Sum of Independent Gammas
M Y (t ) E e tY E e t ( X 1 ... X n ) E e tX 1 e tX n M X 1 (t ) M X n (t )
n
1 n
(1 t ) (1 t ) (1 t ) i 1 i
n
n
Y X i ~ Gamma i ,
i 1 i 1
Linear Function of Independent Normals
M Y (t ) E etY E et ( a1 X 1 ... an X n ) E e ta1 X 1 e tan X n M X 1 (a1t ) M X n (ant )
exp 1a1t
at
2 2
exp n ant
1 1 at 2
n n
n
2
exp i 1 ai i t
n
i 1
ai
2 2 2
i t
2 2 2
n
n n
Y ai X i ~ Normal ai i , ai2 i2
i 1 i 1 i 1
Distribution of Z2 (Z~N(0,1))
1 z2 / 2
Z ~ N (0,1) f Z ( z) e z
2
2 1 2 t 2 1 2 t
1 z2 / 2 1 z 2 1 z 2
M Z 2 (t ) e tz 2
e dz e dz 2 e dz (symmetric about 0)
2
2 0
2
dz 1
Let u z 2 z u 0.5u 1/ 2 dz 0.5u 1/ 2 du
du 2 u
2 1 2 t
2 2 1/ 2
1 z 1 u / 1 u / 1 2
2 e 2
dz u 1 / 2
e 1 2 t
du u 1/ 2 1
e 1 2 t
du (1 / 2)
0
2 2 0
2 0
2 1 2t
1
2 (1 2t ) 1/ 2 (1 2t ) 1/ 2
2
Z 2 ~ Gamma( 1 / 2, 2) 12
Notes :
0
y 1e y / dy ( )
(1 / 2)
n
Z1 ,..., Z n mutually independent Z i2 ~ Gamma( n / 2, 2) n2
i 1
Distributions of X and S2 (Normal data)
X 1 ,..., X n ~ NID ( , 2 ) NID Normal and Independently Distributed
n
Xi 1 n n
1
Sample Mean : X i 1
X i ai X i ai i 1,..., n
n i 1 n i 1 n
n n n n
Note : X i X X i n X X i X i 0
i 1 i 1 i 1 i 1
X
n 2
i X
Sample Variance : S 2 i 1
n 1
Alternative representation of S 2 :
n n n n
1 1
( X i X j ) 2n(n 1) Xi X X j X
2
S
2 2
2n( n 1) i 1 j 1 i 1 j 1
1 n n
Xi X
2n(n 1) i 1 j 1
X
2
j X 2
2 Xi X X j X
n
n n n
1
n X i X n X j X 2 X i X X j X
2 2
2n(n 1) i 1 j 1 i 1 j 1
n
X
n n n
1
n X i X n X j X 2 X i X
2 2
j X
2n(n 1) i 1 j 1 i 1 j 1
2n( n 1) S 2
1
2n(n 1)
n( n 1) S 2 n(n 1) S 2 2(0)(0)
2n( n 1)
S2
X 1 , X 2 ~ NID ( , 2 )
2 2t 2
T X 1 X 2 ~ N (2 ,2 ) M T (t ) exp 2t
2
exp{2 t t }
2 2
2
2 2t 2
D X 2 X 1 ~ N (0,2 ) M D (t ) exp 0
2
exp{ 2 2
t }
2
M T , D (t1 , t 2 ) E (e t1T t2 D ) E exp t1 ( X 1 X 2 ) t 2 ( X 2 X 1 )
E exp[ X 1 (t1 t 2 ) X 2 (t1 t 2 )]
ind
E exp( X 1 (t1 t 2 )) exp( X 2 (t1 t 2 ))
E exp( X 1 (t1 t 2 )) E exp( X 2 (t1 t 2 ))
X
Independence of X and S2 (Normal Data) P2
Independence of T=X1+X2 and D=X2-X1 for Case of n=2
ind
E exp( X 1 (t1 t 2 )) exp( X 2 (t1 t 2 ))
E exp( X 1 (t1 t 2 )) E exp( X 2 (t1 t 2 ))
2 (t1 t 2 ) 2 2 (t1 t 2 ) 2
exp (t1 t 2 ) exp (t1 t 2 )
2 2
2 (t12 t 22 2t1t 2 t12 t 22 2t1t 2 )
exp (t1 t 2 t1 t 2 )
2
2 2t12 2 2t 22
exp 2t1
2 2
2 2t12 2 2t 22
exp 2 t1 exp M T (t1 ) M D (t 2 )
2 2
Thus T=X1+X2 and D=X2-X1 are independent Normals and X& S2 are independent
Distribution of S2 (P.1)
Xi
X i ~ NID( , 2 ) Z i ~ N (0,1) Z i2 ~ 12
2
n
X
i ~ n Gamma(n / 2,2)
2
i 1
2
2 2
n
Xi 1 n 1 n
2 Xi 2 Xi X X
i 1 i 1 i 1
1 n
2
i 1
X
i X
2
X
2
2 Xi X X
1 n
X
n
2 X i X
2 2
n X 2 X i X
i 1 i 1
1 n
(n 1) S n X 2
2
2
2 2
Xi X n X 0
i 1 2 2
Now, X and S 2 are independent :
M ( n 1) S 2 n ( X ) 2 (t ) M ( n 1) S 2 (t ) M n ( X ) 2 (t ) M 1 n 2 (1 2t ) n / 2
2 2 2 2 X i
i 1
Distribution of S2 P.2
Now, X and S 2 are independent :
M ( n 1) S 2 n ( X ) 2 (t ) M ( n 1) S 2 (t ) M n ( X ) 2 (t ) M 1 n 2 (1 2t ) n / 2
X i
2 2 2 2 i1
Now, consider :
n X
:
2
2
n
1 2
n
1
X ~ N X , X
2
2 2
ZX
X X X
n X
~ N (0,1)
i 1 n i 1 n n X n
n X 2
~ 12 M n ( X ) 2 (t ) (1 2t ) 1/ 2
2 2
M 1 n 2
X i (1 2t ) n / 2
M ( n 1) S 2 (t ) 2 i1
1 / 2
(1 2t ) ( n / 2) (1/ 2 ) (1 2t ) ( n 1) / 2
2
M n( X )2 (1 2t )
2
(n 1) S 2 n 1
~ Gamma , 2 2
n 1
2 2
Summary of Results
• X1,…Xn ≡ random sample from N(2)population
• In practice, we observe the sample mean and sample variance (not
the population values: , 2)
• We use the sample values (and their distributions) to make
inferences about the population values
X X
n n n
Xi
2 2
i X i X
2 (n 1) S 2
X i 1
X ~ N , S 2 i 1
i 1
~ n21
n n n 1 2
2
X , S 2 are independen t
X
X / n Z
t ~ t n 1
S/ n (n 1) S 2
n 1 (n 1)
2
( n 1)
2
(See derivation using method of conditioni ng on .ppt
presentati on for t, and F - distributi ons)
Order Statistics
• X1,X2,...,Xn Independent Continuous RV’s
• F(x)=P(X≤x) Cumulative Distribution Function
• f(x)=dF(x)/dx Probability Density Function
5 x 4 (1) 5 x 4 0 x 1
Maximum : g n ( x)
0 o.w.
5(1)(1 x) 4 5(1 x) 4 0 x 1
Minimum : g1 ( x )
0 o.w.
Order Stats - U(0,1) - n=5
4.5
3.5
3
f(x)
2.5 gn(x)
pdf
g1(x)
2
1.5
0.5
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
x
Distributions of Order Statistics
• Consider case with n=4
• X(1) ≤x can be one of the following cases:
• Exactly one less than x
• Exactly two are less than x
• Exactly three are less than x
• All four are less than x
• X(3) ≤x can be one of the following cases:
• Exactly three are less than x
• All four are less than x
• Modeled as Binomial, n trials, p=F(x)
Case with n=4
4 4
P X (1) x [ F ( x)] [1 F ( x)] [ F ( x)]2 [1 F ( x)]2
1 3
1 2
4 4
[ F ( x)] [1 F ( x)] [ F ( x)]4 [1 F ( x)]0
3
3 4
1 [1 F ( x)]4
4 4
P X ( 3) x [ F ( x)] [1 F ( x)] [ F ( x)]4 [1 F ( x)]0
3
3 4
4 F ( x) 3 4 F ( x) 4 F ( x) 4
4 F ( x ) 3 3F ( x ) 4
g 3 ( x) 12 F ( x) 2 f ( x) 12 F ( x) 3 f ( x) 12 f ( x) F ( x) 2 (1 F ( x))
General Case (Sample of size n)
n!
g j ( x) [ F ( x)] j 1[1 F ( x)]n j f ( x) 1 j n
( j 1)!(n j )!
4.5
3.5
f(x)
3
g1(x)
g2(x)
2.5
pdf
g3(x)
g4(x)
2
g5(x)
1.5
0.5
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
x