0% found this document useful (0 votes)
27 views6 pages

Homework1 For Stat6202

The document presents a homework assignment for a statistics course, focusing on deriving conditional distributions for Bernoulli and Poisson random variables. It includes detailed proofs and calculations related to expected values and variances of sample means. Additionally, it explores transformations of normally distributed variables and their joint distributions.

Uploaded by

mycocth16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views6 pages

Homework1 For Stat6202

The document presents a homework assignment for a statistics course, focusing on deriving conditional distributions for Bernoulli and Poisson random variables. It includes detailed proofs and calculations related to expected values and variances of sample means. Additionally, it explores transformations of normally distributed variables and their joint distributions.

Uploaded by

mycocth16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Homework 1 STAT 6202

Yang Liu
Instructor: Professor Tapan K. Nayak

January 26, 2019

1.

P1 , · · · , Xn be i.i.d. Bernoulli variables with success probability θ, when n > 2, and let
Let X
T = ni=1 Xi . Derive the conditional distribution X1 , · · · , Xn given T = t.
i.i.d. Pn
Proof. Since X1 , · · · , Xn ∼ Bernoulli(θ), and T = i=1 Xi ∼ Binomial(n, θ)
n
Y
P (X1 = x1 , · · · , Xn = xn ) = θxi (1 − θ)1−xi
i=1
n
!
X
P X1 = x1 , · · · , Xn = xn , T = Xi = t = θt (1 − θ)n−t
i=1
n
!
X θt (1 − θ)n−t
P X1 = x1 , · · · , Xn = xn | Xi = t = 
n
i=1 θt (1 − θ)n−t
t
1
= 
n
t

2.
Suppose X1 and X2 are iid P oisson(θ) random variables and let T = X1 + 2X2 .

(a) Find the conditional distribution of (X1 , X2 ) given T = 7.

(b) For θ = 1 and θ = 2, respectively, calculate all probabilities in the above conditional
distribution and present the two conditional distributions numerically.
i.i.d.
Proof. (a) Since X1 , X2 ∼ P oisson(θ), then we have

{X1 + 2X2 = 7} = {(X1 = 1, X2 = 3), (X1 = 3, X2 = 2), (X1 = 5, X2 = 1), (X1 = 7, X2 = 0)}

and (X1 = 1, X2 = 3), (X1 = 3, X2 = 2), (X1 = 5, X2 = 1), (X1 = 7, X2 = 0) are mutually
exclusive, then

1
P (T = 7) = P (X1 = 1, X2 = 3) + P (X1 = 3, X2 = 2)
+ P (X1 = 5, X2 = 1) + P (X1 = 7, X2 = 0)
θ θ3 θ3 θ2 θ5 θ1 θ7
= e−θ · e−θ + e−θ · e−θ + e−θ · e−θ + e−θ · e−θ
1 3! 3! 2! 5! 1! 7!
θ4 e−2θ θ2 θ3
 
θ
= 1+ + +
6 2 20 840

Then the conditional distribution of (X1 , X2 ) given T = 7 is

P (X1 = 1, X2 = 3)
P ( X1 = 1, X2 = 3| T = 7) =
P (T = 7)
θ4 e−2θ
6
=  
θ4 e−2θ θ θ2 θ3
6 1 + 2 + 20 + 840
840
=
840 + 420θ + 42θ2 + θ3
P (X1 = 3, X2 = 2)
P ( X1 = 3, X2 = 2| T = 7) =
P (T = 7)
θ4 e−2θ θ
·
=  6 2 
θ4 e−2θ θ2 θ3
6 1 + 2θ + 20 + 840
420θ
=
840 + 420θ + 42θ2 + θ3
P (X1 = 5, X2 = 1)
P ( X1 = 5, X2 = 1| T = 7) =
P (T = 7)
θ4 e−2θ θ2
6 · 20
=  
θ4 e−2θ θ2 θ3
6 1 + 2θ + 20 + 840

42θ2
=
840 + 420θ + 42θ2 + θ3
P (X1 = 7, X2 = 0)
P ( X1 = 7, X2 = 0| T = 7) =
P (T = 7)
θ4 e−2θ θ3
6 ·
840
=  
θ4 e−2θ θ2 θ3
6 1 + 2θ + 20 + 840

θ3
=
840 + 420θ + 42θ2 + θ3

(b) The conditional distribution of (X1 , X2 )|T = 7 is given in table 1.

Table 1: Conditional distribution of (X1 , X2 ) given T = 7


P (X1 = x1 , X2 = x2 |T = 7) (x1 = 1, x2 = 3) (x1 = 3, x2 = 2) (x1 = 5, x2 = 1) (x1 = 7, x2 = 0)
840 420 42 1
θ=1 1303 1303 1303 1303
840 840 169 8
θ=2 1856 1856 1856 1856

2
3.
Let X1 , · · · , Xn be i.i.d.
Pnrandom variables
2 with mean µ and variance σ 2 . Let X̄ denote the
sample mean and V = i=1 Xi − X̄ .

(a) Derive the expected value of X̄ and V .

(b) Further suppose that X1 , · · · , Xn are normally distributed. Let An×n = ((aij )) be an
orthogonal matrix whose
first row is ( √1n , · · · , √1n ) and let Y = AX, where Y = (Y1 , · · · , Yn )0 and X = (X1 , · · · , Xn )
are (column) vectors. (It is not necessary to know aij for i = 2, · · · , n, j = 1, · · · , n for
answering the following questions.)

(i) Find nj=1 aij for i = 1, · · · , n and show that ni=1 Yi2 = ni=1 Xi2 (Use properties
P P P
of orthogonal matrices.)
(ii) Express X̄ and V in terms (or as functions) of Y1 , · · · , Yn .
(iii) Use (only) transformation of variables approach to find the joint distribution of
Y1 , · · · , Yn . Are Y1 , · · · , Yn independently distributed and what are their marginal
distributions?
(iv) Prove that X̄ and V are independent given their marginal distributions.

Proof. (a) Since E[Xi ] = µ and V ar[Xi ] = σ 2 for i = 1, · · · , n


 Pm  Pn
i=1 Xi E[Xi ] nµ
E[X̄] = E = i=1 = =µ
n n n

 Pn 
i=1 Xi
  h 2 i
V ar X̄ = E X̄ − µ = V ar
n
Pn 2
V ar[Xi ] nσ σ2
= i=1 2 = 2 =
n n n

n n
" # " #
X 2 X 2
E [V ] = E Xi − X̄ =E (Xi − µ) − (X̄ − µ)
i=1 i=1
" n # "n
#
X h 2 i X
2
=E (Xi − µ) + nE X̄ − µ − 2E (Xi − µ)(X̄ − µ)
i=1 i=1
" n #
X h 2 i
=E (Xi − µ)2 − nE X̄ − µ
i=1
= nV ar[Xi ] − nV ar[X̄]
σ2
= nσ 2 − n · = (n − 1)σ 2
n

Or since

E X̄ 2 = V ar[X̄] + E[X̄ 2 ]
 

σ2
= + µ2
n

3
" n # " n #
X 2 X X
E [V ] = E Xi − X̄ =E Xi2 − 2 Xi X̄ 2 + nX̄ 2
i=1 i=1 i=1
" n #
X
=E Xi2 − nX̄ 2
i=1
σ2
 
2 2
+ µ2
 
=n· σ +µ −n·
n
= (n − 1)σ 2

(b) (i) Due to the orthogonality of A, A0 A = AA0 = In×n (In×n is diagonal matrix of 1’s. Let
A = (a1· , · · · , an· )0 where aj· is the j th row vector. Then we have for i, j = 1, · · · , n

ai· a0i· = 1 and ai· a0j· = 0

n Pn
1 1 j=1 a1j
X
a1· a01· = √ ·√ = √ =1
n n n
k=1
n Pn
1 j=1 aij
X
ai· a01· = aij · √ = √ =0
n n
j=1
Pn √ Pn
Hence j=1 aij = n for j = 1 and j=1 aij = 0 for j = 2, · · · , n.

n
X
Yi2 = Y 0 Y = X 0 A0 AX = X 0 (A0 A)X
i=1
n
X
= X 0X = Xi2
i=1
Pn Pn
Xi √
(ii) Note that Y1 = √1 · Xi = i=1
√ = n · X̄
i=1 n n

n n
X X √ 2
Yi2 = Y 0 Y − Y12 = Xi2 − nX̄
i=2 i=1
n
X
= Xi − nX̄ 2
i=1
n
X 2
= Xi − X̄
i=1
Y1 Pn 2 Pn 2
Therefore X̄ = √
n
and i=1 Xi − X̄ = i=2 Yi
i.i.d. 2

(iii) since X1 , · · · , Xn ∼ N µ, σ

n (xi −µ)2
1
e− 2σ2
Y
fX1 ,··· ,Xn (x1 , · · · , xn ) = √
i=1
2πσ
Pn 2
−n i=1 (xi −µ)
−n −
= (2π) 2 σ e 2σ 2

n (X−1µ)0 (X−1µ)
= (2π)− 2 σ −n e− 2σ 2

4
where 1 = (1, · · · , 1)0 . Let A = (a·1 , · · · , a·n ), A0 = (a·1 , · · · , a·n )0 where a·j is the j th
column vector, since Y = AX, X = A0 AX = A0 Y , dY d
X = A, dY d
X = det(A) =
0
det(A A) = 1, then we have

d
fY1 ,··· ,Yn (y1 , · · · , yn ) = fX1 ,··· ,Xn (x1 , · · · , xn ) X
dY X=A0 Y
(X−µ)0 (X−µ)
−n −
= (2π) 2 σ −n e 2σ 2

X=A0 Y
(A0 Y −1µ)0 (A0 Y −1µ)
−n −
= (2π) 2 σ −n e 2σ 2

n (Y −A1µ)0 AA0 (Y −A1µ)


= (2π)− 2 σ −n e− 2σ 2

n (Y −A1µ)0 (Y −A1µ)
= (2π)− 2 σ −n e− 2σ 2
Pn 2
−n i=1 (yi −ai· 1µ)
−n −
= (2π) 2 σ e 2σ 2

Pn Pn 2
(
i=1 yi − j=1 aij µ )
−n −n −
= (2π) 2 σ e 2σ 2

√ n yi2
1 (y1 − nµ)2 1
e− 2σ2 e− 2σ2
Y
=√ · √
2πσ i=2
2πσ

The last equation is due to (b) (i). Note that E[Y ] = AE[X] = A1µ,V ar[Y ] =
√ i.i.d.
A0 V ar[X]A = A0 Aσ 2 = I · σ 2 , therefore Y1 ∼ N ( nµ, σ 2 ) ⊥ Y2 , · · · , Yn ∼ N (0, σ 2 )
Or

n
X n
X n
X
(Xi − µ)2 = Xi2 − 2µ Xi + nµ2
i=1 i=1 i=1
n
X √
= Yi2 − 2 nYi + nµ2
i=1
n
X √
= Yi2 + (Y1 − nµ)2
i=2

The second equation is due to (b) (i). Hence

d
fY1 ,··· ,Yn (y1 , · · · , yn ) = fX1 ,··· ,Xn (x1 , · · · , xn ) X
dY X=A0 Y
√ n 2
1 (y − nµ)2 1 yi
− 1 2
e− 2σ2
Y
=√ e 2σ · √
2πσ i=2
2πσ

√ i.i.d. Y1

σ2

(iv) Since Y1 ∼ N ( nµ, σ 2 ) ⊥ Y2 , · · · , Yn ∼ N (0, σ 2 ), X̄ = √ n
∼ N µ, √
n

Y2 2
2 i.i.d.
 
Y σ2
Y2 , · · · , Yn and σ22 , · · · , Yσn2 ∼ χ21 , then ni=2 σi2 ∼ χ2n−1 . Therefore X̄ ∼ N µ, √
P
n

Pn 2 2 2
V = i=2 Yi ∼ σ · χn−1

5
4.
Consider a large population of individuals and let θ denote the (unknown) proportion of the
population belonging to a sensitive group A (e.g. drug users).
Suppose, we randomly select n individuals from the population and ask each person to select
a card from a deck and answer the question written on the card. Each card in the deck has one
of the two questions: Q1 : Do you belong to A? and Q2 : Do you not belong to A? Also, 85%
percent of the cards ask Q1 and the remaining 15% ask Q2 .
Assume that each person answers Yes or No truthfully to the selected question. For i =
1, · · · , n, let Xi = 1 if the ith person answers ’Yes’ otherwise Xi = 0. So, the data are the
observed values of X1 , · · · , Xn .
Give the joint distribution of X1 , · · · , Xn and the distribution of the total number of Yes
responses.

Proof. We first consider to calculate the probability for the ith person to answer ’Yes’

0
Y es0 as response answer Q1

P (Xi = 1) = P (answer Q1 ) · P
0
Y es0 as response answer Q2

+ P (answer Q2 ) · P
= 0.85 × θ + 0.15 × (1 − θ)
= 0.15 + 0.7θ

i.i.d.
Then we have X1 , · · · , Xn ∼ Bernoulli (0.15 + 0.7θ), therefore

n
Y
P (X1 = x1 , · · · , Xn = xn ) = (0.15 + 0.7θ)xi (1 − (0.15 + 0.7θ))1−xi
i=1
Pn Pn
xi
= (0.15 + 0.7θ) i=1 (0.85 − 0.7θ)n− i=1 xi

Pn
Let Yn = i=1 Xibe the total number of ’Yes’ response, then Yn ∼ Binomial (n, 0.15 + 0.7θ)
 
n
P (Yn = y) = (0.15 + 0.7θ)y (0.85 − 0.7θ)n−y f or y = 0, · · · , n
y

You might also like