You are on page 1of 4

University of California, Los Angeles

Department of Statistics

Statistics 100B Instructor: Nicolas Christou

Exam 2
24 February 2021
This is a 2-hour exam: Due by 8:15 pm on Thursday, 25 February

Name:

Problem 1 (25 points)


Answer the following questions:
a. Consider the simple regression model Yi = β0 + β1 xi + i where i ∼ N (0, σ) and 1 , . . . , n are independent.
Form a χ2 random variable using all the Yi ’s.
Pn
e2
b. Refer to question (a). It can be shown that Se2 = i=i
n−2
i
is the unbiased estimator of σ 2 . It is also given that
2
(n−2)Se
σ2
∼ χ2n−2 . Find the distribution of Se2 and its mean and variance. Can we form a t distribution using
 
2
(n−2)Se
σ2
∼ χ2n−2 and β̂1 ∼ N β1 , pPn σ
? If yes, what do we need to check?
(x)i −x̄)2
i=1

c. Refer to question (a). !


  r
σ 1 x̄2
We have seen in class that β̂1 ∼ N β1 , pPn and β̂0 ∼ N β0 , σ n
+P
n . Can we
(xi −x̄)2 (xi −x̄)2
i=1 i=1

form a ratio that follows the F distribution using these two results?
 
d. Refer to question (a). Use β̂1 ∼ N β1 , pPn σ to form a non-central χ2 random variable. What is
(xi −x̄)2
i=1
the non-centrality parameter?

1
Problem 2 (25 points)
Answer the following questions:
a. Let X(1) , X(2) , X(3) be the order statistics of a random sample of size n = 3 from a U (0, θ) distribution. Find
c such that cX(2) is an unbiased estimator of θ.

b. Suppose X1 , . . . , Xn are i.i.d. random variables with Xi ∼ exp( λ1 ). Show that the sample mean X̄ is unbiased
2
estimator of λ and has variance λn .

c. Let θ̂1 and θ̂2 are two independent unbiased estimators of a parameter θ. Suppose var(θ̂1 ) = 2var(θ̂2 ). Find
the constants c1 and c2 so that c1 θ̂1 + c2 θ̂2 is unbiased with the smallest variance.

d. Let X1 , . . . , Xn be i.i.d. random variables with Xi ∼ N (µ, σ). Find c so that E[cS 2 − σ 2 ] is minimized. How
is this new estimator cS 2 compared to S 2 ?

2
Problem 3 (25 points)
Answer the following questions:
a. In class we discussed the following theorem: If Y ∼ Nn (0, I) and if A is n × n symmetric and idempotent
matrix then the quadratic expression Y0 AY ∼ χ2r , where r is the number of the eigenvalues of A equal to
2
1. Use this theorem to prove (n−1)S
σ2
∼ χ2n−1 , where S 2 is the sample variance of a random sample from N (µ, σ).

b. Please list all the properties of the F distribution that were discussed in class or assigned as homework questions.

c. Suppose X1 , . . . , Xn are i.i.d. random variables with Xi ∼ exp(λ1 ) and Y1 , . . . , Ym are i.i.d. random variables
with Yi ∼ exp(λ2 ). The two samples are independent. Find the variance of X̄ Ȳ
.

. . , Xn are i.i.d. Poisson(λ). It is given that E[X(X − 1)] = λ2 and var[X(X − 1)] = 2λ2 + 4λ3 .
d. Suppose X1 , .P
n
Now let Q = i=1 Xi . Find E[Q(Q − 1)] and var[Q(Q − 1)].

3
Problem 4 (25 points)
Answer the following questions:
a. Suppose the lifetime of a certain electronic component follows an exponential distribution with mean 100 hours.
Suppose three such components are connected in series in an electric system. Find the probability that the
electric system will survive more than 40 hours.

b. Suppose Y1 , Y2 , . . . , Yn are i.i.d. U (0, θ) random variables. Find the expected value of the jth order statistic.
(Find the pdf of Y(j) and use it to find the expected value of Y(j) which involves the beta distribution.)

c. Suppose X1 , X2 are independent N (µ, 1) random variables. Let θ̂1 = X̄ and θ̂2 = E[X̄|X1 ]. Show that even
though var(θ̂2 ) < var(θ̂1 ), we cannot use θ̂2 as an estimator of θ.
    
1 1 ρ ρ ρ ... ρ
  1   ρ 1 ρ ρ ... ρ 
 
  1 


 ρ ρ 1 ρ ... ρ 

d. Suppose Y ∼ Nn µ 
  .. 2
, σ 
 .. .. .. .. .. .. . Consider Ȳ as an estimator of µ. Is Ȳ a

  .   . . . . . . 
  ..   .. .. .. .. .. .. 
  .
  . . . . . .

1 ρ ρ ... ρ ρ 1
consistent estimator of µ? Can you suggest a different estimator for µ?

You might also like