You are on page 1of 8

STATS 200 (Stanford University, Summer 2015)

Final Exam Sample Questions


This document contains some questions that are fairly representative of the content, style,
and difficulty of the questions that will appear on the final exam. Most of these questions
come from actual exams that I gave in previous editions of the course. Please keep the
following things in mind:
This document is much longer than the actual final exam will be.
All material covered in the lecture notes (up through the end of class on Monday,
August 10) is eligible for inclusion on the final exam, regardless of whether it is covered
by any of the sample questions below.
The final exam will be cumulative, but material not covered on the midterm exam will
be represented more heavily. The proportion of questions on the final exam that are
drawn from pre-midterm material may not be exactly equal to the proportion of the
sample questions below that are drawn from pre-midterm material.
1. Note: Parts (a)(f) of this question already appeared in the midterm exam sample questions, but they are repeated here to maintain the original format of the overall question.
Let X1 , . . . , Xn iid Beta(1, ), where > 0 is unknown.
Note: The Beta(1, ) distribution has pdf

(1 x)1
f (x) =

if 0 < x < 1,
otherwise.

Also,
E (X1 ) =

1
,
1+

Var (X1 ) =

.
(1 + )2 (2 + )

You may use these facts without proof.


(a) Find the maximum likelihood estimator nMLE of .
Note: Recall that any logarithm of a number between 0 and 1 is negative.
(b) Do we know for certain that nMLE is an unbiased estimator of ?
(c) Find the asymptotic distribution of the maximum likelihood estimator n .
Note: Your answer should be a formal probabilistic result involving convergence in
distribution.
(d) Let = 1/(1 + ) = E (X1 ), and define
n

X n = n1 Xi .
i=1

Do we know for certain that X n is an unbiased estimator of ?

Final Exam Sample Questions

(e) Define the estimator


1
n =
1.
Xn
Do we know for certain that n is an unbiased estimator of ?
(f) Find the asymptotic distribution of n .
Note: Your answer should be a formal probabilistic result involving convergence in
distribution.
(g) Find the asymptotic relative efficiency of nMLE compared to n .
Note: If you were unable to find the asymptotic distributions of nMLE and/or n , then
call their asymptotic variances v MLE () and/or v () so that you can still answer
this question.
2. Let X1 , . . . , Xn iid Geometric(), where is unknown and 0 < < 1.
Note: The Geometric() distribution has pmf

(1 )x
f (x) =

if x {0, 1, 2, . . .},
otherwise,

and it has mean (1 )/. Also, the maximum likelihood estimator of is


nMLE =

n
.
n + ni=1 Xi

You may use these facts without proof.


(a) Find the Fisher information in the sample.
(b) Let 0 be fixed and known, where 0 < 0 < 1. Find the Wald test of H0 = 0 versus
H1 0 and state how to choose the critical value to give the test approximate
size , where 0 < < 1.
Note: You may use either version of the Wald test.
(c) Let 0 be fixed and known, where 0 < 0 < 1. Find the score test of H0 = 0 versus
H1 0 and state how to choose the critical value to give the test approximate
size , where 0 < < 1.
(d) Find the Wald confidence interval for with approximate confidence level 1 ,
where 0 < < 1.
Note: You may use either version of the Wald confidence interval.

Final Exam Sample Questions

3. Let X1 , . . . , Xn be iid random variables such that E,2 (X1 ) = and Var,2 (X1 ) = 2 are
both finite. However, suppose that X1 , . . . , Xn are not normally distributed. Now define
Xn =

1 n
Xi .
n i=1

(a) Do we know for certain that X n is a consistent estimator of ?


(b) Do we know for certain that the distribution of X n is approximately normal for
large n?
4. Let X1 , . . . , Xn be iid continuous random variables with pdf

2x exp(x2 )
f (x) =

if x 0,
if x < 0,

where > 0 is unknown. Suppose we assign a Gamma(a, b) prior to , where a > 0 and
b > 0 are known.
Note: The Gamma(a, b) distribution has pdf

ba

a1

(a) x exp(bx)
f (x) =

if x > 0,
if x 0,

and its mean is a/b. You may use these facts without proof.
(a) Find the posterior distribution of .
(b) Find (or simply state) the posterior mean of .
5. Let X1 , . . . , Xn iid Poisson(), where > 0 is unknown.
Note: The Poisson() distribution has pmf

x exp()

f (x) =
x!

if x {0, 1, 2, . . .},
if x {0, 1, 2, . . .},

and its mean and variance are both . Also, the maximum likelihood estimator of is
1 n
MLE

=
Xi .
n
n i=1
You may use these facts without proof.
(a) Let 0 > 0 be fixed and known. Find the likelihood ratio test of H0 = 0 versus
H1 0 . (You do not need to state how to choose the critical value for this part
of the question.)
(b) State how the critical value of the likelihood ratio test in part (a) can be chosen to
give the test approximate size .

Final Exam Sample Questions

6. Let X be a single continuous random variable with pdf


f (x) =

exp( x)
2

[1 + exp( x)]

x
1
x
1
sech(
) = sech(
),
4
2
4
2

where sech is the hyperbolic secant function, and cdf


F (x) =

1
,
1 + exp( x)

where R is unknown.
Note: The maximum likelihood estimator of is
MLE = X. Also, sech(t) = sech(t)
for all t R, and sech(t) is a strictly decreasing function of t. You may use these facts
without proof.
(a) Show that the likelihood ratio test of H0 = 0 versus H1 0 rejects H0 if and
only if X c for some critical value c. (You do not need to state how to choose the
critical value for this part of the question.)
(b) State how the critical value c of the likelihood ratio test in part (a) can be chosen
to give the test size (exactly, not just approximately), where 0 < < 1.
(c) For the likelihood ratio test with size in parts (a) and (b), find the probability of
a type II error if the true value of happens to be = 0.
(d) Suppose we observe X = xobs . Find the p-value of the likelihood ratio test for the
observed data xobs .
Note: Be sure your answer is correct for both positive and negative values of xobs .
7. Suppose that we call a hypothesis test trivial if its rejection region is either the empty
set or the entire sample space, i.e., a trivial test is a test that either never rejects H0
or always rejects H0 . Now let X Bin(n, ), where is unknown, and consider testing
H0 = 1/2 versus H1 1/2. Find a necessary and sufficient condition (in terms of n)
for the existence of a test of these hypotheses with level = 0.05 = 1/20 that is not trivial.
8. Let X1 , . . . , Xn be iid Exp(1) random variables with pdf

exp(x)
f (x) =

if x 0,
if x < 0.

Then let Yn = max1in Xi . Find a sequence of constants an such that Yn an converges


in distribution to a a random variable with cdf G(t) = exp[ exp(t)]. (This limiting
distribution is called the Gumbel distribution.)
Hint: For any c R, (1 + n1 c)n exp(c) as n .

Final Exam Sample Questions

9. Let X1 , . . . , Xn be iid continuous random variables with pdf

1
x
1
1

2 x3 exp( 22 2x )
f (x) =

if x > 0,
if x 0,

where > 0 is unknown.


(a) Find the maximum likelihood estimator of .
(b) Let 0 > 0 be fixed and known. Find the likelihood ratio test of H0 = 0 versus
H1 0 . (You do not need to state how to choose the critical value to give the
test a particular size in this part of the question.)
(c) State how the critical value of the likelihood ratio test in part (b) can be chosen to
give the test approximate size .
(d) Explain how the likelihood ratio test in part (b) would change if the alternative
hypothesis were H1 < 0 instead.
10. Let X1 , . . . , Xn be iid continuous random variables with pdf

( + 1)

( + x)2
f (x) =

if 0 x 1,
otherwise,

where > 0 is unknown. It can be shown that the Fisher information in the sample is
In () =

n
.
32 ( + 1)2

Use this fact to find (or simply state) the asymptotic distribution of the maximum likelihood estimator n of .
Note: There is no need to actually find the form of n or to verify the result for the Fisher
information. Also, you may assume that the regularity conditions of Section 7.4 of the
notes hold.

Final Exam Sample Questions

11. Let X1 , . . . , Xn be iid continuous random variables with pdf

( + x)2
f (x) =

if x 0,
if x < 0,

where > 0 is unknown. Let n denote the maximum likelihood estimator of (which
you do not need to find).
Note: It can be shown by simple calculus that
E (X1 ) = E (

1
) = ,
X1

E (

1
1
)= ,
+ X1
2

E [

1
1
] = 2,
2
( + X1 )
3

so you may use any of these facts without proof. You may also assume that the relevant
regularity conditions (i.e., those of Section 7.4 of the notes) are satisfied.
(a) Find the score function for the sample and show explicitly that its expectation is
zero (i.e., do not simply cite the result from the notes that says that the expectation
is zero).
(b) Find the Fisher information for the sample.
(c) Find (or simply state) the asymptotic distribution of n .
Note: Your answer should be a formal probabilistic result involving convergence in
distribution.
12. Let X1 , . . . , Xn iid N (, 1), where R is unknown.
(a) State an estimator of that is consistent but not unbiased.
(b) State an estimator of that is consistent but not asymptotically efficient.
(c) Is ( X n )2 = (n1 ni=1 Xi )2 an unbiased estimator of 2 ? Why or why not?
(d) Is ( X n )2 = (n1 ni=1 Xi )2 a consistent estimator of 2 ? Why or why not?
13. Lemma 7.2.1 of the notes states that E [`Xn ()] = 0 for all in the parameter space .
This result uses the regularity condition that Xn = (X1 , . . . , Xn ) is an iid sample. Now
suppose that we were to remove the condition of independence while keeping all other
regularity conditions in place. Explain why the result that E [`Xn ()] = 0 for all
would still be true.

Final Exam Sample Questions

14. Let X1 , . . . , Xn be iid random variables from a distribution that depends on an unknown
parameter R. This distribution has the following properties:
E (X1 ) = 2 exp(),
E (log X1 ) = ,
E (X11 ) = 2 exp(),

Var (X1 ) = 12 exp(2),


Var (log X1 ) = 2 log 2,
Var (X11 ) = 12 exp(2).

Now define the estimators


1 n
(1)
n = log( Xi ),
2n i=1

n
1/n

(2)

n = log( Xi ) .
i=1

(a) Find a function v (1) () such that


(1)
n[n ] D N [0, v (1) ()].
(b) Find a function v (2) () such that
(2)
n[n ] D N [0, v (2) ()].
(1)
(2)
(1)
(2)
(c) Find ARE [n , n ], the asymptotic relative efficiency of n compared to n , and
use it to state which of the two estimators is better for large n.

15. Let X1 , . . . , Xn be iid discrete random variables with pmf

(x + k 1)! k

x! (k 1)! (1 )
p (x) =

if x {0, 1, 2, . . .},
otherwise,

where k is a known positive integer, is unknown, and 0 < < 1.


(a) Find the maximum likelihood estimator n of .
Note: For the purposes of this question, you can ignore any possible data values for
which the MLE does not exist.
(b) Find the asymptotic distribution of the maximum likelihood estimator n of .
Note: You may use without proof the fact that E (X1 ) = (1 )k/, and you may
assume that the regularity conditions of Section 7.4 of the notes hold.

Final Exam Sample Questions

16. Let X be a single continuous random variable with pdf


f (x) =

1
,
[1 + (x )2 ]

where R is unknown. Let 0 R be fixed and known, and consider testing H0 = 0


versus H1 0 .
Note: The cdf of X is F (x) = 1 arctan(x)+ 12 , noting that the arctan function is strictly
increasing and is also an odd function, i.e., arctan(u) = arctan(u) for all u R. Also,
the that appears in the pdf and cdf is the usual mathematical constant, i.e., 3.14.
You may use any of these facts without proof.
(a) Show that the likelihood ratio test of these hypotheses rejects H0 if and only if
X 0 c or X 0 + c , where c 0.
(b) Let 0 < < 1. Find the value of c such that the test in part (a) has size (exactly).
Note: The inverse of the arctan function is simply the tan function.
17. Let X be a single random variable with a continuous uniform distribution on [0, ], where
> 0 is unknown. Consider testing H0 = 1 versus H1 1.
Note: The maximum likelihood estimator of is = X. You may use this fact without
proof.
(a) Let 0 < < 1. Find the likelihood ratio test of these hypotheses, and find the critical
value that gives the test size .
(b) Find values c1 > 0 and c2 > 0 such that the likelihood ratio test with size in part (a)
takes the form Reject H0 if and only if either X c1 or X > c2 .
(c) Give two reasons why it would not be appropriate to choose the critical value for
part (a) based on the result that 2 log has an approximate 21 distribution.
18. Let X1 , . . . , Xn be iid Poisson() random variables with pmf

x exp()

if x {0, 1, 2, . . .},

x!
p (x) =

0
otherwise,

where > 0 is unknown. Then let 0 > 0 be fixed and known, and consider testing
H0 = 0 versus 0 .
n = X n , and E (X1 ) = . You may use these facts without proof.
Note: The MLE of is
(a) Find the Wald test of these hypotheses, and state how to choose the critical value
to give the test approximate size . (You may use either version of the Wald test.)
(b) Find the score test of these hypotheses, and state how to choose the critical value to
give the test approximate size .
(c) Find the Wald confidence interval for with approximate confidence level 1. (You
may use either version of the Wald confidence interval.) where z/2 is the number
such that P (Z z/2 ) = for a standard normal random variable Z.

You might also like