You are on page 1of 4

The University of Edinburgh

College of Science and Engineering


School of Mathematics

Mathematics 3 Honours
MATH10095 Statistical Methodology

Monday 16th December 2019

2.30pm-4.30pm

Chairman of Examiners – Professor A Olde Daalhuis


External Examiner – Professor J McColl

Answer ALL questions

Calculators and other electronic aids:

A scientific calculator is permitted in this examination.


It must not be a graphical calculator.
It must not be able to communicate with any other device.

This examination will be marked anonymously.


MATH10095 Statistical Methodology 1

(1) Let x1 , ..., xn denote an i.i.d sample from a distribution with the probability density
function,
−θx2
 
f (x; θ) = θx exp , x > 0, θ > 0.
2
(a) Obtain the maximum likelihood estimator of parameter θ. What is the
b as n → ∞ (specify the asymptotic
asymptotic distribution of this estimator, θ,
mean and variance of the distribution)? [6 marks]
P100 2
(b) Suppose we observe n = 100 and i=1 xi = 258. Using one iteration of the
Fisher-scoring method, find the maximum likelihood estimate by setting the
initial value θ0 = 0.70. Compare this estimate with the estimate you get from
the derived θb in part (a). [7 marks]
1
(c) A random variable Y is from an Exp(λ) distribution with E(Y ) = , i.e.,
λ
f (y; λ) = λe−λy , y > 0, λ > 0.

Assume y1 , ..., yn is a sample of i.i.d observations. Let the prior distribution of


α
λ be Γ(α, β), with E(λ) = , i.e.,
β

λα−1 e−βλ β α
p(λ) = .
Γ(α)

Show that the posterior distribution of λ, p(λ | y), is also a gamma distribution
and derive a point estimator for λ. [7 marks]

[Please turn over]


MATH10095 Statistical Methodology 2

(2) Let y1 , ..., yn be independent observations from a normal distribution with a zero
mean and the unknown variance parameter, σ 2 . The probability density function is,

y2
 
2 1
f (y; σ ) = 1 exp − 2 , −∞ < y < ∞.
(2πσ 2 ) 2 2σ

(a) Obtain the maximum likelihood estimator, score function, and the Fisher
information for parameter σ 2 . [6 marks]
2 2
Note: Var(Y ) = E(Y ) − [E(Y )] .
(b) Suppose we want to test H0 : σ 2 = σ02 against H1 : σ 2 6= σ02 . Derive the
likelihood ratio test for this setting and specify the asymptotic distribution of
this test statistic. [6 marks]
(c) Explain how we could find a 95% confidence interval for the parameter σ 2 ? (You
don’t need to find the actual confidence interval.) [2 marks]

[Please turn over]


MATH10095 Statistical Methodology 3

(3) Consider a linear regression model in which responses Yi are uncorrelated and have
the common variance σ 2 , for i = 1, ..., n.

(a) If the regression model is E(Yi | xi ) = β0 + β1 x2i , find the least square estimators
of β0 and β1 . [8 marks]
(b) If the regression model is E(Yi | xi ) = β0 + β1 xi , show that
n
X n
X n
X
(yi − y)2 = (yi − ybi )2 + (ybi − y)2 .
i=1 i=1 i=1

(This is Syy = SSResiduals + SSRegression , in which, SS denotes “sum of squares”.)


[4 marks]
(c) For the model in part (b), it is known that,

SSResiduals = Syy − Sxy 2 /Sxx .


2
Show that SSRegression = βb1 Sxx . [4 marks]

Note for parts (b) and (c) of this question:


X X X
Sxx = (xi − x)2 , Syy = (yi − y)2 , Sxy = (xi − x)(yi − y).

[End of Paper]

You might also like