You are on page 1of 9

Practice problems for Homework 11 - Point Estimation

1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341
students. Which of the following strategies is the best:
a) Pick 5 students from the first row.
b) Pick 5 of your friends from the class.
c) Post a note on eLearning and get the first 5 students who respond.
d) Wait near the door and pick the first 5 students who enter the class.
e) Assign a number to each student in the class and use a random number generator to pick 5
students.
2. (10 marks) Consider the following results of 10 tosses of a coin:
H, T, T, T, T, H, T, H, T, T
a) Estimate the probability of head (H) for this coin.
b) Estimate the standard error of your estimate.
3. (10 marks) Suppose the following data shows the number of the problems from the Practice
Problems Set attempted in the past week by 10 randomly selected students:
2, 4, 0, 7, 1, 2, 0, 3, 2, 1.
a) Find the sample mean.
b) Find the sample variance.
c) Estimate the mean number of practice problems attempted by a student in the past week.
d) Estimate the standard error of the estimated mean.
4. (10 marks) Let X1 , . . . , Xn denote a random sample from a Uniform (0, ) distribution, with
> 0 as the unknown parameter. Let X denote the sample mean.
a) Is X unbiased for ? Explain your answer.
b) Find an unbiased estimator for .
c) Find the variance of the estimator in the previous part.
d) Suggest an alternative estimator for .
1

5. (10 marks) A sample of 3 observations, (X1 = 0.4, X2 = 0.7, X3 = 0.9) is


collected from a continuous distribution with density
f (x) = x1 for 0 < x < 1.
Estimate
a. by the method of moments;
b. by the method of maximum likelihood.
6. (10 marks) (P. 648, #3, first part)
The memory residence times of 13,171 jobs were measured, and the sample mean was found
to be 0.05 s and the sample variance, 0.006724 s. Assuming that the memory residence time is
gamma-distributed, estimate its parameters r and using the method of moments.
7. (10 marks) Derive method of moments and maximum-likelihood estimators for:
a) parameter p based on a Bernoulli(p) sample of size n.
b) parameter p based on a Binomial(N, p) sample of size n.
Compute your estimators if the observed sample is (3, 6, 2, 0, 0, 3) and N = 10.
c) parameter based on a Poisson() sample of size n.
d) parameters a and b based on a Uniform (a, b) sample of size n.
e) parameter based on a Normal(, 2 ) sample of size n with known variance 2 and unknown mean .
f) parameter based on a Normal(, 2 ) sample of size n with known mean and unknown
variance 2 .
g) parameters (, 2 ) based on a Normal(, 2 ) sample of size n with unknown mean and
variance 2 .

Solutions:
1.
Strategy (e) is the best as this is the only strategy that ensures that each possible sample of 5
students is equally likely to be selected. The other strategies lead to convenience samples.
2.
Let X denote the toss of a single coin. Further, let X = 1 if a head results, and X = 0 if a tail
results. This X is a Bernoulli (p) random variable, where p denotes the probability of head. Let
p denote the estimator of p.
a) The estimated value of p is p = (1 + 0 + 0 + . . . + 1 + 0 + 0)/10 = 0.3.
p
p
b) The estimated standard error of p is p(1 p)/n = 0.3(0.7)/10 = 0.14.
3.
a) X =

Pn

i=1

Xi /n = (2 + 4 + . . . + 2 + 1)/10 = 2.2

Pn

b) S 2 = i=1 (Xi X)2 /(n 1) = (2 2.2)2 + (4 2.2)2 + . . . + (2 2.2)2 + (1


2.2)2 /(10 1) = 4.4
c) The estimate is X = 2.2.

d) Estimated standard error of X is S/ n = 4.4/10 = 0.66

4.
Let X denote a Uniform (0, ) random variable.
a) No. This is because expectation of E(X) is /2, which implies that E(X) = E(X) =
/2 6= .
b) The estimator 2X is unbiased since E(2X) = , for all .
c) var(2X) = 4var(X) = 4var(X)/n = 42 /(12n) = 2 /(3n).
d) Since denotes the maximum value in the population, a natural estimator is maximum value
in the sample, i.e., max{X1 , . . . , Xn }.
5.
a. Method of moments.
Compute

1 = E(X) =

xf (x)dx =
0

x=1
x+1

x dx =
,
=

+ 1 x=0 + 1

equate it to
= 0.4 + 0.7 + 0.9 = 2
m1 = X
3
3
and solve for ,

2
= = = 2
+1
3

b. Method of maximum likelihood.


The joint density is

f (X1 , ..., Xn ) =

3
Y

x1
i

i=1

Take logarithm,

ln f (X1 , ..., Xn ) =

3
X

{ln + ( 1) ln xi } = 3 ln + ( 1)

i=1

3
X
i=1

Take derivative, equate to 0, and solve for ,


3

3 X
ln f
= +
ln xi = 0,

i=1

= 3/

3
X

ln xi = 3/(ln 0.4 + ln 0.7 + ln 0.9) = 2.1766

i=1

6.
For the Gamma distribution, E(X) = r/ and V ar(X) = r/2 .
So, the first two moments are 1 = r/ and 2 = (r/2 ) + (r/)2 .
4

ln xi

and M2 =
Also, the first two sample moments are M1 = X

1
n

Xi2 .

Then, the method of moments estimators can be obtained by solving the system of equations
(
r/
2 = (r/2 ) + (r/)2

= M1
= M2

with respect to r and .


(
r/
(r/2 ) + (r/)2

= M1

= M2

(
r/
r/2 + M12

= M1

= M2

(
r = M12 /(M2 M12 )
= M1 /(M2 M12 )

Notice that

M2 M12 =

1X
2 = (n 1)s2 /n = (13170)(0.006724)/(13171) = 0.006723.
(Xi X)
n

Then we can compute the estimates,


= X/0.006723
2 /0.006723 = 0.372,

r = X
= 7.437.

It is easier to use the {\bf central} second moments, s2 and V ar(X) instead of E(X 2 ) and M2 .
This way, we solve the system of equations
(

E(X)
= r/ = X
V ar(X) = r/2 = s2 .
Solving this system in terms of r and , we immediately get the method of moment estimates
= X/s
2 /s2 = 0.372,
2 = 7.436.
r = X
7.
# 1a (Bernoulli)
This is a special case of #1(b) when N = 1.
5


Following #1(b), both methods result in p = X.
# 1b (Binomial)
Method of moments.
To estimate parameter p of Binomial(N, p) distribution, we recall that
1 = E(X) = N p.
There is only one unknown parameter p, hence we write one equation.
and solve: p = X/N
.
Equate 1 = E(X) = N p to X
Maximum likelihood.
Start with the joint p.m.f.

n 
Y
N
f (X1 , . . . , Xn ) =
pXi (1 p)N Xi ,
Xi
i=1

then
 X
n 
n
n
X
X
N
ln f (X1 , . . . , Xn ) =
+
Xi ln p +
(N Xi ) ln(1 p),
Xi
i=1

i=1

i=1

Pn
Pn

(N Xi )
nX
nN nX

1 Xi
ln f (X1 , . . . , Xn ) =
1
=

= 0,
p
p
1p
p
1p
(find roots of the derivative in order to maximize the density).
Solving this equation,
= p(N X)

(1 p)X

p = X/N.

.
Answer: p = X/N
# 1(c) (Poisson)
Method of moments.
To estimate parameter of Poisson() distribution, we recall that 1 = E(X) = .
There is only one unknown parameter, hence we write one equation,

1 = = m1 = X.
6

Solving it for , we obtain


= X,

the method of moments estimator of .


Maximum likelihood.
x

The p.m.f. of Poisson distribution is P (x) = e x! , and its logarithm is

ln P (x) = + x ln ln(x!).

Thus, we need to maximize


ln P (X1 , ..., Xn ) =

n
X

( + Xi ln ) + C = n + ln

i=1

where C =

n
X

Xi ,

i=1

ln(x!) is a constant that does not contain the unknown parameter .

Find the critical point(s) of this log-likelihood. Differentiating it and equating its derivative to 0,
we get
n

1X

ln P (X1 , ..., Xn ) = n +
Xi = 0.

i=1
This equation has only one solution
n

X
= 1

Xi = X.
n i=1
Since this is the only critical point, and since the likelihood vanishes (converges to 0)
is the maximizer. Therefore, it is the maximum likelihood
as 0 or , we conclude that
estimator of .
For the Poisson distribution, the method of moments and the method of maximum likelihood
returned the same estimator,
= X.

# 1d (Uniform)
7

Method of moments.
Use two moments, 1 = (a + b)/2 and, for example, 02 = V ar(X) = (b a)2 /12.
Find a
and b by solving the system


a+b
2
(ba)2
12

= X
= S2

S 12
+ S 12
2X
2X

a
=
, b=
2
2

Maximum likelihood. The joint density



f (X1 , . . . , Xn ) =


1 n
ba
0

if a X1 , . . . , Xn b
otherwise

is monotonically increasing in a and decreasing in b. It is maximized at the largest value of a


and the smallest value of b where this density is not 0. These are
a
= min(Xi ) and b = max(Xi )

# 1e (Normal with unknown mean)


and trivially obtain

Method of moments. Equate to X


= X.
Maximum likelihood. The joint density is
f (X1 , . . . , Xn ) =

n
Y
i=1



(Xi )2
1
exp
.
2 2
2

Then

ln f (X1 , . . . , Xn ) = n ln( 2)

n
X
(Xi )2
i=1

n2 2

2 2

P
Xi + Xi2
= n ln( 2)
2 2
P

is a parabola in terms of , and it is maximized at


=
Xi /n = X.
P

# 1f (Normal with unknown variance)


Method of moments. The first population moment is not a function of , thus we equate the
second (central, for simplicity) moment 2 to the second moment S 2 and obtain
= S.
8

Since is known, we can also use the estimator


n

1X

=
(Xi )2
n i=1
2

instead of S 2 .
Maximum likelihood. Differentiate ln f (X1 , . . . , Xn ) from #1e with respect to ,
P

n
(Xi )2
ln f (X1 , . . . , Xn ) = +
= 0,

3
(find roots of the derivative in order to maximize the density) so that
rP
(Xi )2
.

=
n
#1 g (Normal with both parameters unknown)
Method of moments.
When both and are unknown, we equate the first and second (central, for simplicity) moments
and
and trivially obtain
=X
2 = S 2 so that
= S.
Maximum likelihood. First, maximizing ln f (X1 , . . . , Xn ) from # 1e in , we again obtain

=X
regardless of the value of . Then, substitute the obtained maximizer into ln f (X1 , . . . , Xn ) for
the unknown value of and maximize the resulting function in terms of . We get the same

answer as in (d) with the unknown parameter replaced by X,


qP
2
(Xi X)

=
= S.
n

You might also like