Professional Documents
Culture Documents
ESTIMATION
Any statistic used to estimate the value of an unknown parameter θ is called an estimator of θ. The
observed value of the estimator is called the estimate. Suppose that the random variables
x1 , x2 , , xn , whose joint distribution is assumed given except for the unknown parameter θ, are
observed. The problem of interest is to use the observed values to estimate θ. For example the xi’s
might be independent exponential random variables each having the same unknown mean θ. the joint
density function of the random variables would be given by
f x1 , x2 xn f x1 . f x2 . . f xn
1 x1 1 x2 1 xn
e . e . . e , 0 xi ; i 1, , n
1 n x And the objective would be to estimate θ
n exp i , 0 xi ; i 1, , n
i 1 x , x , , xn .
from the observed data 1 2
A particular type of estimator, known as
the maximum likelihood estimator, is widely used in statistics.
It provides a procedure for deriving the point estimator of the parameter directly.
f x,
Consider a random variable with density function , in which θ is the parameter, such as the
inquire; what is the most likely value of θ that produces the set of observations 1 2
x , x , , xn ? i.e.
what is the value that will maximize the likelihood of obtaining the set of observations?
The likelihood of obtaining a particular sample value x i can be assumed to be proportional to the value
independent observations
x1 , x2 , , xn , is
L x1 , , xn , f x1 , f x2 , f xn ,
ˆ L x1 , , xn ,
estimator is the value of θ that maximizes the likelihood function . This estimator
2
L x1 , , xn ,
may be obtained by differentiating with respect to θ and setting the derivative
L x1 , , xn ,
0
equal to zero. i.e.
Because of the multiplicative nature of the likelihood function, it is frequently more convenient to
maximize the logarithm of the likelihood function instead, that is
log L x1 , , xn ,
0
For density functions with two or more parameters, the likelihood function becomes
m
L x1 , , xn ;1 , , m f xi ; 1 , , m
i 1 where 1
, ,
m are m parameters to be
estimated. In this case, the maximum likelihood estimators would be obtained from the solution to the
L x1 , , xn ;1 , , m
0; j 1, , m
j
following simultaneous equations
For large n, the MLE is often considered the best estimate, in that it has the minimum variance
(asymptotically).
Example
Suppose that n independent trials, each of which is a success with probability p are performed. What is
the maximum likelihood of p?
Solution
p xi 1 p 1 p xi 0
p xi x p x 1 p
1 x
, x 0,1
Now,
The likelihood (that is the joint p.m.f) of the data assuming independence of trials, is given by
3
To determine the value of p that maximizes the likelihood, first take logs to obtain
Example
4
Example
5
6
Example
7
The times between successive arrivals of vehicles in a traffic flow were observed as follows:
1.2 , 3.0 , 6.3 , 10.1 , 5.2 , 2.4 , 7.1 , sec. suppose the interarrival time of vehicles follow an exponential
1 t
f t e
distribution; that is Determine the maximum likelihood estimator (MLE) for the mean
interarrival time λ.
Solution
7
1 1
exp i 7 exp ti
t
L t1 , , t7 ;, ,
i 1 where ti is the ith observed
interarrival time.
L 1 1 t
7 8 exp ti 7 exp ti 2 i 0
8 7 i exp ti 0
t 1
or
from which
ti 5.04 sec
7
1
n
ti
in general, therefore the MLE for λ from a sample of size n is
8