Professional Documents
Culture Documents
Professor A G Constantinides©
AGC
DSP Estimation Theory
Let a random variable X have a probability
distribution dependent on a parameter
The parameter lies in a space of all
possible parameters
Let f X ( x | )be the probability density
function of X
Assume the the mathematical form of f X is
known but not
Professor A G Constantinides©
AGC
DSP Estimation Theory
The joint pdf of m sample random variables
evaluated at each the sample points
x1, x2 , . xm
Is given as
m
l ( , x1 , x2 , . xm ) l ( , x) f X ( xi | )
i 1
Professor A G Constantinides©
AGC
DSP Estimation Theory
The likelihood function is a function of the
unknown parameter for a fixed set of
observations
The Maximum Likelihood Principle requires
us to select that value of that maximises
the likelihood function
The parameter may also be regarded as
a vector of parameters
θ 1 2 . k
Professor A G Constantinides©
AGC
DSP Estimation Theory
It is often more convenient to use
(θ, x) log( f X (x | θ))
The maximum is then at
(θ, x)
0
θ
Professor A G Constantinides©
AGC
DSP An example
Let x x1, x2 , . xmbe a random sample
selected from a normal distribution
N( , ) 2
Professor A G Constantinides©
AGC
DSP Estimation Theory
Form the log-likelihood function
1 m
(θ, x) m log( 2 ) m log ( xi )
2
Hence 2 2
i 1
(θ, x) 1 m
2 ( xi ) 0
i 1
(θ, x) m 1 m
3 ( xi ) 2 0
i 1
1 m
or ̂ ML xi ˆ ML
1 m
( xi ML ) 2
m i 1 m i 1
Professor A G Constantinides©
AGC
DSP Fisher and Cramer-Rao
The Fisher Information helps in placing a bound on
estimators
J (θ) E{ ( (θ, x))T }
θ
Cramer-Rao Lower Bound:“If t(X) is any unbiased
estimator of θ based on maximum likelihood then
1
E{[t(X) θ)][t(X) θ)] } J (θ)
T
Professor A G Constantinides©
AGC
DSP Estimation Theory
It can be seen that if we model the
observations as the output of an AR
process driven by zero mean Gaussian
noise then the Maximum Likelihood
estimator for the variance is also the
Least Squares Estimator.
Professor A G Constantinides©