0 Up votes0 Down votes

7 views13 pagesFeeler-Lecture01 Estimation Theory Handouts

Jul 06, 2016

© © All Rights Reserved

PDF, TXT or read online from Scribd

Feeler-Lecture01 Estimation Theory Handouts

© All Rights Reserved

7 views

Feeler-Lecture01 Estimation Theory Handouts

© All Rights Reserved

- MTE-03_dec_2007
- ASTM D 2013-86 Standard-method-of-preparing-coal-sample-for-analysis.pdf
- Estimation
- Family Estimators
- 6686_01_msc_20100715
- Statistics
- Estimating Measurement-Uncertainty- In Quantitative Chemical Analysis
- 0413
- Bootstrap Based Birnbaum Saunders
- v7n4a3_Broadie
- c_ques1
- Rohde Intermediate Stat II
- DOANE - Stats answer key Chap 008
- Combination of Two Exponential Ratio Type Estimators for Estimating Population Mean Using Auxiliary Variable with Double Sampling in Presence of Non-Response
- SSM_Unit8
- StatLec1-5.pdf
- CT3
- 233767863-Newbold-Chapter-7.ppt
- n12.pdf
- Approximating Distributions

You are on page 1of 13

Estimation theory

Johan E. Carlson

Dept. of Computer Science, Electrical and Space Engineering

Lule University of Technology

Lecture 1

1 / 26

Outline

1. General course information

2. Introduction (Chapter 1)

2.1. Estimation theory in Signal Processing

2.2. Problem formulation

2.3. Assessing estimator performance

3.1. Unbiased estimators

3.2. Existence of the MVUB estimator

3.3. Finding the MVUB estimator

3.4. Extension to a vector parameter

2 / 26

Notes

Notes

http://staff.www.ltu.se/~johanc/estimation_theory/

Text book:

Steven M. Kay, Fundamentals of Statistical Signal Processing:

Estimation Theory, Vol 1. Prentice Hall, 1993.

ISBN10: 0133457117

Examination:

Completion of theoretical homework assignments (written

solutions to be handed in to me).

Completion of computer assignments (short lab reports to me).

3 / 26

Modern estimation theory is central to many electrical system, e.g.

Radar, Sonar, Acoustics

Speech

Image and video

Biomedicine (Biomedical engineering)

Communications (Channel estimation, synchronization, etc.)

Automatic control

Seismology

4 / 26

Notes

Notes

materials

Locate flaws/cracks in internal layers

5 / 26

Simply put, given an observed N -point data set

{x[0], x[1], . . . , x[N 1]}

which depends on an unknown parameter , wee define the

estimator

b = g(x[0], x[1], . . . , x[N 1]),

where g is some function.

6 / 26

Notes

Notes

of probability density functions (PDF), i.e.

p(x[0], x[1], . . . , x[N 1]; ),

where semicolon (;) denotes that the PDF is parameterized by the

unknown parameter .

The estimation problem is thus to find (infer ) the value of from the

observations. The PDF should be chosen so that

It takes into account any prior knowledge or constraints

It is mathematically tractable

7 / 26

Example: The Dow-Jones index

DowJones average

3200

3100

3000

2900

2800

0

20

40

60

Day number

8 / 26

80

100

Notes

Notes

A reasonable model could then be

x[n] = A + Bn + w[n], n = 0, 1, . . . , N 1,

where w[n] is white Gaussian noise (WGN), i.e. each sample of

w[n] has the PDF N (0, 2 ), and is uncorrelated with all the other

samples. The unknown parameters can arranged in the vector

= [A B]T . The PDF of x[n] then is

#

"

N 1

1

1 X

2

(x[n] A Bn) ,

p(x; ) =

N exp

2 2

(2 2 ) 2

n=0

9 / 26

Example: The Dow-Jones index

The straight line assumption is consistent with data. A models

the offset and B models the linear increase over time.

The choice of the Gaussian PDF makes the model

mathematically tractable.

Here the parameters are assumed to be unknown, but

deterministic.

One could also assume that is random, but constrained, say

A is in [2800, 3200], and that it is uniformly distributed in this

interval.

This would lead to a Bayesian approach, and

the joint PDF will be

p(x; ) = p(x|)p()

10 / 26

Notes

Notes

corrupted by noise).

3

x[n]

2

1

0

1

0

20

40

60

80

100

x[n] = A + w[n],

where w[n] is N (0, 2 ).

11 / 26

How to estimate A?

A reasonable estimator would be the sample mean

N 1

1 X

b

x[n]

A=

N

n=0

b to A?

How close will A

Are there any better estimators than the sample mean?

12 / 26

Notes

Notes

b = x[0]

A

using all available data.

But, for any given realization of x[n] it might actually be closer

to the true A than the sample mean.

So, how do we assess the performance?

We need to consider the estimators from a statistical

perspective!

b and var(A)

b

Lets consider E(A)

13 / 26

For the first estimator

b = E

E(A)

N 1

1 X

x[n]

N

n=0

1

N

N

1

X

E(x[n])

n=0

= A

For the second estimator

b = E(x[0]

E(A)

= A

14 / 26

Notes

Notes

b = var

var(A)

N 1

1 X

x[n]

N

n=0

=

=

=

1

N2

N

1

X

var(x[n])

n=0

1

N 2

N2

2

N

b = var(x[0])

var(A)

= 2

15 / 26

b = A, i.e. they

So, the expected value of both estimators, E(A)

are both unbiased.

The variance of the second estimator is 2 , which is larger

than the variance of the first estimator.

It appears that indeed the sample mean is a better estimate

than x[0]!

16 / 26

Notes

Notes

deterministic parameters.

We will restrict the search for estimators to those who on

average yield the true parameter value, i.e. to unbiased

estimators.

For all possible unbiased estimators, we will then look for the

one with the minimum variance, i.e. the minimum variance

unbiased estimator (MVUB).

17 / 26

Unbiased estimators

An estimator is said to be unbiased if

b = ,

E()

for all possible values of .

If = g(x), this means that

Z

b

E() = g(x)p(x; )dx = ,

for all .

18 / 26

Notes

Notes

square error (MSE)

h

i

b = E (

b )2

mse()

19 / 26

Unfortunately, this criterion often leads to unrealizable estimators,

since

h

i2

b

b

b

b

mse() = E

E() + E()

h

i2

b + E()

b

= var()

b + b2 (),

= var()

which shows that the MSE depends both on the variance of the

estimator and on the bias. If the bias depends on the parameter

itself, were in trouble!

Lets restrict ourselves to search only for

unbiased estimators!

20 / 26

Notes

Notes

No!

21 / 26

A counterexample of to the existence

Assume that we have to independent observations x[0] and x[1]

with PDF

x[0] N (, 1)

N (, 1), 0

x[1]

N (, 2), < 0

22 / 26

Notes

Notes

b1 =

b2 =

1

(x[0] + x[1])

2

2

1

x[0] +

3

3

var(b1 ) =

var(b2 ) =

1

(var(x[0]) + var(x[1]))

4

4

1

var(x[0]) + var(x[1])

9

9

23 / 26

As a result (looking back at the PDFs), we have that

18

36 , 0

var(b1 ) =

24

36 , < 0

20

36 , 0

var(b2 ) =

24

36 , < 0

So, for 0, the minimum variance is 18/36 (estimator 1) and for

< 0 it is 24/36 (estimator 2). Hence, no single estimator will have

the uniformly minimum variance for all .

24 / 26

Notes

Notes

the MVUB estimator.

There are some possible approaches though

Determine the Cramer-Rao lower bound (CRLB) and check if

some estimator satisfies it (Chapter 3 and 4).

Apply the Rao-Blackwell-Lehmann-Scheffe (RBLS) theorem

(Chapter 5).

Further restrict the estimators to be also linear. Then find the

MVUB estimator within this class. (Chapter 6).

25 / 26

If = [1 , 2 , . . . , p ]T is a vector of unknown parameters, we say

that an estimator is unbiased if

E(bi ) = i ,

for i = 1, 2, . . . , p. By defining

E() =

E(b1 )

E(b2 )

..

.

E(bp )

b = .

E()

26 / 26

Notes

- MTE-03_dec_2007Uploaded bycooooool1927
- ASTM D 2013-86 Standard-method-of-preparing-coal-sample-for-analysis.pdfUploaded byJANET GT
- EstimationUploaded byapi-26870484
- Family EstimatorsUploaded byAnonymous 0U9j6BLllB
- 6686_01_msc_20100715Uploaded byalevelscience
- StatisticsUploaded byJigar Patel's
- Estimating Measurement-Uncertainty- In Quantitative Chemical AnalysisUploaded byMohammad Youssefi
- 0413Uploaded bykacongmarcuet
- Bootstrap Based Birnbaum SaundersUploaded byjamesbondtv
- v7n4a3_BroadieUploaded byAakash Khandelwal
- c_ques1Uploaded bySraboni Ghosh
- Rohde Intermediate Stat IIUploaded byshoaib625
- DOANE - Stats answer key Chap 008Uploaded byBG Monty 1
- Combination of Two Exponential Ratio Type Estimators for Estimating Population Mean Using Auxiliary Variable with Double Sampling in Presence of Non-ResponseUploaded byIJSRP ORG
- SSM_Unit8Uploaded byRam Krishna
- StatLec1-5.pdfUploaded byKuangye Chen
- CT3Uploaded bypooja sharma
- 233767863-Newbold-Chapter-7.pptUploaded byAshutosh
- n12.pdfUploaded byChristine Straub
- Approximating DistributionsUploaded bykshitijsaxena
- Chapter 09 ParameterEstimationUploaded byayman
- Statistics PaperUploaded bykishan ivjay
- Standard DeviationUploaded byMarielle Perejon
- Hoeffding-Type and Bernstein-Type Inequalities for Right Censored DataUploaded byLuis Romero
- An Accurate Sample Rejection Estimator for the Estimation of Outage Probability of EGC ReceiversUploaded byPink Panther
- 6. Normal Probability Distributions.pdfUploaded byAhlan Jufri Abdullah
- STAT FOR FIN CH 4.pdfUploaded byWonde Biru
- intro.pdfUploaded byprasanthi
- 2011-Estim. in an Exponentiated Half Logistic Distr Under Progr Type-2Uploaded bysss
- Análisis de regresiónUploaded bySteev Vega Gutierrez

- 241-247ELE661notesUploaded byppawasthi
- 221-240ELE661notesUploaded byppawasthi
- 141-160ELE661notesUploaded byppawasthi
- 161-180ELE661notesUploaded byppawasthi
- 181-200ELE661notesUploaded byppawasthi
- 201-220ELE661notesUploaded byppawasthi
- 121-140ELE661notesUploaded byppawasthi
- 61-80ELE661notesUploaded byppawasthi
- 101-120ELE661notesUploaded byppawasthi
- 21-40ELE661notesUploaded byppawasthi
- 1-20ELE661notesUploaded byppawasthi
- 41-60ELE661notesUploaded byppawasthi
- 81-100ELE661notesUploaded byppawasthi
- 201-220ELE509_2006Uploaded byppawasthi
- 16-Estimation Theory Lecture notesUploaded byppawasthi
- 12-Estimation Theory Lecture notesUploaded byppawasthi
- 14-Estimation Theory Lecture notesUploaded byppawasthi
- 221-237ELE509_2006Uploaded byppawasthi
- 181-200ELE509_2006Uploaded byppawasthi
- 8-Estimation Theory Lecture notesUploaded byppawasthi
- 4Estimation Theory Lecture notesUploaded byppawasthi
- 6Estimation Theory Lecture notesUploaded byppawasthi
- 2-Estimation Theory Lecture notesUploaded byppawasthi
- 10-Estimation Theory Lecture notesUploaded byppawasthi
- 1-Estimation Theory Lecture notesUploaded byppawasthi

- PS 3_2015Uploaded byHarry
- Wikipedia - Standar DeviasiUploaded byDik Sadja
- Mathematical Methods in Risk TheoryUploaded byAnonymous Tph9x741
- Stochastic Calculus for Discontinuous ProcessesUploaded byjulianli0220
- 5.5.4 Homework AssignmentUploaded byTino Tino
- Random process by B. HajekUploaded bybasusoumya
- Gabai x 2008Uploaded byAnonymous mowhIc5Ly
- Knoll Octavian PhDUploaded byRajiv Cena
- Lecture Plan PTSP - ECE B - Pandurangaiah(1)Uploaded byRaja Muraliprasad
- fGarchUploaded byNguyen Dang Khoi
- SE3 CondensedUploaded byOwais Qureshi
- Probability and StatisticsUploaded byapi-20012397
- Stats 3 TestUploaded bymarkp_cnm
- Characterising and Displaying Multivariate DataUploaded byShyaam Prasadh R
- Standard ErrorUploaded byUmar Farooq
- Hmm Tutorial 1Uploaded byLala Nguyen
- BCS 40 IGNOUUploaded byAbhishek Mandal
- Assignment 1-CalculusUploaded byFarith Ahmad
- The Poisson DistributionUploaded byAnzar Imran
- actuarUploaded byJoel Schwartz
- f5 c8 Probability Distribution NewUploaded byYoga Nathan
- MCQ on Introduction to ProbabilityUploaded byRahmatullah Mardanvi
- Bayesian Statistics for Beginners a step-by-step approach.pdfUploaded byGeorge Karamalis
- ECOMET2 Lecture NotesUploaded byddeww
- Econometrics.pdfUploaded byLucas Gabriel Martins
- ch05Uploaded bySakib Sayed
- Hidden Markov Model in Facial Expression Recognition.pdfUploaded byNitin Prajapati
- 1-may-00Uploaded byJorge Muñoz Aristizabal
- STA457Uploaded byrachel
- Q3Uploaded byShaun Loo

## Much more than documents.

Discover everything Scribd has to offer, including books and audiobooks from major publishers.

Cancel anytime.