You are on page 1of 26

Statistical Signal Processing

Chapter 2: DETECTION THEORY


Instructor:

Phuong T. Tran, PhD.


(slides are borrowed from Prof. Natasha Devoyre, UIC)

Ton Duc Thang University


Faculty of Electrical and Electronics Engineering

Apr. 27 2019
Outline

1 Matched Filters
Development of Detector
Performance of Matched Filters

2 Generalized Matched Filters


Test Statistics
Performance of Generalized Matched Filter

3 Multiple Signals

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 2 / 24


Outline

1 Matched Filters
Development of Detector
Performance of Matched Filters

2 Generalized Matched Filters


Test Statistics
Performance of Generalized Matched Filter

3 Multiple Signals

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 2 / 24


Outline

1 Matched Filters
Development of Detector
Performance of Matched Filters

2 Generalized Matched Filters


Test Statistics
Performance of Generalized Matched Filter

3 Multiple Signals

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 2 / 24


Objectives

Understand the concept of matched filter for


detection of deterministic signal in noise.
Apply matched filter to solve practical detection
problems.
Evaluate the performance of generalized matched
filters.

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 3 / 24


Matched Filters Development of Detector

Outline

1 Matched Filters
Development of Detector
Performance of Matched Filters

2 Generalized Matched Filters


Test Statistics
Performance of Generalized Matched Filter

3 Multiple Signals

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 4 / 24


Matched Filters Development of Detector

Consider the problem of detecting the presence of a known signal


s[n], n = 0, 1, ..., N − 1 in Gaussian noise. That leads to the binary
hypothesis testing:

H0 : x [n] = w [n]
H0 : x [n] = s[n] + w [n], n = 0, 1, ..., N − 1
where w [n] is the WGN with zero-mean and variance σ 2 and x [n] is
the received signal.
The autocorrelation function of w [n] is rww [k] = E [w [n]w [n + k]]
= σ 2 δ[k], where δ[k] = 1 for k = 0 and 0 otherwise.
By NP test, we decide H1 if the following test statistics is above a
N−1
threshold γ 0 = σ 2 ln(γ) + 1
s 2 [n]:
P
2
n=0

N−1
x [n]s[n] > γ 0
X
T (x) = (1)
n=0

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 5 / 24


Matched Filters Development of Detector

Correlator/ Matched Filter


The above test is called the correlator or replica correlator. This is
optimal in white Gaussian noise.
We can represent T (x) as
N−1
X
T (x) = x [n]h[N − 1 − n] = x [n] ⊗ h[n]|n=N−1
n=0
where h[n] = s[N − 1 − n], n = 0, 1, ..., N − 1 and ⊗ denotes the
convolution.
That means, we can feed the received signal through a FIR filter
with impulresponse h[n], then make our decision by sampling the
output of the filter at the time N − 1 and comparing it with the
threshold γ 0 .
This filter is called matched filter because its impulse response
"matches" the signal (it’s a flipped version of s[n]).
Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 6 / 24
Matched Filters Performance of Matched Filters

Outline

1 Matched Filters
Development of Detector
Performance of Matched Filters

2 Generalized Matched Filters


Test Statistics
Performance of Generalized Matched Filter

3 Multiple Signals

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 7 / 24


Matched Filters Performance of Matched Filters

Performance of Matched
Filters

It can be shown that the matched filter maximizes the SNR at the
output of the FIR filter. In the frequency domain, it emphasizes the
bands in which more signal power is located.
Its performance (PD ) can be derived explicitly as a function of PFA :
 s 
E 
PD = Q Q −1 (PFA ) − , (2)
σ2
N−1
where E = s 2 [n] is the signal energy.
P
n=0
Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 8 / 24
Matched Filters Performance of Matched Filters

Example 1

Performance of matched filter in white (uncorrelated) Gaussian


noiseis unaffected by the signal shape, this is not the case for
colored (correlated) Gaussian noise.

Example 1:
Consider the binary hypothesis testing:

H0 : x [n] = w [n]
H0 : x [n] = s[n] + w [n], n = 0, 1, ..., N − 1

where w [n] is assumed to be white with variance σ 2 .

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 9 / 24


Matched Filters Performance of Matched Filters

Example 1

N−1
The test statistics T (x) = xT s = x [n]s[n] ∼ N (0; σ 2 E) under H0
P
n=0
and T (x) ∼ N (E; σ 2 E) under H1 . So, we can obtain the PFA and PD as
follows:


PFA = Pr {T > γ 0 ; H0 } ⇒ γ 0 = σ 2 EQ −1 (PFA )
 s 
 0
γ −E E 

PD = Pr {T > γ 0 ; H1 } = Q √ = Q Q −1 (PFA ) −
σ2E σ2

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 10 / 24


Generalized Matched Filters Test Statistics

Outline

1 Matched Filters
Development of Detector
Performance of Matched Filters

2 Generalized Matched Filters


Test Statistics
Performance of Generalized Matched Filter

3 Multiple Signals

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 11 / 24


Generalized Matched Filters Test Statistics

Test Statistics

The white Gaussian noise in the previous example can be


represented as w = (w [0], w [1], ..., w [N − 1])T ∼ N (0, σ 2 I). Now
we assumed that, more generally, w ∼ N (0, C) for an arbitrary
covariance matrix C. Recall that Cmn = cov (w [m], w [n])
= E (w [m]w [n]) = rww [m − n] when the noise is zero mean.

By using NP test, we decide H1 if

T (x) , xT C−1 s > γ 0

When C is positive semi-definite, C−1 exists and is also positive


semi-definite, and so can be factored as C−1 = DT D for a
non-singular N × N matrix D called pre-whitening matrix.

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 12 / 24


Generalized Matched Filters Test Statistics

Correlated Noise and


Whitening
If you process your received signal x by multiplying it by D (and
similarly for your known signals, or form x0 = Dx and s0 = Ds, then
the generalized correlator in (??) looks like
T
T (x) , xT C−1 s = xT DT Ds = x0 s0

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 13 / 24


Generalized Matched Filters Performance of Generalized Matched Filter

Outline

1 Matched Filters
Development of Detector
Performance of Matched Filters

2 Generalized Matched Filters


Test Statistics
Performance of Generalized Matched Filter

3 Multiple Signals

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 14 / 24


Generalized Matched Filters Performance of Generalized Matched Filter

Performance of generalized
matched filters
The test statistic T (x) = xT C−1 s ∼ N (0, sT C−1 s) under H0 and
T (x) ∼ N (sT C−1 s, sT C−1 s) under H1 .

We can obtain PFA and PD as follows:


 1/2
PFA = Pr {T > γ 0 ; H0 } ⇒ γ 0 = sT C−1 s Q −1 (PFA )
!
0 γ 0 − sT C−1 s
PD = Pr {T > γ ; H1 } = Q 1/2
(sT C−1 s)
 √ 
= Q Q −1 (PFA ) − sT C−1 s

From the performance metric PD we see that in colored noise, unlike


in white Gaussian noise, the shape of the signal s can affect the
performance.

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 15 / 24


Generalized Matched Filters Performance of Generalized Matched Filter

Designing Signals

The question is now how to design s to maximize PD , subject to a


given desired E = sT s (else best performance will result from taking
a signal of infinite energy).

Attack!
Using Lagrangians, we find that if λmin is the smallest eigenvalue of
C with corresponding normalized eigenvector λmin then the√signal s
that maximizes PD under an energy constraint of E is s = Evmin .
Be able to show this!

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 16 / 24


Generalized Matched Filters Performance of Generalized Matched Filter

Example 2

Example 2: Signal design for uncorrelated noise


with unequal variance
Consider the case that w [n] ∼ N (0, σn2 ) and the w [n]’s are uncorrelated,
the C = diag(σ02 , σ12 , ..., σN−1
2 ) and C−1 = diag(σ0−2 , σ1−2 , ..., σN−1
−2
).

By NP test, we decide H1 if
N−1
x [n]s[n]
> γ0
X
T (x) =
n=0
σn2

N−1
s 2 [n]
where γ 0 = ln(γ) + 1 P
2 σn2
.
n=0

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 17 / 24


Generalized Matched Filters Performance of Generalized Matched Filter

Example 3: Signal design for


correlated noise

Example 3: Signal design for correlated noise, very


simple case
Consider the case that w ∼ N (0, C) with
" #
1 ρ
C=
ρ 1

where ρ is the correlation coefficient which satisfies ρ ≤ 1.


What does the test statistic become?
What does the deflection coefficient become?

Solution:

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 18 / 24


Generalized Matched Filters Performance of Generalized Matched Filter

Application
Application: Linear model:
x is an N × 1 received signal, H is an N × p known full-rank matrix, θ is
a p × 1 set of parameters (known or unknown), and w is an N × 1 noise
vector ∼ N (0, C), C is a covariance matrix. Assume θ = θ1 for some
known set of parameters θ1 .
Our hypotheses are:

H0 : x = w
H1 : x = Hθ + w

Our old NP detector in colored Gaussian noise with covariance


matrix C reduces to the test statistic T (x) = xT C−1 s, which is
N−1
s 2 [n]
compared to a threshold γ 0 = ln(γ) + 1 P
2 σn2
. We can directly
n=0
apply this by taking s = Hθ1 .
Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 19 / 24
Multiple Signals

Two signals in noise

Before:

H0 : x = w
H1 : x = s + w

Now:

H0 : x =
s0 + w
H1 : x = s1 + w

We are interested in simple ML detection in white Gaussian noise,


i.e. selecting the hypothesis Hi for which p(x|Hi ) is maximal.

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 20 / 24


Multiple Signals

Two signals in noise (cont.)

When you want to determine which of many signals was sent the
general approach is to correlate the received signal with each of the
possible hypothesis signals si = (si [0], si [1], ..., si [N − 1])T , adjust
N−1
for the signal energy E = si2 [n], and select the hypothesis which
P
n=0
resulted in the maximal correlator output. That is, pick the Hi for
which the following is maximum:
X 1
Ti (x) = x [n]si [n] − Ei
n=0
2

Let’s derive this decision rule!


Geometric meaning?
Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 21 / 24
Multiple Signals

Detector for multiple signals


in WGN

H0 : x = w
H1 : x = s1 + w

H0 : x = s0 + w
H1 : x = s1 + w

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 22 / 24


Multiple Signals

Geometric Interpretation

X 1
Ti (x) = x [n]si [n] − Ei
n=0
2

This test statistic is identical to selecting the hypothesis Hi for


which the Euclidean distance Di2 of the received vector to the known
signal si [n] is smallest, where Di2 is:

(x [n] − si [n])2 = kx − si k2
X
Di2 =
n=0

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 23 / 24


Multiple Signals

Example 4: M-ary Case

Example 4: M-ary Case


How would you choose which of M deterministic signals
{s0 [n], s1 [n], ..., sM−1 [n]}, with equal prior probabilities, occurred upon
receiving x [n] = si [n] + w [n] under hypothesis Hi ?
Find the test statistic T (x).
Find the the probability of error Pe .

Apr. 27 2019 EE702030 - Lecture 4: Deterministic Signal in Noise 24 / 24

You might also like