You are on page 1of 11

LERTEKNIK

REG

Lecture Outline
AU L
T O MA RO
T IC C O N T

1. Conceptual MHT
• Fundamental Components
• Simplifications
Target Tracking: Lecture 5 • Summary
Multiple Target Tracking: Part II 2. Hypothesis-Based MHT
• Assignment Problem
• Algorithm
3. Track-Based MHT
Gustaf Hendeby • Implementation Details
hendeby@isy.liu.se • Summary
4. User Interaction
Div. Automatic Control
Dept. Electrical Engineering 5. Examples
Linköping University • SUPPORT
• ADABTS
December 10, 2014 6. Summary
• Concluding Remarks
• Learn More. . .
G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 1 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 2 / 36

Last Lecture Multiple Hypothesis Tracking (MHT)

Intro multi-target tracking (MTT) MHT: consider multiple associations hypotheses over time
Single hypothesis tracker (SHT) Started with the conceptual MHT
• Global nearest neighbor (GNN)
Integrated track initialization
• Joint Probabilistic Data Association (JPDA)
Two principal implementations
Auction algorithm • hypotheses based
Fundamental theorem of target tracking • track based

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 3 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 4 / 36
Conceptual MHT Conceptual MHT Fundamental Components

Conceptual MHT: basic idea Representing Hypotheses

Each hypothesis, {Θik−1 }N h


i=1 , is characterized by the number of targets
Described in Reid (1979)
(tracks) and their corresponding sufficient statistics
Intuitive hypothesis based brute force implementation
Between consecutive time instants, different association hypotheses, Θ1k−1 , P(Θ1k−1 ) Θ2k−1 , P(Θ2k−1 )
{Θik−1 }N h
i=1 , are kept in memory
Idea: generate all possible hypotheses, and then prune to avoid
combinatorial hypotheses growth
Hypothesis limiting techniques:
• clustering 2
ŷk|k−1
1 1
ŷk|k−1 ŷk|k−1
• pruning low probability hypotheses
• N-scan pruning
• combining similar hypotheses
• ...

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 5 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 6 / 36

Conceptual MHT Fundamental Components Conceptual MHT Fundamental Components

Representing Hypotheses Generating Hypotheses

Form Θ`k , {θk , Θik−1 }


Each hypothesis, {Θik−1 }N h
i=1 , is characterized by the number of targets
(tracks) and their corresponding sufficient statistics Θ1k−1 , P(Θ1k−1 ) Θ2k−1 , P(Θ2k−1 )
yk1 yk2 yk3 yk1 yk2 yk3
Θ1k−1 , P(Θ1k−1 ) Θ2k−1 , P(Θ2k−1 ) FA
measurements
FA
measurements
FA NT FA NT
FA FA
yk3 yk3 FA NT NT FA NT NT
FA FA
T1 NT T1 NT
FA FA
FA NT FA NT
FA FA
Hypotheses NT NT NT NT NT NT

Hypotheses
FA FA
yk1 yk1 T1 NT T1 NT
2
ŷk|k−1 FA FA
1 1
ŷk|k−1 ŷk|k−1 FA NT FA NT
FA FA
yk2 yk2 T2 NT NT T1 NT NT
FA
T1 NT
FA
FA NT
FA
T1 NT NT

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 6 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 7 / 36
Conceptual MHT Fundamental Components Conceptual MHT Fundamental Components

Computing Hypothesis Probabilities System Overview

New Set of
Measurements
Let Θ`k , {θk , Θik−1 }, then {yki }m k
i=1
(using the “Fundamental Theorem of TT”)

Reduce
P(Θ`k |y0:k ) ∝ p(yk |Θ`k , y0:k−1 )P(θk |Θik−1 , y0:k−1 )P(Θik−1 |y0:k−1 ) Set of
Hypotheses
Generate New
Hypotheses
Calculate
Hyp. Probabilities Number of
{Θik−1 }N {Θik }N {P (Θik )}N Hypotheses Θik
  Y  h h h
mkfa mknt Y j j θk−1 (j) j j i=1 i=1 i=1
∝ βfa βnt PD pk|k−1 (yk ) (1 − PD PG ) P(Θik−1 |y0:k−1 )
j∈JDi i
j∈JND
z −1

Note User
Presentation
The sets JDi and JNDi depend on Θik−1 ! The number of targets and target Logic

estimates usually differ between hypotheses.

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 8 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 9 / 36

Conceptual MHT Simplifications Conceptual MHT Simplifications

Reducing Complexity Clustering

Group targets without common measurements, and handle the groups


separately
Θ1k−1 Θ2k−1
Clustering
yk3 yk3
Pruning of low probability hypotheses
N-scan pruning
Merging similar hypotheses
yk1 yk1
2
ŷk|k−1
1 1
ŷk|k−1 ŷk|k−1
yk2 yk2

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 10 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 11 / 36
Conceptual MHT Simplifications Conceptual MHT Simplifications

Clustering Clustering: cluster management

Group targets without common measurements, and handle the groups


separately When targets get closer
Θ1k−1 Θ2k−1 • If measurement falls inside the gates of tracks in different clusters,
merge the clusters
yk3 yk3
Cluster-2 Cluster-2
• The hypotheses for each cluster are combined into a super-hypotheses
When targets separate
• If a group of tracks cluster do not share measurements with the other
yk1 yk1 tracks in the cluster (for a period of time), split the cluster

2
ŷk|k−1 Hypotheses for the cluster are also divided into smaller hypotheses
1 1
ŷk|k−1 ŷk|k−1
yk2 yk2
corresponding to two smaller clusters.
Cluster-1 Cluster-1

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 11 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 12 / 36

Conceptual MHT Simplifications Conceptual MHT Simplifications

Clustering: process clusters separately (1/2) Clustering: process clusters separately (1/2)
Hypotheses generation Hypotheses generation

Form Θ`k , {θk , Θik−1 } for each cluster as if the other clusters do not Form Θ`k , {θk , Θik−1 } for each cluster as if the other clusters do not
exist. exist.
Without clustering Without clustering
Θ1k−1 , P(Θ1k−1 ) Θ2k−1 , P(Θ2k−1 )
yk1 yk2 yk3 yk1 yk2 yk3
measurements measurements
FA FA
FA NT
FA
Cluster-1 Cluster-2 FA NT
FA
Cluster-1 Cluster-2
FA NT NT FA NT NT
T1
FA
NT
Θ1k−1 , P(Θ1k−1 ) Θ1k−1 , P(Θ1k−1 ) T1
FA
NT
Θ2k−1 , P(Θ2k−1 ) Θ2k−1 , P(Θ2k−1 )
yk1 yk2 yk3 yk1 yk2 yk3
FA FA
FA NT measurements measurements FA NT measurements measurements
Hypotheses

Hypotheses
FA FA
NT NT NT FA FA NT NT NT FA FA
Hypotheses

Hypotheses

FA FA NT FA FA NT

Hypotheses
T1 NT T1 T1 NT T1
Hypotheses

FA NT FA NT
FA NT NT FA NT NT
FA NT T1 FA NT T1
FA FA FA FA
T2 NT NT T2 NT T1 NT NT T1 NT
FA T1
T1 NT FA
T1 NT
FA
FA NT
FA
T1 NT NT

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 13 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 13 / 36
Conceptual MHT Simplifications Conceptual MHT Simplifications

Clustering: process clusters separately (2/2) N-scan Pruning

Hypotheses reduction Case N = 2

For each cluster: Θ12

Delete hypotheses with probability below a threshold, γp (e.g., This scheme assumes that Θ22
γp = 0.001) any uncertainty is perfectly
Θ11 Θ32
Deletion Condition: P(Θik ) < γp resolved after N time steps

Hypotheses
It is a general commonsense Θ10 Θ21 Θ42
Keep only the most probable hypotheses with a total probability mass
to choose N ≥ 5 (situation
above a threshold, γc (e.g., γc = 0.99) Θ20 Θ31 Θ52
dependent)
i
X The N last ancestors of each Θ30 Θ41 Θ62
Deletion Condition: P(Θ`kk ) > γc hypothesis must be stored
k=1 Θ51 Θ72

` time
where `k is a sequence such that P(Θ`kk ) ≥ P(Θkk+1 ) 0 1 2 3 4

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 14 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36

Conceptual MHT Simplifications Conceptual MHT Simplifications

N-scan Pruning N-scan Pruning

Case N = 2 Case N = 2

Θ12

This scheme assumes that Θ22


This scheme assumes that
any uncertainty is perfectly any uncertainty is perfectly
resolved after N time steps Θ11 Θ32 resolved after N time steps
Hypotheses

Hypotheses
It is a general commonsense Θ10 Θ21 Θ42 It is a general commonsense Θ42
to choose N ≥ 5 (situation to choose N ≥ 5 (situation
Θ20 Θ31 Θ52 Θ31 Θ52
dependent) dependent)
The N last ancestors of each Θ30 Θ41 Θ62 The N last ancestors of each Θ41 Θ62
hypothesis must be stored hypothesis must be stored
Θ51 Θ72

time time
0 1 2 3 4 0 1 2 3 4

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36
Conceptual MHT Simplifications Conceptual MHT Simplifications

N-scan Pruning N-scan Pruning

Case N = 2 Case N = 2

This scheme assumes that Θ13


This scheme assumes that Θ13
any uncertainty is perfectly any uncertainty is perfectly
resolved after N time steps Θ23 resolved after N time steps Θ23

Hypotheses

Hypotheses
It is a general commonsense Θ42 Θ33 It is a general commonsense Θ42 Θ33
to choose N ≥ 5 (situation to choose N ≥ 5 (situation
Θ31 Θ52 Θ43 Θ31 Θ52 Θ43
dependent) dependent)
The N last ancestors of each Θ41 Θ62 Θ53 The N last ancestors of each Θ41 Θ62 Θ53
hypothesis must be stored hypothesis must be stored
Θ63 Θ63

time time
0 1 2 3 4 0 1 2 3 4

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36

Conceptual MHT Simplifications Conceptual MHT Simplifications

N-scan Pruning Hypothesis Merging

Case N = 2

Reid’s original paper suggests to check for hypothesis pairs with:


This scheme assumes that the same number of targets (tracks)
any uncertainty is perfectly
similar track estimates
resolved after N time steps
Hypotheses

It is a general commonsense If these conditions are satisfied:


to choose N ≥ 5 (situation
dependent) Θ43 merge the hypotheses
The N last ancestors of each assign the new hypothesis the sum of the combined hypotheses’
Θ62 Θ53
hypothesis must be stored probability
Θ63

time
0 1 2 3 4

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 16 / 36
Conceptual MHT Summary Hypothesis-Based MHT

Summary Hypothesis-Based MHT

Attractive method since each hypothesis is Proposed by Cox and Hingorani (1996)
• an alternative representation of reality Generate only the best hypotheses, skip hypotheses that will be
• easily interpreted deleted
Drawback: generating all possible hypotheses only to discarding Use the N-best solutions to the assignment problem (introduced last
(most of) them is inefficient lecture with GNN)
Some hypotheses contain the same track; hence fewer unique tracks • Murty’s method, 1968
than hypotheses Find the Nh -best hypothesis, generating as few unnecessary
Track based methods were popular until an efficient way to implement hypothesis as possible
a hypothesis based MHT was given by Cox and Hingorani (1996) Hypothesis reduction techniques still apply

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 17 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 18 / 36

Hypothesis-Based MHT Assignment Problem Hypothesis-Based MHT Assignment Problem

Assignment Problem: repetition (1/2) Assignment Problem: repetition (2/2)

Let Θ`k , {θk , Θik−1 }.


 Y P j pj θk−1 (j) 
mfa mnt D k|k−1 (yk )
P(Θ`k |y0:k ) ∝ p(yk |Θ`k , y0:k−1 )P(θk |Θik−1 , y0:k−1 )P(Θik−1 |y0:k−1 ) P(Θ`k |y0:k ) ∝ βfak βntk Ci P(Θik−1 |y0:k−1 )
  Y 
j∈JDi
1 − PDj PGj
mkfa mknt Y j j θk−1 (j) j j
∝ βfa βnt PD pk|k−1 (yk ) (1 − PD PG ) P(Θik−1 |y0:k−1 )
j∈JDi i
j∈JND Logarithmize and form the assignment matrices
A1 T1 T2 fa1 fa2 fa3 nt1 nt2 nt3
Divide and multiply the right hand side by yk1 `11 `12 log βfa × × log βnt × ×
× represents −∞. yk2 `21 × × log βfa × × log βnt ×
nTi yk3 × × × × log βfa × × log βnt
P j pj (yki )
(1 − PDj PGj ) = (1 − PDj PGj ) (1 − PDj PGj )
Y Y Y
Ci , `ij , log D k|k−1
j j .
(1−PD PG ) A2 T1 fa1 fa2 fa3 nt1 nt2 nt3
j=1 j∈JDi i
j∈JND yk1 `11 log βfa × × log βnt × ×
yk2 `21 × log βfa × × log βnt ×
yk3 × × × log βfa × × log βnt

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 19 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 20 / 36
Hypothesis-Based MHT Assignment Problem Hypothesis-Based MHT Assignment Problem

Assignment Problem: N-best solutions Assignment Problem: Murty’s Method

Murty’s Method
Given the assignment matrix Ai ,
Given an assignment matrix Ai , the Auction algorithm (or similar)
Find the best solution using Auction algorithm.
finds the best assignment in polynomial time
2nd best solution:
Generalizations of this problem to find the N-best solutions:
• Express the 2nd best solution as the solution of a number of best
• Formulate as several best assignment problems
solution assignment problems.
• Solve independently using the Auction algorithm
• Find the solution to each of these problems by Auction.
• Murty’s method
• The solution giving the maximum reward (minimum cost) is the second
best solution.
Repeat the procedure for more solutions

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 21 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 22 / 36

Hypothesis-Based MHT Algorithm Hypothesis-Based MHT Algorithm

Algorithm Outline Generating the Nh -best Hypotheses

Aim: Given hypotheses {Θik−1 }N i mk Input {Θik−1 }N i Nh i mk


i=1 , {P(Θk−1 |y0:k−1 )}i=1 , and {yk }i=1
h

i=1 and measurements {yk }i=1 , find


h

the Nh best hypotheses {Θik }N h


i=1 (avoid generating all hypotheses)
Output HYP-LIST (N hypotheses, decreasing probability)
PROB-LIST (matching probabilities)
Reminder of Hypothesis Probability

 Y P j pj θk−1 (j)  1. Initialize all elements in HYP-LIST and PROB-LIST to ∅ and −1


mfa mnt D k|k−1 (yk )
P(Θ`k |y0:k ) ∝ βfak βntk Ci P(Θik−1 |y0:k−1 ) 2. Find assignment matrices {Ai }N i Nh
i=1 for {Θk−1 }i=1
h

j∈JDi
1 − PDj PGj | {z }
Legacy 3. For j = 1 . . . Nh
| {z }
Assignment dependent 1. For i = 1 . . . Nh
1. For the assignment matrix Ai find the jth best solution Θjik
Find {Θ`k }N h
`=1 that maximizes P(Θ`k |y0:k ). 2. Compute the probability P(Θjik )
Two steps: 3. Update HYP-LIST and PROB-LIST: If the new hypothesis enters the
list, discard the least probable entry
• Obtain the solution from the assignment (Murty’s method)
4. If P(Θjik ) is lower than the lowest probability in PROB-LIST discard Θjik
• Multiply the obtained quantity by previous hypothesis dependent terms
and never use Ai again in subsequent recursions

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 23 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 24 / 36
Track-Based MHT Track-Based MHT

Track-Based MHT: motivation Track-Based MHT: principle

Hypotheses usually contain identical tracks — significantly fewer Tracks at time k, {Tki }N t
i=1
tracks than hypotheses Track scores, Sc(Tki )
Idea: Store tracks, T i , not hypotheses, Θi , over time Form a track tree, not a hypothesis tree
Delete tracks with low scores
Θ1k−1 , P(Θ1k−1 ) Θ2k−1 , P(Θ2k−1 )
old tracks new tracks
yk1 Tk1 yk3
1
Tk−1 NM Tk2
yk1 Tk3

Track List
2
Tk−1 NM Tk4
yk1
2
ŷk|k−1 yk2 Tk5
2
ŷk|k−1
1 1
ŷk|k−1 ŷk|k−1 1
Tk−1 2
1
ŷk|k−1
yk1 Tk6 Tk−1
yk2
yk2 Tk7
yk3 Tk8

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 25 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 26 / 36

Track-Based MHT Implementation Details Track-Based MHT Implementation Details

Hypotheses Generation Track Scores and Hypotheses Probabilities


Hypothesis: a collection of compatible tracks: old tracks new tracks
Θ1k = {Tk1 , Tk5 , Tk8 }, Θ2k = {Tk2 , Tk3 , Tk7 , Tk8 } Track probability:
yk1 Tk1
Generating hypothesis is needed for reducing the number of tracks
P(Θjk )
X
P(Tki ) =
further and for user presentation 1
Tk−1 NM Tk2
Tki ∈Θjk
Use only tracks with high score
yk1 Tk3
Keep track compatibility information (e.g., in a binary matrix) Hypothesis score:

Track List
old tracks new tracks 2
Tk−1 NM Tk4
Tk1 Tk2 Tk3 Tk4 Tk5 Tk6 Tk7 Tk8 yk2
Sc(Tkj )
X
yk1 Tk1
Sc(Θik ) = Tk5
Tk1 0 0 0 1 1 0 1 1 1
Tk−1 NM Tk2
Tk2 0 1 1 1 1 1 1 Tkj ∈Θik
yk1 Tk3 yk1 Tk6
Tk3 0 0 0 0 1 1
Track List

2
Tk−1 NM Tk4
Tk4 0 0 1 1 1 Hypothesis probability: yk2 Tk7
yk2 Tk5
Tk5 0 1 0 1
exp Sc(Θik )

Tk6 0 1 1
yk1 Tk6
i yk3 Tk8
yk2 Tk7 P(Θk ) =
Tk7 0 1 1+ N j 
P h
yk3 Tk8 j=1 exp Sc(Θk )
Tk8 0

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 27 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 28 / 36
Track-Based MHT Implementation Details Track-Based MHT Summary

Complexity Reducing Techniques System Components


New Set of
Measurements
{yki }m k
i=1

Set of Generate New Discard Discard


Tracks Tracks Low Score Low Probability
Cluster incompatible tracks for efficient hypothesis generation i
{Tk−1 }N t
i=1 {Tki }N t
i=1 Tracks Tracks

Apply N-scan pruning to the track trees


Merge tracks with common recent measurement history Generate Calculate
Hypotheses Track
{Θik }N h
i=1 Probabilities

z −1

User
Presentation
Logic

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 29 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 30 / 36

User Interaction Examples SUPPORT

User Presentation Logic Example: harbor protection (SUPPORT)

Maximum probability hypothesis: simplest alternative


• Possibly jumpy; the maximum probability hypothesis can change
erratically
Show track clusters: (weighted) mean, covariance and expected
number of targets
Keep a separate track list: update at each step with a selection of
tracks from different hypotheses
Consult (Blackman and Popoli, 1999) for details

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 31 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 32 / 36
Examples ADABTS Summary Concluding Remarks

Example: busy indoor environments Which Multi-TT Method to Use?

SNR Low Medium High


Computation
Low Group TT / PHD GNN GNN
Medium MHT GNN or JPDA GNN
High TrBD / MHT MHT Any

GNN and JPDA are very bad in low SNR.


When using GNN, one generally has to enlarge the overconfident
covariances to account for neglected data association uncertainty.
JPDA has track coalescence and should not be used with closely
spaced targets, see the “coalescence avoiding” versions.
MHT requires significantly higher computational load but it is said to
be able to work reasonably under 10–100 times worse SNR.

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 33 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 34 / 36

Summary Learn More. . . Summary Learn More. . .

Learning More (1/2) Learning More (2/2)

Samuel S. Blackman.
Multiple hypothesis tracking for multiple target tracking. Ingemar J. Cox, Matthew L. Miller, Roy Danchick, and G. E. Newnam.
IEEE Transactions on Aerospace and Electronic Systems, 19(1):5–18, January 2004. A comparison of two algorithms for determining ranked assignments with application to
multitarget tracking and motion correspondence.
Samuel S. Blackman and Robert Popoli. IEEE Transactions on Aerospace and Electronic Systems, 33(1):295–301, January 1997.
Design and analysis of modern tracking systems.
Artech House radar library. Artech House, Inc, 1999. Roy Danchick and G. E. Newnam.
ISBN 1-5853-006-0. Reformulating Reid’s MHT method with generalised Murty K-best ranked linear
assignment algorithm.
Ingemar J. Cox and Sunita L. Hingorani. IEE Proceedings-F Radar and Sonar Navigation, 153(1):13–22, February 2006.
An efficient implementation of Reid’s multiple hypothesis tracking algorithm and its
evaluation for the purpose of visual tracking. Matthew L. Miller, Harold S. Stone, and Ingemar J. Cox.
IEEE Transactions on Pattern Analysis and Machine Intelligence, 18(2):138–150, February Optimizing Murty’s ranked assignment method.
1996. IEEE Transactions on Aerospace and Electronic Systems, 33(3):851–862, July 1997.
Ingemar J. Cox and Matthew L. Miller. Donald B. Reid.
On finding ranked assignments with application to multitarget tracking and motion An algorithm for tracking multiple tragets.
correspondence. IEEE Transactions on Automatic Control, 24(6):843–854, December 1979.
IEEE Transactions on Aerospace and Electronic Systems, 31(1):486–489, January 1995.

G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 35 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 10, 2014 36 / 36

You might also like