You are on page 1of 6

2004 IEEE IntelligentVehicles Symposium

University of Parma
Parma, Italy June 14-17,2004

Multi-Target Multi-Object Tracking,


Sensor Fusion of Radar and Infrared
Rainer Mobus,'>* Uli Kolbe'

'Research & Technology, RICIAA *Automatic Control Laboratory


DaimlerChrysler AG, 70546 Stuttgart, Germany ETH Zentrum - ETL, CH-8092 Zurich, Switzerland
email {rainer.moebus, email moebus@control.ee.ethz.ch
u1i.u. kolbe} @daimlerchrysler.com

Abstract- This paper presents algorithms and techniques detected objects. Figure 2 shows a 7'7GHz standard ACC
for single-sensor tracking and multi-sensor fusion of infrared sensor and the infrared sensor used for the multi-sensor
and radar data. Multiple model filtering and data associa- fusion in this paper. To show the motivation and necessity
tion techniques are presented and results are shown for all
presented algorithms. of tracking and fusion algorithms figure 3 shows a bird view
of all infrared and radar measurements obtained during a
I. INTRODUCTION 20 second test drive. The graph shows the trajectory of
the test vehicle as a solid line and the obtained sensor
Many driver assistance systems heavily rely on external
measurements as points. For the sake of readablity the
vehicle sensors. One product that is already available on
lateral position of the measurements is exaggerated by a
the market is the so-called Adaptive Cruise Control, ACC,
factor of 20. These data show that a powerful tracking and
that makes use of a 77GHz radar sensor. The maximum ac-
powerful data association and fusion algorithms are needed
celeration is constrained to 1 2 , the maximum deceleration
to reject clutter and noise and to estimate the dynamics of
is limited to-32. A considerable disadvantage of current
the preceding vehicles reliably.
77GHz ACC systems is their limited angular sensor range.
The longitudinal range is very good and goes up to 150

Fig. 2. Pictures of a 77GHz radar sensor and an infrared sensor for


Fig. 1. Sensor ranges of 77GHz radar and infrared laser: automotive use.

meters, in best cases beyond. The lateral range, however,


is limited to f 3 . 4 degrees. For assistance systems beyond 11. OUTLINE OF THIS PAPER
ACC the 77GHz radar sensor .might not be sufficient. This paper presents algorithms and techniques for the
For improved ACC functionality, lane change assistants, single-sensor tracking and the multi-sensor fusion. Chapter
inner-city assisting systems or collision warning / collision three is dedicated to single-sensor tracking and to results
mitigation systems a rather complete representation of the of radar tracking and infrared tracking with the interacting
traffic scene ahead of the car is needed and the reliability multiple model approach. Chapter four covers sensor fusion
of the sensor information should be as high as possible. and data association techniques, especially the probabilistic
Another sensor suitable for object recognition is the infrared data association. The fifth chapter presents results of multi-
sensor, scanning an area of about f 2 0 degrees up to 80 ple model filtering applied to the sensor fusion of infrared
meters. Figure 1 shows the ranges of the 77GHz radar and radar data. The last chapter gives an outlook.
sensor'and the infrared scanner, the infrared sensor scanning
a large area uncovered by the radar. To exploit the advan- 111. SINGLESENSOR
TRACKING
tages of each technology without being affected by their An overview of the operations performed in the single-
shortcomings all sensor information can be fused in a sensor sensor tracking is shown in figure 4. Different sensors
fusion step. Complementary regions add up to a complete generate sensor specific measurements or targets. In the
range ahead of the car, in overlapping regions redundant first operation (gating, clustering) a pre-selection of these
information is obtained increasing the reliability of the targets takes place. New targets are compared to existing

o-7ao3.~a31o-91o41$2o.ooo 2004 IEEE 732


*'(k I r)

Fig. 5. Flow char! of the IMMfilter principles.

150 -100 -50 0 so 100 150 200


lateral position [m]

Fig. 3. Bird view of a 20 second test drive. The trajectory of the test
vehicle as well as all infrared and radar measurentents are shown.

Fig. 4. Flow chart of the operations perjormed in the single-sensor


tracking step.

objects. If a target is likely to be a new measurement for an Fig. 6. Results of longitudinal radar tracking using the IMM filter
already existing object, the target is labeled. In some cases approach.
an object creates more than one measurement. In this case
measurements are clustered and labeled accordingly. In this
gating step some unlikely measurements can be discarded, to their likelihood. In each time step the actual state vector
e.g. standing objects that might not be of interest in certain is predicted and innovated using different hypotheses. The
applications [ 2 ] , [4]. After this step the data association resulting state vector is a linear combination of the two
takes place. The task is to associate the validated targets hypotheses. The weighting is according to the likelihood of
to existing objects. After this operation, all associated the hypotheses. For further detail the reader is referred to
measurements are used to update the state estimation of the [5], [6], [SI. Results obtained by this approach applied to the
existing objects. If a target could not be associated to an longitudinal tracking of 77GHz radar data is shown in figure
existing object, a new object is created. If no measurement 6. Model 1 was chosen as a constant velocity model, model
was assigned to an existing object, this object is killed. 2 as a constant acceleration model. The results presented
Objects might also be merged or split here. Output of the in figure 6 show relative velocity and relative acceleration
single-sensor tracking is a list of objects for each sensor. corresponding to a traffic scene where a car is cruising
Attributes of this list are e.g. distance, lateral distance, straight ahead of the test vehicle. First the relative velocity
relative lateral velocity and relative longitudinal velocity. is decreasing considerably, remains constant for a while
and then increases again. The graph shows raw relative
A. Tracking with 77GHz Radar Sensor velocity measurements of the radar sensor, the tracked
In [SI and [7] the single-sensor radar tracking was pre- relative velocity and the estimated relative acceleration.
sented with the so-called interacting multiple model filter The segments with changing relative velocity correspond
approach. The idea of the IMM filter is briefly explained to longitudinal maneuvers of the test vehicle or the car
by figure 5. Two Kalman filters with different models or ahead. The model probabilities graph shows that in case of
parameter sets are running in parallel and interact according these maneuvers the probability of the constant acceleration

733
5 3
32

Fig. 8 . Flow chart of the operations perfonon,ied in the multi-sensorfusion


step.

Fig. I . Results of lateral infrared laser tracking using the IMM filter
approarh.

model is very high. The constant velocity model however


dominates in case of a rather steady state behavior of the
two cars.
B. Tracking with Infrared Sensor
Results of the IMM filtering applied to the lateral tracking Fig. 9. Bird view of 111'0 rrafic scenes to illitstrate the data association
of infrared data are shown in figure 7. Model 1 is chosen problern.
here as a constant lateral distance model and model 2 as
a constant lateral velocity model. The results presented in
figure 7 correspond to a traffic scene where a car is cruising objects tracked with the presented tracking algorithms. The
ahead of the test vehicle, then performs a lane-change to the laser objects appear as thin black boxes, the radar objects as
left, another lane-change back to the lane of the test vehicle thick black crosses. The fusion objects are depicted as thick
and finally leaves this lane to the right. The graphs show boxes. The lane information obtained by a vision system is
the raw data and the tracked data of the lateral position, also depicted as dashed lines. The lines moving away from
the estimated lateral velocity and the corresponding model the small car symbol indicate the ranges of the radar and the
probabilities. The model probabilities graph shows that the infrared sensor. In the left scene at about 65m two fusion
probability of the constant lateral velocity model gets very objects, two laser objects and two radar objects can be seen.
high in case of lane-changing maneuvers. In the right scene at about 40m a similar configuration
is shown with two fusion objects, two radar objects and
IV. SENSORFUSION one laser object. In the fusion algorithm it now has to be
The multi-sensor fusion is set up in the same man- decided which measurements (radar and laser objects) are
ner as the single-sensor tracking. The multi-sensor fusion, associated to which existing objects (fusion objects). For
however, uses the output of the single-sensor tracking as most objects shown in these two scenes this step is rather
input Iist. An overview of the operations performed in this straightforward, for the two particular configurations just
sensor fusion step is shown in figure 8. Comparing figures mentioned, however, it is not. Various approaches exist to
4 and 8 shows that the principles and techniques of the perform this data association operation 111, [21, [31, [41.
single-sensor tracking and the sensor fusion step are very Most of these data association algorithms make use of the
similar. The advantage of this separation of the tracking so called gating.
and sensor fusion step is modularity, complexity reduction The probability density function of the Kalman filter resid-
and reusability of the algorithms. These advantages are uals is given by
given priority over one large sensor fusion step on the
measurement level.
A. Probabilistic Data Association (PDA)
In a multi-target, multi-object environment two typical
situations are depicted in figure 9. These two traffic scenes where 7(k) is the corresponding residual vector, given by
are real-world examples and were obtained during a test
drive on a public highway. The figures show laser and radar J(k) = [?(k)-H(k)i(klk- l)]. (2)

734
The corresponding residual covariance matrix S evaluates B. Probabilistc Data Association with IMM
to The data association operations always take place in
between the prediction and innovation step of the dynamic
S ( k ) = H ( k ) P ( k \ k - l)H(k)T + R ( k ) , (3) Kalman filter process [l]. If IMM filtering is applied, this
has to be considered and the PDA has to be adapted. With
with H ( k ) describing the observation matrix and R the n filters running in parallel, first a combined covariance
measurement noise. A measurement is declared valid if matrix Pc is calculated according to the predicted model
the normalized residual fulfills the following thresholding probabilities
condition. n

.\/7(k)TS(klk- 1)-17(k) 5 c. (4) Pc(klk- 1) = &u'(k(k- 1)P"klk- l ) , (9)


i= 1

Statistically more than 99.8% of all valid measurements lie with


within the gate defined by choosing c = 3. The probabilistic
data association approach uses all measurements that are ,ui(klk- I ) , i = I...n (10)
valid according to this gating. All valid measurements are being the predicted model probabilities. For each measure-
then combined to one single residual. The weighting is ment 7i(k) the corresponding residual covariance matrix
according to the likelihood values of the corresponding
measurements. The combined residual vector of m valid S i ( k ) = Hj(k)Pc(k(k-l)Hj(k)T +Rj(k) (11)
measurements is calculated as follows.
can be calculated. This matrix Si is then used to perform the
gating according to (4) to decide whether the measurement
7i(k) is declared valid or not. The combined residual accord-
ing to (5) and (6) is then used to perform the innovation
step separately for all IZ parallel Kalman filters within the
IMM filter.
V. SENSORFUSION
OF 77GHz RADARAND IR
In this chapter results of single-sensor tracking and sensor
fusion are presented, obtained by the algorithms described
2(klk) = a ( k p - 1 ) +K ( k ) 3 ( k ) . (7) above. In figure 10 results of both infrared and radar single-
sensor tracking and of the sensor fusion are shown. The
The updated covariance matrix P(klk) is the standard results correspond to the same traffic scene described for
Kalman filter covariance update plus a correction term that figure 7.
accounts for additional uncertainty due to the fact that the The first graph shows the tracked lateral distance of radar
different residuals are combined to one but are unequally and infrared single-sensor tracking. The result of the sensor-
weighted. The PDA Kalman filter covariance update is fusion step is also shown in the same subfigure. It can
be seen, that the car is always in range of the infrared
sensor whereas the radar tracking looses the object when
it leaves the sensor's range. The following two graphs
show longitudinal distance and relative velocities of both
infrared and radar. The results of single-sensor tracking
The probabilistic data association is clearly a suboptimal and sensor fusion are depicted. The next graph shows the
data association technique. The advantages over other (op- relative longitudinal acceleration obtained from the sensor
timal) data association techniques are low complexity and fusion step. The last graph shows the longitudinal model
the fact that all measurements with a certain minimum probabilities of the steady state model (constant relative
likelihood value are considered in the innovation step. Only velocity) and the maneuver model (constant relative accel-
very unlikely measurements are discarded. For the specific eration model). As expected, the maneuver model prevails
problem of object tracking for automotive applications these when the acceleration is considerably high. Figure 1 1 zooms
characteristics seem appropriate. Looking at the two traffic in the interval of about 15s-25s of both lateral position
scenes in figure 9 it is reasonable to use the radar objects estimation and relative velocity estimation. In the lateral
between the fusion objects for the update of both fusion position graph it can be seen that the fusion result follows
objects. Discarding them might implicate loss of informa- the infrared tracking closer than the radar tracking. This
tion. Associating them to just one fusion object might be is due to the smaller measurement noise given to the
wrong and creating a new fusion object is not reasonable infrared signal and to the PDAF based residual calculation.
conceming the little lateral distance of the existing fusion The same reasoning applies to the relative velocity graph
objects. where the sensor fusion signal follows the radar data closer.

735
IOW

Boo -

800 -

-
L
E
700-

2 600-

g
f 500-
10 15 eo 25 so
'8,
E 400-
-0

300 -

200 -

loo -

L 8 I I ' 1 3 , I

-200 -150 -100 60 0 50 100 150 2


lateral position [mf

Fig. 12. Bird v i m of a 20 second test drive. All infrared and radar
ineasurenients are shown as well as the trajectory of the test vehicle and
two fusion objects.

presented algorithms are shown. The fusion object track


Fig. 10. Results of longitudinal and lateral sensor fusion of radar and close to the trajectory of the test vehicle corresponds to
infrared data using IMM filtering and probabilistic data association.
the data shown in figure 7 , l O and 1 1 . The second track
belongs to another vehicle cruising 011 the left side of the
test vehicle. Comparing the trajectories of the fusion objects
with the sensor measurements, it can be seen that the phase
delay between measurements and fused objects is rather
moderate. Another obvious thing is that the lateral position
estimation is dominated by the infrared sensor.

VI. SUMMARY AND CONCLUSION


In this paper algorithms for single-sensor tracking and
15 w multi-sensor fusion were presented. The results show that
fusing radar data with infrared data considerably increases
-1.5 detection range, reliability and accuracy of the object
tracking. This is mandatory for further development of
c driver assistance systems. Using multiple model filtering
E
2 for sensor fusion applications helps to capture the dynamics
3 of maneuvering objects while still achieving smooth object
tracking for not maneuvering objects. This is important
when safety and comfort systems have to make use of
the same sensor information. Comfort systems generally
require smoothly filtered data whereas for safety systems it
is crucial to capture maneuvers of other road users as fast
Fig. 11. Close look at lateral distance and relative velocity estimation as possible. All presented algorithms are tested in real-time
offgure 10. on standard PC sytems.
VII. OUTLOOK
To demonstrate the capability of the presented algorithms, In further research the presented data association al-
figure 1 2 shows the same data as figure 3. Additionally gorithms will be compared to optimal data association
the tracked trajectory of two fusion objects obtained by the algorithms. The results of the presented tracktng algorithms

736
will be compared to reference data obtained by vehicle-to- [4] S. S. Black”, “Multiple Target Tracking with Radar Applications”,
vehicle communication. Artech House, 1986.
[SI A.P. Blom, Y. Bar-Shalom, “The Interacting Multiple Model Algo-
Further the effects of multiple model estimation upon rithm for Systems with Markovian Switching Coefficients”, IEEE
optimal longitudinal control as presented in [7] will be Transactions on Automatic Control, Vol 33, No 8, 1988.
examined in more detail. [6] Xiao-Rong Li, Y. Bar-Shalom, “Multiple Model Estimation with
Variable Structure” IEEE Transactions on Automatic Control, Vol
41, No 4, 1996.
REFERENCES [7] Rainer Mobus, Mato Baotic, Manfred M o d , ”Multi-Object Adap-
tive Cruise Control”, Hybrid Systems Computation and Control
[l] A. Gelb, “Applied Optimal Estimation”, MIT Press 1974. Conference HSCC 03, Prague, Czech Republic, April 2003.
[2] Y. Bar-Shalom, T. E. Fortmann, “Tracking and Data Association” , [8] Rainer Mobus, Annin Joos, Uli Kolbe, “Multi-Target Multi-Object
Academic Press INC, San Diego, New York 1988. Radartracking”, IEEE Intelligent Vehicles Conference, Columbus,
[3] Y. Bar-Shalom, W. Dale Blair, “Multitarget-Multisensor Tracking: Ohio, USA, June 2003.
Applications and Advances Volume III”, Artech House, Boston,
London, 2000.

737

You might also like