You are on page 1of 53

Nonlinear Filtering: Concepts and

Engineering Applications 1st Edition


Jitendra R. Raol
Visit to download the full and correct content document:
https://textbookfull.com/product/nonlinear-filtering-concepts-and-engineering-applicati
ons-1st-edition-jitendra-r-raol/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Control Systems: Classical, Modern, and AI-Based


Approaches 1st Edition Jitendra R. Raol (Author)

https://textbookfull.com/product/control-systems-classical-
modern-and-ai-based-approaches-1st-edition-jitendra-r-raol-
author/

Nonlinear Filtering: Methods and Applications Kumar


Pakki Bharani Chandra

https://textbookfull.com/product/nonlinear-filtering-methods-and-
applications-kumar-pakki-bharani-chandra/

Coulson and Richardson’s Chemical Engineering, Fourth


Edition: Volume 3A: Chemical and Biochemical Reactors
and Reaction Engineering R. Ravi

https://textbookfull.com/product/coulson-and-richardsons-
chemical-engineering-fourth-edition-volume-3a-chemical-and-
biochemical-reactors-and-reaction-engineering-r-ravi/

Nonlinear Regression Modeling for Engineering


Applications Modeling Model Validation and Enabling
Design of Experiments 1st Edition R. Russell Rhinehart

https://textbookfull.com/product/nonlinear-regression-modeling-
for-engineering-applications-modeling-model-validation-and-
enabling-design-of-experiments-1st-edition-r-russell-rhinehart/
Enterprise GIS: Concepts and Applications 1st Edition
John R. Woodard

https://textbookfull.com/product/enterprise-gis-concepts-and-
applications-1st-edition-john-r-woodard/

Nonlinear Approaches in Engineering Applications:


Automotive Applications of Engineering Problems Reza N.
Jazar

https://textbookfull.com/product/nonlinear-approaches-in-
engineering-applications-automotive-applications-of-engineering-
problems-reza-n-jazar/

Thermodynamics Concepts and Applications 2nd Edition


Stephen R. Turns

https://textbookfull.com/product/thermodynamics-concepts-and-
applications-2nd-edition-stephen-r-turns/

Edible Oil Structuring Concepts Methods and


Applications Ashok R Patel

https://textbookfull.com/product/edible-oil-structuring-concepts-
methods-and-applications-ashok-r-patel/

Nonlinear dynamics and chaos: with applications to


physics, biology, chemistry, and engineering Strogatz

https://textbookfull.com/product/nonlinear-dynamics-and-chaos-
with-applications-to-physics-biology-chemistry-and-engineering-
strogatz/
Nonlinear Filtering
Concepts and Engineering Applications
Nonlinear Filtering
Concepts and Engineering Applications

Jitendra R. Raol
Girija Gopalratnam
Bhekisipho Twala
MATLAB ® is a trademark of The MathWorks, Inc. and is used with permission. The MathWorks does not warrant the accuracy of the text or exer-
cises in this book. This book’s use or discussion of MATLAB ® software or related products does not constitute endorsement or sponsorship by The
MathWorks of a particular pedagogical approach or particular use of the MATLAB ® software.

CRC Press
Taylor & Francis Group
6000 Broken Sound Parkway NW, Suite 300
Boca Raton, FL 33487-2742

© 2017 by Taylor & Francis Group, LLC


CRC Press is an imprint of Taylor & Francis Group, an Informa business

No claim to original U.S. Government works

Printed on acid-free paper

International Standard Book Number-13: 978-1-4987-4517-8 (Hardback)

This book contains information obtained from authentic and highly regarded sources. Reasonable efforts have been made to publish reliable data and
information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use. The authors
and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if per-
mission to publish in this form has not been obtained. If any copyright material has not been acknowledged please write and let us know so we may
rectify in any future reprint.

Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any elec-
tronic, mechanical, or other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information
storage or retrieval system, without written permission from the publishers.

For permission to photocopy or use material electronically from this work, please access www.copyright.com (http://www.copyright.com/) or con-
tact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that
provides licenses and registration for a variety of users. For organizations that have been granted a photocopy license by the CCC, a separate system
of payment has been arranged.

Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation
without intent to infringe.
Visit the Taylor & Francis Web site at
http://www.taylorandfrancis.com
and the CRC Press Web site at
http://www.crcpress.com
Arbib Athans Anderson Astrom Ackermann Ahmed Aggoun

Bode Brown Bayes Bellman Bucy Bierman Balakrishnan Bryson Basar Box Boyd

Billings Benes Baras Bendat Brigo Bar-Shalom Breeman Brockett Bagachi

Cramer Clark Chen Collins Chang Challa Chigansky

Doob Davis Datchmendy Daum Deutsch Doyle Durrant-Whyte Dasarathy

Einstein Euler Evans Eykpoff Elliot

Fisher Fokker Fujisaki Franklin Falb Fuller Fourier Frost

Gauss Girsanov Gelb Grimble Geesey Graupe Goodwin

Holf Ho Handel Hassibi Haddad Haykin Hanzon Hedrick Hsia Hall

Ito Isidori Iliff Isherman

Jenkins Jazwinski Joseph Junkins Julier Jategaonkar

Kolmogorov Kushner Kalman Kailath Kallianpur Kunita Krishnan Kumar Kwakernaak

Kalaba Klein Kokotovic Kharegonekar Kucera Krebs Karandikar Kirubarajan

Legendre Lee Lipster Langevin Laub Luenberger Ljung Levy Lyapunov Liebelt Lo Loh Leonedes Lindsey Lewis Limebeer

Mayne Minorsky Meyer Mendel Moore Masani Movellan Maybeck Mitter Meditch

Mehra Maine Melsa Mook Morelli Mahalabnis Mulder Mutambra Manyika

Dedicated to these and many others who have contributed directly or indirectly to

nonlinear filtering and some supporting areas like stochastic processes/calculus …

Nyquist Nahi Nichols Nelson

Ornstein Oldenberger Olsder

Papoulis Planck Pugachev Pontrygin Popov Priestly Pearson Petersen Payne Piersol Powel Potter Plaetschke

Routh Ragazzini Rao

Shannon Stratonovich Striebel Schmidt Sage Sorenson Segall Swerling Shiryaev Scott

Sivan Smoluchowski Spall Stoica Soderstrom Sayed Shaked Simon Speyer Sridhar Saridis

Skorokhod Schaft Stengel Shinners Skogestad Song Schon Sarkka Sinha

Tao Tempo Tanizaki Uhlenbeck

Vidyasagar Viterbi Varaiya Verhaegen Varshney

Wiener Wong Wonham Warwick Widrow Watanabe Willems Wittenmark

Xiong (Jie) Yurkovich

Zadeh Zakai Zames Zhou Zhang

(The ordering of names within a line is random, and the list is not exhaustive.)
Contents

Preface.........................................................................................................................................................................................xv
Acknowledgements............................................................................................................................................................... xvii
Authors......................................................................................................................................................................................xix
Introduction..............................................................................................................................................................................xxi

Section I Mathematical Models, Kalman Filtering and H-Infinity Filters

1. Dynamic System Models and Basic Concepts.............................................................................................................. 3


1.1 Dynamic Systems: The Need for Modelling, Parameter Estimation and Filtering........................................ 3
1.2 Mathematical Modelling of Systems...................................................................................................................... 5
1.2.1 Time and Frequency Domain Aspects..................................................................................................... 5
1.2.2 Differential Equations................................................................................................................................. 6
1.2.3 Difference Equations................................................................................................................................... 6
1.2.4 State Space Models....................................................................................................................................... 7
1.2.4.1 Physical Representation.............................................................................................................. 7
1.2.4.2 Controllable Canonical Form..................................................................................................... 7
1.2.4.3 Observable Canonical Form....................................................................................................... 8
1.2.4.4 Diagonal Form.............................................................................................................................. 8
1.2.4.5 General State Space Models........................................................................................................ 8
1.2.5 Polynomial Models...................................................................................................................................... 9
1.2.6 Time Series Models...................................................................................................................................... 9
1.2.6.1 Autoregressive Model................................................................................................................ 10
1.2.6.2 Least Squares Model.................................................................................................................. 10
1.2.7 Transfer Function Models......................................................................................................................... 11
1.3 Nonlinear Dynamic Systems................................................................................................................................ 11
1.3.1 Nonlinearities in a System........................................................................................................................ 12
1.3.2 Mathematical Models of Nonlinear Systems......................................................................................... 12
1.3.2.1 Nonlinear Differential and Difference Equations................................................................. 12
1.3.2.2 Volterra Series............................................................................................................................. 13
1.3.2.3 Hammerstein Model.................................................................................................................. 13
1.3.2.4 Nonlinear State Space Models.................................................................................................. 14
1.3.2.5 Nonlinear Time Series Models................................................................................................. 14
1.4 Signal and System Norms...................................................................................................................................... 15
1.4.1 Signal Norms.............................................................................................................................................. 15
1.4.2 System Norms............................................................................................................................................ 16
1.4.2.1 H2 Norm....................................................................................................................................... 16
1.4.2.2 H∞ Norm...................................................................................................................................... 17
1.5 Digital Signal Processing, Parameter Estimation and Filtering....................................................................... 18
1.5.1 Signal Processing....................................................................................................................................... 18
1.5.2 Parameter Estimation: Recursive Approach.......................................................................................... 19
1.5.3 Filtering Concepts...................................................................................................................................... 20
1.5.4 Simple Recursive Filtering........................................................................................................................ 21
Appendix 1A: Mean Square Estimation......................................................................................................................... 21
Appendix 1B: Nonlinear Models Based on Artificial Neural Networks and Fuzzy Logic..................................... 24
Appendix 1C: Illustrative Examples............................................................................................................................... 29

2. Filtering and Smoothing................................................................................................................................................. 35


2.1 Wiener Filtering....................................................................................................................................................... 35
2.2 Least Squares Parameter Estimation.................................................................................................................... 38

vii
viii Contents

2.3 Recursive Least Squares Filter.............................................................................................................................. 39


2.4 State Space Models and Kalman Filtering........................................................................................................... 40
2.4.1 Discrete Time Filter................................................................................................................................... 41
2.4.1.1 State and Covariance Matrix Propagation.............................................................................. 41
2.4.1.2 Measurement Update................................................................................................................ 42
2.4.1.3 Kalman Gain............................................................................................................................... 42
2.4.2 Continuous Time Kalman Filter.............................................................................................................. 43
2.4.3 Interpretation of Kalman Filter................................................................................................................ 44
2.4.3.1 Continuous Time Filter.............................................................................................................. 44
2.4.3.2 Discrete Time Filter.................................................................................................................... 44
2.4.4 Filters for Correlated/Coloured Process and Measurement Noises.................................................. 45
2.4.4.1 Kalman Filter for the Correlated Process and Measurement Noises................................. 45
2.4.4.2 Handling of Coloured Process Noise and Coloured Measurement Noise
in Kalman Filters........................................................................................................................ 46
2.4.5 Time-Varying Linear Kalman Filters...................................................................................................... 47
2.4.6 Steady State Filtering................................................................................................................................. 48
2.4.7 Kalman Filter Implementation Aspects.................................................................................................. 48
2.4.8 Parallelization of Kalman Filters............................................................................................................. 49
2.4.8.1 Measurement Update Parallelization...................................................................................... 50
2.4.8.2 Time Propagation Parallelization............................................................................................ 51
2.5 Filter Error Methods............................................................................................................................................... 51
2.5.1 Output Error Method................................................................................................................................ 51
2.5.2 Process Noise Algorithms for Linear Systems...................................................................................... 52
2.5.2.1 Natural Formulation.................................................................................................................. 53
2.5.2.2 Innovations Formulation........................................................................................................... 53
2.5.2.3 Mixed Formulation.................................................................................................................... 53
2.5.3 Process Noise Algorithms for Nonlinear Systems................................................................................ 54
2.5.3.1 Steady-State Filter....................................................................................................................... 54
2.5.3.2 Time-Varying Filter.................................................................................................................... 55
2.6 Information Filtering.............................................................................................................................................. 56
2.6.1 Fisher’s Information Concept................................................................................................................... 57
2.6.2 Linear Information Filter.......................................................................................................................... 57
2.7 Smoothers................................................................................................................................................................. 58
2.7.1 Smoothing as a Combination of Forward and Backward Filtering................................................... 59
2.7.2 Fixed Interval RTS Smoother................................................................................................................... 62
2.7.3 Fixed Point Smoother................................................................................................................................ 63
2.7.4 Fixed Lag Smoother................................................................................................................................... 65
Appendix 2A: Innovations Approach to Linear Least Squares Estimation.............................................................. 66
Appendix 2B: Filtering Algorithms for Delayed State and Missing Measurements – Illustrative Example........ 74
Appendix 2C: Artificial Neural Network Based Filtering........................................................................................... 82
Appendix 2D: Image Centroid Tracking with Fuzzy Logic in Filtering Algorithms – Illustrative Example....... 85
Appendix 2E: Illustrative Examples............................................................................................................................... 93

3. H∞ Filtering...................................................................................................................................................................... 103
3.1 H∞ Norm and Robustness.................................................................................................................................... 103
3.2 H∞ Filtering Problem............................................................................................................................................. 104
3.2.1 H∞ A Posteriori Filter............................................................................................................................... 107
3.2.2 H∞ A Priori Filter...................................................................................................................................... 108
3.3 H∞ Smoother............................................................................................................................................................110
3.4 H∞ Risk-Sensitive Filter..........................................................................................................................................111
3.4.1 A Posteriori-Risk Sensitive Filter............................................................................................................113
3.4.2 A Priori Risk-Sensitive Filter...................................................................................................................113
3.4.3 Risk-Sensitive Smoother..........................................................................................................................113
3.5 Mixed H∞ and Kalman Filtering..........................................................................................................................114
3.6 Global H∞ Filter.......................................................................................................................................................116
Contents ix

Appendix 3A: Krein Space and Some Definitions and Theorems.............................................................................118


Appendix 3B: Illustrative Examples............................................................................................................................. 121

4. Adaptive Filtering.......................................................................................................................................................... 131


4.1 Need of Filter Tuning and Adaptation............................................................................................................... 131
4.2 Approaches to Adaptive Filtering...................................................................................................................... 131
4.2.1 Heuristic Approach................................................................................................................................. 132
4.2.2 Bayesian Approach.................................................................................................................................. 132
4.2.3 Maximum Likelihood–Based Optimal Adaptive Filtering............................................................... 133
4.2.4 Correlation-Based Adaptation............................................................................................................... 136
4.2.5 Concept of Covariance Matching.......................................................................................................... 137
4.2.6 Fuzzy Logic–Based Adaptation............................................................................................................. 138
4.2.6.1 Fuzzy Inference System for R with Known Q..................................................................... 138
4.2.6.2 Fuzzy Inference System for Q with Known R..................................................................... 138
4.3 H∞ Finite Memory Adaptive Filter...................................................................................................................... 139
Appendix 4A: Maneuvering Target – Illustrative Examples..................................................................................... 140
Appendix 4B: Adaptive Kalman Filter – Illustrative Example................................................................................. 148
Exercises for Section I (Chapters 1–4)........................................................................................................................... 150
References for Section I (Chapters 1–4)........................................................................................................................ 152

Section II Factorization and Approximation Filters

5. Factorization Filtering................................................................................................................................................... 157


5.1 Divergence of Kalman Filter; Need of Factorization....................................................................................... 157
5.2 UD Factorization Filter......................................................................................................................................... 157
5.2.1 Time Propagation..................................................................................................................................... 158
5.2.2 Measurement Data Update..................................................................................................................... 158
5.2.3 Filter for Correlated Process Noise and Bias Parameters.................................................................. 159
5.3 Filtering Algorithms Based on Square-Root Arrays........................................................................................ 160
5.3.1 H2 Square-Root Arrays............................................................................................................................ 160
5.3.2 Chandrasekhar Recursions.....................................................................................................................161
5.3.3 H2 Chandrasekhar Recursions................................................................................................................161
5.4 Square-Root Information Filter............................................................................................................................162
5.4.1 Inclusion of A Priori Information in the Least Squares Cost Function............................................162
5.4.2 Measurements Data Update....................................................................................................................162
5.4.3 State Propagation of Square-Root Information Filter......................................................................... 163
5.4.4 Measurements Data Update of Square-Root Information Filter....................................................... 165
5.5 Eigenvalue–Eigenvector Factorization Filtering............................................................................................... 165
5.5.1 V–D Discrete Time Measurement Update........................................................................................... 165
5.5.2 V–D Square-Root Filtering......................................................................................................................167
5.5.2.1 Continuous Time/Discrete Time Square-Root Filtering Algorithm.................................167
5.5.2.2 Discrete Time/Discrete Time Square-Root Filtering Algorithm........................................167
5.6 H-Infinity Square-Root Filters............................................................................................................................. 168
5.6.1 H-Infinity Square-Root Arrays.............................................................................................................. 168
5.6.2 H-Infinity Chandrasekhar Recursions................................................................................................. 169

6. Approximation Filters for Nonlinear Systems......................................................................................................... 175


6.1 Continuous Extended Kalman–Bucy Filter....................................................................................................... 175
6.2 Continuous-Discrete Extended Kalman–Bucy Filter........................................................................................176
6.2.1 Time Propagation Filter...........................................................................................................................176
6.2.2 Measurement Data Update/Filtering....................................................................................................176
6.3 Continuous Discrete Extended Kalman–Bucy Filter for Joint State Parameter Estimation....................... 177
6.3.1 Time Propagation..................................................................................................................................... 178
6.3.2 Measurement Data Update..................................................................................................................... 178
x Contents

6.4 Iterated Extended Kalman Filter......................................................................................................................... 178


6.5 Linearized Kalman Filter..................................................................................................................................... 179
6.6 Continuous Second-Order Minimum Variance Estimator (SOF).................................................................. 180
6.7 Continuous-Discrete Modified Gaussian Second-Order (CDMGSO) Filter................................................. 181
6.7.1 Measurement Update.............................................................................................................................. 181
6.7.2 Time Propagation/Prediction Part........................................................................................................ 182
6.8 Extended Information Filter................................................................................................................................ 182
6.9 Statistically Linearized Filter.............................................................................................................................. 183
6.10 Derivative-Free Kalman Filter............................................................................................................................. 185
6.10.1 Derivative-Free Kalman Filter Initialization....................................................................................... 185
6.10.2 Sigma Points Computation..................................................................................................................... 186
6.10.3 State and Covariance Propagation........................................................................................................ 186
6.10.4 State and Covariance Update................................................................................................................. 186
6.11 Global Approximations Nonlinear Filters......................................................................................................... 186
6.11.1 Orthogonal Series Expansion Approximations.................................................................................. 187
6.11.1.1 Approximation Based on Legendre or Fourier Bases Functions...................................... 188
6.11.1.2 Approximation Based on Chebyshev Polynomials............................................................. 189
6.11.2 Gaussian Sum Approximation............................................................................................................... 193
6.11.3 Point-Mass Approximation.................................................................................................................... 194
6.11.3.1 Measurement Update.............................................................................................................. 196
6.11.3.2 Time Propagation..................................................................................................................... 196
6.11.3.3 Point Estimates......................................................................................................................... 197
6.11.3.4 Algorithmic Aspects................................................................................................................ 197
6.11.4 Spline Approximation............................................................................................................................. 198
6.11.4.1 B-Splines.................................................................................................................................... 198
6.11.4.2 Spline Filtering......................................................................................................................... 199
6.12 Extended H-Infinity Filters.................................................................................................................................. 201
6.12.1 Continuous Time System........................................................................................................................ 201
6.12.2 Discrete Time System.............................................................................................................................. 202
Appendix 6A: Approximate Filters................................................................................................................................211
Appendix 6B: Basic Numerical Approximation Approaches.................................................................................... 219
Appendix 6C: Satellite Orbit Determination as a Nonlinear Filtering Problem – Application
of the Extended Kalman Filter, Extended UD Filter and Extended UD-RTS Smoother............................. 226
Appendix 6D: Application to Planar Tracking Problem – Illustrative Example.................................................... 242

7. Generalized Model Error Estimators for Nonlinear Systems............................................................................... 249


7.1 Philosophy of Model Error.................................................................................................................................. 250
7.2 Pontryagin’s Conditions....................................................................................................................................... 250
7.3 Basic Invariant Embedding Approach............................................................................................................... 252
7.4 Generalized Continuous Time Algorithm........................................................................................................ 253
7.5 Generalized Discrete Time Algorithm.............................................................................................................. 255
7.6 Conventional Invariance Embedding Estimators............................................................................................ 256
7.7 Robust Estimation of Model Error in H-Infinity Setting................................................................................. 256
7.7.1 Performance Norm.................................................................................................................................. 256
7.7.2 Constraint on Cost Function.................................................................................................................. 257
7.7.3 Semi-Robust/Adaptive Invariant Embedding Estimators................................................................. 258
7.8 Model Fitting Procedure to the Discrepancy/Model Error............................................................................ 258
7.9 Features of Model Error Algorithm................................................................................................................... 259
Exercises for Section II (Chapters 5–7).......................................................................................................................... 264
References for Section II (Chapters 5–7)....................................................................................................................... 266
Contents xi

Section III Nonlinear Filtering, Estimation and Implementation Approaches

8. Nonlinear Estimation and Filtering........................................................................................................................... 271


8.1 The General Estimation Framework.................................................................................................................. 271
8.2 Continuous Time Dynamic Model and Filtering............................................................................................. 272
8.2.1 Fokker–Planck Equation......................................................................................................................... 273
8.2.2 Kushner–Stratonovich Equation.............................................................................................................274
8.2.3 Minimum Variance Estimation............................................................................................................. 275
8.2.4 Bayesian Approach to Continuous Time Filtering.............................................................................. 276
8.2.4.1 Bayes Formula........................................................................................................................... 276
8.2.4.2 Nonlinear Filtering for Stochastic Differential Equation–Continuous Time Systems........278
8.2.5 Computation of the Filtered Estimates................................................................................................. 280
8.3 Bayesian Recursive Estimation–Discrete Time Systems................................................................................. 281
8.3.1 Measurement Data Update/Filtering................................................................................................... 281
8.3.2 Prediction-Time Propagation/Evolution.............................................................................................. 282
8.4 Continuous Time State–Discrete Time Measurement Estimator................................................................... 282
8.4.1 Filtering/Measurement Data Update................................................................................................... 282
8.4.2 Prediction-Time Propagation/Evolution.............................................................................................. 282
8.5 Benes Filter............................................................................................................................................................. 282
8.5.1 Equations of the Benes Filter.................................................................................................................. 285
8.5.1.1 Derivation of the Propagation Part of the Benes Filter....................................................... 285
8.5.1.2 Derivation of the Measurement Data Update Part of the Benes Filter............................. 287
8.6 Wonham Filter....................................................................................................................................................... 287
8.6.1 Development of Filtering Distribution Formulas for n = 0................................................................ 288
8.6.2 Development of Filtering Distribution Formulas for n > 0................................................................ 289
8.7 Conditionally Gaussian Filtering....................................................................................................................... 291
8.7.1 Conditionally Gaussian Models............................................................................................................ 291
8.7.2 State Space Dynamic Models................................................................................................................. 292
8.7.2.1 Time Propagation..................................................................................................................... 293
8.7.2.2 Measurement/Data Update.................................................................................................... 293
8.7.3 Conditional Gauss–Hermite Filtering.................................................................................................. 294
8.7.3.1 Time Propagation Part of the Conditional Gauss–Hermite Filter..................................... 294
8.7.3.2 Measurement/Data Update Part of the Conditional Gauss–Hermite Filter.................... 295
8.8 Daum’s Filter.......................................................................................................................................................... 297
8.8.1 Examples of Daum’s New Filter............................................................................................................. 299
8.8.2 Derivations of Daum’s Filter................................................................................................................... 301
8.8.3 Explicit Estimation Formulas................................................................................................................. 302
8.9 Schmidt’s Design of Nonlinear Filters Based on Daum’s Theory.................................................................. 303
8.9.1 Conditions for Solution of Daum’s Equations..................................................................................... 304
8.9.2 Time Propagation/Evolution and Measurement Update Equations................................................ 305
8.10 Cubature Kalman Filter: A Nonlinear Filter for High-Dimensional State Estimation............................... 306
8.10.1 Basic Bayesian Approach........................................................................................................................ 306
8.10.2 Bayesian Approach: Gaussian Assumption......................................................................................... 307
8.10.2.1 Time Evolution/Propagation.................................................................................................. 307
8.10.2.2 Measurement/Data Update.................................................................................................... 308
8.10.3 Numerical Methods for Moment Integrals.......................................................................................... 309
8.10.3.1 Product Rules............................................................................................................................ 309
8.10.3.2 Non-Product Rules................................................................................................................... 309
8.10.3.3 Basic Aspects of Cubature Rule............................................................................................. 309
8.10.4 Cubature Filter......................................................................................................................................... 309
8.10.4.1 Transformation..........................................................................................................................310
8.10.4.2 Spherical Cubature Rule...........................................................................................................310
8.10.4.3 Radial Rule.................................................................................................................................310
8.10.4.4 Spherical-Radial Rule...............................................................................................................311
xii Contents

8.10.5 Cubature Kalman Filter Algorithm........................................................................................................311


8.10.5.1 Time Propagation/Evolution...................................................................................................311
8.10.5.2 Measurement/Data Update.................................................................................................... 312
Appendix 8A: Innovations Approach to Nonlinear Filtering and Smoothing....................................................... 312
Appendix 8B: Extended Benes Filter............................................................................................................................. 320
Appendix 8C: Comparative Aspects of Nonlinear Filters......................................................................................... 323
Appendix 8D: Illustrative Examples............................................................................................................................. 326

9. Nonlinear Filtering Based on Characteristic Functions......................................................................................... 337


9.1 Conditionally Optimal Filtering......................................................................................................................... 337
9.2 Conditionally Optimal Filters for Continuous Systems.................................................................................. 338
9.2.1 pth-Order Filter Structure...................................................................................................................... 339
9.2.2 Conventional Generalized Structure.................................................................................................... 339
9.2.3 Simplification to Linear Filters............................................................................................................... 340
9.3 Conditionally Optimal Filters for Discrete Systems........................................................................................ 341
9.3.1 Models Linear in Noise........................................................................................................................... 341
9.3.2 pth-Order-Discrete Structure................................................................................................................. 342
9.3.3 Conventional Structure........................................................................................................................... 343
9.3.4 Simplification to Discrete Linear Filter................................................................................................. 343
9.3.5 Linear Filter for Second-Order Vector Difference System................................................................. 343
9.4 Filtering for Continuous Systems with Discrete Measurements................................................................... 344
9.4.1 Conditionally Optimal Filter.................................................................................................................. 345
9.4.1.1 Time Propagation Filter........................................................................................................... 345
9.4.1.2 Measurement Update Filter.................................................................................................... 345
9.4.2 A Special Case of the Filter..................................................................................................................... 346
9.4.2.1 Time Propagation Part............................................................................................................. 346
9.4.2.2 Measurement Data Update Part............................................................................................. 346
9.4.3 Continuous Discrete Linear Filter......................................................................................................... 347
9.4.3.1 Time Propagation Filter........................................................................................................... 347
9.4.3.2 Measurement Update Filter.................................................................................................... 347
9.5 Nonlinear Filtering for Correlated Noise Processes........................................................................................ 348
9.5.1 Conditionally Optimal Solutions.......................................................................................................... 348
9.5.1.1 Decomposition of the General Structure.............................................................................. 349
9.5.1.2 Simplification to Linear Filter................................................................................................. 349
9.5.2 Measurement Noise Correlation........................................................................................................... 350
9.5.2.1 Conditionally Optimal Solutions........................................................................................... 350
9.5.2.2 General Solution....................................................................................................................... 350
9.5.2.3 Correlated Measurement Noise............................................................................................. 351
9.5.2.4 Linear Filters............................................................................................................................. 351
9.5.3 Pseudo-Measurement Filter................................................................................................................... 352
9.6 Simulation Results................................................................................................................................................ 353
9.7 Derivations of Conditionally Optimal Gains for CSDM Nonlinear Systems.............................................. 355
9.7.1 Time Propagation Filter.......................................................................................................................... 355
9.7.2 Measurement Data Update..................................................................................................................... 356
9.8 Derivations of Conditionally Optimal Gains for CSDM Nonlinear Systems with Correlated
Measurement Noise.............................................................................................................................................. 356
Appendix 9A: Finite Dimensional Minmax Algorithm for Nonlinear State Estimation....................................... 357

10. Implementation Aspects of Nonlinear Filters.......................................................................................................... 361


10.1 Sequential Monte Carlo Methods: Particle Filters............................................................................................ 361
10.1.1 Sequential Importance Sampling Filter................................................................................................ 362
10.1.2 Bootstrap–Sampling Importance Resampling Filter.......................................................................... 362
10.1.3 Improved Sampling Importance Resampling Filter........................................................................... 363
10.1.4 Auxiliary Particle Filter.......................................................................................................................... 363
10.1.5 Rejection Particle Filter........................................................................................................................... 364
Contents xiii

10.1.6 Rao–Blackwell Particle Filter.................................................................................................................. 365


10.1.7 Kernel Smoothing and Regularization................................................................................................. 366
10.1.8 Data Augmentation................................................................................................................................. 366
10.1.8.1 Data Augmentation as a Bayesian Sampling Method........................................................ 367
10.1.9 Markov Chain Monte Carlo (MCMC) Particle Filter.......................................................................... 368
10.1.10 Mixture Kalman Filters........................................................................................................................... 368
10.1.11 Mixture Particle Filters........................................................................................................................... 368
10.1.12 Other Monte Carlo Filters....................................................................................................................... 369
10.2 Selection of Proposal Probability Density Functions...................................................................................... 369
10.2.1 Prior Distribution..................................................................................................................................... 369
10.2.2 Annealed Prior Distribution.................................................................................................................. 370
10.2.3 Likelihood................................................................................................................................................. 370
10.2.4 Bridging the Density and the Partitioned Sampling.......................................................................... 371
10.2.5 Gradient-Based Transition Density....................................................................................................... 371
10.2.6 Extended Kalman Filter as Proposal Distribution.............................................................................. 372
10.2.7 Unscented Particle Filter......................................................................................................................... 372
10.3 Theoretical and Practical Aspects of the Particle Filters................................................................................. 372
10.3.1 Convergence and Asymptotic Aspects................................................................................................. 372
10.3.1.1 Almost Sure Convergence....................................................................................................... 372
10.3.1.2 Mean Square Convergence..................................................................................................... 372
10.3.2 Bias and Variance..................................................................................................................................... 373
10.3.3 Robustness.................................................................................................................................................374
10.3.4 Adaptive Procedure................................................................................................................................. 375
10.4 Evaluation and Implementation of the Particles Filters.................................................................................. 375
10.5 Selection of Structural Functions and Computation of Gains in Conditionally Optimal Filters............. 376
10.5.1 Methods for Computation of Gains...................................................................................................... 376
10.5.1.1 Alpha-Family Method............................................................................................................. 376
10.5.1.2 Functional Approximation Method...................................................................................... 377
10.5.1.3 Semi-Invariant Method........................................................................................................... 378
10.5.2 Selection of Structural Functions.......................................................................................................... 378
10.5.2.1 Specialized Structures............................................................................................................. 378
10.5.2.2 Nonlinear Function of Residuals Structures........................................................................ 378
10.5.2.3 Stability Method....................................................................................................................... 379
Appendix 10A: Daum’s Particle and Non-Particle Filters......................................................................................... 379
Appendix 10B: Illustrative Examples........................................................................................................................... 384

11. Nonlinear Parameter Estimation................................................................................................................................. 387


11.1 Nonlinear Least Squares...................................................................................................................................... 387
11.2 Gaussian Least Squares Differential Correction Method............................................................................... 388
11.3 Output Error Method–Maximum Likelihood Approach............................................................................... 389
11.3.1 Principle of Maximum Likelihood........................................................................................................ 389
11.3.2 Cramér–Rao Lower Bound..................................................................................................................... 390
11.3.3 The Maximum Likelihood Estimate Is Efficient................................................................................. 392
11.3.4 Maximum Likelihood Estimation for Dynamic System.................................................................... 393
11.3.5 Derivation of the Likelihood Function................................................................................................. 393
11.3.6 Accuracy Aspects..................................................................................................................................... 394
11.3.7 Output Error Method.............................................................................................................................. 395
11.3.8 Features and Numerical Aspects of OEM/MLM............................................................................... 397
11.4 Estimation Before Modelling Approach............................................................................................................ 397
11.4.1 The Two-Step Procedure......................................................................................................................... 398
11.4.1.1 Extended Kalman Filter/Fixed Interval Smoother.............................................................. 398
11.4.1.2 Regression for Parameter Estimation.................................................................................... 398
11.4.1.3 Model Parameter Selection Procedure.................................................................................. 398
Appendix 11A: Aircraft Real Data Analysis – Illustrative Example......................................................................... 406
Appendix 11B: Expectation Maximization Algorithm for Parameter Estimation.................................................. 408
xiv Contents

12. Nonlinear Observers...................................................................................................................................................... 415


12.1 Continuous Time Full-Order Observer Design................................................................................................ 415
12.1.1 System–Observer Configuration............................................................................................................416
12.2 Discrete Time Full-Order Observer................................................................................................................... 417
12.3 Reduced Order Observer..................................................................................................................................... 417
12.4 Nonlinear Observers............................................................................................................................................ 418
12.4.1 Lyapunov Method................................................................................................................................... 419
12.4.2 Method of Extended Linearization....................................................................................................... 420
12.4.3 Method Based on Lie Algebra................................................................................................................ 421
12.4.4 Deterministic Lyapunov-Based Nonlinear Observer......................................................................... 424
12.4.4.1 Thau’s Method.......................................................................................................................... 424
12.4.4.2 Raghavan’s Method.................................................................................................................. 425
Appendix 12A: Illustrative Examples........................................................................................................................... 427
Exercises for Section III and Section IV (Chapters 8–12 and Appendixes A–F)..................................................... 431
References for Section III (Chapters 8–12).................................................................................................................... 433

Section IV Appendixes – Basic Concepts and Supporting Material

Appendix A: System Theoretic Concepts – Controllability, Observability, Identifiability and Estimability....... 437
Appendix B: Probability, Stochastic Processes and Stochastic Calculus................................................................... 441
Appendix C: Bayesian Filtering.......................................................................................................................................... 485
Appendix D: Girsanov Theorem........................................................................................................................................ 491
Appendix E: Concepts from Signal and Stochastic Analyses...................................................................................... 495
Appendix F: Notes on Simulation and Some Algorithms............................................................................................. 509
Appendix G: Additional Examples.................................................................................................................................... 519
Index......................................................................................................................................................................................... 535
Preface

The stochastic filtering theory was established in the Bayesian theory, mainly the Bayes formula/rule. The
early 1940s with the pioneering work of Norbert Wiener classical minimum mean square error (MMSE) approach
and A. N. Kolmogorov, and then it culminated in the proceeds via determination of the conditional probabil-
early 1960s with the classic and celebrated Kalman fil- ity density function (pdf) of the system’s states given the
ter and soon after with the Kalman–Bucy filter. The actual measurements of the dynamic system. A nonlin-
Kalman filter and its several variants have been based ear estimation problem then reduces to the computation
on the state space modelling approach (of dynamic sys- of the time evolution of a pdf and the measurement/data
tems), and are still dominating the recursive-adaptive update of the pdf (resulting from the previous cycle).
filter theory. Kalman filters have been, in recent times, The resultant algorithms are applicable to the situations
applied in communications systems/networks, machine with nonlinear measurements and transition equations
learning (in training algorithms for artificial neural net- and non-normal error terms, in addition to being appli-
works), neuroscience, economics, finance, and political cable to the normal situations. In the present book we
science, besides their usual and conventional applica- study two approaches to nonlinear ­filtering based on
tions to science and engineering technologies, espe- (1) the conditional pdf of the states given the measure-
cially to aerospace dynamic systems. Various extensions ment; and (2) the joint pdf of the states, their estimates,
for nonlinear systems have also been proposed starting and the measurements. The latter is based on the char-
from the extended Kalman filter, since the basic Kalman acteristic function (of probability); often these estima-
filter is applicable to only linear estimation problems. tors are called the Pugachev’s nonlinear estimators.
Hence, there are now numerous approximation filters The latter estimators are called conditionally optimal
that are applicable to nonlinear systems. Also, proper estimators, mainly because the estimator is specified by
and formal nonlinear filters directly applicable to han- certain structural functions, which need to be specified
dling nonlinear filtering problems without any or more or determined. We attempt to provide some relations
approximations have also been formulated; however, among these two differing filtering concepts for nonlin-
the practical solutions still need some special assump- ear systems, specifically for linear problems.
tions and numerical methods. A very interesting alter- Where appropriate we use numerical simulation
native to stochastic filtering is the H-infinity filtering examples coded in MATLAB® (MATLAB is the trade-
approach that is based on the H-infinity norm, and it mark of MathWorks Ltd., USA), and the user should
does not need any statistical assumptions on noise pro- have an access to PC-based MATLAB software and cer-
cesses; here these processes are considered as general- tain toolboxes: signal processing, control system, system
ized random processes. This approach is also discussed identification and related ones. There are a few good
in the present book. Another alternative to stochastic books on filtering; however, most are quite dated and
filtering is the approach of invariant embedding; this is some are not as comprehensive as desired. The treat-
discussed in this book from the point of view of deter- ment in the present volume is comprehensive and cov-
mination of deterministic discrepancy or estimation of ers three areas of filtering: (1) linear, (2) approximation
model error, both meaning essentially the same thing. filters for nonlinear systems and (3) direct and/or exact
Filtering is desirable and is required in many situa- nonlinear filters – again these are based on two differ-
tions in engineering: electronics systems, power sys- ing concepts. The approach would be, where appropri-
tems, power grids and aerospace vehicles. Also, filtering ate, derivations of some important linear/nonlinear
is very essential for estimating orbital states of (artificial) filtering algorithms, and, where feasible, a few numeri-
satellites, in autonomous flight (estimation/control) of cal illustrative examples using MATLAB will be given.
unmanned aircraft systems/micro (or mini) air vehi- The approach is not specifically that of explicit theorem-
cles (MAVs), and (multisensor multitarget, MSMT) tar- proof type, intentionally kept this way so as to reduce a
get tracking in aerospace/aviation situations. Filtering heavy theoretical look of the treatment. However, sev-
has also become an integral part of multisensory data eral important theorems, theoretical and/or analytical
fusion systems and solutions. The approaches to non- results, short named as TARs, are presented in various
linear filtering are based on the minimum mean square chapters and the appendixes, and the treatment in the
(MMS) – or the least squares (LS) – error principle, the entire book is still formal and proper.
latter, which does not need any probabilistic frame- The end users of the soft technology of nonlinear
work, can be regarded as the special case of the former. filtering presented in this book will be systems, aero,
Fundamentally, certain approaches are also based on mechanical, civil, electrical, and electronics engineers;

xv
xvi Preface

electronics and communications, and telecommuni- this new and emerging field is being called data ana-
cations educational institutions; several R&D labo- lytics or system analytics.
ratories; aerospace and other industries; robotics;
MATLAB® is a registered trademark of The MathWorks,
transportation and automation industry; environ-
Inc. For product information, please contact:
mental sciences; and engineering and economics sys-
tem studies. It is most likely that in the near future The MathWorks, Inc.
this soft technology of nonlinear filtering, along with 3 Apple Hill Drive
system identification and parameter estimation meth- Natick, MA 01760-2098 USA
ods, would make definite inroads into the emerging Tel: 508-647-7000
and evolving field of systems analytics and data sci- Fax: 508-647-7001
ence, and hence for analyses of business, commercial, E-mail: info@mathworks.com
environmental, and health-related data, collectively Web: www.mathworks.com
Acknowledgements

This book is primarily dedicated to all those who worked at CRC Press/Taylor & Francis Group. I am especially
in the very difficult areas of stochastic processes, sto- very grateful to Nora Konopka, Amber Donley, Kathryn
chastic calculus and nonlinear estimation-cum-filtering. Everett, Jennifer Ahringer, Arlene Kopeloff, Christina
I (JRR) am very grateful to Dr. Ranjit C. Desai (emeri- M. Taranto, Kyra Lindholm, Claudia Kisielewicz, and
tus professor, M. S. University of Baroda, Vadodara), Cynthia Klivecka for their tremendous support, quick
Dr. S. Balakrishna (former head and senior scientist, responses, alertness, and very helpful nature during
FMCD, CSIR-NAL; now in the United States, and NASA this book project and earlier ones with CRC. Jonathan
awardee), the late Dr. Naresh Kumar Sinha (emeritus Plant has been a unique editor with tremendous zeal
professor, Department of Electrical and Computer and patience and a lot of respect for his authors, but for
Engineering, McMaster University, Canada, and my him I would not have been able to write five books for
doctoral thesis supervisor) and Dr. S. Srinathkumar (for- CRC Press. I am also very grateful to N. Shanthakumar
mer head and senior Scientist, FMCD, CSIR-NAL, and and Basappa (of the system identification group
NASA awardee) for giving me several opportunities to FMCD-NAL) for their technical support of my activi-
learn several concepts and methods in the areas of sys- ties on parameter estimation for several years. I am
tem identification, parameter estimation, linear and non- also very grateful to Professors N. Shivashankarappa,
linear filtering, and target tracking and data fusion, and Dr. T. Vishwanath, Parimala A., and Swetha A. (Depart­­
applying these to some practical engineering problems. ment of Telecommunications Engineering, MS Ramaiah
I am as ever very grateful to my own family (my 91-year- Institute of Technology, MSRIT), and Professor Reshma
old-mother, daughter, son, and wife), very close relatives Verma and Lakshmi S. (Department of Electronics and
and friends for their support and endurance for more Communications Engineering, MSRIT, Bangalore) for
than four decades. Special thanks are due to my doc- several technical discussions and for their help in devel-
toral students: Drs. Girija G., VPS Naidu, Sudesh Kumar oping certain codes.
Kashyap, and C. Kamali (senior scientists, FMCD, CSIR- I (GG) would like to express my deep gratitude to
NAL, Bangalore); and Drs. T. V. Ramamurthy (Reva my husband, S. Gopalratnam, and other members of
University, Bangalore), and Shobha R. Savanur (vice my family for having made a lot of adjustments during
principal, BLDE college, Bijapur) for giving me opportu- the writing of this book, but for their support this work
nities for technical discussions on various topics related would not have been possible.
to filtering and data fusion. I am also very grateful to We are also very grateful to Hector Mojena III,
Dr. P. G. Madhavan (CAO and chief algorist, Syzen Jonathan Pennell, Todd Perry, and Adel Rosario for their
Analytics, United States) and Dr. Ambalal Patel (senior efficient handling of the book manuscript in its various
scientist, Aeronautical Development Agency, Bangalore) stages of editing and publication. We are also grateful
for moral and technical support for several years. I am to Mayur J. Raol and his creative team for initial cover
very grateful to Jonathan Plant and his unique team design of the book.

xvii
Authors

Jitendra R. Raol earned BE and aerospace vehicles in the country. The shield was associ-
ME degrees in electrical engi- ated with a plaque, a certificate, and cash prize for the
neering from M. S. University of project work. He has published more than 100 research
Baroda, Vadodara, in 1971 and papers and numerous technical reports. He has guest-
1973, respectively, and a PhD (in edited two special issues of Sadhana (an engineering
electrical and computer engineer- journal published by the Indian Academy of Sciences,
ing) from McMaster University, Bangalore) on, first, advances in modelling, system
Hamilton, Canada, in 1986, where identification and parameter estimation (jointly with
he was also a research and the late Dr. Naresh Kumar Sinha) and, second, on mul-
teaching assistant. He taught for two years at the tisource, multisensor information fusion. He has also
M. S. University of Baroda before joining the National guest-edited two special issues of the Defense Science
Aeronautical Laboratory in 1975. At CSIR-NAL he was Journal on mobile intelligent autonomous systems
involved in the activities on human pilot modelling (jointly with Dr. Ajith Gopal, CSIR-SA), and aerospace
in fixed- and motion-based research flight simulators. avionics and allied technologies (jointly with Professor
He re-joined NAL in 1986 and retired on July 31, 2007, A. Ramachandran, MSRIT). He has guided six doctoral
as a scientist G (and head of the flight mechanics and and eight master research scholars and presently he is
control division at CSIR-NAL). He has visited Syria, guiding nearly eight faculty members, formally and/or
Germany, United Kingdom, Canada, China, United technically (of M. S. Ramaiah Institute of Technology,
States, and South Africa on deputation/fellowships to Bangalore), for their doctoral theses. He has co-authored
work on research problems on system identification, an IEE/IET (London) Control Series book Modeling and
neural networks, parameter estimation, multisensor Parameter Estimation of Dynamic Systems (2004) and a
data fusion, and robotics, and to present several tech- CRC Press (Boca Raton) book Flight Mechanics Modeling
nical papers at several international conferences, and and Analysis (2009). He has also authored the CRC Press
deliver guest lectures at some of these places. He has books Multi-Sensor Data Fusion with MATLAB (2010)
given several guest lectures at many Indian colleges and and Data Fusion Mathematics: Theory and Practice (2015).
universities, and Honeywell (HTSL, Bangalore). He had He has edited (with Ajith Gopal) the CRC Press book
also become a fellow of the IEE/IET (United Kingdom) Mobile Intelligent Autonomous Systems (2012). He has
and a senior member of the IEEE (United States). He is served as a member/chairman of numerous advisory,
a life-fellow of the Aeronautical Society of India and a technical project review, and doctoral examination
life member of the Systems Society of India. In 1976, he committees. He has also conducted sponsored research
won the K. F. Antia Memorial Prize of the Institution and worked on several projects from industry as well
of Engineers (India) for his research paper on nonlinear as other R&D organizations for NAL with substantial
filtering. He was awarded a certificate of merit by the budgets. He is a reviewer of a dozen national and
Institution of Engineers (India) for his paper on param- international journals. His main research interests
eter estimation of unstable systems. He received a best have been and are data fusion, system identification,
poster paper award from the National Conference on state/parameter estimation, flight mechanics/flight
Sensor Technology (New Delhi) for a paper on sensor data analysis, H-infinity filtering, nonlinear filtering,
data fusion. He has also received a gold medal and a artificial neural networks, fuzzy logic systems, genetic
certificate for a paper related to target tracking (from algorithms, and soft technologies for robotics. He has
the Institute of Electronics and Telecommunications also authored some books as the collection of his 300
Engineers, India). He is also one of five recipients of (poems and free-) verses on various facets closely
the CSIR (Council of Scientific and Industrial Research, related to science, philosophy, evolution, and life itself,
India) prestigious technology shield for the year 2003 for in the search of true meaning of human life on this
the leadership and contributions to the development of planet. His new area of study and research is data sys-
integrated flight mechanics and control technology for tems analytics (DaSyA).

xix
xx Authors

Girija Gopalratnam earned her PhD Indo-US Science & Technology Forum (IUSSTF) in 2008.
degree from Bangalore University, Her R&D interests include multisensor data fusion and
Karnataka, India, in 1996 for her tracking, modelling and parameter estimation of dynami-
thesis titled ‘Integrated Modelling cal systems, fusion for enhanced synthetic vision, Bayesian
and Parameter Estimation for Flight networks, and fuzzy logic.
Data Analysis’. A gold medalist from
Bangalore University, she holds a Bhekisipho Twala is a research
masters degree and BSc (Hons.) in scientist with over 20 years’ expe-
physics. She worked at the National rience in putting mathematics to
Aerospace Laboratories (NAL) since scientific use in the form of data
1976 and retired as chief scientist and head of the Flight comparison, inference, analysis,
Mechanics and Control Division. She also served as and presentation to design, col-
adviser to management and administration (M&A) for lect, and interpret data experi-
one year before her retirement on March 31, 2014. She is ments surrounding the fields of
the recipient of the NAL outstanding performance awards transport, medical, artificial intel-
for research for theory of parameter estimation for inher- ligence, software engineering,
ently unstable/augmented fly-by-wire aircraft/control sys- robotics, and most recently in electrical and electronic
tem in the year 1996. She has led teams that have received science engineering. He is a professor of artificial
the NAL outstanding performance awards for design, intelligence and statistical sciences, and head of the
development, and project execution in the areas of param- Department of Electrical and Electronic Engineering
eter estimation and multisensor data fusion. She has Science at the Johannesburg University, South Africa.
over 75 research publications in peer-reviewed journals Previously, he had worked as a principal research sci-
and conferences. She is a co-author of the IEE/IET book entist, in the unit of Modelling and Digital Sciences
Modeling and Parameter Estimation of Dynamic Systems. She of the Council of Scientific and Industrial Research,
has delivered a number of invited lectures and presented Pretoria. He had also worked as a manager at
papers at national and international conferences within Methodology and Standards Division, South Africa.
India and abroad. She was a guest scientist at the DLR He was also a research fellow at the Brunel-Software
Institute for Flight Mechanics, Braunschweig, Germany, Engineering Research Group, Brunel University, and
March to November 1990 and May to October 2004. She Bournemouth University. He was a mathematical
was selected to participate in the Science Technology & modeller–­ consultant at London Imperial College,
Innovation Policy (STIP) Executive Education Program at and a chief technical officer at the Ministry of Public
the John F. Kennedy School of Government at Harvard Works and Transport. He has numerous publications
University (Cambridge, Massachusetts) sponsored by the in workshops, conferences, and journals.
Introduction

Estimation-cum-filtering is desirable and required in circumstances) or more approximations are available.


many situations in engineering (e.g. electronics sys- State space domain-based estimation and filtering play
tems, power systems/power grids) and for aerospace an important role in the field of nonlinear multivariable
vehicles. For example, radio communication signals are stochastic control. The idea is, given noisy measure-
corrupted with noise, and a good filtering algorithm can ments of the state of a nonlinear dynamic system, the
remove the noise from or drastically reduce the effect filtering problem consists in estimating as precisely as
of noise in such electromagnetic signals while retain- possible the system’s state to obtain good estimates of
ing the useful information. Also, the uninterruptible the time histories of these states (for example, in target
power supply device filters the line voltage in order to tracking these states are position, velocity, accelera-
smooth out undesirable fluctuations that might other- tion, and certain angular states of the moving target).
wise shorten the lifespan of electrical devices such as The approaches to nonlinear filtering are based on the
computers, printers, and TVs. Filtering now has become minimum mean square (MMS) or the least squares
very essential for estimating states of satellites and tar- (LS) of errors (in states/measurements) methods. Also,
get tracking, including multisensory data fusion, in some approaches are based on Bayesian theory. The LS
aerospace and aviation applications. We study the fil- method considers state estimation as an output (error)
tering problem mainly from the point of view of deter- LS problem and is generally non-recursive; however,
mination and estimation of states of a dynamic system recursive approaches are available, like recursive least
using its state space mathematical models in the filter squares (RLS), a special case of the Kalman filter. The
structure and noisy measurement data (empirical data) classical MMS error approach proceeds via determi-
in optimal, suboptimal, and robust frameworks of the nation of the conditional probability density function
underlying optimization structure. So, this is a model- (cpdf) for the system’s state given the measurements. A
based approach. Much of the filtering and especially nonlinear estimation problem then reduces to the com-
the nonlinear filtering theory is based on theory and putation of the time evolution of a pdf (which, of course,
results from stochastic (random) processes and stochas- incorporates the measurement-data update part also).
tic calculus. Stochastic filtering theory was established The algorithms based on pdf are applicable to nonlin-
in the early 1940s with the pioneering work by Norbert ear measurements and transition equations and non-
Wiener and A. N. Kolmogorov, and it was advanced in normal error terms, in addition to being applicable to
1960 with the publication of the classic Kalman filter the normal situations.
(KF) and Kalman–Bucy filter in 1961. Also, some earlier
works on filtering theory by Bode and Shannon, Zadeh
and Ragazzini, Swerling, and Levinson were equally
important. Subsequently, the linear filtering theory in
H-2 space (Kalman filter, innovations approach) and in
Overview
Krein space (H-infinity domain) was further enriched
and strengthened by Thomas Kailath (and his team at We study in the present book two approaches to nonlin-
Stanford University). The Kalman filter and its numer- ear filtering based on (1) the conditional pdf of the states
ous variants, being based on the state space modelling given the measurement and (2) the joint pdf of the states,
approach, have dominated the recursive-adaptive fil- their estimates and the measurements. The main style
ter theory for decades in signal processing and control will be derivations of several (and not all) linear and
areas. In recent times, Kalman filters have been applied nonlinear filtering algorithms, and implementations
in communications, machine learning, neuroscience, of some important ones for illustration purposes using
economics, finance, political science, and many others, MATLAB® framework, for which, where appropriate, we
besides their usual engineering applications. Because the use numerical simulation examples coded in MATLAB
Kalman filter is basically applicable to linear estimation/​ (MATLAB is the trademark of MathWorks Ltd., USA).
filtering problems, various extensions to nonlinear sys- Some codes written by the authors (and their colleagues)
tems have been proposed starting from the extended will be available to the readers of the book at https://
Kalman filter. As a result, there are now several approxi- www.crcpress.com/Nonlinear-Filtering-Concepts-and​
mation filters that are applicable to nonlinear systems. -E ng i ne er i ng-Appl ic at io n s/R aol- G opa l rat n a m​
Also, nonlinear filters that are directly applicable to -Twala/p/book/9781498745178. The user should have
handle nonlinear systems without any (under special an access to PC-based MATLAB software and certain

xxi
xxii Introduction

toolboxes: signal processing, control system, system wireless sensor networks (WSN). In Chapter 3 we study
identification, neural networks and fuzzy logic, and several H-infinity filtering algorithms that are based on
related ones. Although the codes are written carefully the H-infinity norm. These are alternative approaches
(by us), perfection is not claimed (there might be feasi- to (linear) filtering problems that are conventionally
bility to improvise the codes, make them more optimal, based on the stochastic process theory. We also dis-
more correct/more efficient, less sensitive to some tuning cuss the mixed estimation that can combine the KF and
parameters, and make them more robust; all that is left H-infinity filter for certain specific problem situations. In
to the users). These codes will produce the results given Chapter 4 we discuss some adaptive filtering approaches
in the illustrative examples of the respective chapters that are useful for proper tuning of the basic KF. These
and appendixes of the book. Readers should use their approaches are also applicable to the extended KF and its
own discretion in utilizing these illustrative algorithms, many variants. Approaches based on covariance match-
programs, and software for solving their own practical ing, fuzzy logic, neural networks, and H-infinity frame-
problems, and the authors and the publisher of the book works are discussed for adaptive tuning.
do not take any responsibility in the unsuccessful appli- In Section II, Chapter 5 discusses some factorization
cation of the said algorithms/programs/software. There filtering methods that are useful for retaining efficiency
are some good books on filtering [1–12], however, most and accuracy when the (original) filtering algorithms are
are dated and some are not as comprehensive as desired. implemented on finite word-length computing machines.
The treatment in the present book is comprehensive These will be useful for embedded systems, on-board
in that it covers three areas of filtering: (1) ­linear and applications of filtering algorithms for control and esti-
­factorization-based methods, (2) approximation filters for mation applications, and for implementation of very
nonlinear systems and (3) direct and/or exact nonlinear large-scale aerospace estimation problems. In Chapter 6
filters, with some appropriate, where feasible, numerical we study several extensions of the KF to nonlinear sys-
illustrative examples using MATLAB. The end users of tems. We also study several other approximation filtering
the nonlinear filtering soft technology presented in this algorithms as well that are used for solving nonlinear fil-
book will be systems, aero, mechanical, civil, electrical, tering problems, which are dealt with formally in Section
electronics engineers; electronics and communications III. The extended information filters and the extended
educational institutions; several R&D laboratories; aero- H-infinity filters are also discussed. Chapter 7 discusses
space and other industries; transportation/automation the concept of model error and presents several estima-
industry; environmental sciences/engineering; robotics; tors that are set in the deterministic domain (using least
and economics system studies. Also, in the near future squares criterion) and are used for direct estimation of
this soft technology will find definite applications to model error or the so-called deterministic discrepancy
the emerging and evolving field of systems analytics, or in state space nonlinear dynamic systems. The idea is
data science, and thereby in the business/commercial that if the proper nonlinear model of the system is not
and environmental signal/data processing arena. Next, available, then in two steps one can reasonably determine
a brief description of the chapters is given. the accurate mode from the real data by this approach,
In Section I, in Chapter 1 we describe mathemati- which is called the invariant embedding procedure. We
cal modelling aspects for linear and nonlinear systems. present generalized and semi-robust adaptive estimators
Also, signal and systems norms are discussed. The latter for model errors for discrete and continuous time sys-
play a very important role in optimization frameworks tems; these estimators are based on the H-infinity norm
for obtaining filtering algorithms, in a stochastic or deter- and the solutions of two-point boundary value problems
ministic domain. Chapter 2 discusses the fundamental using the method of invariant embedding.
aspects of recursive filtering, and then dwells on the cel- In Section III, in Chapter 8, we present proper and
ebrated Kalman filter and its linear variants. Also, the fil- formal methods of filtering for nonlinear dynamic
ter error method that incorporates the KF in the structure systems. The background on stochastic differential
of the maximum likelihood (ML)/output error method and difference equations, and stochastic analysis and
(OEM) to handle process noise is developed. We also dis- stochastic calculus that are needed in developing and
cuss information filters and smoothers for linear systems. understanding these filtering methods is presented,
These information filters are directly applicable to mul- mainly in Appendix B, with other supporting theories
tisensory data fusion problems wherein target tracking and analytical results presented in Appendixes C to
and fusion aspects are involved. The reason is that the F. This material will also be useful in understanding
fusion in the information domain is straight­forward com- the developments of many previous chapters of the
pared to that in the covariance domain, that is, using the book. In Chapter 8, we basically study several nonlin-
KF as a data fuser. We also present some linear filters to ear filtering approaches based on the conditional prob-
handle delayed states/measurements, missing measure- ability density function (pdf) of the states given the
ment data, or both. Such a requirement is encountered in measurements in the (minimum) mean-square setting
Introduction xxiii

for continuous and discrete time systems. Chapter 9


discusses the nonlinear estimators and filtering
Prelude to Nonlinear Filtering
approaches based on the concept of the characteristic
function (of the underlying joint pdf). We obtain the The nonlinear filtering theory is mainly based on theo-
conditionally optimal estimators and several special ries of stochastic processes, stochastic calculus, and sto-
cases that are similar to some of the existing linear chastic differential/difference equations, which in turn
KFs and some nonlinear filters that are discussed in are based on probability theory (and statistics); the lat-
Chapter 8. The similarities and contradistinctions ter is considered as an art (and science/mathematics!) of
are presented, perhaps for the first time in a book on calculating probabilities (Appendix B) [13]. The stochas-
nonlinear filtering. These nonlinear estimators/filters tic processes are understood in the (restricted) sense of
are based on the concept of the joint probability den- random evolutions (and fluctuations, often the word vol-
sity function (jpdf) of the states, their estimates, and atility is used) with respect to time. Often measure the-
measurements. We also study some finite dimensional ory is adopted as the foundation of probability theory.
and robust filtering algorithms here. In Chapter 10 we However, the continuous-time processes pose difficult
discuss several approximation approaches, includ- measure theory-related issues [13]: If a particle is subject
ing particle filtering and Monte Carlo simulations for to a random evolution, to show that its trajectory is con-
implementation of nonlinear (and approximations) tinuous, or bounded, requires that all time instances be
filters presented in preceding chapters. In Chapter 11 considered. But, the classical measure theory can only
we discuss the problem of direct parameter estimation handle a countable infinity of time values. Interestingly,
in nonlinear dynamic systems and present nonlinear the probability theory depends on measure theory, and
least squares, generalized least squares differential it also requires more of the measure theory than other
correction, maximum likelihood-output error, and analyses. Kolmogorov’s mathematical model represents
estimation-before-modelling methods. In Chapter 12 real-world events by the elements of the sigma alge-
we discuss several approaches to nonlinear observers, bra F of a probability space (Ω,F,P); here, the set Ω can
starting with the linear observer. We understand the be imagined as a very big urn/vessel from which we
observer in the sense that the Kalman filter is the best can pull out a small ball/set ω; then the elements of F
linear optimal observer. describe the various questions that one can ask about
Where appropriate, we present important theories, ω: that a single random draw is ‘revealed’ progressively,
algorithms, and some application results to certain and time t (discrete or continuous) is introduced, in the
engineering examples in the appendixes of certain form of an increasing family, F of sigma algebras; thus
chapters. Finally, in Section IV, the main Appendixes the sigma algebra F represents ‘what is known of ω up to
A to F, we give some special concepts and special time t’. Then we call T, as the moment where for the first
formulations/equations applicable to linear and non- time the random evolution shows a certain property. It
linear filtering and estimation, with some important is a random quantity such that, to know if T = t, there is
concepts/theorems/analytical results used in some of no need to look at the evolution beyond t. In mathemati-
the previous chapters, where appropriate; throughout cal language, the event T = t belongs to F; analogously,
the book such results are termed as TARs (theorem or we can say that to know if there was a fire in January
theoretical and analytical results). Certain exercises and 2015, there is no need to wait until the month of April.
their solutions are mainly from the open literature that Also, to know if the fire occurred in November, we need
is cited in various Sections I to IV, and at the appendixes to know that a fire occurred in November, and also that no
at the end of the chapters. Due to the types, forms, and fire occurred in December. We call these ‘non-anticipatory’
variability of these exercises, it has not been feasible to random variables at stopping time s.
cite individual references for some of the exercises. The The study of probability mostly involves the study of
solution manual will be available from the publisher for independence: (a) sums of independent random vari-
the instructors. Chapters 1 to 4, and Appendixes 1C, 2E, ables (IRVs) and (b) corresponding limit (probability)
3A, 3B, 4A, are contributed by the second author. All the distributions. The simplest type of random evolution is
remaining chapters and appendixes are contributed by Markov-dependency, or Markov-dependency property
the first author. However, since the book is written in a (MDP): If all useful information is included in (a com-
shared manner (with the authors corresponding only plete) knowledge of the current state of the event (and if
via Internet, and working on different schedules, and this is known), the knowledge of previous states does not
concurrently), the disposition of the material could add any more information about the accuracy of the pre-
not be made sequential, even though enough care has diction. Most examples of random evolution in nature
been taken to have a uniform treatment; we have tried have MDP, or become MDP by a suitable interpretation of
to be as cohesive as feasible, but we do not claim any the words ‘current state’ and ‘complete knowledge’. The
perfection. theory of Markov processes divides into sub-theories,
xxiv Introduction

depending on whether (a) time is discrete or continuous, solution of Dirichlet’s problem (Appendix E) in an open
or (b) the set of possible states is finite or countably infi- set, and the behaviour of Brownian motion starting
nite (Markov chains), or continuous. The classical notion from a point x of this open set; the first moment (means
of sums of IRVs can be generalized into a branch of first time) when a trajectory ω of Brownian motion
Markov process theory where a group structure replaces meets the boundary depends on ω; hence, it is a random
addition called (a) random walks in discrete time and variable. Call it T(ω); and let X(ω) be the position of the
(b) processes with independent increments (in continu- trajectory at that (time) moment. Then it is a point on the
ous time), the very famous Brownian motion (often called boundary, so if f is a boundary function, f(X) is a ran-
the Wiener process). From a probabilistic point of view, dom quantity whose expected value depends on the ini-
a Markov process is determined by its initial law and its tial point x. Call it then F(x); this function on the open set
transition function (probability) P(x, A). This gives, if we solves Dirichlet’s problem on the open set with bound-
observed the process in state x at time s, the probability ary condition f. There is a link between the harmonic
that we find it at a later time t in a set A (if we exclude and super-harmonic functions of potential theory, and
the case of chains, the probability of finding it exactly martingale theory: if a harmonic (or super-harmonic)
in a given state x is null in general). The transition func- function is decomposed with Brownian motion, then
tion is an analytical object; when it is stationary (it only we obtain a martingale (or super-martingale) with con-
depends on the difference (t – s)), we obtain a function tinuous trajectories. If we emphasize this continuity,
Ps,t (x, A) to which the analytical theory of semi-groups super-harmonic functions are not in general continu-
applies. The main aspect of these processes is their long- ous functions, but Brownian trajectories do not see their
term evolution; the evolution of animal or human popu- irregularities.
lations can be described by Markov models with three A good example of martingales is the capital of a
types of limiting behaviour: extinction, equilibrium, or player during a fair game: on average, this capital stays
explosion. The study of these equilibria states, where a constant, however, in detail it can fluctuate consider-
stationarity argument is applicable, is related to the sta- ably; significant, and yet rare gains can compensate
tistical mechanics. Continuous-time Markov processes for accumulations of small losses (or conversely). The
and finite state space Markov chains represent a model notion of super-martingale corresponds to an unfavor-
of perfectly regular random evolution, which stays in a able game (the ‘super’ expresses the point of view of
state for a certain period of time (with known law), then the casino). In continuous time, Brownian motion (the
jumps into another state drawn at random according to mathematical model describing the motion of a pollen
a known law, and so on and so forth indefinitely. But, particle in water seen under a microscope) is a pure fluc-
when the number of states becomes infinite, extraordi- tuation. On average, the particle does not move: hence,
nary phenomena can happen: it could be that the jumps the two-​dimensional Brownian motion is a martingale;
accumulate in a finite period of time (and then the pro- however, if we add a third-vertical dimension, it loses
cess becomes indescribably complicated), even worse, the martingale property, because the particle will tend
it could be that from the start each state is occupied to go down if it is denser than water (this vertical com-
according to a fractal set. The other important area of ponent is a super-martingale), and go up otherwise.
Markov process theory is diffusion theory. In contrast It is also clear that the results obtained for Brownian
to Markov chains (which, in simple cases, progress only motion can be extended to more general Markov pro-
by jumps separated by an interval of constant length), cesses. Interestingly, the main role is played by a pair
the diffusions are Markov processes (real, or with values of transition semi-groups that are dual with respect to
in space of real numbers, or a manifold) whose trajec- a measure (in classical potential theory), the Brownian
tories are continuous. We know from Kolmogorov that semi-group is its own dual with respect to Lesbegue
the transition function is a solution to a parabolic partial measure (Appendix E). Then, we can build a much
differential equation (PDE), the Fokker–Planck equa- richer potential theory; however, provisionally, duality
tion (i.e. forward or backward Kolmogorov equation, remains devoid of probabilistic interpretation. In two
FPKFE). The ideas of increasing families of sigma alge- dimensions, Brownian motion is said to be recurrent.
bras (stopping times) made it possible to give a precise Its trajectories (instead of tending to infinity) come back
meaning to what we call the strong Markov property. (infinitely often) to an arbitrary neighbourhood of any
Given a Markov process whose transition function is point of the plane, giving rise to (the special theory of)
known (and stationary), the process considered from a logarithmic potential. There exists a class of Markov
random time T is again a Markov process with the same processes of the same kind, whose study is related
transition function, provided T is a stopping time. rather to the ergodic theory; also, there is a link between
There is some connection between classical potential probability, harmonic analysis, and group theory (dis-
theory (Appendix E) and continuous time martingale crete groups and Lie groups). Ergocity signifies that
theory (Appendix B). The idea is the link between the the properties of a (stochastic) process remain the same
Introduction xxv

whether these are evaluated across a given ensemble of stake, whereas x(k) is a normalized quantity represent-
the process at a certain time or are evaluated for any ing the gain of a gambler (who stakes $1 at the kth game);
one member of the ensemble along the entire time of the and that f(k) only depends on the past then signifies that
time history of the member, that is, one realization. the gambler is not a psychic. Using the language of finan-
In the study of diffusions – the time evolution of ran- cial mathematics, we see that the normalized quantities
dom processes – the main concern is the structure of Xt represent prices of stocks, for example, and we know
diffusions in several dimensions, and in particular the this is how Brownian motion made its appearance in
possible behaviour (at the boundary of an open set) of mathematics. A question of great practical importance
a diffusion whose infinitesimal generator is known in involving the stochastic integral is the modelling of the
the interior. For example, choose a problem, then find noise that disturbs the (time) evolution of a (mechanical
all strongly Markov processes with continuous trajecto- or any feasible) dynamic system. The only one of these
ries on the positive closed half-line, which are Brownian aspects that has a properly mathematical importance is
motions in the open half-line. However, the problem in the Stratonovich integral (SSI). This SSI possesses the
several dimensions is much more difficult. The idea is remarkable property of being the limit of deterministic
that the diffusion is formed from an interior process, integrals when we approach Brownian motion by differ-
describing the first trip to the boundary, then the sub- entiable curves. Ito’s most important contribution (not
sequent excursions starting and ending on the bound- to have defined stochastic integrals, since Wiener had
ary; an infinite number of small excursions happen in a already prepared the way, but to have developed their
finite amount of time, and we must manage to describe calculus) was the famous Ito formula, which expresses
them and join them back together. It is natural that mar- how this integral differs from the ordinary integral.
tingales should be applied to Markov processes; con- He also used them to develop a very complete theory
versely, methods developed for Markov processes have of stochastic differential equations. It is already known
an impact on the theory of martingales. that Ito’s stochastic integral theory is not essentially tied
A stochastic process X is thought of a function of two to Brownian motion, but could be extended to some
variables X(t,ω) or X(ω). Here, ω is ‘chance’, a param- square-integrable martingales. This theory would very
eter drawn randomly from a giant ‘urn/vessel’ Ω, quickly extend to martingales that are not necessarily
and the trajectories of the processes are functions of square-integrable: on one hand by means of the notion
time t → Xt(ω). However, in general they are irregular of a local martingale, which leads to the final notion of
functions, and we cannot define them by the methods semi-martingale, and on the other hand by means of new
t martingale inequalities. From a concrete point of view,
of analysis, an integral
∫ f (s) dX (ω)
0
s (for reasonable a semi-martingale is a process obtained by superpos-
ing a signal: (a) a process with regular trajectories (of
functions of time), which, in fact, would be the limits bounded variation), satisfying the technical condition
of Riemann sums (Appendix E) on the interval (0,t): of being predictable, and (b) a noise (i.e. a meaningless
∑ f (s(k))(X(k − 1) − X(k))
k
(here, suffix t is avoided), process/pure fluctuation, unwanted stuff), modelled
by a local martingale. The decomposition theorem says
where s(k) would be an arbitrary point in the interval that (under minimal integrability conditions of the
(k, k + 1). However, this is much less feasible if the func- absence of very big jumps), the decomposition of the
tion f(s,ω) itself depends on chance. However, Ito had process into the sum of a signal and a noise is unique.
studied the case where X is a Brownian motion, and f a Knowing the law of probability we can filter out the noise
process such that at each instant t, f(t,ω) does not depend and recover the signal in a unique manner. This reading
on the behaviour of the Brownian motion after the (out)/recovering of the signal depends not only on the
instant t, and where s(k) is the left endpoint of the inter- process, but also on the underlying filtration/estimation/
val (k, k + 1). In this case, we can show that the Riemann filtering method that represents the knowledge of the
sums converge (not for each ω, but as RVs on Ω) to a observer. The fundamental properties of Ito’s stochastic
quantity that is called the (Ito) stochastic integral with integral (ITSI) can be extended to all semi-martingales,
all the properties desired for an integral. This could and most of all develop a unified theory of stochastic
seem artificial, but the discrete analog shows that it is differential equations with regard to semi-martingales.
not so, the sums considered in this case are of the form The study of stability (with respect to all parameters
n at the same time) was also carried out, and we can
S( k ) = ∑ f (k)(X(k − 1) − X(k)).
k =1
Set X(k + 1) – X(k) = equally extend to these general equations a big part of
the theory of stochastic flows, and hence, the theory of
x(k), and think of S(n) as the capital (positive/gain or stochastic differential equations ends up being in com-
negative/loss) of a gambler passing his time in a casino, plete parallelism with that of ordinary differential equa-
just after the nth game. In this case, f(k) represents the tions. There is a specific distinction to the probabilistic
xxvi Introduction

case: the distinction between uniqueness of trajectories or all the data are contaminated with random noise.
and uniqueness in law. The possibility of bringing sev- Sometimes, the information about certain unknown but
eral distinct driving semi-martingales (in other words constant (or slowly varying) parameters of this dynamic
several different ‘times’) into a stochastic differential system (in fact the model parameters/coefficients)
equation (of several dimensions) makes them resemble should also be obtained and hence, the problem of state
equations with total differentials more than ordinary and parameter estimation has developed into a distinct
differential equations, with appropriate geometric con- discipline called the estimation theory. In general, many
siderations (properties of Lie algebras). Ito’s integral dynamic real-life systems are described by stochastic
is not a ‘true’ integral, trajectory by trajectory, but it is differential and/or difference (liner or nonlinear) equa-
one in the sense of vector measures. The class of semi- tions. Most of the real-life systems are nonlinear and
martingales is indeed a class of processes large enough distributed parameter systems. Due to the simplicity of
to contain most of the usual processes, and possesses the analysis and design, the linear estimation algorithms
very good properties of stability. If we replace a law on have been greatly developed and widely used. However,
the space Ω by an equivalent law without changing the full-scale nonlinear estimators/filters are required to
filtration, the semi-martingales for the two laws are the handle the problems that are highly non­linear and where
same, however, their decompositions into signal plus the noise processes are non-Gaussian and for joint state
noise change. Indeed, statistics seeks to determine the and parameter estimation. Interestingly enough, certain
law of a random phenomenon from observations, and astronomical studies have provided a great impetus
this law is not known a priori, and a search for proper- to the development of the estimation theory. The earli-
ties of processes that are invariant under changes in the est approaches to the characterization of the estimation
law is very important. problems in connection with the determination of the
The set of the topics – martingale inequalities, general relevant parameters that describe the motion of celestial
theory, stochastic integral, and enlargement – constitute bodies were due to Babylonian astronomers (300 B.C.),
what is called stochastic calculus. Yet, it carries more Roger Cotes in 1722, Euler in 1749, Bernoulli in 1777,
branches: (a) the use of martingale methods to deal Gauss in 1795, and Legendre in 1806 [15]. Some impor-
with problems of narrow convergence of process laws, tant results of estimation theory, the method of moments
(b) the generalization of martingale convergence theo- due to Pearson in 1894 and the method maximum of
rems leading to some form of asymptotic martingales likelihood due to Fisher in 1911 (to 1925), have provided
(or ‘amarts’), in discrete or continuous time, (c) the exten- a unified framework and stimulus to the development
sion of known results on martingales to certain multi- of the theory of estimation [15]. Much of this work was
dimensional time processes, and (d) prediction theory, concerned with the problem of estimation of parameters
which shows the tight links uniting the most general from the measured data. The work related to minimi-
possible theory of processes with Markov processes. zation of various functions of the errors is attributed
The universality of martingales spreads to a notion of to Galileo Galilei in 1632 [16]. The studies of the least-
martingale problems in diffusion theory, then used in squares estimation in stochastic processes were made by
many other areas, like that of point processes. The idea Kolmogorov in 1939 (to 1941), Krein in 1945, and Wiener
is to characterize the law of a stochastic process by a in 1942 (to 1949). Carlton and Follin (1956) used a recur-
family of processes that we require to be martingales sive approach that provided a stimulus to the work of
(eventually local martingales). In the case of diffusions Bucy and laid the basis for the development of recursive
(more generally Markov processes), these processes are filters [17]. It was soon realized that the linear filtering
constructed in a simple manner from the infinitesimal algorithms may exhibit the so-called divergence phe-
generator. What is unknown of the martingale problem nomenon due to any of the following reasons [18]: (1) the
is a probability law, for which we must discuss existence a priori statistics necessary for tuning of the Kalman fil-
and uniqueness – and for existence, it is quite natural to tering algorithms are chosen incorrectly, (2) un-modelled
use a method of narrow convergence. Further tools are parameters, (3) the linear system model is used, even
narrow compactness criteria using ‘local characteristics’ though there might be a significant effect on nonlinear
of semi-martingales and problems of constructing all parameters on the data used in the estimation/filtering
martingales from a family of martingales by using sto- algorithm, and (4) the filtering algorithm is implemented
chastic integrals. on a finite (small) word length digital computer. In order
Now, we give an overview of the state estimation (-cum- to compensate for the model inadequacies when non-
filtering) problem [14]. State estimation is the process of linear systems are approximated by linear models and
obtaining as much as possible and accurate information to obtain the information about unknown a priori sta-
(basically the time history of the states) regarding the tistics of noise processes, adaptive filtering algorithms
internal states of a given dynamic system using the sys- have been developed. Factorization-based techniques/
tem’s input/output data, wherein usually some of these algorithms that provide accurate and reliable estimates,
Introduction xxvii

in spite of finite-word length implementations of the lin- Fokker–Planck equation. These results do not directly
ear algorithms, have been developed. Application of the yield practically implementable algorithms due to the
linear filtering algorithms to the nonlinear estimation fact that the conditional pdf for nonlinear systems, in
problems has been through the extension and linear- general, requires an infinite number of parameters for
ization of the nonlinear equations. The resulting algo- its representation (unlike, Gaussian pdf that requires
rithms are the linearized and extended Kalman filters. only two parameters, the mean and the covariance of
These and related variants of Kalman filter algorithms the stochastic process involved). As a result, the imple-
were developed in order to compensate for the linear mentable algorithms were obtained by various approxi-
model inadequacies. It must be noted here that all these mations: (1) truncated second-order filter (whereas the
algorithms and techniques, most suitable to linear and extended KF is the first-order filter), (2) Gaussian second-
linearized filtering problems, do not always give satis- order filter, (3) assumed density filter that does not
factory performance when applied to the actual nonlin- involve Taylor series approximation, and (4) cumulant
ear systems’ data. In many cases the linearization would truncation and statistical linearization, and many other
be ill-conditioned and hence the extended version of the algorithms. Also, the maximum a posteriori (MAP) esti-
basic Kalman filtering algorithms are not useful. Proper mator is obtained by the conditional mode. Alternative
estimation procedures for nonlinear systems have been nonlinear estimation results have been developed based
developed based on the Kushner (–Stratonovich) equa- on the principle of invariant embedding (for systems
tion, which describes the evolution of the conditional with no process noise, often called deterministic esti-
probability density function (of states, cpdf), given mation problems), and stochastic approximations. The
the measurements for continuous-time state estima- innovations process-based approach has been used
tion with continuously available measurements. For to generate the estimation results for linear as well as
the discrete time systems, the description of the con- nonlinear systems; these have provided a considerable
ditional probability density is based on Kolmogorov’s insight into general filtering problems. The estimation/
forward equation (KFE), this is also often called the filtering solutions developed for linear problems have

TABLE I.1
Brief History of Stochastic/Nonlinear Filtering Theory
Author(s) Method Solution Main Features/Applicability
Kolmogorov, 1941 Innovations Exact Linear, stationary
Wiener, 1942 Spectral factorization Exact Linear, stationary, infinite
memory
Levinson, 1947 Lattice filter Approximate Linear stationary, finite memory
Bode and Shannon, 1950 Innovations, whitening Exact Linear, stationary
Zadeh and Ragazzini, 1950 Innovations, whitening Exact Linear, non-stationary
Kalman, 1960 Orthogonal projection Exact LQG, non-stationary, discrete
Kalman and Bucy, 1961 Recursive Riccati equation Exact LQG, non-stationary, continuous
Stratonovich, 1960 Conditional Markov process Exact Nonlinear, nonstationary
Kushner, 1967 PDE Exact Nonlinear, non-stationary
Zakai, 1969 PDE Exact Nonlinear, non-stationary
Handschin and Mayne, 1969 Monte Carlo Approximate Nonlinear, non-Gaussian,
non-stationary
Bucy and Senne, 1971 Point-mass, Bayes Approximate Nonlinear, non-Gaussian,
non-stationary
Kailath, 1971 Innovations Exact Linear, non-Gaussian,
non-stationary
Benes, 1981 Potential function Exact solution of Zakai equation Nonlinear, finite-dimensional
Pugachev, early ’80s Characteristic function Exact Nonlinear, conditionally optimal
approach
Daum, 1986 Potential function, log Exact solution of FPKF Nonlinear, finite-dimensional
homotopy, particle flow equation, particle flow filter
Gordon, Salmond and Smith, Bootstrap, sequential Monte Approximate Nonlinear, non-Gaussian,
1993 Carlo non-stationary
Julier and Uhlmann, 1997 Unscented transformation Approximate Nonlinear, (non-)Gaussian,
derivative free
Source: Adapted from Chen, Z., Bayesian filtering: From Kalman filters to particle filters, and beyond, www.dsi.unifi.it/users/chisci/idfric​
/Nonlinear_filtering_Chen.pdf, accessed May 2015.
xxviii Introduction

very limited utility for highly nonlinear systems, and 6. Pitas I., and Venetsanopoulos, A. N. Nonlinear Digital
the nonlinear methods so far mentioned require either Filters: Principles and Applications. Kluwer, New York,
approximations of the cpdf and/or the differentiation 1990.
of the nonlinear functions of the system equations with 7. Astola, J., and P. Kuosmanen. Fundamentals of Nonlinear
Digital Filtering. CRC Press, Florida, 1997.
respect to the current state of the nominal/pre-specified
8. Gelb, A. (Ed.). Applied Optimal Estimation. The Analytic
time history of the state. Some further development was
Sciences Corporation, TASK, MIT Press, Cambridge,
in the area of the so-called unscented KF, or the deriva- MA, 1974.
tive free KF, wherefore the linearization of nonlinear 9. Haykin, S. (Ed.). Kalman Filtering and Neural Networks.
system equations is not required. Instead the attention is John Wiley & Sons, New York, 2001.
shifted to the probability model, for which the so-called 10. Candy, J. V. Model-Based Signal Processing, Wiley-IEEE
sigma points are computed, that is, the probability model Press, Hoboken, NJ, 2005.
is approximated thus. The alternative nonlinear estima- 11. Haykin, S. Adaptive Filtering Theory. Prentice Hall, Upper
tors were proposed by Pugachev, these are based on the Saddle River, NJ, 1986.
joint pdf of the states, their estimates, and the measure- 12. Simon, D. Optimal State Estimation: Kalman, H Infinity,
ments. This approach is also called the characteristic and Nonlinear Approaches, Wiley-Blackwell, Hoboken, NJ,
July 2006.
function approach. Table I.1 depicts a brief developmen-
13. Meyer, P.-A. Stochastic Processes from 1950 to the
tal (and partial) history of the stochastic filtering theory,
Present, translated from the French by Jeanine Sedjro,
from liner to nonlinear, Gaussian to non-Gaussian, and Rutgers University. Originally published in French
stationary to non-stationary problems [19]. Several of as ‘Les Processus Stochastiques de 1950 à Nos Jours’,
the linear [20] and nonlinear estimation-cum-filtering pp. 813–848 of Development of Mathematics 1950–2000,
approaches and algorithms indicated in the preceding edited by Jean-Paul Pier, Birkhäuser, 2000. Electronic
paragraphs are discussed in the sections of the present Journal for History of Probability and Statistics, 5(1), June
book. 2009. www.jehps.net (accessed April 2015).
14. Raol, J. R. Stochastic state estimation with application
to satellite orbit determination. Ph.D. thesis, McMaster
University, Hamilton, Ontario, Canada, 1986.
15. Sorenson, H. W. Parameter Estimation – Principles and
Problems. Marcel Dekker, New York, 1980.
References
16. Kailath, T. A view of three decades of linear filtering
1. Verhaegen, M., and V. Verdult. Filtering and System theory. IEEE Transactions on Information Theory, 20(2),
Identification – A Least Squares Approach. Cambridge 146–181, 1974.
University Press, New York, May 2007. 17. Bucy, R S., and Joseph, P. D. Filtering for Stochastic
2. Tanizaki, H. Nonlinear Filters-Estimation and Applications Processes with Applications to Guidance. Interscience, New
(2nd edn.), Springer-Verlag, Berlin, 1996. http://www2​ York, 1968.
.econ.osaka-u.ac.jp/~tanizaki/cv/books/nf/nf.pdf. 18. Bierman, G. J. Factorization Methods for Discrete Sequential
3. Nelles, O. Nonlinear System Identification. Springer-Verlag, Estimation. Academic Press, New York, 1977.
Berlin, 2001. 19. Chen, Z. Bayesian filtering: From Kalman filters to par-
4. Ahmed, N. U. Linear and Nonlinear Filtering for Scientists ticle filters, and beyond. www.dsi.unifi.it/users/chisci​​
and Engineers. World Scientific Publishing Company, /idfric/Nonlinear_filtering_Chen.pdf (accessed May
Singapore, January 1999. 2015).
5. Krishnan, V. Nonlinear Filtering and Smoothing: An 20. Raol, J. R., Girija, G., and Singh, J. Modelling and Parameter
Introduction to Martingales, Stochastic Integrals and Estimation of Dynamic Systems. IEE/IET Control Series
Estimation. Dover Publications, New York, 2005. Vol. 65, London, 2004.
Another random document with
no related content on Scribd:
bees in spring-time, like, xii. 121.
beggarly, unmannered corse, xii. 285.
beggars are coming to town, The, etc., viii. 408 n.
beguile the slow and creeping hours of time, xii. 157.
Begun in gladness, whereof has come, etc., vii. 57.
Behold the fate of a reformer, etc., vi. 378.
Behold the lilies of the field, etc., xi. 504; vi. 392.
Behold the twig, to which thou laidest down thy head, is now
become a tree, v. 199.
Behold thy mother, etc., v. 184.
beholds that lady in her bower, etc., viii. 308.
Believe me, the providence of God, etc., vi. 100.
believes him to have been the greatest genius, etc., v. 123.
believes in a fat capon, x. 69.
bellum internecinum, iii. 61; xi. 469.
Below the bottom of the great abyss, etc., v. 315.
Belton so pert, and so pimply, viii. 120; x. 38.
Beneath the hills, along the flowery vales, etc., iv. 272.
Beneath the hills, amid the flowery groves, etc., vii. 233.
Besides these jolly birds, whose corpse impure, v. 80.
best can feel them, xii. 43.
best company in the world, the, viii. 82.
best of kings, i. 305; iii. 41.
best of men (The) that e’er wore earth about him, was a sufferer,
etc., v. 185.
best tennis players, the, vii. 42.
best-found, and latest, as well as earliest choice, viii. 392.
best thing (that the) that could have happened to a man was never
to have been born, etc., i. 1.
bestow his tediousness, xii. 40.
Better be lord of them that riches have, etc., vi. 111.
better none, x. 185.
Beware, therefore, with lordes for to play, etc., iii. 385.
Beyond Hyde Park all is a desart, etc., vi. 187; vii. 67; viii. 36.
bidding, at his, viii. 236.
bid a gay defiance to mischance, must, etc., viii. 160.
Bidding the lovely scenes, etc., ix. 94; xii. 151.
Bigger than a mustard seed, at first no, etc., x. 395.
bis repetita crambe, vii. 126.
bitter bad judges, i. 94; vi. 310, 407.
black and melancholy yew trees, No, ix. 145.
black mutton or white, v. 114; vii. 173.
black upon white, and white upon black, vi. 319.
blasts from hell, viii. 363.
blazons herself, viii. 74.
bleating oratory, the, v. 323.
blesses the Regent, etc., iii. 42.
Blessings be with them, and eternal praise, etc., i. 22.
blights the tender blossom, etc., xii. 140.
blind with rain, ix. 109.
blindness to the future kindly given, Oh! etc., vi. 250.
blinking Sam, xi. 221.
blocking out and staying in, xii. 233.
blossom tear? Ah! why so soon the, xii. 207.
blotted out the map of Europe, xii. 291.
Blow, blow, thou winter’s wind, xii. 122.
blown about by every wind, etc., xii. 441.
blushes with blood of queens and kings, vii. 225.
body of this death, the, xii. 125.
bony prizer, viii. 357; xi. 367.
bonzes and priests, of all religions, the, etc., viii. 104.
book in the world he was the best pleased with, viii. 94.
book, sealed, ix. 29.
Books do not teach the use of books, vi. 73.
Books, dreams are each a world, and books, we know, are a
substantial world, both pure and good, v. 247; vii. 372; viii. 120;
x. 38; xi. 295.
book and brain, within the volume of the, etc., vi. 173.
bordered on the verge of all we hate, viii. 188.
Borealis race, Or like the, iii. 141.
born for the universe, iv. 251.
Born for their use, they live but to oblige them, etc., vii. 80.
born in a garret sixteen storeys high, iv. 258.
born to converse, to live, and act with ease, xi. 381.
Born universal heir to all humanity! vi. 42, 253.
born within the sound of Bow-bell, vii. 70.
bosom of its Father and its God, v. 137.
both end and use, iii. 323.
both living and loving! ii. 310.
Both thought it was the wisest course, etc., viii. 66.
bound them with Styx, xii. 260.
bow their crested pride, iii. 11.
brain would have been like a smokejack, my, vi. 275.
brangle and brave-all, etc., iii. 314.
brave man in distress, a, xi. 533.
brave sublunary things, vi. 193; vii. 265; xii. 153.
brazen throat and iron tongue, with its, etc., xii. 55.
break out like a wild overthrow, vi. 164.
breath that under heaven is blown, By every little, iv. 333; xii. 22.
breath can mar them as a breath has made, A, vii. 52; xi. 197.
Breathed hot, From all the boundless furnace of the sky, etc., v. 88.
breezy call of incense-breathing morn, ix. 51.
Brentford on one throne, So sit two Kings of, ix. 236.
Brentford to Ealing, from, etc., viii. 168, 318.
Brightest, if there be remaining Any service, without feigning, etc.,
v. 255.
brilliant land! Ah! etc., viii. 441.
Bring back the hour of glory in the grass, etc., vi. 257.
Bring but a Scotsman frae his hill, etc., xi. 446.
Britain’s warriors, her Statesmen, etc., iii. 162, 258; xi. 429.
Britain’s warriors, the flower of, etc., xi. 429.
Britannia rival Greece, bid, vi. 270.
broad as it is long, as, xi. 369.
brother, and half the story had its, etc., viii. 399.
brother of the groves, a, viii. 467; xii. 133.
brother, Sir Charles, lived to himself, her, vi. 90.
brothers of the angle, xii. 19.
Brownies and Bogilis full is his Buik, of, x. 311.
Brunswick’s fated line, iii. 117. bubble knocks another on the head,
one, etc., viii. 464.
bud of the briar, the, v. 323.
building up of our feelings through the imagination, vii. 408 n.
Buonaparte, little bookselling, xi. 386
burden and the mystery, the, v. 67; ix. 159.
buried as a man, he had been, etc., xii., 353.
burning and shining light, i. 60.
burnished fly in month of June, a, v. 88.
Busied about some wicked gin, xi. 581.
But a little way off, they saw the mast, etc., v. 323.
But for an utmost end, etc., xi. 265.
But he so teazed me, viii. 255.
But I will come again, my love, An’ it were ten thousand mile, ii.
290.
But if, unblameable in word and thought, etc., v. 94.
But not for me the merry bells, viii. 525.
But of the two, less dangerous is the offence, etc., v. 74.
But still the world, etc., iii. 254.
But ’tis the fall degrades her to a whore, etc., iii. 46; vii. 368; xi.
475.
But the admirers of this great poet have most reason to complain,
etc., i. 177.
But the commandment of knowledge, etc., v. 332.
But there is matter for a second rhyme, etc., xi. 282; xii. 275.
But thou, oh Hope, with eyes so fair, etc., viii. 436.
But where are the other eleven? i. 257.
But where ye doubt the truth not knowing, Believing the best, good
may be growing, etc., v. 280.
butterflies flutter around, And gaudy, xii. 25.
buttress, wall, and tower, Where, ix. 266.
by a long tract of time, by the use of language, etc., vii. 387.
By him lay heavie Sleepe, cosin of Death, etc., v. 196.
By our first strange and fatal interview, etc., xii. 28.
By the first part of this last tale, etc., v. 275.
by the help of his fayre hornes on hight, v. 42.
By the mass I saw him of late call up a great black devil, etc., v.
288.
by words only ... a man becometh, x. 135.

C.
Cætera desunt, vi. 121.
calamity, the rub that makes, etc., xii. 199.
call evil good and good evil, to, xi. 341.
Call not so loud or they will hear us, vii. 377.
call up him who left half-told, And, xii. 27.
Calling each by name, etc., ix. 401.
Calm contemplation and majestic pains, iv. 274; vi. 26; ix. 44.
Calm contemplation and poetic ease, v. 71; xi. 432, 508.
calm, peaceable writers, vi. 254.
came, saw, and were satisfied, we, viii. 455.
Canning had the most elegant mind since Virgil, xi. 336 n.
canny ways and pawky looks, xii. 91.
canonised bones, his, vi. 58.
cant religious, cant political, etc., xii. 338.
capacity, a greater general, etc., x. 178.
caput mortuum, xi. 495.
careful after many things, They are, etc., xii. 197.
Care, mad to see a man so happy, etc., v. 129.
Care mounted behind the horseman, etc., vi. 87.
cares, And ever against eating, etc., xii. 142.
Carnage is its daughter! i. 214; vii. 374; viii. 348.
Carnage is her daughter, iii. 120 n.
Carnage was the daughter of Humanity, i. 391 n.; iii. 166.
Carnation was a colour he never could abide, xi. 457.
Carlo Maratti succeeded better than those, etc., vi. 124.
carries noise, and behind it, it leaves tears, it, viii. 348.
cast both body and soul into hell, xii. 359.
cast some longing, lingering looks behind, viii. 250.
Castalie, the dew of, v. 14; x. 156; xii. 294.
castle walls crumbled into ashes, his, etc., viii. 309.
casuist, that noble and liberal, i. 235; viii. 186.
cat and canary-bird, the, etc., x. 195.
catalogue they go for actors, in the, viii. 465.
Catch a king and kill a king, xi. 551.
Catch ere she falls, The Cynthia of the minute, xi. 402.
catch glimpses that may make them less forlorn! vi. 27; xi. 267; xii.
42.
catch the breezy air, vii. 70.
cathedral’s gloom and choir, The, etc., ix. 207; xi. 535.
Caucasus, the frosty, xii. 149.
cause of evil, re-risen, iii. 117.
cause was hearted, the, xii. 288.
Cease your funning, viii. 194, 255, 323. 470.
censure the age, When they, etc., vii. 377.
Centaur not fabulous, xii. 228.
certain lady of a manor, a, i. 422; xi. 273 n.
certain little gentleman, a, iii. 312.
Certain so wroth are they, iii. 268.
certain tender bloom his fame o’erspreads, A, xii. 207, 262.
Certainly, as her eyelids are more pleasant to behold, etc., v. 324.
C’est un mauvais métier que celui de médire, vii. 205.
Chaldee wise, The, etc., v. 292.
Challenges essoine, from every work he, xii. 46, 225.
chamber, was dispainted all within, His, etc., viii. 128.
chapel-bell, the little, xii. 305.
chargeable, very, x. 172.
Charity begins at home, iii. 289; xi. 319.
Charity covers a multitude of sins, vii. 83; viii. 33.
charm these deaf adders wisely, xi. 415.
Charming Betsy Careless, the, viii. 144.
Charron, Or more wise, viii. 93 n.
chase his fancy’s rolling speed, x. 120.
cheap defence, i. 295.
cheat the gallows face, xi. 551.
cheese-parings, as a saving of, etc., vii. 273.
chemist, statesman, fiddler and buffoon, i. 85; x. 207.
cherish our prejudices, etc., xii. 395.
child and champion of Jacobinism, iii. 99, 227; iv. 6; xi. 422.
child is father to the man, the, vii. 231; xi. 334.
children of yon azure sheen, As are the, xii. 262.
children of the world are wiser, the, etc., xi. 522; xii. 298.
children’s play, Come, let us leave off, etc., iii. 132.
children sporting, We see the, etc., vi. 92; xii. 130.
chips of short-lung’d Seneca, The dry, etc., x. 98.
chop off his head, viii. 201.
choosing songs the Regent named, In, etc., iv. 359.
Christ, inscribed the cross of, etc., xii, 261.
Christ Jesus! what mighty crime, etc., vi. 239.
Christian could die! to see how a, xii. 330.
chrysolite, this one entire and perfect, xii. 105, 235.
Ci giace il gran Titiano di Vecelli, etc., ix. 270.
Circled Una’s angel face, and made a sunshine in the shady place,
v. 46; x. 77.
cities in Romanian lands, Of all the, etc., xii. 323.
city, no mean, ix. 69.
city set on a hill, a, etc., x. 335.
clad in flesh and blood, i. 13, 135.
Clad in the wealthy robes his genius wrought, etc., ii. 108.
Clamour grew dumb, unheard was shepherd’s song, etc., v. 315.
clap on high his coloured winges twain, v. 35; x. 74.
clappeth his wings, and straightway he is gone, viii. 404; ix. 70.
clear it from all controversy, to, etc., iv. 335; vi. 52.
Cleopatra, will be the fatal, xii. 310.
clerk there was of Oxenford also, A, etc., i. 84.
clock that wants both hands, A, etc., viii. 434.
Close to the gate a spacious garden lies, etc., ix. 325.
clothed and fed, with which they are, ix. 93.
cloud by day, neither the, etc., ix. 361.
clouds in which Death hid himself, the, etc., vii. 14.
clouds of detraction, of envy and lies, through, vii. 367.
clouds over the Caspian, like two, xii. 11.
Cockney School in Poetry, xii. 256 n.
coil and pudder, xi. 554; xii. 335, 383.
Cold drops of sweat sit dangling on my hairs, etc., v. 212.
cold icicles, the, from his rough beard Dropped adown upon her
snowy breast! v. 38.
cold rheum, vi. 304.
Colonel took upon him to wear a shirt, x. 382; xii. 142.
colouring of Titian, the grace of Raphael, etc., vi. 74.
come betwixt the wind and their nobility, vii. 378.
come, but no farther, xii. 108.
Come, gentle Spring, etc., v. 86.
come home to the bosoms and businesses of men, i. 200; v. 333;
vii. 293, 337; viii. 91; xi. 548; xii. 377, 400.
Come, kiss me, love, viii. 265.
Come, live with me and be my love, v. 99, 211, 298.
Come, say before all these, etc., viii. 265.
Come then, the colours and the ground prepare, etc., vii. 290; viii.
73, 186; xi. 240.
comes like a satyr, iv. 246.
comes the tug of war, viii. 219.
comforted with their bright radiance, xi. 346.
coming and going he knew not where, i. 90.
Coming events cast their shadow before, vii. 50; x. 221; xii. 113.
Coming, gentlemen, coming, x. 382.
Coming Reviews cast their shadows before, x. 221.
common people always prefer exertion and agility to grace, ix. 173.
companion of my way, Let me have a, etc., vi. 182.
companion of the lonely hour, xii. 53.
companions of the spring, The painted birds, xi. 271.
company, Tell me your, etc., vi. 202; xi. 196; xii. 133.
compelled to give in evidence against himself, i. 129.
complex constable, that, iii. 299.
compost heap, a, vi. 37.
Compound for sins they are inclin’d to, etc., viii. 18.
conceit or the world well lost, all for, xii. 363.
condemned to everlasting fame, x. 375.
confined in too narrow room, iii. 290.
conformed to this world, to be, iii. 275; viii. 146.
Conniving house (as the gentlemen of Trinity), etc., i. 56.
conquering and to conquer, xi. 418.
conscience and tender heart, Where all is, ii. 371; iii. 155; iv. 204,
326; vi. 165; vii. 173, 280; x. 238.
conspicuous scene, etc., xii. 31.
constant chastity, unspotted faith, etc., iii. 208.
constrained by mastery, iii. 166; iv. 220; v. 86; vii. 197; viii. 404; ix.
17; xii. 188.
constrain his genius by mastery, viii. 479.
consummation of the art devoutly to be wished, a, viii. 190; xii. 125.
contagious gentleness, viii. 309.
contemporary bards would be admired when Homer and Virgil
were forgotten, xi. 288.
contempt of the choice of the people, i. 394, 427; iii. 32 and n., 175,
401.
contempt of their worshippers, in, xii. 244.
content man’s natural desire, vi. 324.
Continents have more, of what they contain, etc., iii. 272; vi. 205;
xii. 16.
Contra audentior ito, xi. 514.
conversation, To excel in, etc., vii. 32.
converse with the mighty dead, Hold high, ix. 69.
convertible to the same abandoned purpose, iii. 91.
cooped and cabined in by saucy doubts and fears, viii. 477; xii. 125.
copied the other, Which of you, ix. 33.
Corinthian capitals of polished society, the, iv. 290; xii. 131.
coronet face, the, xii. 226.
Corporate bodies have no soul, vi. 264.
corrupter sort of mere politiques, The, etc., v. 329.
could be content if the species were continued like trees, he, v. 334.
could he lay sacrilegious hands, etc., viii. 269.
counterfeiten chere, To, etc., iii. 268.
courage never to submit, etc., xii. 192.
courtly, the court, viii. 55; ix. 61.
courtiers offended should be, lest the, etc., iii. 45; viii. 457.
Cover her face: my eyes dazzle: she died young, v. 246.
covers a multitude of sins, vii. 83; viii. 33.
coxcombs, the prince of, proud of being at the head, etc., viii. 36,
83.
crack of ploughs and kine, xii. 380.
Craignez Dieu, mon cher Abner, etc., ix. 116.
Created hugest that swim the oceanstream, vii. 13.
Creation’s tenant, he is nature’s heir, xi. 500.
creature of the element, a, etc., xii. 30.
Credat Judæus Apella, xii. 266.
Credo quia impossibile est, vii. 351.
credulous hope, the, etc., xii. 321.
cries all the way from Portsmouth, etc., viii. 322.
crisis is at hand for every man to take part for, the, etc., vi. 154.
crown which Ariadne wore, etc., x. 186.
crown of the head, From the, etc., xii., 247.
cruel sunshine thrown by fortune on a fool, etc., xi. 550.
crust of formality, a, vi. 356.
cry more tuneable, A, etc., xii. 18.
cubit from his stature, a, viii. 263.
Cucullus non facit monachum, vii. 236.
Cuique tribuito suum, v. 368; vii. 191.
Cupid and my Campaspe play’d, etc., v. 201.
Cupid, as he lay among Roses, by a bee was stung, v. 312.
cups that cheer, but not inebriate, The, etc., vi. 184.
cure for a narrow and selfish spirit, a, xii. 429.
curiosa felicitas, v. 149; xi. 606.
curl her hair so crisp and pure, to, etc., viii. 465.
curtain-close such scenes, And, etc., xii. 328.
Cut is the branch that might have grown full strait, etc., v. 206.
cut up so well in the cawl, They do not, etc., iii. 321; vii. 202; viii.
340.
cuts the common link, xii. 402.
Cymocles, oh! I burn, etc., x. 245.

D.
daily food and nourishment of the mind of the artist, the, etc., vi.
125, 126.
daily intercourse of all this unintelligible world, the, etc., viii. 420.
dainty flower or herb that grows on ground, No, etc., iv. 353.
dallies with the innocence of thought, That, etc., xii. 177.
Damn you, can’t you be cool, etc., iii. 226.
damnation round the land, iv. 224.
dancing days, Such were the joys of our, etc., viii. 437; xi. 300.
dandled and swaddled, vi. 270.
Dapple, and there I spoke of him, There I thought of, vi. 61.
dark closet, with a little glimmering of light, a, etc., xi. 174.
darkness dare affront, and with their, xii. 198.
darkness that might be felt, in, iii. 57; vi. 43.
darling in the public eye, iv. 298.
darlings of his precious eye, the, xii. 195.
dashed and brewed, vii. 140; x. 235.
dateless bargain, to all engrossing despotism, a, xi. 414.
daughter and his ducats, his, xii. 142 n.
daughters of memory, the, iv. 348.
day, It was the, etc., viii. 288.
Dazzled with excess of light, viii. 551.
dazzling fence of argument, the, xii. 358.
De apparentibus et non existentibus eadem est ratio, v. 341 n.; vii.
50; xii. 56, 217.
De mortius nil nisi bonum, viii. 323.
de omne scibile et quibusdam aliis, vi. 214; vii. 315.
de omnibus rebus et quibusdam aliis, xi. 467.
d’un pathetique à faire fendre les rochers, vi. 236.
deaf the praised ear, and mute the tuneful tongue, v. 274.
Dear chorister, who from these shadows sends, etc., v. 300.
Death may be called in vain, and cannot come, etc., v. 357.
death there is animation too, Even in, ix. 221.
deathless date, vi. 291.
decked in purple and in pall, etc., viii. 308.
declamations or set speeches, His, are commonly cold, etc., i. 177.
decorum is the principal thing, v. 360.
dedicate its sweet leaves, i. 386.
Deem not devoid of elegance the sage, By Fancy’s genuine feelings
unbeguiled, etc., v. 120.
deep abyss of time, fast anchored in the, vii. 125.
deep, within that lowest, etc., xii. 144.
defections, his right-handed, etc.,
vii. 181.
defend the right, to, x. 167.
degree, in a high or low, etc., xi. 442.
Deh! quando tu sarai tornato al mondo, ix. 251.
Deh vieni alla finestra, viii. 365.
deity they shout around, A present, etc., x. 191; xii. 250.
deliberately or for money, iv. 339; vi. 56.
delicious breath painting sends forth, What a, etc., ix. 19.
delicious thought, of being regarded as a clever fellow, i. 93 n.
delight in love, ’tis when I see, If there’s, etc., viii. 73.
delight in! to fear, not to, xii. 243.
Deliverance for mankind, vi. 152 n.
Delphin edition of Nature, xi. 335.
Demades, the Athenian, condemned a fellow-citizen, etc., viii. 94.
Demanded how we can know any proposition, but here it will be,
etc., xi. 130.
Demogorgon, dreaded name of, the, xii. 259.
demon that he served, the, vii. 285.
demon whispered, L——, have a taste, Some, vi. 94, 403.
demure, grave-looking, spring-nailed, the, etc., vi. 221; vii. 242; xi.
530.
Depreciation of Pope is partly founded upon a false idea, etc., xi.
490.
depth of a forest, in the kingdom of Indostan, In the, etc., xi. 267.
Descended from the Irish kings, etc., i. 54.
deserter of Smorgonne, iii. 54.
Desire to please, etc., viii. 278; xii. 177, 183, 426.
Despise low joys, etc., xii. 31.
Despise low thoughts, low gains, etc., v. 77.
Destroy his fib or sophistry: in vain, etc., iv. 300.
Detur optimo, vii. 187.
Deva’s winding vales, xii. 265.
devil said plainly that Dame Chat had got the needle, the, v. 288.
Devil was sick, The, etc., xii. 126.
Devil upon two sticks, viii. 404.
devilish girl at the bottom, a, viii. 83.
Di rider finira pria della Aurora, iii. 371.
diamond turrets of Shadukiam, the, iv. 357.
Diana and her fawn, etc., xii. 58.
Did first reduce our tongue from Lyly’s writing, etc., v. 201.
Did I not tell thee, Dauphine, etc., viii. 43.
Did not the Duke look up? Methought he saw us, v. 215.
Die of a rose in aromatic pain, vi. 249; vii. 300; viii. 143; ix. 391.
Died at his house in Burbage-street, etc., vi. 86.
differences himself by, v. 334.
digito monstrari, vi. 286.
dim doubts alloy, no, xi. 321.
dip it in the ocean, and it will stand, iv. 197; vi. 160 n.; ix. 133 n.
dipped in dews of Castalie, v. 14; x. 156; xii. 294.
direct and honest, To be, etc., xii. 219.
disappointed still are still deceived, And, ix. 287.
disastrous strokes which his youth suffered, the, viii. 96.
discipline of humanity, a, i. 123; vii. 78, 184; xii. 122.
discoursed in eloquent music, vii. 199.
disdain the ground she walks on, i. 71 n.
disembowel himself of his natural entrails, etc., vi. 267; xi. 322.
disjecta membra poetæ, viii. 423; ix. 309.
distant, enthusiastic, respectful love, viii. 160.
distilled books are, like distill’d waters, etc., xi. 203.
divest him, along with his inheritance, to, etc., viii. 72.
Divide et impera, vii. 147.
divinæ particula auræ, ix. 361; xii. 157.
divine Fanny Bias, iv. 359.
divine, the matchless, what you will, the, vi. 175.
Do not mock me: Though I am tamed, and bred up with my
wrongs, etc., v. 252.
Do unto others as you would, etc., vi. 396.
Do you read or sing? If you sing you sing very ill, vii. 5; viii. 319.
Do you see anything ridiculous in this wig? viii. 21.
Do you think I’ll sleep with a woman that doesn’t know what’s
trumps? viii. 427.
docked and curtailed, xi. 316.
Does he wind into a subject? etc., vii. 275; viii. 103.
does a little bit of fidgets, viii. 469.
dog, he still plays the, viii. 263.
dogs, among the gentlemanlike, etc., iii. 278.
Don John of the Greenfield was coming, vi. 359.
Don Juan was my Moscow, etc., iv. 258 n.
Don’t forget butter, viii. 264.
Don’t you remember Lords—and—who are now great statesmen;
little dirty boys playing at cricket, etc., v. 118; vii. 205.
double night of ages and of her, The, etc., xi. 424.
Doubtless the pleasure is as great, etc., iii. 169; vii. 204; viii. 302.
douce humanité, iii. 36; xi. 525.
doux sommeil, iii. 108.
Down the Bourne and through the Mead, ii. 87.
dragged the struggling monster into day, viii. 164.
dramatic star of the first magnitude, a, viii. 164.
drawn in their breath and puffed it forth again, vii. 59.
dreaming and awake, ’twixt, vi. 71.
dregs of earth, the, xii. 41.
dregs of life, the, vii. 302.
Dress makes the man, the want of it the fellow, etc., vii. 212.
Drip, drip, drip, drip, drip, etc., v. 306.
dross compared to the glory hereafter, etc., xi. 322.
drossy and divisible, more, vii. 173, 453; xi. 174.
drunk full ofter of the tun than of the well, v. 129.
dry discourse, but, xi. 25.
Duke and no Duke, viii. 263.
Dulce ridentem Lalagen, Dulce loquentem, vi. 61.
Dull as the lake that slumbers in the storm, iii. 22; vii. 278.
Dull Beotian genius, viii. 370.
dull cold winter does inhabit here, vii. 176; ix. 62.
dull product of a scoffer’s pen, v. 114.
dulness could no further go, The force of, vi. 46 n.; x. 219, 377.
dumb forgetfulness a prey, for who to, xi. 546.
Dum domus Æneæ Capitoli immobile saxum, etc., vii. 12.
dungeon of the tower, From the, etc., xii. 158.
durance vile, xi. 237.
Durham’s golden stalls, iii. 123.
dust in the balance, But as the, iv. 63.
Dust to dust, etc., xii. 53.
dust we raise! What a, vi. 240.
dwelleth not in temples made with hands, ix. 48.
dwelt Eternity, ix. 218.
dying Ned Careless, viii. 72.
dying shepherd Damætas, I give it to you as the, etc., xi. 289.
E.
Each lolls his tongue out at the other, etc., xi. 527.
Each man takes hence life, but no man death, etc., v. 225.
ear and eye, He is all, etc., xii. 121.
earth, earthy, of the, i. 239; vi. 43; ix. 55, 389.
ease, he takes his, xii. 123.
eat, drink, and are merry, xii. 16.
eat his meal in peace, vi. 94.
Ebro’s temper, the, viii. 103.
eclipsed the gaiety of nations, i. 157; viii. 387, 526.
Eden, and Eblis, and cherub smiles, iv. 354.
Edina’s darling Seat, xii. 253.
Edinburgh, We are positive when we say, etc., viii. 105.
effeminate! thy freedom hath made me, xii. 124.
Eftsoones they heard a most melodious sound, etc., v. 36.
eggs, with five blue, i. 92.
Eke fully with the duke my mind agrees, etc., v. 194.
elbow us aside, who, iv. 99.
elegant Petruchio, an, v. 345.
Elevate and surprise, vi. 216, 290; x. 271, 388.
elegant turn of her head, ix. 147.
eleven obstinate fellows, the other, xii. 326.
Elysian beauty, melancholy grace, vii. 366.
Elysian dreams of lovers, when they loved, Th’, etc., viii. 307.
embowelled, of our natural entrails, and stuffed, are, viii. 417.
embryo fly, the little airy of ricketty children, iv. 246.
Emelie that fayrer was to sene, etc., i. 400.

You might also like