technique is a kind of stochastic sampling approach aim-ing to tackle the complex systems which are analyticallyintractable. The power of Monte Carlo methods is thatthey can attack the diﬃcult numerical integration prob-lems. In recent years, sequential Monte Carlo approacheshave attracted more and more attention to the researchersfrom diﬀerent areas, with many successful applications instatistics (see e.g. the March special issue of 2001
Annalsof the Institute of Statistical Mathematics
), sig-nal processing (see e.g., the February special issue of 2002
IEEE Transactions on Signal Processing
), machinelearning, econometrics, automatic control, tracking, com-munications, biology, and many others (e.g., see  andthe references therein). One of the attractive merits of se-quential Monte Carlo approaches lies in the fact that theyallow on-line estimation by combining the powerful MonteCarlo sampling methods with Bayesian inference, at an ex-pense of reasonable computational cost. In particular, thesequential Monte Carlo approach has been used in parame-ter estimation and state estimation, for the latter of whichit is sometimes called particle ﬁlter.
The basic idea of particle ﬁlter is to use a number of
randomvariables called particles,
sampled directly from the statespace, to represent the posterior probability, and updatethe posterior by involving the new observations; the “par-ticle system” is properly located, weighted, and propagatedrecursively according to the Bayesian rule. In retrospect,the earliest idea of Monte Carlo method used in statisti-cal inference is found in , , and later in , ,, , , but the formal establishment of particleﬁlter seems fair to be due to Gordon, Salmond and Smith, who introduced certain novel resampling techniqueto the formulation. Almost in the meantime, a numberof statisticians also independently rediscovered and devel-oped the sampling-importance-resampling (SIR) idea ,, , which was originally proposed by Rubin , in a non-dynamic framework.
The rediscovery andrenaissance of particle ﬁlters in the mid-1990s (e.g. ,, , , , , ) after a long dominantperiod, partially thanks to the ever increasing computingpower. Recently, a lot of work has been done to improvethe performance of particle ﬁlters , , , ,, , . Also, many doctoral theses were devotedto Monte Carlo ﬁltering and inference from diﬀerent per-spectives , , , , , , , ,, , .It is noted that particle ﬁlter is not the only leaf in theBayesian ﬁltering tree, in the sense that Bayesian ﬁlteringcan be also tackled with other techniques, such as diﬀeren-
Many other terminologies also exist in the literature, e.g., SIS ﬁl-ter, SIR ﬁlter, bootstrap ﬁlter, sequential imputation, or CONDEN-SATION algorithm (see  for many others), though they are ad-dressed diﬀerently in diﬀerent areas. In this paper, we treat them asdiﬀerent variants within the generic Monte Carlo ﬁlter family. MonteCarlo ﬁlters are not all sequential Monte Carlo estimation.
The particle ﬁlter is called
if it produces i.i.d. samples;sometimes it is deliberately to introduce
correlations amongthe particles for the sake of variance reduction.
The earliest idea of multiple imputation due to Rubin was pub-lished in 1978 .
tial geometry approach, variational method, or conjugatemethod. Some potential future directions, will be consid-ering combining these methods with Monte Carlo samplingtechniques, as we will discuss in the paper. The attentionof this paper, however, is still on the Monte Carlo methodsand particularly sequential Monte Carlo estimation.
D. Outline of Paper
In this paper, we present a comprehensive review of stochastic ﬁltering theory from Bayesian perspective. [Ithappens to be almost three decades after the 1974 publica-tion of Prof. Thomas Kailath’s illuminating review paper“A view of three decades of linear ﬁltering theory” ,we take this opportunity to dedicate this paper to him whohas greatly contributed to the literature in stochastic ﬁlter-ing theory.] With the tool of Bayesian statistics, it turnsout that the celebrated Kalman ﬁlter is a special case of Bayesian ﬁltering under the LQG (linear, quadratic, Gaus-sian) circumstance, a fact that was ﬁrst observed by Hoand Lee ; particle ﬁlters are also essentially rootedin Bayesian statistics, in the spirit of recursive Bayesianestimation. To our interest, the attention will be given tothe
situationswhere we mostly encounter in the real world. Generally fornonlinear ﬁltering, no exact solution can be obtained, or thesolution is inﬁnite-dimensional,
hence various numericalapproximation methods come in to address the intractabil-ity. In particular, we focus our attention on sequentialMonte Carlo method which allows on-line estimation in aBayesian perspective. The historic root and remarks of Monte Carlo ﬁltering are traced. Other Bayesian ﬁlteringapproaches other than Monte Carlo framework are also re-viewed. Besides, we extend our discussion from Bayesianﬁltering to Bayesian inference, in the latter of which thewell-known hidden Markov model (HMM) (a.k.a. HMMﬁlter), dynamic Bayesian networks (DBN) and Bayesiankernel machines are also brieﬂy discussed.Nowadays Bayesian ﬁltering has become such a broadtopic involving many scientiﬁc areas that a comprehen-sive survey and detailed treatment seems crucial to caterthe ever growing demands of understanding this importantﬁeld for many novices, though it is noticed by the authorthat in the literature there exist a number of excellent tuto-rial papers on particle ﬁlters and Monte Carlo ﬁlters ,, , , , as well as relevant edited volumes and books , , , . Unfortunately, asobserved in our comprehensive bibliographies, a lot of pa-pers were written by statisticians or physicists with somespecial terminologies, which might be unfamiliar to manyengineers. Besides, the papers were written with diﬀerentnomenclatures for diﬀerent purposes (e.g. the convergenceand asymptotic results are rarely cared in engineering butare important for the statisticians). The author, thus, feltobligated to write a tutorial paper on this emerging andpromising area for the readership of engineers, and to in-troduce the reader many techniques developed in statistics
Or the suﬃcient statistics is inﬁnite-dimensional.