You are on page 1of 1

kalman filters

LISTING 1, continued Kalman filter simulation
P=a*P*a -a*P*c *ivs *c*P*a +S; ’ ’ n() ’ w %Sv sm prmtr frpotn ltr ae oe aaees o ltig ae. ps=[o;x1] o ps (); pses=[oma;y; oma pses ] psa =[oht xa(); oht psa; ht1] vl=[e;x2] e vl (); vla =[eht xa(); eht vla; ht2] ed n %Po tersls lt h eut coeal ls l; t=0:d :drto; t uain fgr; iue po(,o,tpses tpsa) lttps ,oma, ,oht; gi; rd xae(Tm (e)) lbl‘ie sc’; yae(Psto (et’; lbl‘oiin fe)) t t e ‘ i u e 1 - V h c e P s t o ( r e M a u ed a d E t m t d ’ il(Fgr eil oiin Tu, esr , n siae)) fgr; iue po(,o-oma,tpspsa) lttpspses ,o-oht; gi; rd xae(Tm (e)) lbl‘ie sc’; yae(Psto Err(et’; lbl‘oiin ro fe)) tte‘iue2-Psto MaueetErrad Psto Etmto Err) il(Fgr oiin esrmn ro n oiin siain ro’; fgr; iue po(,e,tvla) lttvl ,eht; gi; rd xae(Tm (e)) lbl‘ie sc’; yae(Vlct (etsc’; lbl‘eoiy fe/e)) tte‘iue3-Vlct (readEtmtd’ il(Fgr e o i y T u n s i a e ) ); fgr; iue po(,e-eht; lttvlvla) gi; rd xae(Tm (e)) lbl‘ie sc’; yae(Vlct Err(etsc’; lbl‘eoiy ro fe/e)) tte‘iue4-Vlct Etmto Err) il(Fgr eoiy siain ro’;

that make it more effective in certain situations. Kalman filter theory assumes that the process noise w and the measurement noise z are independent from each other. What if we have a system where these two noise processes are not independent? T his is the correlated noise problem, and the Kalman filter can be modified to handle this case. In addition, the Kalman filter requires that the noise covariances Sw and Sz be known. What if they are not known? T hen how can we best estimate the state? Again, this is the problem solved by the H ¥ filter. Kalman filtering is a huge field whose depths we cannot hope to begin to plumb in these few pages. T housands of papers and dozens of textbooks have been written on this subject since its inception in 1960.

Historical perspective
The Kalman filter was developed by Rudolph Kalman, although Peter Swerling developed a very similar algorithm in 1958. The filter is named after Kalman because he published his results in a more prestigious journal and his work was more general and complete. Sometimes the filter is referred to as the Kalman-Bucy filter because of Richard Bucy’s early work on the topic, conducted jointly with Kalman. T he roots of the algorithm can be traced all the way back to the 18-yearold Karl Gauss’s method of least squares in 1795. L ike many new technologies, the Kalman filter was developed to solve a specific problem, in this case, spacecraft navigation for the Apollo space program. Since then, the Kalman filter has found applications in hundreds of diverse areas, including all forms of navigation (aerospace, land, and marine), nuclear power plant instrumentation, demographic modeling, manufacturing, the detection of underground radioactivity, and fuzzy logic and neural esp network training. Dan Simon is a professor in the electrical and computer engineering department at

T he K alman filter not only works well but is theoretically attractive. I t can be shown that the K alman filter minimizes the variance of the estimation error. But what if we have a problem where we are more concerned with the worst case estimation error? What if we want to minimize the “worst” estimation error
J UNE 2001

rather than the “average” estimation error? T his problem is solved by the H ¥ filter. T he H ¥ filter ( pronounced “H infinity” and sometimes written as H ¥ ) is an alternative to K alman filtering that was developed in the 1980s. I t is less widely known and less commonly applied than the K alman filter, but it has advantages

Embedded Systems Programming