Recursive Bayesian Filtering

Table of Contents

Notation

Kalman Filter

Assumed model

We assume the state to be random variable recursive_bayesian_filtering_0f3382b5db7c9b12a3c9bb251f8742593a01fbb7.png as follows:

recursive_bayesian_filtering_cd23a56498b79929b711031f9022f6dd927076c8.png

and the measurements or observations to be a random variable recursive_bayesian_filtering_bf9224af1edfdec706c082f5ac17aa08017e8357.png:

recursive_bayesian_filtering_59bef22ff11727d7a8650207ae6a62bfdb96f199.png

I.e. both state recursive_bayesian_filtering_0f3382b5db7c9b12a3c9bb251f8742593a01fbb7.png and measurement recursive_bayesian_filtering_bf9224af1edfdec706c082f5ac17aa08017e8357.png are assumed to be normally distributed with covariance-matrices recursive_bayesian_filtering_f9605b37cc8325e45a0fe2b3fea28fdf8f133927.png and recursive_bayesian_filtering_b8740bbd5247a48c4613d19b789b4123a34b6263.png, respectively.

Deduction

Notation

  • recursive_bayesian_filtering_0f3382b5db7c9b12a3c9bb251f8742593a01fbb7.png state
  • recursive_bayesian_filtering_ec995d55127fefa28a3a39a42bbac8714ca91c0c.png predicted state
  • recursive_bayesian_filtering_88998b42b7ecc96b564f883b2ca9583e4e8a9260.png covariance matrix for the predicted state
  • recursive_bayesian_filtering_5e00f67403f4726cdedc5c47e7ddb3f0a5f1d970.png control-matrix, e.g. mapping external forces to how it affects the state
  • recursive_bayesian_filtering_673e9a4604c0880c5c9884a6f212c729b9a0c00f.png control-vector, e.g. external force
  • recursive_bayesian_filtering_f9605b37cc8325e45a0fe2b3fea28fdf8f133927.png covariance matrix describing external uncertainty from the environment
  • recursive_bayesian_filtering_aec043ac1acca6de39d94e43d616b1867da60aab.png maps the predicted state to the measurement-space, i.e. "predict" measurement from state
  • recursive_bayesian_filtering_bf9224af1edfdec706c082f5ac17aa08017e8357.png real measurement
  • recursive_bayesian_filtering_b8740bbd5247a48c4613d19b789b4123a34b6263.png uncertainty in real measurements

Algorithm

We start out by writing the predicted state recursive_bayesian_filtering_c8ef9bf1bebb79fc0874e01c3a8f7efe4a24fadc.png as a function of the previous recursive_bayesian_filtering_07f80aa967b655685f6de2d2ef31fcb45eb812e4.png predicted state:

recursive_bayesian_filtering_e53695d9afb9a9c90e13ec57ead70acec5d72466.png

Where recursive_bayesian_filtering_88998b42b7ecc96b564f883b2ca9583e4e8a9260.png is due to the following identify:

recursive_bayesian_filtering_ea07c475a58c30ec9696b2c31a0e451b4ded4f76.png

I.e. mapping the covariance at step recursive_bayesian_filtering_114e4801501415189b70b37824e892a9cd5c506b.png to the covariance of the state at step recursive_bayesian_filtering_ad63c81b09a45c2afd6c497ff38a542e26c1423a.png.

The next step is then to incorporate "external forces", where we use:

  • recursive_bayesian_filtering_673e9a4604c0880c5c9884a6f212c729b9a0c00f.png - control vector which defines how
  • recursive_bayesian_filtering_5e00f67403f4726cdedc5c47e7ddb3f0a5f1d970.png - the matrix

I like to view recursive_bayesian_filtering_d14bfdd0726ee0d4f26711913fcbd2340a921f1e.png as a simple function mapping the control-vector recursive_bayesian_filtering_673e9a4604c0880c5c9884a6f212c729b9a0c00f.png to the state-space, or rather, a function which maps the external "force" to how this "force" affects the state.

This, together with the uncertainty of the external "forces" / influences:

recursive_bayesian_filtering_df0ceb53a901b4bad0b0763bf8732a8ee795e957.png

Now, what if we can see the real measurement for step recursive_bayesian_filtering_ad63c81b09a45c2afd6c497ff38a542e26c1423a.png, recursive_bayesian_filtering_bf9224af1edfdec706c082f5ac17aa08017e8357.png, before proceeding? Well, we would like to update our model! To do this, we need to relate the predicted state to the real state . We thus write the predicted measurement as:

recursive_bayesian_filtering_45192ec75de370cd6c488e31f7a53fdde4d4835e.png

Now, we can then write the real measurement as follows:

recursive_bayesian_filtering_b8475fd5988ab4ccb8ad42be5e17e0a49c45b900.png

And we of course want the following to be true:

recursive_bayesian_filtering_d1446a6597a370d6ef1a3d9538b842d3e0731cb3.png

Therefore we can map the predicted state to the measurement-space by using the matrix recursive_bayesian_filtering_aec043ac1acca6de39d94e43d616b1867da60aab.png.

Thus we have a model for making the predictions, now we need to figure out how to update our model!

For this we first map our "internal" state recursive_bayesian_filtering_c8ef9bf1bebb79fc0874e01c3a8f7efe4a24fadc.png to the measurement-space. In this case, our measurements only consists of the position recursive_bayesian_filtering_5dc098682fd50b12d0208ca93fb3b4c3471853de.png.

recursive_bayesian_filtering_9e73e5caa11f4e3573ddc8ece2ff1033b254c4e3.png

Thus, the predicted measurements follow recursive_bayesian_filtering_be0b18b19dad5f2a8bd42eb5fb2a38e43951258a.png and the real measurements follow recursive_bayesian_filtering_1bf9bee643d66494e1c7d57fc85580a37d346350.png, where recursive_bayesian_filtering_0f3382b5db7c9b12a3c9bb251f8742593a01fbb7.png is the "real" internal state.

To compute our updated state, we can then take the joint distribution of the predicted and real measurements, incorporating information from both our estimate and the current measurement.

Due to the joint-probability of two Gaussian rvs. also being Guassian, with mean recursive_bayesian_filtering_acb0106122b7f90d9bd5639367a141a7e53d8327.png and covariance recursive_bayesian_filtering_017948b866be67b1a8e56a6b5f8848f823410c34.png

recursive_bayesian_filtering_35acfcc3949837f91ed9d2c977eef9189ca436c1.png

recursive_bayesian_filtering_e487996db1e79e6f678a7c657fc2d248333579d9.png

recursive_bayesian_filtering_5d0264dd611563808baad7207a299cec3d678062.png

In the context of Kalman filters we call recursive_bayesian_filtering_33fa159720a66d1ac91afb2b74c8d201587b0d4d.png the Kalman gain.

By letting recursive_bayesian_filtering_1083488eca45cd5c762c75f4516c7de50ca6fe2e.png and recursive_bayesian_filtering_77fe616a63b661eaf8f5f2d45704b042343c607b.png, we get the update-equations:

recursive_bayesian_filtering_49dd079dac8d52d11769b1c623480daf68987f58.png

recursive_bayesian_filtering_d36b0f9b0b3b5b69a10d9cdfe62375777e758c4a.png

where recursive_bayesian_filtering_d8e06175b1efe0c9ddd1862c9e4c9174aa60bf1f.png is recursive_bayesian_filtering_33fa159720a66d1ac91afb2b74c8d201587b0d4d.png without the recursive_bayesian_filtering_aec043ac1acca6de39d94e43d616b1867da60aab.png in front.

Summary

Predict!

recursive_bayesian_filtering_c733b9aa9cb6781e210f0c7748f3b1f1d909103c.png

Update! Since our predictive model is Gaussian, and we assume a Gaussian distribution for our measurements, we simply update our model parameters recursive_bayesian_filtering_8db955e5bac8ef26a83de37278050e788293be4d.png by taking the joint-distribution of these two Gaussian rvs., which again gives us a Guassian distribution with the new updated paramters:

recursive_bayesian_filtering_aafccc749e5232134b4891ffc0d52d2ef153cf00.png

Prediction

recursive_bayesian_filtering_c733b9aa9cb6781e210f0c7748f3b1f1d909103c.png

Update

Since our predictive model is Gaussian, and we assume a Gaussian distribution for our measurements, we simply update our model parameters recursive_bayesian_filtering_8db955e5bac8ef26a83de37278050e788293be4d.png by taking the joint-distribution of these two Gaussian rvs., which again gives us a Guassian distribution with the new updated parameters:

recursive_bayesian_filtering_aafccc749e5232134b4891ffc0d52d2ef153cf00.png

Particle Filtering

Notation

  • recursive_bayesian_filtering_89477598e1282a91ee3f52bc85e715d363766aeb.png denotes the k-th sampled step of the i-th "particle" / sample-sequence
  • recursive_bayesian_filtering_6c944f6787d14cf157284560a81bc2d99c7f4121.png is the "importance distribution" which we can sample from

Overview

  • Can represent arbitrary distributions
    • But this is an approximate method, unlike method such as Kalman Filter which are (asymptotically) exact and "optimal"
  • Name comes from the fact that we're keeping track of multiple "particles"
    • "Particle" is just a single "chain" of samples
  • Main building-block is Importance Sampling (IS)
    • "Extended" to sequences, which we call Sequential Importance Sampling

Sequential Importance Sampling (SIS)

Sequential Importance Sampling (SIS) extends Importance Sampling (IS) by updating the sample-weights recursive_bayesian_filtering_3db61d2df26059457a81537632bdae10d7788896.png sequentially.

The particle and weight updates, respectively, are defined by

recursive_bayesian_filtering_8fcbd979cbc1e5a8c69e9bee8b9ebb03d1a0358a.png

IMPORTANT: normalize the weights after each weight-update, such that we can use the weights to approximate the expectations.

recursive_bayesian_filtering_2dc9ff76648e756a27b6e1809123b463f6b4a557.png

Degeneracy phenomenon

  • Usually after a few iterations of SIS most particles have neglible weights
    • => large computation efforts for updating particles with very small contributions: highly inefficient
  • One can "measure" degeneracy using the "effective sample size":

    recursive_bayesian_filtering_6661b2eee93ea7e8029fdeecb7c511f4eb5c0757.png

    • Uniform: recursive_bayesian_filtering_ba6455c42b7944d7bfbe824417ff625876dc0c8e.png
    • Severe degeneracy: recursive_bayesian_filtering_4db61a763a5268b1a8495a7f73092c864238b558.png
  • To deal with degeneracy we introduce resampling
  1. When degeneracy is above a threshold, eliminate particles with low importance weights
  2. Sample recursive_bayesian_filtering_b3845b676dc3083a0ac31a0a29810621f9c09bcc.png new samples / "particles" recursive_bayesian_filtering_8b2f39b3a889d34fbdfcd5eb980d01925e68a5ab.png by

    recursive_bayesian_filtering_bbf9c96d22709d0d61c57b77d208cd9de5759b0e.png

    i.e. we're generating new particles by drawing from the current particles, each with probability equal to the corresponding weight.

    • If we eliminiated the particles with low importance weights, the corresponding weights should of course not be included.
  3. New samples and weights are

    recursive_bayesian_filtering_65193f3e9de93bdb06139eae3d66cc5a230e581a.png

    i.e. new weights are uniform.

A good threshold is a "measure" of degeneracy using the "effective sample size":

recursive_bayesian_filtering_6661b2eee93ea7e8029fdeecb7c511f4eb5c0757.png

  • Uniform: recursive_bayesian_filtering_ba6455c42b7944d7bfbe824417ff625876dc0c8e.png
  • Severe degeneracy: recursive_bayesian_filtering_4db61a763a5268b1a8495a7f73092c864238b558.png

Practical considerations

Implementing the categorical sampling method used in resampling can efficiently be performed by

  1. Normalize weights
  2. Calculate an array of the cumulative sum of the weights
    • Partitions the range recursive_bayesian_filtering_5cb819dbdaa11557a460fe04a5ef95eede596daa.png into recursive_bayesian_filtering_b3845b676dc3083a0ac31a0a29810621f9c09bcc.png intervals
    • If recursive_bayesian_filtering_454d5bb2f7a4a2c77d3f19048d0f7bc09ad33f30.png, then the interval recursive_bayesian_filtering_8b0e7a5ea4d56f5afdac3c2658ca87bd50b0ad6f.png falls into is the corresponding recursive_bayesian_filtering_89477598e1282a91ee3f52bc85e715d363766aeb.png
  3. Randomly generate recursive_bayesian_filtering_b3845b676dc3083a0ac31a0a29810621f9c09bcc.png uniform numbers recursive_bayesian_filtering_682157313aab04e3b248fec3889ef7d94181185a.png and determine the corresponding samples

Generic Particle Filtering

Combining SIS and resampling, we get generic Particle Filtering.

At each "step" recursive_bayesian_filtering_ad63c81b09a45c2afd6c497ff38a542e26c1423a.png, we "obtain new states" / "update the particles" through:

  1. Produce tuple of samples recursive_bayesian_filtering_c96a0cc9e07184c096b3cff5fdbc1e19e92b91eb.png using SIS
  2. Compute recursive_bayesian_filtering_eede093fca25c81cf8e6be79d291159ec838f2e8.png
    • If recursive_bayesian_filtering_74076cf421d5575a635add841804e9dd2e8c153e.png: resample!