Recursive Bayes

It’s funny, this is such a standard use of Bayesian analysis, but most PPLs don’t have much support for it. It’s usually better to do inference for as much data as you have at a time, because the posterior is represented as a function of the data. There are two main exceptions to this:

  • If you make strong assumptions about the data, e.g. for a Kalman filter you assume normality.
  • If you make very few assumptions, and do things very empirically. Then you end up with a particle filter.

One big focus of Soss.jl is model composability, in particular models can chain together. There’s an example here of building a Markov chain in Soss. This kind of thing will get easier with some new updates in the works. What (I think) we really want is something like

  • Represent a chain abstractly
  • Observe a chunk of data, and get the posterior conditional all this (all at once)
  • Turn this into a representation of a new distribution to be used as a prior. Maybe it’s a Gaussian approximation, or a mixture of Gaussians, or an approximation from some variational family.
3 Likes