Over the past couple months we’ve been working on a fairly significant PR designed to overhaul Turing’s internal API to make future development a lot easier. Many of these changes are never something a use will see, but there are several significant breaking changes that everyone needs to know about.
Currently the changes are only hitting the
master branch, but they will eventually filter out into the stable release.
The most obvious change is how the number of iterations is specified when generating a chain with
sample. Previously, users would draw samples using a line like this:
chain = sample(model, NUTS(1000, 100, 0.65))
This means that we are requesting 1,000 posterior samples, that the first 100 samples should be used in adaptation, and that we’re targeting a 65% acceptance rate.
No longer! The number of samples is being moved outside the algorithm construction, so you will be able to use:
# Using NUTS. Note that the adaptation samples still # needs to be in the algorithm definition. chain = sample(model, NUTS(100, 0.65), 1000) # You can also use the default NUTS settings for # easy inference in continuous models. chain = sample(model, NUTS(), 1000) # Using 10 particles in Particle Gibbs (PG) chain = sample(model, PG(10), 1000) # Using Gibbs to combine PG and HMC across different parameters chain = sample(model, Gibbs(PG(10, :theta1), HMC(0.1, 7, :theta2)), 1000)
For this PR we were mostly focused on getting the internal API set up correctly, so there are going to be several non-core samplers that typically do not get much use. The plan is to eventually move them over to the new internal set up, but for now these samplers will be nonfunctional:
Why are we doing this?
Much of the reason we are breaking some things is that the Turing team is aiming to make the package the home of plug-and-play Bayesian inference methods, and this means we need to have a consistent interface between models and sampling methods.
A lot of the design work for the PR was based on creating an implementing a very general interface that provides most of the framework for any MCMC method, whether that’s Metropolis-Hastings, a Hamiltonian sampler, or any other method you might like.
We’re hoping that we can get at least some of the wider Julia PPL universe on out sampling interface, so that everyone has a common language on how to transition between samples and return sampled chains. The raw interface can be found here, but the plan is eventually to move it to MCMCChains.jl. A guide on how to use the interface and its design goals is coming Soon™.