[ANN] Turing.jl 0.14.0

Hey folks,

It’s been a while since our last release, which we’re hoping to avoid entirely. The Turing-verse is mostly adapting Colprac going forward, which means we’ll be flagging releases much more frequently to keep up with bug fixes and minor changes.

There are a lot of things wrapped up in this release. The biggest thing we want to note is that Turing is dropping support for Julia < 1.3, so please update to 1.3 or greater to continue to receive the most current updates. We dropped 1.3 for a handful of reasons, primarily that necessary fixes for some Turing dependencies had some requirement for 1.3+.

I’ll list out the major features and improvements you can expect with 0.14. There were a ton of bug fixes and miscellaneous improvements to the whole Turingverse, too many to list here, so I’ll focus on the primary changes to the system.

Affine invariant ensemble sampling (emcee)

You may be familiar with the excellent emcee package in Python, which implements affine invariant ensemble sampling. We added emcee-style sampling to AdvancedMH recently, and Turing 0.14 allows users to sample Turing models with the ensemble sampler.

Usage:

using Turing

@model gdemo(x, y) = begin
    s ~ InverseGamma(2,3)
    m ~ Normal(0, sqrt(s))

    x ~ Normal(m, sqrt(s))
    y ~ Normal(m, sqrt(s))
end

n_samples = 100
n_walkers = 1000

spl = Turing.Inference.Emcee(n_walkers, MvNormal(2, 100), 2.0)
model = gdemo(1.5, 2.0)

chain1 = sample(model, spl, n_samples)

There’s two caveats with this particular sampling method:

  1. Each “chain” represents a walker.
  2. You cannot use Gibbs with Emcee.

As per usual, give it a try and let us know how it works.

MLE/MAP fixes

You can now provide a starting point for MLE/MAP, as in

estimate = optimize(model, MLE(), LBFGS(), starting_point)

Second-order optimizers like Newton are now broken, in favor of first-order methods using the AD-derived gradient. The previous version of MLE/MAP used finite difference gradients instead of the available AD gradient, which is now fixed.

AdvancedVI.jl

The variational inference library has been spun off into AdvancedVI.jl to mirror the various satellite packages that Turing has on offer. Expect future development on VI to show up there.

14 Likes