[ANN] Turing.jl 0.10.0

Hey folks. We’re happy to announce Turing version 0.10.0! We’re now maintaining a HISTORY.md file with an in-depth summary of changes, but I wanted to give an overview of the biggest features that you can use in 0.10.0.

Automatic differentiation

@mohamed82008 added two additional automatic differentiation backends, Zygote.jl and ReverseDiff.jl. You can use these backends with Turing.setadbackend(:zygote) and Turing.setadbackend(:reversediff), respectively.

It’s worth noting Zygote cannot be used in models that mutate parameters, so choose your models accordingly. Both Zygote and ReverseDiff currently perform poorly in loops, so vectorized code will be more performant. See performance tips for more information on this.

Both are experimental at this stage, so we would appreciate it if you tried them out and let us know about any bugs you find, or your general experience.

We have also introduced breaking changes to how you select your AD backend. Previously, you could use Turing.setadbackend(:forward_diff) and Turing.setadbackend(:reverse_diff), but the new syntax is to use the lowercase version of the AD package name.

List of available backends:

  • Turing.setadbackend(:zygote) is Zygote.jl
  • Turing.setadbackend(:tracker) is Tracker.jl
  • Turing.setadbackend(:reversediff) is ReverseDiff.jl
  • Turing.setadbackend(:forwarddiff) is ForwardDiff.jl

The previous syntax Turing.setadbackend(:forward_diff) and Turing.setadbackend(:reverse_diff) have been deprecated and are slated to be removed, but for now they will map to

  • Turing.setadbackend(:forward_diff) is ForwardDiff.jl
  • Turing.setadbackend(:reverse_diff) is Tracker.jl

Elliptical slice sampling

Elliptical slice sampling has been supported by Turing since v0.8, but @devmotion recently linked up the excellent package EllipticalSliceSampling.jl to Turing. Elliptical slice sampling is a very cool tool, and I don’t think I’ve done a good enough job at telling everyone about it as I should have, so please try it out! You can sample your model using sample(model, ESS(), n_samples) as per usual.

Note that ESS requires Gaussian priors, so you should get fun error messages if you use non-Gaussian priors.