[ANN] Turing.jl 0.12.0 release

Hey folks.

Turing 0.12.0 is out, and it comes with a lovely set of changes and updates. There are some deprecations and API changes, so please read carefully if 1) you use custom distributions or 2) you use the parallel sampling psample.

See all the changes in our HISTORY.md file, but for now here’s the main changes from the user side.

Custom distributions

The interface for defining new distributions with a constrained support and making them compatible with Turing has changed. To make a custom distribution type CustomDistribution compatible with Turing , the user needs to define the method bijector(d::CustomDistribution) that returns an instance of type Bijector implementing the Bijectors.Bijector API.

You can read more about Bijectors.jl here. Please reach out to the team on Discourse in the probabilistic programming category, or on Slack/Zulip.


Upstream changes in AbstractMCMC.jl have deprecated the psample function, which previously sampled multiple chains with thread parallelism. psample was initially a test to see if we could make it work, but it’s been time to make it more in line with the design patterns we use.

To that end, sample now has additional signatures which allow you to specify a type of parallelism (threaded or process-based) to use on each chain. Here’s some examples:

# Sample n_chains using one process per chain
sample(model, sampler, MCMCDistributed(), n_samples, n_chains)

# Sample n_chains using one thread per chain
sample(model, sampler, MCMCThreads(), n_samples, n_chains)

I hope everyone will try this out – the distributed process sampling is a new feature that I’m fond of.

Thread safety

In other parallelism news, observe statements are threadsafe. As an example, the following code is now thread-safe:

@model function example(data)
    Threads.@threads for i in eachindex(data)
        data[i] ~ Normal(0,1)

This should help if you observe a large amount of data and want to parallelize logpdf calls.

Model tools

The macros @varinfo , @logpdf , and @sampler are removed. Instead one can access the internal variables _varinfo , _model , _sampler , and _context in the @model definition.

This was to remove macros from inside the model definition, which was kind of an antipattern. Since @logpdf was removed, it’s worth noting that you can access the current log density using DynamicPPL.getlogp(_varinfo).

Particle sampler methods

SMC and PG now have new methods that make it far easier to set the sampler threshold and resampler method.

For example, if you want to sample with a threshold of 0.4, you can do so with SMC(0.4). For PG, this is PG(n_particles, 0.4).

That’s all for now folks. Thanks for reading, and please give us whatever feedback you have.


@torfjelde also wanted me to mention that you can now add docstrings to your models. The following should now work:

julia> using DynamicPPL, Distributions

julia> """
       This is a demo model.
       @model function demo(x)
           m ~ Normal()
           x ~ Normal(m, 1)
           return m

help?> demo
search: demo denominator ReadOnlyMemoryError widemul StridedMatrix DenseMatrix divrem code_llvm @code_llvm StridedVecOrMat DimensionMismatch DenseVecOrMat


  This is a demo model.

julia> m = demo(1)
Model{var"###evaluator#307",(:x,),Tuple{Int64},(),ModelGen{var"###generator#308",(:x,),(),Tuple{}}}(##evaluator#307, (x = 1,), ModelGen{var"###generator#308",(:x,),(),Tuple{}}(##generator#308, NamedTuple()))

julia> m()

It’s a small change, but also a cool one that make Turing models seem more tightly integrated with the Julia system.


Thanks a lot for writing this package. Could you please provide a complete working example about how to write a custom distribution using the new interface?

Sure. @torfjelde is going to update the custom distribution documentation here – I’ll post back here when it’s done.

The updated distribution guide is here: https://turing.ml/dev/docs/using-turing/advanced#31-domain-transformation

Thanks you very much. I also found there are concrete examples in this discussion.