Traces of transformed variables within Turing.jl models

Hi all,

Is it possible to make Turing return the transformations of random variables in chains? For instance:

@model function silly_example(y)
    x ~ Beta(1,1)
    z = x > 0.5                # This guy!
    y ~ Normal(0, x)
end

Is it possible to get traces of z in the chains when sampling?

Thanks!
Brad

3 Likes

One stupid way if you are on Distributions.jl >= 0.24.7 is to make z a draw from a Dirac distribution:

@model function silly_example(y)
    x ~ Beta(1,1)
    z ~ Dirac(x > 0.5)
    y ~ Normal(0, x)
end

This is a little limited to univariate quantites, unfortunately.

Alternatively, you can use the “generated quantities” method:

    x ~ Beta(1,1)
    z = x > 0.5
    y ~ Normal(0, x)

    return z
end

model2 = silly_example_2(1.0)
chain2 = sample(model2, NUTS(), 1000)
gen = generated_quantities(model2, chain2)

Note the addition of the return statement here. generated_quantities goes back to extract whatever is in the return statement for each parameterization in the chain.

Honestly it’s been a long discussion on how to include these in chains more easily, and I don’t think we’ve settled on a good solution yet.

3 Likes

Oh great, generated quantities was what I needed! Thanks!

I had the same question on twitter https://twitter.com/inferencelab/status/1350067286539964417 before I knew about this discourse. FYI there’s not much on generated_quantities in the docs, so as a beginner to Julia and Turing, I felt a bit lost.

3 Likes

I am somewhat late to the discussion but, is there also a way to conveniently use generated_quantities or something similar with variational inference?

If I understand correctly, generated_quantities simply takes your chain, and then sets the parameters equal to the values found in the chain and executes the other statements to compute the quantities of interest. It shouldn’t matter whether the chain was made by variational inference or any other sampling method.

Nice and thanks for the quick response @dlakelan! For future reference, here is the example from the docs using variational inference:

using Turing
using Optim
using StatsPlots
using Turing: Variational

@model function demo(xs)
    s ~ InverseGamma(2, 3)
    m_shifted ~ Normal(10, √s)
    m = m_shifted - 10

    for i in eachindex(xs) 
        xs[i] ~ Normal(m, √s)
    end

    return (m, )
end


q = vi(model, ADVI(10, 2000))
samples = rand(q, 1000)

#NOTE: constructing the chain becomes slightly more involved with sampled vectors and matrices
_, sym2range = bijector(model, Val(true))
val = vcat([samples[range...,:] for (sym, range) in pairs(sym2range)]...)'
parameter_names = [keys(sym2range)...]
chain = Chains(val, parameter_names)

gq = generated_quantities(model, chain_vi)
m = vec(getindex.(gq, 1))
density!(m, lab="generated quantity (VI)")
vline!([0], lab="true value")

example

PyMC3 solves this with pm.Deterministic, could there be an easy way to provide this special case? Hopefully with fewer symbols to type, though… :yum: