How to obtain log likelihoods from turing model

What are some good ways of obtaining the likelihood of observing some data given the parameters ( P(y|\theta), where I feed in y and \theta )? I couldn’t quite find it in the documentation.

edit: would be especially helpful if I can get likelihoods when sampling from the posterior using one of the samplers, such as NUTS.

call loglikelihood

1 Like

To elaborate, here’s a simple example showing how to actually do it:

using Turing

@model function demo(x)
    μ ~ Normal(0, 1)
    σ ~ Exponential(1)
    x .~ Normal(μ, σ)
end

m = demo(randn(10))
loglikelihood(m, (μ=2, σ=3))

The MCMCChain from sampling contains two (identical?) fields, :lp and :log_density, with the value of the log-posterior at each sample point, but those values will include the priors as well as the likelihood. Not sure if there’s a built-in way to get that from a posterior chain, but you can calculate the log-likelihood manually inside the model, return it, and get it after sampling via generated_quantities:

@model function demo1(x)
    μ ~ Normal(0, 1)
    σ ~ Exponential(1)
    loglik = loglikelihood(Normal(μ, σ), x)
    Turing.@addlogprob!(loglik)
    return loglik
end
m1 = demo1(randn(10))
c1 = sample(m1, NUTS(), 100)
ll1 = generated_quantities(m1, c1)
6 Likes

Thanks so much for the detailed example!

1 Like

thanks!