Turing conditioning syntax and models using @addlogprob!

Hello! It is my understanding the preferred syntax to provide observations to Turing models going forward is to make use of the condition operator model() | (;y) instead of explicitly passing the observations model(y). However, I do not see how this works when building a model without an explicit likelihood in the code. Consider the model below which uses a Kalman filter over the observations and the Turing.@addlogprob! macro instead of something like y ~ Distribution(...).

@model function kalman(ys, H, F, Q, P, R)
    _, N = size(ys)
    latent_dim = size(H, 2)
    x₀ ~ MvNormal(zeros(latent_dim), I)
    x = x₀
    map(1:N) do t
        x, P, y, S = kalman_predict(x, P, H, F, Q, R)
        r = ys[:,t] - y # without ys as arg this quantity cannot be computed
        x, P, y, S = kalman_update(x, P, r, S, H, R)
        Turing.@addlogprob! - 0.5 * sum(logdet(S) + r'inv(S)*r)
    end
end

Without explicitly passing ys there is no way to write down the log probability, so I am curious if there is a way to use the conditioning syntax with this kind of model?

Yeaah that’s unfortauntely because we don’t have a great solution to that at the moment.

As a short-term solution, see the code-snippets after EDIT 2 in Supporting mutating ADs in models that fill arrays of parameters · Issue #412 · TuringLang/DynamicPPL.jl · GitHub

Using the @isobservation and @value macros there, you should be able to achieve what you want.

These might make their way into the codebase in the not too distant future.

Thank you for explaining the situation. I’ll probably wait to make changes until an API becomes public.

Could you explain why it is Turing is moving to the newer syntax? It seems to introduce some new complexity so I assume it is for a good reason.

1 Like