Hello! It is my understanding the preferred syntax to provide observations to Turing models going forward is to make use of the condition operator model() | (;y)
instead of explicitly passing the observations model(y)
. However, I do not see how this works when building a model without an explicit likelihood in the code. Consider the model below which uses a Kalman filter over the observations and the Turing.@addlogprob!
macro instead of something like y ~ Distribution(...)
.
@model function kalman(ys, H, F, Q, P, R)
_, N = size(ys)
latent_dim = size(H, 2)
x₀ ~ MvNormal(zeros(latent_dim), I)
x = x₀
map(1:N) do t
x, P, y, S = kalman_predict(x, P, H, F, Q, R)
r = ys[:,t] - y # without ys as arg this quantity cannot be computed
x, P, y, S = kalman_update(x, P, r, S, H, R)
Turing.@addlogprob! - 0.5 * sum(logdet(S) + r'inv(S)*r)
end
end
Without explicitly passing ys
there is no way to write down the log probability, so I am curious if there is a way to use the conditioning syntax with this kind of model?