Turing.jl setting different updates for arrays of variables

Hello I have a model which has an array of discrete binary parameters J[t]
and an array of continuous parameters X[t]. I want to use NUTS() or
HMC() updates for the continuous variables and MH() updates
for the discrete parameters in the McMC. It should be possible
by using GIbbs(), however it is not clear how
to use it when the random variables are indexed arrays, I could not find
not such examples in the documentation.

Here is the changepoint model

Turing.@model function poisson_model(N,T,p,rho)
J = tzeros(Int,T)
X = tzeros(Real,T)
J[1] ~ Dirac(1)
X[1] ~ Exponential(rho)
theta0= X[1]
N[1] ~ Poisson(theta0)
for t in 2:T
J[t] ~ Bernoulli(p)
X[t] ~ Laplace(0,rho)
theta = sum( (J[s]*X[s]) for s in 1:t)
if theta < 0.0
Turing.@addlogprob! -Inf
N[t] ~ Poisson(theta)

poisson_posterior= poisson_model(N,T,p,rho)

chain = Turing.sample(poisson_posterior,MH( ),nMcMC)

works but it is mixing very slowly. Then if I try to combine
different updates using GIbbs()

chain = Turing.sample(poisson_posterior,Gibbs(HMC(0.1,5,:X),MH(:J)),nMcMC)

the last command gives errors, while it should work when X and J are scalars.

Thanks !

It is better to surround code with triple quotes (```) to make it format better.
Also, it is better to have code which runs when copy pasted into REPL (known as a MWE - minimal working example). In this case:

using Turing
Turing.@model function poisson_model(N, T, p, rho)
    J = tzeros(Int, T)
    X = tzeros(Real, T)
    J[1] ~ Dirac(1)
    X[1] ~ Exponential(rho)
    theta0 = X[1]
    N[1] ~ Poisson(theta0)
    for t = 2:T
        J[t] ~ Bernoulli(p)
        X[t] ~ Laplace(0, rho)
        theta = sum((J[s] * X[s]) for s = 1:t)
        if theta < 0.0
            Turing.@addlogprob! -Inf
        N[t] ~ Poisson(theta)
    return (J, X)
rho = 0.5
T = 3
N = [1, 2, 2]
p = 0.5
poisson_posterior = poisson_model(N, T, p, rho)
nMCMC = 1000
using StatsPlots
chain = Turing.sample(poisson_posterior, Gibbs(HMC(0.1, 5, :X), MH(:J)), nMCMC)

As for the question, the above code ran without errors on Julia 1.8.3 with Turing 0.22.0. What is the error on your setup?

Thanks a lot Dan, next time I will use ‘’’ and put a minimal working example. I was not sure whether the syntax
Gibbs( NUTS(100,0.65,:X),MH(:J)) would be correct when X[t] and J[t] are arrays of random variables, but yes it is correct , the code is running now. The problem was perhaps in too large HMC stepsize for my data
N = [182,165,182,136,135,127,136,133,