Hi everyone,
I just finished reading the paper for the Julia Turing module and went through your tutorials – looks really great, congratulations! I did not find a specific forum for it, so I post my question here:
In your tutorials, you have a HMM example included here: https://turing.ml/tutorials/4-bayeshmm/
I will abuse notation a bit and not include indices. You have some evidence e, and want the joint posterior of parameter \theta=\{m,T\} and state trajectories s, P(\theta, s | e). You initialize the sampler via the command:
g = Gibbs(1000, HMC(2, 0.001, 7, :m, :T), PG(20, 1, :s) )
Now, my actual question. I really like your concept of compositional Gibbs, and this sampler runs really well, but what you really do there is running a Particle Gibbs algorithm via (1) particle filtering the states s and (2) jointly sampling \theta via HMC + correction step. It cannot be a Gibbs sampler as you use a particle approximation for the trajectory, and as you only use 20 particle trajectories, I don’t think you apply any form of correction here because your chain is never stuck and works perfectly fine for this example.
I kind of get that you want to unify the commands you need to know for inference, but this looks kind of weird, because you call PG inside the Gibbs sampler, but at the end you perform Particle Gibbs, not a Gibbs sampler, right? They are really different beasts. I have seen that you have a SMC.jl file inside your Turing module, wouldn’t it make much more sense to write something like this for state space models:
g = PG( YOUR_SMC_METHOD(:s,…), YOUR_MCMC_METHOD(:theta,…) )
#or more general
g = ParticleMCMC( YOUR_SMC_METHOD(:s,…), YOUR_MCMC_METHOD(:theta,…) )
Apologies if I somehow misinterpreted it, and thank you for your input!
Best regards
Edit: Just to clarify - the sampler worked perfectly fine, I just wonder about the syntax when you combine various MC methods.