i recently fell in love with Julia, and i was looking for packages that can sample efficiently a boltzmann kind probability distribution:
p(x) ∼ exp[-H(x)]
where the variables “x” can be both continuous or discrete.
I found out about dynamicHMC, but apparently it handles only continuous variables since is demands a gradient…
Does anyone know about a package that handles the mixed variables case?
So that i can avoid writing a slower, buggier, simpler (in a bad way) MCMC sampler for my research project…
Thank you in advance!!
It’s not clear what you are asking for, HMC won’t work unless the (log) density is continuous and differentiable — it’s a basic requirement of the algorithm.
You can either
- marginalize out the discrete part, and use DynamicHMC or any other NUTS variant,
- use a Gibbs-like sampler with HMC on the continuous part (but for that, you really need to understand the details and tune it yourself),
- use a gradient-free sampler, eg (RW)MH.
Hi Tamas, first of all thanks for answering and for the good work on Julia!
the first option unfortunately cannot be pursued since i need to estimate statistics on composite variables…
the other options can do, could you recommend some package or combination?
Apparently there are ways to augment HMC with discrete variables! very interesting
I think that Klara and Mamba can do combined NUTS-Gibbs steps. But I am not sure.
Turing.jl supports a Gibbs sampler.