ANN: DynamicHMC 2.0

DynamicHMC 2.0 was just released. I wrote a blog post about the changes and the new features.

There are breaking API changes, please read the documentation.

Feel free to ask support questions here.

32 Likes

@Tamas_Papp thanks a lot for the updated version of your package!
I am a newbie in this field of MCMC for Bayesian estimation… I hope you don’t mind if I use this thread for a bit of help in understanding why my code is not working.

I have quite a complicated loglikelihood function, written in Julia, that can be correctly evaluated.
However, after following the steps in your worked example (transforming the parameters and setting up the gradient) I get an error when I run mcmc_with_warmup():

ArgumentError: nothing should not be printed; use show, repr, or custom output instead.

(This is the full stacktrace)

I wouldn’t be surprised if the problem was in the automatic differentiation, since the loglikelihood is quite an involved function, but unfortunately I’m not able to understand this error message…

Is there a print statement somewhere in your log likelihood function?

@jkbest2 there are no `print’ statements in the loglikelihood… From what I understand by looking at the stacktrace it seems something related to internal functions of the library

My knowledge in Bayesian is really rusty and I plan to pick it up in the near future. My old textbooks all use BUGS, I recently bought the Statistical Rethinking book but it uses R. The Julian codes for this book in github seem still in progress, many chapters are missing.

Do you have any recommended readings for using your DynamicHMC package?

1 Like

I am happy to help you with this, but I need an MWE.

Yes, there is a bibliography in the README. I would recommend starting with Bayesian Data Analysis. But Statistical Rethinking should also be fine — coding everything is part of the learning process.

Note that learning even the basics of Bayesian analysis is not a trivial undertaking, and can easily be equivalent to a full semester course.

Thanks, today I will try to strip down the loglikelihood function to the minimum that still produces the problem!

Thanks. It would be best if you opened an issue on Github, with version information etc.

BTW, are you sure you are using 2.0.0 or master? There was an issue like this which was fixed by this line. Your line numbers in the stacktrace don’t match up with the current version, but that could be an unrelated thing.

Hey thanks a lot, I was not using the last version I think.
I have now updated to Julia 1.2 and I got the packages from your github repository and now everything works.
Incidentally for some reason I still get the same problem if I run the code in Jupyter notebook instead of in a script…

Thanks. I will read these materials.

To my knowledge, we currently have two high-quality HMC implementations: DynamicHMC.jl and AdvancedHMC.jl. I haven’t used either intensely (but I definitely plan to), so I’m curious as to what the main differences between the two packages are.

4 Likes

This is a good question. AFAIK DynamicHMC.jl predates AdvancedHMC.jl, so it would be best to ask the authors of the latter about what, if anything, they were missing that made them start their own package.

@Kai_Xu, could you provide some information in this regard? I heard good things about both DynamicHMC.jl and AdvancedHMC.jl, and I guess they probably have their individual strengths. From a user perspective, it would be great so have some pointers on which might fit what kind of use cases better.

1 Like

It occurred to me that @goedman (hope you don’t mind me pinging) coded up everything in

https://github.com/StatisticalRethinkingJulia

for all available options, so perhaps he is the best person to ask about a comparison.

1 Like

Thanks for the ping @oschulz. Here are some of my thoughts.

What’s AdvancedHMC.jl (AHMC) for

AHMC is motivated for two purposes.

  1. Serve as the HMC backend of the Turing probabilistic programming language.

    • It extracted the HMC codes from Turing.jl written during my MPhil in 2016 and becomes a standalone package on its own.
  2. Serve as a research platform for developing novel variants of HMC algorithm

    • We made some efforts so that we can quickly develop new HMC algorithms using AHMC. You might see some novel samplers available in AHMC shortly.

Note that because of these two purposes, the direct use of AHMC requires some basic understanding of HMC algorithms, and might not be suitable for users of all levels. But you can achieve more stuff with it because of its flexibility and GPU support. E.g. you can sample from an energy-based model defined by a neural network using Flux.jl running on GPUs via CuArrays.jl.

See our poster on StanCon 2019 for more details of AHMC in which we also statistically compare our no-U-turn sampler (NUTS) implementation to Stan’s based on MCMCBenchmarks.jl.

The differences between AHMC and DynamicHMC.jl (DHMC)

The main difference is AHMC aims to support a wide range of HMC variants, including both static and dynmiac HMC algorithms, while DHMC, as the name indicates, implments a specific variant of dynamic HMC algorithm: NUTS with mutinomial sampling and generalised no-u-turn criterion; AHMC also supports this variant. Again our poster illustrates the variants that AHMC currently supports.

The Turing language and the differences in modelling pipeline

For end-users who focus on modelling, I highly recommend them to use Turing directly. The default HMC samplers are based on AHMC, but you can choose to use the NUTS implementation by DHMC as well. Plus there are many other inference methods available in Turing.jl, e.g. importance sampling, SMC, particle Gibbs, variational inference, etc. You will need some of those if you are working with discrete data :slight_smile:

PS: A compositional interface that combines different MCMC samplers in a Gibbs algorithm is also available (see our AISTATS paper).

Turing has its modelling language (quite similar to Stan’s) which automatically uses Bijectors.jl (which is another library extracted from Turing.jl) to deal with constraint variables. The use of Turing.jl and Bijectors.jl (roughly) corresponds to the use of LogDensityProblems.jl and TransformVariables.jl.

For analysis, Turing provides MCMCChains.jl that implements a unified and easy way to check a lot of statistics for MCMC as well as plotting useful figures. This is also the default MCMC summary backend of CmdStan.jl.

9 Likes

The only difference I see is AdvancedHMC supporting the slice sampler and the turning criterion of the original NUTS, but since both have been superseded (by the multinomial sampler and the generalized turning criterion), I don’t think this is relevant for most users; as these are less efficient variants, not something that you would ever want to go back to.

Thanks @Tamas_Papp and @Kai_Xu!

Regarding Turing, we’re planning to integrate DynamicHMC and/or AdvancedHMC into BAT.jl, it’s focused on use cases with (often complex) user-written likelihoods. So I’ll need to use the lower level interfaces, and I’m very happy that we have two high-quality (as far as I can judge) standalone HMC packages now that can be used as backends (without pulling in large frameworks that are geared towards specific use cases).

1 Like

I started DynamicHMC.jl specifically for supporting user-coded likelihoods, because the packages that were around that time preferred to expose a higher-level framework with directed acyclic graphs, which are not really suitable for the models I am working with. I find that being able to debug, unit test, benchmark, and optimize my log posterior as a plain vanilla Julia function is really valuable.

If you are using (modern) NUTS, I would say that the sampler should be close to identical between AdvancedHMC.jl or DynamicHMC.jl, so perhaps you should just review the interface and use whatever you find convenient.

For DynamicHMC.jl, the API for defining a log density is documented here in detail:

https://tamaspapp.eu/LogDensityProblems.jl/dev/

You can use this interface to perform AD for you with any of the supported AD packages (and it is easy to change later).

If you want to discuss something or just need a new feature, just open an issue or ping me on this forum.