MCMC landscape

All the same it seems, just different data :smiley:
Thanks for the link, I’ll try to process the stuff and see how far I come!

EDIT: There is actually a topic related to my problem (Differentiating through a Jump Problem).

EDIT2: Just for completeness, my current conclusions:

  1. Following the linked topic above, Gillespie simulations with discrete states and discrete random selection of reactions do not seem to be (automatic) differentiable. You could make them continuous as a SDE for AD to work (see linked topic for more info), but that’s not the path I want to take. So I’m using samplers now without the need of a gradient; i.e. AdvancedMH as mentioned by @cpfiffer.
    If there are more non-gradient samplers in Julia, I would be glad to know! On a first glance, the currently implemented MH sampler generally works, but can have low number of effective samples.
  2. Approximate Bayesian (ABC) methods should be applicable for these problems; they do not need the evaluation of a potentially costly likelihood, but are – as the name suggests – only an approximation of the posterior (how good, has to be checked case-specific). I will try them out once I run into runtime limitations. Potential Julia implementations: ApproxBayes, GpABC, ApproximateBayesianComputing, ABC (and maybe more).
2 Likes