Which MCMC library for continuous time stochastic process inference?


I am just getting started in julia, and there seem to be many options for doing bayesian inference. I have been looking at Mamba and Klara, but I am not sure which is better for what I want to do: inference using continuous time Markov processes observed at discrete intervals.

In this scenario, writing the likelihood function requires a matrix exponential function. HMC would then require computing the derivatives of the matrix exponential in a numerically stable fashion. This is apparently not a trivial thing to do (see this STAN issue: github (https://github.com/stan-dev/math/issues/32). Has anybody here used julia for my application?


Can we include reasons why posts are withdrawn?


Can sequential monte carlo work for this? Turing.jl has a SMC sampler

There is also this : https://github.com/JuliaPOMDP


I was short-sighted and pointed him to a continuous-time stochastic process library

But that’s not what he needed. Not sure if this response is cluttering the discussion now, you can remove it if you like.


Hi, I am currently working on such a model. There are various issues and workarounds:

  1. For the matrix exponential, you can remove the balancing code from the Julia implementation, and do the rest in native Julia. This should be OK if your model transition matrix is not too unbalanced. This works with ForwardDiff.jl, and presumably reverse diff, but I could not get that working because of an issue.
  2. I made a library for HMC likelihood calculations, available here. A test fails, I will plan to fix it next week when I am less busy. Otherwise it is fine, uses log probabilities for numerical stability. Works with AD.
  3. Mamba is more mature, Klara is more hands-on. Mamba has no AD, Klara does, but since I cannot get reverse diff working at the moment, I am using Mamba, since forwarddiff AD is actually not much of an improvement.

I am happy to answer questions about details.


I wouldn’t worry about it. If anything, my preference is that nothing is ever deleted under any circumstances.


Forgot to mention two important differences between Klara and Mamba:

  1. in Klara, you can work directly with the likelihood function, while for Mamba, you need to use a trick, since it prefers DAGs for model construction. But he trick is simple.
  2. Mamba has much better facilities for convergence analysis.
  3. Neither have tools for posterior predictive checks, so I am working on a library that does it (you can find it among my repos, but it is heavily WIP).

BTW, I (think I) fixed the heisenbug in the tests, the tests are still running:


Thanks for your tips. Right now I am just trying to get a simple two-state discrete time Markov chain working in mamba. But I am having some difficulty figuring out how to use it.

Do you know if it possible to implement gradexpm by hand following the strategy used by theano, and then use automatic differentiation for everything except exmp?


Sorry, I don’t know the approach used by Theano and I have no time at the moment to read up on it. You can always use AD for steps implemented in pure Julia.


I was just wondering if I can use AD and plugin a hand-implemented gradient for a certain function.


just to answer my own question, it looks like a custom derivative function in ForwardDiff.jl can be implemented by doing something similar to the code at the bottom of dual.jl.


In principle yes, and it is simple for ℝ → ℝ functions, but last time I checked there was an issue with matrix to matrix functions, in particular, it was my impression that you need to wait for this. But ask the ForwardDiff.jl crowd, they are really friendly and helpful.

That said, forward AD is not what you want for likelihoods/posteriors, which are ℝⁿ → ℝ. You want reverse mode AD.

And finally, if you are working with a 2x2 system, you can calculate expm in closed form. This may be a good way to get started and explore the tools available at the moment.


This maybe of relevance: