Analysis and Diagnostics for MCMC

Is there a package in Julia similar to the coda package in R? Most of the functions in coda are pretty straightforward, but it’s nice to have a bunch of MCMC diagnostics/analysis functions all grouped together. I see that the documentation for Klara has an MCMC Stats section, but no information.

Mamba.jl has some diagnostic functions.

It would be great if in the long run one would factor out

  1. a common MCMC format,
  2. diagnostic for that format

into (two) separate packages. If you are interested, this would be a great project.

1 Like

Hi @Tamas_Papp thanks for the reply. Agreed - it would be great for Julia to have a standalone package for an MCMC object and diagnostics and visualizations methods for this type. Would it be acceptable to begin by pulling some of the existing code from Mamba.jl? Or should this be written from scratch?

Would be nice to have the MCMC methods play with StatsBase.StatisticalModel. I imagine a good integration with solvers (for Maximum a Posterior), and Distributions.jl.

1 Like

I am not sure how that would work. Typically, one uses Bayesian methods for problems which are more complex than those one would handle in a GLM framework. In my experience.

Instead of integrating things under a single umbrella framework, I prefer a modular approach:

  1. just coding the likelihood/posterior, ideally in an AD-compatible manner (potentially using Distributions.jl, of course),
  2. optimizing that with Optim.jl or BlackBoxOptim.jl, or whatever is the most convenient,
  3. when doing MCMC, just getting vectors from the posterior, for which I can calculate \hat{R} or ESS.

I think this way because when I run into problems with MCMC, it is much easier to debug small independent parts, than the insides of some framework.

BTW, on the original question: since then I have written and registered

https://github.com/tpapp/MCMCDiagnostics.jl

Cool suite of packages! I was actually thinking mostly of Bayesian / GLM problems (e.g., Stata’s bayes:). Those could potentially be a low hanging fruit with tools for more complex scenarios.

It would be very nice to have a small number of methods for estimating asymptotic variance / ESS associated with particular functions [since these are one-to-one]. Particularly batch means, overlapping batch means, and a spectral variance approach mentioned in

Flegal, J. M., & Jones, G. L. (2010). Batch means and spectral variance estimators in Markov chain Monte Carlo. The Annals of Statistics.

These are relatively simple, and then I would certainly use the MCMCDiagnostics package for some of my work. I would be happy to try and contribute if you are interested, although I don’t know when I will have time.

1 Like

Picking back up on this, it would be really nice to get this going. I think what @Tamas_Papp has provided in his package is a really nice start. I also really like his idea of having two packages - one for a common MCMC format, and one for diagnostic operating on that format.

I think a first step should be to get a package for storing chains. I also feel like these packages would probably get better traction if they were hosted on JuliaStats in github. Any thoughts there?

I would be happy to start the work - any volunteers to help with it? I’ll start a package and share the link this weekend.

1 Like

I am interested. Things I use or have use for are: Online statistics, ability to deal with vectors of static arrays, compute credibility bands and uncertainty measures and visualise them.
For me the interface I would need would be to basically create an mcmc chain

samples = MCSamples(x0, options)
...
while cond
   ...
   push!(samples, vector, accepted)
end

Useful options for the MCSampler object would be to to keep only subsamples, to keep only accepted states and their count, or certain coordinates, or to do only online statistics (running mean and variance). Here MCSamples could be an abstract type with different subtypes depending on the options.

I have some preliminary things for mcmc online statistics in
https://github.com/mschauer/Bridge.jl/blob/master/src/mclog.jl

I think that there will be potentially a very large number of packages implementing different MCMC algorithms. Some of them may be so specialized that they can’t use a simple Diagnostics package. However, for many cases, one ultimately wants asymptotic variances or effective sample sizes associated with one or more univariate functions, e.g. in order to provide asymptotically exact confidence intervals.

So I agree with Tamas’ approach also for the reason that one can keep the scope focused. One could have separate methods for doing online estimation, as these would necessarily have to be somewhat different.

Here is a simple batch means algorithm for the asymptotic variance, with a test.

https://gist.github.com/awllee/fe8411179d280da65a4dfc5d6af0b120

I use

\tau = \frac{\text{effective sample size}}{\text{sample size}}

(from MCMCDiagnostics.ess_factor_estimate) for diagnostics, not for quantification of the variance. This is because, asymptotic guarantees aside, it can have a relative large sample variance for chains of a few thousand draws, even with good algorithms. I did some experiments here.

Because of this, I am distrustful of algorithms that terminate conditional on this statistic, no matter how it is obtained. MCMCDiagnostics.jl uses the relatively simple autocorrelation cutoff method, which has proven to be very robust (references are in the docstrings). When 0.2 < \tau in difficult problems, I am reasonably happy about the outcome, otherwise I get suspicious and experiment with variable transformations.

Contributious are always welcome, but I am not sure this is the right package for online methods. Maybe OnlineStats.jl?

In practice, I find that I like to work with the following two representations:

  1. A vector of vectors, in \mathbb{R}^n. These are the untransformed, “raw” coordinates in the parameter space, before the application of various transformations. I use these for convergence diagnostics, eg \hat{R}.

  2. The above transformed into variables. Eg for a variable on > 0, one would use exp, ContinuousTransformations.jl implements quite a few of these, also for vectors of such variables. I like to use these for summarizing inference (quantiles, HPD regions), and posterior predictive checks.

A package about these is in the works, but I am waiting for v0.7 because named tuples provide such a nice interface for (2) above. I will announce it here soon.

Also, I have

On this: any reasonable implementation should deal with all <: AbstractVector, but I don’t see the advantage of static arrays for MCMC analysis. All the operations are super-cheap and I don’t think it is worth the specialization overhead.

To be honest, I am personally not as interested in diagnostics for conditional termination.

But I am interested in having widely used and consistent methods available for computing the integrated autocorrelation time / asymptotic variance. This is anyway what you are using to compute your diagnostic, and is a good measure to provide alongside MCMC estimates and to assess the performance of different Monte Carlo Markov chains.

I am also not that interested in online methods, at least as a first aim. The main thing I would be interested in is the availability of the different basic methods for estimating the asymptotic variance as a post-processing step, which requires very little complicated design or specialization.

I don’t get you. I argue against overspecialisation. MCMC state logging methods should be at least as general as to be able to accept state vectors whose elements are geometric points. If it is restricted to Vector{Float64}.

In many cases, at least for estimating the asymptotic variance which is the main algorithmic part, specialization should be free. In the gist I posted, the function is

function estimateAvar(xs::Vector{T}, f::F = x -> x) where F<:Function where T

which I have used before with T being an SVector or MVector, and really it could be anything.

Of course, one might want to further specialize beyond Vector{T}, and I assume that is feasible.

Anyway, my main question is whether different asymptotic variance estimators should be part of MCMCDiagnostics.jl or if they belong in another package.

As I said, I am happy to accept PRs, except for online algorithms (in the style of OnlineStats.jl), which should go there instead IMO.

I am sorry I don’t fully understand. If you accept all kinds of inputs, overspecialization (of the generated code) is precisely what you get. Eg I work with StaticVector types in DynamicHMC, when I implement algorithms that benefit from it. Currently these generate specialized methods for diagnostics.

But an MCMC diagnostics package may just want to (quietly, automatically) convert everything to Vector{Float64}, otherwise specialized code is generated for everything, which may not be worth it. In any case, this is not a major cost ATM, especially on v0.7, so I may not change it unless someone opens an issue.

Also, a nice (if a bit NUTS-specific) thread on diagnostics recently on the Stan forum:

I plan to implement these in Julia soon.

Just to get the process started, I created package here for storing chains. I think the first step is setting up a struct that essentially holds a vector of vectors, as stated. After that, we can start to expand to have things like push!. I’d also like to figure out a way to store chains that have birth/death, but I’ve never come across a good method for that. Any ideas?