Issue with the Metropolis adjusted Langevin sampler from AdvancedMH.jl

I am trying to use a Metropolis adjusted Langevin sampler, from AdvancedMH.MALA. I did a simple test to see if things are working correctly, which is to sample from a multivariate normal distribution. Here is the code:

import AdvancedMH, LogDensityProblems, Distributions

struct TheNormalLogDensity{M}
    A::M
end

# can do gradient
LogDensityProblems.capabilities(::Type{<:TheNormalLogDensity}) = LogDensityProblems.LogDensityOrder{1}()

LogDensityProblems.dimension(d::TheNormalLogDensity) = size(d.A, 1)
LogDensityProblems.logdensity(d::TheNormalLogDensity, x) = -x' * d.A * x / 2

function LogDensityProblems.logdensity_and_gradient(d::TheNormalLogDensity, x)
    return -x' * d.A * x / 2, -d.A * x
end

Σ = [1.5 0.35; 0.35 1.0]
σ² = 0.01
spl = AdvancedMH.MALA(g -> Distributions.MvNormal((σ² / 2) .* g, σ² * I))

chain = AdvancedMH.sample(TheNormalLogDensity(inv(Σ)), spl, 500000; initial_params=ones(2))
data = stack([c.params for c = chain])

I can then compute an empirical covariance matrix from the sampled data, which should be close to Σ because we have a lot of samples here.

# this should be close to Σ
C = data * data' / size(data, 2)

But actually, I get an empirical covariance that seems very different from Σ:

> data * data' / size(data, 2)
2×2 Matrix{Float64}:
 0.507211  0.124855
 0.124855  0.341631

Am I doing something wrong here?

1 Like

This seems to be a bug: MALA implementation results in the wrong covariance · Issue #95 · TuringLang/AdvancedMH.jl · GitHub