Hello,
I am attempting to perform Gibbs sampling using a mixture of close formed conditional posteriors and other samplers. Ultimately, I will use this to estimate parameters in a nonlinear model where I can write out the posterior distribution for some components but not all.
In the simple working example below, I simulate data from a normal distribution with known mean = 0 and variance parameter varphi. I know the closed form for conditional posteriors for varphi (https://en.wikipedia.org/wiki/Conjugate_prior - Normal with known mean). Since I can write out the conditional posterior distributions for varphi, I can sample from the conditional posteriors directly using MH() with a static proposal, or so I thought. When I run the code below, I notice that the updates for varphi are never accepted - and they should be accepted every time in theory.
My question is, why isn’t MH() behaving as I would expect? This is pretty odd.
Replicable code is pasted below. Thank you for any insight. This seems like a use case that would be generally relevant.
using Turing
using Distributions
using MCMCChains
using StatsPlots
using PDMats
# Simulation setup
n = 1000 # number of subjects
# Linear regression model
@model function J_samp(y, n)
varphi ~ Gamma(1, 1)
y ~ MvNormal(zeros(n), PDiagMat(fill(1, n) ./ varphi))
# Gamma parameter for MH() proposals
v = (1 + 0.5 * sum((y) .^ 2))
return y
end
# Generate y
model = J_samp(missing, n)
chain = sample(model, Prior(), 1)
y = generated_quantities(model, chain)[1]
# Perform Gibbs sampling
n_samples = 20000
model2 = J_samp(y, n)
samp_meth = Gibbs(
MH(:varphi => v -> Gamma(1 + 1000/2, (1 / v))),
)
chain2 = sample(model2, samp_meth, n_samples)
plot(group(chain2, :varphi))