Simulation with Mamba.jl or something else

As background, I’m going through Allen Downey’s ThinkBayes 2nd edition. My double pedagogical approach is that I’m translating all the examples and exercises to equivalent Julia code from the original python. Through chapter 18 of 20, this has been challenging but doable in that I’ve been able to come up with native Julia solutions that produce identical or where random variables are concerned approximately identical. However, in Chapter 19, he demonstrates an example of MCMC using pymc3. After looking at the options, it appeared to me that Mamba most closely reflected the spirit of pymc3 - especially since it actually mentions it in the docs. But, I’ve been having trouble getting a solution that is acceptably similar to the python example.

I’ll admit I’m a little out of my depth here (there’s a reason I’m doing the course work) and that the possibility that I’ve come up with a facepalm solution is pretty high.

Is Mamba the best package to use to come up with this MCMC simulation? If so, any idea how my model isn’t accurately reflecting the pymc3 model?

Here’s a short snippet of the python that produces the “correct” values:

import pymc3 as pm

with pm.Model() as model:
    lam = pm.Gamma('lam', alpha=1.4, beta=1.0)
    goals = pm.Poisson('goals', lam, observed=4)

options = dict(return_inferencedata=False)
with model:
    trace = pm.sample(500, **options)

sample_post_pymc = trace['lam']

with model:
    post_pred = pm.sample_posterior_predictive(trace)

The Julia code that I’ve come up with looks like this:

model = Model(
  goals = Stochastic(1,
    (lam) -> Poisson(lam * 1.0),
    false
  ),
  lam = Stochastic(() -> Gamma(1.4, 1))
);

scheme = [NUTS([:lam])];

setsamplers!(model, scheme);

inits = [
	Dict{Symbol, Any}(
		:lam => rand(Gamma(1.4, 1)),
		:goals => line[:goals]
	)
	for i in 1:3
]
sim = mcmc(model, line, inits, 2000, burnin=1000, chains=1)

x = predict(sim)
preds = reshape(x.value, 1000)