How do I distribute Mamba mcmc function iterations on different processors?

I’m not 100% sure what your model is–the code snippet you posted seems to only have one level. Do you think you could make up a minimum working example? That said, it is definitely possible to do a (hierarchical) logistic regression in Turing. Here’s a basic example showing one way to do it:

using Turing
using Distributions

# define logistic (invlogit) function
logistic(x) = 1 / (1 + exp(-x))
# make up some fake parameters and data
npop = 2
ntime = 10
alpha_hyper = Normal(1, 2)
beta_hyper = Normal(0, 0.4)
alpha = rand(alpha_hyper, 1, 2)
beta = rand(beta_hyper, 1, 2)
x = randn(ntime, npop)
p = logistic.(alpha .* x .+ beta)
y = rand.(Bernoulli.(p))

@model function hierarchical_logistic_reg(x, y)
    # priors on hyperparameters
    mu_alpha ~ Normal(0, 5)
    sigma_alpha ~ Gamma(2, 3)
    mu_beta ~ Normal(0, 5)
    sigma_beta ~ Gamma(2, 3)
    # regression parameters
    alpha ~ Normal(mu_alpha, sigma_alpha)
    beta ~ Normal(mu_beta, sigma_beta)
    p = logistic.(alpha .* x .+ beta)
    # observation likelihood
    y ~ arraydist(Bernoulli.(p))
end

model = hierarchical_logistic_reg(x, y)
chn = sample(model, NUTS(), 1000)

using StatsPlots
plot(chn)

If you’re just starting out with Bayesian modeling in Julia, I’d probably recommend using Turing–it’s a more recent and more actively developed library at this point than Mamba.

One other point about doing distributed computations inside the model: double-check that it’s actually giving you a speedup compared to serial computation. For the example I gave above, the overhead of doing the parallel likelihood calculations actually make that approach slower than a simpler serial version.