Converting a BUGS model to Turing, simple linear regression

Hello everyone,

I’m thinking about translating a course that focuses on WinBUGS to Turing so the students can use something a little more modern if they want. I’m new to Julia and Turing.

Here is the WinBUGS model:

for (i in 1:N) {
    Y[i] ~ dnorm(mu[i],tau)
    mu[i] <- alpha + beta * (x[i] - 
} <- mean(x[])
alpha ~ dnorm(0, 0.0001)
beta ~ dnorm(0, 0.0001)
tau ~ dgamma(0.001, 0.001)
sigma <- 1.0/sqrt(tau)
list(N=5, x=c(1,2,3,4,5), Y=c(1,3,3,3,5)) #-----------------------------
list(alpha = 0.1, beta = 0.6, tau = 1)

and here is my attempt using Turing:

using Turing, Distributions
using DataFrames
using Random
using Statistics

data = DataFrame(x = [1, 2, 3, 4, 5], y = [1, 3, 3, 3, 5])

@model function simplereg(x, y)	
	# intercept prior
	α ~ Normal(0, 10e-4)
	# coefficient prior
	β ~ Normal(0, 10e-4)
	# variance prior
	τ ~ Gamma(10e-3, 100)
	σ = 1/sqrt(τ)

	for i in 1:length(x)
		y[i] ~ Normal(α + β*(x[i]-mean(x)), σ)

iterations = 10000
model = simplereg(data.x, data.y)
chain = sample(model, MH(), iterations)

This runs, but the results are wildly different: for Turing, mean alpha is .0001, beta is .00003, tau is .1. For BUGS, mean alpha is about 3, beta about .8, tau about 1.9.

Also, I didn’t see a simple way to provide initial values - how is that handled in Turing?

I saw a way to use burn-in depending on which sampler is selected, but MH doesn’t use it. Why is that?

Distributions.jl (which Turing and most of the Julia ecosystem uses) uses a normal distribution parameterized by a variance, instead of a precision (1/variance). So your Turing code should use something like Normal(0, 10e4) instead.

You’ll also want to check how Gamma is parametrized – I don’t remember off the top of my head if Distributions.jl is different than WinBUGS in that regard. (Edit - looks like you did change how Gamma was parameterized from WinBUGS, so I guess you already checked.)


Woops, I overlooked that. Time to double-check all the parameterizations. Thanks for catching that, what a silly mistake.

@model function simplereg(x, y)	
	α ~ Normal(0, 100)
	β ~ Normal(0, 100)
	τ ~ Gamma(.001, 1000)
	σ = 1/sqrt(τ)

	for i in 1:length(x)
		y[i] ~ Normal(α + β*(x[i] - mean(x)), σ)

Okay, I think this should be equivalent, but the results are still way off. Tau is coming out at .001 now. What am I missing here?

I don’t think there’s a problem with your model; I think it’s just an issue with the default MH settings being a very bad fit for these parameters (when I ran the code, the Rhat looked fine, but that was apparently very misleading). If you use Nuts, the estimates match what you report for WinBUGS.

1 Like

NUTS did the trick! Thank you very much.