Help converting pymc3 switchpoint analysis code to Turing.jl

Hey all,

I’ve been trying to practice turing.jl and Julia as a whole and its been a great experience so far!

I am having some trouble coming from pymc3 background, any help and direction is much appreciated!

Given the code from the (amazing) book on Bayesian methods for Hackers:

import pymc3 as pm
import theano.tensor as tt

with pm.Model() as model:
    alpha = 1.0/count_data.mean()  # Recall count_data is the
                                   # variable that holds our txt counts
    lambda_1 = pm.Exponential("lambda_1", alpha)
    lambda_2 = pm.Exponential("lambda_2", alpha)
    
    tau = pm.DiscreteUniform("tau", lower=0, upper=n_count_data - 1)

    idx = np.arange(n_count_data) # Index
    lambda_ = pm.math.switch(tau > idx, lambda_1, lambda_2)

    observation = pm.Poisson("obs", lambda_, observed=count_data)

    step = pm.Metropolis()
    trace = pm.sample(10000, tune=5000,step=step)

my translation is:

@model text_model(y) = begin
    α = 1/mean(y)
    λ₁ ~ Exponential(α)
    λ₂ ~ Exponential(α)
    
    τ ~ DiscreteUniform(1, length(y))
    
    function equality(idx)
        if τ > idx
            return λ₁
        end
        return λ₂
    end
    
    times = 1:length(y)
    λ = map(equality, times)

    for idx in times
        y[idx] ~ Poisson(λ[idx])
    end
    return τ
end

model = model(data.count)
trace = sample(model, Gibbs(HMC(0.01, 10, :λ₁, :λ₂), PG(20, :τ)), 1000)

There are no errors (other than warning of rejected proposals for numerical reasons once in a while) and the sampling goes on without a problem.

The problem is that I get the wrong results :expressionless: .

Question:
Do you think the code translation makes sense? not super sure about the switchpoint method I am using.

Thanks in advance Julia and Turing Communities!

1 Like

What exactly is wrong in the results? Also, in the python script the upper bound of the discrete uniform is n_count - 1 while it is n_count in the Turing script. Maybe that is a problem? Hard to examine without data. Is the data available somewhere?

Thanks for reaching out @trappmartin , the reason for python to be n_count -1 is since its 0 indexed - therefore the length of a list (data) is different from the index of the last item.

And the code is talked about here: Jupyter Notebook Viewer

And does the switch statement make sense to you? I this how you would implement pm.switch in turing?

Thanks again!

This seems to work fine:

@model text_model(y) = begin
    α = 1/mean(y)
    λ₁ ~ Exponential(α)
    λ₂ ~ Exponential(α)
    τ ~ DiscreteUniform(1, length(y))
    for idx in 1:length(y)
        y[idx] ~ Poisson(τ > idx ? λ₁ : λ₂)
    end
    return τ
end
sample(text_model(rand(1:100, 100)), MH(), 1000)

Edit: Man, Turing is neat!! Looking at that Python code above brought back some unpleasant memories!

7 Likes

Your results will match approximately.
Change to α = mean(y) in your code, since Exponential(α) in Distributions.jl is different from the PyMC3 version.

2 Likes