Help with first non-trivial Turing example

I’d like some help with writing my first example in Turing.jl. I’m not even sure if this model is possible to be expressed in the Turing modeling framework. Below is the code I have so far and the current error:

# Import Turing and Distributions.
using Turing, Distributions

n = 1000
m = 10
k = 4
theta = randn(n)
b = zeros(k,m)
for i in 1:m
    b[1,i] = randn()
    for j in 2:k
        dd = truncated(Normal(), b[j-1,i], Inf)
        b[j,i] = rand(dd)
    end
end

logit = x -> log(x / (1 - x))
invlogit = x -> exp(x)/(1 + exp(x))
y = zeros(m,n)
probs = zeros(k,m,n)
for p in 1:n
    for i in 1:m
        probs[1,i,p] = 1.0
        for j in 1:(k-1)
            Q = invlogit(theta[p] - b[j,i])
            probs[j,i,p] -= Q
            probs[j+1,i,p] = Q
        end
        y[i,p] = rand(Categorical(probs[:,i,p]))
    end
end



# Graded Response Model
@model grm(y, n, m, k) = begin
    b = Array{Float64,2}(undef, k, m)
    for i in 1:m
        b[1,i] ~ Normal(0,1)
        for j in 2:k
            b[j,i] ~ truncated(Normal(0,1), b[j-1,i], Inf) 
        end
    end
    probs = zeros(k, m, n)
    theta = Vector{Float64}(undef, n)
    for p in 1:n
        theta[p] ~ Normal(0,1)
        for i in 1:m
            probs[1,i,p] = 1.0
            for j in 1:(k-1)
                Q = invlogit(theta[p] - b[j,i])
                probs[j,i,p] -= Q
                probs[j+1,i,p] = Q
            end
            y[i,p] ~ Categorical(probs[:,i,p])
        end
    end
    return theta, b
end;

sample(grm(y, n, m, k), HMC(0.05, 10), 1500)

and the error:

ERROR: TypeError: in typeassert, expected Float64, got ForwardDiff.Dual{Nothing,Float64,10}

The issue is that automatic differentiation changes the types of your variables from Float64 to ForwardDiff.Dual, so the type constraints in b = Array{Float64,2}(undef, k, m) throw an error.

Fortunately, there’s a quick fix for this. Turing provides type-stable notation, so you can just change your model to

@model function grm(y, n, m, k, ::Type{TM}=Array{Float64,2}, ::Type{TV}=Vector{Float64}) where {TM, TV}
    b = TM(undef, k, m)
    for i in 1:m
        b[1,i] ~ Normal(0,1)
        for j in 2:k
            b[j,i] ~ truncated(Normal(0,1), b[j-1,i], Inf) 
        end
    end
    probs = zeros(k, m, n)
    theta = TV(undef, n)
    for p in 1:n
        theta[p] ~ Normal(0,1)
        for i in 1:m
            probs[1,i,p] = 1.0
            for j in 1:(k-1)
                Q = invlogit(theta[p] - b[j,i])
                probs[j,i,p] -= Q
                probs[j+1,i,p] = Q
            end
            y[i,p] ~ Categorical(probs[:,i,p])
        end
    end
    return theta, b
end;

Give that a shot and let me know if it works for you.

Thanks I got it to start sampling, but I’m having a different error now. I’ll try to tackle it for a bit before coming back here. For future reference I also had to do something similar with the probs variable.

Yeah, that makes sense. I should have caught that one too. What’s your error? It might be a quick fix.

Hold on, I threw in a @show to try to debug it in the model and now my REPL is just endlessly printing things haha.

It showed the sampling progress which made it all the way to 100% but then I got the error for the Categorical distribution that the input probability vector didn’t sum to 1 or something like that.

Here we go:

julia> result = sample(grm(y, n, m, k), HMC(0.05, 10), 1500)
Sampling: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| Time: 0:00:12
ERROR: ArgumentError: Categorical: the condition isprobvec(p) is not satisfied.

If I do just 1 sample I get:

julia> result = sample(grm(y, n, m, k), HMC(0.05, 10), 1)
┌ Warning: The current proposal will be rejected due to numerical error(s).
│   isfinite.((θ, r, ℓπ, ℓκ)) = (true, true, false, true)
└ @ AdvancedHMC C:\Users\Benjamin\.julia\packages\AdvancedHMC\WJCQA\src\hamiltonian.jl:47
Sampling: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| Time: 0:00:11
ERROR: ArgumentError: Categorical: the condition isprobvec(p) is not satisfied.

@cpfiffer could it be a problem with the conditionally defined b parameter?

Possibly. One trick here is to add a println(probs[:,i,p]) statement just above the line y[i,p] ~ Categorical(probs[:,i,p]) to see what the draw looks like – it’ll probably be pretty ugly because it’s wrapped in a ForwardDiff.Dual.

A way to make it easier to read (i.e. not wrapped in a Dual) is just to sample with Metropolis-Hastings for diagnostic purposes while printing out your vector – sample(model, MH(), 100) or something.

Switching to MH temporarily is also good because if you don’t actually get this error in MH, it’s likely that there’s a gradient issue of some kind which can sometimes be an easy fix.

This error is common when the sum of the probability vector is slightly short or slightly over 1. Try 1-normalizing it.

Doesn’t seem to work.

Code currently:

# Import Turing and Distributions.
using Turing, Distributions

n = 1000
m = 10
k = 4
theta = randn(n)
b = zeros(k,m)
for i in 1:m
    b[1,i] = randn()
    for j in 2:k
        dd = truncated(Normal(), b[j-1,i], Inf)
        b[j,i] = rand(dd)
    end
end

logit = x -> log(x / (1 - x))
invlogit = x -> exp(x)/(1 + exp(x))
y = zeros(m,n)
probs = zeros(k,m,n)
for p in 1:n
    for i in 1:m
        probs[1,i,p] = 1.0
        for j in 1:(k-1)
            Q = invlogit(theta[p] - b[j,i])
            probs[j,i,p] -= Q
            probs[j+1,i,p] = Q
        end
        y[i,p] = rand(Categorical(probs[:,i,p]))
    end
end



# Graded Response Model
@model function grm(y, n, m, k, ::Type{TC}=Array{Float64,3}, ::Type{TM}=Array{Float64,2}, ::Type{TV}=Vector{Float64}) where {TC, TM, TV}
    b = TM(undef, k, m)
    for i in 1:m
        b[1,i] ~ Normal(0,1)
        for j in 2:k
            b[j,i] ~ truncated(Normal(0,1), b[j-1,i], Inf) 
        end
    end
    probs = TC(undef, k, m, n)
    theta = TV(undef, n)
    for p in 1:n
        theta[p] ~ Normal(0,1)
        for i in 1:m
            probs[1,i,p] = 1.0
            for j in 1:(k-1)
                Q = invlogit(theta[p] - b[j,i])
                probs[j,i,p] -= Q
                probs[j+1,i,p] = Q
            end
            probs[:,i,p] ./= sum(probs[:,i,p])
            y[i,p] ~ Categorical(probs[:,i,p])
        end
    end
    return theta, b
end;

result = sample(grm(y, n, m, k), MH(), 1)

with error

Sampling: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| Time: 0:00:01
ERROR: ArgumentError: Categorical: the condition isprobvec(p) is not satisfied.

Are any of the probabilities slightly negative by any chance?

All the debugging print statements I put in always show appropriate probability vectors

The probability is slightly negative sometimes. I checked.

Ah the initial values are not appropriate. Can i set initial values?

You mean define probs to be zeros? Ya sure, go ahead.

no I mean the b[:,i] parameters need to be in ascending order. That is why distribution of b[j,i] is conditional on b[j-1,i]. but when i print out the parameters for when probs is negative it is because this condition does not hold. How does this happen?

This is a bug. I will work on a fix. Thanks for reporting!

To clarify, the problem is that the supports of the prior distributions are themselves random. We didn’t account for this case properly in Turing but fixing it isn’t difficult.

Want me to open an issue on github so it can be properly tracked on there?

Sure feel free to open one. Thanks!