@model function coinflip_binomial_broken(y::Array{Int64})
# prior on p
p ~ Beta(1, 1)
# updates on p
heads = sum(y)
heads ~ Binomial(length(y), p)
end
When run, I get InexactError: Int64(3.0555688441753306) (the value changes each time) which seems to be coming from the logpdf calculation for the Binomial.
I can make it work by pulling the sum and length functions out of the model block:
@model function coinflip_binomial(heads::Int64, flips::Int64)
# prior on p
p ~ Beta(1, 1)
# update on p
heads ~ Binomial(flips, p)
end
That said, why does the first one not work???
It seems to be due to sum(y) (rather than length(y)), which I deduced by redundantly passing the heads variable to the model instead of calculating it within the block while leaving the length call as is.
As a beginner to Turing and to Julia, I’d really like to understand what’s happening here. I suspect my understanding of how to write and interpret code in an AD setting is where my gap is. Any ideas on the above or suggestions for resources would be much appreciated.
The stack trace would help as we can’t see where your error happens. But I’d take a guess and say you have a Vector{Int} somewhere and calculate a Float64 which you then assign to that vector. That causes an InexactError because you can’t convert any float to an integer. Or any other container parameterized with Int.
I am unable to produce your result. Please be sure to post a working example that reproduces the error that you find. Also, please indicate which version of Turing you are using.
On my system, using Turing 0.15.8, I experience tons of gradient errors. I am guessing that the macro is not parsing the length(y) in the Binomial function.
using Turing, Random
@model function coinflip_binomial_broken(y::Array{Int64})
# prior on p
p ~ Beta(1, 1)
# updates on p
heads = sum(y)
heads ~ Binomial(length(y), p)
end
Random.seed!(5484)
y = [1,0,0,1]
chain = sample(coinflip_binomial_broken(y), NUTS(1000,.65), 2000)
Output:
Summary Statistics
parameters mean std naive_se mcse ess rhat
Symbol Float64 Float64 Float64 Float64 Float64 Float64
heads 2.0000 0.0000 0.0000 0.0000 NaN NaN
p 0.2672 0.0000 0.0000 0.0000 7.1985 1.0187
Quantiles
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
Symbol Float64 Float64 Float64 Float64 Float64
heads 2.0000 2.0000 2.0000 2.0000 2.0000
p 0.2672 0.2672 0.2672 0.2672 0.2672
As you can see, the sampler does not move beyond its initial value.
You are correct. It is not length, but rather heads = sum(y) that is the culprit. The following works.
using Turing, Random
@model function coinflip_binomial_broken(y::Array{Int64})
# prior on p
p ~ Beta(1, 1)
# updates on p
heads = sum(y)
2 ~ Binomial(length(y), p)
end
Random.seed!(5484)
y = [1,0,0,1]
chain = sample(coinflip_binomial_broken(y), NUTS(1000,.65), 2000)