# Help with Turing.jl: BetaBinomial ArgumentError: BetaBinomial: the condition n >= zero(n) && (α >= zero(α) && β >= zero(β)) is not satisfied

New to Julia, but have had my looks on it for a while now and finally got a bit of extra time during the Summer the play around with it, wuhuu!

My goal is to implement a small program that I already have running in Python which uses Numpyro and Jax to improve its speed.

I am trying to use a BetaBinomial model with Turing.

``````using Turing
using Turing: Variational

N = [7642688, 7609177, 8992872, 8679915, 8877887, 8669401]
k = [2036195, 745632, 279947, 200865, 106383, 150621]
z = collect(1:length(N))

@model function betabin_model(z, k, N)
q ~ Beta(2, 3)
A ~ Beta(2, 3)
c ~ Beta(1, 9)

ϕ ~ Exponential(1000)
θ = ϕ + 2

for i in eachindex(z)
μ = clamp(A * (1-q)^(z[i]-1) + c, 0, 1)
α = μ * θ
β = (1-μ) * θ
k[i] ~ BetaBinomial(N[i], α, β)
end
end

model = betabin_model(z, k, N);
chains = sample(model, NUTS(), 1000);
``````

I get a warning, but it seems to run anyways:

``````Warning: The current proposal will be rejected due to numerical error(s).
│   isfinite.((θ, r, ℓπ, ℓκ)) = (true, false, false, false)
``````

However, sometimes when I run it, I run into a specific error. I can reproduce the error when I try to use ADVI:

``````advi = ADVI(10, 1000)
q = vi(model, advi);
``````

which gives the following error:

``````ERROR: LoadError: ArgumentError: BetaBinomial: the condition n >= zero(n) && (α >= zero(α) && β >= zero(β)) is not satisfied.
``````

This I cannot understand, at least from a mathematical standpoint. All of the elements in `N` are positive, `θ` is > 2, and I am even clipping `μ` to be between 0 and 1; how can either `α` or `β` (or `n`) then be negative?

Does anyone have any explanations? Also, please, feel free to comment on the quality of the code as this is my first julia code and I am here to learn

1 Like

Numerical error warnings during warm-up are normal and expected. If they persist past the warm-up, they generally indicate a problem with the model or its implementation.

Do you see this error when sampling with NUTS?

As far as I know, I should be sampling with `NUTS`?

``````chains = sample(model, NUTS(), 1000);
``````

Ah, I misunderstood you. Yeah, when sampling with ADVI it happens every time, but I have experienced it other times, although now that I think about it, it might have been when using:

``````
using Optim

# Generate a MLE estimate.
mle_estimate = optimize(model, MLE())

# Generate a MAP estimate.
map_estimate = optimize(model, MAP())
``````

I sampled your model with `NUTS` and also estimated the modes with `MLE` and `MAP` and did not see these errors. For `NUTS`, I drew 1000 samples from 16 chains and only found one transition with numerical error, which is quite good. `rhat` and `ess` both look good, so there aren’t any obvious issues here.

I’d chalk this up as an issue specific to ADVI, which unfortunately I can’t help with.

Okay, I’ve managed to narrow it down to the following.

If I change `k` to the following:

``````k = [2042914, 745828, 277760, 205701, 97427, 152673]
``````

and then generate the MAP estimate with a specific set of initial parameters:

``````map_estimate = optimize(model, MAP(), [0.1, 0.1, 0.01, 100])
``````

(with everything else kept the same) I run into the previously mentioned error:

``````ERROR: LoadError: ArgumentError: BetaBinomial: the condition n >= zero(n) && (α >= zero(α) && β >= zero(β)) is not satisfied.
``````

I have tried adding `@info N[i], α, β` inside the model. With this, I find the following lines just before the error is called:

``````[...] a lot of [ Info: ] above here

[ Info: (8877887, Dual{ForwardDiff.Tag{Turing.Core.var"#f#3"{DynamicPPL.TypedVarInfo{NamedTuple{(:q, :A, :c, :ϕ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:q, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:q, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:A, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:A, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:c, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:c, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:ϕ, Tuple{}}, Int64}, Vector{Exponential{Float64}}, Vector{AbstractPPL.VarName{:ϕ, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{var"#24#25", (:z, :k, :N), (), (), Tuple{Vector{Int64}, Vector{Int64}, Vector{Int64}}, Tuple{}}, DynamicPPL.SampleFromPrior, Turing.OptimizationContext{DynamicPPL.DefaultContext}}, Float64}}(7.453710778999033,-1.1633888898325928,0.29162337763363744,6.945280851663119,7.417474913162984), Dual{ForwardDiff.Tag{Turing.Core.var"#f#3"{DynamicPPL.TypedVarInfo{NamedTuple{(:q, :A, :c, :ϕ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:q, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:q, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:A, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:A, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:c, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:c, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:ϕ, Tuple{}}, Int64}, Vector{Exponential{Float64}}, Vector{AbstractPPL.VarName{:ϕ, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{var"#24#25", (:z, :k, :N), (), (), Tuple{Vector{Int64}, Vector{Int64}, Vector{Int64}}, Tuple{}}, DynamicPPL.SampleFromPrior, Turing.OptimizationContext{DynamicPPL.DefaultContext}}, Float64}}(403.94591260649196,1.1633888898325928,-0.29162337763363744,-6.945280851663119,401.98214847232805))
[ Info: (8669401, Dual{ForwardDiff.Tag{Turing.Core.var"#f#3"{DynamicPPL.TypedVarInfo{NamedTuple{(:q, :A, :c, :ϕ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:q, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:q, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:A, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:A, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:c, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:c, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:ϕ, Tuple{}}, Int64}, Vector{Exponential{Float64}}, Vector{AbstractPPL.VarName{:ϕ, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{var"#24#25", (:z, :k, :N), (), (), Tuple{Vector{Int64}, Vector{Int64}, Vector{Int64}}, Tuple{}}, DynamicPPL.SampleFromPrior, Turing.OptimizationContext{DynamicPPL.DefaultContext}}, Float64}}(7.162863556540884,-0.3614415363041935,0.07248121590660006,6.945280851663119,7.128041630854146), Dual{ForwardDiff.Tag{Turing.Core.var"#f#3"{DynamicPPL.TypedVarInfo{NamedTuple{(:q, :A, :c, :ϕ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:q, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:q, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:A, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:A, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:c, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:c, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:ϕ, Tuple{}}, Int64}, Vector{Exponential{Float64}}, Vector{AbstractPPL.VarName{:ϕ, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{var"#24#25", (:z, :k, :N), (), (), Tuple{Vector{Int64}, Vector{Int64}, Vector{Int64}}, Tuple{}}, DynamicPPL.SampleFromPrior, Turing.OptimizationContext{DynamicPPL.DefaultContext}}, Float64}}(404.2367598289501,0.3614415363041935,-0.07248121590660006,-6.945280851663119,402.2715817546368))
[ Info: (7642688, Dual{ForwardDiff.Tag{Turing.Core.var"#f#3"{DynamicPPL.TypedVarInfo{NamedTuple{(:q, :A, :c, :ϕ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:q, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:q, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:A, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:A, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:c, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:c, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:ϕ, Tuple{}}, Int64}, Vector{Exponential{Float64}}, Vector{AbstractPPL.VarName{:ϕ, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{var"#24#25", (:z, :k, :N), (), (), Tuple{Vector{Int64}, Vector{Int64}, Vector{Int64}}, Tuple{}}, DynamicPPL.SampleFromPrior, Turing.OptimizationContext{DynamicPPL.DefaultContext}}, Float64}}(NaN,NaN,NaN,NaN,NaN), Dual{ForwardDiff.Tag{Turing.Core.var"#f#3"{DynamicPPL.TypedVarInfo{NamedTuple{(:q, :A, :c, :ϕ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:q, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:q, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:A, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:A, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:c, Tuple{}}, Int64}, Vector{Beta{Float64}}, Vector{AbstractPPL.VarName{:c, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:ϕ, Tuple{}}, Int64}, Vector{Exponential{Float64}}, Vector{AbstractPPL.VarName{:ϕ, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{var"#24#25", (:z, :k, :N), (), (), Tuple{Vector{Int64}, Vector{Int64}, Vector{Int64}}, Tuple{}}, DynamicPPL.SampleFromPrior, Turing.OptimizationContext{DynamicPPL.DefaultContext}}, Float64}}(Inf,NaN,NaN,NaN,NaN))
ERROR: LoadError: ArgumentError: BetaBinomial: the condition n >= zero(n) && (α >= zero(α) && β >= zero(β)) is not satisfied.
Stacktrace:
[1] macro expansion
@ ~/.julia/packages/Distributions/fXTVC/src/utils.jl:6 [inlined]
[2] #BetaBinomial#42
@ ~/.julia/packages/Distributions/fXTVC/src/univariate/discrete/betabinomial.jl:30 [inlined]
[3] BetaBinomial
@ ~/.julia/packages/Distributions/fXTVC/src/univariate/discrete/betabinomial.jl:30 [inlined]
[4] #24
I was able to reproduce this error with these values of `k`. Since there aren’t any obvious issues with your model or implementation, I’d recommend opening an issue on the Turing repo.