Turing's negative binomial regression with horseshoe prior failing sometimes

Hi,
I am trying to do a negative binomial regression using Turing. I am using the horseshoe prior. Here is my code:

using Turing, StatsModels, RDatasets, Random

sanction = dataset("Zelig", "sanction")

function NegativeBinomial2(μ, ϕ)
    p = 1 / (1 + μ / ϕ)
    r = ϕ

    return NegativeBinomial(r, p)
end

@model NegativeBinomialRegression(X, y) = begin
    p = size(X, 2)
    n = size(X, 1)
    h = 1.0
    #priors
    
    halfcauchy = Truncated(TDist(1), 0, Inf)
    
    τ ~ halfcauchy    ## Global Shrinkage
    λ ~ filldist(halfcauchy, p) ## Local Shrinkage
    σ ~ InverseGamma(h, h)
    β0 = repeat([0], p)  ## prior mean
    β ~ MvNormal(β0, λ * τ)

    ## link
    #z = α .+ X * β
    z =  X * β
    mu = exp.(z)

    #likelihood
    for i = 1:n
        y[i] ~ NegativeBinomial2(mu[i], σ)
    end
end

frm = @formula(Num ~ Target + Coop + NCost)

frm = apply_schema(frm, schema(frm, sanction), RegressionModel)
y, X = modelcols(frm, sanction)

chain = sample(MersenneTwister(), NegativeBinomialRegression(X, y), NUTS(), 1000)

This often works, but occasionally, I get the following error:

ERROR: DomainError with Dual{ForwardDiff.Tag{DynamicPPL.DynamicPPLTag, Float64}}(0.0,NaN,NaN,NaN,NaN,NaN,NaN,NaN):
NegativeBinomial: the condition zero(p) < p <= one(p) is not satisfied.

Running the sampler a few times should reproduce this error:

for i in 1:20
    sample(MersenneTwister(), NegativeBinomialRegression(X, y), NUTS(), 100)
end

Thanks,
Ayush

Of course, global shrinkage being zero doesn’t make sense. But this can happen due to numerical accuracy. If this happens, you can try simple things like setting the lower bound on the Truncated to be something like eps(Float64) and whatnot.