I’m trying to use Turing to do inference for a Normal-Inverse-Gaussian (NIG) distribution, but I have two problems:
- I can only make the inference work for certain realizations.
- The parameter restriction feels clumsy.
My attempt is to enforce that
|beta| < alpha:
using Turing @model function nig1(x) μ ~ Normal(0, 1) α ~ Exponential(1) δ ~ Exponential(1) β_limit = 0.9 * abs(α) β ~ Uniform(-β_limit, β_limit) β = clamp(β, -β_limit, β_limit) x ~ filldist(NormalInverseGaussian(μ, α, β, δ), length(x)) end
clamp I get errors trying to compute
sqrt of a negative number (in
If “0.9” is replaced with e.g. “0.95” there are numerical problems.
Turing can return a sensible output with this function – for certain realizations! Other realizations cause problems when evaluating the Bessel function in the pdf of the NIG.
import Random Random.seed!(12) μ = 0; α = 0.5; β = 0.2; δ = 1; X = Distributions.NormalInverseGaussian(μ, α, β, δ) data = rand(X, 2_000) model = nig1(data) chain = Turing.sample(model, NUTS(), 3_000)
sample function succeeds, but there are lots of rejected proposals.
To avoid the pdf of the NIG I try to use the alternative expression as a mean-variance mixture:
@model function nig2(x) μ ~ Normal(0, 1) α ~ Exponential(1) δ ~ Exponential(1) β_limit = 0.95 * α β ~ Uniform(-β_limit, β_limit) β = clamp(β, -β_limit, β_limit) γ = sqrt(α^2 - β^2) z = filldist(InverseGaussian(δ/γ, δ^2), length(x)) for i in eachindex(x) x[i] ~ Normal(μ + β * z, sqrt(z[i])) end end
(For now) I’m not interested in the latent
z variables, which is why I sample with
= instead of
~. I don’t know if this is smart?
(Replacing the loop with
x ~ MvNormal(μ + β * z, LinearAlgebra.diagm(z)) doesn’t seem to work.)
This takes a lot longer to run and the trace plots are very suspicious – only a single value is sampled for each parameter.