My Turing implementation of the model:
@model sv_model(r) = begin
N = length(r)
h = Array{Float64}(undef, N)
σ ~ TruncatedNormal(0.3, 0.1, 0.001, 10)
ρ ~ Uniform(-1,1)
ϕ ~ Uniform(-1,1)
μ ~ Normal(-1.0, 0.3)
h₀ = μ
r₀ = 0.0
r[1] ~ Normal(0.0, exp(0.5*μ))
h[1] ~ Normal(μ, sqrt(1 - ρ*ρ)*σ)
for i = 2:N
r[i] ~ Normal(0.0, exp(0.5*h[i-1]))
h[i] ~ Normal(μ + ϕ * (h[i-1] - μ) + σ*ρ*exp(-0.5*h[i-1])*r[i-1], sqrt(1 - ρ*ρ)*σ)
end
end
With HMC() or NUTS() for algorithms, I get the below error.
TypeError: in typeassert, expected Float64, got ForwardDiff.Dual{Nothing,Float64,10}
Stacktrace:
[1] setindex!(::Array{Float64,1}, ::ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#7"
With PG(), i get occasionally lucky. Otherwise, most of the times i get the following message (though i can’t see why sigma is < 0)
CTaskException:
ArgumentError: Normal: the condition σ >= zero(σ) is not satisfied.
Stacktrace:
[1] macro expansion at /Users/balaji/.julia/packages/Distributions/RAeyY/src/utils.jl:6 [inlined]
[2] #Normal#98 at /Users/balaji/.julia/packages/Distributions/RAeyY/src/univariate/continuous/normal.jl:37 [inlined]
[3] Normal at /Users/balaji/.julia/packages/Distributions/RAeyY/src/univariate/continuous/normal.jl:37 [inlined]
[4] macro expansion at ./In[133]:18 [inlined]
[5] ##evaluator#771(::Random._GLOBAL_RNG, ::DynamicPPL.Model{var"###evaluator#771",(:r,),Tuple{Array{Float64,1}},(),
The same exact model runs in Stan (julia interface) though.