Making Turing Fast with large numbers of parameters?

really should be written by someone who actually knows what is going on though :slight_smile:

2 Likes

True. But it will be faster if someone took their understanding from a thread like this and put it in a PR with some examples and then people from the Turing team corrected or edited their PR.

1 Like

well I’m certainly willing to do that. It could help if I understood exactly how LazyArrays really works. Also what should I ideally have written as the code. I’m thinking…

    totweight ~ arraydist([truncated(Normal(staffweights[staffid[i]] + patientweights[patientid[i]] + usewch[i] * wcwt, measerr),0.0,Inf) for i in 1:length(totweight)])

becomes:

    means = @~ view(staffweights,staffid) .+ view(patientweights,patientid) .+ usewch .* wcwt
    totweight ~ arraydist(LazyArray(Base.broadcasted(m -> truncated(Normal(m,measerr),0.0,Inf),
                                                     means)))

But this gives an error:

ERROR: MethodError: no method matching arraydist(::BroadcastVector{Any, var"#182#184", Tuple{BroadcastVector{Float64, typeof(+), Tuple{BroadcastVector{Float64, typeof(+), Tuple{SubArray{Float64, 1, Vector{Float64}, Tuple{Vector{Int64}}, false}, SubArray{Float64, 1, Vector{Float64}, Tuple{Vector{Int64}}, false}}}, BroadcastVector{Float64, typeof(*), Tuple{Vector{Int64}, Float64}}}}}})
Closest candidates are:
  arraydist(::AbstractVector{<:UnivariateDistribution}) at /home/dlakelan/.julia/packages/DistributionsAD/b93cZ/src/arraydist.jl:5
  arraydist(::AbstractMatrix{<:UnivariateDistribution}) at /home/dlakelan/.julia/packages/DistributionsAD/b93cZ/src/arraydist.jl:17
  arraydist(::AbstractVector{<:MultivariateDistribution}) at /home/dlakelan/.julia/packages/DistributionsAD/b93cZ/src/arraydist.jl:47
Stacktrace:

It makes me think the views are a problem here…

also it doesn’t help if I wrap the @~ as in LazyArray(@~...)

@mohamed82008 is there a modification to Turing which specifically handles BroadcastVectors that I don’t have? is this something in the master branch of Turing or some other thing? When I look at my install:

julia> methods(arraydist)
# 3 methods for generic function "arraydist":
[1] arraydist(dists::AbstractVector{<:UnivariateDistribution}) in DistributionsAD at /home/dlakelan/.julia/packages/DistributionsAD/b93cZ/src/arraydist.jl:5
[2] arraydist(dists::AbstractMatrix{<:UnivariateDistribution}) in DistributionsAD at /home/dlakelan/.julia/packages/DistributionsAD/b93cZ/src/arraydist.jl:17
[3] arraydist(dists::AbstractVector{<:MultivariateDistribution}) in DistributionsAD at /home/dlakelan/.julia/packages/DistributionsAD/b93cZ/src/arraydist.jl:47

I am trying to see if I can find a solution to your problem. But I have a question so I can better understand what is going on. In this particular case, why are you using arraydist?

I’ve tried running a few models using different distributions, and so far have no encountered your problem. It was only when I used truncated(Normal(...)...) or arraydist that I had issues.

The Turing performance tips tell you that if you use arraydist it is somehow better/faster than writing loops. At the very beginning I had a loop over Normals, which was converted to a MvNormal and went tens to hundreds of times faster… but now I want a shape other than normal (and which is restricted to positive values).

In my real problem, like in this problem, the data is restricted to positive values, but unlike in this problem, my real problem can have “most likely” values down near 0, with errors only really possible in the positive direction. If I don’t model that correctly then the model can predict negative values for a quantity that can only logically be 0 or bigger.

Gotchya. My understanding (which is probably wrong) is that arraydist and filldist sped things up when dealing with parameters, but not for the observations. Hopefully someone will correct me if I am wrong.

You could write it using standard broadcasting syntax. That worked fine for me when I rewrote your model 2.


    θ = staffweights[staffid] .+ patientweights[patientid] .+ usewch .* wcwt
    totweight .~ Gamma.(15, θ ./ 14) 

Another option might be a custom distribution. An example using MvBinomial is here.

My only other (probably unhelpful) suggestions at this point to keep things positive are the obvious options (e.g., log(y) or using a something like a Gamma distribution, as in your example), which I assume you have tried or cannot use.

I’ve tried a Gamma but not using the broadcast syntax, it was crazy slow, but I’ll give it a try with broadcast!

I like this LazyArrays idea but it doesn’t seem to work in my case

That’s definitely valid. But the way broadcasting is done in Turing, the RHS will be evaluated first then broadcasting of the logpdf computation will be done afterwards. When using ForwardDiff or Zygote, this is fine. When using Tracker or ReverseDiff, you can do better because of the TrackedArray issue I mention above.

That’s another good option. A perhaps easier alternative is to accumulate the observations’ log probabilities manually using the Turing.@addlogprob! macro. So you can do something like:

Turing.@addlogprob! sum(logpdf.(Gamma.(15, θ ./ 14), totweight))

This line of code is what the lazy arrays trick will evaluate under the hood.

2 Likes

Yeah, I’m actually thinking this might be a smart idea because the thing I’m modeling has a variety of weird aspects to it. If I did this I could make sure it had its own gradient.

What line of code would I use to make the LazyArray method work? I couldn’t get that to work, it would error out saying arraydist couldn’t take that type of object (see errors above)

The problem seems that the type of the vector is BroadcastVector{Any}. Seems like a LazyArrays.jl inference issue. Try asserting the type of the output of the broadcasted function to give the compiler a hint.

1 Like

I tried this:


function truncatenormal(a,b)::UnivariateDistribution
    truncated(Normal(a,b),0.0,Inf)
end


@model function estweights3lazy(nstaff,staffid,npatients,patientid,usewch,totweight)
    wcwt ~ Gamma(20.0,15.0/19)
    measerr ~ Gamma(10.0,20.0/9)
    staffweights ~ filldist(truncated(Normal(150,30),90.0,Inf),nstaff)
    patientweights ~ filldist(truncated(Normal(150,30),90.0,Inf),npatients)
    theta = LazyArray(@~ view(staffweights,staffid) .+ view(patientweights,patientid) .+ usewch .* wcwt)
    totweight ~ arraydist(LazyArray(@~ truncatenormal.(theta,measerr)))
end

But it gives a backtrace:

ERROR: MethodError: no method matching arraydist(::BroadcastVector{Any, typeof(truncatenormal), Tuple{BroadcastVector{Float64, typeof(+), Tuple{BroadcastVector{Float64, typeof(+), Tuple{SubArray{Float64, 1, Vector{Float64}, Tuple{Vector{Int64}}, false}, SubArray{Float64, 1, Vector{Float64}, Tuple{Vector{Int64}}, false}}}, BroadcastVector{Float64, typeof(*), Tuple{Vector{Int64}, Float64}}}}, Float64}})
Closest candidates are:
  arraydist(::AbstractVector{<:UnivariateDistribution}) at /home/dlakelan/.julia/packages/DistributionsAD/b93cZ/src/arraydist.jl:5
  arraydist(::AbstractMatrix{<:UnivariateDistribution}) at /home/dlakelan/.julia/packages/DistributionsAD/b93cZ/src/arraydist.jl:17
  arraydist(::AbstractVector{<:MultivariateDistribution}) at /home/dlakelan/.julia/packages/DistributionsAD/b93cZ/src/arraydist.jl:47
Stacktrace:
  [1] estweights3lazy(__model__::DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, __varinfo__::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.UntypedVarInfo{DynamicPPL.Metadata{Dict{AbstractPPL.VarName, Int64}, Vector{Distribution}, Vector{AbstractPPL.VarName}, Vector{Real}, Vector{Set{DynamicPPL.Selector}}}, Float64}, Vector{Base.RefValue{Float64}}}, __context__::DynamicPPL.SamplingContext{DynamicPPL.SampleFromUniform, DynamicPPL.DefaultContext, Random._GLOBAL_RNG}, nstaff::Int64, staffid::Vector{Int64}, npatients::Int64, patientid::Vector{Int64}, usewch::Vector{Int64}, totweight::Vector{Float64})
    @ Main ./REPL[81]:7
  [2] macro expansion
    @ ~/.julia/packages/DynamicPPL/RcfQU/src/model.jl:465 [inlined]
  [3] _evaluate(model::DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, varinfo::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.UntypedVarInfo{DynamicPPL.Metadata{Dict{AbstractPPL.VarName, Int64}, Vector{Distribution}, Vector{AbstractPPL.VarName}, Vector{Real}, Vector{Set{DynamicPPL.Selector}}}, Float64}, Vector{Base.RefValue{Float64}}}, context::DynamicPPL.SamplingContext{DynamicPPL.SampleFromUniform, DynamicPPL.DefaultContext, Random._GLOBAL_RNG})
    @ DynamicPPL ~/.julia/packages/DynamicPPL/RcfQU/src/model.jl:448
  [4] evaluate_threadsafe(model::DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, varinfo::DynamicPPL.UntypedVarInfo{DynamicPPL.Metadata{Dict{AbstractPPL.VarName, Int64}, Vector{Distribution}, Vector{AbstractPPL.VarName}, Vector{Real}, Vector{Set{DynamicPPL.Selector}}}, Float64}, context::DynamicPPL.SamplingContext{DynamicPPL.SampleFromUniform, DynamicPPL.DefaultContext, Random._GLOBAL_RNG})
...

Please post a full MWE for me to copy and paste.

Sure, it’s the same as the above MWE, but here it is in just copy-paste form:

using Pkg
Pkg.activate(".")
using Turing, DataFrames,DataFramesMeta,LazyArrays,Distributions,DistributionsAD
using LazyArrays

## every few hours a random staff member comes and gets a random
## patient to bring them outside to a garden through a door that has a
## scale. Sometimes using a wheelchair, sometimes not. knowing the
## total weight of the two people and the wheelchair plus some errors
## (from the scale measurements), infer the individual weights of all
## individuals and the weight of the wheelchair.

nstaff = 100
npat = 100
staffids = collect(1:nstaff)
patientids = collect(1:npat)
staffweights = rand(Normal(150,30),length(staffids))
patientweights = rand(Normal(150,30),length(staffids))
wheelchairwt = 15
nobs = 300

data = DataFrame(staff=rand(staffids,nobs),patient=rand(patientids,nobs))
data.usewch = rand(0:1,nobs)
data.totweights = [staffweights[data.staff[i]] + patientweights[data.patient[i]] for i in 1:nrow(data)] .+ data.usewch .* wheelchairwt .+ rand(Normal(0.0,20.0),nrow(data))


function truncatenormal(a,b)::UnivariateDistribution
    truncated(Normal(a,b),0.0,Inf)
end


@model function estweights3lazy(nstaff,staffid,npatients,patientid,usewch,totweight)
    wcwt ~ Gamma(20.0,15.0/19)
    measerr ~ Gamma(10.0,20.0/9)
    staffweights ~ filldist(truncated(Normal(150,30),90.0,Inf),nstaff)
    patientweights ~ filldist(truncated(Normal(150,30),90.0,Inf),npatients)
    theta = LazyArray(@~ view(staffweights,staffid) .+ view(patientweights,patientid) .+ usewch .* wcwt)
    totweight ~ arraydist(LazyArray(@~ truncatenormal.(theta,measerr)))
end

ch3l = sample(estweights3lazy(nstaff,data.staff,npat,data.patient,data.usewch,data.totweights),NUTS(500,.75),1000)
1 Like

I am not getting that error but I am getting numerical errors. I am using Turing 0.18.

as am I, let me just restart my Julia session and see if it persists.

Do you get it sampling at all?

also when I print the type of the BroadcastVector, I get

BroadcastVector{Truncated{Normal{ReverseDiff.TrackedReal{Float64, Float64, Nothing}}, Continuous, ReverseDiff.TrackedReal{Float64, Float64, Nothing}}, typeof(truncatenormal), Tuple{BroadcastVector{ReverseDiff.TrackedReal{Float64, Float64, Nothing}, typeof(+), Tuple{BroadcastVector{ReverseDiff.TrackedReal{Float64, Float64, Nothing}, typeof(+), Tuple{SubArray{ReverseDiff.TrackedReal{Float64, Float64, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}}, 1, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}, Tuple{Vector{Int64}}, false}, SubArray{ReverseDiff.TrackedReal{Float64, Float64, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}}, 1, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}, Tuple{Vector{Int64}}, false}}}, BroadcastVector{ReverseDiff.TrackedReal{Float64, Float64, Nothing}, typeof(*), Tuple{Vector{Int64}, ReverseDiff.TrackedReal{Float64, Float64, Nothing}}}}}, ReverseDiff.TrackedReal{Float64, Float64, Nothing}}}

which seems right to me.

Do you get it sampling at all?

No. I suspect the prior or model might be bad. Try a smaller dataset or initialising from a good point.

So since the model and data are entirely synthetic, the model is actually essentially perfect. All the real weights are normally distributed from the given priors and the true wheelchair weight is right at the peak density of the prior for the wcwt parameter. However obviously initialization might be the issue. I’ll see if I can figure out an init vector to avoid that… after I get my kids breakfast etc.

Thanks so much for your help!

1 Like

If it’s synthetic from a single set of parameter values and there is sufficient data and your model is identifiable, the posterior will asymptotically tend to a dirac-delta distribution. This will cause numerical issues when you have a large enough data set since you are trying to sample from a very concentrated distribution where the log joint almost everywhere is -Inf. A solution to this is to generate data from multiple sets of parameter values or use a smaller data set.

1 Like

Yes, or a very good initial point and a very small step size, good point.

However in this case there are 100 patients and 100 staff, and 300 observations so on average each person is only represented around 3 times, so I don’t think that is the issue, but I’m happy to try with 50 or 100 observations and see if it changes anything.

I let this model run for 1 hour and it was still at 0% and the backtrace when I interrupted looked like:

ERROR: InterruptException:
Stacktrace:
  [1] Array
    @ ./boot.jl:457 [inlined]
  [2] Array
    @ ./boot.jl:466 [inlined]
  [3] Array
    @ ./boot.jl:474 [inlined]
  [4] similar
    @ ./abstractarray.jl:829 [inlined]
  [5] similar
    @ ./abstractarray.jl:828 [inlined]
  [6] similar
    @ ./broadcast.jl:212 [inlined]
  [7] similar
    @ ./broadcast.jl:211 [inlined]
  [8] copy
    @ ./broadcast.jl:929 [inlined]
  [9] materialize
    @ ./broadcast.jl:904 [inlined]
 [10] logabsdetjac(b::Bijectors.TruncatedBijector{1, Float64, Float64}, x::Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}})
    @ Bijectors ~/.julia/packages/Bijectors/EELoe/src/bijectors/truncated.jl:125
 [11] logpdf_with_trans
    @ ~/.julia/packages/Bijectors/EELoe/src/Bijectors.jl:134 [inlined]
 [12] assume(rng::Random._GLOBAL_RNG, spl::DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, dist::Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}, vn::AbstractPPL.VarName{:patientweights, Tuple{}}, vi::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Set{DynamicPPL.Selector}}}}}, ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Base.RefValue{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}}})
    @ Turing.Inference ~/.julia/packages/Turing/uMQmD/src/inference/hmc.jl:484
 [13] tilde_assume
    @ ~/.julia/packages/DynamicPPL/RcfQU/src/context_implementations.jl:63 [inlined]
 [14] tilde_assume
    @ ~/.julia/packages/DynamicPPL/RcfQU/src/context_implementations.jl:58 [inlined]
 [15] tilde_assume
    @ ~/.julia/packages/DynamicPPL/RcfQU/src/context_implementations.jl:43 [inlined]
 [16] tilde_assume!
    @ ~/.julia/packages/DynamicPPL/RcfQU/src/context_implementations.jl:140 [inlined]
 [17] estweights3lazy(__model__::DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, __varinfo__::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Set{DynamicPPL.Selector}}}}}, ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}, Vector{Base.RefValue{ForwardDiff.Dual{ForwardDiff.Tag{Turing.Core.var"#f#1"{DynamicPPL.TypedVarInfo{NamedTuple{(:wcwt, :measerr, :staffweights, :patientweights), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:wcwt, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:wcwt, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:measerr, Tuple{}}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:measerr, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:staffweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:staffweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:patientweights, Tuple{}}, Int64}, Vector{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64}, FillArrays.Fill{Truncated{Normal{Float64}, Continuous, Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:patientweights, Tuple{}}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(estweights3lazy), (:nstaff, :staffid, :npatients, :patientid, :usewch, :totweight), (), (), Tuple{Int64, Vector{Int64}, Int64, Vector{Int64}, Vector{Int64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}, Float64}, Float64, 10}}}}, __context__::DynamicPPL.SamplingContext{DynamicPPL.Sampler{NUTS{Turing.Core.ForwardDiffAD{40}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext, Random._GLOBAL_RNG}, nstaff::Int64, staffid::Vector{Int64}, npatients::Int64, patientid::Vector{Int64}, usewch::Vector{Int64}, totweight::Vector{Float64})

...

I tried 50 observations, and put the initial position at exactly the true values and the initial epsilon in NUTS at 1e-9,

as follows:

julia> ch3l = sample(estweights3lazy(nstaff,data.staff,npat,data.patient,data.usewch,data.totweights),NUTS(500,.75;init_ϵ=1e-9),1000;init_theta=vcat([15.0,20.0],staffweights,patientweights))

it still spits out a bunch of warnings like:

┌ Warning: The current proposal will be rejected due to numerical error(s).
│   isfinite.((θ, r, ℓπ, ℓκ)) = (true, true, false, true)
└ @ AdvancedHMC ~/.julia/packages/AdvancedHMC/HQHnm/src/hamiltonian.jl:47
┌ Warning: The current proposal will be rejected due to numerical error(s).
│   isfinite.((θ, r, ℓπ, ℓκ)) = (true, true, false, true)
└ @ AdvancedHMC ~/.julia/packages/AdvancedHMC/HQHnm/src/hamiltonian.jl:47
┌ Warning: The current proposal will be rejected due to numerical error(s).
│   isfinite.((θ, r, ℓπ, ℓκ)) = (true, true, false, true)
└ @ AdvancedHMC ~/.julia/packages/AdvancedHMC/HQHnm/src/hamiltonian.jl:47

and goes nowhere.

@cscherrer Would this be a good usecase for Soss.jl?

1 Like