Error running Museinference.jl example

Not sure if this should be a new topic. It follows on from https://discourse.julialang.org/t/ann-museinference-jl-drop-in-replacement-for-hmc-vi-try-it-on-your-problem/73468.

When I run the example code

@model function funnel()
    θ ~ Normal(0, 3)
    z ~ MvNormal(zeros(512), exp(θ/2))
    x ~ MvNormal(z, 1)
end

x = (funnel() | (θ=0,))() # draw sample of `x` to use as simulated data
model = funnel() | (;x)

muse(model, 0, get_covariance=true)

I get an error InexactError: Int64(-2.0175508218727827), which seems to be attempting to identify a threadid (although I don’t think it’s multithreaded).

Thanks for any help.

Here’s the complete error

ERROR: InexactError: Int64(-2.0175508218727827)
Stacktrace:
  [1] Int64
    @ .\float.jl:900 [inlined]
  [2] convert
    @ .\number.jl:7 [inlined]
  [3] setindex!
    @ .\array.jl:969 [inlined]
  [4] acclogp!!
    @ C:\Users\mpritchard\.julia\packages\DynamicPPL\oJMmE\src\threadsafe.jl:25 [inlined]
  [5] tilde_observe!!
    @ C:\Users\mpritchard\.julia\packages\DynamicPPL\oJMmE\src\context_implementations.jl:184 [inlined]
  [6] tilde_observe!!
    @ C:\Users\mpritchard\.julia\packages\DynamicPPL\oJMmE\src\context_implementations.jl:170 [inlined]
  [7] funnel(__model__::DynamicPPL.Model{typeof(funnel), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{NamedTuple{(:θ,), Tuple{Float32}}, DynamicPPL.DefaultContext}}, __varinfo__::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.SimpleVarInfo{NamedTuple{(), Tuple{}}, Int64, MuseInference.PartialTransformation{Tuple{AbstractPPL.VarName{:z, Setfield.IdentityLens}}}}, Vector{Int64}}, __context__::DynamicPPL.SamplingContext{DynamicPPL.SampleFromPrior, DynamicPPL.ConditionContext{NamedTuple{(:θ,), Tuple{Float32}}, DynamicPPL.DefaultContext}, Xoshiro})
    @ Main c:\Users\mpritchard\Desktop\Untitled-1__asdf.jl:27
  [8] _evaluate!!
    @ C:\Users\mpritchard\.julia\packages\DynamicPPL\oJMmE\src\model.jl:582 [inlined]
  [9] evaluate_threadsafe!!(model::DynamicPPL.Model{typeof(funnel), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{NamedTuple{(:θ,), Tuple{Float32}}, DynamicPPL.DefaultContext}}, varinfo::DynamicPPL.SimpleVarInfo{NamedTuple{(), Tuple{}}, Int64, MuseInference.PartialTransformation{Tuple{AbstractPPL.VarName{:z, Setfield.IdentityLens}}}}, context::DynamicPPL.SamplingContext{DynamicPPL.SampleFromPrior, DynamicPPL.DefaultContext, Xoshiro})
    @ DynamicPPL C:\Users\mpritchard\.julia\packages\DynamicPPL\oJMmE\src\model.jl:571
 [10] evaluate!!
    @ C:\Users\mpritchard\.julia\packages\DynamicPPL\oJMmE\src\model.jl:506 [inlined]
 [11] evaluate!!
    @ C:\Users\mpritchard\.julia\packages\DynamicPPL\oJMmE\src\model.jl:519 [inlined]
 [12] evaluate!!(model::DynamicPPL.Model{typeof(funnel), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{NamedTuple{(:θ,), Tuple{Float32}}, DynamicPPL.DefaultContext}}, rng::Xoshiro, varinfo::DynamicPPL.SimpleVarInfo{NamedTuple{(), Tuple{}}, Int64, MuseInference.PartialTransformation{Tuple{AbstractPPL.VarName{:z, Setfield.IdentityLens}}}})    
    @ DynamicPPL C:\Users\mpritchard\.julia\packages\DynamicPPL\oJMmE\src\model.jl:519
 [13] sample_x_z(prob::TuringMuseProblem{AbstractDifferentiation.ReverseRuleConfigBackend{Zygote.ZygoteRuleConfig{Zygote.Context{false}}}, DynamicPPL.Model{typeof(funnel), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}}, rng::Xoshiro, θ::ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(θ = 1,)}}})
    @ MuseInference C:\Users\mpritchard\.julia\packages\MuseInference\d1Xjv\src\turing.jl:214
 [14] (::MuseInference.var"#47#53"{Nothing, TuringMuseProblem{AbstractDifferentiation.ReverseRuleConfigBackend{Zygote.ZygoteRuleConfig{Zygote.Context{false}}}, DynamicPPL.Model{typeof(funnel), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}}})(rng::Xoshiro)
    @ MuseInference C:\Users\mpritchard\.julia\packages\MuseInference\d1Xjv\src\muse.jl:146
 [15] iterate
    @ .\generator.jl:47 [inlined]
 [16] _collect(c::Vector{Xoshiro}, itr::Base.Generator{Vector{Xoshiro}, MuseInference.var"#47#53"{Nothing, TuringMuseProblem{AbstractDifferentiation.ReverseRuleConfigBackend{Zygote.ZygoteRuleConfig{Zygote.Context{false}}}, DynamicPPL.Model{typeof(funnel), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}}}}, #unused#::Base.EltypeUnknown, isz::Base.HasShape{1})
    @ Base .\array.jl:802
 [17] collect_similar(cont::Vector{Xoshiro}, itr::Base.Generator{Vector{Xoshiro}, MuseInference.var"#47#53"{Nothing, TuringMuseProblem{AbstractDifferentiation.ReverseRuleConfigBackend{Zygote.ZygoteRuleConfig{Zygote.Context{false}}}, DynamicPPL.Model{typeof(funnel), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}}}})
    @ Base .\array.jl:711
 [18] map(f::Function, A::Vector{Xoshiro})
    @ Base .\abstractarray.jl:3261
 [19] pmap(f::Function, #unused#::MuseInference.LocalWorkerPool, args::Vector{Xoshiro})
    @ MuseInference C:\Users\mpritchard\.julia\packages\MuseInference\d1Xjv\src\util.jl:75
 [20] muse!(result::MuseResult, prob::TuringMuseProblem{AbstractDifferentiation.ReverseRuleConfigBackend{Zygote.ZygoteRuleConfig{Zygote.Context{false}}}, DynamicPPL.Model{typeof(funnel), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}}, θ₀::Int64; rng::Nothing, z₀::Nothing, 
maxsteps::Int64, θ_rtol::Float64, ∇z_logLike_atol::Float64, nsims::Int64, α::Float64, progress::Bool, pool::MuseInference.LocalWorkerPool, regularize::typeof(identity), H⁻¹_like′::Nothing, H⁻¹_update::Symbol, broyden_memory::Float64, checkpoint_filename::Nothing, get_covariance::Bool, save_MAPs::Bool)
    @ MuseInference C:\Users\mpritchard\.julia\packages\MuseInference\d1Xjv\src\muse.jl:145
 [21] muse!
    @ C:\Users\mpritchard\.julia\packages\MuseInference\d1Xjv\src\muse.jl:112 [inlined]
 [22] #muse!#132
    @ C:\Users\mpritchard\.julia\packages\MuseInference\d1Xjv\src\turing.jl:235 [inlined]
 [23] muse(::DynamicPPL.Model{typeof(funnel), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{NamedTuple{(:x,), Tuple{Vector{Float64}}}, DynamicPPL.DefaultContext}}, ::Vararg{Any}; kwargs::Base.Pairs{Symbol, Bool, Tuple{Symbol}, NamedTuple{(:get_covariance,), Tuple{Bool}}})    @ MuseInference C:\Users\mpritchard\.julia\packages\MuseInference\d1Xjv\src\muse.jl:107
 [24] top-level scope
    @ c:\Users\mpritchard\Desktop\Untitled-1__asdf.jl:35
1 Like

PS, same error running the longer example at MuseInference.jl · MuseInference

Yea this error pops up when you start Julia with multiple threads, even if you don’t use them, my guess is you are doing so even if not intentionall. I believe its a DynamicPPL bug and have been meaning to file an issue. In the meantime, following here, defining this should fix it:

import Turing.DynamicPPL as DynPPL
function DynPPL.ThreadSafeVarInfo(vi::DynPPL.SimpleVarInfo)
    return DynPPL.ThreadSafeVarInfo(vi, Vector{Real}(zeros(typeof(DynPPL.getlogp(vi)), Threads.nthreads())))
end

Edit: ThreadSafeVarInfo(::SimpleVarInfo) bug with ForwardDiff · Issue #524 · TuringLang/DynamicPPL.jl · GitHub

Thank you. I’m afraid I now get a new error:

ERROR: Mutating arrays is not supported -- called setindex!(Vector{Real}, ...)
This error occurs when you ask Zygote to differentiate operations that change
the elements of arrays in place (e.g. setting values with x .= ...)     

Yea, ran into that as well. I believe that’s a bonafide DynamicPPL/Zygote issue, eg:

Threads.nthreads() # 2 or more
using Turing, Zygote
@model function foo()
    x ~ Normal()
end
model = foo()
Zygote.gradient(x -> logjoint(model, (;x)), 1) # your error

You can follow along here to see if we figure it out or if you don’t actually need threads make sure you start Julia with only 1, or if you are fine with ForwardDiff the latest tagged version of MuseInference (0.2.4 from today) is fine with that with any number of threads.

The above comment wasn’t quite accurate; we do support Zygote usage with threads, but not when doing logjoint(model, (;x)) because this is just effectively a wrapper around

logjoint(model, SimpleVarInfo((;x), 0.0))

In short, in Turing / DynamicPPL we have two different “trace” structures that are used when working with a model:

  • VarInfo which is the one that has been around since the beginning of Turing, and is the one which is most heavily used.
  • SimpleVarInfo which is a “new” one that is supposed to be much “simpler” to work with (the implementation of VarInfo is quite complicated) and allows us to provide some simpler functionality for querying the model about some information, e.g. logjoint, in a nicer way than what we do with VarInfo.

By default the threadsafe VarInfo uses a Vector of Ref for each of the threads which are mutated; Ref is supported by Zygote. By default, the threadsafe SimpleVarInfo uses a Vector{<:Real}, i.e. not Ref, and hence Zygote complains.

Sooo, the way you can make this work is as follows:

Zygote.gradient(x -> logjoint(model, SimpleVarInfo((;x), Ref(0.0))), 1)

That is, we manually wrap the logp in a Ref, which is then compatible with Zygote.

Unfortunately, SimpleVarInfo with Ref is not as well-tested, so when trying the above I ran into an error which is fixed by adding

DynamicPPL.getlogp(vi::DynamicPPL.SimpleVarInfo{<:Any,<:Ref}) = vi.logp[]

which I’ll now make a PR for :grimacing:

Thank you. For now I have turned off multiple threads and it works for me.

Unfortunately, SimpleVarInfo with Ref is not as well-tested, so when trying the above I ran into an error which is fixed by adding

This is now fixed on DPPL@0.23.15