DiffEqFlux Example codes not working

Hello people,
I am trying to get into the DiffEqFlux package and neural ODE networks, but the example code from the documentation of the package is not working for me. It raises the error:

LoadError: ArgumentError: tuple must be non-empty
Stacktrace:
[1] first(#unused#::Tuple{})
@ Base .\tuple.jl:134
[2] _unapply(t::Nothing, xs::Tuple{})
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\lib\lib.jl:163
[3] _unapply(t::Tuple{Nothing}, xs::Tuple{}) (repeats 2 times)
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\lib\lib.jl:167
[4] _unapply(t::Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, xs::Tuple{Nothing, Nothing, Nothing, Vector{Float32}, Vector{Float32}, Nothing})
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\lib\lib.jl:168
[5] unapply(t::Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, xs::Tuple{Nothing, Nothing, Nothing, Vector{Float32}, Vector{Float32}, Nothing})
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\lib\lib.jl:177
[6] (::Zygote.var"#188#189"{Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, Zygote.var"#kw_zpullback#40"{DiffEqSensitivity.var"#adjoint_sensitivity_backpass#179"{Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}, Tsit5, InterpolatingAdjoint{0, true, Val{:central}, Bool, Bool}, Vector{Float32}, Vector{Float32}, Tuple{}, Colon, NamedTuple{(), Tuple{}}}}})(Δ::Matrix{Float32})
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\lib\lib.jl:195
[7] (::Zygote.var"#1689#back#190"{Zygote.var"#188#189"{Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, Zygote.var"#kw_zpullback#40"{DiffEqSensitivity.var"#adjoint_sensitivity_backpass#179"{Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}, Tsit5, InterpolatingAdjoint{0, true, Val{:central}, Bool, Bool}, Vector{Float32}, Vector{Float32}, Tuple{}, Colon, NamedTuple{(), Tuple{}}}}}})(Δ::Matrix{Float32})
@ Zygote C:\Users\aspec.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59
[8] Pullback
@ C:\Users\aspec.julia\packages\DiffEqBase\dCe5g\src\solve.jl:70 [inlined]
[9] (::typeof(∂(#solve#59)))(Δ::Matrix{Float32})
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\compiler\interface2.jl:0
[10] (::Zygote.var"#188#189"{Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, typeof(∂(#solve#59))})(Δ::Matrix{Float32})
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\lib\lib.jl:194
[11] (::Zygote.var"#1689#back#190"{Zygote.var"#188#189"{Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, typeof(∂(#solve#59))}})(Δ::Matrix{Float32})
@ Zygote C:\Users\aspec.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59
[12] Pullback
@ C:\Users\aspec.julia\packages\DiffEqBase\dCe5g\src\solve.jl:68 [inlined]
[13] (::typeof(∂(solve##kw)))(Δ::Matrix{Float32})
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\compiler\interface2.jl:0
[14] Pullback
@ D:\Users\aspec\Desktop\Juno\test.jl:23 [inlined]
[15] (::typeof(∂(predict_n_ode)))(Δ::Matrix{Float32})
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\compiler\interface2.jl:0
[16] Pullback
@ D:\Users\aspec\Desktop\Juno\test.jl:27 [inlined]
[17] (::typeof(∂(loss_n_ode)))(Δ::Float32)
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\compiler\interface2.jl:0
[18] #188
@ C:\Users\aspec.julia\packages\Zygote\zowrf\src\lib\lib.jl:194 [inlined]
[19] #1689#back
@ C:\Users\aspec.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59 [inlined]
[20] Pullback
@ C:\Users\aspec.julia\packages\Flux\0c9kI\src\optimise\train.jl:102 [inlined]
[21] (::Zygote.var"#69#70"{Params, typeof(∂(#38)), Zygote.Context})(Δ::Float32)
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\compiler\interface.jl:255
[22] gradient(f::Function, args::Params)
@ Zygote C:\Users\aspec.julia\packages\Zygote\zowrf\src\compiler\interface.jl:59
[23] macro expansion
@ C:\Users\aspec.julia\packages\Flux\0c9kI\src\optimise\train.jl:101 [inlined]
[24] macro expansion
@ C:\Users\aspec.julia\packages\Juno\n6wyj\src\progress.jl:119 [inlined]
[25] train!(loss::Function, ps::Params, data::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{}}}, opt::ADAM; cb::var"#78#80")
@ Flux.Optimise C:\Users\aspec.julia\packages\Flux\0c9kI\src\optimise\train.jl:99
[26] top-level scope
@ D:\Users\aspec\Desktop\Juno\test.jl:48
[27] eval
@ .\boot.jl:360 [inlined]
[28] include_string(mapexpr::typeof(identity), mod::Module, code::String, filename::String)
@ Base .\loading.jl:1094

I also tried to run this code from this nice blog post and at first I get UndefVarError: param not defined, so I change p = param([2.2, 1.0, 2.0, 0.4]) to p = Flux.params([2.2, 1.0, 2.0, 0.4]) and I get BoundsError: attempt to access Params at index [2].

I am aware of this post. I used Pkg.update() and after updating, my versions are julia 1.6.1, Diff EqFlux 1.39, Flux 0.12.4, Zygote 0.6.12 and DiffEqSensitivity 6.49.1.
There are brand new versions for Zygote (6.14) and DiffEqSensitivity (6.50.1) but Pkg does not download these yet for me. Is this a problem of my package versions?

I think there is something wrong with the parameters in both cases?

Oh no. Do ]add DiffEqSensitivity@6.50.1: does it give you it? Not even if you ]up? There was a nasty bug in that release of a few days ago that that patch hits, and this bug is what you found. Good thing is that for only like 6 hours did the latest release actually had, bad thing is you hit it :sweat_smile:

2 Likes

Today I was able to just update Zygote and DiffEqSensivity to its latest versions with Pkg.update().
This fixed it, the code from the documentation works now just fine.
The code from the blog post doesent but since its from 2019, I’ll just leave it like that and work with the latest documentation.

Thanks Chris!!

1 Like

Hi,
I also had the same issue and I gave the following commands:

] activate .
] add Flux
] add DifferentialEquations
] add DiffEqFlux
] add Pluto
] up

Now when I run the first Lotka-Volterra ODE example, I get the following error:

result_ode

Failed to show value:

MethodError: no method matching has_syms(::SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, GalacticOptim.var"#163#179"{GalacticOptim.var"#161#177"{Vector{Float64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}, Int64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#167#183"{GalacticOptim.var"#165#181"{Vector{Float64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}, Int64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#169#185", Nothing, Nothing, Nothing})

Closest candidates are:

has_syms(!Matched::SciMLBase.AbstractSciMLFunction) at /Users/amit/.julia/packages/SciMLBase/cU5k7/src/scimlfunctions.jl:1312

    rows(::SciMLBase.OptimizationSolution{Float64, 1, Vector{Float64}, SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, GalacticOptim.var"#163#179"{GalacticOptim.var"#161#177"{Vector{Float64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}, Int64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#167#183"{GalacticOptim.var"#165#181"{Vector{Float64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}, Int64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#169#185", Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Iterators.Pairs{Symbol, Main.workspace4.var"#1#2", Tuple{Symbol}, NamedTuple{(:cb,), Tuple{Main.workspace4.var"#1#2"}}}}, Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Optim.Flat}, Float64, Optim.MultivariateOptimizationResults{Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Optim.Flat}, Float64, Vector{Float64}, Float64, Float64, Vector{Optim.OptimizationState{Float64, Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Optim.Flat}}}, Bool, NamedTuple{(:f_limit_reached, :g_limit_reached, :h_limit_reached, :time_limit, :callback, :f_increased), NTuple{6, Bool}}}})@tabletraits.jl:25
    table_data(::SciMLBase.OptimizationSolution{Float64, 1, Vector{Float64}, SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, GalacticOptim.var"#163#179"{GalacticOptim.var"#161#177"{Vector{Float64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}, Int64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#167#183"{GalacticOptim.var"#165#181"{Vector{Float64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}, Int64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#169#185", Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Iterators.Pairs{Symbol, Main.workspace4.var"#1#2", Tuple{Symbol}, NamedTuple{(:cb,), Tuple{Main.workspace4.var"#1#2"}}}}, Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Optim.Flat}, Float64, Optim.MultivariateOptimizationResults{Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Optim.Flat}, Float64, Vector{Float64}, Float64, Float64, Vector{Optim.OptimizationState{Float64, Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Optim.Flat}}}, Bool, NamedTuple{(:f_limit_reached, :g_limit_reached, :h_limit_reached, :time_limit, :callback, :f_increased), NTuple{6, Bool}}}}, ::IOContext{IOBuffer})@PlutoRunner.jl:953
    show_richest(::IOContext{IOBuffer}, ::Any)@PlutoRunner.jl:641
    var"#sprint_withreturned#28"(::IOContext{Base.DevNull}, ::Int64, ::typeof(Main.PlutoRunner.sprint_withreturned), ::Function, ::SciMLBase.OptimizationSolution{Float64, 1, Vector{Float64}, SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, GalacticOptim.var"#163#179"{GalacticOptim.var"#161#177"{Vector{Float64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}, Int64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#167#183"{GalacticOptim.var"#165#181"{Vector{Float64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}, Int64}, GalacticOptim.var"#160#176"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoForwardDiff{nothing}, DiffEqFlux.var"#81#86"{typeof(Main.workspace4.loss)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#169#185", Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Iterators.Pairs{Symbol, Main.workspace4.var"#1#2", Tuple{Symbol}, NamedTuple{(:cb,), Tuple{Main.workspace4.var"#1#2"}}}}, Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Optim.Flat}, Float64, Optim.MultivariateOptimizationResults{Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Optim.Flat}, Float64, Vector{Float64}, Float64, Float64, Vector{Optim.OptimizationState{Float64, Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Optim.Flat}}}, Bool, NamedTuple{(:f_limit_reached, :g_limit_reached, :h_limit_reached, :time_limit, :callback, :f_increased), NTuple{6, Bool}}}})@PlutoRunner.jl:591
    format_output_default(::Any, ::Any)@PlutoRunner.jl:515
    #format_output#17@PlutoRunner.jl:532[inlined]
    formatted_result_of(::Base.UUID, ::Bool, ::Nothing, ::Module)@PlutoRunner.jl:448
    top-level scope@none:1

Any hints on it? Googling “DiffExEq has_syms” yielded zero results!

"DifferentialEquations"   v"6.17.2"
"DiffEqFlux"              v"1.41.0"
"Plots"                   v"1.19.3"

What version of SciMLBase? This should all work: tests passed this morning.

1 Like

I think it would be 1.18.2

(neuralode) pkg> add SciMLBase
   Resolving package versions...
    Updating `~/Programs/lammps/neuralode/Project.toml`
  [0bca4576] + SciMLBase v1.18.2
  No Changes to `~/Programs/lammps/neuralode/Manifest.toml`

Julia v1.6? Double check your packages against: Merge pull request #592 from SciML/dg/zeros · SciML/DiffEqFlux.jl@10db629 · GitHub

Yes. Julia 1.6. I would do a fresh install as per Merge pull request #592 from SciML/dg/zeros · SciML/DiffEqFlux.jl@10db629 · GitHub and try again

HI, my bad. the marked solution works fine. It was my Pluto environment that defaulted to incompatible versions.

1 Like

Sorry but I am again getting the same error :frowning:

Below is the output of Pkg.installed()

Dict

"DifferentialEquations"  v"6.17.2"
"IJulia"                 v"1.23.2"
"DiffEqFlux"             v"1.41.0"
"Flux"                   v"0.12.6"
"DiffEqSensitivity"      v"6.55.3"
"GalacticOptim"          v"2.0.3"
"SciMLBase"              v"1.18.2"
"Pluto"                  v"0.15.1"

This started exactly when I tried Neural Ordinary Differential Equations with sciml_train · DiffEqFlux.jl example, and installed GalacticOptim. Otherwise first example it ran fine. I checked all versions match that of the Github tests output @ChrisRackauckas linked above. Can someone run this example and confirm? or let me know of exact versions to install?


curiously, when I exit the notebook, I suddenly get a dump of what i believe is loss values:

Pluto.run()      From worker 3:	568.15894f0434.4206f0332.06467f0250.48984f0187.0734f0152.50137f0136.16827f0126.96706f0121.2905f0117.60257f0115.15193f0113.51723f0112.40828f0111.57703f0110.81408f0109.988945f0109.06599f0108.08015f0107.087494f0106.130066f0105.22804f0104.384094f0103.59345f0102.85079f0102.15321f0101.50064f0100.89501f0100.3388f099.833374f099.37761f098.96815f098.59792f098.257515f097.93651f097.62526f097.31654f097.00604f096.69209f096.374794f096.05442f095.73055f095.402435f095.067535f094.72321f094.36593f093.99072f093.59294f093.16605f092.70265f092.194664f091.63346f091.01227f090.328094f089.58445f088.792534f087.970055f087.13608f086.30306f085.47203f084.63121f083.7587f082.83261f081.838165f080.772545f079.64241f078.46562f077.2562f076.02693f074.770584f073.47433f072.11543f070.68138f069.183754f067.657715f066.102715f064.52064f062.886257f061.199165f059.464417f057.68492f055.909504f054.106533f052.221027f050.292976f048.328747f046.310036f044.12147f041.775806f039.193882f036.26379f032.65502f027.907555f022.354113f018.43349f016.897245f016.335962f015.697074f015.608442f015.2385845f014.764184f014.495843f014.2397375f013.890338f013.045919f012.510047f012.131939f011.95006f011.292672f010.760164f010.418838f010.2662115f09.930472f09.695794f09.522998f09.570238f09.5872555f09.547219f09.48484f09.552379f09.506222f09.486923f09.387326f09.337777f09.250162f09.163964f09.088326f08.967246f08.829781f08.696905f08.596681f08.538674f08.394243f08.31968f08.290894f08.187294f08.159099f08.028077f08.021745f07.977979f07.8815804f07.819726f07.8295503f07.6914396f07.6676316f07.627518f07.527976f07.469681f07.441427f07.3785663f07.2955456f07.2708926f07.178184f07.1534233f07.1166983f07.054399f07.0168014f06.972834f06.9272513f06.8617764f06.826621f06.789806f06.719758f06.701697f06.6581135f06.6142445f06.550403f06.5332766f06.465754f06.4106936f06.3928046f06.349188f06.296569f06.285975f06.238286f06.1910152f06.16512f06.1253705f06.082903f06.0443377f06.011204f06.020672f05.915146f05.898919f05.8771763f05.832835f05.7892246f05.762329f05.746222f05.7536855f05.6681633f05.667557f05.6192584f05.603681f05.5579205f05.527161f05.4997663f05.4583626f05.439699f05.4230833f05.409861f05.3623056f05.337322f05.3116107f05.26598f05.2382646f05.2203703f05.2195477f05.161291f05.1336517f05.1099534f05.070721f05.0339894f05.017902f04.989069f04.9443793f04.923273f04.8770094f04.866687f04.822877f04.8717484f04.765713f04.741526f04.712138f04.6928883f04.691238f04.6152163f04.5952115f04.5558057f04.5217805f04.4913464f04.455991f04.4425898f04.3972635f04.366869f04.3319817f04.2970405f04.2653265f04.2405753f04.2077765f04.1824946f04.1570296f04.119496f04.102591f04.068939f04.0352306f04.023786f03.9831362f03.9697852f03.948108f03.916962f03.8992085f03.8614178f03.8544784f03.83019f03.8080332f03.797446f03.7525935f03.7406178f03.7508569f03.6993494f03.687811f03.6704361f03.6429176f03.6276693f03.600624f03.5798368f03.5632331f03.542536f03.524584f03.5066445f03.4865615f03.4712062f03.4508545f03.4337256f03.4200463f03.3991728f03.3785882f03.3748863f03.357627f03.334887f03.3155105f03.2958634f03.281297f03.2640254f03.250614f03.2333226f03.2149863f03.2039466f03.1877465f03.1728635f03.161281f03.1464067f03.1301765f03.1121116f03.1016512f03.0851889f03.0714684f03.0585163f03.0441897f03.0297153f03.0297153f03.019445f02.9473295f02.8890715f02.7300403f02.6166162f02.534856f02.3499265f01.959265f01.7775936f01.6027204f01.5160251f01.2178062f01.1230245f01.0729015f01.0576413f01.0175136f00.9540978f00.9081272f00.8391925f00.7483134f00.68620694f00.59908384f00.5477955f00.4933506f00.45541283f00.42468095f00.38609543f00.36694038f00.34359246f00.32173973f00.31286559f00.2977528f00.27428943f00.26711512f00.24022837f00.21361437f00.18615705f00.17063197f00.16265129f00.15699944f00.14188226f00.12822522f00.120355316f00.114945725f00.10933838f00.10105974f00.096922345f00.08888299f00.07308835f00.07173807f00.06372737f00.063386895f00.057111f00.05494704f00.05477482f00.054249056f00.054062452f00.052890923f00.052890923f0568.15894f0434.4206f0332.06467f0250.48984f0187.0734f0152.50137f0136.16827f0126.96706f0121.2905f0117.60257f0115.15193f0113.51723f0112.40828f0111.57703f0110.81408f0109.988945f0109.06599f0108.08015f0107.087494f0106.130066f0105.22804f0104.384094f0103.59345f0102.85079f0102.15321f0101.50064f0100.89501f0100.3388f099.833374f099.37761f098.96815f098.59792f098.257515f097.93651f097.62526f097.31654f097.00604f096.69209f096.374794f096.05442f095.73055f095.402435f095.067535f094.72321f094.36593f093.99072f093.59294f093.16605f092.70265f092.194664f091.63346f091.01227f090.328094f089.58445f088.792534f087.970055f087.13608f086.30306f085.47203f084.63121f083.7587f082.83261f081.838165f080.772545f079.64241f078.46562f077.2562f076.02693f074.770584f073.47433f072.11543f070.68138f069.183754f067.657715f066.102715f064.52064f062.886257f061.199165f059.464417f057.68492f055.909504f054.106533f052.221027f050.292976f048.328747f046.310036f044.12147f041.775806f039.193882f036.26379f032.65502f027.907555f022.354113f018.43349f016.897245f016.335962f015.697074f015.608442f015.2385845f014.764184f014.495843f014.2397375f013.890338f013.045919f012.510047f012.131939f011.95006f011.292672f010.760164f010.418838f010.2662115f09.930472f09.695794f09.522998f09.570238f09.5872555f09.547219f09.48484f09.552379f09.506222f09.486923f09.387326f09.337777f09.250162f09.163964f09.088326f08.967246f08.829781f08.696905f08.596681f08.538674f08.394243f08.31968f08.290894f08.187294f08.159099f08.028077f08.021745f07.977979f07.8815804f07.819726f07.8295503f07.6914396f07.6676316f07.627518f07.527976f07.469681f07.441427f07.3785663f07.2955456f07.2708926f07.178184f07.1534233f07.1166983f07.054399f07.0168014f06.972834f06.9272513f06.8617764f06.826621f06.789806f06.719758f06.701697f06.6581135f06.6142445f06.550403f06.5332766f06.465754f06.4106936f06.3928046f06.349188f06.296569f06.285975f06.238286f06.1910152f06.16512f06.1253705f06.082903f06.0443377f06.011204f06.020672f05.915146f05.898919f05.8771763f05.832835f05.7892246f05.762329f05.746222f05.7536855f05.6681633f05.667557f05.6192584f05.603681f05.5579205f05.527161f05.4997663f05.4583626f05.439699f05.4230833f05.409861f05.3623056f05.337322f05.3116107f05.26598f05.2382646f05.2203703f05.2195477f05.161291f05.1336517f05.1099534f05.070721f05.0339894f05.017902f04.989069f04.9443793f04.923273f04.8770094f04.866687f04.822877f04.8717484f04.765713f04.741526f04.712138f04.6928883f04.691238f04.6152163f04.5952115f04.5558057f04.5217805f04.4913464f04.455991f04.4425898f04.3972635f04.366869f04.3319817f04.2970405f04.2653265f04.2405753f04.2077765f04.1824946f04.1570296f04.119496f04.102591f04.068939f04.0352306f04.023786f03.9831362f03.9697852f03.948108f03.916962f03.8992085f03.8614178f03.8544784f03.83019f03.8080332f03.797446f03.7525935f03.7406178f03.7508569f03.6993494f03.687811f03.6704361f03.6429176f03.6276693f03.600624f03.5798368f03.5632331f03.542536f03.524584f03.5066445f03.4865615f03.4712062f03.4508545f03.4337256f03.4200463f03.3991728f03.3785882f03.3748863f03.357627f03.334887f03.3155105f03.2958634f03.281297f03.2640254f03.250614f03.2333226f03.2149863f03.2039466f03.1877465f03.1728635f03.161281f03.1464067f03.1301765f03.1121116f03.1016512f03.0851889f03.0714684f03.0585163f03.0441897f03.0297153f03.0297153f03.019445f02.9473295f02.8890715f02.7300403f02.6166162f02.534856f02.3499265f01.959265f01.7775936f01.6027204f01.5160251f01.2178062f01.1230245f01.0729015f01.0576413f01.0175136f00.9540978f00.9081272f00.8391925f00.7483134f00.68620694f00.59908384f00.5477955f00.4933506f00.45541283f00.42468095f00.38609543f00.36694038f00.34359246f00.32173973f00.31286559f00.2977528f00.27428943f00.26711512f00.24022837f00.21361437f00.18615705f00.17063197f00.16265129f00.15699944f00.14188226f00.12822522f00.120355316f00.114945725f00.10933838f00.10105974f00.096922345f00.08888299f00.07308835f00.07173807f00.06372737f00.063386895f00.057111f00.05494704f00.05477482f00.054249056f00.054062452f00.052890923f00.052890923f0

And also following warning:

      From worker 3:	WARNING: both Flux and Iterators export "flatten"; uses of it in module DiffEqFlux must be qualified
      From worker 3:	WARNING: both Flux and Distributions export "params"; uses of it in module DiffEqFlux must be qualified

If this is of any help? I think some collision of functions names?

This doesn’t seem like the same issue. Open a new thread. Sounds like a Pluto thing.