DiffEqFlux, Neural ODEs: Unable to run example code

Hi all,

I’ve recently started using Julia for machine learning purposes. I’ve just started playing around with neural networks using Flux and DiffEqFlux. I am trying to run some of the example code from the DiffEqFlux but I’m unable to, always getting the same error.

Here is some code, adapted from the example code on the site, which helped me pinpoint where the error is coming from:

using Flux, DiffEqFlux, DifferentialEquations


function lotka_volterra(du,u,p,t)
  x, y = u
  α, β, δ, γ = p
  du[1] = dx = α*x - β*x*y
  du[2] = dy = -δ*y + γ*x*y
end

parameters = [2.2; 1.0; 2.0; 0.4] # Initial Parameter Vector

u_0 = [1.0, 1.0]
tspan = (0.0, 10.0)

prob = ODEProblem(lotka_volterra, u0, tspan, parameters);

function predict_rd(u0)

    prob = ODEProblem(lotka_volterra, u0, tspan, parameters);
    solve(prob, Tsit5(), saveat=0.1)[1, :]

end

function loss_rd(u0)

    sum((predict_rd(u0) .- 1).^2)

end

predict_rd(u_0)

gs = gradient(() -> loss_rd(u_0), params(parameters))

When the last line is run, where the gradient is to be calculated, I get the following error:

LoadError: ArgumentError: tuple must be non-empty
in expression starting at C:\Users\chris\neural_nets_tutorials\gradient_test.jl:33
first(::Tuple{}) at tuple.jl:95
_unapply(::Nothing, ::Tuple{}) at lib.jl:163
_unapply(::Tuple{Nothing}, ::Tuple{}) at lib.jl:167
_unapply(::Tuple{Tuple{Nothing}}, ::Tuple{}) at lib.jl:167
_unapply(::Tuple{NTuple{6,Nothing},Tuple{Nothing}}, ::Tuple{Nothing,Nothing,Nothing,Array{Float64,1},Array{Float64,1},Nothing}) at lib.jl:168
unapply(::Tuple{NTuple{6,Nothing},Tuple{Nothing}}, ::Tuple{Nothing,Nothing,Nothing,Array{Float64,1},Array{Float64,1},Nothing}) at lib.jl:177
(::Zygote.var"#188#189"{Zygote.var"#kw_zpullback#40"{DiffEqSensitivity.var"#adjoint_sensitivity_backpass#179"{Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},Tsit5,InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},Array{Float64,1},Tuple{},NamedTuple{(),Tuple{}},Colon}},Tuple{NTuple{6,Nothing},Tuple{Nothing}}})(::Array{Float64,2}) at lib.jl:195
(::Zygote.var"#1741#back#190"{Zygote.var"#188#189"{Zygote.var"#kw_zpullback#40"{DiffEqSensitivity.var"#adjoint_sensitivity_backpass#179"{Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},Tsit5,InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},Array{Float64,1},Tuple{},NamedTuple{(),Tuple{}},Colon}},Tuple{NTuple{6,Nothing},Tuple{Nothing}}}})(::Array{Float64,2}) at adjoint.jl:59
#solve#59 at solve.jl:70 [inlined]
(::typeof(∂(#solve#59)))(::Array{Float64,2}) at interface2.jl:0
(::Zygote.var"#188#189"{typeof(∂(#solve#59)),Tuple{NTuple{6,Nothing},Tuple{Nothing}}})(::Array{Float64,2}) at lib.jl:194
(::Zygote.var"#1741#back#190"{Zygote.var"#188#189"{typeof(∂(#solve#59)),Tuple{NTuple{6,Nothing},Tuple{Nothing}}}})(::Array{Float64,2}) at adjoint.jl:59
solve at solve.jl:68 [inlined]
(::typeof(∂(solve##kw)))(::Array{Float64,2}) at interface2.jl:0
predict_rd at gradient_test.jl:21 [inlined]
(::typeof(∂(predict_rd)))(::Array{Float64,1}) at interface2.jl:0
loss_rd at gradient_test.jl:27 [inlined]
(::typeof(∂(loss_rd)))(::Float64) at interface2.jl:0
#119 at gradient_test.jl:33 [inlined]
(::typeof(∂(#119)))(::Float64) at interface2.jl:0
(::Zygote.var"#69#70"{Params,Zygote.Context,typeof(∂(#119))})(::Float64) at interface.jl:255
gradient(::Function, ::Params) at interface.jl:59
top-level scope at gradient_test.jl:33
include_string(::Function, ::Module, ::String, ::String) at loading.jl:1088

I’ve tried to update DiffEqSensitivity to version 6.50.1, per this post’s suggestion (DiffEqFlux Example codes not working), but this gives the following error:

ERROR: Unsatisfiable requirements detected for package CuArrays [3a865a2d]:

I’ve cut the above output short as it is very long.

These are my package versions:

[aae7a2af] DiffEqFlux v1.39.0
[41bf760c] DiffEqSensitivity v6.49.1
[0c46a032] DifferentialEquations v6.18.0
[587475ba] Flux v0.12.1

I’d be grateful for any help. Just trying to get the example codes to run so I can finally start applying these methods

I am not sure why you can’t update the dependencies but I can tell that the code does work with

jl_kbMB9w) pkg> st
      Status `/tmp/jl_kbMB9w/Project.toml`
  [aae7a2af] DiffEqFlux v1.43.0
  [0c46a032] DifferentialEquations v6.19.0
  [587475ba] Flux v0.12.6
1 Like

The package versions are probably being held back by something else in that environment. Can you try ]up and if that doesn’t change the versions, try ]add DifferentialEquations@v6.19.0 (and similar for the other dependencies out of date)?

Another option would be to use a different environment than the default environment (which is recommended anyway, precisely to keep packages that you’re not actively using from holding those that you do use back).

That output would be very helpful in diagnosing why your packages are being held back though. You can use the “Hide Details” option (in discourse, click on the gear and select “Hide Details” in the answering box) to collapse very long segments into a collapsible box.

1 Like

Late update, however I figured out what the problem was. The new versions of the libraries required a newer version of julia. I was using julia 1.5.4 whilst the libraries required julia 1.6.2.

For anyone having similar problems I would suggest checking the “Project.toml” file of each library on github to check for any compatibility issues.

1 Like