Been testing example of Neural ODE code with Julia 1.85. i get error ComponentArray

''Lux, DiffEqFlux, DifferentialEquations, Optimization, OptimizationOptimJL, Random, Plots,ComponentArrays

rng = Random.default_rng()
u0 = Float32[2.0; 0.0]
datasize = 30
tspan = (0.0f0, 1.5f0)
tsteps = range(tspan[1], tspan[2], length = datasize)

function trueODEfunc(du, u, p, t)
true_A = [-0.1 2.0; -2.0 -0.1]
du .= ((u.^3)‘true_A)’
end

prob_trueode = ODEProblem(trueODEfunc, u0, tspan)
ode_data = Array(solve(prob_trueode, Tsit5(), saveat = tsteps))

dudt2 = Lux.Chain(x → x.^3,
Lux.Dense(2, 50, tanh),
Lux.Dense(50, 2))
p, st = Lux.setup(rng, dudt2)
prob_neuralode = NeuralODE(dudt2, tspan, Tsit5(), saveat = tsteps)

function predict_neuralode(p)
Array(prob_neuralode(u0, p, st)[1])
end

function loss_neuralode(p)
pred = predict_neuralode(p)
loss = sum(abs2, ode_data .- pred)
return loss, pred
end

Do not plot by default for the documentation

Users should change doplot=true to see the plots callbacks

callback = function (p, l, pred; doplot = false)
println(l)

plot current prediction against data

if doplot
plt = scatter(tsteps, ode_data[1,:], label = “data”)
scatter!(plt, tsteps, pred[1,:], label = “prediction”)
display(plot(plt))
end
return false
end

pinit = Lux.ComponentArray(p)
callback(pinit, loss_neuralode(pinit)…; doplot=true)

use Optimization.jl to solve the problem

adtype = Optimization.AutoZygote()

optf = Optimization.OptimizationFunction((x, p) → loss_neuralode(x), adtype)
optprob = Optimization.OptimizationProblem(optf, pinit)

result_neuralode = Optimization.solve(optprob,
ADAM(0.05),
callback = callback,
maxiters = 300)

optprob2 = remake(optprob,u0 = result_neuralode.u)

result_neuralode2 = Optimization.solve(optprob2,
Optim.BFGS(initial_stepnorm=0.01),
callback=callback,
allow_f_increases = false)

callback(result_neuralode2.u, loss_neuralode(result_neuralode2.u)…; doplot=true)‘’

What error do you get? Can you share the whole stack trace?

UndefVarError: ComponentArray not defined

Stacktrace:
[1] getproperty(x::Module, f::Symbol)
@ Base .\Base.jl:31
[2] top-level scope
@ In[59]:1

You could add and use the ComponentArrays package explicitly.

Like here: First Neural ODE example — PDEs

Edit: I should have referred the original source Neural Ordinary Differential Equations · DiffEqFlux.jl

If I’m reading your quote literally, this is incorrect because you dropped the using statement:

using Lux, DiffEqFlux, DifferentialEquations, Optimization, OptimizationOptimJL, Random, Plots,ComponentArrays

and you need to make sure these packages are installed.

1 Like

Thank you. Your example is running smoothly with Julia 1.9.1.

Thank you Chris. But, there seems to be a complain at line ‘pinit = Lux.ComponentArray(p)’ (with Julia 1.9.1).

What does it say? Please post full stack traces.

UndefVarError: ComponentArray not defined

Stacktrace:
[1] getproperty(x::Module, f::Symbol)
@ Base .\Base.jl:31
[2] top-level scope
@ In[2]:46

What did you get when you did using ComponentArrays?

Same error as before.

So you opened a new REPL, did using ComponentArrays, got no warnings or error, and then did ComponetArray(x = rand(4,4)) or something of the sort and it said it wasn’t defined?

I am using Julia 1.8.5. Being new to Julia, I also have problems. I copied the code from “Neural Ordinary Differential Equations · DiffEqFlux.jl

  1. I have the same error with pinit = Lux.ComponentArray(p), and it works after I add using ComponentArrays and remove Lux as:
    Using ComponentArrays
    pinit = ComponentArray(p)
  2. But I am still stuck at optimization.solve using ADAM, and I changed them to the following, but it still not working with the error " Adam not defined"
    using OptimizationOptimisers
    result_neuralode = Optimization.solve(optprob,
    Optim.Adam(0.05),
    callback = callback,
    maxiters = 300)
    Error message changed to “optimization algorithm not found” if I used only Adm instead of Optim.Adam as below
    esult_neuralode = Optimization.solve(optprob,
    Adam(0.05),
    callback = callback,
    maxiters = 300)
    Any suggestion how I can proceed? Thanks.

Follow dev: Neural Ordinary Differential Equations · DiffEqFlux.jl

I thought I linked it to you the other day. I’m trying to figure out why the docs build is behind.

Sorry, I am a different person but encountered the same problem. I did download the linked file you mentioned. Still have the error with ADAM line
result_neuralode = Optimization.solve(optprob,
ADAM(0.05),
callback = callback,
maxiters = 300)
The error message is as follows:

ERROR: Optimization algorithm not found. Either the chosen algorithm is not a valid solver
choice for the OptimizationProblem, or the Optimization solver library is not loaded.
Make sure that you have loaded an appropriate Optimization.jl solver library, for example,
solve(prob,Optim.BFGS()) requires using OptimizationOptimJL and
solve(prob,Adam()) requires using OptimizationOptimisers.

For more information, see the Optimization.jl documentation: Optimization.jl: A Unified Optimization Package · Optimization.jl.

1 Like

Did you using OptimizationOptimisers?

1 Like

Hello, I am facing the same error and I already imported and use the package OptimizationOptimisers.

Can you share an MWE?

After I restart the computer, and ran the code as posted, The original source original source works. Neural Ordinary Differential Equations · DiffEqFlux.jl

I am not sure what happened. Even though it gave me error before but now it works. Is there some thing one should do to clear existing catch or precompiled stuff.

Much appreciated.

1 Like