UndefVarError: ComponentArray not defined

using Lux, DiffEqFlux, DifferentialEquations, Optimization, OptimizationOptimJL, Random, Plots

rng = Random.default_rng()
u0 = Float32[1.0, 1.0]
datasize = 80
tspan = (0.0f0, 10f0)
tsteps = range(tspan[1], tspan[2], length = datasize)
p = Float32[1.5, 1.0, 3.0, 1.0]

function lotka_volterra(du,u,p,t)
x, y = u
α, β, δ, γ = p
du[1] = dx = αx - βxy
du[2] = dy = -δ
y + γxy
end
prob_trueode = ODEProblem(lotka_volterra,u0,tspan,p)

ode_data = solve(prob_trueode, Tsit5(), saveat = tsteps)
plot(tsteps, ode_data[1,:], label = [“x” “y”], xlabel = “Time”, ylabel = “Population”, title = “Lotka-Volterra”, lw = 3)
dudt2 = Lux.Chain(x → x.^3,
Lux.Dense(2, 50, tanh),
Lux.Dense(50, 2))
p, st = Lux.setup(rng, dudt2)
prob_neuralode = NeuralODE(dudt2, tspan, Tsit5(), saveat = tsteps)

function predict_neuralode(p)
Array(prob_neuralode(u0, p, st)[1])
end

function loss_neuralode(p)
pred = predict_neuralode(p)
loss = sum(abs2, ode_data .- pred)
return loss, pred
end

Do not plot by default for the documentation

Users should change doplot=true to see the plots callbacks

callback = function (p, l, pred; doplot = false)
println(l)

plot current prediction against data

if doplot
plt = scatter(tsteps, ode_data[1,:], label = “data”)
scatter!(plt, tsteps, pred[1,:], label = “prediction”)
display(plot(plt))
end
return false
end

pinit = Lux.ComponentArray(p)
callback(pinit, loss_neuralode(pinit)…; doplot=true)

use Optimization.jl to solve the problem

adtype = Optimization.AutoZygote()

optf = Optimization.OptimizationFunction((x, p) → loss_neuralode(x), adtype)
optprob = Optimization.OptimizationProblem(optf, pinit)

result_neuralode = Optimization.solve(optprob,
ADAM(0.05),
callback = callback,
maxiters = 500)

optprob2 = remake(optprob,u0 = result_neuralode.u)

result_neuralode2 = Optimization.solve(optprob2,
Optim.BFGS(initial_stepnorm=0.01),
callback=callback,
allow_f_increases = false)

callback(result_neuralode2.u, loss_neuralode(result_neuralode2.u)…; doplot=true)

Did you install ComponentArrays?

Hello Chris, yes, I have installed ComponentArrays.
Manifest.toml (94.8 KB)
Project.toml (1.4 KB)

oh it looks like you’re missing the using ComponantArrays here. Where was this example pulled from?

Neural Ordinary Differential Equations · DiffEqFlux.jl (sciml.ai)

Oh interesting, the dev version is updated: Neural Ordinary Differential Equations · DiffEqFlux.jl. I’ll make sure to trigger the stable version to update.

Thank you, Chris. For my thesis project, the idea is to use this approach to simulate an RC circuit using NODEs and see if it can help in predicting the internal temperatures of a house. What do you think? I would be glad to know.

I tried changing pinit=Lux.Componentsarray(p) with pinit = deepcopy(p) but Know i have this error

Optimization algorithm not found. Either the chosen algorithm is not a valid solver
choice for the OptimizationProblem, or the Optimization solver library is not loaded.
Make sure that you have loaded an appropriate Optimization.jl solver library, for example,
solve(prob,Optim.BFGS()) requires using OptimizationOptimJL and
solve(prob,Adam()) requires using OptimizationOptimisers.

For more information, see the Optimization.jl documentation: https://docs.sciml.ai/Optimization/stable/.

Chosen Optimizer: Stacktrace: [1] __solve(::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#26#27", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{NamedTuple{(), Tuple{}}, NamedTuple{(:weight, :bias), Tuple{Matrix{Float32}, Matrix{Float32}}}, NamedTuple{(:weight, :bias), Tuple{Matrix{Float32}, Matrix{Float32}}}}}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Adam; kwargs::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol}, NamedTuple{(:callback, :maxiters), Tuple{var"#23#25", Int64}}}) @ SciMLBase [C:\Users\marco.julia\packages\SciMLBase\QqtZA\src\solve.jl:173](file:///C:/Users/marco/.julia/packages/SciMLBase/QqtZA/src/solve.jl:173) [2] solve(::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#26#27", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{NamedTuple{(), Tuple{}}, NamedTuple{(:weight, :bias), Tuple{Matrix{Float32}, Matrix{Float32}}}, NamedTuple{(:weight, :bias), Tuple{Matrix{Float32}, Matrix{Float32}}}}}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Adam; kwargs::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol}, NamedTuple{(:callback, :maxiters), Tuple{var"#23#25", Int64}}}) @ SciMLBase [C:\Users\marco.julia\packages\SciMLBase\QqtZA\src\solve.jl:84](file:///C:/Users/marco/.julia/packages/SciMLBase/QqtZA/src/solve.jl:84) [3] top-level scope

Adam(0.05

, (0.9, 0.999), 1.0e-8

, IdDict{Any, Any}())

Did you change your code entirely to the updated tutorial from the dev version? I just set the doc rebuild so that stable is fixed, but you should make sure you’re using the Lux updated form.

I solved it using the same code in Julia on VS Code instead of using Jupyter Notebook, I don’t know why but now it works. thanks Chris