# Optimization.jl and Lux.jl 1.10.0 Compatability

I was revisiting code I wrote in Julia 1.9.3 which closely mimics the code written by Chris here. While it worked fine in 1.9.3, when I updated my packages and installation to 1.10.0. I started getting the following error

LoadError: type OptimizationState has no field layer_1

Seems to link back to the way Optimization.solve links back to the Lux NN. Here is a snippet of the code in my file.

``````rbf(x) = exp.(-(x .^ 2))
# Multilayer FeedForward
U = Lux.Chain(Lux.Dense(1, 5, rbf), Lux.Dense(5, 5, rbf), Lux.Dense(5, 5, tanh),
Lux.Dense(5, 1))
# Get the initial parameters and state variables of the model
p, st = Lux.setup(rng, U);

# Define the hybrid model
function ude_dynamics!(du, u, p, t, p_true)
û = U(u, p, st)[1]
du[1] = p_true[1] * û[1]
end

# Closure with the known parameter
nn_dynamics!(du, u, p, t) = ude_dynamics!(du, u, p, t, p_)

# Define the UDE problem
prob_nn = ODEProblem(nn_dynamics!, u0, tspan, p)
``````

which, after some function definitions, continues to

``````#Define the optimization function with automatic differentiation
optprob = Optimization.OptimizationProblem(optf, ComponentVector{Float64}(p))

res1 = Optimization.solve(optprob, ADAM(), callback = callback, maxiters = 1000)
``````

The error is thrown at res1 = Optimization.solve(optprob, ADAM(), callback = callback, maxiters = 1000). Any help is appreciated.

You dropped the important part.

edit Just figured it out, it’s simply the callback function. Replace `loss=loss(p)` by `loss=l`.

Did you figure it out in the end? I literally copy-pasted the example at

with `maxiter=1` in a brand new `--temp` environment and I get the same error

``````using ComponentArrays, DiffEqFlux, OrdinaryDiffEq, Optimization, Distributions, Random,
OptimizationOptimisers, OptimizationOptimJL

nn = Chain(Dense(1, 3, tanh), Dense(3, 1, tanh))
tspan = (0.0f0, 10.0f0)

ffjord_mdl = FFJORD(nn, tspan, (1,), Tsit5(); ad = AutoZygote())
ps, st = Lux.setup(Xoshiro(0), ffjord_mdl)
ps = ComponentArray(ps)
model = StatefulLuxLayer{true}(ffjord_mdl, nothing, st)

# Training
data_dist = Normal(6.0f0, 0.7f0)
train_data = Float32.(rand(data_dist, 1, 100))

function loss(θ)
logpx, λ₁, λ₂ = model(train_data, θ)
return -mean(logpx)
end

function cb(p, l)
@info "FFJORD Training" loss=loss(p)
return false
end

optf = Optimization.OptimizationFunction((x, p) -> loss(x), adtype)
optprob = Optimization.OptimizationProblem(optf, ps)

res1 = Optimization.solve(
optprob, OptimizationOptimisers.Adam(0.01); maxiters = 1, callback = cb)
┌ Error: Exception while generating log record in module Main at REPL[103]:2
│   exception =
│    type OptimizationState has no field layer_1
│    Stacktrace:
│      [1] getproperty
│        @ ./Base.jl:37 [inlined]
│      [2] macro expansion
│        @ ~/.julia/packages/Lux/7UzHr/src/layers/containers.jl:0 [inlined]
│      [3] applychain(layers::@NamedTuple{layer_1::Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_2::Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(zeros32)}}, x::SubArray{Float32, 2, Matrix{Float32}, Tuple{UnitRange{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}, ps::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = ViewAxis(4:6, ShapedAxis((3, 1))))), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = ViewAxis(4:4, ShapedAxis((1, 1))))))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = ViewAxis(4:6, ShapedAxis((3, 1))))), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = ViewAxis(4:4, ShapedAxis((1, 1))))))}}}, Nothing, Optimisers.Leaf{Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, st::@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}})
│        @ Lux ~/.julia/packages/Lux/7UzHr/src/layers/containers.jl:478

[tens of thousands of error lines]
``````

Yes thanks for the report, full answer in The FFJORD copy-pasteable code doesn't work · Issue #931 · SciML/DiffEqFlux.jl · GitHub