Optimizer not found error

@ChrisRackauckas,
I am running a UODE solver using Optimization.jl per recommendation.
Here are a few lines from a larger code (I might construct a MWE later).

adtype = Optimization.AutoZygote()
# adtype = Optimization.AutoFiniteDiff()
# ERROR: Cannot find function signature. Must wait on Rackauckas
optf = Optimization.OptimizationFunction((x,p)->loss_neuralode(x), adtype)
# Componentarray is a projection operator
optprob = Optimization.OptimizationProblem(optf, ComponentVector{Float64}(ps_NN))
res1 = Optimization.solve(optprob, Adam(0.1), maxiters=200)

I get the error:

Chosen Optimizer: Adam{Float64}(0.1, (0.8999999761581421, 0.9990000128746033), 2.220446049250313e-16)
Stacktrace:
 [1] __solve(::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#24#25", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(coeffs = ViewAxis(1:20, ShapedAxis((10, 2), NamedTuple())),)}}}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Adam{Float64}; kwargs::Base.Pairs{Symbol, Int64, Tuple{Symbol}, NamedTuple{(:maxiters,), Tuple{Int64}}})
   @ SciMLBase ~/.julia/packages/SciMLBase/QqtZA/src/solve.jl:177
 [2] solve(::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#24#25", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(coeffs = ViewAxis(1:20, ShapedAxis((10, 2), NamedTuple())),)}}}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Adam{Float64}; kwargs::Base.Pairs{Symbol, Int64, Tuple{Symbol}, NamedTuple{(:maxiters,), Tuple{Int64}}})
   @ SciMLBase ~/.julia/packages/SciMLBase/QqtZA/src/solve.jl:88
 [3] top-level scope
   @ ~/src/2022/basic_UODE/custom_lux_layer/Lokta-Voltera_polylayer.jl:160
Adam{Float64}(0.1, (0.8999999761581421, 0.9990000128746033), 2.220446049250313e-16)

I looked at the Julia source code and found that the solve function in SciMLBase.jl is called:

# if no cache interface is supported at least the following method has to be defined
function __solve(prob::OptimizationProblem, alg, args...; kwargs...)
      throw(OptimizerMissingError(alg))
end

Why is it that the Adam algorithm is not recognized? While waiting for a response, I’ll work on a MWE. Thanks.

Did you using OptimizationOptimisers?

No, I used Optimisers on its own. Is that not correct?

I checked out OptimizationOptimisers and looked at its various functions. I notice that ADAM, which is deprecated is still in this package. Why? I also notice that freeze! is not in the package, although it is in Optimisers.jl. So my question is: when does one use OptimizsationOptimisers.jl as opposed to Optimisers.jl. Thanks.

In fact, a google search of “OptimizationOptimisers.jl”" does not return any results. Not clear how that is possible. :slight_smile:

ADAM is different from Adam. Flux has ADAM, and Optimisers has Adam.

To use any of the optimization packages with Optimization.jl, you need to load the wrapper. See the documentation page:

https://docs.sciml.ai/Optimization/stable/optimization_packages/optimisers/

It’s a bit unfortunate, though with Julia v1.9 we can make use of the new weak dependencies feature to make that easier.

Thanks. Note that the demo you shared with me a while back, did not require this wrapper. Why is it that Optimisers.jl does not simply include OptimizationOptimisers.jl as a package. Is that not possible? Thanks.

It’s not possible until Julia v1.9.

Which demo?

A few weeks ago, when you were updating the documentation, you shared a demo program using the latest libraries that demonstrated the L-V equation with Lux and DataDrivenEq, and helped me get the code running. I just looked, and OptimizationOptimiers is indeed loaded (I had forgotten).

Here is the “demo”: Automatically Discover Missing Physics by Embedding Machine Learning into Differential Equations · Overview of Julia's SciML

Just one question: If I load OptimizatonOptimisers, is Optimisers also necessary? Many functions are contained in both libraries but not all. Why were all the functions in Optimisers.jl transferred over? I am probably digging into the weeds too much :slight_smile:

Gordon

Optimisers.jl is a separate library we don’t control. OptimizationOptimisers wraps it into the Optimization.jl API. It would be nice if libraries adopted the Optimization.jl API themselves, that would get rid of the wrapper, but the API is still young enough that it’s not surprising and with the weak dependencies stuff we can make that automatic.

Thanks. This makes sense. Cheers!