Unsupported Argument in Optimization.solve

I am running a demonstration of NeuralODE, fond at:
https://diffeqflux.sciml.ai/stable/examples/neural_ode/:

result_neuralode2 = Optimization.solve(optprob2,
    Optim.BFGS(initial_stepnorm=0.01), 
    callback = callback) #,
    # allow_f_increases = false)  # unsupported argument. STRANGE.

and get the error message

ERROR: MethodError: no method matching __map_optimizer_args(::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#52#53", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = 1:0, layer_2 = ViewAxis(1:150, Axis(weight = ViewAxis(1:100, ShapedAxis((50, 2), NamedTuple())), bias = ViewAxis(101:150, ShapedAxis((50, 1), NamedTuple())))), layer_3 = ViewAxis(151:252, Axis(weight = ViewAxis(1:100, ShapedAxis((2, 50), NamedTuple())), bias = ViewAxis(101:102, ShapedAxis((2, 1), NamedTuple())))))}}}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}; callback=OptimizationOptimJL.var"#_cb#11"{var"#49#51", BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, Base.Iterators.Cycle{Tuple{Optimization.NullData}}}(var"#49#51"(), BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}(LineSearches.InitialStatic{Float64}
  alpha: Float64 1.0
  scaled: Bool false
, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}
  delta: Float64 0.1
  sigma: Float64 0.9
  alphamax: Float64 Inf
  rho: Float64 5.0
  epsilon: Float64 1.0e-6
  gamma: Float64 0.66
  linesearchmax: Int64 50
  psi3: Float64 0.1
  display: Int64 0
  mayterminate: Base.RefValue{Bool}
, nothing, 0.01, Flat()), Base.Iterators.Cycle{Tuple{Optimization.NullData}}((Optimization.NullData(),)), Core.Box(#undef), Core.Box(Optimization.NullData()), Core.Box(2)), maxiters=nothing, maxtime=nothing, abstol=nothing, reltol=nothing, allow_f_increases=false)
Closest candidates are:
  __map_optimizer_args(::OptimizationProblem, ::Optim.AbstractOptimizer; callback, maxiters, maxtime, abstol, reltol) at ~/.julia/packages/OptimizationOptimJL/MgZBn/src/OptimizationOptimJL.jl:16 got unsupported keyword argument "allow_f_increases"
Stacktrace:

which seems clear enough, but the documentation for Optimizer.jl (and for OptimizerOptimJL) does not point that out, nor have I found any record of this error anywhere.

I notice that the latest version of Optimization.jl as found at https://optimization.sciml.ai/stable/optimization_packages/optim/ is 3.9.2. However, when I look at the Manifest.toml of my project, I find the lines:

[[deps.Optimization]]
deps = ["ArrayInterfaceCore", "ConsoleProgressMonitor", "DocStringExtensions", "Logging", "LoggingExtras", "Pkg", "Printf", "ProgressLogging", "Reexport", "Requires", "SciMLBase", "SparseArrays", "TerminalLoggers"]
git-tree-sha1 = "da5ac09bd9d4d7a0a69ccb5ee7f4c870b7a6bc7c"
uuid = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
version = "3.9.2"

I conclude that there is an inconsistency between the documentation and “reality”. Is there another argument to replace allow_f_increases?

Thanks for any insight.

Gordon

Seems fine:

using Optimization, OptimizationOptimJL
rosenbrock(u,p) =  (p[1] - u[1])^2 + p[2] * (u[2] - u[1]^2)^2
u0 = zeros(2)
p  = [1.0,100.0]

optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
prob = OptimizationProblem(optf, u0, p)
sol = solve(prob,BFGS(), allow_f_increases = true)

Share your ]st?

1 Like