PolyesterForwardDiff - backend choice is not available

Hi everybody,

I’m trying to use PolyesterForwardDiff in my code but stumbling upon an error for which the MWE is inspired by the doc:

julia> using Optimization, OptimizationOptimJL, ForwardDiff, PolyesterForwardDiff

julia> rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
rosenbrock (generic function with 1 method)

julia> x0 = zeros(2)
2-element Vector{Float64}:
 0.0
 0.0

julia> _p = [1.0, 100.0]
2-element Vector{Float64}:
   1.0
 100.0

julia> optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
(::OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, typeof(rosenbrock), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}) (generic function with 1 method)

julia> prob = OptimizationProblem(optf, x0, _p)
OptimizationProblem. In-place: true
u0: 2-element Vector{Float64}:
 0.0
 0.0

julia> sol = solve(prob, BFGS())
retcode: Success
u: 2-element Vector{Float64}:
 0.9999999999373603
 0.9999999998686199

julia> optf = OptimizationFunction(rosenbrock, Optimization.AutoPolyesterForwardDiff())
(::OptimizationFunction{true, AutoPolyesterForwardDiff{nothing}, typeof(rosenbrock), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}) (generic function with 1 method)

julia> prob = OptimizationProblem(optf, x0, _p)
OptimizationProblem. In-place: true
u0: 2-element Vector{Float64}:
 0.0
 0.0

julia> sol = solve(prob, BFGS())
ERROR: ArgumentError: The passed automatic differentiation backend choice is not available. Please load the corresponding AD package PolyesterForwardDiff.
Stacktrace:
  [1] instantiate_function(f::Function, x::Optimization.ReInitCache{…}, adtype::AutoPolyesterForwardDiff{…}, p::Int64, num_cons::Int64)
    @ Optimization ~/.julia/packages/Optimization/xbuNJ/src/function.jl:114
  [2] instantiate_function(f::Function, x::Optimization.ReInitCache{…}, adtype::AutoPolyesterForwardDiff{…}, p::Int64)
    @ Optimization ~/.julia/packages/Optimization/xbuNJ/src/function.jl:106
  [3] OptimizationCache(prob::OptimizationProblem{…}, opt::BFGS{…}, data::Base.Iterators.Cycle{…}; callback::Function, maxiters::Nothing, maxtime::Nothing, abstol::Nothing, reltol::Nothing, progress::Bool, kwargs::@Kwargs{})
    @ Optimization ~/.julia/packages/Optimization/xbuNJ/src/cache.jl:27
  [4] __init(prob::OptimizationProblem{…}, opt::BFGS{…}, data::Base.Iterators.Cycle{…}; callback::Function, maxiters::Nothing, maxtime::Nothing, abstol::Nothing, reltol::Nothing, progress::Bool, kwargs::@Kwargs{})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/yMF3E/src/OptimizationOptimJL.jl:101
  [5] __init(prob::OptimizationProblem{…}, opt::BFGS{…}, data::Base.Iterators.Cycle{…})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/yMF3E/src/OptimizationOptimJL.jl:68
  [6] init(::OptimizationProblem{…}, ::BFGS{…}; kwargs::@Kwargs{})
    @ SciMLBase ~/.julia/packages/SciMLBase/Dwomw/src/solve.jl:166
  [7] init(::OptimizationProblem{…}, ::BFGS{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/Dwomw/src/solve.jl:164
  [8] solve(::OptimizationProblem{…}, ::BFGS{…}; kwargs::@Kwargs{})
    @ SciMLBase ~/.julia/packages/SciMLBase/Dwomw/src/solve.jl:96
  [9] solve(::OptimizationProblem{…}, ::BFGS{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/Dwomw/src/solve.jl:93
 [10] top-level scope
    @ REPL[17]:1
Some type information was truncated. Use `show(err)` to see complete types.

I looked at the code for OptimizationCache but couldn’t figure it out. Has somebody encountered this?

My package versions:

(Test) pkg> st
Project Test v0.1.0
Status `~/workspace/julia/1.10/Test/Project.toml`
  [052768ef] CUDA v5.2.0
  [5ae59095] Colors v0.12.10
  [a93c6f00] DataFrames v1.6.1
  [aae7a2af] DiffEqFlux v3.3.1
  [071ae1c0] DiffEqGPU v3.4.1
  [0c46a032] DifferentialEquations v7.12.0
  [31c24e10] Distributions v0.25.107
  [587475ba] Flux v0.14.12
  [f6369f11] ForwardDiff v0.10.36
  [e9467ef8] GLMakie v0.9.9
  [cd3eb016] HTTP v1.10.2
  [7f7a1694] Optimization v3.23.0
  [36348300] OptimizationOptimJL v0.2.2
  [42dfb2eb] OptimizationOptimisers v0.2.1
  [1dea7af3] OrdinaryDiffEq v6.73.1
  [f0f68f2c] PlotlyJS v0.18.13
  [91a5bcdd] Plots v1.40.1
  [98d1487c] PolyesterForwardDiff v0.1.1
  [92933f4c] ProgressMeter v1.9.0
  [ce6b1742] RDatasets v0.7.7
  [1ed8b502] SciMLSensitivity v7.56.0
  [789caeaf] StochasticDiffEq v6.65.1
  [e88e6eb3] Zygote v0.6.69
  [02a925ec] cuDNN v1.3.0
  [ea8e919c] SHA v0.7.0
  [10745b16] Statistics v1.10.0

(Test) pkg> 

Note that I’m able to successfully call PolyesterForwardDiff.threaded_gradient!(f, dx, x, ForwardDiff.Chunk(8)); for MWE style values of f, dx and x

As I was trying to recreate a smaller cleaner env, I got this, which I suspect could be related, but I’m not sure how to interpret it

PolyesterForwardDiff is not a supported AD backend for Optimization.jl yet, only the ones listed in the docs OptimizationFunction · Optimization.jl are supported. Create an issue in GitHub - SciML/OptimizationBase.jl: The base package for Optimization.jl, containing the structs and basic functions for it. if you’d like it supported.

1 Like

That would explain it. Thank you sir. I’ve created this issue. I’m unfortunately not knowledgeable enough to prove more details, but happy to try to help if you think it needs to be altered.

For context, I’m trying to minimize a loss function that is decently well behaved but unfortunately has a couple dozen dimensions and for which speedy convergence is necessary. Hoping to leverage every core I have.