Use ReverseDiff.jl in JuMP

Is it possible to use ReverseDiff.jl in JuMP.jl instead of ForwardDiff.jl. I am not seeing the desired speedup in JuMP when I’m using ForwardDiff.jl

Can you provide a reproducible example of what you are doing and what you expect?

JuMP uses ForwardDiff only if you register a user-defined operator with @operator.

By default, JuMP uses a sparse reverse-mode automatic differentiation algorithm.

Here is my call to the optimizer:

function optimization(a)
    model = Model(NLopt.Optimizer)
    set_optimizer_attribute(model, "algorithm", :LD_MMA)
    
    @variable(model,2 >= a[i=1:num_variables] >= 0, start = a[i])
    
    register(model, :optim,  num_variables, optim; autodiff=true)
    
    @NLobjective(model, Min, optim(a...))
    
    # Solve the optimization problem
    JuMP.optimize!(model)

    # Check solution status and print results
    println("got ", objective_value(model))
    
end

Here is a simplified version of my optimization function


a = ones(12)
num_variables = length(a)
function optim(a::T) where {T}
    
    solution = a^2
    
    return solution
end

I know the simplified version looks simple, but the real function is too long to not have to register.

See:

JuMP might be the wrong tool for the job here. Why not use NLopt directly? Or try Ipopt.

I see, okay. So I need an extremely simple objective function to take advantage of JuMP.

So I need an extremely simple objective function to take advantage of JuMP.

No, it’s more if you have only an objective function that accepts a vector x, then there are other tools.

JuMP is meant for constrained mathematical optimization problems.

Would this not count as a constraint?
2 >= a[i=1:num_variables] >= 0
In the real problem, I have bounds between the variables.

The normal use case for a modeling language is that you have a problem description that is not easily/best described by functions, and/or you want derivatives to be computed via the modeling language, and/or you want the interface to a solver that the modeling language provides. It is not that the problem is simple, but the description is simple. Things like, summations over various indices and various named variables.
You have minimize f(x) subject to lvar <= x <= uvar AND you want to use a separate package to compute the derivatives AND you want to use a solver that is directly available via other interfaces. So that is a specific combination that makes JuMP ill-suited.

The question is then whether you should use JuMP or something else, and it depends on why you wanted to use JuMP in the first place.
If you just want to solve a problem with a simple description, and you want it to be fast, you can try other packages.
For instance:

using ADNLPModels, JSOSolvers

nlp = ADNLPModel(
    optim, # your function
    a, # A starting point,
    lvar, # A vector with the lower bounds on the variables
    uvar, # A vector with the upper bounds on the variables
)
output = tron(nlp)

This will use ADNLPModels to define the problem and compute derivatives using various backends (see Default backends · ADNLPModels.jl)
Then it will give the problem to TRON, which uses first and second derivatives to solve a bounded problem.
If you want to use a LBFGS model on top of it, you can follow this: https://youtu.be/JswiadZohK4 replacing Percival with tron.

1 Like