Hi @niro, welcome to the forum!
I was wondering if there a way to tell JuMP that I don’t want to compute the hessian information
To check if your settings make a difference, you can check the number of functions calls.
With:
using JuMP, Ipopt
function main(method)
model = Model(Ipopt.Optimizer)
@variable(model, x)
@variable(model, y)
@objective(model, Min, (1 - x)^2 + 100 * (y - x^2)^2)
set_attribute(model, "hessian_approximation", method)
optimize!(model)
end
I get
julia> main("exact")
This is Ipopt version 3.14.14, running with linear solver MUMPS 5.6.2.
Number of nonzeros in equality constraint Jacobian...: 0
Number of nonzeros in inequality constraint Jacobian.: 0
Number of nonzeros in Lagrangian Hessian.............: 3
... lines omitted ...
Number of objective function evaluations = 36
Number of objective gradient evaluations = 15
Number of equality constraint evaluations = 0
Number of inequality constraint evaluations = 0
Number of equality constraint Jacobian evaluations = 0
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations = 14
Total seconds in IPOPT = 0.008
julia> main("limited-memory")
This is Ipopt version 3.14.14, running with linear solver MUMPS 5.6.2.
Number of nonzeros in equality constraint Jacobian...: 0
Number of nonzeros in inequality constraint Jacobian.: 0
Number of nonzeros in Lagrangian Hessian.............: 0
... lines omitted ...
Number of objective function evaluations = 47
Number of objective gradient evaluations = 25
Number of equality constraint evaluations = 0
Number of inequality constraint evaluations = 0
Number of equality constraint Jacobian evaluations = 0
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations = 0
Total seconds in IPOPT = 0.043
The key difference is Number of nonzeros in Lagrangian Hessian
and Number of Lagrangian Hessian evaluations
, which are 0
if you set the limited-memory option. This shows that JuMP is not computing the Hessian, or passing it to Ipopt.
Could it also speed up the line starting with @expression?
Nope
does function tracing mean that AD is fully performed at this stage already
Nope.
It works well but this step may take some time because the generated function may be quite lengthy.
For improving the speed of @expression
, we can likely make some improvements. Can you provide a reproducible example of your code?
I really should write a tutorial on optimal control, with some common tricks for performance.
You could take a read of: Help solving: Adding a few equations to ODE slows JuMP >100x
Or look at
- GitHub - control-toolbox/OptimalControl.jl: Solvers of optimal control problems
- GitHub - JuliaControl/ModelPredictiveControl.jl: An open source model predictive control package for Julia.
- GitHub - infiniteopt/InfiniteOpt.jl: An intuitive modeling interface for infinite-dimensional optimization problems.