Which Optimization package for large-scale structured nonlinear problems

Hello everyone,

As the title says, I would like some insight on what Julia Optimization package would be most appropriate to efficiently solve nonlinear problems with potentially exploitable sparsity structures (e.g. block diagonal Hessian).

I have come across various packages:

  • NLPModels.jl emphasises the capability for user-defined jac/hess structure;
  • JuMP.jl, Optim.jl, NLOpt.jl all seem to support nonlinear problems, but I’m not sure what happens under the hood for exploiting sparsity.
    Perhaps someone with experience on this could help?

Many thanks in advance:)

Perhaps StructuredOptimization.jl could be interesting for you as well!

It relies on solvers that do not build Hessian directly (it uses for example L-BFGS) but it can handle problems with sparse cost functions such as LASSO.

1 Like

A good starting point is to use JuMP with Ipopt and see if the performance is acceptable.

How large is “large-scale”?

2 Likes

Thank you for the tip. Large enough to highly benefit from sparsity exploitation on its Jacobian and Hessian, would JuMP allow for that?

JuMP commutes the Jacobian and Hessian using sparse reverse-mode AD. Try it out and see if the performance is acceptable.

Do you know if this is also done in the case of user-defined functions?

For user-defined functions, we disable all hessians, and compute the gradient with forward mode AD. Depending on the model, this can cause significant issues.

If at all possible, you should write out your constraints algebraically.

1 Like