If I have made an error in the characterization of any listed packages corrections are greatly appreciated.
So far, it seems that ADNLPModels, GalacticOptim, JuMP, Nonconvex and Optim are the packages that currently support all of my requirements. However, if you know of some other Julia package that might be able to solve such problems, I would very much like to hear about it.
Note: original post has been revised based on info from this discussion.
Thanks! Now I see the difference in terminology. In MOI/JuMP, the difference between global and local is the capacity of a solver to certify (of course, there might be numerical issues) that the solution is a global optimum or only a local optimum. For GalacticOptim, it seems more about how the feasible space is explored.
@mohamed82008, thanks for the tip about Optim. In this example I did not see how to combine what is shown here with an AD approach for the Jacobian and Hessian. Do you know of an example doing this?
@ChrisRackauckas, I will give GalaticOptim a try going to Ipopt through the MOI backend, unless you suggestion a different one. What AD system do you recommend for sparse large scale problems? AutoModelingToolkit sounds like the best choice from the docs you post, correct?
Not necessarily. Each has its own advantages. MTK will scalarize the equations but will generate really fast code. It won’t scale in compile time the best, but for scalar-heavy code that is big and sparse it’s really good, if it compiles in time. Otherwise ReverseDiff with tape compilation is good with similar properties, but it can segfault if the tape gets too long. If the code is heavy in linear algebra, Zygote is a good bet. Tracker is kind of an in-between Zygote-ish thing that can work in some cases where Zygote doesn’t. Forward-mode doesn’t scale as well.
Yes, we should probably add a cons diff overload to MTK. It’s only like 10 lines.