Uno v2.0.0 is out!
Already available in your favorite language: Uno_jll.
The major changes are:
a more powerful unification framework with ingredients such as Hessian model (exact, identity, zero), regularization strategy (primal, primal-dual, none) and inequality handling method (inequality constrained, interior-point).
the null space active-set QP solver BQPD is now available as precompiled binaries and a binary package BQPD_jll This means we can now use the filtersqp preset (trust-region filter SQP method) within Uno_jll:
Wow, seriously exciting! Thanks for all your work on this @cvanaret. Is a C API on the short- or medium-term horizon, or would you say that the library needs to stabilize more before it makes sense to start developing that? I donβt have a good understanding of how hard it is to produce one, like whether or not it requires big internal rewrites or things like that.
Thanks again for everything here! I do have some problems where I can go through AMPLNLWriter and Iβm super stoked to try out the FilterSQP preset.
Thank you for the kind words @cgeoga!
Iβm working on the Python bindings at the moment, which gives me a pretty good idea how much of Uno I should expose on the Python side. I donβt think it will be a lot of effort to write a C API. Iβm quite busy at the moment, but the end of the year sounds realistic
Let me know how the filtersqp preset performs! Tip: the preset uses globalization_strategy=fletcher_filter_method, but do give globalization_strategy=waechter_filter_method (a la IPOPT) a shot as well.
It should be usable in the Optimization.jl interface via the AMPLWriter stuff. We should probably document how to do it. Iβd like to see a PR to have the SciMLBenchmarks.jl machine test it in the battery of global optimizer tests:
Iβd be interested to see where it lands. Itβs not really possible to know if itβs useful until such a comparison is done. I assume at face value that it will be good as it has globalizing stuff + uses differentiability, so it βshouldβ be better than say differential evolution, but at the end of the day we only recommend what the benchmarks say
Optimization.jl looks like a pretty broad toolbox. Comparing local Newton methods (SQP/barrier) against metaheuristics makes little sense to me, but Uno for bound constrained problems should be more or less on par with IPOPT, SNOPT, L-BFGS-B and so on.
I can give it a shot when I have more time, but as I wrote in the previous message, I think this is a weird comparison. Newton methods βsolveβ (because we have a first-order characterization of stationary points) while metaheuristics βsearchβ. Itβs like comparing and (local methods vs global search methods).
That said, I have nothing against metaheuristics, I used Differential Evolution a lot during my PhD as a primal strategy for solving global optimization problems. Combining local methods and metaheuristics (within the so-called memetic algorithms) makes total sense to me.
Thatβs a great question
When I came up with the unification framework, I always had augmented Lagrangian methods (as constraint relaxation strategies, see wheel diagram) in the back of my mind. Theyβre not implemented in Uno yet, but the abstractions are there. I heard about NCL a few years ago at a Michael Saunders talk and Iβm definitely going to test that!
@CeterisPartybus We need a C interface in Uno to be able to plug it into NCL.jl.
We plan to discuss and work on it with @cvanaret next week at ICCOPT.
Do you have a specific application in mind?
That said, a direct integration into Uno is probably more relevant. We are working on a new evolution of NCL and how to exploit it from a linear algebra perspective, with specific KKT formulations.
We are collaborating with @frapac and @sshin23 on MadNLP.jl and GPU usage.
It should be quite easy to integrate it into Uno afterward.
Thanks for the answer. Correct me if I am wrong but I understand that the idea of Uno is to allow the user to combine strategies to build a βcustomβ algorithm. So, I should be able to configure Uno to mimic the NCL algorithm as well, right?
I do not have an application in mind but was merely wondering if I could actually do that.
Very cool & ambitious work! Congratulations @cvanaret for the 2.0.0 release.
Would it be possible to write an NLPModels.jl wrapper? NLPModels.jl is a very thin template for NLPs (specifying callbacks and dimensions, etc.), so writing a wrapper wouldnβt be too much work. Itβd make it possible to use Uno for a variety of NLP test cases and modeling environments such as ADNLPModels.jl and ExaModels.jl. Or maybe thatβs what youβre planning to do at ICCOPT @amontoison?
@sshin23 It is what we have in mind
We want to test a new feature of Uno that we work on with Nick Gould and Sven Leyffer and for that we want to use CUTEst.jl.
If we have a C interface, we can do like Ipopt.jl and KNITRO.jl and add extensions for MOI.jl / NLPModels.jl.
Dear all,
Weβve fixed a few bugs in the C and Julia interfaces: hereβs Uno 2.2.1!
Now all vectors that are passed to the model creation are copied internally, which means you can pass temporaries.
Weβre now working on improving the documentation Thereβs also a tiny bug left: when several instances are solved sequentially, the numbers of function evaluations accumulate. I suspect itβs due to these quantities being static variables in the C++ code.
Weβre also trying to register Uno to JuliaRegistries and most likely we wonβt get the name Uno.jl (too short). The best candidates at the moment are UnoSolver.jl and UnifyingNonlinearOptimization.jl.