The Uno (Unifying Nonconvex Optimization) solver

Uno v2.0.0 is out!
Already available in your favorite language: Uno_jll.

The major changes are:

  • a more powerful unification framework with ingredients such as Hessian model (exact, identity, zero), regularization strategy (primal, primal-dual, none) and inequality handling method (inequality constrained, interior-point).

  • the null space active-set QP solver BQPD is now available as precompiled binaries and a binary package BQPD_jll :partying_face: This means we can now use the filtersqp preset (trust-region filter SQP method) within Uno_jll:
julia> using JuMP, AmplNLWriter, Uno_jll
julia> options = String["preset=filtersqp", "QP_solver=BQPD"];
julia> model = Model(() -> AmplNLWriter.Optimizer(Uno_jll.amplexe, options));
julia> @variable(model, x <= 0.5, start = -2);
julia> @variable(model, y, start = 1);
julia> @objective(model, Min, 100 * (y - x^2)^2 + (1 - x)^2);
julia> @constraint(model, x*y >= 1);
julia> @constraint(model, x + y^2 >= 0);
julia> optimize!(model)
Original model /tmp/jl_gTmcsU/model.nl
2 variables, 2 constraints (0 equality, 2 inequality)
Reformulated model /tmp/jl_gTmcsU/model.nl
2 variables, 2 constraints (0 equality, 2 inequality)

Used overwritten options:
- QP_solver = BQPD
- TR_min_radius = 1e-8
- TR_radius = 10
- constraint_relaxation_strategy = feasibility_restoration
- filter_type = standard
- globalization_mechanism = TR
- globalization_strategy = fletcher_filter_method
- hessian_model = exact
- inequality_handling_method = inequality_constrained
- l1_constraint_violation_coefficient = 1.
- loose_tolerance = 1e-6
- progress_norm = L1
- protect_actual_reduction_against_roundoff = no
- regularization_strategy = none
- residual_norm = L2
- switch_to_optimality_requires_linearized_feasibility = yes
- tolerance = 1e-6

─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 iter   TR iter  TR radius    phase  step norm   objective   primal feas  stationarity  complementarity  status          
─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 0      -        1.0000e+01   OPT    -           9.0900e+02  4.0000e+00   2.4797e+03    0.0000e+00       initial point   
 1      1        1.0000e+01   OPT    2.0000e+00  2.6000e+01  1.0000e+00   5.5713e+03    2.8887e+03       βœ” (h-type)      
 2      1        1.0000e+01   OPT    -           -           -            -             -                infeasible subproblem
 2      1        1.0000e+01   FEAS   7.5000e-01  2.5250e+01  1.1250e+00   -             -                ✘ (restoration) 
 -      2        3.7500e-01   FEAS   3.7500e-01  4.1504e-01  9.5312e-01   3.9528e-01    0.0000e+00       βœ” (restoration) 
 3      1        7.5000e-01   FEAS   7.5000e-01  3.9312e+01  5.6250e-01   5.0000e-01    0.0000e+00       βœ” (restoration) 
 4      1        1.5000e+00   FEAS   1.1250e+00  3.0650e+02  0.0000e+00   1.6654e+04    1.4508e+04       βœ” (restoration) 
 5      1        1.5000e+00   OPT    0.0000e+00  3.0650e+02  0.0000e+00   0.0000e+00    0.0000e+00       0 primal step   
─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 iter   TR iter  TR radius    phase  step norm   objective   primal feas  stationarity  complementarity  status          
─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────

Uno 2.0.0 (TR Fletcher-filter restoration inequality-constrained method with exact Hessian and no regularization)
Mon Jul  7 15:13:07 2025
────────────────────────────────────────
Optimization status:					Success
Iterate status:				    		Feasible KKT point
Objective value:			    		306.5
Primal feasibility:			    		0
β”Œ Stationarity residual:				0
β”” Complementarity residual:				0
β”Œ Feasibility stationarity residual:	0
β”” Feasibility complementarity residual:	0
β”Œ Infeasibility measure:				0
β”‚ Objective measure:					306.5
β”” Auxiliary measure:					0
CPU time:								0.015675s
Iterations:								5
Objective evaluations:					3
Constraints evaluations:				7
Objective gradient evaluations:			6
Jacobian evaluations:					6
Hessian evaluations:					6
Number of subproblems solved:			7

cc @cgeoga

A comprehensive description of the changes from v1.3.0 can be found here.

I’m actively working on the following features:

  • L-BFGS Hessian approximation;
  • SLP-EQP method;
  • a barrier method that better exploits the structure of the original problem;
  • an exponential barrier method, an alternative to IPOPT’s log barrier method;
  • Python bindings.

If you’re attending the ICCOPT 2025 conference in two weeks, don’t miss our session on Recent advances in open-source continuous solvers with talks about the HiGHS, acados and Uno solvers.

14 Likes

Wow, seriously exciting! Thanks for all your work on this @cvanaret. Is a C API on the short- or medium-term horizon, or would you say that the library needs to stabilize more before it makes sense to start developing that? I don’t have a good understanding of how hard it is to produce one, like whether or not it requires big internal rewrites or things like that.

Thanks again for everything here! I do have some problems where I can go through AMPLNLWriter and I’m super stoked to try out the FilterSQP preset.

1 Like

Thank you for the kind words @cgeoga!
I’m working on the Python bindings at the moment, which gives me a pretty good idea how much of Uno I should expose on the Python side. I don’t think it will be a lot of effort to write a C API. I’m quite busy at the moment, but the end of the year sounds realistic :slight_smile:

Let me know how the filtersqp preset performs! Tip: the preset uses globalization_strategy=fletcher_filter_method, but do give globalization_strategy=waechter_filter_method (a la IPOPT) a shot as well.

1 Like

Naive question – if we ignore the bounds on C(x), is Uno competitive with just optimization routines from Optimization.jl?

It should be usable in the Optimization.jl interface via the AMPLWriter stuff. We should probably document how to do it. I’d like to see a PR to have the SciMLBenchmarks.jl machine test it in the battery of global optimizer tests:

I’d be interested to see where it lands. It’s not really possible to know if it’s useful until such a comparison is done. I assume at face value that it will be good as it has globalizing stuff + uses differentiability, so it β€œshould” be better than say differential evolution, but at the end of the day we only recommend what the benchmarks say :sweat_smile:

3 Likes

Any way to get NOMAD.jl into this benchmark?

Just PR the benchmark file and bump the manifest.

1 Like

Optimization.jl looks like a pretty broad toolbox. Comparing local Newton methods (SQP/barrier) against metaheuristics makes little sense to me, but Uno for bound constrained problems should be more or less on par with IPOPT, SNOPT, L-BFGS-B and so on.

I can give it a shot when I have more time, but as I wrote in the previous message, I think this is a weird comparison. Newton methods β€œsolve” (because we have a first-order characterization of stationary points) while metaheuristics β€œsearch”. It’s like comparing :red_apple: and :tangerine: (local methods vs global search methods).
That said, I have nothing against metaheuristics, I used Differential Evolution a lot during my PhD as a primal strategy for solving global optimization problems. Combining local methods and metaheuristics (within the so-called memetic algorithms) makes total sense to me.

2 Likes

3 posts were split to a new topic: Optimization on unit sphere?

Wow, Uno 2.0 looks impressive. Thanks!

Can one mimic the NCL solver (GitHub - JuliaSmoothOptimizers/NCL.jl: A nonlinearly-constrained augmented-Lagrangian method) with Uno as well?

1 Like

That’s a great question :smiley:
When I came up with the unification framework, I always had augmented Lagrangian methods (as constraint relaxation strategies, see wheel diagram) in the back of my mind. They’re not implemented in Uno yet, but the abstractions are there. I heard about NCL a few years ago at a Michael Saunders talk and I’m definitely going to test that!