Non-linear gradient descent under linear constraints?

Hey,

Is there a pure julia implementation of a non-linear gradient descent under linear matrix/vector equality and inequality constraints ?
My problem is given as :

\min\limits_{Ax=b,\, x\ge 0}\, f(x)

where f is non-linear (but written in Julia, differentiable and stuff), A is a matrix and b a vector. Im looking for a local gradient descent under constraints, but I need pure Julia code (to pass in custom types of numbers).

Check FrankWolfe.jl: scalable constrained optimization | Mathieu BesanƧon | JuliaCon2021 - YouTube.

2 Likes

Percival.jl and MadNLP.jl should also solve this problem.

2 Likes

The FrankWolfe.jl looks promissing, iā€™ll start by this one. I knew about Percival (already tried, not working well since it does not ā€˜useā€™ the fact that the constraints are linearā€¦), and I do not know about MadNLP.jl, iā€™ll try this one next.

Thanks a lot !

1 Like

Some algorithms in ProximalAlgorithms.jl or COSMO.jl may also be able to handle this problem but their API is not straightforward and I canā€™t confirm for sure if the implementation supports nonlinear f although some algorithms should.

I can confirm that COSMO does not handle non linear objective, only quadratic ones (but itā€™s great, and can be interfaced via MathOpt so Convex.jl and JuMP are working with it). I do not know about ProximalAlgorithms.

1 Like

If factorising A is an option, consider using a nullspace formulation. Then Percival will probably do better. Also MMA from Nonconvex can be used in this case. MMA only support inequality constraints not equality.

A is ā€˜wideā€™, so that there are fewer constraints than variables. Factorising it is an option yes, I can construct a generator for elements of \{x,\, Ax = b\}. Is that the nullspace formulation you are taling about ? Wouldā€™nt that complexify the conic x \ge 0 constraint ?

Anyway you gave me a few options :slight_smile:

If you find the nullspace Y of A such that A Y = 0 and you have a single feasible solution x_0 = A \backslash b then every feasible solution can be written as x_0 + N y where y is a vector of ā€œfreeā€ nullspace coordinates. The constraints x \geq 0 then become linear inequality constraints x_0 + N y \geq 0 and f(x) is replaced by f(x_0 + N y).

2 Likes

Yes, that I can do. But it does not help me that much.

Actually, the constraints can also be given i another form, for which I managed to derive the LMO from FranckWolfe.jl. Iā€™m now gonna try to to make FranckWolfe work :slight_smile:

1 Like