Is there a pure julia implementation of a non-linear gradient descent under linear matrix/vector equality and inequality constraints ?
My problem is given as :

\min\limits_{Ax=b,\, x\ge 0}\, f(x)

where f is non-linear (but written in Julia, differentiable and stuff), A is a matrix and b a vector. Im looking for a local gradient descent under constraints, but I need pure Julia code (to pass in custom types of numbers).

The FrankWolfe.jl looks promissing, iāll start by this one. I knew about Percival (already tried, not working well since it does not āuseā the fact that the constraints are linearā¦), and I do not know about MadNLP.jl, iāll try this one next.

Some algorithms in ProximalAlgorithms.jl or COSMO.jl may also be able to handle this problem but their API is not straightforward and I canāt confirm for sure if the implementation supports nonlinear f although some algorithms should.

I can confirm that COSMO does not handle non linear objective, only quadratic ones (but itās great, and can be interfaced via MathOpt so Convex.jl and JuMP are working with it). I do not know about ProximalAlgorithms.

If factorising A is an option, consider using a nullspace formulation. Then Percival will probably do better. Also MMA from Nonconvex can be used in this case. MMA only support inequality constraints not equality.

A is āwideā, so that there are fewer constraints than variables. Factorising it is an option yes, I can construct a generator for elements of \{x,\, Ax = b\}. Is that the nullspace formulation you are taling about ? Wouldānt that complexify the conic x \ge 0 constraint ?

If you find the nullspace Y of A such that A Y = 0 and you have a single feasible solution x_0 = A \backslash b then every feasible solution can be written as x_0 + N y where y is a vector of āfreeā nullspace coordinates. The constraints x \geq 0 then become linear inequality constraints x_0 + N y \geq 0 and f(x) is replaced by f(x_0 + N y).

Yes, that I can do. But it does not help me that much.

Actually, the constraints can also be given i another form, for which I managed to derive the LMO from FranckWolfe.jl. Iām now gonna try to to make FranckWolfe work