I am glad to announce Nonconvex.jl, a package in which I implement and wrap a few constrained nonlinear optimization algorithms and packages using automatic differentiation as a fundamental building block. There is no way to define gradients directly in Nonconvex.jl, you can only define a custom adjoint rule using ChainRulesCore.jl. This is especially powerful however because it allows you to use Zygote.jl with your favorite nonlinear optimization algorithm. The following algorithms are available:

- A pure Julia implementation of the original method of moving asymptotes (MMA), referred to as
`MMA87`

. This is a first order algorithm. The algorithm was generalized to handle infinite bounds. - A pure Julia implementation of the globally convergent MMA, referred to as
`MMA02`

. This is a first order algorithm. The algorithm was generalized to handle infinite bounds. - First and second order augmented Lagrangian algorithms as implemented in Percival.jl. Using the first order augmented Lagrangian algorithm together with Zygote is the star of the show because it enables efficient ODE/PDE-constrained optimization where the constraints are per element or per time point. I use it in TopOpt.jl for example to do stress-constrained topology optimization.
- First and second order interior point algorithms as available in Ipopt.jl
- All the algorithms available in NLopt.jl

I have an ambitious set of issues so if you are interested in helping out, I am happy to chat.