Update - Release v1.0
I released Nonconvex 1.0 and started the JuliaNonconvex organisation. In the 1.0 release, Nonconvex.jl is just a shell package that can be used to load individual solver/wrapper packages using the Nonconvex.@load macro. Individual solvers and wrappers live in their own packages in the JuliaNonconvex org. The documentation also lives in the Nonconvex.jl repo. There are plenty of features that I haven’t announced yet on this forum which are available on 1.0:
- Arbitrary decision variable types are now supported. This includes tuples, named tuples, arrays of arrays, dictionaries, structs, etc. These can also be nested together or put in a heterogeneous vector.
- A multi-start version of all the algorithms is now available making using of Hyperopt.jl to re-start the optimisation at different initial solutions using various strategies.
- A JuMP model can now be converted to a Nonconvex model. So you can start from a JuMP model, define all your variables and linear constraints there and then change that model to a Nonconvex model to define the nonlinear functions, make use of Zygote and/or make use of the optimisation algorithms available in Nonconvex.jl.
- An experimental constrained surrogate-assisted optimisation meta-algorithm is now available. It can replace expensive functions by Gaussian process surrogates and solve the surrogate model using any of the other optimisation algorithms in Nonconvex.jl (hence the meta part). It’s experimental because it needs more fine tuning and experimenting. Inequality and equality constraints are supported.
- The augmented Lagrangian algorithm in Nonconvex (wrapped from Percival.jl) now handles arbitrary precision number types correctly.
- Thanks to @noil-reed 's GSoC project, Nonconvex.jl now has nonconvex semidefinite constraint support via an interior point meta-algorithm that can turn any nonlinear programming solver to a nonlinear semidefinite programming solver by iteratively solving the barrier problem. This means that you can constrain any (non-convex) matrix-valued function of the decision variables to be positive semidefinite. This is a recent work and the documentation is underway but you can find an example here.
- @noil-reed also implemented some zero-order local search algorithms for bound constrained problems that don’t require gradients. See the multi trajectory search algorithm in the documentation for more details.
Happy optimisation and as always feel free to test the package, open issues, start discussions, and/or of course reach out to contribute!