We are glad to have released FrankWolfe.jl, a package implementing several variants of the Frank-Wolfe / Conditional Gradient algorithm.
The method allows the optimization of arbitrary objective functions with first-order information over convex sets. We implement key convex sets that people are familiar with like L{1, 2, inf}-norm balls or funky ones like the Birkhoff polytopes. Furthermore, arbitrary constrained sets can be defined through the MathOptInterface.jl, using JuMP or Convex.jl.
Go check out our preprint for a more detailed tour!
Our research group will continue to work on it, integrating our and other people’s recent developments and improvements on conditional gradients, applying it to exciting problems in optimization and ML. Feel free to contribute, add issues or give us feedback!
Edit: really cool to see the work being started and done in optimization, I saw Nonconvex.jl being announced this week too!