FrankWolfe.jl: convex-constrained non-linear optimization at scale

We are glad to have released FrankWolfe.jl, a package implementing several variants of the Frank-Wolfe / Conditional Gradient algorithm.

The method allows the optimization of arbitrary objective functions with first-order information over convex sets. We implement key convex sets that people are familiar with like L{1, 2, inf}-norm balls or funky ones like the Birkhoff polytopes. Furthermore, arbitrary constrained sets can be defined through the MathOptInterface.jl, using JuMP or Convex.jl.

Go check out our preprint for a more detailed tour!

Our research group will continue to work on it, integrating our and other people’s recent developments and improvements on conditional gradients, applying it to exciting problems in optimization and ML. Feel free to contribute, add issues or give us feedback!

Edit: really cool to see the work being started and done in optimization, I saw Nonconvex.jl being announced this week too!


This looks really cool! My my main question is whether this method would work well where the convex set is a polytope defined by arbitrary linear inequalities. I guess this line from the docs means something similar?

  • you can use an LP oracle defined via an LP solver (e.g., glop , scip , soplex ) with MathOptInferface

It could be nice to have a bit more detail/doc on this – or should it be self-evident how to define a LP oracle with MOI for use with FrankWolfe?

Yes we can define an arbitrary set (polytope or convex set) from MOI. The best way to look at it now is through the example here and which is also reproduced in the paper.

We plan to add proper documentation pages too yes :slight_smile: