[ANN] OptimKit.jl – A blissfully ignorant Julia package for gradient optimization

This looks like a reasonable package, nice! It’s always good to see the optimization community growing and also work on manifolds of course.

How do you plan to provide manifolds for your approach?

Currently, I am working with a student on a LBFGS in Manopt.jl, which currently focusses on non-smooth optimization methods on manifolds, but also includes Nelder Mead and a gradient descent for example. It is based on the ManifoldsBase.jl interface to use the manifolds in Manifolds.jl.

I just took a short look, and what I am not much in favour of is the explicit verbosity->print something style. That’s why Manopt.jl has two different approaches to both Optim and OptimKit

  • one implements a single iteration instead of the whole while loop, this way one decouples the step itself and its surrounding (debug, stopping criterion etc)
  • debug is done by decorating the options and a dispatch
  • the optimise function is called solve in Manopt.jl, but handles the while (and dispatches the inner iteration)

The parameters in Manopt.jl are just an arbitrary struct inheriting from the Options (changing data) while the function and the manifold are stored in a Problem(static data), which follows the idea from manopt and pymanopt.

It would be great to discuss such modelling choices together.

1 Like