Manopt.jl provides a framework for optimization on manifolds.
Based on Manopt and MVIRT, both implemented in Matlab, this toolbox provide an easy access to optimization methods on manifolds in Julia.
If you want to delve right into
Manopt.jl check out the Getting Started: Optimize! tutorial.
Manopt.jl makes it easy to use an algorithm for your favorite manifold as well as a manifold for your favorite algorithm. It already provides many manifolds and several algorithms, which can easily be enhanced, for example to record interims values or produce debug output throughout iterations.
The main features are:
- types and inheritance manifolds, points on manifolds and tangent vectors
- Meta manifolds like a product manifold, a power manifold and the tangent bundle available for any manifold
- Traits for special manifold properties lie Lie groups and matrix manifolds
- functions to directly start optimizing: several cost functions, differentials, gradients and proximal maps are already available
- solvers are implemented on a high level, such that they are also available for your own manifold directly.
- visualization (in Asymptote) and plots are a further focus of the toolbox to also visually compare optimization techniques on manifolds
Thanks for reading and for all the help all of you already provided,
Very cool! At first glance, it seems like there are more manifolds than in Optim.jl and a different set of algorithms than what I’m used to (you seem to be more interested in convex non differentiable programming). Could we join forces here? Also there was an effort to split off the manifold stuff in Optim into a separate package? (see eg https://github.com/antoine-levitt/Manifolds.jl, although I seem to recall a more recent effort which I can’t find again), that could be a good starting point. cc @pkofod, who is planning a rewrite of Optim.
Cool, I was not aware of you Manifold.jl approach. Yes, I am mainly interested in non smooth optimisation, that’s where I started (in Matlab) and after a discussion with Nicolas Boumal (the manopt guy) – we decided to join the ideas and I started a Package to include both approaches: His smooth algorithms (which are not yet all covered in my first version) and the non smooth ones.
From Optim.jl I also tried to adapt the modular idea of the solver having three parts (init, step and get result). The reason I did not ask to join Optim is, that the manifolds there only cover quite few functions (retraction and inner mainly) and I wanted to cover more (parallel transport for example).
Currently I am trying to cover all manifolds that manopt covered, see https://www.manopt.org/tutorial.html#manifolds and the ones I did and try to find a good interface for an approach to handle embeddings of manifolds.
I played around with parallel transport, but for the cases I was interested in (lbfgs on stiefel, mainly) it didn’t do anything that projecting the gradient didn’t, so I didn’t bother. I’m still not seeing the interest in the context of optimization, but that’s probably because I’m used to embedded manifolds? I’d welcome a clarification (we can do it by mail or gitter to avoid spamming everybody). In any case a separate library for manifolds can (should) support parallel transport & friends, even if the optimizer ends up not using them.
There’s also https://github.com/Jutho/OptimKit.jl that has some manifold functionality. The number of optimization libs that do manifolds is getting out of hand (in julia as in other languages), and writing a full-featured, optimized, robust library is not so trivial, so I’m in favor of anything that can be done to share code.
@mateuszbaran is also working on a manifold optimization package of sorts, just to add another one to the mix
Sharing Code in the sense of a common framework would definitely be helpful. For example also for things like debug or record or even caching capabilities. Manopt for example has a quite elaborated way of caching computed function values and gradients.
Concerning parallel Transport – well for symmetric positive definite matrices or on the sphere it’s really more than just projecting and in order to compute several differentials, one needs parallel transport.
Ah, projections, yes. I would like to include something like that with more structure. I already started a
Trait in my toolbox, with the goal to specify Manifold A is embedded in Manifold B and then define both an
embed(B,A,x) to get the embedded representation of x (which is on B) with respect to A. Similarly of course the inverse
project(A,B,y) to project a point y form A onto the submanifold B. But I decided to first go public with these first algorithms and manifolds
So does Optim. This is the kind of wheel-reinventing that could be avoided with a common library. Re parallel transport, I’ll send you an email.
Yes, we should stop reinventing wheels, you’re right. The Julia version does not have caching yet for example. I Will look into the Optim.jl way of doing that to maybe also use the same framework already.
Maybe it would be good to have a OptimFramework or something?
I hope I don’t come across as too annoying btw. I’m as fond as reinventing wheels as anybody: it’s just much more fun than trying to interface with somebody else’s code. But here there seems to be a lot of people interested in this kind of techniques (people doing optimization, but also people doing eg ODEs on manifolds), @pkofod is in the middle of a rewrite, and so it seems like a good point to get around a table and see what we can do to avoid the usual “four libraries with different feature set, idiosyncrasies and bugs” setup.
No, you don’t. I am completely on your side, that common frameworks are a good idea. We should definitely try to find a common way of providing features, to hopefully end up with just one caching scheme for example. That’s why I proposed to do a meta/interface package (well as soon as at least a few people agreed on how that should look )
Very cool! I was just about to release a somewhat similar package, although with a few notable differences:
- It’s designed to have good support for functional manifolds (in the style of https://www.springer.com/gp/book/9781493940189 ),
- I’ve put quite a lot of effort to have basic operations (exponential and logarithmic maps, inner products, parallel transport etc.) optimised: I have mutating versions, versions that work directly on
AbstractArray representations and support for mutating arrays of
isbits types (this was quite tricky).
I will publish my code as a separate package, at least as a reference for future work. I definitely support the idea of having a common framework .
Looking forward to seeing those tricky details. For the basic operations I also did a lot of work providing all the formula for the documentation. With your package having a slightly different focus, a common framework would really be helpful to be able to use advantages of both (and other manifold/optimisation) packages then.
OK, so it looks like a good first step would be to externalize the definition of manifolds to an external package, which all those libraries can then depend upon. The bare-bones version (retract/project) used by Optim is https://github.com/JuliaNLSolvers/ManifoldProjections.jl (see also https://github.com/JuliaNLSolvers/ManifoldProjections.jl/issues/1), could we agree on a simple minimal interface that such a library should support (eg discuss it on an issue there)?
Yes, maybe a Package like
Manifold.jl to collect such a common framework (similar to a common framework for solvers) would be a good idea
I would not only define a manifold but also points and tangent vectors. For retract/project I would like to try the general approach to define that between manifolds (a manifold and its sub manifold) – maybe with the additional idea, that vectors and matrices can be implicitly converted to be points on
Euclidean(n,n), respectively… but we should discuss that in a separate thread, maybe as an issue maybe as a discussion somewhere here on discourse? Since it affects more than one package, it would maybe be nice to have a common place here.
edit: and for example one could discuss how to distinguish manifolds and their metrics, too; for symmetric positive definite matrices, you can use different metrics leading to different geodesics and such.
Here is my code: https://github.com/mateuszbaran/FunManifolds.jl . I’m still setting up CI and docs, many things are missing and there is still quite a bit of code to be published later. I think product space is, at the moment, the best demonstration of some tricks I’ve used to get good performance.
I agree that details of a common interface should be discussed somewhere else, maybe an issue in Manopt.jl?
About distinguishing manifolds and metrics, that’s something I was thinking about that some time ago, and my conclusion was that so few things remain the same when changing the metric that it’s best to make two completely separate implementations.
Thanks for providing the code, despite the fact that my
ProdTVector are called a little differently (and log and exp are actually called log and exp in Manopt.jl), our implementations might be similar. For the power space, I also allow arbitrary n-dimensional arrays, not just vectors, but that’s also not that much of a difference (most functions can even be written with the same comprehension).
I however defined the geodesic on a general manifold just using exp (to not implement it again and again and again), so you only haver to override that if you have geodesics but no nice exp. And I haven’t yet worked on an approach to work ambient.
For your optimization approach, I also tried to modularise that, having a Stepsize type, stopping criterion type (functors actually) to abstract from those, see for example my steepestDescent.
for the discussion on a common framework I started a thread here Towards a common Manifold Framework / Package – also for the metric you’re right it might be better to really have two manifolds though they most probably share the same points/tangent vector representations – but still one could work on a common type for those and just have a different manifold-type for each metric.
Let me also mention OptimKit.jl, not released but functional and with basic documentation in the README. It only covers some basic gradient optimization methods, but it is completely generic in how you specify your problem, retraction, transport, …
That’s also an interesting – though quite different – approach, thanks for the link.