Manopt.jl makes it easy to use an algorithm for your favorite manifold as well as a manifold for your favorite algorithm. It already provides many manifolds and several algorithms, which can easily be enhanced, for example to record interims values or produce debug output throughout iterations.

The main features are:

types and inheritance manifolds, points on manifolds and tangent vectors

Meta manifolds like a product manifold, a power manifold and the tangent bundle available for any manifold

Traits for special manifold properties lie Lie groups and matrix manifolds

functions to directly start optimizing: several cost functions, differentials, gradients and proximal maps are already available

solvers are implemented on a high level, such that they are also available for your own manifold directly.

visualization (in Asymptote) and plots are a further focus of the toolbox to also visually compare optimization techniques on manifolds

Thanks for reading and for all the help all of you already provided,
Ronny.

Very cool! At first glance, it seems like there are more manifolds than in Optim.jl and a different set of algorithms than what Iâ€™m used to (you seem to be more interested in convex non differentiable programming). Could we join forces here? Also there was an effort to split off the manifold stuff in Optim into a separate package? (see eg https://github.com/antoine-levitt/Manifolds.jl, although I seem to recall a more recent effort which I canâ€™t find again), that could be a good starting point. cc @pkofod, who is planning a rewrite of Optim.

Cool, I was not aware of you Manifold.jl approach. Yes, I am mainly interested in non smooth optimisation, thatâ€™s where I started (in Matlab) and after a discussion with Nicolas Boumal (the manopt guy) â€“ we decided to join the ideas and I started a Package to include both approaches: His smooth algorithms (which are not yet all covered in my first version) and the non smooth ones.
From Optim.jl I also tried to adapt the modular idea of the solver having three parts (init, step and get result). The reason I did not ask to join Optim is, that the manifolds there only cover quite few functions (retraction and inner mainly) and I wanted to cover more (parallel transport for example).

Currently I am trying to cover all manifolds that manopt covered, see https://www.manopt.org/tutorial.html#manifolds and the ones I did and try to find a good interface for an approach to handle embeddings of manifolds.

I played around with parallel transport, but for the cases I was interested in (lbfgs on stiefel, mainly) it didnâ€™t do anything that projecting the gradient didnâ€™t, so I didnâ€™t bother. Iâ€™m still not seeing the interest in the context of optimization, but thatâ€™s probably because Iâ€™m used to embedded manifolds? Iâ€™d welcome a clarification (we can do it by mail or gitter to avoid spamming everybody). In any case a separate library for manifolds can (should) support parallel transport & friends, even if the optimizer ends up not using them.

Thereâ€™s also https://github.com/Jutho/OptimKit.jl that has some manifold functionality. The number of optimization libs that do manifolds is getting out of hand (in julia as in other languages), and writing a full-featured, optimized, robust library is not so trivial, so Iâ€™m in favor of anything that can be done to share code.

Sharing Code in the sense of a common framework would definitely be helpful. For example also for things like debug or record or even caching capabilities. Manopt for example has a quite elaborated way of caching computed function values and gradients.

Concerning parallel Transport â€“ well for symmetric positive definite matrices or on the sphere itâ€™s really more than just projecting and in order to compute several differentials, one needs parallel transport.

Ah, projections, yes. I would like to include something like that with more structure. I already started a Trait in my toolbox, with the goal to specify Manifold A is embedded in Manifold B and then define both an embed(B,A,x) to get the embedded representation of x (which is on B) with respect to A. Similarly of course the inverse project(A,B,y) to project a point y form A onto the submanifold B. But I decided to first go public with these first algorithms and manifolds

Yes, we should stop reinventing wheels, youâ€™re right. The Julia version does not have caching yet for example. I Will look into the Optim.jl way of doing that to maybe also use the same framework already.

Maybe it would be good to have a OptimFramework or something?

I hope I donâ€™t come across as too annoying btw. Iâ€™m as fond as reinventing wheels as anybody: itâ€™s just much more fun than trying to interface with somebody elseâ€™s code. But here there seems to be a lot of people interested in this kind of techniques (people doing optimization, but also people doing eg ODEs on manifolds), @pkofod is in the middle of a rewrite, and so it seems like a good point to get around a table and see what we can do to avoid the usual â€śfour libraries with different feature set, idiosyncrasies and bugsâ€ť setup.

No, you donâ€™t. I am completely on your side, that common frameworks are a good idea. We should definitely try to find a common way of providing features, to hopefully end up with just one caching scheme for example. Thatâ€™s why I proposed to do a meta/interface package (well as soon as at least a few people agreed on how that should look )

Iâ€™ve put quite a lot of effort to have basic operations (exponential and logarithmic maps, inner products, parallel transport etc.) optimised: I have mutating versions, versions that work directly on AbstractArray representations and support for mutating arrays of isbits types (this was quite tricky).

I will publish my code as a separate package, at least as a reference for future work. I definitely support the idea of having a common framework .

Looking forward to seeing those tricky details. For the basic operations I also did a lot of work providing all the formula for the documentation. With your package having a slightly different focus, a common framework would really be helpful to be able to use advantages of both (and other manifold/optimisation) packages then.

Yes, maybe a Package like Manifold.jl to collect such a common framework (similar to a common framework for solvers) would be a good idea

I would not only define a manifold but also points and tangent vectors. For retract/project I would like to try the general approach to define that between manifolds (a manifold and its sub manifold) â€“ maybe with the additional idea, that vectors and matrices can be implicitly converted to be points on Euclidean(n) and Euclidean(n,n), respectivelyâ€¦ but we should discuss that in a separate thread, maybe as an issue maybe as a discussion somewhere here on discourse? Since it affects more than one package, it would maybe be nice to have a common place here.

edit: and for example one could discuss how to distinguish manifolds and their metrics, too; for symmetric positive definite matrices, you can use different metrics leading to different geodesics and such.

Here is my code: https://github.com/mateuszbaran/FunManifolds.jl . Iâ€™m still setting up CI and docs, many things are missing and there is still quite a bit of code to be published later. I think product space is, at the moment, the best demonstration of some tricks Iâ€™ve used to get good performance.

I agree that details of a common interface should be discussed somewhere else, maybe an issue in Manopt.jl?

About distinguishing manifolds and metrics, thatâ€™s something I was thinking about that some time ago, and my conclusion was that so few things remain the same when changing the metric that itâ€™s best to make two completely separate implementations.

Thanks for providing the code, despite the fact that my Product, ProdMPoint and ProdTVector are called a little differently (and log and exp are actually called log and exp in Manopt.jl), our implementations might be similar. For the power space, I also allow arbitrary n-dimensional arrays, not just vectors, but thatâ€™s also not that much of a difference (most functions can even be written with the same comprehension).

I however defined the geodesic on a general manifold just using exp (to not implement it again and again and again), so you only haver to override that if you have geodesics but no nice exp. And I havenâ€™t yet worked on an approach to work ambient.

For your optimization approach, I also tried to modularise that, having a Stepsize type, stopping criterion type (functors actually) to abstract from those, see for example my steepestDescent.

for the discussion on a common framework I started a thread here Towards a common Manifold Framework / Package â€“ also for the metric youâ€™re right it might be better to really have two manifolds though they most probably share the same points/tangent vector representations â€“ but still one could work on a common type for those and just have a different manifold-type for each metric.

Let me also mention OptimKit.jl, not released but functional and with basic documentation in the README. It only covers some basic gradient optimization methods, but it is completely generic in how you specify your problem, retraction, transport, â€¦