[ANN] Manopt.jl

After the change Manopt’s trust region solver still assumes that the subsolver has a few specific fields (p, η, trust_region_radius, , X, ηPη and δHδ) so I’d suggest copying existing TruncatedConjugateGradientState and either follow @kellertuer 's suggestions or just implement solve!(::AbstractManoptProblem, ::MyNewState) that does all the solving if you don’t want to interact with Manopt’s iteration protocol.

:loudspeaker: Manopt.jl v0.4.21 :snow_capped_mountain: Count and Cache everything (from the objective)!
See the tutorial at :books: Count and use a Cache · Manopt.jl

This version also introduces the new weak dependencies feature of Julia 1.9 to Manopt.jl, such that Manopt.jl now loads 3x faster.

2 Likes

:speaker: Manopt.jl v0.4.34 :mountain_snow: Euclidean Gradient conversion.

introduces the keyword objective_type=:Euclidean to all solvers,
which allows you to provide a Euclidean cost, gradient, Hessian in the embedding of a manifold.
We then perform the conversion to Riemannian gradient and Hessian automatically in Manopt.jl
See https://manoptjl.org/stable/tutorials/EmbeddingObjectives/ for a tutorial / example.

1 Like

To follow up ton the Apache et al. paper – that might be challenging on manifolds – at least not straight forward – at require the metric tensor and its eigenvalues probably.

Concerning the ARC that I mentioned, it is now available at Adaptive Regularization with Cubics · Manopt.jl.

1 Like

Wow, I am quite impressed. I will try the ARC solver once I have time again.

I can see that the unit tests, e.g. here: https://github.com/JuliaManifolds/Manopt.jl/blob/b574950cf349013232e8ab07fa355046bd90e8d3/test/solvers/test_adaptive_regularization_with_cubics.jl
use the same optimisation problems. Would you have any intuition about how various methods perform relative to each other, and whether that is consistent with MATLAB implementation?

I specifically think about this figure:
https://link.springer.com/article/10.1007/s10107-020-01505-1/figures/1

Thanks,
Robert

The implementation is based on a Master thesis (which is – not yet – publicly available but at some point will), which specifically compared the Julia implementation (said student did) with the Matlab implementation.

They are – according to the experiments in the thesis – very consistent in the results, just that we might have a small memory disadvantage for high-dimensional problems, that we could not completely narrow down yet (edit: It seemed to be that Maltab’s \ was sometimes faster than our linear system solver from LinearAlgebra, but we could not completely find out why).

For the tests I mainly use problems that are easy to compare and check that the methods work correctly. Those are not meant necessarily to compare methods.

Comparing methods is also a bit delicate, since it depends a lot on the application, so the intuition is not that easy, since it really depends on the individual problem.

Compared to the figure, we do have Lanczos and CG and also GD as sub solvers available in Manopt.jl – I wrote CG and GD based on the students implementation.

edit: and concerning the comparison, also keep in mind that this is not a professional software – so I can not just do such a comparison. If you for example have a student to write a thesis on such a comparison, I can surely help, but otherwise, the open source package also gets better with other peoples contributions.

1 Like

The newest version Manopt 0.4.41 has now all sub solvers decoupled, that is, also for Trust Regions one can no prove alternative solvers for the sub problem – which is also rewritten to work on the tangent space.

@fastwave so if we now would have a paper describing a Lanczos method on tangent spaces, we could implement that. However, the challenge is to work independent of coordinates, i.e. only using inner products, since coordinates are a bit hard to keep track of. For example on the 2-Sphere each tangent space is 2-dimensional but the vectors are a subset of R3 – so a plane. Then for coordinates we would need a basis of that space first. Basis-free approaches are therefore preferable.

I looked a bit into this and until now I have not yet found a good reference for e.g. Lanczos for Trust Regions on manifolds.

3 Likes

The new version

Manopt.jl 0.4.42

introduces an extension to Jump.jl, so that you can use Manifolds based on ManifoldsBase.jl (e.g. from Manifolds.jl) and algorithms from Manopt.jl within the JuMP modelling.
It is realized as a weak dependency, so though MOI.jl is a bit heavy in loading, this does not affect Manopt.jl in general.

See Extensions · Manopt.jl for a first small example and a documentation of the interface. Thanks @blegat for starting the idea and implementing this!

9 Likes

Our newest version

Manopt.jl 0.4.54

introduces two new non-smooth solvers

where the first is the reference implementation for a recently published paper of the same title.

8 Likes

Our newest version

Manopt.jl 0.4.59

introduces a Riemannian variant of the Covariance matrix adaptation evolutionary strategy (CMA-ES) algorithm! Thanks @mateuszbaran for implementing that.

4 Likes

An already the next nice version

Manopt.jl 0.4.60

Introduces the possibility to extend recording to the subsolver; to be precise, it could of course record things before itself, but that was lost after every subsolver call.

Now, for example with

s1 = exact_penalty_method(
    M2,
    f2,
    grad_f2,
    p0;
    g = g,
    grad_g = grad_g,
    record = [:Iteration, :Cost, :Subsolver],
    sub_kwargs = [:record => [:Iteration, :Cost]],
    return_state=true,
);

The :Subsolver keyword states, that the (main) solver shall record (take over) the recording from the subsolver in every step. For more examples see the new section at Record values · Manopt.jl.

Manopt.jl 0.4.64

Introduces a rewrite of the constrained objective. Now

  • any constraint function can either be a single function or a vector of functions
  • its gradient can be the same two options, but each of these can now map into a vector of tangent spaces of the tangent space of a power manifold (using a new range= positional keyword specifying the power manifold representation).
  • its Hessian can now also be provided

Furthermore the access functions for inequality and equality constraints (and their gradients and hessians) like get_grad_equality_constraint(M, o, p, i) interface accepts more than just a single index i, namely any range, bit vectors or even just : for all;
hence both algorithms and implementations of constraints can benefit from more flexible implementations.

2 Likes

Manopt.jl 0.4.68

introduces the Interior Point Newton Method
to solve constrained problems on Riemannian manifolds.
We also introduce the conjugate residual solver for the sub problem as well as several method to work with and evaluate the KKT conditions.
Both follow the paper by Lai & Yoshise.

5 Likes

:mountain_snow: Manopt.jl 0.5.0

This breaking release is mainly concerned with stability and usability

  • all interfaces have been unified, especially orders of arguments and names of keywords
  • for gradient rules like CG or average gradient, and stepsizes like the Armijo linesearch, specifying the manifold (yet again) is no longer necessary thanks to an idea from @bvdmitri – and of course for the nice discussions at JuliaCon!
  • the documentation has been reworked to using a glossary internally
  • we now use Aqua.jl to avoid ambiguities
  • we are back to supporting Julia 1.6 again after this rework as well.

For a full list of changes, especially everything to adapt for the breaking ones see the Changelog.md.

6 Likes