PosDefManifoldML: machine learning for positive definite matrices

Hello all,

PosDefManifoldML is a Python-free package for classifying data in the Riemannian manifolds P of real or complex positive definite matrices, for example, covariance matrices, Fourier cross-spectral matrices, kernels, etc. It is based on the PosDefManifold.jl, GLMNet.jl and LIBSVM.jl packages.

For the moment being, PosDefManifoldML implements the Riemannian Minimum Distance to Mean (MDM) classifier, which operates directly in P, the elastic net logistic regression (including the pure Ridge and pure Lasso logistic regresison model) and several support-vector machine classifiers in the tangent space. The models operating in the tangent space can be used also for traditional (Euclidean) feature vectors and in a Riemannian context as here implemented have won several international machine learning competitions in the field of brain-computer interface (see table 1, page 29 here).


Cool! Any plan to support the MLJ.jl interface?

1 Like

That would be nice, if there is someone interested in helping with that.

Nice package. You might be interested in the Manifolds.jl package, with it you could do ML on many other manifolds (we have the SPD manifold with two metrics).


This looks interesting, and I appreciate the nicely written Readme - it would also be cool to have some references to publications on the topic as well to help fill in gaps for those of us looking for new tools!

Thanks. I am concentrating on the manifold of positive definite matrices, for which i implemented 10 metrics in PosDefManifold.jl. The level of abstraction of Manifolds.jl seems really impressive though.


Thank you. There are several references with the links to the pdf’s in the docs here. Also, i wrote an extensive introduction here.


Very nice - thanks!

Marco, your documentation is gorgeous! Fantastic work here.


Thank you very much Casey, fantastic as well is your willing to read it. All the packages i release are documented alike, as i believe a good documentation is fundamental for making a package really useful. All details are covered there and, hopefully, reading the documentation is a pleasant and instructive experience.


This looks like a quite interesting package, thanks for providing it with such a detailed documentation. Might I suggest a small criticism, I would like to comment that the manifold for illustrations looks like having positive curvature, while the SPDs with most metrics have nonpositive curvatures. That apart, the illustrations look really nice; with which programm are you doing these?

We could try to get more metric into Manifolds.jl, where we currently have 3 metrics covered (more or less) – I might take a look and “steal” a few metrics (of course documenting the origin of course). Or with ManifoldsBase.jl try to get your algorithms into other manifolds as well. Of course only if you’d be interested (we can of course help!).

PS: Thanks for referring to Manopt.jl in your references. To some extend your ML methods are also optimization methods in that regime.

The package DirectSum.jl is intended to provide a potentially universal interface for metrics. Its current design is not adapted for use with matrices, since it was designed for Grassmann.jl. However, it could be adapted for use with matrices. For diagonal metrics, it uses a bit encoding. However, the VectorBundle <: Manifold is not limited to bit encoded diagonal metrics, it can be extended for any quadratic form to define a metric, although I have not taken the time to fully do that yet. In my experience, something like DirectSum is very useful for working with pre-allocated metric algebras.

Hello Ronny,
indeed the curves on the figure seem like curves with positive curvature, however this makes much nicer pictures. As this is just for illustration purposes, I have been using this kind of figures since a long time. I am doing the art work simply with power point and paint.net (in Windows).

I have implemented many metrics for completeness, but i don’t think all of them are necessary. For the manifold of positive definite matrices I would say that the Jeffrey and S-divergence, along with the Wasserstein metric (which has been recently shown to be a Riemannian metric), nicely complement the affine invariant metric and the Euclidean and log-Euclidean distances.

In riemannianGeoemtry.jl i have implemented gradient-descent iterative algorithms to estimate the center of mass with all metrics that require an iterative algorithm, including the whole family of power means and p-means (e.g., the median). Sure you can copy or call them in Manopt.jl.

I would be glad to contribute. Indeed PosDefManifoldML.jl can be easily adapted to classify data on the tanget space of any manifold for which a projection on the tangent space is defined. As long as a function for computing distances and means on the manifold are ready to be called, classification on the manifold with PosDefManifoldML.jl is also straightforward.

Good to see that there is a community working in Julia on Riemannian stuff. Some of my former students may be interested in contributing. I am confident people will join, slowly as they get convinced on how nice Julia is!

Hi Marco,
oh, of course median and mean are the first examples to illustrate the solver framework, so they are already within Manopt.jl, we even ported easier (without the framework) versions to Manifolds.jl, together with variance estimation within the statistics we provide on arbitrary manifolds. But I definetly will take a look at the Wasserstein metric (affine, LogEucldiean and LogCholesky are already available). Which one do you mean with Euclidean? The metric from the embedding would make it complicated, since points not on the manifold have finite distance?

Thanks for the info on your illustrations, just curius, what other people use (I use TikZ for 2D and Asympotote for 3D); for the illustration – yes, the positive curvature makes the images nicer, maybe it’s a trade-off between exacteness and readability of the figure.

It’s great to see more people working on manifolds in Julia, yes! And exactely, as you write, your methods should be doable on any tangent space, so generalizing that would be great.

Hi Ronny,

the algorithm i implemented for the Wasserstein barycenter is the fastest i have found in the literature so far.

By ‘Euclidean’ i mean just the Euclidean distance ||P-Q||.

OK, i will check what it takes to allow passing a ManifoldsBase.Manifold to machine leaarning methods of PodSefManifoldML.jl. I will start a thread in the Manifold.jl github.

Then I will definetly take a look at that algorithm!

But then Euclideanis not a Riemannian metric on the symmetric positive definite matrices, since nonpositive-definite matrices have a finite distance.

Great, looking forward to reading about that thread to discuss that idea further.

Sure, i allow using the Euclidean distance for compatibility with other staff. PosDefManifold actually is for manipulating positive definite matrices in general, not just on the Riemannian manifold.

Thanks for the clarification. Then Euclidean is slightly misusing the idea of a metric, but maybe that’s the easiest way to model(include it in your package.