I have written a new solver for robustified non-linear least squares problems, with non-Euclidean variables. It can compute auto-differentiated Jacobians when necessary. I have called the package NLLSsolver. Please feel free to clone it from GitHub and try it out. Feedback is welcomed.
Background:
Until recently I prototyped in MATLAB, and wrote production code in C++. I have been following Julia for a while, as a potential replacement for both. Recently I decided to try it out, and since I tend to do a lot of non-linear least squares optimizations of geometry problems in my work, my first goal was to optimize bundle adjustment (BA) problems.
I first had a look at existing Julia optimizer frameworks, JuMP and NLPModels. I found an implementation of BA as an NLPModel. However, I found that the model did not support robustification (necessary for downweighting outlier or erroneous measurements), did not seem able to model non-Euclidean variables such as 3D rotation matrices, and also had hand-coded derivatives (I’m not sure if auto-differentiation (AD) was or is now available). I needed to be able to do all 3 of these things. In addition, I was put off JuMP by the assessment of the authors of that BA implementation, who made it sound even less suitable.
This led me to decide to implement my own optimization framework for robustified non-linear least squares problems. It’s been a very interesting first major project in Julia for me, and now I’m ready to share it publicly. It’s still in the early stages, and there isn’t much documentation (just the readme, and file comments), but there are a couple of examples, BA being one of them. I tested it against the NLPModels BA optimization on an unrobustified problem, and it was competitive (actually slightly faster, reaching the same cost), despite having auto-differentiated derivatives instead of hand-coded ones. There’s also still huge scope for reducing allocations.
Many thanks to everyone who has answered my questions on this forum since I started learning Julia. Hopefully I can now give a bit back to the community.
13 Likes
You should now be able to install it from the package registry.
this sounds very interesting!
I just saw rotations (SO(3) in your announcement, to you also have other non-Euclidean variables?
There is a bunch of algorithms for Optimization on Manifolds (like SO(3)) in Manopt.jl
( Home · Manopt.jl, disclaimer – I am a developer). Maybe a solver close to yours might be Levenberg–Marquardt · Manopt.jl ?
Concerning the derivative, you can employ Euclidean AD and “convert” the result to a Riemannian one. See for example Nicolas Boumals book An introduction to optimization on smooth manifolds section 4.7 for gradients (of embedded manifold) and I think 5.5 for the Hessian?
The gradient conversion is implemented in Manifolds.jl
and ManifoldsDiff.jl
(again disclaimer, I am among the developers). The Hessian is still future work.
With that you can compute the Euclidean (Matrix-)Gradient and turn it into a Riemannian (Rotations-)Gradient.
There are lower bounded, and upper & lower bounded variables, but I’m not sure how well they work in practice.
Do they support robustified least squares cost functions?
1 Like
Just bounds would not make Euclidean variables non-Euclidean – that would just be constrained (but-still-Euclidean) Optimisation (like Augmented Lagrangian)
Can you phrase this a bit more mathematical? What does robustified mean?
For example I know the robust PCA, which instead of the distance quared (Euclidean: norm-squared) minimised the distance (norm). But robust per se is a bit vague mathematically – so what are you referring to when you speak about robustified?
Sure like for the PCA using the norm - is no longer least-squares – so I am unsure what your robust refers to.
In general all algorithms work with a cost (any cost f(M,p)
where p is a point on the manifold M
. Without a gradient for example particle_swarm
. If you have a gradient, for example gradient_descent
or quasi_Newton
. If you even have a(n approximation of) Hessian, trust_regions
.
It depends how the “bounds” are implemented. I’m using unconstrained, non-Euclidean variables, as opposed to constrained Euclidean variables. My solver doesn’t support constraints.
Robustified non-linear least squares is explained well in the Ceres solver documentation.
Thanks I will check the ceres docs. Ah I see – well with a Chain rule, we could handle that as well (to have the gradient), m but we do not have a specific solver for that. Might be interesting to give that a look.
And sure, the tradeoff/choice is usually either constraint Euclidean or unconstraint Riemannian (non-Euclidean). Manopt.jl
even mixes both and has two solvers (augmented Lagrangian and exact penalty method) that can do constrained Riemannian optimisation (or in your words constrained non-Euclidean).