I’m happy to announce that a ProximalAlgorithms v0.5.0 was released! This new version comes with some breaking changes, and significant code overhaul. Most importantly, it comes with documentation, which was long overdue! A big thanks goes to the contributors and reviewers that helped pushing this release over the finish line in the past few months.
ProximalAlgorithms allows solving a variety of optimization problems where non-differentiable terms occur in the objective. Constrained problems are an example of this, in which the indicator function of the feasible set is included in the objective. Other examples include models in which non-differentiable losses or regularization terms are used.
Here is a summary of the changes that went into this release.
Introducing ProximalCore
The core API for proximal minimization is now defined in the new, extremely lightweight ProximalCore package, which ProximalAlgorithms depends on. This way one can easily define the proximal mapping for custom types, as explained here and have ProximalAlgorithms work fine with them, without the necessity to depend on ProximalOperators.
ProximalOperators v0.15
The latest version of ProximalOperators depends on ProximalCore, and implements the interface there defined: If you wish to use the function types defined in ProximalOperators, don’t forget to update it to the recently released v0.15.0 in order for it to work with the latest ProximalAlgorithms release.
Algorithms, algorithms, algorithms
Significant work went into refactoring the algorithms code, but some new algorithms were also added. The fleet of available algorithms now include:
- (Fast) Proximal gradient methods
- Douglas-Rachford splitting
- Three-term splitting
- Primal-dual splitting algorithms
- Newton-type methods (based both on proximal gradient and Douglas-Rachford)
Here is a list of the implemented algorithms, and the types of problems they target.
Here is a quick introduction on the interface to algorithms, with some example.
Zygote as default backend for gradients
While the ProximalCore API allows to explicitly define the gradient of a custom function type, it is only natural to fall back to using the amazing AD ecosystem that Julia has to offer. Therefore, ProximalAlgorithms falls back to Zygote whenever gradients are to be computed. This means that one can provide a regular Julia functions where only gradients are needed, as one would naturally do, with no need to define a custom type like it is needed for proximal mappings instead. This simple example shows this in action.
In the future, it will make sense to add support for multiple AD backends, so that one can easily switch from Zygote to ReverseDiff, Enzyme, Diffractor, Yota, Nabla… whatever works best. If anyone is interested in this, feel free to reach out here or through issues.