Hi all,

We registered our package ProximalOperators.jl, containing the implementation of the proximal mapping associated with several convex and nonconvex penalty functions, indicator functions and so on. This can be useful to quickly implement prox-based algorithms, such as ADMM, the proximal gradient method, or the Vu-Condat primal-dual splitting algorithm.

We tried to keep every implementation of prox as efficient as possible, but I’m sure that there’s a lot of room for improvement.

Please don’t hesitate to give your feedback and suggest additional features or prox’s that may be missing!



Thank you, this looks great! Are you familiar with JuliaML (ProximalOperators was just mentioned in our gitter chat), specifically PenaltyFunctions? It looks like we adopted identical syntax for the prox function. I’m not sure the form it would take, but there’s a lot of room for working together here.


Hey, no I wasn’t familiar with PenaltyFunctions. Yes, the structure looks similar, although we have used a four-arguments in-place prox! method.

Edit: another difference is that our implementation of prox! returns the function value at the proximal point, along with the proximal point itself.

We currently need to finish implementing calculus rules: the functions we implemented in ProximalOperators are very customizable with parameters and so on, but an alternative take on a library like this could be to start with elementary functions and build up more complicated ones, using calculus rules to evaluate prox. They are not many: conjugation, postcomposition, sum of independent functions are fully working. We need to make the precomposition with an affine mapping more general then what it currently is.

There’s a few more bullets in our TODO list that need to be addressed. I guess what is also important is to have a scalable method to evaluate prox for generic smooth functions.

But yes, there’s a lot we can do together, I agree.

Edit: it would also be nice to interface a library like this with some modeling language I suppose.

Thank you!