I am trying to minimize a function that depends on only few parameters, but contains a logsumexp with many terms in the sum, i.e. a function of the form
where where a_i, x \in \mathbb{R}^m. In my case, the number of terms n is very large, but the number of parameters m is small. Typically, I might have n \sim 10^5 and m \sim 10.
Naively putting this function into Convex.jl will result in O(n) variables and constraints being passed to the solver, which I am hoping to avoid.
Is there any way of representing this function efficiently?
Could you post a minimal example? (See also Please read: make it easier to help you). It looks like one might be able to avoid ~n variables/constraints but it would be easier to figure it out with some code to work with.
Thanks for your replies. Here is a minimal example for a moderate number of terms:
using Convex, SCS
A = randn(1000, 10)
x = Variable(10)
problem = minimize(Convex.logsumexp(A*x))
solve!(problem, SCS.Optimizer)
This example has 10 variables and 1000 terms, and looking at the output of SCS shows that 1012 variables and 3002 constraints are passed to the solver:
I found that problems with more than 10.000 terms easily consume >20GB of memory.
It is my understanding that JuMP would require me to formulate the problem in a solver-compatible way on my own, which I am not sure how to do (with less than O(n) variables).
Thanks a lot! I just tried your JuMP code with a larger number of terms and it seems to work. While this actually creates more variables and constraints than the Convex.jl version, the memory requirement is much, much lower (50000 terms seem to use ~1GB). So maybe a clever reformulation of the problem is actually not needed.
Maybe the excess memory requirement is an issue with Convex.jl (possibly related to this)?
More generally, Convex.jl is not actively maintained, whereas the JuMP people are extremely reactive, so betting on the latter is a good call for current projects
You could try Convex#master. Convex has not been maintained much over the years but we merged a big refactor and Oscar (who does an amazing job maintaining tons of JuMP packages) also spent some time cleaning up Convex, fixing bugs and writing tests. Those changes haven’t made it to a release yet though.