Write large Least Square like problems in JuMP

Hi,
Here is a small code showing the issue I am facing when trying to form large problem (least square like) with JuMP

using JuMP, Ipopt, BenchmarkTools

p = 500
n = 200

solver = with_optimizer(Ipopt.Optimizer);
ML = Model(solver)
@variable(ML, v[1:p] )

A = rand(n,p)
b = [ones(15); zeros(p-15)];
y = A*b+0.5*rand(n);

@btime sum( (A*v - y).^2 ); # Computing sum of square where v are JuMP variables
30.526 s (50026082 allocations: 2.55 GiB)
@btime sum( (A*rand(500) - y).^2 ); # Same sum when v are floats 
33.332 μs (10 allocations: 9.47 KiB)

The problem is not so big and it still takes 30s to form (I am not solving it yet).
It is normal that the second sum is faster but here it is 10^6 times faster!
My questions are:

  • Is there an efficient way to form that type of problem
    @objective(ML, Min, sum( (A*v - y).^2 ) )?
    Is it normal that simple square sum is so long to form with JuMP?
    I want to deal with p ~ 5000, n ~ 1000 plus some handmade regularization terms and as for now JuMP is not an option ~3000s.

  • Without JuMP, are there efficient packages that one could use? Convex.jl seems faster but I am not sure if I can add custom (non-convex) regularization terms.

1 Like

Yes, define a new set of variables z, add the constraint z == A*v - y, and minimize sum(z.^2). To see why this helps, compare the density of the Hessian matrices of the two formulations.

Ipopt is really designed for constrained nonlinear optimization. If you only have a least squares problem plus other objective terms, you can use other packages like Optim. Others will likely chime in with additional alternatives.

1 Like

Thanks! It works well and way faster

@variable(ML, z[1:n] )
btime @constraint(ML, z .== A*v - y)
82.711 ms (116252 allocations: 14.70 MiB)
@btime @objective(ML, Min, sum( z.^2 ) )
 231.922 μs (2854 allocations: 238.06 KiB)