Here is a small code showing the issue I am facing when trying to form large problem (least square like) with JuMP
using JuMP, Ipopt, BenchmarkTools p = 500 n = 200 solver = with_optimizer(Ipopt.Optimizer); ML = Model(solver) @variable(ML, v[1:p] ) A = rand(n,p) b = [ones(15); zeros(p-15)]; y = A*b+0.5*rand(n); @btime sum( (A*v - y).^2 ); # Computing sum of square where v are JuMP variables 30.526 s (50026082 allocations: 2.55 GiB) @btime sum( (A*rand(500) - y).^2 ); # Same sum when v are floats 33.332 μs (10 allocations: 9.47 KiB)
The problem is not so big and it still takes 30s to form (I am not solving it yet).
It is normal that the second sum is faster but here it is
10^6 times faster!
My questions are:
Is there an efficient way to form that type of problem
@objective(ML, Min, sum( (A*v - y).^2 ) )?
Is it normal that simple square sum is so long to form with JuMP?
I want to deal with
p ~ 5000,
n ~ 1000plus some handmade regularization terms and as for now JuMP is not an option
Without JuMP, are there efficient packages that one could use?
Convex.jlseems faster but I am not sure if I can add custom (non-convex) regularization terms.