Problem forming a least squares solution

I’m having trouble formulating a model to minimize ||Ax - b||
Ideally I’m looking to minimize the 1 norm, but in trying to get there I ran into errors that I’m not sure I understand:

using JuMP
using ECOS

model = Model(with_optimizer(ECOS.Optimizer))
A = rand(10, 10)
b = rand(10)

@objective(model, Min, norm(A*x - b))
ERROR: JuMP no longer performs automatic transformation of `norm()` expressions into second-order cone constraints. They should now be expressed using the SecondOrderCone() set. For example, `@constraint(model, norm(x) <= t)` should now be written as `@constraint(model, [t; x] in SecondOrderCone())`
...

So I see that the norm function has a problem- is this error correct even though I’m using it in the objective function, not a constraint?

When I try to make it into the norm squared, I get this:

@objective(model, Min, sum(el^2 for el in A*x - b))
ERROR: The solver does not support an objective function of type MathOptInterface.ScalarQuadraticFunction{Float64}.
Stacktrace:
...

Now it says the ECOS solver supports second-order conic programming (including problems with convex quadratic constraints and/or objective). I’m not super experienced with optimization yet, but wouldn’t ScalarQuadraticFunction fall under a quadratic objective?

Thanks

Automatic transformation of quadratic objective to quadratic constraint is being implemented in https://github.com/JuliaOpt/MathOptInterface.jl/pull/789.
Meanwhile, you can do it manually)

@variable(model, t)
@objective(model, Min, t)
@constraint(model, sum(el^2 for el in A*x - b) <= t)

or even closer to ECOS format:

@variable(model, t)
@objective(model, Min, t)
@constraint(model, [t; Ax - b] in SecondOrderCone())
1 Like

oh great to hear!
thanks for showing me how to do that in the mean time.

Is there an equally simple way to formulate the L1 norm version of this problem? or does that make it much harder since it’s not longer in the SecondOrderCone?

Do you have to use JuMP, or can you use Convex.jl? An example using Convex.jl would be the following:

using Convex, SCS

n = 10
A = rand(n,n)
b = rand(n)

x = Variable(n)

problem = minimize(norm(A * x - b,1))
solve!(problem, SCSSolver(verbose=false))

println(problem.status)
println(x.value)

I’ve currently been using Convex to solve it in the way you posted- I started to look at Jump because I had read somewhere it might be faster to use for optimizing in a loop. I’m doing this 1 norm min. in a loop over about 1 billion pixels (each pixel has an independent inversion problem to solve for), and Convex has some memory leak bug that causes RAM to slowly grow until it crashes

If it’s not true that Jump can be better for many repeated/possibly parallel optimizations then I’ll just look into solving the memory bug for Convex

https://github.com/JuliaOpt/Convex.jl/issues/83 this is the issue i’m talking about with Convex

Just to summarize some of the discussion in that issue— I think in this case there’s something different than that bug going on, since with the workaround Convex.clearmemory(), SCS seems to be fine but ECOS keeps growing in memory. We should do some more testing to try to isolate if it’s really in Convex, in the ECOS wrapper, or ECOS itself.

@chriscoey is adding an automatic reformulation of the L1 norm in https://github.com/JuliaOpt/MathOptInterface.jl/pull/818
You will then be able to use it as easily as with SecondOrderCone

Ah that’s unfortunate that it might be in ECOS- I haven’t gotten around to testing it much further, since for my problem the SCS solver seems to fail to converge pretty consistently, and when it does, it runs 10-15x longer than the ECOS solver (as I’m solving billions of small-ish problems), so using SCS would make my entire script take too long for now.

Trying to revive this topic as the new update in JuMP removed the support for “norm”.

I have a similar problem using JuMP & ECOS (or SCS) for formulating a simple least square problem, i.e., trying to “minimize norm(y-Ax,2)”. Of course, knowing that ECOS supports only SOCP, I convert the problem into one. Here is the MWE

using JuMP
using ECOS
n = 6; m = 3
A = rand(m,n)
y = rand(m,1)

model = Model(ECOS.Optimizer);
@variable(model, z)
@variable(model, x[1:n])
@constraint(model, [z, y-A*x] in SecondOrderCone())
@objective(model, Min, z);
optimize!(model)

and I get the error while adding the SOC constraint

In `@constraint(model, [z, y - A * x] in SecondOrderCone())`: unable to add the constraint because we don't recognize Any[z, GenericAffExpr{Float64,VariableRef} (...) as a valid JuMP function.

It seems that JuMP has some issue in recognizing the “GenericAffExpr{Float64,VariableRef}” since a “hard-coded” constraint like “@constraint(model, [z, 1-0.1x[1]-0.2x[2]] in SecondOrderCone())” works fine for me.

Any thoughts?

Beware of wrong conversion to conic form of LASSO · Issue #219 · jump-dev/Convex.jl · GitHub

That issue is outdated, and doesn’t seem to be a problem on Convex 0.13

[z, y - A * x] makes a vector of two elements, rather than concatenating the elements.

You want

@constraint(model, [z; y - A * x] in SecondOrderCone())

or

@constraint(model, vcat(z, y - A * x) in SecondOrderCone())

Thanks a lot. Indeed it works with

@constraint(model, [z; y - A * x] in SecondOrderCone())

I was too obsessed with the error message of GenericAffExpr{Float64,VariableRef} then…