Which optimization package to use for constrained least squares?

I want to minimize (A*x - b)^2 subject to x ∈ [lower, upper] and sum(x) <= 1.

With Optim.jl, I can easily solve this problem with the box constraints, but I don’t see a way to add the constraint on sum(x). Should I be using something like Jump.jl or Convex.jl instead?

Good question. Following because I’m also interested in constrained optimization, I’m looking for argmax f(x) where f(x) is a computational function (not something with simple symbolic form, think of it as some kind of weighted average of data) subject to sum(abs.(x)) == C.

So far, I’ve been doing it with NLopt and the COBYLA algorithm, which works, but it only finds a local optimum so I restart it multiple times. It’s a bit cumbersome. Would like to know what other options there are.

You can use Ipopt and JuMP for this

using JuMP, Ipopt
model = Model(Ipopt.Optimizer)
@variable(model, lower[i] <= x[i=1:size(A, 2)] <= upper[I])
@constraint(model, sum(x) <= 1)
@objective(model, Min, sum((A * x - b).^2))
1 Like

Since the problem is convex, you can also use the Convex.jl package:

using Convex, SCS

A = randn(10,5)
b = randn(10)
x = Variable(5)
add_constraint!(x, sum(x) ≤ 1)
add_constraint!(x, x ≤ +2) # lower
add_constraint!(x, x ≥ -1) # upper
problem = minimize(sumsquares(A*x-b))
#problem = minimize(sumsquares(A*x-b+A*fill(2,5)))
solve!(problem, SCS.Optimizer)

To test that the constrains are indeed satisfied, I shifted the solution by 2, which is why you see the commented out line with A * [2,2,...,2].