Potential bug related to initialization of a quadratic program

Hi all,

Consider the code below (a convex, quadratically constraint, quadratic program I pulled out of a hat). As it stands, it runs fine. However, if I comment out the for loop (i.e. I don’t manually set the starting values of x to arbitrary nonzero values), JuMP tells me the solution is infeasible. Anybody know what’s going on here?

Thanks a lot!

Q,R = qr(randn(53,53))
J = randn(53)
H = Q*diagm(randn(53).^2)*Q'
using JuMP
using Ipopt
model = Model(Ipopt.Optimizer)
@variable(model, x[1:53])
for i = 1:53
# Ipopt screws up if this is not the case!
set_start_value(x[i], 1.)
end

@objective(model, Min, x'*H*x)
@constraint(model, x'*J == 0.)
@constraint(model, x'*x == 1.)

JuMP.optimize!(model)

What is the output of termination_status(model) in both cases?

Ipopt is probably finding a locally infeasible solution. This isn’t the same thing as proving that the problem has no feasible solution, and it is expected behavior.

In general, for nonlinear solvers you should provide a good guess for the starting solution using domain knowledge.

1 Like

The gradient of the constraint @constraint(model, x'*x == 1.) is zero at the default starting point (all zeros) and the constraint isn’t satisfied, so this may be triggering Ipopt’s local infeasibility criterion. If that’s the case, Ipopt is working as designed.

Hi, you are both correct that x .= 0 is infeasible. But so is the manual initialisation x.=1. Since this a convex program (the quadratic matrices are positive definite), I would have thought that Ipopt would be expected to find an initial feasible starting point a priori? Is this not the case?

Thanks

In answer to the question:

feasible case:

LOCALLY_SOLVED::TerminationStatusCode = 4

infeasible case:

LOCALLY_INFEASIBLE::TerminationStatusCode = 5

The constraint x'*x == 1 is not a convex constraint.

The issue isn’t whether the initial point is infeasible. More specifically, it’s whether the initial point satisfies Ipopt’s conditions for declaring local infeasibility. I don’t know these in detail, but having an un-satisfied constraint whose gradient is zero (so there’s no first-order direction to escape the infeasibility) seems likely to trigger these conditions.

1 Like

Yes, completely correct I was careless. x’*x <= 1 is the convex constraint. Sorry about that.

In which case it is a nonlinear program, and initial conditions will determine feasibility. Makes intuitive sense that you need some initial amount of gradient on the constraints to get towards a feasible solution. Everything makes sense. Thanks a lot to both!

If the constraint is x' * x \leq 1, then the global optimum is just x .== 0, right?

If you do have a convex problem like this, a conic formulation should always find the global optimum. In Convex.jl, e.g., the problem would be formulated as

using LinearAlgebra
Q,R = qr(randn(53,53))
J = randn(53)
H = Q*diagm(randn(53).^2)*Q'
using Convex
using SCS
x = Variable(53)

problem = minimize(quadform(x, Symmetric(H)), [x' * J == 0, norm(x, 2) <= 1])
solve!(problem, SCSSolver())
@show problem.optval
@show evaluate(x)

You can use conic solvers like SCS in JuMP too.

1 Like

Yes that’s correct, the convex constraint x^2 <=1 would make the optimisation pointless. Thanks, I haven’t looked at Convex.jl, I’ll have a browse!

1 Like