Optimization.jl constraints violated...why?

I am trying to implement model predictive control using Optimization.jl, with the Optim backend. It seems like it should be straightforward: a quadratic program, with some linear constraints. But the constraints just aren’t enforced–it happens with Newton() and BFGS() and L-BFGS() solvers. I don’t know why. The output says the solution was successful.

My cons(u, p) function returns a vector with 0s, I pass in lcons=0 and ucons=0 to the OptimizationProblem, and after solving, cons(sol.u, p) does not have 0s like it’s supposed to.

I know many people will recommend using JuMP instead. I’m open to that. I’d still like to know why Optimization.jl is failing me, though.

Here is code that gets at the gist of what I’m saying, but here is the full code.

optf = OptimizationFunction(J, Optimization.AutoForwardDiff(), cons=cons)
...
lcons = # all zeros
ucons = # all zeros except for upper bounds on a couple variables
prob = OptimizationProblem(optf, u0, p, lcons=lcons, ucons=ucons)
sol = solve(prob, Newton())
cons(sol.u, p)  # DOESN'T SATISFY CONSTRAINTS
1 Like

Hey, all the optimizers that you mentioned are unconstrained optimisers, you’d have to use IPNewton from Optim or one of the constrained optimizers that can be used with MOI for the constraints to be followed. We’ll soon have better checks for things like this so that an error is thrown when you attempt this!

4 Likes

Awesome! The consequences of trying to do optimization without a good formal foundation…

2 Likes

Hmm, I’m still stuck…it appears a vector of constraints isn’t working as expected? In my example, I keep getting a warning that my initial guess isn’t an interior point, though I double-checked and as far as I can tell it is.

So I went to the example from the docs and I’m having the opposite problem. In the cons function I add an unmet constraint: the second element of the vector, 1847, which clearly is not 0 as requested in lcons/ucons. Yet I’m not getting any warning that the initial guess isn’t feasible and the solver works anyway. What am I missing?

rosenbrock(x, p) =  (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
function cons(res, x, p)
    res .= [x[1]^2 + x[2]^2, 1847]
end
x0 = zeros(2)
p  = [1.0,100.0]
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(), cons=cons)
lcons = [-5., 0, 1]; ucons = [10.0, 0, 1]
prob = Optimization.OptimizationProblem(optf, x0, p, lcons = [-5.0, 0], ucons = [10.0, 0])
sol = solve(prob, IPNewton())

After some messing with lcons and ucons it looks like it does see the problem and complain as long as I make it an inequality constraint, e.g., 0 < cons[2] < 1. Does this mean equality constraints are just totally ignored?

As far as I can tell, it fails when there’s a mix of equality and inequality constraints.

@kjohnsen from a quick glance it looks like the problem would would like to solve would be a good fit for the JuMP framework. Have you given that a try?

3 Likes

Not yet…I realize it’s more mature and better documented, so I bet it would work well. I was hoping Optimization/Optim would do the trick, since the pure-Julia approach is attractive

In my experience Optim can be suitable for small problems but I have had difficulty getting it to work on large constrained optimization problems. There is some discussion of my experience here, AC Optimal Power Flow in Various Nonlinear Optimization Frameworks

There are a few pure-Julia MOI compatible solvers out there but there is a good reason that JuMP primarily calls a wide variety of non-Julia solvers. These solvers distill decades of R&D in optimization algorithms, and it is not easy to get similar performance rolling your own.

2 Likes

Yeah, looks like the interior check is limited to inequality constraints so that’s why this doesn’t error.

As far as I can tell, it fails when there’s a mix of equality and inequality constraints.

That’s partially true, if you set a realistic equality constraint it will follow it but looks like something like this is not enforced. Like in the examples here http://optimization.sciml.ai/stable/tutorials/constraints/

1 Like

Yes, Optim.jl seems to have some issues. I plan to move more of the Optimization.jl tutorials away from Optim.jl whenever I have time, for this reason. NLopt, IPOPT, etc. just are generally more stable, and we should be pushing people in that direction instead.

1 Like