JuMP: unused `@operator` has critical impact

I have some model defined like

fake_fun(a,b) = 0

function my_model()
    ...
    @operator(m, unused_op, 2, fake_fun)  
    ...
end

where both unused_op and fake_fun are not used anywhere except on that line. With this @operator line, Ipopt swiftly optimizes my (InfiniteOpt) models (in less than 5 seconds). However, if I change it to

fake_fun(a) = 0

function my_model()
    ...
    @operator(m, unused_op, 1, fake_fun)  
    ...
end

or if I take out that line altogether, then the process becomes unresponsive (unkillable) and I see this message:

MUMPS returned INFO(1) = -9 and requires more memory, reallocating.  Attempt 1
  Increasing icntl[13] from 1000 to 2000.

Furthermore, I have noticed that Number of nonzeros in Lagrangian Hessian goes from 0 to:

Number of nonzeros in equality constraint Jacobian...:     7842
Number of nonzeros in inequality constraint Jacobian.:        0
Number of nonzeros in Lagrangian Hessian.............:    11625

when I take out the @operator line.

Is that a bug that deserves a MWE, or is that “Well, Ipopt has a number of heuristics, and one of those changes when an @operator is present”?

However, if I change it to

I think you might be missing something. They look the same to my eyes. I’m guessing you meant #@operator.

It’s this one. Except it isn’t really the fault of Ipopt.

The issue is in MathOptInterface’s sparse reverse-mode automatic differentiation backend: if you have a multivariate user-defined operator and you do not provide an explicit oracle for the Hessian, then MOI disables the Hessian of the Lagrangian:

In Ipopt, this means it sets the hessian_approximation=limited-memory option:

So if you do set_attribute(model, "hessian_approximation", "limited-memory") then you should be able to remove the @operator and it will still be fast.

1 Like

Thank you, that’s what I suspected. It worked!

1 Like