Is it possible that passing user defined gradient slows down the solver

Hi all,

I am using JuMP with Ipopt and encounter this problem. For example, I have a constraint looks like x^2 + y^2 <= 1. Originally I just add this constraints to the model. But when I try to define a function f(x, y) = x^2 + y^2, register this function and its gradient to the model (I tried both autodiffer and user-defined gradient), and then run the code, Ipopt actually takes more time and iterations to converge. I am wondering is this possible, or I am not coding in the correct way?

In addition, just to make sure, currently we cannot give hessian of multivariate functions, right?

Thanks in advanced.

1 Like

Since Ipopt is actually taking more iterations (rather than just more time), the gradients you’re supplying might be wrong. Ipopt has a built-in derivative checking function (enable using derivative_test keyword), which is pretty handy. Specifically, I’ve used the following in tests:

https://github.com/JuliaRobotics/MotionCaptureJointCalibration.jl/blob/efd5f040ba7a856f9cc96f16aa5a76f3352d228c/test/runtests.jl#L83

2 Likes

Thank you for the fast responses. I just tried and the solver prints no errors detected by derivative checker. It looks like with user defined function and gradient(i.e., JuMP.register()), Ipopt is not evaluating the Lagrangian Hessian.

1 Like

That is correct. JuMP does not provide Hessian information to Ipopt if there are non-univariate user-defined functions present. This can, of course, affect the algorithmic performance of the solver.

Thank you for the response. So if I use juMP.register(), Ipopt will not use second order information? But if I don’t user defined function, i.e., use a loop to add variables to constraints/variable, then Ipopt will automatically calculate both gradient and hessian? Am I understanding correctly? Thanks.

If you register a non-univariate user-defined function, then JuMP will report to Ipopt that hessians are not available to be queried. Ipopt will switch to “limited-memory hessian approximation” in this case (see https://github.com/JuliaOpt/Ipopt.jl/blob/8319f091254f5ad2974f8812280fcb9b29df003b/src/IpoptSolverInterface.jl#L38). Otherwise, JuMP provides derivatives and by default Ipopt will use both gradients and hessians.

1 Like