Debugging in Ipopt without JuMP

Hi all,
I am trying to solve a constrained optimization problem by using Ipopt “without” using JuMP as I want to see how the performance changes by giving the gradient and hessian information.
I am referring to the C wrapper example in Ipopt.jl.

Actually, when I use JuMP + Ipopt, the problem is not solved correctly, and I found that the number of nonzero elements in Lagrangian Hessian is much larger than the number theoretically derived.
This is another reason why I don’t want to use JuMP + Ipopt.

The unfortunate thing is that Ipopt stops due to segfault.
My main question is how I can check the source of segfault.
REPL just says that a segfault happened.
What is the proper way of debugging while I am using a package written in another language?
If you also know another good solver in which we can provide the gradient and hessian information as in Ipopt without JuMP, I really appreciate it.

Thanks.

Welcome. You may be interested in this past post:

I’m moving your post to the more relevant Optimization category where you might get more expert attention.

If you can share a reproducible example, this would be of interest to JuMP developers to potentially identify an issue.

The unfortunate thing is that Ipopt stops due to segfault.

Did you follow the warnings here?

It’s hard to say more without a reproducible example.

the problem is not solved correctly

What happened? It found a suboptimal solution? Or it converged to an infeasible point?

I found that the number of nonzero elements in Lagrangian Hessian is much larger than the number theoretically derived.

Yes. JuMP computes the hessian in a way that causes repeated elements in the hessian, but this is not a performance problem. JuMP still computes the true sparse hessian.

Hi @jd-foster and @odow,
Thank you for your replies.

I understand that a reproducible example helps to improve our conversation. But as I am trying to solve a complicated problem, it will take time to convert it into a simple version for showing on this post.

On the other hand, I would like to ask about the warnings pointed out by @odow.

I read the warning and checked the C wrapper example in test folder before.
My understanding of the warning is as follows.
It seems that values == nothing happens at the beginning of the optimization so as to specify which elements in the jacobian and hessian are nonzero in Ipopt and we do not specify the value of ‘values’ in any line in the example.
Therefore, as long as we do not try to access the elements of x in a conditional branch, say starting from line 88, the warned situation does not occur.
And I do not write a code in which I access the elements of x in a conditional branch starting by values == nothing.
Thus I am thinking that the cause of segfault does not come from this.
Do I misunderstand the warning? Or is there another situation the warning is effective?

By the way, when I use JuMP + Ipopt, the optimization stops as it converges to an infeasible point.

I understand that a reproducible example helps to improve our conversation

If you are manually constructing the callbacks, there are a lot of things that can go wrong. It’s hard to say more without code. A segfault usually means that you did something wrong in the callback. Did you compute the correct number of non-zeros? Does the structure of your non-zeros remain constant over the iterations? Did you fill them in in the correct order?

By the way, when I use JuMP + Ipopt, the optimization stops as it converges to an infeasible point.

If you know the problem has a feasible solution, that usually means your starting point was far from feasibility. Try passing a feasible starting point. You can use set_start_value(x, value) in JuMP to set the starting point.

Hi @odow,

Actually, I rechecked my code based on your advice and found a mistake.
Now Ipopt works without Jump.

Thank you for your replies!

1 Like