When using Ipopt through JuMP, I find myself unable to set hessian_approximation=“exact” (i.e. to tell Ipopt that I will give it an exact hessian). When I try, I get an error to the effect of “Exception message: eval_h is called but has not been implemented.” This suggests that JuMP is not passing the Hessian when asked. Am I doing something wrong, or am I just mistaken in my understanding that JuMP uses ReverseDiff to compute exact Hessians?

The only case in which JuMP does not pass Hessian matrices by default is if you use multivariate user-defined functions, where Hessians are not supported at all. There’s no need to set the hessian_approximation parameter either way. You can confirm that Ipopt used Hessians by looking at the summary it prints at the end of a solve.

Thanks for your response. Could you give me an example of such a problematic multivariate user defined functions? Does this mean multivariate input or output? If it’s the former, I thought that automatic differentiation could compute gradients and hessians for user defined functions (that are Julia code). Also, how can one have a hessian without a multivariate input. I suspect I am just being dense and not understanding what you mean.

Also, to clarify: I have confirmed that ipopt is using a hessian approximation which is not the default setting for ipopt, which further suggests that JuMP is not passing a hessian. (If I force ipopt to ask JuMP for an exact hessian, I get the error described above.)

That makes sense. It sounds like for all cases except for univariate or simple multivariate cases that can basically be written in-line, JuMP does not support exact Hhessians. If I am understanding correctly, even the following would fail to compute/pass an exact hessian:

While I’m sure you’re right, I looked through the documentation pretty carefully (including section on user-defined functions from which I borrowed the above example) and cannot find any mention suggesting that Hessians are not supported for multivariate user-defined functions. Did I miss something, or is this just obvious/common-knowledge?

Second-order derivatives of multivariate functions are not currently supported;

These are the cases that JuMP was originally designed to address. User-defined functions were an add-on on top of that. If there isn’t much of a closed-form component to your problem, you may decide instead to talk to Ipopt directly and use AD tools to compute derivatives. The situation in JuMP could change in the long term, but it’s not a high priority.

I’ll keep my eye out for updates to JuMP and will look into talking directly with Ipopt or Kntro using ReverseDiff to compute the hessian, but I suspect that my own implementation will likely under-perform JuMP’s (even sans-hessian) since I suspect JuMP is doing a lot to carefully minimize redundant allocations/computations.

I am not able to install ipopt as it says build issues. May I know how did you install it? I raised this question long back but got no response.
Thanks.

Are you certain that you have installed all of the required dependencies outside of Julia? IPOPT has a guide here for installing IPOPT on Unix. Pay particular attention to the required system packages and possibly the external linear solvers. My best guess is that you’re missing some of the required system packages your IPOPT installation isn’t correctly pointing to a linear solver.

Doesn’t the Pkg.add("Ipopt") install the Ipopt solver with all dependencies?

I had a peep in the files like get.blas, get.ASL etc. It downloads these softwares and then build them.
I downloaded the source packages requried and kept them in the respective folders.

So I think that the command Pkg.add("Ipopt") should be able to install the package and consequently Pkg.build("Ipopt") should also be able to install it.

I will look into other dependencies required thogh.

If I remember correctly, I also had to install the required packages in the “required system packages” link before Pkg.add(“Ipopt”) would work correctly for me.