Is the Hessian used in non linear JuMP?

For non linear optimization, JuMP allows user to create their own function and either use autodiff or specify the derivatives manually:
https://jump.dev/JuMP.jl/v0.21.3/nlp/#User-defined-Functions-1

However the documentation makes the following statements:

" […] the fourth argument is the name of the Julia method which computes the function, and autodiff=true instructs JuMP to compute exact gradients automatically" (no mention to hessian)

"users may want to provide their own routines for evaluating gradients […] (but) second-order derivatives of multivariate functions are not currently supported "

Both seem to somehow imply that JuMP does not use autodiff to compute the Hessian. But this seems very unlikely because solvers like Ipopt could benefit from having the Hessian information computed by ForwardDiff instead of using only BFGS (quasi newton)

Also, if only the gradient is used, and not the Hessian, it would seem more logical to use ReverseDiff instead of ForwardDiff as the previous would be orders of magnitude more efficient in high dimension.

Anyone knows the answer? Does JuMP use or not the Hessian for user defined functions?

1 Like

JuMP does not use the Hessian for user-defined functions.

JuMP will compute the hessian for algebraic constraints.

Therefore, where possible, write out the constraint like @NLconstraint(model, x^2 + sin(x) >= 1) instead of f(x) = x^2 + sin(x), @NLconstraint(model, f(x) >= 1).

Fixing this is on our list of things todo…

Thanks for the reply @odow.

Just to make it a bit more clear, if I have a problem of the type:
@NLobjective(mode, Min, J)

@NLconstraint(model, J==my_NL_fun_with_autodiff(x))

will that use the hessian? (sorry about insisting, but I know that for my specific problem, quasi-newton does not work well enough)

Actually, this has apparently already been marked as a feature request

https://github.com/jump-dev/JuMP.jl/issues/1198

1 Like