For non linear optimization, JuMP allows user to create their own function and either use autodiff or specify the derivatives manually:
https://jump.dev/JuMP.jl/v0.21.3/nlp/#User-defined-Functions-1
However the documentation makes the following statements:
" […] the fourth argument is the name of the Julia method which computes the function, and autodiff=true
instructs JuMP to compute exact gradients automatically" (no mention to hessian)
"users may want to provide their own routines for evaluating gradients […] (but) second-order derivatives of multivariate functions are not currently supported "
Both seem to somehow imply that JuMP does not use autodiff to compute the Hessian. But this seems very unlikely because solvers like Ipopt could benefit from having the Hessian information computed by ForwardDiff instead of using only BFGS (quasi newton)
Also, if only the gradient is used, and not the Hessian, it would seem more logical to use ReverseDiff instead of ForwardDiff as the previous would be orders of magnitude more efficient in high dimension.
Anyone knows the answer? Does JuMP use or not the Hessian for user defined functions?