Hello,
The new MOI.VectorNonlinearOracle support providing the Hessian of the Langrangian function, defined as, according to MOI.jl doc:
but the Langrangian typically also include the objective function:
where \cdot is the dot product, J(\mathbf{x}) the objective function, and the nonlinear equality constraints are defined as \mathbf{f}(\mathbf{x}) = \mathbf{0}.
- My understanding is that
MOIwill do \nabla^2 J(\mathbf{x}) + \sum_i \mu_i \nabla^2 f_i(\mathbf{x}) internally before passing it to the solver (if the Hessian of the objective function is also provided), am I right ? - The
MOI.VectorNonlinearOraclealso requires functions to evaluate the equality constraint \mathbf{f}(\mathbf{x}) and its Jacobian \nabla \mathbf{f}(\mathbf{x}). When computing the Jacobian with e.g. DifferentiationInterface.jl and ForwardDiff.jl, we can compute both efficiently withvalue_and_jacobian. Now, If I compute the \sum_i \mu_i \nabla^2 f_i(\mathbf{x}) matrix withvalue_gradient_and_hessian, can I re-use the intermediate results (the value and the gradient), to construct the \mathbf{f}(\mathbf{x}) vector and \nabla \mathbf{f}(\mathbf{x}) Jacobian ? My feeling right now is no, so I should just compute \sum_i \mu_i \nabla^2 f_i(\mathbf{x}) withhessiandirectly, since the intermediate results are useless here.
Thanks!