Hi, I am faced with a troublesome optimization problem. As shown below, the non-linear optimization problem is easy to model and solve via Ipopt with JuMP.jl as interface. However, the objective function and constraints are time-consuming to compute, even with the assistance of auto-differenciation embedded in JuMP.jl.
However, the jacobian and hessian of constraints is very sparse. Someone has provided the analytical jacobian and hessian of contraints, and analytical gradient of objective function to Ipopt with MATLAB opti as interface, and the speed is super fast.
Now, I know the analytical form of objective gradient and the jacobian and hessian of constraints. I wonder how can I provide those analytical expressions to Ipopt solver in JuMP.jl to speed up, I didn’t find any useful JuMP.jl docs in regard to this respect.
I guess MathOptInterface.jl can make it, but it is low-level and not easy to use. Besides, a lack of examples stopped me.
You can find an example in the tutorial section of manual (User Defined Hessian). There is some information in the Nonlinear Modelling part of the manual section. Search for hessian on the page. However, there is a comment:
Finally, the matrix is treated as dense, so the performance will be poor on functions with high-dimensional input.
if you have just have some Julia functions f, g, and s that you want to minimize, the JuMP is not the best tool for the job. See Should you use JuMP? · JuMP
In Julia ecosystem, BLPDemand.jl has implemented the estimation algorithm but without providing the jacobian and hessian of constraints, so the speed might be limited.
I am thinking about translate these MATLAB codes into Julia. The key takeaway is to feed the spase structure and values of jacobian and hessian to Ipopt.
Until now, I guess NLPModelsIPopt.jl could be a good tool, thanks to @abelsiqueira . Here is the link to the docs. Tutorial · NLPModelsIpopt.jl . C API of Ipopt.jl might work as well.
I will apppreciate any other advice in this respect.
but without providing the jacobian and hessian of constraints
Note that JuMP automatically computes this information. You don’t need to provide it analytically. Using the low-level API might be slightly faster, but it’s easy to make a mistake, and it takes longer to code. If you have a JuMP model and it works, then I’d just use that.
A short note: since the problem doesn’t have any inequality constraints, IPOPT (a barrier or interior-point method) doesn’t even use its barrier-related techniques. It boils down to a Lagrange-Newton method, and the differences between different Newton-based methods are Hessian regularization (problem convexification) and globalization techniques (line search vs trust-region method & what kind of merit function).
What if the jacobian and hessian of constraints are very sparse? Does JuMP detect it automatically?, and will claiming sparsity structure reduce computing time?
FastDifferentiation.jl will handle sparse jacobians and hessians. It has limitations on problem size so it may not work for your case. But if it does it generates very efficient derivatives.