JuMP.jl: Supplying gradient of objective and jacobian of constraints in Nonlinear Optimization

Hi, I am faced with a troublesome optimization problem. As shown below, the non-linear optimization problem is easy to model and solve via Ipopt with JuMP.jl as interface. However, the objective function and constraints are time-consuming to compute, even with the assistance of auto-differenciation embedded in JuMP.jl.

\text{min} \; f(x)\\ \text{s.t.}\; g(x) = a \\ \quad \; s(x) = b\\

However, the jacobian and hessian of constraints is very sparse. Someone has provided the analytical jacobian and hessian of contraints, and analytical gradient of objective function to Ipopt with MATLAB opti as interface, and the speed is super fast.

Now, I know the analytical form of objective gradient and the jacobian and hessian of constraints. I wonder how can I provide those analytical expressions to Ipopt solver in JuMP.jl to speed up, I didn’t find any useful JuMP.jl docs in regard to this respect.

I guess MathOptInterface.jl can make it, but it is low-level and not easy to use. Besides, a lack of examples stopped me.

You can find an example in the tutorial section of manual (User Defined Hessian). There is some information in the Nonlinear Modelling part of the manual section. Search for hessian on the page. However, there is a comment:

Finally, the matrix is treated as dense, so the performance will be poor on functions with high-dimensional input.

If you don’t have to use JuMP, you can specify your optimization problem using JuliaSmoothOptimizers, for instance, using ManualNLPModels.

Here is an example with the gradient: How to create a model from the function and its derivatives
And the constructor docs: Reference · ManualNLPModels.jl
You can then use Ipopt through NLPModelsIpopt. See these tutorials:

The last link also shows a different way to specify an optimization problem that does not depend on ManualNLPModels.jl.

4 Likes

if you have just have some Julia functions f, g, and s that you want to minimize, the JuMP is not the best tool for the job. See Should you use JuMP? · JuMP

If you can share an example people may have suggestions, otherwise NLPModels is a good choice, or perhaps even the C API to Ipopt: GitHub - jump-dev/Ipopt.jl: Julia interface to the Ipopt nonlinear solver.

1 Like

@odow The model I want to replicate is the random coefficient Logit model, a classical demand estimation model in empirical industrial organization.

Just as I mentioned, the model will be solved faster if we provide the analytical jacobian and hessian of constraints to Ipopt solver. Here is the linke to the paper.
https://onlinelibrary.wiley.com/doi/abs/10.3982/ECTA8585
The author Jean-Pierre Dubé has provided MATLAB code to replicate results.

In Julia ecosystem, BLPDemand.jl has implemented the estimation algorithm but without providing the jacobian and hessian of constraints, so the speed might be limited.

I am thinking about translate these MATLAB codes into Julia. The key takeaway is to feed the spase structure and values of jacobian and hessian to Ipopt.

Until now, I guess NLPModelsIPopt.jl could be a good tool, thanks to @abelsiqueira . Here is the link to the docs. Tutorial · NLPModelsIpopt.jl . C API of Ipopt.jl might work as well.

I will apppreciate any other advice in this respect.

but without providing the jacobian and hessian of constraints

Note that JuMP automatically computes this information. You don’t need to provide it analytically. Using the low-level API might be slightly faster, but it’s easy to make a mistake, and it takes longer to code. If you have a JuMP model and it works, then I’d just use that.

A short note: since the problem doesn’t have any inequality constraints, IPOPT (a barrier or interior-point method) doesn’t even use its barrier-related techniques. It boils down to a Lagrange-Newton method, and the differences between different Newton-based methods are Hessian regularization (problem convexification) and globalization techniques (line search vs trust-region method & what kind of merit function).

1 Like

What if the jacobian and hessian of constraints are very sparse? Does JuMP detect it automatically?, and will claiming sparsity structure reduce computing time?

FastDifferentiation.jl will handle sparse jacobians and hessians. It has limitations on problem size so it may not work for your case. But if it does it generates very efficient derivatives.

What if the jacobian and hessian of constraints are very sparse? Does JuMP detect it automatically?

Yes. JuMP automatically computes sparse Jacobians and Hessians.

Yep, I implemented a JuMP model and it works, but the speed is slow. I am thinking about show my code to ask for additional perfomance advice.

2 Likes

Yes please show your code. There might be other improvements

Thanks, I show my code in a seperate question. Here is the link.

1 Like