Solving Nonlinear constrained optimization

Hello,

I have been recently struggling to solve a NLP in julia. The problem can be written as:

min f(x)
st c(x)=0

It is important to mention that the objective function is a discretized integration over a trajectory and the constraints are resulting from a discretization of a PDE. Thus, the problem can be prohibitively expensive. I have derived and implemented the analytical gradient and hessian of the objective function and I was looking for a library to solve this problem as local SQP problem. I have implemented the local KKT system myself and computed the step, however, it started to be difficult for me to implement a linesearch/trust region algorithm that involves merit functions and a lot of heuristics. At this point, I started looking for a library that computes the min efficiently given that I provide a sparse hessian and jacobian. I am thinking of something like Optim.jl but for constrained optimization. Any suggestions are appreciated.

“Differential equation constrained optimization” can just be framed as a parameter estimation problem. https://diffeqflux.sciml.ai/dev/examples/pde_constrained/

1 Like

Thanks Chris for your prompt response. I see that it works fine for parameter estimation, but do you think it could be used if I have a tracking problem (i.e. my optimization parameter is an entire trajectory)? In this case, my parameter vector x can be very big.

The state-of-the-art solvers you need are Ipopt, filterSQP, SNOPT, MINOS. Since you don’t have inequality constraints, they will all pretty much do the same thing (solve the KKT system, then use globalization techniques to force sufficient decrease).

I know Ipopt is available in Julia, not sure about the others.

1 Like

That is indeed correct. I already tried to solve the problem with the julia wrapper of Ipopt (NLPModelsIpopt.jl) and that worked fine for a small size of the problem. The problem that I faced however is the difficulty to pass analytical Jacobian and Hessians to the solver. I think you need to strictly follow some sparsity structure in your definition in order to avoid dense matrices. But yeah, this is the closest thing I found to what I try to do.

The whole point is for this to be used on big systems. It’s much more efficient to do this approach than directly using an optimizer. The neural ODE case is simply a case of doing this with thousands of parameters.

https://diffeqflux.sciml.ai/dev/examples/neural_ode_sciml/

Here’s a case with tens of thousands on a GPU:

https://diffeqflux.sciml.ai/dev/examples/mnist_conv_neural_ode/

1 Like

For trajectory optimization I’m using Casadi.jl which is a wrapper on python’s casadi library which in turn is a wrapper around C++ library with ipopt as NLP solver. I found it the fastest for my tasks. But JuMP (with ipopt backend)can be used either with limitations I’ve mentioned in the post above. I’m transcribing my problem using collocation pseudospectral methods. For the system like quadcopter with 13 states and several waypoints I’m getting hundreds of decision variables and optimization takes about 1 second. For more complicated fixed wing aircraft with highly nonlinear complex aerodynamics it takes thousands of decision variables and dozens of seconds. In both cases it faster then realtime with realtime factor about 10x.