High-dimensional optimization problem with inequality constraint

Hello,
I am solving a high-dimensional optimization problem with an inequality constraint using blackboxoptim. The objective function is differentiable, but I cannot write down the Hessians by hand. The dimensionality is over 7000. I know the dimensionality is extremely high, so I am seeking advice on which way should I go to achieve better results and what alternative packages (besides blackboxoptim) to use:

  1. choose multiple starting points (achievable in blackboxoptim, but I suspect that many of these starting points will violate the inequality constraint, and I don’t know how to write codes to choose starting points that satisfy the inequality constraint);
  2. use a more efficient algorithm. Since the objective function is automatically differentiable, I wonder if it is possible to solve the problem without providing the Hessian matrix. If so, can I use NLOpt or JuMP instead?
  3. given the high dimensionality, multiplicity is expected. Any suggestions for dealing with multiplicity in optimization problems?
    Thanks a bunch!!!
1 Like

You don’t need the Hessian (second derivative), you just need the first derivatives (gradients/Jacobians) of the objective and constraints. NLopt provides several algorithms that can handle this.

(We regularly solve problems with > 10^6 degrees of freedom and multiple inequality constraints.)

By “multiplicity” you mean “multiple local optima”? In an arbitrary non-convex problem, you just have to learn to live with the fact that you are unlikely to have a global optimum (and can never guarantee it). Be happy if you find a local optimum that does much better than your competitors or what you could come up with by hand.

(There are various heuristic approaches that one can try, e.g. gradually increasing the number of degrees of freedom, that can sometimes help if you find yourself getting frequently stuck in poor local minima. But this is highly problem dependent.)

2 Likes

Yes, you should be able to get the Hessian using automatic differentiation. However, as mengxiaoliu points out, you can use 1st-order methods or quasi-Newton methods that do not exploit the exact Hessian.

There is only one inequality, but there might bound constraints? In this case, they also count as inequalities :wink: Depending on the number of inequalities, it is possible that both active-set methods and interior point methods perform well. I would start with Ipopt (interior point solver), it’s always a safe bet. It is an infeasible method, that is, you can start from an infeasible point and it will iteratively improve constraint violation.

1 Like

Shameless plug alert

Check out https://github.com/mohamed82008/Nonconvex.jl where you can use NLopt or Ipopt (or other algorithms) with automatic differentiation by default. The API there is probably more beginner friendly than JuMP’s blackbox nonlinear optimization API. You can also have a multi-start version of these algorithms in Nonconvex which I found useful to explore a few local optima.

3 Likes

You don’t need to compute the gradient and hessian by hand, they are computed with automatic differentiation. So if can use one of the NLP solvers here: Installation Guide · JuMP with JuMP and they will first or second order derivatives (depending on the solver) without you having to specify any derivative manually.

1 Like

In high-dimensional problems, it is generally to expensive to compute (or even store) the Hessian matrix (unless it is very sparse), whether it is done by hand or automatically. Fortunately, there are many optimization algorithms that only require first derivatives.

3 Likes

Thanks. This is tremendously helpful! Regarding the “multiplicity,” yes, I meant multiple local optima. I was hoping to obtain multiple local optima and then process them by hand.

Yes. All the parameters have a lower bound and an upper bound. There is one additional inequality constraint for two of the parameters. I will experiment with Ipopt and see how it goes. Thanks!!

Thanks!! Multistart sounds like a good way to approach this problem.

Re: (We regularly solve problems with

10
6
degrees of freedom and multiple inequality constraints.)

Do you mean more than 10^6 parameters that the objective function takes? Are you using NLOpt for that? That is very impressive!

Yes. Provided you have gradients, this size of problem can be solved with NLopt or tools like JuMP and Ipopt