Latest recommendations for global optimization

What are people’s recommendations for a global optimizer? I’m trying to do simulated method of moments (choosing a set of parameters for an economic model to minimise the distance between model predictions and data) and have so far been using blackboxoptim. I find it somewhat unreliable: running exactly the same code on the same data I find sometimes the optimizer gives me a parameter choice that provides a close fit to the data, but sometimes it just seems to get stuck without converging). Any particular favourites out there?

5 Likes

IIRC BlackboxOptim uses stochastic optimization algorithms, so a stochastic result is somewhat expected.
NLopt includes a simulated annealing (SAMIN) solver I’ve used with success before.
You also find some global solves in the evolutionary category at GitHub - wildart/Evolutionary.jl: Evolutionary & genetic algorithms for Julia

1 Like

Out of curiosity, how many parameters are you optimizing? How much time did BlackBox take? I have an optimization problem which I know has local minima and I’m also looking for options. My problem has on the order of 10^5 variables.

There’s also https://github.com/timholy/QuadDIRECT.jl .

https://github.com/JuliaDiffEq/DiffEqBenchmarks.jl has parameter estimation tests which are difficult likelihood landscapes which we test on global optimizers. We will soon update all of these to v1.0 and that’ll be a very good read on how well these optimizers work.

Note that global optimization itself is always somewhat unreliable.

1 Like

Is this task a constrained or unconstrained optimization problem?

1 Like

That is to be expected, if the objective is stochastic. If you can’t use common random variables to make the objective smooth, you should use a stochastic optimization method, eg simulated annealing or similar. Yes, it will be slow, and global convergence is not guaranteed. It never is :wink:

Also, it is good to expand the model gradually: find the optimum for a simple model, then add bells & whistles starting from that point.

Unless you use a solver like COUENNE or BARON, but then you definitely can’t handle 10^5 variables, unless you were really solving a convex problem all along.

2 Likes

Thankfully I’m only optimizing over 8 parameters. Re timings, actually I haven’t quite been patient enough to get to full convergence but I have managed to get a good fit to the data after approx 15000 evaluations, which tends to take circa. 30000 secs.Point taken about stochastic nature of optimizers - bit of a silly question on my part! But good to get thoughts on what’s out there.

sounds useful! Look forward to seeing that.

In what sense do these solvers guarantee a global optimum?

Finding guaranteed global minima for non-convex problems is a really hard problem. Feasible approaches require that you be able to compute guaranteed bounds on the function value over continuous chunks of parameter space. https://github.com/JuliaIntervals/IntervalOptimisation.jl is the only pure-Julia package I know of that does this, based on interval arithmetic. Whether it scales to large number of variables is a bit doubtful, though, depending on what you call “large” (10^5 seems likely to be impossible, 8 might be feasible).

6 Likes

They can provide both lower and upper bounds on the optimal objective function value. So if the upper bound is equal to the lower bound and you have a feasible point that achieves that objective function value, you know that it is a global minimizer. These solvers do this using (variants of) a branch and bound algorithm, where they recursively subdivide the search space into smaller regions and have ways of eliminating whole regions.

These solvers definitely scale very badly though, and all the ones I know of require you to specify the problem in a symbolic form: the solver needs to ‘know’ the exact functions that define the constraints and objective function; it’s not enough to just supply a function that returns a value and its derivatives.

4 Likes

I have some benchmark here but the conclusion is that you should use python’s CMAES:

IntervalOptimisation should work for regular julia functions, though how you define them can be a bit of a delicate matter due to the non-distributive property of interval arithmetic. In that world, minimizing (x-3)^2 is not the same thing as minimizing x^2 - 6x + 9; you’ll get much faster convergence with the former.

4 Likes

Yes, in that sense it’s nicer, since it works through method overloading and a special scalar type. You can use BARON, COUENNE et al. through JuMP, but since JuMP can only handle nonlinear expressions in @NLconstraint/@NLexpression/@NLobjective macros, it can be a little more awkward. However, fundamentally, neither method allows you to just call out to some C function for the objective value and gradient, which would be possible with a gradient-based method.

1 Like

Note by the way that IntervalOptimisation’s Moore-Skelboe algorithm is only for unconstrained optimization, whereas general MINLP solvers (such as COUENNE and BARON) can handle constraints.

1 Like

I haven’t used it, but there’s also https://github.com/JuliaIntervals/IntervalConstraintProgramming.jl. Maybe @dpsanders can comment.

1 Like

I can see that I actually have seen this repo before, because I’ve asked for dashed lines in an issue :slight_smile: If you look at all the Optim solvers (or solvers using Optim sub-solvers), they flatten out at around 1000 f_calls or iterations. At least that’s how I figure (funny, right?) the plots should be read. The only issue is, that f_calls_limit here https://github.com/jonathanBieler/BlackBoxOptimizationBenchmarking.jl/blob/887b868d321200a714027dfdf4ef1d2fc16a2936/scripts/optimizers_interface.jl#L93-L97 does not overwrite the built-in iteration limit of iterations=1000. You need to set that as well.

I’m not saying these solvers will work miracles on these types of problems, that x=1000 tick mark is just quite obvious to me :slight_smile: Would love to see a plot where you also allow iterations=run_length. Whichever is reached first will terminate the program.

Indeed, IntervalOptimisation.jl can, in principle, find guaranteed global optima for Julia functions satisfying certain conditions. It does this by using a version of branch and bound that uses interval arithmetic to guarantee that it can remove regions from the search space.

It is probable (hopefully) that it can handle a pretty complicated function with 8 variables (but it seems to be a problem if the function is too oscillatory). Certain functions can be handled even to a hundred variables or more. But 10^5 is certainly way out of range.

The currently released version handles only unconstrained optimization, but constrained optimization is, in principle, “easy”; Eeshan Gupta implemented this as part of his GSoC program this summer: Global Optimisation by eeshan9815 · Pull Request #9 · JuliaIntervals/IntervalOptimisation.jl · GitHub, but this is in the process of being updated to version 1.0.

As Tim suggested, this uses interval constraint propagation from GitHub - JuliaIntervals/IntervalConstraintProgramming.jl: Calculate rigorously the feasible region for a set of real-valued inequalities with Julia. This is a technique that enables one to “contract” (squash) a box to a smaller box that still satisfies a constraint.

I gave an overview / intro to this stuff in this workshop at JuliaCon 2018:

(starting from minute ~46).

Hopefully all of this will be getting an overhaul in the next few months.

3 Likes

Thanks, good catch. I’ll fix that.