A performant optimizing algorithm for a case with an expensive cost function and parallel execution

I have a very expensive multivariate cost function (FDTD simulation with openEMS) but I have resources to run many of them in parallel. What would be a good algorithm (and package) to implement this? A variant of differential evolution perhaps?

1 Like

Response surface techniques are designed specifically for the global optimization of functions that are very expensive to evaluate. They construct in each iteration an interpolation or approximation surrogate function of known analytic form. The surrogate function is then subjected to global optimization, e.g., by some form of multiple random start (started at a selection of the current points). The resulting optimizers (or some points where the feasible region has only been sparsely explored) are taken as new evaluation points. Since this is sequential global optimization, each step is much more expensive than the others, but the reduction of the number of function values needed gives (for sufficiently expensive function evaluations) a net gain in speed. In principle, these methods may have convergence guarantees if the point selection strategy is well chosen; but this is irrelevant in view of the fact that for expensive functions, only few (perhaps up to 1000) function evaluations are admissible.
…
Both simulated annealing methods and genetic algorithms* are, in their simpler forms, easy to understand and easy to implement, features that invite potential users of optimization methods to experiment with their own versions. The methods often work, if only slowly, and may be useful tools for applications where function values are not very expensive and the primary interest is to find (near-)solutions now , even when the reliability is uncertain and only subglobal optima are reached.
*More generally, all metaheuristic algorithms are included here.

Reference: Neumaier A. Complete search in continuous global optimization and constraint satisfaction. Acta Numerica. 2004;13:271-369. doi:10.1017/S0962492904000194

3 Likes

I’ve had the same problem, with global optimization of non-differentiable functions. Unfortunately, even algorithms which are inherently parallel are typically not set up for it. It seems implementors often assume that the objective function is cheap, and focus their attention on other things. Probably because standard benchmark functions usually are cheap.

I’ve made a parallel version of ParticleSwarm in Optim.jl, available in GitHub - sgaure/Optim.jl: Optimization functions for Julia. It hasn’t been merged yet, but you’re of course free to use it.

3 Likes

Or, better, that if the objective function is expensive the bottleneck won’t be the on the algorithm itself.

My two cents for the OP is that the best algorithm depends a lot on the problem. For example, some problems are suited for using Genetic algorithms for the global optimization phase*, and that is rather simple to parallelize.

*I assume that it is always a good idea to have a local optimizer at the bottom improving the current guesses.

2 Likes

The previous suggestion of using a surrogate optimizer is good if the dimensionality of your search space is not too large.

NOMAD.jl is designed for expensive blackbox functions and can provide a good, but probably not optimal, solution in fewer objective function evaluations than most other optimizers. The Julia wrapper for NOMAD doesn’t support parallel objective function evaluations, but the underlying C++ package does. It can be called from Julia with more effort using the file-based interface.

CMAEvolutionStrategy.jl is a global optimizer that supports parallel evaluation of the objective function. It is slow but very thorough. I have used it for similar type electromagnetic optimization problems by using a coarser grid for the electromagnetic solver to get in the vicinity of a good solution, then polishing it with a local optimizer from NLopt.jl, as suggested by @lmiq above. Note: Nowadays, instead of the Powell algorithms available in NLopt, I would choose the corrected versions from PRIMA.jl.

Edit: Another package for blackbox optimization that includes multiple algorithms (DE, PSO, and others) is Metaheuristics.jl. It supports single-objective and multiple objective problems. This table lists the available algorithms and shows which ones support parallel (denoted as “batch” therein) objective function evaluations.

3 Likes