Latest recommendations for global optimization

If you lack differentiability and continuity, you really shouldn’t be using any algorithm that in its heart, takes a derivative. This includes algorithms like NEWUOA, BOBYQA, and COBYLA, which internally fit data to a differentiable model.

If you are trying to solve an optimization problem that lacks differentiability, usually your best course of action is to think more deeply about the problem and see whether you can reformulate it in a differentiable way, e.g. by using an epigraph formulation.

For global optimizations of smooth functions in < 10 dimensions, usually the MLSL multistart algorithm is my first choice, though like most multistart algorithms it requires some problem-specific experimentation to choose a good stopping criterion for the local optimizer (since you don’t want to waste too much time doing local optimization to high precision … you can always “polish” the best local optimum at the end). MLSL doesn’t handle nonlinear constraints, however.

7 Likes