Optimizing noisy objective

Just some ideas:

  • Particle-Swarm Optimisation: Had some good results on various “black-box” problems with this, but probably will require too many iterations. (Also can’t point at any reference implementation right now…)
  • DIRECT (e.g. https://github.com/npinto/direct)
  • Some sort of surrogate modelling (Wikipedia)?
  • Sobol sampling and Gaussian Process Model (GPM) as surrogate: Did this once for a parameter optimisation that used a very expensive simulation (hours per run) as evaluations. We just sampled the parameter space using N Sobol points (we could obtain over night), trained a GPM and figured out if the model accuracy was already stagnating (by cross-validation). Went on with sampling (overnight) until the model accuracy was not improving anymore, then applied a (very expensive) global optimisation algorithm to the GPM which evaluated lightyears faster than the original simulation… don’t know whether this could be successfully applied to such high-dimensional problems though. Maybe sparse GPM’s or a simpler type of surrogate model?
  • Polynomial Chaos Expansion (PCE, as surrogate model): according to a colleague of mine, PCE yields reliable (accurate) models with even less samples than e.g. GPM’s. They use it for surrogate modelling of astronomic processes… Just some pointers I found by a quick google:
1 Like