I’m trying to optimize a non-differentiable function with a mixture of discrete and continuous parameters. The function takes a long time to run, but we can run many batches of it at a time on separate machines, so I would like something where I can ask the optimizer for the next 100 experiments, feed the results into the optimizer, and iterate.
Empirically, we’ve had bad results with Bayesian Optimization, but good results with Random Search where we manually analyze the results and tweak the distributions for the next batch of experiments. This is unsustainable as our number of parameters grows, so I’m trying to find something that can prune the search space in a more automated way.
There are a lot of optimization algorithms – do you have any recommendations for libraries, algorithms, or that are similar to random search but a bit more automated between iterations?