Optimizing a physics experiment

I’m a member of an atomic physics lab, where a big part of our day to day work involves optimizing performance of a complicated machine with lots of “knobs”. We are currently not very sophisticated in how we perform the optimization, and we usually only turn one or two knobs at a time. I’ve thought many times that it would be nice to automate this process as much as possible. I am wondering what Julia packages might help in doing this.

The problem I am trying to solve can be broadly characterized as follows:

  • We seek to optimize a scalar function f(x_1,x_2,\dots,x_n). What we actually measure is this function plus some noise, f(x_1,\dots) + \epsilon. The function f is measured “by hand” or by some program outside of Julia, so in particular it is not amenable to differentiable programming. f is smooth, but pathological in every other way (non-convex, many local extrema, etc.).
  • Some of the input parameters x_i can be measured precisely (e.g. voltage set points). Others can only be measured approximately (e.g. knobs on optics mounts).
  • I would like a set of tools that helps me optimize f by suggesting new combinations of settings for parameters x_i and receiving input I provide from a measurement, keeping track of the evolution of f as parameters are changed.

Based on what I know of common optimization toolboxes, two major challenges this problem poses are (1) optimizing a noisy signal, and (2) the manual data input. The second in particular implies that the algorithm must work with limited samples, as input is very slow.

Note that the desired toolbox doesn’t have to accomplish spectacular optimization–it just has to do better than what we can do entirely by hand.

So my questions are:

  1. Are there any Julia packages suitable for this application?
  2. If not, would anyone be interested in collaborating or providing guidance on a package for this purpose? I think this problem has broad applicability in hard sciences (anything with complicated, custom machines).

That’s a fairly common problem in chemistry. And for instance I just found this package that implements the most common methods: GitHub - phrb/ExperimentalDesign.jl: Design of Experiments in Julia

2 Likes

The problem you have is very similar to hyper-parameter optimization (HPO) of ML algorithms. You have a noisy black-box objective and some decision variables you can tweak in your experiment. Of course, HPO is just a special case of this generic black-box optimization problem with a noisy objective. The main difference between your problem and standard ML HPO or black-box optimization is that your problem has a “human in the loop”. So each f(x) evaluation needs you to do an experiment and manually input the results.

In general, HPO algorithms can be categorised as follows:

  1. Fixed experiment design, e.g. using full factorial experiment, latin hypercube sampling, pre-specified random sequence, Sobol sequences, etc. You can find methods for coming up with diverse experiment designs used in GitHub - phrb/ExperimentalDesign.jl: Design of Experiments in Julia, GitHub - baggepinnen/Hyperopt.jl: Hyperparameter optimization in Julia. and GitHub - tpapp/MultistartOptimization.jl: Multistart optimization methods in Julia.. You may have to inspect the code and just copy and paste the design generation part.

  2. Adaptive experiment design. This can be formulated as a black-box optimization problem. Broadly speaking, this is trying to figure out the next most promising settings to try given all the previous ones. Some black-box optimization algorithms don’t mind a noisy objective f(x). Others do so you can’t just use any algorithm if the noise is significant. Given the manual nature of your function evaluation, you would need to run 1 iteration at a time of the chosen optimization algorithm. Perhaps the most popular noisy objective optimization algorithm out there is the so-called Bayesian optimization. Bayesian optimization is a class of so-called surrogate-assisted optimization which relies on building a model (aka surrogate) for your f(x) and optimizing the model. Note that most surrogates used here are typically ML-based so they don’t mind the noise. When the surrogate used is a Gaussian process, the surrogate-assisted optimization is often known as Bayesian optimization. For this path, you have the BOHB algorithm in GitHub - baggepinnen/Hyperopt.jl: Hyperparameter optimization in Julia., you have GitHub - SciML/Surrogates.jl: Surrogate modeling and optimization for scientific machine learning (SciML) which also has many surrogates implemented, and you have the less popular, not-very-well-tested, experimental GitHub - JuliaNonconvex/NonconvexBayesian.jl: Constrained Bayesian optimization implementation in Nonconvex.jl (my package, shameless plug).

I hope this was useful.

3 Likes

Of course, you can also find the Python alternative to all the above packages which might be more mature and well maintained.

1 Like

For the human in the loop part, your objective function can be:

function f(x)
	println("Try these settings: $x")
	println("Input objective: ")
	return parse(Float64, readline())
end

You may need a try-catch and while loop to handle typos. Pass this function to any black-box optimization algorithm that can handle a noisy objective.

3 Likes