Which optimization API/Pkg for simple global optimization

#1

Hi, in a laser physics experiment I am building I need to optimize the positions of 4 actuators on some mirrors; these positions are simply controlled by 4 numbers. The optimization needs to be done to maximize a set of metrics that I can encode as a single fitness value. Usually I perform this optimization by assigning it to some PhD students or post-docs. But I’m hoping I can save their time for other tasks and get Julia to do this for me!

I need to essentially solve a bounded global optimization problem. Previously I have used genetic algorithms for such tasks with some success, but I get the impression that these are now out of date and that better solutions exist.

However, when I look at the packages in Julia I quickly get overwhelmed by the range of choices and the optimization jargon that I do not understand (and don’t really have the time to study). For example I do not know really what my “model” is or if my problem is convex, quadratic etc. For example the table of solvers at http://www.juliaopt.org/ is essentially useless to me!

Essentially I’d like a routine that I pass my fitness function, that takes 4 numbers and returns a single number to be maximized, which then finds (or has a good chance of finding) the globally optimal set within bounds. I don’t mind tuning parameters to the algorithm. As I said I’ve used genetic algorithms for this before. I also don’t have access to the gradients etc. as it is for hardware control. And there is likely some hysteresis, but I’m hoping to ignore that.

It seems that NLOpt might be my best bet, but I’m happy to hear suggestions from the experts!
Thanks!

#2

Is it your function differentiable? You said you don’t have the gradient, but I am wondering whether can be calculated numerically. In any case, I can shamelessly plug my own package Genoid.jl. It is a genetic algorithm that uses locally a local optimizer to find a global solution faster (if the fitness is differentiable, otherwise does not use curvature information).

Alternatively, Optim.jl is an alternative (it has not gradient and not gradient based solvers and it works very well).

#3

The constraints are simple bounds? What is the objective function like? I’m guessing it’s non-convex, otherwise it wouldn’t be a very hard problem. If you can express the objective function in terms of an analytical expression of the 4 input variables and set that as an @NLobjective in JuMP, I would try the open-source Couenne solver which uses a spatial branch-and-bound technique to provably obtain a globally optimal solution to your nonlinear constrained optimization problem. Keep in mind non-convex global optimization is a difficult problem, even quadratic programming with a single negative eigenvalue is NP-hard. But for only 4 variables and simple constraints, spatial branch-and-bound shouldn’t scale too badly.

#4

Yes the constraints are simple bounds.

The objective function is the output of the physical experiment as measured by my apparatus: there is no analytical expression for it, and derivatives are not available. I do expect it to be somehow “smooth” but full of local minima.

#5

Just to be clear: I want to run this optimization in real time in my experiment. The optimization routine will provide a trial set of my 4 variables and my code will then actually physically set the mirror actuators to these values, my laser will fire, and I will measure the outcome of the experiment, which I then return to the optimization routine. The bounds are the limits that I can move the mirrors to.

#6

It would be interesting to compare that to the MLSL algorithm (available in NLopt, another option for differentiable or non-differentiable global optimization), which also combines local and global search (see this notebook for an example, at the end).

#7

good idea — I have coded several functions that are smooth but challenging to optimize with local solvers. If I find the time I will write a blog post about comparing different approaches available in Julia.

#8

Yeah, just set the problem up with JuMP or MathProgBase and try a bunch of different solvers. My problems all tended to do well with the offerings from NLopt, but YMMV.

#9

I have a simple meta-heuristic optimisation package available in julia which I have developed as a means of learning julia. You can find out more at http://www.ucl.ac.uk/~ucecesf/fresa.html .

Happy to communicate off-line as I do not like web based systems for communication. I do wish this community used a proper mailing list… but that’s an issue for another day