From the README:
The CMA Evolution Strategy is a stochastic method for derivative-free optimization of potentially non-linear, non-convex or noisy functions over continuous domains (Hansen 2016). A brief discussion of its performance in practice can be found on wikipedia.
The default settings and implementation details follow closely Hansen 2016 and pycma.
For details on noise handling see Hansen 2009.
Example
julia> function rosenbrock(x)
n = length(x)
sum(100 * (x[2i-1]^2 - x[2i])^2 + (x[2i-1] - 1)^2 for i in 1:div(n, 2))
end
julia> using CMAEvolutionStrategy
julia> result = minimize(rosenbrock, zeros(6), 1.)
(4_w,9)-aCMA-ES (mu_w=2.8,w_1=49%) in dimension 6 (seed=17743412058849885570, 2020-05-12T16:22:27.211)
iter fevals function value sigma time[s]
1 9 6.06282462e+02 8.36e-01 0.008
2 18 6.00709117e+02 8.42e-01 0.009
3 27 2.40853796e+02 7.84e-01 0.009
100 900 8.25748973e-01 1.44e-01 0.021
200 1800 2.21358637e-05 1.12e-02 0.040
266 2394 5.58767672e-12 2.76e-05 0.051
Optimizer{Parameters,BasicLogger,Stop}
(4_w,9)-aCMA-ES (mu_w=2.8,w_1=49%) in dimension 6 (seed=17743412058849885570, 2020-05-12T16:22:27.254)
termination reason: ftol = 1.0e-11 (2020-05-12T16:22:27.255)
lowest observed function value: 1.076905008476142e-12 at [0.9999990479016964, 0.9999981609497738, 0.9999990365312236, 0.9999981369588251, 0.9999994689450983, 0.9999988356249463]
population mean: [1.000000255106133, 1.0000004709845969, 1.0000006232562606, 1.0000012290059055, 0.9999998790530266, 0.9999997338544545]
If you are interested, I’ll register it.