[ANN-RFC] NaturalES.jl - Natural Gradient Optimization

NaturalES.jl

This package implements the optimization methods described in
Wierstra, et al “Natural Evolution Strategies”, JMLR (2014).
this implementation follows the KISS™ principle, it can be used as

Usage

function rosenbrock(x::AbstractVector{T}) where T
    s=(1.0 - x[1])^2
    for i in 1:(length(x)-1)
        s+=100.0 * (x[i+1] - x[i]^2)^2
    end
    return s
end

optimize(rosenbrock,[0.3,0.6],1.0,sNES) # separable natural es.

(sol = [0.9999902815083116, 0.9999805401026993], cost = 9.450201922031972e-11)


optimize(rosenbrock,[0.3,0.6],1.0,xNES) # exponential natural es.

(sol = [0.9999999934969991, 0.9999999871800216], cost = 4.574949214506023e-17)

for further info in Julia type ?optimize.

Tips:

  • Use xNES for hard problems with strongly correlated variables
  • Use sNES for high dimensional problems that exhibit many local minima
  • Use sNES for problems with mostly separable variables

Future Plans:

  • Implementing other strongly performing variants
  • Parallelization

if there is any interest in having this package registered, let me know!
you can now install this package by typing ]add NaturalES in the Julia REPL.

19 Likes

Sweet! How does it compare to https://github.com/robertfeldt/BlackBoxOptim.jl ?

7 Likes

i’d say in some cases it can compare very favorably :smiley:


using BenchmarkTools
using BlackBoxOptim
using NaturalES
function rosenbrock(x::AbstractVector{T}) where T
    s=(1.0 - x[1])^2
    for i in 1:(length(x)-1)
        s+=100.0 * (x[i+1] - x[i]^2)^2
    end
    return s
end
function bb_task()
    best_candidate(bboptimize(rosenbrock; SearchRange = (-5.0, 5.0), NumDimensions = 2, Method = :separable_nes,TraceMode=:silent))
end
function nates_task()
    optimize(rosenbrock,[0.0,0.0],2.0,sNES).sol
end

@btime bb_task()  #19.940 ms (276040 allocations: 14.91 MiB)
@btime nates_task() #652.101 μs (3867 allocations: 61.84 KiB)

print(bb_task()) #[0.9876181286752018, 0.975347963147114]
print(nates_task()) #[0.9999963410120233, 0.9999926866427169]

using LinearAlgebra

print(norm(bb_task().-[1,1])/norm(nates_task().-[1,1])) # ~ 10^3


much less time,much less allocations, for a much better solution

in this case my package is ~ 31 times faster and it reaches a solution three order of magnitude more accurate

the argument is still valid also for xNES

function bb_task()
    best_candidate(bboptimize(rosenbrock; SearchRange = (-5.0, 5.0), NumDimensions = 2, Method = :xnes,TraceMode=:silent))
end
function nates_task()
    optimize(rosenbrock,[0.0,0.0],2.0,xNES).sol
end

@btime bb_task()  #126.951 ms (326961 allocations: 26.50 MiB)
@btime nates_task() #175.700 μs (2743 allocations: 233.63 KiB)
5 Likes

Nice work. Maybe this can be merged into BlackBoxOptim, or vice versa, in the future.

2 Likes

https://github.com/JuliaRegistries/General/pull/13727

the package is in the process of being registered.
The package is now registered!
thank you everyone for showing some love for this tiny package!

7 Likes

Note that the performance that people care about usually is the probability of finding the global minima in function of the number of function evaluations, not really the time the algorithm needs to run (since the bottleneck if often evaluating the error function).

I have a small framework to do that kind of benchmarks if you are interested.

3 Likes

it would be nice to add my package to those beautiful plots!