Optimizer in Evolutionary.jl returns initial conditions as minimizer with default setting

I want to use a genetic algorithm to minimize a function. After searching, I read that the best available option is Evolutionary.jl. Hence, I started reading the documentation. I tried running the examples found in the tutorial but came across some issues.

Running the example:

res = Evolutionary.optimize(x->-sum(x), BitVector(zeros(3)), GA())

indeed returns the correct solution. However, the same piece of code with different initial conditions (that are not the final results) produce wrong results (at least in my PC). For instance:

res = Evolutionary.optimize(x->-sum(x), [11,19,-1], GA())


 * Status: success

 * Candidate solution
    Minimizer:  [11, 19, -1]
    Minimum:    -29
    Iterations: 12

 * Found with
    Algorithm: GA[P=50,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0001 (vs limit Inf)
    Iterations:    12
    f(x) calls:    650

which is clearly wrong. In similar cases, the results I get are the initial conditions, and not the min point.

When running the example found here, I get the correct results though. Hence, if I try to alter the settings of the GA(), and use these ones:

ga = GA(populationSize=100,selection=uniformranking(3),

I get results that seem ok.

To dig into that, I also tried defining a custom sum function, and print the results to see what is going on:

function sumTest(x)
    return sum(x)

and then define a test:

initialParameters = [1.0, -1.0, 10.0]
results = Evolutionary.optimize(sumTest, initialParameters, GA())

as expected, the argument passed in function sumTest is always the initialParameters.

This is also fixed when using custom ga parameters.

Found relevant issue in the GitHub repository.

You are correct, default parameters are useless.