Genetic Algorithm packages

For those who run genetic algorithms or similar optimizations methods in Julia, which package do you use?

I’ve been using Evolutionary.jl and it works well, but am wondering there are any alternatives worth checking out.

Metaheuristics.jl implements many methods including Genetic Algorithms, Particle Swarm Optimization, Differential Evolution, Artificial Bee Colony, Machine-coded Compact Genetic Algorithms, etc. I use it in my optimization cources.


Other packages worth a look: Nonconvex which wraps both Metaheuristics and Nomad, and CMAEvolutionStrategy, which I’ve used extensively with good success on tough global optimization problems. Also, Optimization is a super meta package that wraps many population-based optimizers (among others).


NLopt.jl also exposes some genetic algorithms.

(Though in my experience these are algorithms of last resort.)

1 Like

BlackBoxOptim.jl, gave very good results for my use case.
Its compare_optimizers function is nice, but the default optimizer already gave good results.

Could you please elaborate on which algorithm(s) you prefer and why for global optimization?

There is not a “use it and drop others” algorithm for global optimization. The success of an algorithm depends on the objective function, dimensionality, and selecting the parameters among other stuff. I generally run multiple instances of several algorithms to get an insight of the global optimum.

If you ask my personal opinion, I always try out many of them with different parameters (I suppose the question is asked for real domain). DE is generally effective for many problems, whereas PSO can be the proper selection. ECA (Evolutionary Centers Algorithm) and ABC (Artificial Bee Colony) are also promising. MCCGA (Machine-coded compact genetic algorithm) is the most performant one in many cases.


Thanks for your insights. I’m hoping @stevengj will also expand on why he considers GA algorithms to be methods of last resort.

At least for my use cases, I’ve almost always found other methods to be more efficient. If you have derivatives, then I usually like MLSL (combing gradient-based local search with random or quasi-random global search + a technique that reduces repeated searching of the same local optima).

Genetic algorithms (and particle swarms and simulated annealing) have evocative natural analogies that, in my opinion, have given them disproportionate mindshare compared to algorithms that just have math (e.g. MLSL, DIRECT & other branch-and-bound algorithms…).


I share the same thoughts as @stevengj regarding the use of derivative-based methods when dealing with problems that involve derivatives. Additionally, it’s worth noting that many metaheuristics and evolutionary algorithms are minimalist in nature, assuming continuity and the availability of derivatives.

For additional insight on nature-inspired algorithms:

The field of meta-heuristic search algorithms has a long history of finding inspiration in natural systems. Starting from classics such as Genetic Algorithms and Ant Colony Optimization, the last two decades have witnessed a fireworks-style explosion (pun intended) of natural (and sometimes supernatural) heuristics - from Birds and Bees to Zombies and Reincarnation.

The goal of the Evolutionary Computation Bestiary is to catalog the, ermm… exuberance of the meta-heuristic “eco-system”. We try to keep a list of the many different animals, plants, microbes, natural phenomena and supernatural activities that can be spotted in the wild lands of the metaphor-based computation literature.


What’s MLSL, could u please share some references.

The NLopt algorithms manual: MLSL has a couple of references to the original papers.

1 Like