Hello, I’m an engineer in operational research and I was working on a MILP. Since the problem was to big I decided to use Benders cut. But it’s still very big so I want to change my method and to use genetics algorithm. Indeed in my work, I can simplify primal subproblem and make it run faster when I just test the value of the subproblem for a soluce of the master problem, so I want to test many possible configuration of the design given by the master problem. So I want to know if there is a libraries that implement genetics algorithm with constraint (my master problem as constraints) and that can take into their fitness function the result of a simulation (here, the value of the subproblem for one specific combination on the master problem). Thank you .

Unfortunately, Julia does not seems to have much generic packages for genetic algorithms.

wildart/Evolutionary.jl seems very limited to accomplish what you ask. You can maybe still use it by defining a very bad fitness function whenever a constraint is violated.

GeneticAlgorithms.jl could work as it asks you to implement your own crossover operation (so you can implement it in a way that take the constraints into account). However, it is a long time since it was last updated (before 1.0) so I doubt it is still working. Maybe your best take is to clone it and update it to last Julia version.

If you are open to something else than genetic algorithms, you may be interested by the solutions proposed by BlackBoxOptim.jl. (Optim.jl also propose a few algorithms that do not require the optimized function to be differentiable)

I don’t think there are any libraries in Julia with specific implementations for constrained genetics algorithms. As far as I understand it, handling constraints in GA is usually a problem-specific method – usually done by transforming the constraints into penalty function and integrating it to the fitness function. For the generic GA, the packages listed in JuliaOpt are Evolutionary.jl and GeneticAlgorithms.jl

If you’re trying to find an optimum solution by testing many possible configuration of the design in an IP solver. It’s much more closely related to Mathematical-Programming Based Heuristic or Hybrid Metaheuristic where you use an IP solver in tandem with heuristic or metaheuristic methods. Several references for the methods are here:

Hope it helps!

Thank you ! I can also put that if constraint aren’t respected I don’t get the result of the LP function, it could be a way to economize time of computation

A very simple operation is to use Deb’s approach to incorporate constraints into the fitness function.

https://doi.org/10.1016/S0045-7825(99)00389-8

We have been using it and its simple yet very effective.

If anyone wants to implement constraint handling methods for zero-order algorithms, I will be happy to host them in GitHub - mohamed82008/Nonconvex.jl: Toolbox for non-convex constrained optimization..

The ISRES algorithm is a GA supporting arbitrary nonlinear constraints directly via a stochastic-ranking strategy (not via simple penalties). It’s available in NLopt.jl, but not for MILP.

(If you are using penalties, you should also look into the augmented-Lagrangian algorithm, which is a much better way of implementing constraints as penalties than the obvious approach.)

Rather than relaxing the constraints in the objective function (you have to steer the penalty parameter and it can be pretty funky), you can build a selection operator that compares the objective values and the constraint violations of two individuals and determines which individual is the best (see for example Differential evolution: a practical approach to global optimization).