I’m looking for a combinatorial optimizer (decision variables are a permutation of the integers 1, 2, \ldots, N, where N is between, say, 50, and 500, for a black box function (i.e., only the value of the objective function is obtainable, no derivatives.). This is for a short-fuse project, so I would prefer a solution that I can just plug in and use as-is without extensive customization. I am aware of Combo which looks like it is untouched for > 3 years and doesn’t work under Julia 1.x. I’ve also looked at MHLib which might be made to work but has no documentation, so it may require quite a bit of time and effort to understand and set up properly. Any other leads would be appreciated. I have very little experience with JuMP but I am open to using it if someone can explain how it can be used for this type of problem.

I’ve also considered using a continuous variable optimizer, say, DE or CMA-ES, where the N floating-point decision variables are bounded between 0 and 1, and the permutation is inferred by their order. Can anyone comment on the advantages/pitfalls of using such a scheme?

Thanks, I suspected JuMP was not right for this problem and I appreciate your confirming this.

Regarding the difficulty, I think it is not too bad. I’ve used a Matlab genetic algorithm years ago for a similar problem and it worked quite well. I just don’t want to have to spend the time translating and debugging the Matlab code if there is a canned solution waiting for me.

Edit: Here is an example of someone using a genetic algorithm to solve the traveling salesman problem in seconds with about the same number of unknowns as I need. I don’t believe any domain knowledge was used here, just the distances between cities.

However, unless you have domain knowledge, you’re unlikely to do better than:

function solve(f, N; time_limit)
start = time()
x = collect(1:N)
best_x, best_f = copy(x), Inf
while time() - start < time_limit
cost = f(x)
if cost < best_f
best_f = cost
best_x .= x
end
sortperm!(x, rand(N))
end
return best_x, best_f
end
function black_box(x::Vector{Int})
return sum(i * x[i] for i in 1:length(x))
end
solve(black_box, 50; time_limit = 1.0)

Thanks for the suggestions. I believe that Metaheuristics and BlackBoxOptim are intended only for objective functions depending on continuous variables. I looked at Evolutionary, and as far as I can tell from the documentation the only type of combinatorial operators provided by the package are for use with binary variables.

The random search function you supplied takes N continuous random samples between 0 and 1 and infers the permutation from their order, the technique I was asking about. Instead of just using randomly generated samples, these N continuous variables could be optimized using a standard, continuous variable black-box optimizer like the ones you pointed me to. I wanted to know if anyone has experience doing this sort of thing and how it compares to an optimizer specialized for combinatorial domains.

MHLib.jl may indeed be a good starting point for your work when you want to apply a heuristic approach. To realize a basic (general) variable neighborhood search, GRASP, (adaptive) large neighborhood search etc. is easy for a black-box permutation problem by just updating the TSP demo, see the more detailed comments at here.

Note, however, that as for any challenging combinatorial optimization problem, it always is a good idea to think about the problem structure and to possibly realize more specialized methods you can provide to the framework. For example, a problem-specific construction heuristic for creating an initial solution might be a good idea, or some more specific neighborhood structures, possibly allowing an incremental solution evaluation. MHLib.jl only contains basic functionality, but is designed for efficiency and to give much flexibility for such problem-specific extensions.