BlackBoxOptim - beginner's questions

I am new to solving optimization problems. So please excuse any ignorance in my questions.

I am using BlackBoxOptim.jl, and I have a few questions:

  1. Initial guess and search range.
    I see that there is an optional argument of SearchRange. What happens when no range is specified? What is the initial guess? Is it random or deterministic? Is there a way to control the initial guess?

  2. Stopping condition.
    In case I do not know the minimum of the function, and therefore cannot use the TargetFitness parameter, what is the stopping condition of the optimizer? I guess that if it gets to more than MaxSteps iterations, it stops. But is there a way to provide other conditions? Like the wanted tolerance of the fitness function?

  3. Keeping track of the results.
    Is there a way to log the attempted values by the optimizer? I am interested to see which regions in the parameter space were explored, and what was the value of the fitness function.

  4. Choosing an optimizer.
    I do not know how to choose an optimizer. I read in the documentation the guide so I am using the recommended adaptive_de_rand_1_bin_radiuslimited and de_rand_1_bin DE. To that I compare the results of the random_search. Is there some more advice on how to choose / compare between the optimizers? (My problem has dimensions of O(100)).


I’m in the same boat and have a lot of similar questions, but I can also help with some answers maybe.

  1. It seems like the initial guess is randomly drawn from the initial range, which is (-1,1) if no range is provided.

  2. I just modified my objective function to print its arguments, which seems to work fine.

Thanks for the input!

Sorry for missing these questions earlier. Anyway:

  1. Default SearchRange is (-1.0, 1.0). Default sampling is latin hypersquare sampling with the goal being a better “spread” over the search space. There is currently no easy way to provide an initial guess but this is one of the most requested features so we are currently working on that.

  2. MaxSteps and MaxTime are the most commonly used stopping conditions. But you can also use a callback function and explicitly stop based on your own logic. There is an example of this in the test suite, see bottom of:

  3. There is logging and tracing but they are not very flexible. If you want to log everything you can either do this in the fitness function or use a callback function, see for example:

  4. The default choice (adaptive_de_rand_1_bin_radiuslimited) is very robust and in particular when your problem is large. You can also try dxnes (can be more effective if smaller problem) or generating_set_search (can be faster for “simpler” problems). The other ones rarely have an edge on any of these three in my experience but YMMV. A new CMA-ES is coming soon which I have some high hopes for.

Hope this helps!


Hi I have another beginner’s question.

I have one version where using Tuples for the SearchRange
(like in the introductory Rosenbrock example) works, and another version of BlackBoxOptim where it rejects the use of Tuples for SearchRange.

I could not find any documentation for clarification of how to specify ranges.
[update: I figured out that the ranges cannot be Integers - you have to add the decimal points! Tuples are OK now]

I am also new with BlackBoxOptim.jl

Where can I find the references about optimizers?

Warmest Regards

Sorry for late reply; vacation here…

Anyway, documentation is lacking but if you want to see which methods are implemented you can check:

Note though that the DE variants for single-objective and BORG for multi-objective have seen the most use and testing.

Hope this helps.

I have another question for @robertfeldt:
Is it possible to add an inequality constraint for two parameters, e.g., x[1]>x[2]?

These types of optimizers don’t support that out of the box so the way to do it is typically to add some large penalty for violating the constraint or use a multi-objective optimizer and have the second fitness be the penalty for violating the constraint. Some code to give you an idea (this is for the first option, can be adapted for the 2nd option):

const BasePenalty = 10000 # Select some large number that will dominate the "normal" fitness.

function penalty_constraint_x1_gt_x2(x)
    if x[1] < x[2]
        return BasePenalty + (x[2] - x[1]) # Smaller penalty as we get closer to non-violation of constraint
        return 0.0

my_new_fitness(x) = my_normal_fitness(x) + penalty_constraint_x1_gt_x2(x)

res = bboptimize(my_new_fitness; ...)

Got it. Thanks!

1 Like