Randomness in initialization

Hi,

I have a simple model, which I know there exists multiplicity in results. When I start Ipopt to optimize this model, I found out that every time the optimizer gives me different results. The question is: Is there a way to control this randomness? Thanks
Here are just examples of two consecutive starts:
First run stops after 950 iterations and gives the following messages; Second run, however, stops already after 752 iterations.
grafik
grafik

Moreover, I also encounter more strange randomness: sometimes the ipopt can solve the problem, sometimes gives infeasible back…

1 Like

Assuming that the code uses Julia’s random number generator (RNG), you should be able to set a specific sequence with

using Random
Random.seed!(1234)

(or any other favorite number).

It looks like Ipopt is a wrapper around a C library, so this may not work if they don’t use Julia’s RNG.

thanks awasserman. I tried, but it doesn’t work… I also tried to set
VariablePrimalStart() = nothing
ConstraintPrimalStart() = nothing
ConstraintDualStart() = nothing
NLPBlockDualStart() = nothing
No one helps…

I found out that every time the optimizer gives me different results.

This should not happen. Can you provide a minimal working example?

You set start values with set_start_value, or the start = keyword.

hi odow, thanks. I’ll need sometime to prepare.

I somehow know why it is like that. Because, I don’t use either start or set_start_value for any of my variable. My problem should have multiple optimal results (even global). I originally want to understand how Jump initialize my problem. Then I found out that it embeds some random generation processes, which I cannot decipher.

I somehow know why it is like that

Are you calling rand? You cannot have randomness in your constraints, including in user-defined functions.

My problem should have multiple optimal results (even global)

This is not a problem. Ipopt will find you a locally optimal solution.

I originally want to understand how Jump initialize my problem.

JuMP sets the start value of variables to 0, or projects them onto their bounds if 0 is not feasible.

Then I found out that it embeds some random generation processes

This is not correct. Repeated runs of the same model should find the same solution.

I just took another look at your screenshots. The dual infeasibility error is far too large for Ipopt to be reporting a locally optimal solution.

Are you setting a time limit? What is the rest of the log? What is termination_status(model)?

I would guess that the acceptable_tol parameter or related has been set to a custom value?
https://coin-or.github.io/Ipopt/OPTIONS.html#OPT_acceptable_tol

yes, jd-foster, exactly, the acceptable_tol is changed.

yes, odow. That’s another problem of my model, which is not easy to be initiated:(
Yesterday I tried to build a minimal example for you, then I found something interesting:

In my original model, I use a function _add_node to build up the model:
function _add_node!(model::JuMP.Model, load::AbstractLoad)::Nothing
key = :nloads
if !(key in keys(object_dictionary(model)))
id = model[key] = 1
else
id = model[key] += 1
end
T = @variable(model,base_name=“T$id”, lower_bound = T_min, upper_bound = T_max, start = T_init)

In the light version (hard-coded), it looks like:
T1 = @variable(model,base_name=“T1”, lower_bound = T_min, upper_bound = T_max, start = T_init)
T2 = @variable(model,base_name=“T2”, lower_bound = T_min, upper_bound = T_max, start = T_init)
and so on…

Then I found out that my light version doesn’t create randomness any more!
That means, it is something to do with my original model formulation. I’ll study it at first before I report new findings.

Do you already have some clues, what could go wrong? Thanks in advance!

I’m not aware of anything in Ipopt’s algorithm that’s non-deterministic, which means that running the same problem (same data inputs) on the same machine should give you the same results.

If you see some non-determinism, I would look outside Ipopt’s core code, which could include:

  • The linear solver you’re using (default is Mumps, but users can use another linear solver), especially if you use multiple cores
  • The exact input that’s passed to Ipopt. For instance, permuting the variables gives an equivalent problem but it may yield a different solution. Something to look out for is whether something is using non-ordered dictionaries (last time I checked, iterating over the key-value pairs of a Dict is done in a non-deterministic fashion).
    One way to check this is to export the Ipopt model to a file, then call Ipopt directly on that file (without going through JuMP). This will also rule out anything that’s in the Julia part of your code.
1 Like