InfiniteOpt/supports violate the domain bounds

I want to solve a problem with randomly generated bounds using InfiniteOpt. However, if I don’t specify the number of supports for the parameter, I get an error:


If I specify the number of supports, the problem is solved.

The code:

using JuMP, InfiniteOpt, Ipopt, Random
rng = MersenneTwister(0)
Random.seed!(rng, 0)
aR = 0
bR = 1
dR = (bR-aR).*rand(rng,1,2).+ aR
model = InfiniteModel(Ipopt.Optimizer)
@variable(model, x[1:2] >= 0)

# @infinite_parameter(model, z1 in [minimum(dR), maximum(dR)],num_supports = 10) ######Works
@infinite_parameter(model, z1 in [minimum(dR), maximum(dR)]) ######Doesn't work

@objective(model,Max,x[1]+x[2])
@constraint(model,z1*x[1]+1*x[2]<=2.0)
status=optimize!(model)

The same happens if I fix the random bounds:

using JuMP, InfiniteOpt, Ipopt, Random
dRNotRAndom = [0.8236475079774124, 0.9103565379264364] ###obtained above
model = InfiniteModel(Ipopt.Optimizer)
@variable(model, x[1:2] >= 0)

@infinite_parameter(model, z1 in [minimum(dRNotRAndom), maximum(dRNotRAndom)],num_supports = 10) ###Works
# @infinite_parameter(model, z1 in [minimum(dRNotRAndom), maximum(dRNotRAndom)]) ###Doesn't work

@objective(model,Max,x[1]+x[2])
@constraint(model,z1*x[1]+1*x[2]<=2.0)
@show [minimum(dR), maximum(dR)]
status=optimize!(model)

But approximating them works:

using JuMP, InfiniteOpt, Ipopt, Random
dRNotRAndomApp = [0.82364751, 0.91035654]                 #####Approximated
model = InfiniteModel(Ipopt.Optimizer)
@variable(model, x[1:2] >= 0)

# @infinite_parameter(model, z1 in [minimum(dRNotRAndomApp), maximum(dRNotRAndomApp)],num_supports = 10) ###Works
@infinite_parameter(model, z1 in [minimum(dRNotRAndomApp), maximum(dRNotRAndomApp)]) ###Works

@objective(model,Max,x[1]+x[2])
@constraint(model,z1*x[1]+1*x[2]<=2.0)
@show [minimum(dR), maximum(dR)]
status=optimize!(model)

Not sure if this is a bug or if I am doing sth wrong.

Hi @blob, this is indeed a bug.

The problem is ultimately solved via discretization and all supports (i.e., discretization points) are rounded to a certain number of significant digits (the default is 12). Hence, the bug is that specifying domain bounds with a higher number of sigfigs can lead to the case where generated support points violate the bounds.

This will be fixed by Fix SigFig Bug by pulsipher · Pull Request #344 · infiniteopt/InfiniteOpt.jl · GitHub which will round the domain bounds to use the same number of sigfigs.

In the meantime, you can work around this by specifying a higher number of sigfigs via the sig_figs argument when defining your infinite parameter:

@infinite_parameter(model, z1 in [minimum(dRNotRAndom), maximum(dRNotRAndom)], sig_figs = 12)

Or you can first round your bounds like you already tried. For a given infinite parameter, you can check its sigfigs via the significant_digits function.

With respect to your problem, it appears you are solving a robust optimization problem. It is worth noting that InfiniteOpt has limited support for these types of problems and will only solve them via sample average approximation (i.e., enforcing that the constraints hold for each discretization point). Hence, you should control what support points are specified rather than relying on the default behavior which just generates 10 support points.

Also, one small note that status in status=optimize!(model) is unnecessary since optimize! will always return nothing. You can check the status via status = termination_status(model) (after optimize! has been called).

1 Like