# Constraint MINLP problem

Hello. I was wondering if someone could clarify a question for me. I have a problem when it comes to modeling in JuMP. I have a MINLP problem with the following objective function and constraints.

``````@NLobjective(model,Min,sum(sum((c_00[i]+c_10[i]*x[i,t] + c_01[i]*n[i,t]+ c_20[i]*(x[i,t])^2 +c_11[i]*x[i,t]*n[i,t] + c_02[i]*(n[i,t])^2) +y[i,t] for i=1:nus)+μ*(z[t])^2 for t=1:nd))

@variable(model,x[i=1:nus,t=1:nd]≥0)
@variable(model, n[i=1:nus, t=0:nd],Int)

for i=1:nus
fix(n[i, 0], 3)
end
@variable(model,y[i=1:nus,t=1:nd]≥0,Int)
@variable(model,z[t=1:nd]≥0)

@constraint(model,restri[i=1:nus,t=1:nd],y[i,t]≥n[i,t]-n[i,t-1])   (1)
@constraint(model,restric[i=1:nus,t=1:nd],y[i,t]≥n[i,t-1]-n[i,t])  (2)
@constraint(model,lim_nummaq[i=1:nus,t=1:nd],nmin[i,t]≤n[i,t]≤nmax[i,t]) (3)
@constraint(model,lim_geracao[i=1:nus,t=1:nd],xmin[i,t].*n[i,t]≤x[i,t]≤xmax[i,t].*n[i,t]) (4)
@constraint(model,demanda[t=1:nd],sum(x[i,t] for i=1:nus)+z[t]==d[t]) (5)
@constraint(model,meta_geracao[i=1:nus],sum(x[i,t] for t=1:nd)==metas[i]*24) (6)
``````

The restriction (4) is dependent on the variable n [i, t], with xmin and xmax being lower and upper bounds respectively and when I run the model I get the following message: n[i, t] expected to be a number. How do I write this constraint in the correct way?xmin and xmax must depend on n[i,t].

Hi there. Please read Please read: make it easier to help you and provide a minimal reproducible example. To be able to help, we need to be able to run your code. In particular, I don’t know the data scubas `xmin` and `xmax`

``````using JuMP,AmplNLWriter

us=["Chavante","Capivara"]
d=[1080 864 702 621 594 810 972 1080 1161 1242 1296 1188 1350 1512 1485 1458 1215 1053 999 1053 1107 945 864 783]
metas=[220,320]
xmin=[30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30; 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50]
xmax=[440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440;640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640]
nmin=[1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1;1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1]
nmax=[4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4;4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4]
cpp=[3*110,3*160]
cp=3
c_00=[0.1959,-0.006446]
c_10=[1.369,2.227]
c_01=[12.96,38.23]
c_20=[0.001165,0.002172]
c_11=[-0.142,-0.4079]
c_02=[4.353,18.59]
μ=0.2

model = Model(
() -> AmplNLWriter.Optimizer(
)
)

nus=length(us)
nd=length(d)

@NLobjective(model,Min,sum(sum(cp*(c_00[i]+c_10[i]*x[i,t] + c_01[i]*n[i,t]+ c_20[i]*(x[i,t])^2 +c_11[i]*x[i,t]*n[i,t] + c_02[i]*(n[i,t])^2)+cpp[i]*y[i,t] for i=1:nus)+μ*(z[t])^2 for t=1:nd))

@variable(model,x[i=1:nus,t=1:nd]≥0)
@variable(model, n[i=1:nus, t=0:nd],Int)
for i=1:nus
fix(n[i, 0], 3)
end
@variable(model,y[i=1:nus,t=1:nd]≥0,Int)
@variable(model,z[t=1:nd]≥0)

@constraint(model,restri[i=1:nus,t=1:nd],y[i,t]≥n[i,t]-n[i,t-1]) (1)
@constraint(model,restric[i=1:nus,t=1:nd],y[i,t]≥n[i,t-1]-n[i,t])  (2)
@constraint(model,lim_nummaq[i=1:nus,t=1:nd],nmin[i,t]≤n[i,t]≤nmax[i,t])  (3)
@constraint(model,lim_geracao[i=1:nus,t=1:nd],xmin[i,t].*n[i,t]≤x[i,t]≤xmax[i,t].*n[i,t])  (4)
@constraint(model,demanda[t=1:nd],sum(x[i,t] for i=1:nus)+z[t]==d[t])        (5)
@constraint(model,meta_geracao[i=1:nus],sum(x[i,t] for t=1:nd)==metas[i]*24)  (6)
``````

Like this?

1. You need to specify the `@NLobjective` after declaring your variables.
2. You need to split constraint (4) into two constraints, one for each side of the inequality. We used to throw a nicer error message. This is a regression in the latest release. (Edit: issue https://github.com/jump-dev/JuMP.jl/issues/2599)
3. You can use `Bonmin_jll` instead of `bonmin.exe`

Here’s how I would write your code

``````using JuMP
import AmplNLWriter
import Bonmin_jll

us = ["Chavante","Capivara"]
d = [
1080 864 702 621 594 810 972 1080 1161 1242 1296 1188 1350 1512 1485 1458 1215 1053 999 1053 1107 945 864 783
]
metas = [220,320]
xmin = [
30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30 30
50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50
]
xmax = [
440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440 440
640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640 640
]
nmin = [
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
]
nmax = [
4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4
4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4
]
cpp = [3 * 110, 3 * 160]
cp = 3
c_00 = [0.1959, -0.006446]
c_10 = [1.369, 2.227]
c_01 = [12.96, 38.23]
c_20 = [0.001165, 0.002172]
c_11 = [-0.142, -0.4079]
c_02 = [4.353, 18.59]
μ = 0.2

model = Model() do
AmplNLWriter.Optimizer(Bonmin_jll.amplexe)
end

nus = length(us)
nd = length(d)

@variables(model, begin
x[i=1:nus, t=1:nd] >= 0
n[i=1:nus, t=0:nd], Int
y[i=1:nus, t=1:nd] >= 0,Int
z[t=1:nd] >= 0
end)
for i in 1:nus
fix(n[i, 0], 3)
for t in 1:nd
set_lower_bound(n[i, t], nmin[i, t])
set_upper_bound(n[i, t], nmax[i, t])
end
end
@NLobjective(
model,
Min,
sum(
sum(
cp * (
c_00[i] +
c_10[i] * x[i, t] +
c_01[i] * n[i, t] +
c_20[i] * x[i, t]^2 +
c_11[i] * x[i, t] * n[i, t] +
c_02[i] * n[i, t]^2
) +
cpp[i] * y[i, t]
for i = 1:nus
) + μ * z[t]^2
for t = 1:nd
)
)
@constraints(model, begin
[i=1:nus, t=1:nd], y[i, t] >= n[i, t] - n[i, t-1]
[i=1:nus, t=1:nd], y[i,t] >= n[i, t-1] -n[i, t]
[i=1:nus, t=1:nd], xmin[i, t] * n[i, t] <= x[i, t]
[i=1:nus, t=1:nd], x[i, t] <= xmax[i, t] * n[i, t]
[t=1:nd], sum(x[:, t]) + z[t] == d[t]
[i=1:nus], sum(x[i, :]) == 24 * metas[i]
end)
``````
2 Likes

Thanks for your help. Is there a difference between Bonmin_jll and bonmin.exe? I run the code with bonmin.exe

Is there a difference between Bonmin_jll and bonmin.exe?

No. But you can install Bonmin_jll via the Julia package manager instead of having to keep a copy of the `.exe` around and remember the path.

1 Like