Equivalence to add_nonlinear_constraint in new JuMP version

Hi,

I have used the functions add_nonlinear_constraint/ add_nonlinear_expression/ add_nonlinear_objective in the old version of JuMP to add programmatically-generated Expr objects to models. I wonder if these functions are still available in the new versions of JuMP (v 1.15 and newer), or if we have some equivalences. Thank you.

There is no equivalent to the old functions. You shouldn’t need them.

What are trying to achieve?

Hi @odow,
Thanks for responding. I need to repeatedly solve an optimization problem as a part in my genetic programming workflow. The problem has fixed parameters, variables, and objective function. It has a few fixed constraints, while the other constraints change each time. The constraints which change over time are generated programmatically, so I could not control their forms or declare them manually.
Before I use add_nonlinear_constraint() together with delete() for this purpose, but now it is not possible with the newer versions.

Can you provide a reproducible example of what you are trying to do?

The @NL interface is now legacy.

The new nonlinear interface has first-class support for nonlinear expressions: Nonlinear Modeling · JuMP, and you can, for example, delete a nonlinear constraint.

You should not need to use the Base.Expr input syntax.

Sorry for my late response. Here is an example:

using JuMP, Ipopt
model = Model(Ipopt.Optimizer);
# set_silent(model)
@variable(model, ir1, start = 3.0)
@variable(model, ir2, start = 3.0)
@variable(model, ir3, start = 3.0)
@variable(model, SP, start = 5.0)
@variable(model, yA, start = 1.0)
@variable(model, yF, start = 2.0)

fix(yA, 1.0)
fix(yF, 2.0)
@constraint(model, SP*yA == yF)

@constraint(model, constr1, ir2 == min(ir1, yA*SP))
@constraint(model, constr2, ir2 == yF)
@constraint(model, constr3, ir3 == max(ir1, yF))
@constraint(model, constr4, ir3 == yA*SP)

optimize!(model)

In this example, the four constraints (constr1 to constr4) are generated programmatically in form of Expr. For example, constr1 would originally be :(ir2 == min(ir1, yA*SP)). I have no prior knowledge about how these Expr look like before they are actually generated, so it is not possible to declare them in advance and then manipulate some coefficients.

the four constraints (constr1 to constr4) are generated programmatically in form of Expr

Can you provide a reproducible example of how this is done?

Here is a conceptual example of how I do it. I hope it is sufficient to illustrate the process.

using JuMP, Ipopt, Random
model = Model(Ipopt.Optimizer);
@variable(model, ir1, start = 3.0)
@variable(model, ir2, start = 3.0)
@variable(model, ir3, start = 3.0)
@variable(model, SP, start = 5.0)
@variable(model, yA, start = 173.0)
@variable(model, yF, start = 2.0)

fix(yA, 1.0)
fix(yF, 2.0)

ir1 = model[:ir1]; ir2 = model[:ir2], ir3 = model[:ir3]
SP = model[:SP]; yA = model[:yA]; yF = model[:yF]

set_LHS = [ir1, ir2, ir3];
set_RHS = [ir1, ir2, ir3, SP, yA, yF]
set_func = [:+, :-, :*, :max, :min]

function make_RHS(set_RHS, set_func)
      return Expr(:call, rand(set_func), rand(set_RHS), rand(set_RHS))
end

function make_constr(set_LHS, set_RHS, set_func)
      rhs = make_RHS(set_RHS, set_func)
      return Expr(:call, :(==), rand(set_LHS), rhs)
end

constr1 = make_constr(set_LHS, set_RHS, set_func)

You don’t need to use Expr any more. Just do:

julia> using JuMP

julia> begin
           model = Model()
           @variable(model, ir1, start = 3.0)
           @variable(model, ir2, start = 3.0)
           @variable(model, ir3, start = 3.0)
           @variable(model, SP, start = 5.0)
           @variable(model, yA == 1.0)
           @variable(model, yF == 2.0)
           set_LHS = [ir1, ir2, ir3];
           set_RHS = [ir1, ir2, ir3, SP, yA, yF]
           set_func = Any[+, -, *, max, min]
           function make_RHS(set_RHS, set_func)
               return rand(set_func)(rand(set_RHS), rand(set_RHS))
           end
           function make_constr(set_LHS, set_RHS, set_func)
               return @constraint(model, rand(set_LHS) == make_RHS(set_RHS, set_func))
           end
       end
make_constr (generic function with 1 method)

julia> constr1 = make_constr(set_LHS, set_RHS, set_func)
-SP*ir3 + ir2 = 0

julia> constr2 = make_constr(set_LHS, set_RHS, set_func)
-yF*ir3 + ir3 = 0

julia> constr3 = make_constr(set_LHS, set_RHS, set_func)
ir1 - min(ir3, yF) = 0

julia> constr4 = make_constr(set_LHS, set_RHS, set_func)
ir3 - min(ir1, yA) = 0

julia> constr5 = make_constr(set_LHS, set_RHS, set_func)
ir2 - max(ir2, SP) = 0

julia> print(model)
Feasibility
Subject to
 ir1 - min(ir3, yF) = 0
 ir3 - min(ir1, yA) = 0
 ir2 - max(ir2, SP) = 0
 -SP*ir3 + ir2 = 0
 -yF*ir3 + ir3 = 0
 yA = 1
 yF = 2
1 Like

@odow: Thanks a lot! I will try this. Could you also show me what to do if a constraint is more nested, to an arbitrary depth?
For example, if I have this Vector{Expr}, which is basically generated programmatically in a similar way to the example above:

[:(var = yA + yA), :(var = var * SP), :(var = var - yF)]

I have managed to transform this Vector{Expr} into a single Expr: :(var = (yA + yA)*SP - yF). From this Expr, I can easily make the constraint that I really want, which is: :((yA + yA)*SP - yF == 0), but I have no idea how to add this to model.

You can do something like:

julia> function make_RHS2(set_RHS, set_func, depth)
           rhs = rand(set_RHS)
           for _ in 1:depth
               fi = rand(set_func)
               rhs = fi(rhs, rand(set_RHS))
           end 
           return rhs    
       end
make_RHS2 (generic function with 1 method)

julia> 

julia> make_RHS2(set_RHS, set_func, 3)
2 ir3 + ir2 - yF

julia> make_RHS2(set_RHS, set_func, 3)
max(max(ir3, yA) + yA, SP)

julia> make_RHS2(set_RHS, set_func, 3)
ir3*ir1 + ir2 + yF

julia> make_RHS2(set_RHS, set_func, 3)
max(min(max(ir2, yA), yA), yA)

The main takeaway is that you do not need to use Expr. Just build the function as if it contained Float64 data. Use loops, higher-order functions, etc.

1 Like

Thanks, @odow ! I improvised a bit and managed to do what I wanted to.

1 Like