MethodError: no method matching ^(::VariableRef, ::Vector{VariableRef})

You need to take another look at the subscripts in your math formulation. You have r_i in the big constraint, as well as i in I and i in J_i.

You should also think a lot more about your formulation.

  • What does it mean for (x_j)^x if x_j is either 0 or 1? The power has no effect, so you can just remove it.
  • Is r_i really a variable or is it data? Can’t r_i just take arbitrarily negative values to reduce R? Is r_i >= 0?
  • Does it make sense for x and y to have upper bounds of 1? I don’t think they’d ever be something outside those values if the left-hand side of their inner square terms are 0 or 1`?
  • Why do you even need for all j in J? There’s no data, so every constraint is identical.

Ignoring the formulation issues, here’s how you could write it in JuMP, but note that you likely won’t get the answer you’re expecting because the formulation doesn’t make sense:

using JuMP
import Juniper
import Ipopt
n = 5
model = Model(
    optimizer_with_attributes(
        Juniper.Optimizer,
        "nl_solver" => optimizer_with_attributes(Ipopt.Optimizer, MOI.Silent() => true),
    ),
)
@variable(model, 0 <= x <= 1)
@variable(model, 0 <= y <= 1)
@variable(model, R)
@variable(model, r[1:n] >= 0)
@variable(model, x_j[j=1:n], Bin, start = j == 1 ? 1.0 : 0.0)
@objective(model, Min, R)
@constraint(model, sum(x_j[j] for j in 1:n) >= 1)
@NLconstraint(
    model, 
    [j=1:n],
    (sqrt((x_j[j] - x)^2 + (x_j[j] - y)^2) + r[j]) * x_j[j] <= R,
)
optimize!(model)
1 Like