Issue with JuMP's Parameter making the problem recognized as nonlinear

Hello!

I am working on an algorithm where I would like to solve convex programs (CP) repeatedly; in the algorithm, the CP’s structure remains the same (same variables, same number of constraints, etc.), but coefficients to the constraints get updated. Thus, to automate the modification to the JuMP.Model (without having to reinstantiate one at each iteration), I am trying to use Parameter ’s.
However, the issue I am having is that any nonlinear operation on Parameter ’s are recognized as nonlinearity of the optimization problem itself (even though it is meant to be a fixed value & not an optimization variable), thus making the resulting model appear as though it is not a CP.

Please see below a minimal example that I would like to resolve:

using ECOS
using JuMP

model = Model(ECOS.Optimizer)
@variable(model, x[1:2] >= 0)
@variable(model, θ in Parameter(0))         # initialize storage for parameter
set_parameter_value(θ, 0.5)                 # parameter value would be updated by outer loop

@constraint(model, x[1] + x[2] <= cos(θ))   # problematic due to nonlinear operation!

@objective(model, Min, 2x[1] + 3x[2])

optimize!(model)
@show objective_value(model)

for which I get the error

ERROR: LoadError: Constraints of type MathOptInterface.ScalarNonlinearFunction-in-MathOptInterface.LessThan{Float64} are not supported by the solver.

If you expected the solver to support your problem, you may have an error in your formulation. Otherwise, consider using a different solver.

The list of available solvers, along with the problem types they support, is available at https://jump.dev/JuMP.jl/stable/installation/#Supported-solvers.

Is there something that I am missing about how I should be using Parameter…?
If not (and if this is unavoidable), what would be a good way for me to be able to update the value of \theta in the above code without having to recreate the model/overwrite the constraint?

I’d appreciate any help/suggestion, thank you!

Hi @Yuricst, correct, we don’t detect that the nonlinear operation of a parameter is in fact a constant.

See this part of the JuMP documentation: Constraints · JuMP

For your code, it would look something like (I swapped <= to >= to yield different solutions):

julia> using ECOS

julia> using JuMP

julia> begin  # Option 1
           model = Model(ECOS.Optimizer)
           set_silent(model)
           @variable(model, x[1:2] >= 0)
           constraint = @constraint(model, x[1] + x[2] >= 0.0)
           @objective(model, Min, 2x[1] + 3x[2])
           for rad in [0, 0.5, 1]
               set_normalized_rhs(constraint, cos(rad))
               optimize!(model)
               @assert is_solved_and_feasible(model)
               @show objective_value(model)
           end
       end
objective_value(model) = 1.999999997831738
objective_value(model) = 1.7551651219056938
objective_value(model) = 1.0806046105119194

julia> begin  # Option 2
           model = Model(ECOS.Optimizer)
           set_silent(model)
           @variable(model, x[1:2] >= 0)
           @variable(model, cos_θ == cos(0))
           @constraint(model, x[1] + x[2] >= cos_θ)
           @objective(model, Min, 2x[1] + 3x[2])
           for rad in [0, 0.5, 1]
               fix(cos_θ, cos(rad); force = true)
               optimize!(model)
               @assert is_solved_and_feasible(model)
               @show objective_value(model)
           end
       end
objective_value(model) = 1.999999997831738
objective_value(model) = 1.7551651219056938
objective_value(model) = 1.0806046105119194

Thank you for your reply!
I see, I am for now resorting with using set_normalized_rhs().

Concerning Option 2 in your code, I imagine using fix(cos_θ, cos(rad); force = true) still does not make cos_θ not a variable as far as the model is concerned?
I am asking for a scenario where I might have, for example, a constraint like

@constraint(model, x[1] * sin(θ) + x[2] * cos(θ) <= 0)

where I imagine even if I did

@variable(model, cos_θ == cos(0))
@variable(model, sin_θ == sin(0))

@constraint(model, x[1] * sin_θ + x[2] * cos_θ <= 0)

then

fix(cos_θ, cos(rad); force = true)
fix(cos_θ, sin(rad); force = true)

this would still be recognized as a quadratic constraint…?

Correct. We don’t substitute parameters into expressions and reduce them from nonlinear to linear/quadratic or quadratic to linear.

But you can use set_normalized_coefficient to modify the coefficient of a variable.