[JuMP] Updating NLparameter of a model in a loop

copied from JuMP Gitter channel

I have a question about solving a nonlinear JuMP model in a loop while updating some nonlinear parameters of the obj function:

using JuMP, Ipopt

function setup_model()
    model = JuMP.Model(with_optimizer(Ipopt.Optimizer))
    @variable(model, x[1:2])
    # I want to change this parameter in an (MPC)-loop later
    @NLparameter(model, x_ref[1:2] == 0)

    function my_obj(x...)
        return (x[1] - x_ref[1])^3 +  (x[2] - x_ref[2])^2

    function ∇my_obj(g, x...)
        g[1] = 3 * (x[1] - x_ref[1])^2
        g[2] = 2 * (x[2] - x_ref[2])

    JuMP.register(model, :my_obj, 2, my_obj, my_obj)

    @NLobjective(model, Min, my_obj(x...))

    return model

function run_in_loop(model::JuMP.Model, new_ref)
    # update NLparameter x_ref here, how?
    # x_ref <-- new_ref
    return JuMP.value(x)


model = setup_model()

for i = 1:5
    new_ref = rand(2)
    x = run_in_loop(model, new_ref)

My problem looks similar to this simplified (stupid) example. Basically I want to solve the model in each iteration with updated parameters x_ref. I have two questions about that:

  1. How do I access the NLparameter in anothter function having only the model? model[:x_ref] doesn’t work.
  2. When trying to solve a problem with such an objective function I get the error:
ERROR: LoadError: MethodError: no method matching -(::Float64, ::NonlinearParameter)
Closest candidates are:
  -(::Float64, ::Float64) at float.jl:403
  -(::Float64) at float.jl:393
  -(::Real, ::Complex{Bool}) at complex.jl:300
 [1] (::var"#my_obj#501"{Array{NonlinearParameter,1}})(::Float64, ::Vararg{Float64,N} where N) at /Users/Micha/Dropbox/Research/4YP_Linh/question.jl:11
 [2] (::JuMP.var"#96#99"{var"#my_obj#501"{Array{NonlinearParameter,1}}})(::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}) at /Users/Micha/.julia/packages/JuMP/MsUSY/src/nlp.jl:1177

Hmm. I don’t know if NL parameters work in user-defined functions. It would be useful to update the docs for this.

If you can build the objective programmatically, you might be better off using https://www.juliaopt.org/JuMP.jl/stable/nlp/#Raw-expression-input-1

Thanks @odow.
I think building it like this would be quite tricky as the real objective function that I am using has a lot of terms (a model predictive control objective).

  1. Could I abuse a JuMP variable as a parameter? Define @variable(model, p) and then constraining p == parameter at each iteration?
  2. Even if the NLparameter is not used in a user-defined function, how would I reference the NLparameter if I am given only the model?
  1. Sure. Use fix(p, parameter). This is what I do, for example, in SDDP.

  2. Store it in the model:

model = Model()
@NLparameter(model, my_param == 1)
model[:my_param] = my_param

This is likely an oversight. PRs to fix appreciated.

1 Like