# InfiniteOpt: Unable to put expressions with divide operation into objective

Hi there!

I am quite new to Julia and JuMP/InfiniteOpt. I’m trying to solve a time optimal control problem. I have an issue when I incorporate an expression using the divide operator into the objective function. The following error comes up (full stack trace below) `ERROR: MethodError: no method matching check_belongs_to_model(::NonlinearExpression, ::Model)`.

I have tested some nonlinear expressions which don’t include some division or a negative power e.g. `(x[1] + x[2])^2` and I don’t encounter this problem. I think that the expression should be differentiable via automatic differentiation.

The following code should recreate the

``````using InfiniteOpt
using Ipopt
using Plots

#initial variable values
x0 = 5
y0 = 10
x_0 = [x0, y0]
vx0 = 2
vy0 = 2
v_0 = [vx0, vy0]
ux0 = 2
uy0 = 3
u_0 = [ux0, uy0]
N = 100
x_f = [1.0,1.0]
v_f = [1.0,1.0]

model = InfiniteModel(Ipopt.Optimizer)

@infinite_parameter(model, τ ∈ [0, 1], num_supports = N)

@variables(model, begin
# state variables
x[i=1:2], Infinite(τ), (start = x_0[i])
v[j=1:2], Infinite(τ), (start = v_0[j])
# control variables
-1 <= u[k=1:2] <= 1, Infinite(τ), (start = u_0[k])
end)

@variable(model, 5 ≤ tf ≤ 15, start = 12)

#initial conditions
@constraint(model, [i = [1, 2]], x[i](0) == x_0[i])
@constraint(model, [i = [1, 2]], v[i](0) == v_0[i])
@constraint(model, [i = [1, 2]], u[i](0) == 0)

#terminal conditions
@constraint(model, [i=[1,2]], x[i](1) == 0)
@constraint(model, [i = [1, 2]], v[i](1) == 0)

# ODEs
@constraint(model, [i = 1:2], ∂(x[i], τ) == tf * v[i])
@constraint(model, [i = 1:2], ∂(v[i], τ) == tf* u[i])

I = @expression(model, (x[1] + x[2])^(-1))

@objective(model, Min, tf * ∫(1 - I + u[1]^2 + u[2]^2, τ))

optimize!(model)
``````

The full stack trace is:

``````ERROR: MethodError: no method matching check_belongs_to_model(::NonlinearExpression, ::Model)

Closest candidates are:
check_belongs_to_model(::GenericAffExpr, ::AbstractModel)
@ JuMP ~/.julia/packages/JuMP/as6Ji/src/aff_expr.jl:625
check_belongs_to_model(::AbstractVariableRef, ::AbstractModel)
@ JuMP ~/.julia/packages/JuMP/as6Ji/src/variables.jl:349
check_belongs_to_model(::VectorConstraint, ::Any)
@ JuMP ~/.julia/packages/JuMP/as6Ji/src/constraints.jl:674
...

Stacktrace:
[1] check_belongs_to_model(expr::NonlinearExpr, model::Model) (repeats 2 times)
@ JuMP ~/.julia/packages/JuMP/as6Ji/src/nlp_expr.jl:523
[2] set_objective_function
@ JuMP ~/.julia/packages/JuMP/as6Ji/src/objective.jl:279 [inlined]
[3] set_objective
@ JuMP ~/.julia/packages/JuMP/as6Ji/src/objective.jl:326 [inlined]
[4] _set_objective(trans_model::Model, sense::MathOptInterface.OptimizationSense, expr::NonlinearExpr)
@ InfiniteOpt.TranscriptionOpt ~/.julia/packages/InfiniteOpt/WWtLl/src/TranscriptionOpt/transcribe.jl:582
[5] transcribe_objective!(trans_model::Model, inf_model::InfiniteModel)
@ InfiniteOpt.TranscriptionOpt ~/.julia/packages/InfiniteOpt/WWtLl/src/TranscriptionOpt/transcribe.jl:605
[6] build_transcription_model!(trans_model::Model, inf_model::InfiniteModel; check_support_dims::Bool)
@ InfiniteOpt.TranscriptionOpt ~/.julia/packages/InfiniteOpt/WWtLl/src/TranscriptionOpt/transcribe.jl:965
[7] build_transcription_model!
@ ~/.julia/packages/InfiniteOpt/WWtLl/src/TranscriptionOpt/transcribe.jl:938 [inlined]
[8] #build_optimizer_model!#103
@ ~/.julia/packages/InfiniteOpt/WWtLl/src/TranscriptionOpt/optimize.jl:34 [inlined]
[9] build_optimizer_model!
@ ~/.julia/packages/InfiniteOpt/WWtLl/src/TranscriptionOpt/optimize.jl:26 [inlined]
[10] build_optimizer_model!(model::InfiniteModel; kwargs::@Kwargs{})
@ InfiniteOpt ~/.julia/packages/InfiniteOpt/WWtLl/src/optimize.jl:534
[11] build_optimizer_model!
@ InfiniteOpt ~/.julia/packages/InfiniteOpt/WWtLl/src/optimize.jl:528 [inlined]
[12] #optimize!#475
@ InfiniteOpt ~/.julia/packages/InfiniteOpt/WWtLl/src/optimize.jl:924 [inlined]
[13] optimize!(model::InfiniteModel)
@ InfiniteOpt ~/.julia/packages/InfiniteOpt/WWtLl/src/optimize.jl:922
[14] top-level scope
``````

The versions I’m using are InfiniteOpt v0.5.8 and JuMP v1.21.1
Thanks in advance for any help!

1 Like

Without line numbers, it’s hard to tell which is line 63. But it seems to flag the nonlinear expression, where I don’t understand what you’re trying to do. It looks like a double integrator with some positive initial conditions, driving `x[1]` and `x[2]` to final zero. But I am confused by the nonlinear expression

``````I = @expression(model, (x[1] + x[2])^(-1))

@objective(model, Min, tf * ∫(1 - I + u[1]^2 + u[2]^2, τ))
``````

So `I` has a singularity at final condition. If `x[1]+x[2]` is small positive, then `-I` is large negative. And if `x`s are zero, then `I` is negative infinity. Why are you integrating something going infinite? Or maybe my interpretation is wrong, I’m just a fragile human.

Hi! Thanks for the response.
Hmm I’ll try to find a way to include line numbers when copying code, but yes the problem is the nonlinear expression.
There is nothing to interpret with `I` - my actual objective function is a nonlinear function with negative powers of expressions (i.e. lots of expressions divided by one another) and I found that I get this error when I have some expression ^{-1} in the objective function.

Re singularity - yes, this is true but if I set the terminal conditions `x_f, v_f = 1` then I still get the same error, but thanks for pointing out because I forgot to change the script that I submitted!

Hi @aruna-ram, welcome to the forum

This looks like a bug in InfiniteOpt, because it shouldn’t be mixing `NonlinearExpression` with `NonlinearExpr` (the difference is very subtle to the reader!).

@pulsipher is the main developer of InfiniteOpt, so he can help us out.

1 Like

It only helps a little to set the final condition to 1. That means you’re not forcing the optimization to singularity, but you are incentivizing it to hang out near zero. What is should then do is at t=0, decrease x[1] and x[2] as close as possible to zero, and then stay there as long as it can before coming back up to 1. The objective integrates `-I`, and may well send your x’s to zero, until `1/(x[1]+x[2])` blows up.

It may be better to formulate the problem more nicely, and avoid it wanting to blow up.

1 Like

Hi @aruna-ram,

You uncovered a bug in InfiniteOpt’s legacy nonlinear API. Currently, JuMP allows the unintentional creation of nonlinear expressions that mix the new/old nonlinear interfaces (Should error when trying to mix `NonlinearExpression` and `GenericNonlinearExpr` · Issue #3740 · jump-dev/JuMP.jl · GitHub).

A new API that is built upon JuMP’s new nonlinear interface is ready to go on the master branch, but hasn’t been released yet since there are a few other additional features we want to add before making a major release.

In the meantime, I have added a patch to InfiniteOpt’s `v0.5` release that fixes this problem: Fix legacy NLP interface by pulsipher · Pull Request #340 · infiniteopt/InfiniteOpt.jl · GitHub. I will release a new package version once all the tests clear.

4 Likes

Version `v0.5.9` has now been released which fixes the bug.

2 Likes

Thanks for fixing this! I can now run the code without getting this error.

2 Likes