How to save nonlinear JuMP model to file?

Is it unsupported feature yet?
Ivoking write_to_file I get:

ERROR: MathOptInterface.UnsupportedAttribute{MathOptInterface.NLPBlock}: Attribute MathOptInterface.NLPBlock() is not supported by the model.

Could someone point some good formats to store nonlinear problems? Even though it is not supported bu JuMP yet. That might be a good feature to add support later.

Of course I can use plain Serialization.serialize for the moment.

Since there is no answer yet. It could be quite likely that this is just not possible. In linear cases, the interface can store the internal representation of the linear/quadratic terms. For nonlinear terms, the interface would need to be able to store an arbitrary Julia function. Which is of course much harder.

I think the question then becomes: If you have a very long and complicated function, is it maybe easier to put it into a Julia file and store this script (and include it later again) rather than some serialization which might not work in the future?

If your nonlinear function is a symbolic expression, you could use Symbolics to store it: Symbolic Calculations and Building Callable Functions · Symbolics.jl They also use the approach to store a Julia file with the plain definition. (See this recent post: Using Serialization to store then load a lambdified function? - #4 by SteffenPL but there are more posts of course on this topic.)

What file format did you use?

You should be able to use .nl or .mof.json.

1 Like

Well this makes some difference.
I’ve used wrong file extension. And it seems not clearly stated in docs which extension of file should be used for nonlinear models.

Still no great success.

*.nl extension fails due to registered function atan2 claiming
ERROR: Unsupported operation atan2

while *.mof.json fails due to use of abs2 function in model.
ERROR: Cannot convert Expr to MOF. Unknown function: abs2.

Thanks @odow! It’s a step forward anyway.

What is the model you are trying to write to a file, and why are you trying to do it?

It looks like I need to add support for atan2 to the writers. NL should support it, so I guess it’s an oversight.

The problem is aircraft trajectory optimization. It has hundreds / thousands of nonlinear constraints produced from nonlinear dynamics goals and constraints during transcription. That transcription takes a while in first place.

Then I have some julia functions that implement computations for each constraint. Then I have to extract expressions for nonlinear constraints and nonlinear cost function using Symbolics.jl as JuMP does not allows nonlinear expressions on variables. Then I have to use macro substitutions to change that expressions to something JuMP will accept. So it takes a while…

And I don’t want to wait each time to run optimizations based on the model. So I want to save it to a file.

atan2 is implemented as registered user defined function. Because julia has only one atan both for 1-argument and 2 argument versions which confuses JuMP.

You cannot write a JuMP model to file which contains a user-defined function. (Because we’d somehow have to write the Julia function to a text file.)

And I don’t want to wait each time to run optimizations based on the model. So I want to save it to a file.

Why does creating the model take so long? It shouldn’t take any longer than reading from a file. Perhaps you could share your problem, and we can see if we can improve things so you don’t have to write to a file.

You cannot write a JuMP model to file which contains a user-defined function. (Because we’d somehow have to write the Julia function to a text file.)

But atan2 quite a useful function. So it woutl be nice to have some solution for having it in our models.
Meanwhile I have to write

JuMP.register(model, :atan2, 2, atan; autodiff = true)

And also do some macro susbtitutions in expressione

ex = MacroTools.prewalk(u -> @capture(u, ($(atan)(a_, b_))) ? :(atan2($a, $b)) : u, ex)

Also I have to do much more macro substitutions in expressions because JuMP add_nonlinear_constraint does not undestands expressions like $(+)(a, b) generated by Symbolics module. It expects something like (a+b)
Thus I have a long list of such macros:

    ex = Symbolics.toexpr(symexpr)
    ex = MacroTools.prewalk(u -> @capture(u, ($(getindex)(u, n_))) ? :($(var[n])) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(+)(a_, b_))) ? :($a + $b) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(*)(a_, b_))) ? :($a * $b) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(^)(a_, b_))) ? :($a^$b) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(/)(a_, b_))) ? :($a / $b) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(atan)(a_, b_))) ? :(atan2($a, $b)) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(max)(a_, b_))) ? :(max($a, $b)) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(min)(a_, b_))) ? :(min($a, $b)) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(sin)(a_))) ? :(sin($a)) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(cos)(a_))) ? :(cos($a)) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(sqrt)(a_))) ? :(sqrt($a)) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(abs2)(a_))) ? :(abs2($a)) : u, ex)
    ex = MacroTools.prewalk(u -> @capture(u, ($(atan)(a_))) ? :(atan($a)) : u, ex)

And these macros took most of the time. It not too much though. It might take a minute. But still quite annoying during developing.

It would be nice to get rid off Symbolics.jl in first place if JuMP would accept nonlinear expressions on its variables directly. And thats still not the case. Hope this is in the next features list.

Can you share an example of your larger problem? Why are you using symbolics and why the macro rewrites?

Unfortunately it is a relatively big project. It is hard to cut some part of it to share in the forum…

I’m doing trajectory optimization using collocation method. It transcribes problem to a big list of nonlinear constraints extracted from systems dynamics. You can think of it as a function which computes list of residuals given current state, goals and parameters. Each residual is a constraint which should be zero in order for dynamics to be feasible. To extract expressions for these residuals/constraints I’m using Symbolics.jl. I pass symbolic values to the function and get list of expressions for each constraint residual. I can’t pass JuMP’s symbolic values to the function as JuMP not allows nonlinear operations on them.

Then I do what I have described earlier with expressions extracted using Symbolics.jls symbolyc variables.

If the macros take most of the time, it’s because you’re repeatedly walking the same expression graph. Do something like this to walk only once:

using JuMP, Symbolics, Ipopt

to_jump_expr(c::Any, ::Dict{Symbol,VariableRef}) = c
to_jump_expr(x::Symbol, map::Dict{Symbol,VariableRef}) = JuMP.index(map[x])
to_jump_expr(f::Function, ::Dict{Symbol,VariableRef}) = Symbol(f)
function to_jump_expr(expr::Expr, map::Dict{Symbol,VariableRef})
    for i in 1:length(expr.args)
        expr.args[i] = to_jump_expr(expr.args[i], map)
    end
    return expr
end

foo(x, y) = sqrt(sin(x) + cos(y))

Symbolics.@variables X Y
sym_expr = Symbolics.toexpr(foo(X, Y))

model = Model()
@variable(model, x)
@variable(model, y)
jump_expr = to_jump_expr(sym_expr, Dict(:X => x, :Y => y))
add_nonlinear_constraint(model, :($jump_expr <= 0))

In general though, JuMP is not built to interact with Symbolics. Instead of using functions, write your constraints using JuMP’s @NL macros.

1 Like

Thanks @odow for great recipe!

@NL macros is good when you hand writing your small model. It is not convenient for big models. The best solution would be if JuMP variables work the way Symbolics variables work and extract nonlinear expressions from functions IMHO.

Function is the most universal representation of computation. Having my constraints constructed as functions I can extract expressions with Symbolics and try to solve using some SciML stuff or convert to JuMP nonlinear constraints and solve there. Or I can extract CasADi expressions using its variables and use CasADi to solve the problem. Or I can use different automatic diff libs and use some other tools (something from JuliaSmoothOptimizers) to get a solution.

Btw both Symbolics.jl and CasADi.jl allows extract nonlinear expressions from functions which is yet not the case for JuMP.jl. SymPy.jl also can do that but it almost useless for any big problem, it might be useful for some operations with dynamics equations before transcription when problem becomes huge.

The best solution would be if JuMP variables work the way Symbolics variables work and extract nonlinear expressions from functions IMHO.

Thanks @odow for video! Great work you are doing! Especially SymbolicAD looks very promising. In trajectory optimization transcription constraints tend to have very similar structure…

Please consider the issue with atan and probably some other multi-method functions which take one or two arguments in Julia. Specifically atan2 is used very often in nonlinear problems and there is no such dedicated function in Julia it overloads atan for both one- and two- argument versions which confuses JuMP.

Meanwhile I have to register user defined function atan2 which brings all the limitations of user registered functions.

Can you share a small example of a model containing atan2? We might be able to add support for it to MathOptInterface.

julia> using JuMP

julia> m = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.

julia> @variable(m, x)
x

julia> @variable(m, y)
y

julia> @NLconstraint(m, atan(x, y) >= 0)
┌ Warning: Function atan automatically registered with 2 arguments.
│ 
│ Calling the function with a different number of arguments will result in an
│ error.
│ 
│ While you can safely ignore this warning, we recommend that you manually
│ register the function as follows:
│ ```Julia
│ model = Model()
│ register(model, :atan, 2, atan; autodiff = true)
│ ```
└ @ JuMP ~/.julia/packages/JuMP/Y4piv/src/parse_nlp.jl:21
atan(x, y) - 0.0 ≥ 0

julia> write_to_file(m, "model.mof.json")
ERROR: The function atan is a Unary function, but you have passed 2 arguments.
Stacktrace:

So if you not registering yourself 2-argument of atan JuMP does it for you. But then it can not save this to any file as it expects atan being unary function. Also you not able to mix unary and binary versions of atan in one model.

that’s why I’m registering binary version with atan2 name like so:

JuMP.register(model, :atan2, 2, atan; autodiff = true)

But this still not allows to save model also it requires me to make substitutions in expressions renaming all binary atan invocations with a new name:

ex = MacroTools.prewalk(u -> @capture(u, ($(atan)(a_, b_))) ? :(atan2($a, $b)) : u, ex)

1 Like

I guess I meant a realistic example of a practical application.

Well any nonlinear model using atan2 function would be realistic.

For example calculating aerodynamic forces of an aircraft one have to know angle of attack and sideslip angle which is usually calculated as atan2 on 2 of velocity components.

More on history importance uses and motivation behind existence of atan2 function one can find on wikipedia page.

That function available in all math libraries for all languages I know starting from C. It is supported by most nonlinear optimisation tools like CasADi for example.

It is unfortunate that Julia decided to implement atan2 as one more method for atan function. Combining unary and binary versions under the same name. Which causes confusions in packages like JuMP since then…

Please open an issue at MathOptInterface.jl. We can probably implement this.

Done.

Btw. Proposed example on converting Symbolics expressions to JuMP with recursive function does not stretches to the case when Vectors are used unfortunately.

If you have @variable(model, u[1:10]) for example there is no JuMP.index(u). Instead there are different entities of kind JuMP.index(u[i]).

And Symbolics.jl will produce expressions that look like
:((getindex)(U, i)). And there is no point to parse it down to a a Symbol. One have to substitute whole pattern of that kind with JuMP.index(u[i]).

So still some macro work is required… Unless there are some more smart tricks.