Convert a nonlinear model defined by JuMP's legacy version to latest version

Hi everyone, I am fetching models from MINLPLib.jl, and I guess they use “Nonlinear Modeling (Legacy)” to define nonlinear constraints/objectives. I am curious if there is any way to transform a nonlinear model defined by legacy version to a model defined by latest version “Nonlinear Modeling”?

1 Like

@odow I tested using .nl file from MINLPLib organization and julia package. Both throw an error

ERROR: MethodError: no method matching operate(::typeof(*), ::Type{Float64}, ::Float64, ::Nothing)
The function operate exists, but no method is defined for this combination of argument types.

when I set

new_model = JuMP.read_from_file(“minlplib/nl/4stufen.nl”; use_nlp_block = false)

But output a model when I set

m_new = JuMP.read_from_file(“minlplib/nl/4stufen.nl”; use_nlp_block = true)

1 Like

Hi @zhu266, thanks for posting over here.

Yes, I was just trying this locally as well and I found the same bug. I’m just about to push a fix. I’ll update this post with the link when it’s ready: [FileFormats.NL] fix try_scalar_affine_function by odow · Pull Request #2766 · jump-dev/MathOptInterface.jl · GitHub

One approach will be:

using MINLPLib
using JuMP
function convert_legacy_to_new(model)
    path = mktempdir()
    filename = joinpath(path, "model.nl")
    write_to_file(model, filename)
    return read_from_file(filename; use_nlp_block = false)
end

model = fetch_model("minlp/4stufen")
model = convert_legacy_to_new(model)

The other approach is:

using JuMP
import Downloads
function download_model(name; kwargs...)
    io = IOBuffer()
    Downloads.download("https://www.minlplib.org/nl/$(name).nl", io)
    seekstart(io)
    return read(
        io,
        Model;
        use_nlp_block = false,
        format = MOI.FileFormats.FORMAT_NL,
    )
end
model = download_model("4stufen")

Great, thank you!

I’m just preparing a new release of MathOptInterface, Prep for v1.40.2 by odow · Pull Request #2767 · jump-dev/MathOptInterface.jl · GitHub, so this should be fixed in a few hours.

Just chiming back in to say that this now fixed. Sorry for the bug!

(minlp) pkg> st
Status `/private/tmp/minlp/Project.toml`
  [4076af6c] JuMP v1.26.0
  [c36e90e8] MINLPLib v0.1.0 `https://github.com/lanl-ansi/MINLPLib.jl#master`
  [b8f27783] MathOptInterface v1.40.2

julia> using JuMP

julia> import Downloads

julia> function download_model(name; kwargs...)
           io = IOBuffer()
           Downloads.download("https://www.minlplib.org/nl/$(name).nl", io)
           seekstart(io)
           return read(
               io,
               Model;
               use_nlp_block = false,
               format = MOI.FileFormats.FORMAT_NL,
           )
       end
download_model (generic function with 1 method)

julia> model = download_model("4stufen")
A JuMP Model
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: AffExpr
├ num_variables: 149
├ num_constraints: 351
│ ├ NonlinearExpr in MOI.EqualTo{Float64}: 34
│ ├ AffExpr in MOI.EqualTo{Float64}: 60
│ ├ AffExpr in MOI.GreaterThan{Float64}: 4
│ ├ VariableRef in MOI.GreaterThan{Float64}: 149
│ ├ VariableRef in MOI.LessThan{Float64}: 56
│ └ VariableRef in MOI.ZeroOne: 48
└ Names registered in the model: none
1 Like

I really appreciate your efforts! Also, I mention that the model returned by convert_legacy_to_new(model) has anonymous variable. Suppose _[i] corresponding to x, is the index i the VariableIndex of x?

If you want to maintain names, use MathOptFormat | Specification and description of the MathOptFormat file format instead:

function convert_legacy_to_new(model)
    path = mktempdir()
    filename = joinpath(path, "model.mof.json")
    write_to_file(model, filename)
    return read_from_file(filename; use_nlp_block = false)
end

This won’t keep the JuMP-level structures like model[:x], but you can look things up by their name:

julia> variable_by_name(model, "x[13]")
x[13]
1 Like