Automatically generated API and Frontend for a JuMP model

Dear JuMP community,

We would like to share with you MOS, software that automatically generates an API and Frontend for an annotated optimization model: MOS Software Documentation — MOS documentation

There are instructions at the link for trying it out. We hope it is useful in making it easier to deploy JuMP models to solve problems, and would be very happy to hear any feedback or answer any questions. We know the language-specific kernels need more work, and we are learning example-by-example.

1 Like

Nice!

One suggestion: it took me a little while to find your examples: mos-examples/knapsack_model.jl at main · Fuinn/mos-examples · GitHub.

Perhaps the first part of the docs could be “MOS helps you easily deploy optimization models. Here’s a brief example for CVXPY/JuMP/Pyomo/…” and then some sample code?

1 Like

Thank you very much Oscar for the suggestion, greatly appreciate it. Just updated the docs to reflect this suggestion, and will provide some sample JuMP code directly when we do a larger revamp.

This example shows how MOS may serve a JuMP model, serving the same model as show in this JuMP tutorial example.

As a short example, after the model has been added to local or remote MOS server, it may be accessed as follows:

using MOSInterface
interface = Interface()
model = get_model_with_name(interface, "Simple MIP JuMP Model")
set_interface_object(model, "lower_bound", 1.2)
MOSInterface.run(model)
get_status(model)

Details about authentication and preparing the model are at the first link above

1 Like

(and an annotated model file can be added to MOS with a one line command)

model = new_model(interface, "./examples/jump/simple_mip/simple_mip_model.jl")