Does AmplNLWriter support most of the functionalities that jump natively provides through MathOptInterface?

I am trying to decide if I should start making some kind of MathOptInterface for my solver to make jump interface directly with it or should i just use the AmplNLWriter to indirectly talk to jump. The solver I have already has AMPL support.

PS: I am new to both jump and julia so any input whatsoever would be very helpful.

Thank you :slight_smile:

Hi @Mihir_V146, welcome to the forum :smile:

Does AmplNLWriter support most of the functionalities that jump natively provides through MathOptInterface?

No. As one example, AmplNLWriter does not provide incremental interfaces. So if you solve the problem, modify it, and then re-solve, AmplNLWriter will throw out the previous model and build a new one. This is different to incremental optimisers like HiGHS or Gurobi, where, if you modify the problem, they update the solver’s in-memory representation allowing very efficient re-solves.

But you should care about that only if you solver has native support for incremental updates. It might not, in which case, AmplNLWriter is good.

Also mind that: writing a MathOptInterface is a lot of work. Just use AmplNLWriter to begin with.

What’s the solver? Do you have a link?

1 Like

Thanks a lot for your guidance, I really appreciate it!

The solver is Minotaur optimization solver that I am working on

https://minotaur-solver.github.io/

(I am not allowed to reply with links for some reason which is quite absurd)

I would just stick to AmplNLWriter as of now and maybe try to make minotaur more compatible with AmplNLWriter like Bonmin_jll does with precompiled binaries.

1 Like

maybe try to make minotaur more compatible with AmplNLWriter like Bonmin_jll does with precompiled binaries.

It has been a while since I last looked at minotaur: Native Support from JuMP in Julia · Issue #19 · coin-or/minotaur · GitHub

It needs a built recipe for GitHub - JuliaPackaging/Yggdrasil: Collection of builder repositories for BinaryBuilder.jl · GitHub

I am not allowed to reply with links for some reason which is quite absurd

It’s because you are a new user. You can probably post a link now. The rule is to stop spam from new users (which we get a lot of, the bots just remove it before you can see it).

1 Like

Thank you once again for you time and guidance.

The links you sent were very helpful.

I went through https://jump.dev/JuMP.jl/stable/moi/tutorials/implementing/ to read up on implementing a primitive MathOptInterface for Minotaur.

Based on my understanding, the general workflow for integrating Minotaur with JuMP through MathOptInterface might look something like the following:

  1. Model/Solve using C++ only with minotaur and get the output (This should work as minotaur supports C++ interface currently)
  2. Do the same but instead try and implement a C interface for doing so.
  3. Make a Julia interface.
  4. Finally make it interface with JuMP.

I wanted to check whether this workflow is reasonable, or if there is a more appropriate approach for integrating Minotaur with JuMP (instead of the indirect AmplNLWriter way).

I would really appreciate any guidance or suggestions you might have.

What is the motivation for writing an MOI interface instead of using AmplNLWriter? Note the big red warning at the top of that tutorial.

  1. The first step is to build a Minotaur_jll in Yggdrasil so that people in Julia can install it. This means building only with free solvers (you won’t be able to use CPLEX etc).

Unless there is good reason, stop here. Going further is a lot of work.

If you really want to continue:

  1. Write a pure C API for Minotaur (not C++). I didn’t spot one looking through the source code, but I might have missed it.
  2. Rebuild th Minotaur_jll to wrap the C API as a shared library
  3. Write a Minotaur.jl package similar to Ipopt.

Note that none of the steps 1-3 require you to write any Julia. I wouldn’t consider anything about the Julia wrapper until you have finished step 3.

Moreover, if all the C API is going to do is a single-shot build the model and solve. Then there is very little reason to go to the effort. Do step 1 and use AmplNLWriter. Ipopt had a C interface already, and we wrote Ipopt.jl because it means we can do our own automatic differentiation, and we can efficiently update parameters between solves.

1 Like

I would work on Minotaur_jll in Yggdrasil as per you advice, as there is not a lot of motivation right now to proceed further.

I was initially planning to build the MOI interface purely for exploratory purposes to learn more about the topic and gain new skills but it seems to be a very long and time consuming journey. Maybe we could work on that in the future, if need arises :slight_smile:

Thank a lot for all your help! :smile:

1 Like

I was initially planning to build the MOI interface purely for exploratory purposes to learn more about the topic and gain new skills

This is the best reason :smile: But the first steps are to get it building and write a C API.

1 Like