Using ANTIGONE from Julia

I’ve been investigating the various options for (globally optimal) MINLP in Julia, and ANTIGONE looks like a nice option. I’m curious if anyone has any experience using it from Julia.

At first glance, it looks like this will be difficult, as ANTIGONE appears to only support GAMS models, and I can’t find any JuMP-GAMS interface. It also looks like converting from .nl to GAMS is nontrivial. NEOS does support ANTIGONE, but only through GAMS file uploads, which seems like it doesn’t resolve the issue.

Am I missing anything? Any suggestions on the best path forward?

Or, perhaps, what is your favorite MINLP solver? So far on my list I have:

  • COUENNE: easy to use, compatible with JuMP, but appears to perform poorly
  • SCIP: also compatible with JuMP, but haven’t evaluated its performance
  • BARON: compatible with JuMP, but expensive. Currently waiting for a license
  • ANTIGONE: GAMS-only
  • LINDOGlobal: GAMS-only

Edits:

  • POD.jl: native Julia, with JuMP support
1 Like

This package provides a way to access any solver that supports AMPL NL files,

You might also consider this native Julia global solver,

Thanks! Unfortunately, I don’t think ANTIGONE itself supports AMPL NL files, and I can’t find any clear way to transform a NL model into a GAMS one.

I’ve played around with POD.jl a bit in the past, but forgot to add it to the list. I’ll fix that now.

I know this is a bit old, but in case there are others who have a similar issue.

You can give GAMS.jl a try: GitHub - GAMS-dev/gams.jl: A MathOptInterface Optimizer to solve JuMP models using GAMS
For using Antigone, simply do:

using GAMS, JuMP
model = Model(GAMS.Optimizer)
set_optimizer_attribute(model, GAMS.Solver(), "antigone")
4 Likes

Cool! Is this an officially supported GAMS product? Please let me know if we can offer any help from our side.

Yes, this is officially supported. However, it’s still at an early stage and we are very busy with testing. Thanks very much for the offer, I will very probably come back to that. If you have any suggestions for improvements or feature requests, simply create an issue. We are more than happy to hear your feedback!

2 Likes

Very nice.

This repo also has a lot of tests for nonlinear solvers: GitHub - jump-dev/MINLPTests.jl: Unit and Integration Tests for JuMP NLP and MINLP solvers

JuMP/MOI’s current support for NLP is second-class. We hope to make significant improvements over the next 12-24 months.

You can join: JuliaOpt/JuMP-dev - Gitter. We also hold monthly development calls. The next is June 26. If you’re interested, ask @mlubin on the Gitter channel for an invite. I’m sure we’d all be interested to hear the challenges interfacing JuMP with GAMS.

1 Like