Annoucement: Algencan.jl initial release

After a long period planning I have finished a first rough implementation of a interface between the Algencan solver for nonlinear optimization (general NLP with constraints) for JuMP / MathProgBase that I am naming Algenca.jl. This is pre-alpha software, please read the README.md file in the Github page.

At this time, before using it you need to first download and compile Algencan and create a shared library from it. There are some instructions on how to do it in the README.md file in Algencan.jl. See the Github page. After that, you need to create an environment variable named ALGENCAN_LIB_DIR point to the directory that has the shared library.

For now Algencan.jl is a simple module, not a package yet. So you can not use Pkg to install it. I plan to give it some Pkg integration soon. To use it, download the code from Github, add the src directory to Julia’s LOAD_PATH and import Algencan. There is a example of use in the file first_tests.jl in the example directory.

Feel free to test it, make comments, and find issues.

Paulo

2 Likes

Would be great to see how it does against Optim

Perhaps you are aware of this, but PkgDev.generate will generate the package skeleton with little effort. Even if you are not registering it, Pkg.update is a great help for keeping up with WIP packages.

Algencan is suppoded to be competitive with IPOPT, at least that was the case when it was first released. It is a very mature code. I’ll try to write some simple tests, maybe using CUTEst.jl.

I admit that I have not even looked at how to make packages. I will take a look at PkgDev.generate, thanks for the tip.

I am aware - this is why I am interested in the comparison.

I used CUTEst.jl and made a very naive comparison between Algencan.jl and Ipopt.jl using their simple linear algebra solvers (without using HSL libraries). You can see the performance profile below. The code for the test is in the github repository so that you can reproduce it.

I selected all the problems from CUTEst that have at least 10 variables and 10 constraints and at most 1000 variables and 1000 constraints, but I had to ignore problems that have lower bounded constraints as this is not yet implemented in Algencan.jl.

Note that comparing optimization software is hard. In this case a special difficulty is that Ipopt uses a scaled stopping criteria (in its tol parameter) while Algencan uses an unscaled one. I tried to make the results comparable, but I may have failed miserably. If anyone sees any clear pitfall in my code, please let me know and I’ll fix it ASAP. I tried to set the precisions to 1.0e-5.

As I said, the moral here is that both codes behave quite similarly in overall the test but can have dramatically different behaviour in a specific test. So the best is to try both and use the best one for your application. And, if you can, try both with HSL, they should both behave better.