[MadNLPHSL] Feature proposal: simplify installation with custom-compiled HSL library

HSL_jll.jl obtained from https://licences.stfc.ac.uk/products/Software/HSL/LibHSL requires free academic or purchased license and contains extended set of HSL libraries. However there is free license for individuals for HSL archive that has subset of HSL_jll libraries, but includes “ma27” which is usually for optimal power flows the best anyway. To use HSL archive instead of HSL_jll, archive needs to be custom compiled which I will here assume that it is already done. I have confirmed working libhsl.so with “ma27” tested on Ipopt (loaded at runtime).

MadNLPHSL includes short tutorial how to use custom compiled library, but I have failed in reproducing it. It states that julia./artifacts/Overrides.toml should be modified in the following way.

# replace HSL_jll.jl artifact /usr/local/lib/libhsl.so
ecece3e2c69a413a0e935cf52e03a3ad5492e137 = "/usr/local"

I’ve put libhsl.so at /home/karlo/HSL/lib/libhsl.so

And I’ve put in Overrides.toml after installing MadNLPHSL (clean installed as the only package in Julia).
ecece3e2c69a413a0e935cf52e03a3ad5492e137 = /home/karlo/HSL

Normally, when I install MadNLPHSL and then dev-install HSL_jll and then write pkg>instantiate, it rebuilds and everything is working (installation needs to be in that order). However, with Overrides.toml approach it should work without HSL_jll installation which it does not (instead it should use custom compiled library).

It is my understanding that content-hash (“ecece”… code) is operating system dependant, even version dependant. I have tried even content-hash that actual HSL_jll creates, which is for me 4caf94fdf1f7c386576863cb01ae5c56354bf33f, but without installing HSL_jll it did not work, and HSL_jll is only obtainable using more strict licensing conditions so it should not be used.

To conclude, I think we need better tutorial for this or function to define custom library just like CUDA.jl has function to define CUDA version for case when installing on a system without installed CUDA such as containers. The referenced function is: CUDA.set_runtime_version!(v"11.8"; local_toolkit=true)

@amontoison you may know if this should be put on MadNLPHSL.jl or HSL.jl issue list.

Here are the JuMP docs for how to use a custom binary: Custom binaries · JuMP

2 Likes

This helps a lot. The confusing part with HSL_jll is that there are two versions of it. One that is obtained from HSL site (HSL_jll.jl.v2024.11.28) and public one HSL_jll.jl (v4.0.2+0), each with different license conditions. The public version seems to include a free for all license that allows distribution and changes even for binaries as long as you do not take credit for the product. v2024 version does not allow for distribution unless purchased, but can take credit for it. Julia code is in both cases covered by permissive MIT license. Basically, public version can be used to override the artifact and use the binaries from HSL archive.

I haven’t tried it yet, but JuMP docs are very detailed. I marked it as solved. Thank you! :slightly_smiling_face:

Note that the public HSL_jll is a small shim that allows compilation by providing the appropriate symbols. It does not contain the actual contents of the HSL routines, so any code actually using it at runtime will error.

The idea is that you can add HSL_jll as a package and distribute that so things will compile etc, but any user will still need to download the official package and agree to those license terms.

One example of a code that compiles with the public HSL_jll is Uno: