[ANN] GALAHAD.jl: Bridging Julia with advanced optimization techniques

Dear Julia Community,

We are pleased to announce the release of GALAHAD.jl, the Julia
interface to the renowned GALAHAD library, now at version 5.0.
GALAHAD.jl brings the powerful optimization capabilities of GALAHAD
directly to Julia users, offering tools specifically designed to tackle
a wide range of smooth optimization problems.

Why GALAHAD?

GALAHAD is a sophisticated collection of Fortran modules, developed to
solve large-scale smooth optimization problems both efficiently and
accurately. It provides advanced algorithms for:

  • Unconstrained and Bound-Constrained Optimization:
    Solve problems with or without bounded variables.
  • Optimization subproblems:
    Solve quadratic or least-squares subproblems involving
    trust-region or norm regularization.
  • Linear and nonlinear least-squares:
    Solve least-squares problems.
  • Linear and Quadratic Programming (LP/QP):
    Handle optimization with linear constraints and linear or
    quadratic objectives.
  • Nonlinear Programming (NLP):
    Address problems with nonlinear objectives or constraints.
  • Global Optimization: Extend local optimization techniques
    to try to identify global minima of non-convex problems.

For more details and to start using GALAHAD.jl, visit:

Alexis Montoison (@amontoison), on behalf of the GALAHAD team
Nick Gould (@nimgould), Jari Fowkes (@jfowkes) & Dominique Orban (@dpo)

35 Likes

Thank you tons Alexis for this announcement!
I am sure that with more users of the software the examples section for each algo will get more tractable.

Just for my culture, do you plan to write an interface between GALLAHAD.jl and Optimization.jl?

It would be fantastic to have access to the algos in GALLAHAD directly in Opimization.jl.

2 Likes

This looks great! Is the threading in this package controlled mostly by the number of BLAS threads? I’m guessing one has to create thread-local structures to use these routines among multiple Julia threads?

At the very least, we plan to write interfaces between JSO’s NLPModels and GALAHAD.jl.

1 Like

@abehersan We plan to write interfaces between NLPModels.jl and GALAHAD.jl.
Since JuMP-dev 2024, we have a fully working bridge for JuMP models → NLPModels thanks to @blegat and NLPModelsJuMP.jl, so we expect to have two interfaces for the price of one (like Percival.jl).

For the interface between Optimization.jl and NLPModels.jl, we only have one side (NLPModels.jl → Optimization.jl), but it should not be hard to do the other side.

@xzackli GALAHAD doesn’t do multithreading directly. However, it highly exploits external dependencies that do multithreading, like BLAS/LAPACK, linear solvers, or tools that perform automatic differentiation.

For the automatic differentiation, everything can be done in Julia because GALAHAD supports reverse-communication. We just need to fill the arrays for the derivatives (gradient, sparse Jacobian, sparse Hessian), and we can use whatever tools we want (ADNLPModels.jl, JuMP.jl, DifferentiationInterface.jl, etc.).

GALAHAD is quite similar to Ipopt in terms of linear solvers supported (MUMPS, HSL, SPRAL, MKL Pardiso), which are all known to be parallel (through BLAS or internal threading).
In particular, GALAHAD shines the best with the HSL linear solvers.
It’s developed at the same place (Rutherford Appleton Laboratory) by different members of the same team.
It’s for GALAHAD.jl that we initially developed libHSL and HSL_jll.jl, even though we knew that all users of Ipopt.jl would be happy to have it.

Last but not least, GALAHAD.jl is compiled with LBT (libblastrampoline) and also its dependencies, which means we can dynamically switch between BLAS/LAPACK backends at runtime if we have a more efficient one for our platform (MKL, Apple Accelerate, NVBLAS, …).
I documented it in the README of Ipopt.jl.

4 Likes

Awesome! I can only wish you good luck. Hope everything goes on smoothly there.

Thanks for making Julia and its optimization ecosystem awesome!

1 Like