PRIMA: a package for solving general nonlinear optimization problems without using derivatives

Hi everyone,

I am very glad to announce PRIMA, a package for solving general nonlinear optimization problems without using derivatives.

PRIMA provides the reference implementation for M.J.D. Powell’s renowned derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. The “P” in the name stands for Powell, and “RIMA” is an acronym for “Reference Implementation with Modernization and Amelioration”.

Powell’s solvers are widely used by engineers and scientists. For instance, see Section 1 of a recent paper on Powell’s solvers as well as the Google searches of COBYLA and BOBYQA.

The current version of PRIMA is implemented in modern Fortran (F2008 or above). Interfaces to Julia, MATLAB, Python, and C are available. Native implementations in these languages will also be provided in the future.

If you are interested in PRIMA, you may check its GitHub repository at

libprima / prima

and its Julia interface PRIMA.jl:

JuliaRegistries / General / P /PRIMA.

The Julia interface of PRIMA is mainly due to the efforts of Éric Thiébaut and Alexis Montoison, who should receive all the credits.

Thanks and regards,
Zaikun ZHANG
Ph.D. and Assistant Professor
Dept. App. Math., Hong Kong Polytechnic University


PS: Who was Powell?

Michael James David Powell FRS was “a British numerical analyst who was among the pioneers of computational mathematics”. He was the inventor/early contributor of quasi-Newton method, trust region method, augmented Lagrangian method, and SQP method. Each of them is a pillar of modern numerical optimization. He also made significant contributions to approximation theory and methods .

Among numerous honors, Powell was one of the two recipients of the first Dantzig Prize from the Mathematical Programming Society (MOS) and Society for Industrial and Applied Mathematics (SIAM). This is considered the highest award in optimization.

25 Likes

Thanks for setting this up! I believe @Vaibhavdixit02 will have a wrapper for Optimization.jl by the end of the day.

8 Likes

Great! Thank you and @Vaibhavdixit02 . I hope PRIMA will be useful to the Julia community.

Éric Thiébaut and Alexis Montoison should receive all the credits for the Julia setup represented by PRIMA.jl.

1 Like

My guess is it will be less widely used than it is in Fortran because derivative free optimizers are generally worse than optimizers with derivatives, and Julia has pretty good autodiff functionality, but having good derivative free methods are nice to check why everything is exploding. It is also great to see new work in this area. More robust methods are always wonderful.

The wrapper is almost done: [WIP] Add PRIMA wrapper by Vaibhavdixit02 · Pull Request #612 · SciML/Optimization.jl · GitHub

Derivative-free methods have their place in the cannon of methods. Indeed, pervasive AD makes them less widely used than they would be in something like Python. However, it’s a good sanity check and a good thing to use for people who have wrapped a C/Fortran code that isn’t differentiable. So it’ll be interesting to benchmark them in the SciMLBenchmarks and see how they perform against derivative-based methods and against the NLopt implementations.

11 Likes