[ANN] NonlinearOptimization TestFunctions.jl: A comprehensive collection of test functions for nonlinear optimization

Hello Julia Community,

I am excited to announce my new package, NonlinearOptimizationTestFunctionsInJulia.jl. It is designed to be a comprehensive collection of test functions for benchmarking nonlinear optimization algorithms. Each function comes with an analytical gradient and detailed metadata to ensure efficient and reliable testing.

Key Features:

  • Standardized Test Functions: Includes well-known functions like Rosenbrock, Ackley, and Branin, all with consistent interfaces.
  • Analytical Gradients: Accelerates testing and benchmarking processes.
  • Detailed Metadata: Each function is supplied with information on its starting points, global minima, and properties (e.g., unimodal, multimodal).
  • Compatibility: Seamless integration with popular optimization packages such as Optim.jl, NLopt.jl, and tools like ForwardDiff.jl.

This package is a tool for anyone looking to develop, test, or compare optimization algorithms.

You can find the repository here: GitHub - UweAlex/NonlinearOptimizationTestFunctions.jl: NonlinearOptimizationTestFunctionsInJulia is a Julia package designed for testing and benchmarking nonlinear optimization algorithms. It provides a comprehensive collection of standard test functions, each equipped with analytical gradients, metadata, and validation mechanisms.

11 Likes

Cool! Looks similar to CUTEst (GitHub - JuliaSmoothOptimizers/CUTEst.jl: Julia's CUTEst Interface; has 286 unconstrained problems) and GitHub - JuliaSmoothOptimizers/OptimizationProblems.jl: Optimization Problems for Julia.

More test functions:

3 Likes

Can you make sure to support the Optimization.jl form? Then we’ll add it to the SciMLBenchmarks.jl server.

3 Likes

(post deleted by author)

Thanks for the great feedback! You’re right that our project shares vibes with CUTEst.jl and OptimizationProblems.jl, but we’re carving out a niche with a pure-Julia, lightweight suite starting with Molga & Smutnicki (2005) functions like Cross-in-Tray. We’re excited to expand with AMPGO and BBOB functions, like Alpine, to test algorithms across dimensions. Unlike CUTEst.jl, we avoid Fortran dependencies for stability.

1 Like

There’s no Fortran in OptimizationProblems.jl. Why not contribute new problems there?

I am new to GitHub, and I am not really experienced with Julia, so I thought it’s better to start something for myself first. Try to create Problems without making problems^^. The plan is to learn enough to be able to support other projects in the future

1 Like

Update: The first 100 test functions are now implemented! Here’s the link to the current state on GitHub: GitHub - UweAlex/NonlinearOptimizationTestFunctions.jl: NonlinearOptimizationTestFunctions is a Julia package designed for testing and benchmarking nonlinear optimization algorithms. It provides a comprehensive collection of standard test functions, each equipped with analytical gradients, metadata, and validation mechanisms.. Feedback is welcome! 😊

3 Likes

I’m excited to share an update on NonlinearOptimizationTestFunctions.jl!

I have now implemented over 200 test functions, including the full sets from Jamil & Yang (2013) and Molga, Marcin, and Smutnicki (2005). All functions – and their analytical gradients where available – can now be evaluated in high precision, enabling reliable benchmarking for both standard and arbitrary-precision optimization workflows.

In addition, 100+ more unconstrained test functions are already in the pipeline. I’ve also completed the initial groundwork for incorporating constrained optimization problems, which will significantly expand the package’s applicability in future releases.

More updates soon — and as always, contributions, feedback, and ideas are welcome!

4 Likes

I’m currently polishing up the documentation and experimenting with a few more demos to make the package even more user-friendly. One feature I’m particularly excited about is the with_box_constraints wrapper, which turns bounded test functions into unconstrained ones using a domain-safe modified exact L1 penalty method.

Unlike traditional L1 penalties that might evaluate the objective or gradient outside bounds (risking domain errors or instability), this wrapper clamps points to the feasible region before evaluation while adding a penalty term (with fixed ρ=10^6) to pull violations inward. This ensures safe, stable runs—especially for benchmarks where bounds define the valid domain—and works seamlessly with any unconstrained optimizer (e.g., LBFGS, BFGS, etc.), not just specific ones like Optim.jl’s Fminbox.

A quick benchmark across all 155 bounded functions (10 random feasible starts each) shows it’s often more robust and efficient than Fminbox: higher success rates on tricky cases (e.g., non-differentiable or ill-conditioned functions) and fewer calls when both converge. Check out the updated docs for usage examples and the full comparison

Feedback welcome—let me know if you’d like more details or to try it out!

2 Likes

@amontoison may be interested in this

SHGO.jl: Global Optimization for Test Function Analysis

Quick update on NonlinearOptimizationTestFunctions.jl: I’m building SHGO.jl (Simplicial Homology Global Optimization) to automatically characterize multimodal test functions.

Status

Working prototype finds all local minima + basin count:

using SHGO, NonlinearOptimizationTestFunctions

tf = fixed(TEST_FUNCTIONS["sixhumpcamelback"]; n=2)
result = analyze(tf; n_div=15)

println("$(result.num_basins) basins, global min: $(get_global_value(result))")
# Output: 6 basins, global min: -1.031628

https://github.com/UweAlex/SHGO.jl

Next: PyCall Benchmark vs SciPy

This week I’ll validate against scipy.optimize.shgo to check correctness and performance. Goal is to add multimodality metrics to the test function collection automatically.

Feedback welcome! :rocket:

1 Like