I am excited to announce my new package, NonlinearOptimizationTestFunctionsInJulia.jl. It is designed to be a comprehensive collection of test functions for benchmarking nonlinear optimization algorithms. Each function comes with an analytical gradient and detailed metadata to ensure efficient and reliable testing.
Key Features:
Standardized Test Functions: Includes well-known functions like Rosenbrock, Ackley, and Branin, all with consistent interfaces.
Analytical Gradients: Accelerates testing and benchmarking processes.
Detailed Metadata: Each function is supplied with information on its starting points, global minima, and properties (e.g., unimodal, multimodal).
Compatibility: Seamless integration with popular optimization packages such as Optim.jl, NLopt.jl, and tools like ForwardDiff.jl.
This package is a tool for anyone looking to develop, test, or compare optimization algorithms.
Thanks for the great feedback! You’re right that our project shares vibes with CUTEst.jl and OptimizationProblems.jl, but we’re carving out a niche with a pure-Julia, lightweight suite starting with Molga & Smutnicki (2005) functions like Cross-in-Tray. We’re excited to expand with AMPGO and BBOB functions, like Alpine, to test algorithms across dimensions. Unlike CUTEst.jl, we avoid Fortran dependencies for stability.
I am new to GitHub, and I am not really experienced with Julia, so I thought it’s better to start something for myself first. Try to create Problems without making problems^^. The plan is to learn enough to be able to support other projects in the future