Choosing a package for univariate (convex) optimization in Julia

I need to solve 1d convex optimization programs, where I have access to all derivatives. Ideally I’d use Optim.jl but the interface only supports vectors. The backend NLSolvers.jl works but the documentation is not as detailed.
I know I can wrap x::Real into StaticArrays.MVector(x) and use any vector optimization package but I was wondering if there are easier options. My goal is to minimize code complexity and setup costs, especially memory, cause this optimization subroutine will get called a gazillion times.
Related:

Note that NLSolvers.jl is not the backend of Optim.jl, it is a re-write that Patrick has been slowly working on solo for a while now. One of the goals iirc was to make your use case more “first class” as shown in the README. For example, optimising small problems with static array decision variables or scalars should not allocate.

1 Like

I would still recommend a 1d algorithm for 1d problems.

1 Like

If you have the first and second derivatives, using a second-order optimisation algorithm may be better for local optimisation (e.g. Newton’s method converges in a single iteration in the quadratic case).

2 Likes

Yes, I would just write a custom implementation of the Newton method for that.

2 Likes

Ya it’s probably hard to beat a simple Newton + back-tracking line search for 1D. But if you want fancier line search algorithms, it gets slightly more complex.

Why line search at all? It’s convex.

1 Like

Ya it might not be necessary if you know the second derivative and the problem is strictly convex.