[ANN] Optim.jl updates

Hi all!

I am not sure if the Package Announcements category existed back when the previous version announcements were made about Optim.jl, so I am starting a new thread here.

Optim v1.8.0 is out as of yesterday. It is a feature release because @blegat has added MathOptInterace support (Introduction · MathOptInterface) thereby closing one of the oldest issues in Optim.jl. I also made the update because I found out that the docs were not running.

Lastly, I want to point something out. I saw some discussions where people had some ideas that Optim.jl may not be maintained, may be deprecated, and more. I just wanted to say, that that is definitely not the case. I have been busy with work at times and have not reviewed and responded to issues quickly. Actually, very slowly if at all. I have changed my Github settings and mail inbox settings to hopefully help highlight these so I don’t forget them or miss them. That said, it is always possible and welcomed to ping me on slack, and I’ll probably get to it rather quickly. Though, the fact that there are not new versions every month does not reflect that the package is dead, rather it probably means that it has met a relatively stable state.

For a very long time I have also been working on NLSolvers.jl that is supposed to replace Optim.jl internals at some point. It is tagged, and I am also happy to help with issues in that package whenever it may be needed. The package tries to avoid a lot of issues I had with Optim.jl and supports non-allocating versions of both equation solving and optimization. I will make a different post in the near future about the ideas and plans for that package.

Here at the end I also want to point out that there have been several unconstrained optimization, constrained optimization, nonlinear equation solving, nonlinear least squares solving packages in the ecosystem for a long time. That is good! I think they all fill various niches and needs. I use Optim.jl every day and so does our users, and it is adjusted and improved according to the needs that I and our product has. If users open issues and give feedback, their input will also help shape it. My needs are mostly met so that might be why there are not 20 new versions a year, and I think many other users find it useful as well. At least, more than 100 people and teams a year have been kind enough to cite their use in the past couple of years according to google scholar. It’s great to see that citing software has become normalized somewhat!

Until next time,
Patrick / @pkofod

70 Likes

I’ve been using Optim.jl for years but I miss an Adam optimizer since it’s used so commonly in practice. Also the efficient fg! API requires always some boilerplate.

Probably, I switch to Optimization.jl (which uses Optim.jl as one of its backends) as soon as they fix this issue (avoiding multiple forward passes) and the LBFGS issue.

2 Likes

You are free to switch of course ! :slightly_smiling_face: Any feature requests can go to the respective packages. NLSolvers.jl has Adam and AdaMax. I worked on that when I had more time, and the intention was to bring NLSolvers.jl code in as the next internal code for Optim.jl, but I have not gotten there yet :slight_smile:

9 Likes

If you have your stuff set up for Optim still, you can give Add Adam and AdaMax by pkofod · Pull Request #1069 · JuliaNLSolvers/Optim.jl · GitHub a go.

6 Likes

I use Optim.jl and just want to say thank you for your work on this nice package!

7 Likes

Yes, I’ll give it a shot!
Thanks a lot :slight_smile:

1 Like

Optim v1.9.0 is out
A new feature release of Optim is out.

Based on @roflmaostc feedback I pulled in the Adam/AdaMax code from NLSolvers.jl, exported them as Adam and AdaMax, and added very brief mentions in the docs. Since these are fixed step length methods the user may have to set the step length themselves, but I didn’t make it a required keyword. Instead, I used the values from the paper.

Edit:

Optim v1.9.1 is out
@roflmaostc found out that some operations in Adam and AdaMax were using scalar indexing and thus not CUDA compatible so a small bug fix was release as well

10 Likes

This looks very nice. It would be a make a great solver backend for my NLLSsolver.jl package. That package focuses more on problem definition (i.e. frontend), with features such as:

  • automatic computation of gradient and hessian (including sparse hessians)
  • support for robust kernels, including optimizing their parameters
  • support for manifold variables

However, it lacks support for:

  • a broad variety of solvers
  • constraints

So it seems that NLSolvers.jl offers solutions to that. In order to use it as the backend, NLSolvers.jl would need to support manifold variables, which I see is on your roadmap. I’d be happy to discuss it with you, when you’re at that stage.

1 Like

Yes, manifolds are only included in a few places in the form of some basic scaffolding :slight_smile: I am working on some fixes and documentation. I assume you would want to wait until that is out so you can better evaluate if it’s a good fit.

Did anyone notice a performance regression from v1.7 to v1.8 and v1.9? We have some scripts that are taking much longer to finish now after the update. We are not sure yet if the regression is coming from Optim.jl.

The optimization problem that we think was affected is this one:

We add box constraints with tight bounds sometimes (lower ~= upper).

If you’re tracking run-time hopefully you store a manifest as well such that you can see what packages were used? Maybe it’s better to make a separate thread for this discussion?

We will try to isolate a MWE in a separate thread :+1:

1 Like

Please tag me and I’ll try to help!

1 Like

That depends on whether you’ve already settled on the design. If so, I’ll wait. If not, I’m happy to have a chat, if that helps you with the design.

If you are interested, we could also check how to couple Optim.jl and ManifoldsBase.jl – we had some discussions quite a while back and concluded, that might be something nice as well. So if you are interested in that, let me know – I am happy to help.
It might not be easily doable in every solver, but in some probably.

1 Like

I am planning a small release later today, so let me catch up on two versions that have not been reflected in this thread yet.

Back in March version 1.9.3 was released. This patch release fixed long load times of Optim.jl due to the then recent addition of MathOptInterface.jl support. Also, a fix was included that made sure that the documentation is actually updated. Previously it was quite outdated.

Soon after, in April, version 1.9.4 was released. This was also related to MOI package issues. This time a precompilation issue.

There is ongoing work by @tim.holy to fix the Hager & Zhang line search, but sometimes you find a large number of issues once you scratch the surface. Tim is not one to hot fix issues, so we hope that the recurring issues with this line search will be improved once it’s done.

5 Likes

Okay, here we go.

Optim v1.10.0 has been released

Links
Release notes Release v1.10.0 · JuliaNLSolvers/Optim.jl · GitHub
Registry PR New version: Optim v1.10.0 by JuliaRegistrator · Pull Request #119223 · JuliaRegistries/General · GitHub
Updated docs Home · Optim

Description
Overall, this release is mostly small bug-fixes, but it increments the minor version to v1.10 since Christian Hasuchel added the time_limit option to univariate optimization that was previously only possible for multivariate optimization.

NelderMead() had issues with trace elements not being copied on each iteration which caused the final values to be the same for all iterations (the final value). There was also a fix to maximize that did not work if you provided f, g! and h! due to a misspelled variable that was not caught due to lacking test coverage. There were also several docs improvements.

Release notes from Github

Shoutout
Special shoutout to christianhauschel and tuncbkose who made contributions to Optim for the first time. I could not figure out if they are on here and what their username may be. Feel free to reveal yourself :slight_smile:

Also a shoutout to @tim.holy who has been looking into various issues and thinking about better testing of algorithms than what is currently in Optim.jl/test/. Tim Holy also recently joined the JuliaNLSolvers organization as owner. This means that if something stalls or I cannot be reached anymore (for whatever reason, though hopefully not the proverbial bus) Tim can help out. The other owners are @johnmyleswhite and myself. Tim Holy has been contributing to Optim.jl with John Myles White since before my time in the Julia ecosystem so it is quite natural that he joins the ranks. Note, I will probably remain the most active, but having John and Tim marked as owners makes ensures that someone has access.

22 Likes