[ANN] Optim.jl updates

Hi all!

I am not sure if the Package Announcements category existed back when the previous version announcements were made about Optim.jl, so I am starting a new thread here.

Optim v1.8.0 is out as of yesterday. It is a feature release because @blegat has added MathOptInterace support (Introduction · MathOptInterface) thereby closing one of the oldest issues in Optim.jl. I also made the update because I found out that the docs were not running.

Lastly, I want to point something out. I saw some discussions where people had some ideas that Optim.jl may not be maintained, may be deprecated, and more. I just wanted to say, that that is definitely not the case. I have been busy with work at times and have not reviewed and responded to issues quickly. Actually, very slowly if at all. I have changed my Github settings and mail inbox settings to hopefully help highlight these so I don’t forget them or miss them. That said, it is always possible and welcomed to ping me on slack, and I’ll probably get to it rather quickly. Though, the fact that there are not new versions every month does not reflect that the package is dead, rather it probably means that it has met a relatively stable state.

For a very long time I have also been working on NLSolvers.jl that is supposed to replace Optim.jl internals at some point. It is tagged, and I am also happy to help with issues in that package whenever it may be needed. The package tries to avoid a lot of issues I had with Optim.jl and supports non-allocating versions of both equation solving and optimization. I will make a different post in the near future about the ideas and plans for that package.

Here at the end I also want to point out that there have been several unconstrained optimization, constrained optimization, nonlinear equation solving, nonlinear least squares solving packages in the ecosystem for a long time. That is good! I think they all fill various niches and needs. I use Optim.jl every day and so does our users, and it is adjusted and improved according to the needs that I and our product has. If users open issues and give feedback, their input will also help shape it. My needs are mostly met so that might be why there are not 20 new versions a year, and I think many other users find it useful as well. At least, more than 100 people and teams a year have been kind enough to cite their use in the past couple of years according to google scholar. It’s great to see that citing software has become normalized somewhat!

Until next time,
Patrick / @pkofod

72 Likes

I’ve been using Optim.jl for years but I miss an Adam optimizer since it’s used so commonly in practice. Also the efficient fg! API requires always some boilerplate.

Probably, I switch to Optimization.jl (which uses Optim.jl as one of its backends) as soon as they fix this issue (avoiding multiple forward passes) and the LBFGS issue.

2 Likes

You are free to switch of course ! :slightly_smiling_face: Any feature requests can go to the respective packages. NLSolvers.jl has Adam and AdaMax. I worked on that when I had more time, and the intention was to bring NLSolvers.jl code in as the next internal code for Optim.jl, but I have not gotten there yet :slight_smile:

9 Likes

If you have your stuff set up for Optim still, you can give Add Adam and AdaMax by pkofod · Pull Request #1069 · JuliaNLSolvers/Optim.jl · GitHub a go.

6 Likes

I use Optim.jl and just want to say thank you for your work on this nice package!

7 Likes

Yes, I’ll give it a shot!
Thanks a lot :slight_smile:

1 Like

Optim v1.9.0 is out
A new feature release of Optim is out.

Based on @roflmaostc feedback I pulled in the Adam/AdaMax code from NLSolvers.jl, exported them as Adam and AdaMax, and added very brief mentions in the docs. Since these are fixed step length methods the user may have to set the step length themselves, but I didn’t make it a required keyword. Instead, I used the values from the paper.

Edit:

Optim v1.9.1 is out
@roflmaostc found out that some operations in Adam and AdaMax were using scalar indexing and thus not CUDA compatible so a small bug fix was release as well

10 Likes

This looks very nice. It would be a make a great solver backend for my NLLSsolver.jl package. That package focuses more on problem definition (i.e. frontend), with features such as:

  • automatic computation of gradient and hessian (including sparse hessians)
  • support for robust kernels, including optimizing their parameters
  • support for manifold variables

However, it lacks support for:

  • a broad variety of solvers
  • constraints

So it seems that NLSolvers.jl offers solutions to that. In order to use it as the backend, NLSolvers.jl would need to support manifold variables, which I see is on your roadmap. I’d be happy to discuss it with you, when you’re at that stage.

1 Like

Yes, manifolds are only included in a few places in the form of some basic scaffolding :slight_smile: I am working on some fixes and documentation. I assume you would want to wait until that is out so you can better evaluate if it’s a good fit.

Did anyone notice a performance regression from v1.7 to v1.8 and v1.9? We have some scripts that are taking much longer to finish now after the update. We are not sure yet if the regression is coming from Optim.jl.

The optimization problem that we think was affected is this one:

We add box constraints with tight bounds sometimes (lower ~= upper).

If you’re tracking run-time hopefully you store a manifest as well such that you can see what packages were used? Maybe it’s better to make a separate thread for this discussion?

We will try to isolate a MWE in a separate thread :+1:

1 Like

Please tag me and I’ll try to help!

1 Like

That depends on whether you’ve already settled on the design. If so, I’ll wait. If not, I’m happy to have a chat, if that helps you with the design.

If you are interested, we could also check how to couple Optim.jl and ManifoldsBase.jl – we had some discussions quite a while back and concluded, that might be something nice as well. So if you are interested in that, let me know – I am happy to help.
It might not be easily doable in every solver, but in some probably.

1 Like

I am planning a small release later today, so let me catch up on two versions that have not been reflected in this thread yet.

Back in March version 1.9.3 was released. This patch release fixed long load times of Optim.jl due to the then recent addition of MathOptInterface.jl support. Also, a fix was included that made sure that the documentation is actually updated. Previously it was quite outdated.

Soon after, in April, version 1.9.4 was released. This was also related to MOI package issues. This time a precompilation issue.

There is ongoing work by @tim.holy to fix the Hager & Zhang line search, but sometimes you find a large number of issues once you scratch the surface. Tim is not one to hot fix issues, so we hope that the recurring issues with this line search will be improved once it’s done.

5 Likes

Okay, here we go.

Optim v1.10.0 has been released

Links
Release notes Release v1.10.0 · JuliaNLSolvers/Optim.jl · GitHub
Registry PR New version: Optim v1.10.0 by JuliaRegistrator · Pull Request #119223 · JuliaRegistries/General · GitHub
Updated docs Home · Optim

Description
Overall, this release is mostly small bug-fixes, but it increments the minor version to v1.10 since Christian Hasuchel added the time_limit option to univariate optimization that was previously only possible for multivariate optimization.

NelderMead() had issues with trace elements not being copied on each iteration which caused the final values to be the same for all iterations (the final value). There was also a fix to maximize that did not work if you provided f, g! and h! due to a misspelled variable that was not caught due to lacking test coverage. There were also several docs improvements.

Release notes from Github

Shoutout
Special shoutout to christianhauschel and tuncbkose who made contributions to Optim for the first time. I could not figure out if they are on here and what their username may be. Feel free to reveal yourself :slight_smile:

Also a shoutout to @tim.holy who has been looking into various issues and thinking about better testing of algorithms than what is currently in Optim.jl/test/. Tim Holy also recently joined the JuliaNLSolvers organization as owner. This means that if something stalls or I cannot be reached anymore (for whatever reason, though hopefully not the proverbial bus) Tim can help out. The other owners are @johnmyleswhite and myself. Tim Holy has been contributing to Optim.jl with John Myles White since before my time in the Julia ecosystem so it is quite natural that he joins the ranks. Note, I will probably remain the most active, but having John and Tim marked as owners makes ensures that someone has access.

23 Likes

Optim v1.11.0 has been released

Links
Release notes [Release v1.11.0 · JuliaNLSolvers/Optim.jl · GitHub]Release v1.11.0 · JuliaNLSolvers/Optim.jl · GitHub)
Registry PR New version: Optim v1.11.0 by JuliaRegistrator · Pull Request #123560 · JuliaRegistries/General · GitHub
Updated docs Home · Optim

Description
This feature release is a small it updates outdated docs on simulated annealing and fixes some trace related to Adam and AdaMax and change the alpha such that ParameterSchedulers.jl can be used. Essentially, a predefined line search.

Release notes from Github

  • Fixing extended trace failure for Adam and AdaMax and generalising alpha parameter to accept callable object (scheduler) (#1115) (@kishore-nori)
  • Update simulated_annealing.md documentation to avoid pointing to a very old issue. (#1116) (@pkofod)

Shoutout
Special shoutout to Sai Krishna Kishore Nori who made contributions to Optim for the first time.

11 Likes

Optim v1.12.0 has been released

Links
Release notes: Release v1.12.0
Registry PR New version: Optim v1.12.0 by JuliaRegistrator · Pull Request #123560 · JuliaRegistries/General · GitHub
Updated docs Home · Optim

Description
This feature release has several new features and bug fixes. The most notable changes are:
- Support for autodiff with DifferentiationInterface.jl
- Support for EnumX termination codes
- Refactor of the preconditioning code and update documentation to correctly reflect the behavior
- Updates to the documentation that were long overdue including convergence tolerance descriptions
- Fix of the initial convergence check to correctly return the convergence info when the gradient was small in the first evaluation
- The start of a benchmark suite to be run on each PR to avoid regressions

I’m sure @gdalle wants to expand on why his contributions are important, but on the Optim side we already were able to simplify the objective function struct construction with AD and we not finally support reverse mode AD that I turned off at the beginning of my Optim.jl days because ReverseDiff was so unstable. Reverse mode autodiff should be the goto for scalar optimization, so I’m curious if you all find it to work well! The old inputs should still work.

There were a few changes that touches important internals including the AD work so please do not hesitate to open an issue if you find a bug after upgrading (or in general of course)!

I also noticed that Optim.jl has surpassed 1100 stars on Github - so lets celebrate that as the JOSS paper also nears 600 citations according to scholar! New stars and citations seem to come at a steady pace, so Julia is very much still alive in the scientific realm!

I have more changes planned in the near future so stay tuned :slight_smile:

Release notes from Github

- Fix default of allow_f_increases and allow_outer_f_increases in docs by @JoshuaLampert in #1127
- Start in interior of variable bounds by @blegat in #1073
- Update MOI_wrapper.jl to exclude two failing tests by @pkofod in #1134
- Support autodiff with DifferentiationInterface by @gdalle in #1131
- Update Options documentation by @pkofod in #1135
- Use DocumenterCitations.jl by @abhro in #1130
- Expose SAMIN options by @pkofod in #1136
- Create benchmark suite by @MilesCranmer in #1084
- Refactor preconditioning code by @pkofod in #1138
- Fix admonition block and URL display by @abhro in #1141
- Implement EnumX termination codes by @pkofod in #1142
- Format repository by @pkofod in #1150
- Refactor OptimizationResults to not store so many different fields by @pkofod in #1151
- Update ipnewton_basics.jl by @pkofod in #1153
- Fix initial convergence by @pkofod in #1152
- Remove Parameters dependency by @pkofod in #1154
- fix: equality comparison where assignment was likely meant by @ForceBru in #979
- CompatHelper: bump compat for ForwardDiff to 1, (keep existing compat) by @github-actions in #1140
- Slight rewrite of update_g and update_fg as well as some SAMIN fixes by @pkofod in #1155
- Update TerminationCode names and add ObjectiveNotFinite by @pkofod in #1156
- Some cleanup for deprecated tolerance specifications by @pkofod in #1157
- Don't use _tol in tests by @pkofod in #1158
- Change tag to 1.12.0 by @pkofod in #1159

Shoutout
As always I want to thank all the contributors to this release. This includes all the people who have contributed to the package for the first time:
- @JoshuaLampert made their first contribution in #1127
- @gdalle made their first contribution in #1131
- @abhro made their first contribution in #1130
- @ForceBru made their first contribution in #979

24 Likes