How to update learning rate during Flux training in a better manner?

The error comes from trying to mix old style optimization rules (Optimiser and ExpDecay) with the new-style Optimisers.jl API Flux favours now. Optimisation Rules · Flux has a warning about this.

If you’re fine with writing a couple more lines to turn that train! call into a custom training loop, then see my runnable example of how to schedule parameters using the new optimization interface + ParameterSchedulers.jl in Learning rate scheduler with the new interface of Flux - #5 by ToucheSir.

2 Likes