I’m pleased to announce the 0.5 release of the Flux machine learning library. Here’s an incomplete list of things that have changed since the last announcement.
- Experimental JIT compilation work for models, applying optimisations such as pre-allocating memory.
- Run models in the browser via Flux.JS
- Among many GPU performance improvements, CUDNN integration for RNNs – RNNs will now be much faster, no changes to your code needed!
- A new and improved N-dimensional API for convolutions, whose CPU versions now have pure-Julia implementations.
- Regularisation of model weights
- Stable APIs for saving and loading models
- Tracked scalars for more advanced AD use cases
- Much new functionality and layers, including the GRU RNN, numerically-stable log versions of softmax and sigmoid, binary cross entropy, permutedims, kronecker product, and much more.
- Many more models in the model zoo
Thanks to all contributors to Flux!