[ANN] MLJFlux.jl 0.5

MLJFlux.jl allows users to quickly build, train and optimise Flux.jl neural network models using the high-level, general purpose machine learning framework MLJ.jl.

The new 0.5 release assimilates a number of substantial under-the-hood improvements, such as:

  • better performing re-implementation of L1-L2 regularization
  • implicit-style AD
  • use of Optimiser.jl optimisers

It also provides a new model, NeuralNetworkBinaryClassification, an optimised version of the existing classifier for binary targets.

MLJFlux.jl 0.5 sports substantially revamped documentation including new sample workflows and extended examples.

Thanks to Tiem van der Deure, @EssamWisam, @pat-alt for contributions to this release.

20 Likes