We (@ayush-1506 @ablaom) are releasing the first version of MLJFlux.jl: An interface to the Flux deep learning framework for MLJ.jl. MLJFlux provides four main models:
NeuralNetworkRegressor
MultitargetNeuralNetworkRegressor
NeuralNetworkClassifier
ImageClassifier
models in this context are containers of hyper-parameters ; they don’t store the learned neural network parameters. (Similar to an MLJ model)
For instance, the NeuralNetworkRegressor
takes the parameters:
mutable struct NeuralNetworkRegressor{B,O,L} <: MLJModelInterface.Deterministic
builder::B # A function that returns the Flux chain
optimiser::O # mutable struct from Flux/src/optimise/optimisers.jl
loss::L # Loss function
epochs::Int # number of epochs
batch_size::Int # size of a batch
lambda::Float64 # regularization strength
alpha::Float64 # regularizaton mix (0 for all l2, 1 for all l1)
optimiser_changes_trigger_retraining::Bool # Should chaining optimise retrain the chain
end
For more on the input scitype accepted by the models and output prediction types returned, see the documentation.
Users have the option to define their own neural network architecture using Flux (however complicated it be, as long as it can be run using Flux.jl) as a part of the appropriate model (of the four models mentioned above), wrap it an MLJ machine and perform any of the tasks we would with any other MLJ.jl machine: Hyper-parameter tuning, plotting, evaluation, learning curves, pipelining etc. (Alternatively, it’s possible to switch hyper-parameters without having to retrain the complete chain)
For a basic usage of MLJFlux.jl for regression on the standard boston housing dataset, head over to the boston-flux example in DataScienceTutorials.
This newest release of MLJFlux.jl supports the zygote version of Flux. (The plan is to keep up with newer Flux releases as they come along)
For installation: ]add MLJFlux
Documentation: GitHub - FluxML/MLJFlux.jl: Wrapping deep learning models from the package Flux.jl for use in the MLJ.jl toolbox
The architecture has been inspired by skorch (scikit-learn compatible neural network library that wraps PyTorch). If you do give MLJFlux a try, we’d love to know your thoughts around design, implementation and directions in which we can improve the package.