Try it out (nothing to install):
Learn: Data Science Tutorials in Julia.
Free: workshop at JuliaCon2020
Been a little while since our v0.1 announcement. Now at
v0.11.6 v0.12.0, with functionality spread over over ten repositories:
MLJ has been under continuous development since November 2018. Worth a try if you have not looked at it for a while.
MLJ is a toolbox written in Julia providing a common interface and meta-algorithms for selecting, tuning, evaluating, composing and comparing machine learning models written in Julia and other languages. Now with access to over 140 models, including most popular scikit-learn models, as well as deep learning models via MLJFlux, recently released.
MLJ strengths include:
Flexible, carefully thought out API for model composition, including new improved
Extendible hyper-parameter optimization interface that plays well with model composition, with multiple resampling and (nested) parallelization options. Recently added algorithms are:
RandomSearchwith customizable priors (built-in)
MLJTreeParzenTuningprovided by TreeParzen.jl
Searchable database of model metadata for matching models to learning tasks
Developers should note that MLJ has extensive documentation for integrating a new machine learning model into MLJ and for adding a hyper-optimization algorithm.