Announcing two significant enhancements to the multi-paradigm machine learning framework MLJ.
A model wrapper for controlling iterative models
By wrapping a model using IterativeModel(model=...)
one can control the model externally, enabling features such as early stopping (based on an out-of-sample loss), saving model snapshots, and tracking learned parameters. As a wrapper, iterative model control is composable with other MLJ meta-algorithms, such as hyper-parameter optimization.
The wrapper, documented here, is based on generic iterative control provided by IterationControl.jl
Learning more
For a little more detail, see our JuliaCon2021 poster.
For a detailed demonstration of the wrapper on the MNIST image dataset, see here.
Model stacking
Using MLJ’s new Stack
constructor, one can blend the predictions of multiple models, with the help of an adjudicating model. Model stacking, as introduced by Wolpert (1992), is popular in machine learning competitions. In the Biostats community, a model stack is also known as a super model (Van der Laan et al (2007)).
Thanks to Oliver Labayle @olivierlabayle for adding this constructor.
Learning more
For a quick start, see the MLJ documentation.
To learn more about how stacking works, and how stacking is implemented using MLJ’s generic model composition API, see this tutorial.