Announcement for the initial release of NeuroTreeModels.jl
NeuroTree
based models comprise a collection of differentiable trees, in an attempt get both the performance benefits of boosted tree methods and flexibility of gradient based learning. A more comprehensive description can be found in the doc’s design section: NeuroTree - A differentiable tree operator for tabular data | NeuroTreeModels
Comprehensive bencharmks have been run against XGBoost, LightGBM, CatBoost and EvoTrees on 6 datasets commonly used in publications of ML methods on tabular data. Results and code to reproduce are found at MLBenchmarks.jl.
NeuroTree share similarities with Yandex’s Neural Oblivious Decision Ensemble. Key differences include:
- Full binary trees (rather than oblibious ones).
- Rely on a simple
NeuroTree
operator that behaves similarly to aDense
operator for tabular, 2D input data. Such operator can be composed like a Dense operator in Flux chains to compose more complex models, like stack of trees, or combination with any other operators.