[ANN] EvoLinear.jl for Linear Boosting

A new package implementing linear boosting has just been launched: EvoLinear.jl.

It essentially covers the functionality provided by the gblinear learner found in XGBoost.

Notably, it has L1 & L2 regularization, plus the following loss functions along their associated evaluation metrics:

  • MSE (mean squared error)
  • Logistic / Logloss
  • Poisson
  • Gamma
  • Tweedie
13 Likes

How is it different from GitHub - svs14/GradientBoost.jl: Gradient boosting framework for Julia. ?
A general purpose boosting package, to which one can plug arbitrary classifier would be so nice.

My perspective was to complement EvoTrees.jl with linear based learners, so most boosting needs are covered (trees + linear). I’m mainly concerned about implementation performance, along the flexibility for diverses loss functions.

I’m not sure if there are other important base learners that would be relevant in a boosting context?

1 Like

What I meant was to just expose some api, such that one can add a different learner, for example neural networks instead of decision trees. But it is just a suggestion. My motivation was that having a go-to boosting package would be nice, especially if there are already few of those.

Although I understand this was just used as an example, I’m not sure all kind of learners may be well adapted for boosting applications. I’d consider that a NN learns in a similar fashion to boosting, in the sense that weights are updated in sequence based on the current state of the underlying model. As such, my first impression would be that such boosting would turns out as a cumbersome and likely less efficient way to converge. Anecdotically, I was actually about to consider how effective deep learning optimizers can be so as train a linear booster using first order gradient rather than the second-order methods currently used by EvoLinear and XGBoost. Such approach would technically results in the NN model.
Just some thoughts here, thanks for your input!

Since EvoTrees already implements the MLJ model interface, wouldn’t making EvoLinear implement that as well solve this problem? Or do you mean substituting 1 tree → 1 NN in an existing boosting model?

I’m also planning to add MLJ integration to EvoLinear, likely within next week or so, once it gets added to the general registry.

3 Likes

Starting with v0.3.0 EvoLinear.jl now implements the MLJ interface.

5 Likes