New machine learning package, Julia implementation of XGBoost

Here is some critical feedback:
One of the main points of Gradient Boosting (and extensions like XGBoost) is its application/generalization to any loss (not just linear regression or logistic regression). This generalization is very easily achieved via Multiple Dispatch in julia. In that context, seeing that the package works only for logistic was surprising. Additionally, the code seems to be hard-coded around Logit loss.
It would be great to abstract away the XGBoost algorithm from the direct dependence on loss, and instead, the algorithm can get the gradient (and hessian) from the loss Type (this is where multiple dispatch is so handy). This makes adding new losses very easy.
Thanks.

5 Likes