Quantile Random Forest in Julia?

Hi there,
I have played a bit with DecisionTree.jl and MJL.jl.
Now I would like to get some uncertainty quantification for my RandomForest regressions. I have seen that Quantile Random Forest is an increasingly accepted method to get the prediction intervals 5%-95% percentiles over the learner trees instead of only the mean like in standard RF regression. Does someone have a pointer how to go about this in Julia? There are some Python implementations I am aware of like:

I don’t think that I am skilled enough to port these myself, but I’d be happy to help in some way.

In EvoTrees.jl, Quantile is a support regression loss type. There’s a minimal example in the docs:

EvoTrees.jl however is primarily a gradient boosted trees, so each tree learn on top of the residuals of previous trees. In practice, it typically performs as good if not better than random forest.

There’s the (undocumented) bagging_size kwargs than can be specified in the learner to emulate a RandomForest behavior. This however doesn’t work well for some loss function, which I believe is the case for Quantile loss.

You could also consider conformal predictions to build confidence intervals in a more data-driven way: GitHub - JuliaTrustworthyAI/ConformalPrediction.jl: Predictive Uncertainty Quantification through Conformal Prediction for Machine Learning models trained in MLJ.