@Ivan Thanks for reporting this.
In the last breaking release of MLJXGBoostInterface those particular access points were indeed removed. However, MLJ now has a generic feature_importance
accessor function you can call on machines wrapping supported models, and the MLJXGBoostInterface models are now supported.
Unfortunately, I just discovered a minor bug, so that only the classifier is currently working. Here’s the workflow in that case:
using MLJ
XGBoostClassifier = @load XGBoostClassifier pkg=XGBoost
X, y = @load_iris
model = XGBoostClassifier()
mach = machine(model, X, y) |> fit!
julia> feature_importances(mach)
4-element Vector{Pair{Symbol, Float32}}:
:petal_length => 2.991818
:petal_width => 1.3149351
:sepal_width => 0.072732545
:sepal_length => 0.042442977