I am trying to copy the XGBoost.jl with Crabs tutorial with JLBoost.jl. I think the data is identical and most things work fine, but I can’t get learning_curve!
to work. Here is the code.
using Pkg; Pkg.activate(".")
using MLJ, StatsBase, Random, PyPlot, CategoricalArrays, PrettyPrinting, DataFrames
X, y = @load_crabs
X = DataFrame(X)
using XGBoost, MLJ
@load XGBoostClassifier
xgb = XGBoostClassifier()
xgbm = machine(xgb, X, y)
r = range(xgb, :num_round, lower=10, upper=500)
curve = learning_curve!(xgbm, resampling=CV(),
range=r, resolution=25,
measure=cross_entropy)
]add https://github.com/xiaodaigh/JLBoost.jl#development
using JLBoost
xgb = JLBoostClassifier()
xgbm = machine(xgb, X, y)
r = range(xgb, :nrounds, lower=10, upper=500)
curve = learning_curve!(xgbm, resampling=CV(),
range=r, resolution=25,
measure=cross_entropy)
But I am getting this error, and I am not sure where this CrossEntropy comes in? Also, I tried to look through the XGBoost MLJ implementation and I can’t find what I should change to get this to work.
MethodError: no method matching (::MLJBase.CrossEntropy)(::Array{Float64,1}, ::CategoricalArray{String,1,UInt8,String,CategoricalString{UInt8},Union{}})
Closest candidates are:
Any(!Matched::AbstractArray{#s160,1} where #s160<:UnivariateFinite, ::AbstractArray{#s159,1} where #s159<:(Union{CategoricalString{U}, CategoricalValue{#s15,U} where #s15} where U)) at C:\Users\RTX2080\.julia\packages\MLJBase\JdmO3\src\measures\finite.jl:36