From a quick look at their documentation it seems they don’t have a MLJ.confusion_matrix! method, so you are probably out of luck (or you would copy that method to do an in place update, how ugly;).
@ablaom Just an FYI, I wrote a function for multi-class.
function kappa(yhat, y)
# Get confusion matrix
try
confmat = MLJ.confusion_matrix(mode.(yhat), y) #probabilistic
catch
confmat = MLJ.confusion_matrix(yhat, y) #deteministic
end
confmat = confmat.mat
# sizes
c = size(confmat)[1] # number of classes
m = sum(confmat) # number of instances
# relative observed agreement
diags = [confmat[i, i] for i in 1:c]
p_0 = sum(diags)/m
# probability of agreement due to chance
# for each class, this would be: (# positive predictions)/(# instances) * (# positive observed)/(# instances)
p_e = 0
for i in 1:c
p_e_i = sum(confmat[i, j] for j in 1:c) * sum(confmat[j, i] for j in 1:c)/m^2
p_e += p_e_i
end
# Kappa calculation
κ = (p_0 - p_e)/(1 - p_e)
return κ
end
Noted, thanks. The try/catch block is discouraged as slow. Basically kappa is a deterministic measure and would be implemented as such in MLJ. The user can compute mode if want to apply it to probabilistic predictions. In any case, MLJ’s evaluate! apparatus will allow you to specify deterministic measures where predictions are probabilistic, automatically calling the model’s predict_mode method instead of predict before passing on to the measure.
BTW, a PR is welcome, if you are happy to include tests. Apart from the user guidelines for measures, there are these guidelines for adding new measures.