The `evaluate`

and `evaluate!`

methods in MLJ can accept a vector of metric functions. However, it doesn’t appear that you can evaluate metrics based on probabilistic predictions (e.g. AUC) and metrics based on deterministic predictions (e.g. accuracy) at the same time. Here’s a MWE:

```
using DataFrames
using RDatasets
using MLJ
using MLJLinearModels
iris = dataset("datasets", "iris")
df = filter(r -> r.Species != "virginica", iris)
y = droplevels!(copy(df.Species))
X = select(df, Not(:Species))
model = LogisticClassifier(penalty=:none)
logistic_machine = machine(model, X, y)
holdout = Holdout(shuffle=true, rng=1)
logistic_auc = evaluate!(
logistic_machine,
resampling = holdout,
measure = auc
)
logistic_accuracy = evaluate!(
logistic_machine,
resampling = holdout,
operation = predict_mode,
measure = accuracy
)
```

Does anyone know if it’s possible to evaluate `auc`

and `accuracy`

at the same time without having to run `evaluate!`

twice?