[ANN] BetaML v 0.7 New Missing values imputers and "standardised" fit!/predict API

Hello all, I have just released a “big” new release of BetaML.jl, the Machine Learning toolbox of the Bureau d’Économie Théorique et Appliquée (“BETA”) of Nancy, a repository with several Machine Learning algorithms (linear classifiers, random forests, neural networks, k-means/medoids, Gaussian Mixture Models…) and utility functions (samples, partitioners, encoders, fitting measures, scalers, cross-validation, confusion matrix…) in pure Julia.

These are the release notes:

  • new experimental “V2” API that implements a “standard” mod = Model([Options]), train!(mod,X,[Y]), predict(mod,[X]) workflow. In BetaML v0.7 this new API is still experimental, as documentation and implementation are not completed (missing yet perceptrons and neural networks). We plan to make it the default API in BetaML 0.8, when the current API will be dimmed deprecated.
  • new Imputation module with several missing values imputers MeanImputer, GMMImputer, RFImputer, GeneralImputer and relative MLJ interfaces. The last one, in particular, allows using any regressor/classifier (not necessarily of BetaML) for which the API described above is valid
  • Cluster module reorganised with only hard clustering algorithms (K-Means and K-medoids), while GMM clustering and the new GMMRegressor1 and GMMRegressor2 (different strategies over the underlying EM algorithm) are in the new GMM module
  • Split large files in subfiles, like Trees.jl where DT and RF are now on separate (included) files
  • New oneHotDecoder(x) function in Utils module
  • New dependency to DocStringExtensions.jl
  • Several bugfixes
3 Likes