ANN: LossFunctions.jl

Hello everyone!

After MLLabelUtils.jl, I am happy to announce the next JuliaML package to reach a mature state: LossFunctions.jl

Github: https://github.com/JuliaML/LossFunctions.jl

Description

LossFunctions.jl is package designed for the sole purpose to provide an efficient and extensible implementation of various loss functions used throughout Machine Learning. It is thus intended to serve as a special purpose back-end for other (Julia)ML libraries that require losses to accomplish their tasks. To that end we provide a considerable amount of carefully implemented loss functions, as well as an API to query their properties (e.g. convexity). Furthermore, we expose methods to compute their values, derivatives, and second derivatives for single observations as well as arbitrarily sized arrays of observations. In the case of arrays a user additionally has the ability to define if and how element-wise results are averaged or summed over. We put a lot of effort into testing the correctness and type-stabilty. Further we tried to make everything as type-friendly as possible.

Check it out! The documentation just got a huge overhaul.

Documentation: Welcome to Read the Docs — LossFunctions.jl latest documentation

Motivation

The motivation behind LossFunctions is pretty simply. It is quite common for other languages to bake the utilized loss function into the implementation of the algorithm (you may have seen frameworks where you have to specify the used loss as a string, e.g. loss="l1"). While that may be necessary in some special cases, many algorithms really just have the requirement that the used loss is from some family of loss functions that have some properties. Our goal was to allow users to define new losses in a first-class manner without losing the ability to utilize them for algorithm that could handle such a loss-function in the theory.

Disregarding the general case of being restricted by a two-language barrier. There are various other reasons I can think off why other frameworks still do it like this anyway. My guess is that one of the most convincing reasons is a practical one. Most loss functions are themselves actually very very simple operations. for some it boils down to just performing a subtraction. In most languages, any kind of abstraction could lead to a overhead. In Julia we were able to define the losses with many layers of abstraction that still have zero cost, which is quite nice.

Closing Words

Let me know what you think. Any kind of feedback or criticism is very welcome!

big thanks to all the contributors

15 Likes

Thank you for this library! It’s very succinct and does its job well. Keep up the good work!

1 Like

This looks like a great package – I haven’t tried it but the documentation and design are excellent. Having packages like this that implement one kind of thing – in this case loss functions – in a generic, reusable, high-performance way is, to me, really a big part of the promise of Julia. Multiple dispatch really does make sharing a package like this feasible :hugging:

4 Likes

Many thanks for this library. I’m the guy who already had a LossFunctions.jl on github (unregistered), but your package is vastly better and I’ll be switching my other code over to using this library going forward.

2 Likes