Best Julia Package for Neural Networks

What is currently the best machine learning package in Julia?

First, in terms of ease of use for students.

Second, in terms of capability & performance for experts?

Flux.jl and Knet.jl are the most popular

6 Likes

Flux.jl is amazingly easy to read. By far the most readable nn package I ever used

If you’re looking for a general ML package, then MLJ.jl is your best bet! It’s an interface and it can access Flux.jl and other ML packages

5 Likes

Do you mean API level like Chain or the underlying code?

1 Like

I meant the underlying code, but I like both

My own personal experiences with both:
Ease of use? Probably Flux.

Performance? Knet.

Not sure if this is unique to me, but my custom models with large 3D data, an epoch on a single core of my K80 takes 35 minutes in Flux. The same model in Knet with the recent CUDA changes takes 10 minutes. YMMV.

Flux comes “ready to go,” whereas Knet performs the underlying NN computations but you have to define your own layers (not too big of a deal to be honest). The maintainer of Knet is very active and is constantly updating the package. In fact I think he’s working on implementing a layers interface, so that should be available soon.

4 Likes

I am also looking at Knet recently, but it seems like a one-person project. Not sure how long it will be maintained.

Keras was a one person project for a very long time :wink:

2 Likes

+1 for Knet. I’ve been using it for the past two years and it’s getting better and better. I somehow find the underlying code more user friendly and easier to understand for a newbie in comparison to Flux(although Flux is more elegant, I still find it harder to understand).
When I started I was very curious about the internals so I used the debugger a lot. In the case of Knet, if you start debugging, you’ll encounter mostly three “code blocks”: the types(Chain, Conv which you defined), a whole bunch of iterators over your data and the backpropagation part where you can see the tape being created. The single obscure part I found is the primitives for the backpropagation which are automatically generated and you cannot “step into” them (you can find them in Autograd.jl).
I does receive contributions from other users, but my impression is that the core stuff is maintained only by Deniz Yuret. Now it should gain more traction since the transition to CuArrays is mostly done and there is no need anymore for separate compilation of the kernels at installation.

Actually, it is maintained by a research group at Koç University.

2 Likes

I haven’t used Knet recently but at the time it was a bit lower level than Flux, which is sometimes better for learning and debugging, while Flux feels more like magic, which is more convenient to use when it works.

1 Like

I am tempted to add to the list my own package, BetaML, a repository of ML algorithms (Feed-forward NN with optional AD, Clustering/Reccomendation Systems, Linear/Kernel Classifiers, Decision Trees/Random Forests).

It is nowere close to the other packages in the list in terms of performances/capabilities, but it is very easy to use and the code is readable by a beginner (indeed, it has been wrote by a beginner… :slight_smile: ) to understand the basic of the various ML methods…

Note: in the title you say “NN”, but then your question is about ML algorithms… NN are “only” one category of ML algorithms…

2 Likes

The other thing you could do is just try both. They are similar enough in my experience that it doesn’t take much to port code from one to another; in fact I started in Flux and then moved over to Knet due to performance improvements. See which one works for your use case.