Flux.jl versus Lux.jl

The Scientific machine ecosystem in Julia now recommends Lux.jl as opposed to Flux.jl. What are the use cases in general where Flux.jl is better than Lux.jl? Looking at the questions on Discourse, I have the impression that most of the work still uses Flux. Is this true?

I tried to find complex examples of networks using Lux.jl and came up short? Where can I find these? For example, have transformers been implemented with Lux? How about GRUs? LSTMs? Attention models?




The documentation explains some here Why use Lux over Flux?


I read that DiffEqFlux extends Lux. Is it 100% implemented in Lux? If not, why not? If yes, shouldn’t the module then be renamed DiffEqLux? I have on this site that coding good names is important. Using the substring Flux for a library that extends Lux is confusing to me.


1 Like

I guess now DiffEqFlux is actually DiffEqFLux . :smirk: (F stands for “for”)

It will still take in Flux neural networks, but convert them to Lux (because the two are interchangeable). That makes the code simpler and improves performance. We don’t plan to drop the Flux frontend, everything will still allow users to give us Flux models if they so choose. But, for documented reasons, our tutorials will not suggest that people use Flux, and we will convert Flux models to Lux models internally.

:slight_smile: I thought I was being original.

disclaimer: I’m a Flux’s maintainer and didn’t tinker much with Lux, so probably I’m a little biased.

I would pick Flux for any typical deep learning, and use Lux for NeuralODE-like tasks instead.
Flux has been around for longer, is maintained by more people, and has a larger ecosystem of packages built on top of it, e.g.

The main difference with Lux is that Flux model has stateful layers (like pytorch) and Lux passes the state around instead (like Flax). I think you won’t find many other differences, Lux borrows heavily from Flux and both packages rely on the same package in the julia ecosystem, e.g. NNlib.jl, Optimisers.jl, Zygote.jl (all maintained by the Flux people) and then CUDA.jl etc…