Lilith.jl is now called Avalon.jl

Exactly. Frankly speaking, I can’t recommend neither Flux, nor Avalon as the main deep learning library for someone in the industry (just yet), and not because dropout or transposed convolutions are missing, but because there are still too many bugs and caveats. Sometimes these issues come from third party libraries and have quite long way to fixes (e.g. like this bug in CUDA.jl), sometimes they hit corner cases and take weeks to fix (e.g. this one). But it’s part of infrastructure maturing - when GPU stuff, web programming, API clients, big data tools, etc. are ready, we will already have ML kitchen in a good state to finally replace Python.
What is good about existing deep learning libraries in Julia is that they are already suitable for certain tasks (e.g. I used Avalon extensively for my representation learning experiments, some of them can be found in model zoo) and if something is missing, it’s usually not too hard to add it (e.g. I’m currently working on Transformers which require at least Embedding layer, so that’s my next goal).

No I didn’t, thanks for letting me know! It will definitely influence my work in ONNX branch.

5 Likes