I am excited to announce the release of FastAI.jl which has been in the works for over a year now. FastAI.jl is a high-level, user-friendly library for training deep learning models built on Flux.jl. It draws inspiration from the great fastai. We hope the package makes the Julia deep learning ecosystem more approachable for beginners and life easier for more advanced practitioners.
The extensive documentation has a host of tutorials and examples and you can try them out by installing FastAI.jl or, if you want access to a free GPU, by using this Google Colab template.
Features include
high-level training loops with hyperparameter scheduling and callbacks
a learning rate finder
dataset loading
efficient data augmentation
visualizations
logging
efficient data loading;
and many more
FastAI.jl comes with stable low-level and mid-level APIs for doing most things that are possible in fastai. Support for computer vision applications is very good, and we hope to add the high-level functionality for other applications like tabular data, reccomendation systems and text over the following weeks and months. This documentation page gives a comparison between the original fastai package’s and FastAI.jl’s APIs.
For those interested in learning more, we’ll be hosting a Q&A session 02.08., 10PM UTC (03.08., 12AM CEST | 8AM AEST). Jeremy Howard (creator of fastai) will be there, too.
That said, unless you are representing FastAI (which is a specific organization/entity), it might be better to choose another name if all you wish to portray is a similar flavor of ease-of-use.
I don’t know if FastAI has registered trademarks (and can therefore legally force you to change the name), but regardless, it doesn’t behoove well to create confusion and put them in the awkward position of explaining that they are not responsible for the Julia library (and the lack of feature parity, or distinct design decisions, etc).
Imitation is certainly a form of flattery, but the way to keep it in good taste is to not copy the name
Forgive this comment if you do represent FastAI after all.
See here it looks like Jeremy Howard, (one of) the maintainer(s)/creator of FastAI is thrilled to have the project named like this and is also excited about the general Julia ML community! Hopefully trademarks and stuff have been sorted if necessary but if the author is pleased then that’s 99% of the way there.
We (well, not me, the folks who put the hard work into FastAI and supporting libraries) have been in touch with Jeremy and fast.ai contributors off and on since the project’s inception . You can probably find the nitty-gritty details scattered across Zulip and the fastai discord.
Great to hear! @holylorenzo & co, Good luck with the project!
The modular pieces look particularly useful and exciting for everyone in the Julia ecosystem
FastAI builds on top of Flux which it uses as a differentiable programming toolbox for neural networks. FastAI then provides different levels of abstractions that allows practitioners to quickly develop models for specific tasks such as image classification etc. There are many more nuances of course and I’m sure the author’s can provide a lot more details but that’s how I view it currently.
FastAI.jl is built on top of many packages and combines them. It uses Flux.jl for models and optimization, but is still useful if you’re not using Flux.jl. Flux.jl is to FastAI.jl what PyTorch is to fast.ai. FastAI.jl builds on many other packages as well; some purpose-built for FastAI.jl, but all useful in their own right. From the docs:
Flux.jl provides models, optimizers, and loss functions, fulfilling a similar role to PyTorch
MLDataPattern.jl gives you tools for building and transforming data containers
DataLoaders.jl takes care of efficient, parallelized iteration of data containers
DLPipelines.jl provides the low-level LearningMethod interface for defining data pipelines.
DataAugmentation.jl takes care of the lower levels of high-performance, composable data augmentations.
FluxTraining.jl contributes a highly extensible training loop with 2-way callbacks
You may also find this new blog post to give a good overview of how FastAI.jl brings these together in a high-level interface.
I’ve just updated the description with the meeting link for the Q&A about FastAI.jl that’s happening 02.08., 10PM UTC (03.08., 12AM CEST | 8AM AEST). If you already have a question in mind, you can submit it to the pigeonhole link.
There was a small issue that prevented it to work for me:
From worker 2: WARNING: both FastAI and CairoMakie export "Image"; uses of it in module workspace2 must be qualified
From worker 2: WARNING: both FastAI and CairoMakie export "Label"; uses of it in module workspace3 must be qualified
… and it worked almost until the end. The final fitonecycle!(learner, 10) seems to freeze for me… I’m not sure if it’s because it’s running too slowly on my CPU…?
Thanks for the feedback, I’ll update the imports so it works.
Unfortunately, Pluto.jl doesn’t yet show you the output during training, so once you enable the cell it will only return once the training is complete which can take quite some time, especially the first time when it’s compiling. You’ll get a better interactive output for the training part in a Jupyter notebook atm. Training should also be prohibitively slow on a CPU, so I recommend using a GPU. If you don’t have access to one, you can try out this Google Colab template and copy the code over. Let me know if everything works out!
Is there a reference for some of the high-level philosophy guiding the FastAI.jl port? I’m interested in contributing but am somewhat intimidated by the mapping from Python to Julia, specifically what has constrained Jeremy/Sylvain’s choices and what wouldn’t in Julia.
EDIT: I am reading through the API comparison doc now.
Has anyone else run into problems with the visualization based on the functions defined under interpretation (in method.jl)? I consistently seem to be getting ERROR: UndefVarError: showbatch not defined or ERROR: UndefVarError: showoutputs not defined etc, even though I am of course using FastAI and I checked that FastAI.jl does include("interpretation/method.jl") and export showbatch etc. It also doesn’t help when I explicitly state FastAI.showbatch or so. I am working VS Code. What am I missing here? Everything else works like a charm though!