I wanted to announce to y’all that I’m heading up a study group to apply Julia to the fantastic deep learning course at fast.ai. We’re going to be following along with the book and implementing things in Julia as we go!
For those of you not familiar, fastai is an extremely good deep learning course. It’s typically done in Python. They have tons of lectures, resources, and tools to make deep learning extremely easy. Importantly, they have a “code-first” approach to teaching, which I personally find to be much nicer than dense math.
Some things you can expect in our study group:
Learning about deep learning!
Writing good Flux.jl code.
Regular meetings to check in on everyone’s learning.
Presentations on deep dives by group members (I’ll do a bunch of em if people would like that)
Intermittent breaks from following the book to do fun side projects
A good community of smart people!
Please join our Discord if you are interested in following along! I am trying to keep the Discord relatively spam-free, so please DM me here on the Julia Discourse or email me at cameron (at) pfiffer (dot) org.
Everyone is welcome, no matter your skill level! I personally find that teaching is the best way to learn, so I’m hoping to have folks around that I can be helpful to.
Flux is way more mature atm, it’s bit funny but I am unable to settle on one framework in Julia because of lack of consensus and rewrite because of flaws. It just incites the urge to stick with Python for work instead. We sort of dumped Knet, now we are trying to dump Flux in a way and yeah, feels like a mess from my general user perspective. I recently started thinking I might have to waste couple years to write my own framework(out of love for Julia) for my work because these folks can’t settle .
To be fair it’s not really better in Python. You have PyTotch, Tensorflow, Mxnet, Jax, etc. I would say it’s not really settled there either. Most of the frameworks in Julia will probably also coexist. It’s not necessarily a bad thing. Even if it makes choosing framework harder.
Fair there are ample choices(they have problem of choice too) there too, problem of trenches comes then. Most of those frameworks are amply ironed out for research purposes, gap between entering and to the trenches is much higher compared to what we have here rt now. It would be nice to have one well supported good thing that we can trust and use for research. PyTorch there looks like clearly winning n seems to want to maintain that place for a while.
My sense of Lux is that it’s still quite new – I’d be interested in playing with it during the study group and I would fully endorse people exploring it, but Flux is pretty robust for the moment and thus is probably a good self-learning tool.
My sense is that it’s good if you only want to do computer vision & tabular data stuff. Internally we are probably not going to use it a lot for this course, though it’s up to each learner. We may end up trying to merge some features back into FastAI.jl if we can.
Basic support for text data was added as part of a GSoC project, but it still needs some work before merging. In general I think the JuliaText ecosystem could use some help too, but that’s beyond the scope of FastAI.jl .
WRT development, all I know is that real life commitments mean that there isn’t nearly as much time to work on FastAI.jl as there was before. If anyone is interested in helping with continued development of the library and has a decent amount of experience with Julia dev + ML, feel free to reach out.
I think I could be interested in talking to someone about it. I’m very, very interested in text stuff, so I expect to show up around the JuliaText ecosystem. I wouldn’t mind a one-on-one chat with whoever is currently in the JuliaText world to get me acquainted with the ecosystem.
There’s also some folks in this study group who are pretty good developers, might be worth giving me an overview if only so I can start offering up little tasks to people.
Unfortunately I am likewise disconnected with the JuliaText ecosystem and would not know who to talk to about this. If you figure that out, would be more than happy to join the conversation too!