I’m not so much asking if they are better, just some mapping, and trying to get an overview of the mainstream and where there might be big holes in the Julia ecosystem.
E.g. Pytorch Lightning corresponds to Keras, and to what in Julia? See e.g.: Converting From Keras To PyTorch Lightning | by William Falcon | Towards Data Science
They are higher-level interfaces to Pytorch and TensorFlow, those are the most mainstream low-level packages right? There’s also MXNet, and Julia already officially part of, but has MXNet fallen out of favor (not only for Julia)?
Just as Torch was written in Lua and migrated to Python, it could happen again to Julia, or has already happened in the form of Flux (also a replacement for TensorFlow)? Where does Knet fit in?
I’m mostly thinking of neural networks, feel free to add more mappings for machine learning. I.e. MLJ.jl would correspond to SciKitlearn? And it can also have it as a backend.
For some applications, e.g. GPT-3, you would find such in a model-zoo at e.g. Flux (if available), or in other more likely places? And models improving on GPT-3 like Google Brain’s Switch Transformer Language Model Packs 1.6-Trillion Parameters | Synced
For multi-GPU, what’s the go to library now, Horovod or Deepspeed or its fork: https://github.com/EleutherAI/DeeperSpeed ? And what’s the equivalent in Julia if any, or could you use those?
https://github.com/FluxML/FastAI.jl corresponds well to fast.ai.