What are the currrent options for deep learning on an AMD GPU using Julia?

Hello everyone,

I am quite new to Julia and the last few days wondering how I should go about my next project which will involve training word embedding models and performing various downstream tasks afterwards. I would like to use as much Julia as possible but I was wondering what are my different options are. My current perception is that, albeit a lot of work being done in that area, AMDGPU.jl it has not been integrated with all the other machine learning libaries in Julia I have found (Flux, Knet, MLJ, more?), because it is not fully ready yet unlike Cuda.jl, but is required as a backend for these libraries to work with AMD GPUs (please corrent me if this is wrong!).
Is using PyTorch via PyCall a viable option, does anyone have any experience doing that? I was also wondering if MXnet might work as ROCm seems to support it and it seems to support Julia? I have also seen the “old” tensorflow package, but it has been in maintenance mode for a such a long time that I am a bit “afraid” to use it since many things may have changed. If neither of those is an option, what would be my current best shot at being able to handle everything after word embedding training within Julia? Did someone have a similar problem before, how do you get your models into Julia?

This is a bit off-topic from the main question, but maybe it makes sense to also describe my situation and background a bit, so it is more understandable why I would like to do all (or most) of this in Julia (despite it may being a uphill battle here or there): I come from an R background and the last time I have done some deep learning I worked with R tensorflow (which is just a wrapper for calling tensorflow via Python from R) about 3 years ago.
A few months ago I had to do some clustering for a project and faced the problem that neither R or Python had fast enough implementations of the algorithm available. So I had to make a decision if I want to start learning C++ (0 previous experience) and switch back and forth between R and C++ or try something else. During my research on possible alternatives I stumbled upon Julia and I’ve fallen in love with the language quiet a bit ever since.
Now I am in the situation as described where I want to train word embedding models and then perform various downstream tasks and I am lucky enough to have server access to a machine with a MI100 AMD GPU, which is also needed for the amount of data I have to process. This is also my first “big data”-project and I expect that I have to write some code that is not contained in pre-existing libraries in R, which I would like to do as much of the work as possible within Julia. My data is also too far to fit in memory so data.table which used to be my go to package in R might not be a good choice anymore (and I have read good things about JuliaDB for example).

Sorry for the kind of long post, but thanks for taking the time to read it :slight_smile:

2 Likes