Flux with AMD GPU(s)?

Has anyone used Flux with an AMD GPU? I will be involved in a project that likely will be using the LUMI supercomputer in Finland which is based on AMD.

1 Like

Not sure about Flux in particular, but AMD GPU support has been making good progress recently as far as I understand, see:

@jpsamaroo can probably be more specific.

5 Likes

Hi @johnbb ! Flux should work with AMDGPU.jl, although many features (like CNNs or softmax) don’t work yet because we haven’t hooked up the necessary functions from ROCm’s MIOpen library. That should be pretty easy to wire up, though, so if you want to take this on, please let me know!

5 Likes

Pinging @luraess because he’s actively working on Julia + AMD GPUs and also is testing on LUMI if I’m not mistaken.

1 Like

Julian, do you maybe have a Github issue with a short list of what steps you think are required… beyond just trying each Flux test/example and seeing what’s missing/broken, then examining the CUDA.jl equivalent? Thanks!

2 Likes

Indeed, doing some early access tests with AMDGPU.jl, MPI.jl and ImplicitGlobalGrid.jl on LUMI. The ROCm stack is functional and accessible from AMDGPU. Currently testing with Julia v1.8.0-rc3. “Classical” HPC though so nothing done with Flux (yet).

2 Likes

In addition to AMDGPU.jl providing the necessary bindings, we’ll want to create an AMD equivalent for NNlib. Once that’s in place, Flux models should just work™. Maybe we ought to create this repo and used the NNlib interface as “the list” to track all the missing pieces top down?

5 Likes

BTW while our 3D+ML team at AMD is using Julia and AMDGPU.jl, we’re not heavy users of Flux and NNlib yet… we write our ML kernels primarily using KernelAbstractions.jl. So while I wish we could address this Flux+AMDGPU limitation, we can’t prioritize it right now. But our team will be hiring 3 more research engineers soon. If anyone who reads this is interested in joining our team and supporting this use case, please message me. :slight_smile:

9 Likes

Using the NNlib interface as our list of missing features sounds good; I’m not actively focusing on Flux-based ML right now, though, so I’ll let one of you create NNlibAMDGPU.jl (or NNlibROCm.jl, etc.). Feel free to also add comments to Implement Neural Network primitives · Issue #11 · JuliaGPU/AMDGPU.jl · GitHub.

1 Like

Thanks for the response everyone. I likely don’t have the skills to contribute to an AMD NNlib, unfortunately, apart from making tests and being a keen user (through Flux). As it seems now, I will not have access to LUMI/AMD GPUs before well into 2023. It would be great if we somehow could use Flux (or similar) on AMD GPUs within the next year or two.

Due current status, will make sense starting working on Flux with CUDA and next switching to AMD as soon things got more complete?
How much changes will be needed (of course limiting to ROCm supported components) porting the code from NV to AMD backend (the idea is using NV as primary and just cross test on AMD from time to time to know when is ready for a switch).

Yes, in my case, I already have working models in Flux with single NVIDIA GPUs. In the aforementioned project, a (sub)project under the European Destination Earth programme, I have/had no influence on the choice of HPC/compute resources as my task is relatively a minor one. Besides, I am possibly the only one using Julia, but of course eager to demonstrate that Julia/Flux is a viable alternative to TensorFlow and PyTorch, in particular since I don’t know Python. I guess I will mostly do the development and testing on my own computer (as often is the case), but in the end I need to have code running on AMD hardware.

I will do the same.
Concerned if there is any tool to check the Julia code and hilight ROCm unsupported features.
Something like HIPify.