AMDGPU.jl and AMD Instinct MI300A APUs

Hi everyone,

At my research institute, we’re considering purchasing a server equipped with four AMD Instinct MI300A APUs. Our numerical code is entirely written in Julia and currently runs smoothly on NVIDIA GPUs using CUDA.jl. However, we’re concerned about the compatibility of AMDGPU.jl with the MI300A.

As discussed in this GitHub issue, support for the gfx942 architecture was added in LLVM v17. The upgrade to LLVM v17 appears to have been merged into the master branch of Julia a few weeks ago (Bump LLVM to v17), so I suppose it will be available in one of the upcoming releases.

Does anyone see any additional issues with using the MI300A with AMDGPU.jl, especially considering the unified memory structure of these new APUs?

Thanks in advance for your help!

Regarding unified memory, AMDGPU already uses it on normal (non mi300A/300x) gpus. So it should work the same except faster.
IIRC amdgpu arches are backwards compatible so you might lose on a couple of features before llvm17 is there but it should all work. If you have any further questions I believe the #gpu channel in slack will get you better/faster answers.

2 Likes