GPU computing on AMD consumer hardware?

I’m putting together a new computer and would like to be able to experiment with GPU computing in Julia (on Ubuntu). I would greatly prefer an AMD GPU because of the in-kernel driver support on Linux.

My understanding is that the main AMD GPU package is AMDGPU.jl, and that is build on top of ROCm.
The hardware support list for ROCm does not mention “consumer” hardware like RX 5700 or RX 6800.

However, benchmarks seems to indicate that ROCm does work on RX 5700 and RX 6800 cards, and they even perform at least as well as similarly priced Nvidia cards. I can also find hits that at least RX 6800 will by supported in ROCm soon.

Does anybody here have experience using RX 5700 or RX 6800 or similar hardware for GPU computing in Julia, and can comment on how well it works?

4 Likes

Those benchmarks are only for OpenCL on ROCm and thus don’t offer any guarantees on whether other libraries will work on Navi (RX 5000/6000) GPUs. I assume some basic functionality will work, but support is spotty and messaging on when they’ll be supported even more so.

Thank you for the reply @ToucheSir .
Not very uplifting, unfortunately :slightly_frowning_face:

If OpenCL supports the card, then AMDGPU.jl should be able to support it too, since both packages rely on the same underlying libraries (LLVM, ROCR, and ROCT). You might consider running AMD’s ROCK kernel for the most up-to-date support as well (although IMO it has some stability issues on my Raven Ridge system compared to a mainline kernel). I would expect that regular non-library things will work fine, but rocBLAS, rocFFT, et. al may have spotty support since those include a lot of hand-rolled intrinsics.

If you have any issues getting it to work, please ping me here on Discourse, or file an issue on AMDGPU’s Github page, and we can debug it from there!

3 Likes

Thank you for the reply @jpsamaroo . I watched your Juliacon 2020 talk with great interest.
Can you recommend a current AMD GPU for experimenting with GPU computing in Julia, at not too much more 500 USD?

My recommendation would be a Vega 56/64, although most of the AMD card prices are inflated right now (holidays? shortage due to CNY? I have no idea why, but maybe try Ebay). The RX 480/580 line is apparently going unsupported in ROCm, which is quite sad, but the Vega line is known to be well supported by all of ROCm, while not being absurdly priced.

That said, I’m personally looking forward to when the RX 6800 comes down in price (and ROCm support gets better), because I’d love to play around with the raytracing instructions!

2 Likes

Thank for the reply @jpsamaroo . eBay has some reasonably priced Vega 64 cards. That’s a good suggestion.

Still no ROCm support for RX5000 or RX6000 (navi) cards:

1 Like

There’s a new version of ROCm: AMD Documentation - Portal
However, I can not find anything about supported hardware where it used to be.
Phoronix indicates that some RDNA2 cards (RX6000 series) could be supported.

There seems to be spotty support for RX6000 cards https://github.com/RadeonOpenCompute/ROCm/issues/1617#issuecomment-1046570695

1 Like

Is there updated advice on recommended consumer hardware for AMDGPU.jl development, for 2022?

Whilst I am developing ultimately for the MI250, in the meantime I can’t tie up entire HPC nodes for development. So I’m looking for a well-supported development setup for AMD hardware that allows easy local development.

None of the currently available consumer GPUs are supported by ROCm. AMD was supposed to add support for RDNA2 cards with ROCm 5, but it is not there yet.

If you want to start developing now, I recommended to buy used Radeon VII. You can find them e.g. from eBay.

1 Like