Mac's AMD GPU

the new Mac Pro seems to be very powerful and can be equipped with multiple powerful AMD GPUs. But OpenCL.jl seems very inactive.

do we have any chance to unleash the power of Mac’s GPU using Julia? (Mac is not supporting NVidia anymore)

4 Likes

I hope so! Once Add OpenCL runtime support by jpsamaroo · Pull Request #24 · JuliaGPU/AMDGPUnative.jl · GitHub is working and ready to go, it should only be a small amount of work to fully support AMD GPUs on both Mac and Windows.

12 Likes

so glad to hear that! thanks!

1 Like

I have the same concern. I hope it’s OK if I piggyback off your question, @tomtom.

I’m considering buying a new Mac to replace my old one, but given the increasing importance of GPUs both in machine learning and general number crunching, I am quite worried that tools to support non-Nvidia cards are lagging behind – I don’t mean specifically in Julia, but I get the impression that the momentum is mostly behind CUDA now.

GPGPU is not currently an important part of my workflow, but I’d like to dip my toes, and will probably do more over the lifetime of a new laptop.

What do people think (I’m asking for unsubstantiated opinions and wild guesses)? Are Macs a viable platform for GPGPU and machine learning over the next years? Do you think that openCL alternatives to Julia packages like CUDANative and CuArrays can reach rough parity, or are Macs now a dead end?

Would you buy a Mac if GPUs make up any significant part of your workflow?

2 Likes

seems like AMDGPUnative.jl depends on OpenCL.jl?
This’s a concern as OpenCL.jl is very outdated?

OpenCL.jl is a very mature and rather stable package. It doesn’t see much recent development partially because it does the job it’s supposed to do.

However, OpenCL.jl doesn’t provide the same conveniences that one gets with the CUDA ecosystem (specifically, CuArrays), which makes it less useful when composing its GPU support with other packages, and with multiple dispatch.

Adding OpenCL support to AMDGPUnative would workaround that by leveraging what already exists for AMD GPUs, specifically ROCArrays (sister package to CuArrays).

3 Likes

I don’t think anyone can accurately predict what Apple will do w.r.t GPGPU support in their software ecosystem, but I assume they’re smart enough to keep OpenCL working for the foreseeable future. Dropping that support would immediately kill off a portion of their compute ecosystem who aren’t willing to move from OpenCL to whatever proprietary garbage they replace it with, while the rest of the ecosystem would need to invest significant effort to change to such an API.

If you’re willing to work on Linux, it is really the ideal platform for GPGPU right now and gets full support from AMD (especially if you go with Ubuntu).

4 Likes

u mean Nvidia istead of AMD?

No I mean AMD :smile: AMD’s ROCm stack is their premier GPGPU software stack, and only works on Linux at the moment. AMD also has their drivers integrated into the mainline Linux kernel.

1 Like

I can’t imagine! That means Mac Pro canNOT use ROCm even with a Radeon Vega II ??? either Apple or AMD are ridiculous then.

Neither is really at fault, modern GPU drivers are complicated beasts that touch multiple subsystems within the kernel and requires cooperation between vendor and kernel owner (AMD and Apple, resp.). Still, it’s just a matter of time for kernel engineers to integrate everything. In the meantime, you can use OpenCL behind-the-scenes through AMDGPUnative once that PR is working. Once AMD and Apple get the ROCm stack working on macOS, we’ll make use of it in AMDGPUnative.

3 Likes

FWIW, I am not sure I understand the motivation for of an Apple machine for computational purposes. For a sleek and well-designed laptop, or possibly even a nice desktop, I understand they produce very appealing hardware with a polished OS experience.

But for a server that’s just going to generate a lot of heat, noise, and end up in the server room anyway (ideally, or under the desk if that’s not possible), why pay the Apple premium? You can get more value with commodity hardware.

2 Likes

haha, I’m now using a iMac for coding and personal desktop (i.e. not a server hided in a server room), and I hope to buy a Mac Pro to do some serious deep learning.
Mac is good looking and has the best GUI. Besides, it has a ecosystem that works well with iOS devices.
Yes, generally Apple products are charged with premium, but Mac Pro (2019) is an exception: it is fairly priced, given that u upgrade the RAM by yourself and do not buy the exclusive MPX modules. Also, various tests in Youtube show that Mac Pro has excellent cooling and produces very little noise.

I wonder why the package is called AMDGPUnative.jl, does it mean that it won’t work for other GPUs with a proper OpenCL driver ?

Apple is not supporting opencl anymore. Is not even installed in new versions of osx

1 Like

In the scientific workgroup I’m in, people use mostly iMacs and some old Mac Pros to do exploratory data science and ODE modeling. Still quite a few use Matlab, some Python.

We have been discussing leveraging GPUs in the Macs to speed up the exploratory part of data processing and some smaller parallel jobs of ODE solving using Julia DiffEq’s GPU computation locally. This seems out of the question for now. I get the feeling that Apple ditches their open-source scientific users in favor of some kind of strategic position where they control what can be done with the GPUs in their computers, by first letting OpenCL die and then not adopting Vulkan and ROCm.

For heavy computation we use an institutional compute cluster. The cluster is Linux and is getting increasingly GPU support. The GPU cluster admins require that people test their GPU workloads locally before starting batch.

So for us, working GPU acceleration on Macs would be quite useful. We are not going to switch everyone to Linux because 1. reliable hardware 2. institutional support 3. required (M$ etc) software runnning only on macos. We rely on macos as a bridge to access both the commercial and open-source software worlds.

I thought that Vulkan’s support would be rather mandatory for all constructors unless they let go an increasing part of the gaming production…

I am not really convinced that Apple hardware is more reliable than reasonable alternatives (eg everything except the very low tier manufacturers).

In any case, I am not arguing that anyone should switch from OS X (and Mac) to Linux, just pointing out that there is a very hefty premium. The entry level is $6000, for which you get a 8-core Intel Xeon W, 32 GB ECC RAM, 256 GB SSD (seriously), and a Radeon Pro 580X.

Of these, ECC RAM used to be the most difficult to replicate in commodity hardware… until entry level AM4 motherboards started supporting it. Now you can get a better AMD system for 40–60% of the price.

Whether having OS X and a sleek design is worth this much markup is of course a subjective preference that everyone should just decide for themselves. For some applications, it may be justified (Apple markets to “serious creators” and the movie industry, for example), but specifically for Julia, I am not sure it is a sensible choice.

seems like Apple is still support OpenGL and OpenCL:
Apple’s OpenCL support

Nope,

Most machines still can use openCL but not in the latest OSX version.

here you will get the message:


Important: OpenCL was deprecated in macOS 10.14. 
To create high-performance code on GPUs,
 use the Metal framework instead. See Metal.

I also still use OSX but my next workstation will not be a mac because of this. What I am curious is if GPUArrays works in an ubuntu installation on a mac (dual boot). To make it even worse, OSX will not support Cuda so forget about buying a new mac pro with Nvidia GPUS.

2 Likes