If you look at the model zoo for mnist mlp for Flux.jl you may notice the line
using CuArrays is commented out. Presumably, this was because the rest of the code would fail if that line was turned on. Of course, CuArrays.jl is an GPU array library to allow computations to be done on GPU. I am so excited that I got it to work that I’ve decided to write this down right now even though it’s 2am where I live (and I still got a full time of consulting work tomorrow starting at 9am T_T).
- Windows 10 Pro
- NVIDIA RTX 2080 grahpics card
- Julia v1.0.3
- Flux.jl v0.6.10
- Visual Studio Community 2015
** Important: Do NOT install 2017 version
** I am not sure if this is required, but if you want to use Knet.jl as well then you might as well follow this instruction because Knet.jl only works with 2015
You need to install
update working for v10
Important: I have tested it to work with v9, it doesn’t seem to work for me if I install v10,
** This might be important but I will test upgrading to v10 now that I am sure it work for v9
- cuDNN library
Now re-build the CuArrarys and Flux using
Now rerun the model zoo example by uncommenting
using CuArrays then it should work!
|> gpu is a no-op unless CuArrays.jl is loaded in, in which case it will make everything go to the GPU. I think a comment should be added to these model zoo examples saying
# Uncomment the following line to use GPUs. Requires CuArrays.jl is installed.
I think the issue was that it just didn’t work. Even if CuArrays.jl can be installed it didn’t work for me. Maybe it’s a toolkit version thing. I remember doing an interview and the interviewer asked me if Flux’s GPU was working for me.
Yes, you need to install the drivers, CUDA toolkit, and cuDNN separately. There’s a licensing issue that stops it from being automatic. I think the CUDANative docs mention the first two, while the CuArrays.jl docs mention the last one.
Maybe the Flux.jl docs need to mention this in its installation docs. https://fluxml.ai/Flux.jl/stable/#Installation-1 I noticed that’s pretty bare.
Hope this PR helps with that
I spent yesterday not getting flux to work on GPUs on AWS (some details here). This kind of thing is very frustrating.
In my humble opinion, it would be very nice if someone ™ had a set of tests that tested compatibility with at least some more or less standard configurations. The FreeBSD project does this for the builds (including unit tests) of their packages. It’s one of those things would be nice for Julia to have, if the funding was available.
(yes I know the combinatorial explosion in configurations makes any kind of “total” test coverage totally infeasible, but the total absence of coverage is also bad).
The problem with this is that it’s expensive to run all of these compatibility checks as part of CI, especially when testing on proprietary operating systems. The best solution might be to donate money or hardware + software licenses to the appropriate project so that they don’t need to pay out of pocket.