Error using CuArrays in Colab

I get the following error while trying to use CuArrays on GoogleColab:

pkg"add CuArrays; precompile;"
using CuArrays

mgpu = cu(mcpu)
@benchmark mgpu*mgpu

The error message is:

Resolving package versions...
 Installed Requires ─────────── v0.5.2
 Installed MacroTools ───────── v0.5.2
 Installed AbstractFFTs ─────── v0.4.1
 Installed GPUArrays ────────── v2.0.0
 Installed Adapt ────────────── v1.0.0
 Installed CUDAapi ──────────── v2.0.0
 Installed DataStructures ───── v0.17.5
 Installed CuArrays ─────────── v1.4.5
 Installed CEnum ────────────── v0.2.0
 Installed TimerOutputs ─────── v0.5.2
 Installed NNlib ────────────── v0.6.0
 Installed OrderedCollections ─ v1.1.0
 Installed CUDAnative ───────── v2.5.5
 Installed CUDAdrv ──────────── v4.0.4
 Installed LLVM ─────────────── v1.3.2
  Updating `~/.julia/environments/v1.0/Project.toml`
  [3a865a2d] + CuArrays v1.4.5
  Updating `~/.julia/environments/v1.0/Manifest.toml`
  [621f4979] + AbstractFFTs v0.4.1
  [79e6a3ab] + Adapt v1.0.0
  [fa961155] + CEnum v0.2.0
  [3895d2a7] + CUDAapi v2.0.0
  [c5f51814] + CUDAdrv v4.0.4
  [be33ccc6] + CUDAnative v2.5.5
  [3a865a2d] + CuArrays v1.4.5
  [864edb3b] + DataStructures v0.17.5
  [0c68f7d7] + GPUArrays v2.0.0
  [929cbde3] + LLVM v1.3.2
  [1914dd2f] + MacroTools v0.5.2
  [872c559c] + NNlib v0.6.0
  [bac558e1] + OrderedCollections v1.1.0
  [ae029012] + Requires v0.5.2
  [a759f4b9] + TimerOutputs v0.5.2
Precompiling project...
Precompiling CuArrays
┌ Info: Precompiling CuArrays [3a865a2d-5b23-5a0f-bc46-62713ec82fae]
└ @ Base loading.jl:1192
┌ Info: CuArrays.jl failed to initialize and will be unavailable (set JULIA_CUDA_SILENT or JULIA_CUDA_VERBOSE to silence or expand this message)
└ @ CuArrays /root/.julia/packages/CuArrays/Dzwot/src/CuArrays.jl:144
SYSTEM: show(lasterr) caused an error

Stacktrace:
 [1] macro expansion at /root/.julia/packages/CUDAdrv/3EzC1/src/error.jl:123 [inlined]
 [2] cuDeviceGet(::Base.RefValue{Int32}, ::Int64) at /root/.julia/packages/CUDAdrv/3EzC1/src/libcuda.jl:30
 [3] Type at /root/.julia/packages/CUDAdrv/3EzC1/src/devices.jl:25 [inlined]
 [4] initialize at /root/.julia/packages/CUDAnative/2WQzk/src/init.jl:40 [inlined]
 [5] maybe_initialize(::Symbol) at /root/.julia/packages/CUDAnative/2WQzk/src/init.jl:33
 [6] macro expansion at /root/.julia/packages/CUDAdrv/3EzC1/src/error.jl:119 [inlined]
 [7] cuMemAlloc_v2(::Base.RefValue{CUDAdrv.CuPtr{Nothing}}, ::Int64) at /root/.julia/packages/CUDAdrv/3EzC1/src/libcuda.jl:312
 [8] alloc at /root/.julia/packages/CUDAdrv/3EzC1/src/memory.jl:64 [inlined]
 [9] macro expansion at /root/.julia/packages/TimerOutputs/Tf7lx/src/TimerOutput.jl:214 [inlined]
 [10] macro expansion at /root/.julia/packages/CuArrays/Dzwot/src/memory.jl:55 [inlined]
 [11] macro expansion at ./util.jl:213 [inlined]
 [12] actual_alloc(::Int64) at /root/.julia/packages/CuArrays/Dzwot/src/memory.jl:54
 [13] actual_alloc at /root/.julia/packages/CuArrays/Dzwot/src/memory/binned.jl:53 [inlined]
 [14] macro expansion at /root/.julia/packages/CuArrays/Dzwot/src/memory/binned.jl:194 [inlined]
 [15] macro expansion at /root/.julia/packages/TimerOutputs/Tf7lx/src/TimerOutput.jl:214 [inlined]
 [16] pool_alloc(::Int64, ::Int64) at /root/.julia/packages/CuArrays/Dzwot/src/memory/binned.jl:193
 [17] (::getfield(CuArrays.BinnedPool, Symbol("##9#10")){Int64,Int64,Set{CuArrays.BinnedPool.Block},Array{CuArrays.BinnedPool.Block,1}})() at /root/.julia/packages/CuArrays/Dzwot/src/memory/binned.jl:291
 [18] lock(::getfield(CuArrays.BinnedPool, Symbol("##9#10")){Int64,Int64,Set{CuArrays.BinnedPool.Block},Array{CuArrays.BinnedPool.Block,1}}, ::ReentrantLock) at ./lock.jl:101
 [19] alloc(::Int64) at /root/.julia/packages/CuArrays/Dzwot/src/memory/binned.jl:290
 [20] macro expansion at /root/.julia/packages/TimerOutputs/Tf7lx/src/TimerOutput.jl:214 [inlined]
 [21] macro expansion at /root/.julia/packages/CuArrays/Dzwot/src/memory.jl:121 [inlined]
 [22] macro expansion at ./util.jl:213 [inlined]
 [23] alloc at /root/.julia/packages/CuArrays/Dzwot/src/memory.jl:120 [inlined]
 [24] CuArray{Float32,2,P} where P(::UndefInitializer, ::Tuple{Int64,Int64}) at /root/.julia/packages/CuArrays/Dzwot/src/array.jl:90
 [25] convert at /root/.julia/packages/CuArrays/Dzwot/src/array.jl:98 [inlined]
 [26] adapt_storage at /root/.julia/packages/CuArrays/Dzwot/src/array.jl:240 [inlined]
 [27] adapt_structure at /root/.julia/packages/Adapt/aeQPS/src/Adapt.jl:9 [inlined]
 [28] adapt at /root/.julia/packages/Adapt/aeQPS/src/Adapt.jl:6 [inlined]
 [29] cu(::Array{Float64,2}) at /root/.julia/packages/CuArrays/Dzwot/src/array.jl:288
 [30] top-level scope at In[2]:3rd_place_medal: ```

How do I fix this?

@maleadt Can you please comment on how to fix this?

Well, the error gives you a suggestion to get more information (set the env var before importing), so please do that first. There’s also been a new version that includes a fix for initialization on Colab, so be sure to update packages and and try again.

I tried with the latest release and it seems to work fine. Well, I come asking for I am a beginner. Sorry if some of my posts are redundant. I find it difficult to seek help with anything related to GPU programming in Julia. There is very little documentation and very little help.

What do you mean by that? We do our best to support all users and their questions, but you posted this outside of the GPU domain so I didn’t get a notification. Documentation is lacking, sure, but we’re working on that: https://juliagpu.gitlab.io/CUDA.jl/
For quick questions, feel free to hop on the #gpu Slack channel and ask there.

1 Like
  1. To begin with I use the tutorial of CuArrays. There, it is pretty straightforward that one should add and use the packages. There are no additional variables’ settings specified. These pop-up on trying to run the code on colab.
  2. For the longest time, I was clueless about running Julia on Colab. For, the procedure mentioned here https://discourse.julialang.org/t/julia-on-google-colab-free-gpu-accelerated-shareable-notebooks/15319?u=ennvvy was broken. I had to wait for a couple of weeks to be able to fix this issue.

I am not finding flaws here with the community. I am only reporting the problems that I have faced, hoping the feedback leads to better user experience eventually.