I am trying to setup Julia to use my GPU, out of curiosity as I am a new user.
After a few problems, I encoutered this one, where I have not found anything (apart from this which didn’t help me either):
using BenchmarkTools
using FFTW
using CuArrays
using Random
const nx = 1024 # do 1024 x 1024 2D FFT
xc = CuArray{ComplexF64}(CuArrays.randn(Float64, nx, nx))
AssertionError: ctx === CuCurrentContext()
Stacktrace:
[1] context() at /home/benoit/.julia/packages/CUDAnative/Phjco/src/init.jl:61
[2] generator() at /home/benoit/.julia/packages/CuArrays/rNxse/src/rand/CURAND.jl:33
[3] #randn#113(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Type{Float64}, ::Int32, ::Int32) at /home/benoit/.julia/packages/CuArrays/rNxse/src/rand/random.jl:176
[4] randn(::Type{Float64}, ::Int32, ::Int32) at /home/benoit/.julia/packages/CuArrays/rNxse/src/rand/random.jl:176
[5] top-level scope at In[8]:7
Interesting; is it always reproducible? Was this in a fresh REPL session? Does it happen if you just call CUDAnative.context()
? Could you try the following (and paste its output):
julia> using CUDAdrv
julia> CuCurrentContext()
julia> using CUDAnative
julia> CUDAnative.initialize()
julia> CuCurrentContext()
CuContext(Ptr{Nothing} @0x0000556265bd5720)
julia> CUDAnative.thread_contexts
1-element Array{Union{Nothing, CuContext},1}:
CuContext(Ptr{Nothing} @0x0000556265bd5720)
julia> CUDAnative.context()
CuContext(Ptr{Nothing} @0x0000556265bd5720)
I had this problem when running the code in a fresh Jupyter Notebook, and it was reproducible. Now that I wrote it in the REPL
julia> using BenchmarkTools
julia> using FFTW
julia> using CuArrays
[ Info: CUDAnative.jl failed to initialized, GPU functionality unavailable (set JULIA_CUDA_SILENT or JULIA_CUDA_VERBOSE to silence or expand this message)
julia> using Random
julia> const nx = 1024 # do 1024 x 1024 2D FFT
1024
julia> xc = CuArray{ComplexF64}(CuArrays.randn(Float64, nx, nx))
ERROR: AssertionError: ctx === CuCurrentContext()
Stacktrace:
[1] context() at /home/benoit/.julia/packages/CUDAnative/Phjco/src/init.jl:61
[2] generator() at /home/benoit/.julia/packages/CuArrays/rNxse/src/rand/CURAND.jl:33
[3] #randn#113(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Type{Float64}, ::Int32, ::Int32) at /home/benoit/.julia/packages/CuArrays/rNxse/src/rand/random.jl:176
[4] randn(::Type{Float64}, ::Int32, ::Int32) at /home/benoit/.julia/packages/CuArrays/rNxse/src/rand/random.jl:176
[5] top-level scope at none:0
As for your commands,
julia> using CUDAdrv
julia> CuCurrentContext()
julia>
julia> using CUDAnative
julia> CUDAnative.initialize()
ERROR: CUDA error: invalid device context (code 201, ERROR_INVALID_CONTEXT)
Stacktrace:
[1] throw_api_error(::CUDAdrv.cudaError_enum) at /home/benoit/.julia/packages/CUDAdrv/aBgcd/src/error.jl:136
[2] macro expansion at /home/benoit/.julia/packages/CUDAdrv/aBgcd/src/error.jl:149 [inlined]
[3] cuCtxGetDevice at /home/benoit/.julia/packages/CUDAdrv/aBgcd/src/libcuda.jl:144 [inlined]
[4] device at /home/benoit/.julia/packages/CUDAdrv/aBgcd/src/context.jl:145 [inlined]
[5] device!(::CuDevice) at /home/benoit/.julia/packages/CUDAnative/Phjco/src/init.jl:114
[6] initialize() at /home/benoit/.julia/packages/CUDAnative/Phjco/src/init.jl:32
[7] top-level scope at none:0
julia> CuCurrentContext()
julia> CUDAnative.thread_contexts
0-element Array{Union{Nothing, CuContext},1}
julia> CUDAnative.context()
ERROR: AssertionError: ctx === CuCurrentContext()
Stacktrace:
[1] context() at /home/benoit/.julia/packages/CUDAnative/Phjco/src/init.jl:61
[2] top-level scope at none:0
Well, that message says it all: the package failed to initialize, so isn’t functional. Follow the advise of the message, run with JULIA_CUDA_VERBOSE=true to see additional details.