Bug in CUDA, CuArray, or something I just don't know?

Hello!

When I run this code to convert a vector of ints into a CuArray, I see something extremely weird…

julia> ListOfInts = collect(1:1*10^6)
1000000-element Vector{Int64}:
       1
       2
       3
       4
       5
       ⋮
  999997
  999998
  999999
 1000000

julia> CuArray(ListOfInts)
1000000-element CuArray{Int64, 1, CUDA.Mem.DeviceBuffer}:
       1
       2
       3
       4
       5
       ⋮
  999997
  999998
  999999
 1000000

julia> CuArray(ListOfInts)
1000000-element CuArray{Int64, 1, CUDA.Mem.DeviceBuffer}:
                    1
                    2
                    3
                    4
                    5
                    ⋮
  4533517044312782213
  4529285733552316672
 -9223372036854775808
 -4708928475910122632

It works the first time, but completely errors out the second time. Makes me a bit nervous for using it, could anyone tell me where I am going wrong?

I am new to GPU programming.

Kind regards

I think it perhaps might be a printing error since I do get the correct sum:

julia> j = CuArray(ListOfInts)
1000000-element CuArray{Int64, 1, CUDA.Mem.DeviceBuffer}:
  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
  ⋮
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0
  0

julia> sum(j)
500000500000

I take it you’re running this in the VSCode terminal: CuArrays don't seem to display correctly in VS code · Issue #875 · JuliaGPU/CUDA.jl · GitHub

Indeed!

Thank you for making me aware of that issue :slight_smile:

I will mark your answer as solution, hope it gets fixed one day, VS Code is a bit too good to let go off, even for GPU programming… :wink:

Kind regards