Out of GPU memory with user defined function

This post was temporarily hidden by the community for possibly being off-topic, inappropriate, or spammy.

Something is fishy:

function test()
    X = rand(300, Int(40e3));
    Y = rand(Int(40e3), 300);
    Y * X
end

@btime test()

reports

6.625 s (6 allocations: 12.10 GiB)