Julia v1.8.5
LoopVectorization v0.12.159
Working on trying to maximize performance of some functions and I’m getting errors. Here is a minimum code example of one:
function test(a,n)
rows = size(a,1)
cols = size(a,2)
println(rows,":",cols)
# a[n,:] = mean.(eachcol(a[1:n,:]))
@turbo for j in 1:cols
_s = 0.0
for i in 1:n
_s += a[i,j]
end
a[n,j] = _s / n
end
end
N = 100_000
x = rand(N,2)
test(x,10)
y = rand(N)
test(y,10)
The second call gives: ERROR: MethodError: reducing over an empty collection is not allowed; consider supplying
init to the reducer
I realize the first passes a Matrix and the second passes a Vector.
Function works as expected without the @turbo
.
Why is this error being generated and how can I modify the code for this to work with both Matrix and Vector types?
At first glance it seems this code is explicitly designed for a two-dimensional array, for instance because it mentions rows and cols. What do you expect to achieve when giving it a vector?
Redefine the existing method as test(a::AbstractaMatrix, n)
, and add a dispatch for test(a::AbstractVector, n)
, which would have only a single loop.
You can define different methods for the different combinations of input parameter types: https://docs.julialang.org/en/v1/manual/methods/
1 Like
If you don’t want, as others suggest, to define a vector-specific method, try moving @turbo to the inner loop, which is the more demanding one.
function test(a,n)
rows = size(a,1)
cols = size(a,2)
println(rows,":",cols)
# a[n,:] = mean.(eachcol(a[1:n,:]))
for j in 1:cols
_s = 0.0
@turbo for i in 1:n
_s += a[i,j]
end
a[n,j] = _s / n
end
end
Or reshape the vector as a column matrix. (although, it doesn’t seem like a nice solution to me)
function test1(a,n)
rows = size(a,1)
cols = size(a,2)
a=reshape(a,:,cols)
println(rows,":",cols)
# a[n,:] = mean.(eachcol(a[1:n,:]))
@turbo for j in 1:cols
_s = 0.0
for i in 1:n
_s += a[i,j]
end
a[n,j] = _s / n
end
end
PS
I don’t know the terms and applicability limits of the macro, but the highlighted problem deserves further investigation by those who know the macro well