Threads.@threads not working, even without i/o

#1

I am continuing my reporting of bugs with multi-threading on Julia. I am using v1.1.0 and running on 4 threads. The following code

using SparseArrays;
using Serialization;
N=1000;
oneN=fill(1.0, N);
K=spzeros(N,N);
Threads.@threads for i in 1:N
K[i,i] = 1.0;
end

gives the strange error

Error thrown in threaded loop on thread 2: BoundsError(a=Array{Float64, (17,)}[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1], i=23)
Error thrown in threaded loop on thread 0: BoundsError(a=Array{Float64, (17,)}[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1], i=22)
Error thrown in threaded loop on thread 1: BoundsError(a=Array{Float64, (17,)}[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1], i=22)
Error thrown in threaded loop on thread 3: BoundsError(a=Array{Float64, (17,)}[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1], i=23)

This is a very simple use of multi-threading, without any I/O. Could you tell me what is going wrong ? Thanks.

0 Likes

#2

I’ll be surprised if sparse array is thread safe.

0 Likes

#3

Multithreading is indeed rather tricky, and you need to think through how your memory access patterns will work across all threads — particularly when you’re mutating the same object across multiple threads.

In the case of sparse arrays, multiple threads attempting to use indexed assignment will cause them all to try to mutate the backing arrays simultaneously and rapidly leave them in a bad state.

I encourage you to first maximize your single-threaded performance. Profile, follow the performance tips, and try to eliminate allocations in your hot loops. In the case of a sparse array, that often means creating arrays of indices and values and using the sparse constructor instead of incrementally filling it. That will likely give you far bigger gains than 4x threading ever could. Once you have things lean and mean, then is the time to parallelize — but you need to be very cautious about how you structure your algorithm to ensure there aren’t race conditions like this.

7 Likes

#4

Thanks, I will try that.

0 Likes