Hi,
I just updated to Julia 0.6 and am using the MPI wrapper. When I try to send a vector using MPI, the sending command never finishes. For example the following code does not finish:
using MPI
MPI.Init()
x = rand( Complex128, 300 )
sreq = MPI.Isend( x, 1, 0, MPI.COMM_WORLD )
MPI.Wait!(sreq)
MPI.Finalize()
Am I doing something wrong? Did anybody else experience similar issues?
When using Julia 0.5 I never ran into such issues. Also, if I do not request to wait for the send request and receive the message, only the first part of the vector is received. If I use smaller vectors, everything works well also.
Thank you,
Matthias
Can you post a complete minimal example? Whether or not this behavior is expected depends on what code is running on MPI rank 1.
Hi Jared,
My file test.jl looks like that:
using MPI
MPI.Init()
rank = MPI.Comm_rank(MPI.COMM_WORLD)
if( rank==0 )
x = rand(Complex128,200)
sreq = Vector{MPI.Request}(0)
for pp=1:MPI.Comm_size(MPI.COMM_WORLD)
push!( sreq, MPI.Isend( x, pp-1, pp, MPI.COMM_WORLD ) )
end
MPI.Waitall!(sreq)
end
println("done")
MPI.Finalize()
I then run the code with
mpirun -np 2 julia test.jl
and it does not finish.
If I change test.jl to just send a shorter (10-element) vector:
using MPI
MPI.Init()
rank = MPI.Comm_rank(MPI.COMM_WORLD)
if( rank==0 )
x = rand(Complex128,10)
sreq = Vector{MPI.Request}(0)
for pp=1:MPI.Comm_size(MPI.COMM_WORLD)
push!( sreq, MPI.Isend( x, pp-1, pp, MPI.COMM_WORLD ) )
end
MPI.Waitall!(sreq)
end
println("done")
MPI.Finalize()
Then, when I run the file as
mpirun -np 2 julia test.jl
it finishes with no problems.
Thank you,
Matthias
This behavior is expected. The reason is that there is no MPI receive to match the MPI.Isend, so, in general, the MPI.Waitall is waiting for the send to finish, which will never happen. In some special case (small vectors, for example) MPI will do the send even if there is no receive yet, buffer the data internally, and then copy it into the receive buffer when the receive is finally posted.