Custom distribution for DistributedArrays

Hi to everybody,

I am going to port an old code by mine to Julia. Since this code uses MPI, I figured out I could use DistributedArrays.jl. My code implements a calculation on a very large arrays (~1 TB) and uses MPI to distribute the data among a number of nodes in a computing cluster. The way the data are distributed depends on a number of factors: each node gets roughly the same amount of data, but not exactly. Suppose for instance that my very long array has 100 elements, and that I am running the code on two nodes; depending on the way the data have been taken, I might end up splitting the array in two chunks containing 56 and 44 elements respectively. This helps the computation, as many operations on the data have to be performed on the two sequences of 56 and 44 elements, and I found that using this split instead of the simpler 50+50 scheme improves the speed.

Is there a way to tell a function like dzero to use a custom partitioning scheme by specifying exactly how many elements to use in each worker? I was able to find how to specify the number of splits in each dimension, but this seems to produce partitions of equal size (i.e., 50+50).

Thanks a lot,
Maurizio.