I think it’s just double counting, because of the behavior of
summarysize. It basically recursively counts how much memory is used by all objects reachable. A
SharedArray has a field
loc_subarr_1d and another
s. The latter holds the whole array and the former holds a 1d view of the array. Mutating one mutates the other, so it’s just double counting. It’s probably doing something like:
mysizeof(f) = sum((sizeof(f), (mysizeof(getfield(f,i)) for i in 1:nfields(f))...))
But it’s weird that it’s not double counting for the 1D case . Anyways, I think it deserves some attention from devs.
Maybe adding a bug keyword or something will get more attention more quickly. In the meantime, I think the equivalent of task manager in Windows or system monitor in Linux will give you a rough idea of memory used by large enough data.