For my code I am using a long multydimensional list of anonymous functions, which I calculated in a previous session and loaded as a constant into the current session. I then run a pmap operation (over another list), in which I just hand it over as a function argument for each of the workers.
Now when I load the list, Julia will allocate allot of memory, which won’t be freed again. Using varinfo()
this is not reflected anywhere, the object is shown to have a size several orders of magnitude lower then what Julia is allocating, when creating it.
So the first question would be, is there a way in Julia to actually see, what is causing currently allocated memory?
Further, once I start the pmap operations, Julia will again allocate a lot of memory. I assume, that is because it copies some parts, of what causes it to use so much memory on the master, to the workers. In my search to overcome this I came across the existence of SharedArrays. But they seem to not work trivially, because all the objects in the list are of different bitstypes.
Therefore my question is, is there a way to be able to read a larger number of bitstype-objects of different bitstypes from several workers, without copying it for every worker?
I know I could use threads, but they turned out to be way slower then using workers and pmap. Also the calculations done on each worker really are completely independent except, that they would all like to read from this 1 list.
Any more general advice on dealing with this Situation is also appreciated.