Memory allocation for large numbers of objects?

I’m experiencing some strange behaviour that I think may be due to memory allocation.

I have a parser that runs through a big (90GB) file. It munches lines at a decent rate and then every so often… just hangs for a few seconds before starting again.

I’m not asking for a diagnosis of my problem.

Instead, suppose I create say, 20GB, of smallish objects (a couple of kb each). Is there some fundamental mechanism that would cause the REPL to pause for a while to expand the quantity of memory it had at its disposal?

(E.g. python has a pool of memory.)

Definitely sounds like it could be garbage collection. When you create an object Julia allocates memory for it. When there are no longer any references to it (e.g. the variable goes out of scope, or the last reference is removed from some other data structure) that data is no longer accessible so it’s useless. Julia periodically cleans up any of this orphan data so it’s not using up your RAM.

Check your code to see if you can pre-allocate to reduce the amount of memory you need to get garbage-collected (reduce, reuse, recycle!) (http://docs.julialang.org/en/stable/manual/performance-tips/#pre-allocating-outputs)

Thanks for the hint. Trying it out with gc_enable(false) makes my code 3 times faster and stops the hanging. So a hacky improvement is just to manually enable and disable as appropriate for a 2x boost. Also, I have a better idea what to go for properly next.

You may find this thread from the old mailing list interesting, particularly Jeff Bezanson’s comment on reusing space efficiently by overwriting it.

2 Likes