Worker process memory leak?

Hello,

I have a question about garbage collection on worker processes (this is on 0.6.2). I have a long-running function that uses a RemoteChannel to communicate with the master node. This function allocates and frees memory over the course of its operation, but the freed memory doesn’t appear to be reclaimed by the garbage collector, even if @everywhere gc(true) is called.

Here’s a simple example. First, define the worker function:

function worker(channel)

    println("Starting")
    a = 0

    while true
        cmd = take!(channel)

        if cmd == :alloc
            println("Allocating")
            a = zeros(10000, 50000)

        elseif cmd == :free
            println("Freeing")
            a = 0

        elseif cmd == :quit
            println("Quitting")
            return
        end
    end
end

Start julia with a worker process (julia -p 1), load the function, and start a worker running:

> @everywhere include("worker.jl")
> c = RemoteChannel(()->Channel(1))
> remote_do(worker, 2, c)

Allocate some memory on the worker, and note that the memory usage of the julia worker process (as measured by the operating system, i.e. htop’s RES field) increases by several GB as expected:

> put!(c, :alloc)

Now, let’s try to free the memory:

> put!(c, :free)

The memory usage of the worker process doesn’t go back down. Calling @everywhere gc(true) has no effect. However, if we quit the worker process (> put!(c, :quit)) and then garbage collect (@everywhere gc(true)), the memory usage of the worker process does return to its pre-allocation level. This feels like a bug to me, but if I’m not understanding something about how garbage collection is supposed to happen on worker processes, please let me know! :slight_smile:

It seems like this might be somewhat related to this issue, although in that one the memory leak appeared when using Futures instead of Channels.

By the way, thanks to the community for such a great language! Julia is a pleasure to use.

2 Likes