JuMP uses all available memory, can't do anything else after


#1

Hey guys, so I have an optimization problem in which JuMP with Ipopt uses 90% of my available RAM memory for automatic differentiation.

In my layman’s understanding, after it calculates the hessian and all, it stores it in RAM and then the text “This program contain Ipopt (…)” shows up and it starts running the iterations.

After many hours of solving the first optimization problem, I then change the initial set of values and try to run it again by defining the exact same JuMP model all over again. It is here that I notice that Julia does not throw away any of the memory used in the first pass, so when JuMP starts to calculate AD, memory usage is already at 90% and it starts to increase until it runs out of memory.

In fact, even any other simple operation starts to run out of memory as well.

Is there a way to ‘clean’ this memory, or better yet, to change only the initial value but preserve the AD Hessian? Here is a simple example of what I’m trying to say:

My code is in JuMP 0.18, but answers/tips for 0.19 are very welcome as well, thanks

using JuMP, Ipopt, Random, Distributions

function genFake()
    X = [ones(1000) randn(1000) randn(1000).+10]
    ϵ = rand(Normal(0,10), 1000)
    β0 = [5.0, 1.5, 2.3]
    y = X*β0 .+ ϵ
    return y, X
end

function OLS(y, X, test)
    m = Model(solver = IpoptSolver())
    @variable(m, β[i = 1:3], start = test[i] )
    @objective(m, Min, sum((y[i] - X[i,:]'*β)^2 for i in 1:1000))
    status = solve(m)
    println("Objective value: ", getobjectivevalue(m))
    println("β = ", getvalue(β))
end
test = rand(5,3)
y, X = genFake()
for i in 1:5
    OLS(y,X, test[i,:])
end

#2

That may be an issue with Julia’s JIT.

I have tested with JuMP 0.19, Julia 1.1 and Linux:

using JuMP, Ipopt, Random, Distributions

function genFake(npoints)
    X = [ones(npoints) randn(npoints) randn(npoints).+10]
    ϵ = rand(Normal(0,10), npoints)
    β0 = [5.0, 1.5, 2.3]
    y = X*β0 .+ ϵ
    return y, X
end

function OLS(y, X, test)
    m = Model(with_optimizer(Ipopt.Optimizer))
    @variable(m, β[i = 1:3], start = test[i] )
    @objective(m, Min, sum((y[i] - X[i,:]'*β)^2 for i in 1:length(y)))
    status = JuMP.optimize!(m)
    println("Objective value: ", JuMP.objective_value(m))
    println("β = ", JuMP.value.(β))
end

function main(N, nrepeat)
    test = rand(nrepeat,3)
    y, X = genFake(N)
    for i in 1:nrepeat
        OLS(y,X, test[i,:])
    end
end

I first set a small values (e.g. main(10, 1)) to run the script and precompile all functions, then run main(1000, 5). I do not have any memory issue arising:

@time main(1000, 5)
  0.039539 seconds (60.84 k allocations: 3.943 MiB)

Ipopt converges in one iteration, which is expected (we even have an analytical solution for this very small problem).