Hey guys, so I have an optimization problem in which JuMP with Ipopt uses 90% of my available RAM memory for automatic differentiation.

In my layman’s understanding, after it calculates the hessian and all, it stores it in RAM and then the text “This program contain Ipopt (…)” shows up and it starts running the iterations.

After many hours of solving the first optimization problem, I then change the initial set of values and try to run it again by defining the exact same JuMP model all over again. It is here that I notice that Julia does not throw away any of the memory used in the first pass, so when JuMP starts to calculate AD, memory usage is already at 90% and it starts to increase until it runs out of memory.

In fact, even any other simple operation starts to run out of memory as well.

Is there a way to ‘clean’ this memory, or better yet, to change only the initial value but preserve the AD Hessian? Here is a simple example of what I’m trying to say:

My code is in JuMP 0.18, but answers/tips for 0.19 are very welcome as well, thanks

```
using JuMP, Ipopt, Random, Distributions
function genFake()
X = [ones(1000) randn(1000) randn(1000).+10]
ϵ = rand(Normal(0,10), 1000)
β0 = [5.0, 1.5, 2.3]
y = X*β0 .+ ϵ
return y, X
end
function OLS(y, X, test)
m = Model(solver = IpoptSolver())
@variable(m, β[i = 1:3], start = test[i] )
@objective(m, Min, sum((y[i] - X[i,:]'*β)^2 for i in 1:1000))
status = solve(m)
println("Objective value: ", getobjectivevalue(m))
println("β = ", getvalue(β))
end
test = rand(5,3)
y, X = genFake()
for i in 1:5
OLS(y,X, test[i,:])
end
```