Large amount of memory allocation when using autodiff and IPNewton

Hi all,

I’m new to Julia and have been working on a model which essentially is a long loop over optimisation problems:

for state = 1:n

            x0 = ones(5)/2;
            fun1(x) = -hh_utility(x, state)[1];
            cfg = ForwardDiff.GradientConfig(fun1, x0, ForwardDiff.Chunk{5}());
            cfg2 = ForwardDiff.HessianConfig(fun1, x0, ForwardDiff.Chunk{5}());
            grad1(g, x) = ForwardDiff.gradient!(g, fun1, x, cfg);
            hes1(H, x) = ForwardDiff.hessian!(H, fun1, x, cfg2);
            df = TwiceDifferentiable(fun1, grad1, hes1, x0);
            res = optimize(df, dfc, x0, IPNewton());
            output[state, :] = res.minimizer;


When I run the code, I get the following timing:

127.602522 seconds (2.09 G allocations: 161.172 GiB, 18.27% gc time, 0.42% compilation time)

The memory allocation seems quite large and I was wondering whether I should try to reduce it or whether it’s normal? If I just check the function and derivatives, I get these allocations:

@time fun1(x0)
  0.000014 seconds (3 allocations: 80 bytes)

@time grad1(g, x0)
  0.000034 seconds (17 allocations: 1.297 KiB)

@time hes1(H, x0)
  0.000058 seconds (21 allocations: 9.703 KiB)

In case I should try to reduce allocations, what could be a good way of doing that?


There could be potential for optimisation.

In general, I like this reference:

For your case, you could also use the profiler Profiler · Julia in VS Code to find out where the allocations are coming from.

About your code: You might want to initialise the gradient configs outside of the for loop. In addition, there might be some type instability in ‘hh_utility’ which would then make everything less efficient. Is it possible to share ‘hh_utility’?