I am using Optim using LBFGS for a statistical application (maximum likelihood), and I have encountered the issue that Optim consumes an increasingly large amount of memory until it runs out and crashes. The data set size is on the order of 1-2GB, and there are ~1000 parameters. I tried running Optim and watching memory use over the iterations. Weirdly, the memory did go up and then down again (presumably with garbage collection) until iteration ~70 or so, after which point it continued to consume memory until it crashed (using up all ~155GB available). It’s worth noting that I provide an analytic gradient, so it’s not ForwardDiff.Duals that are generating the memory use.
Any thoughts about what could be going on or how to avoid/resolve the issue would be greatly appreciated. Thank you!