Nelder Mead tolerance vastly higher when I restart optimization with updated guess for initial parameters

As I understand, in general you cannot parallelize optimization algorithms. You could parallelize the objective function as you have done. In principle, however, you can parallelize Nelder-Mead as you mention, but I don’t think Optim.jl supports that. The book I linked above has sample code for Nelder-Mead, which you can modify and parallelize at will. Another suggestion I have is to try to optimize the computation of the objective function and log-likelihood and use speed-up tools like LoopVectorization.jl. See here How to speed up these functions? for example.