Preconditioner for nonlinear optimization

Hello,

I am working on the following nonlinear minimization problem:

\displaystyle{\min_{\boldsymbol{c} \in R^n}} \mbox{ }\frac{1}{M} \sum_{i=1}^{M} \left(\frac{1}{2} \left[\boldsymbol{c}^T \boldsymbol{f}_0 + \int_0^{x^i} \sigma(\boldsymbol{c}^T\boldsymbol{g}(t)) dt \right]^2 - \log(\sigma(\boldsymbol{c}^T \boldsymbol{g}(x^i)) \right), where \sigma(x) = \log(1 + \exp(x)) is the softplus function. x^1,x^2,\ldots, x^M are parameters with M \sim 200-300 and the coefficient \boldsymbol{c} has dimension \sim 20. This optimization is a critical part of my code.

For now, I am using Optim.jl with L-BFGS.
Do you have some advice to speed-up the optimization?
Are there some rules to design a pre-conditioner?
Should I supply the Hessian as well and move to a Newton method?

Thank you for your help,