I think it might be due to you using u0 = 0.0 instead of u0 = [0.0]. I canâ€™t tell from the docs if the scalar form is supported in the general NonlinearProblem (but you can use StaticArrays to speed things up)

Note that f'(x_0) = 0 here so you might want to use another initial guess. Even using the in-place version I get that NewtonRaphson stalls on this problem.

Why? Since they are exact, they stuck in gradient 0 for Newton-Raphson algorithm. On the other hand, if finite difference(except maybe when central diff used) is used, it can have nonzero gradient.

Oh right, my bad. Usually, when people say â€śfinite differentiation vs exactâ€ť they either mean â€śautodiff is performing finite differentiationâ€ť (which is wrong) or â€śexact differentiation would be betterâ€ť (which in this case happens to also be wrong due to the flat gradient) but you meant neither of those ^^

My comparison was for OPs getting 2 results, one from exact derivatives using autodiff and the other without autodiff in the last sentence but, in retrospect, I ordered my saying the other way around.