I think it might be due to you using u0 = 0.0 instead of u0 = [0.0]. I can’t tell from the docs if the scalar form is supported in the general NonlinearProblem (but you can use StaticArrays to speed things up)
Note that f'(x_0) = 0 here so you might want to use another initial guess. Even using the in-place version I get that NewtonRaphson stalls on this problem.
Why? Since they are exact, they stuck in gradient 0 for Newton-Raphson algorithm. On the other hand, if finite difference(except maybe when central diff used) is used, it can have nonzero gradient.
Oh right, my bad. Usually, when people say “finite differentiation vs exact” they either mean “autodiff is performing finite differentiation” (which is wrong) or “exact differentiation would be better” (which in this case happens to also be wrong due to the flat gradient) but you meant neither of those ^^
My comparison was for OPs getting 2 results, one from exact derivatives using autodiff and the other without autodiff in the last sentence but, in retrospect, I ordered my saying the other way around.