Struggling to get scipy.newton performance using NonlinearSolve.jl

Yes, if it’s a bunch of scalar rootfinds, you should use a scalar rootfinder n times instead of a vector Newton method since the latter will require building and factorizing a large Jacobian. With that in mind, here’s the NonlinearSolve code:

using NonlinearSolve

function dispersion(k, p)
    k * p.G * tanh(k * p.d) - p.ω^2
end

# Setup parameters
N = 1000
G, d, ω = 9.8, 30.0, 2π * collect(range(0.05, 1, N))
p = map(ω -> (G=G, d=d, ω=ω), ω)

# Starting point: first order approximation
k0 = @. ω^2 / G / tanh(ω * sqrt(d / G))
prob = NonlinearProblem{false}.(dispersion, k0, p)
solver = solve.(prob, (NewtonRaphson(),), tol=1e-9)

which as expected is really fast:

using BenchmarkTools
julia> @benchmark solver = solve.(prob, (NewtonRaphson(),), tol=1e-9)
BenchmarkTools.Trial: 10000 samples with 1 evaluation.
 Range (min … max):  128.400 μs …   5.394 ms  ┊ GC (min … max): 0.00% … 96.94%
 Time  (median):     144.500 μs               ┊ GC (median):    0.00%
 Time  (mean ± σ):   152.696 μs ± 125.333 μs  ┊ GC (mean ± σ):  2.62% ±  3.17%

   ▅▆ ▁ ▃█ ▃
  ▁██▆█▃█████▆▄▃▂▂▂▁▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁ ▂
  128 μs           Histogram: frequency by time          262 μs <

 Memory estimate: 78.69 KiB, allocs estimate: 18.

150 μs ain’t too shabby.

2 Likes