Using NL solver to find zeros

I was using Roots.jl for this sort of problem, and wanted to use ModelingTookit.jl or NLsolver.jl (the former uses the later).

using ModelingToolkit
∂xf(x)=(2 * x) * inv(2 * sqrt(3 + (x ^ 2)))
nl_f = @eval eval(∂xf[.01])

f2 = (du,u) -> nl_f(du,u,(-100,100))

using NLsolve
nlsolve(f2,ones(1))

I’m trying to find x when:
∂xf(x)=.01

using ModelingToolkit, NLsolve

# define some symbols to be used for the symbolic part of the evaluation
@variables x
@parameters slope
@derivitaves D'~x

# The interesting function
f(x) = √(3 + x^2)

# A symbolic expression for the derivative of the function
∂xf_sym = expand_derivatives(D(f(x)))

# Build the system we would like solved
nl_sys = NonlinearSystem([0 ~ ∂xf_sym - slope], [x], [slope])

# generate an in-place evaluator and jacobian for the system
# These are julia functions that implement the symbolic calculation built above.
# https://mtk.sciml.ai/stable/tutorials/nonlinear/
f! = generate_function(nlsys, [x], [slope], expression=Val{false})[2]

j! = generate_jacobian(nlsys, expression=Val{false})[2]

# the desired value for slope
params = [0.01]

# pass the generated function and jacobian to the solver, with an initial guess
soln = nlsolve((out, x)->f!(out, x, params), (out, x)->j!(out, x, params), [0.0])

# build a Julia version of the symbolic derivative
∂xf = eval(build_function([∂xf_sym], [x])[1])

# check that the system is really solved
∂xf(soln.zero) ≈ params

It works, the others are simpler. I was not wanting to use IntervalRootFinder, because it was writen in Python (as though I’m doing anything close to where speed would begin to matter), but instead I think I’ll use roots.jl for this. This works good though, so I will consider it for other things.

Err… I am the author of IntervalRootFinding.jl and I can guarantee that it is written in pure Julia. I’m not sure why you think otherwise.

2 Likes

I thought from the documentation. It definatly works fine, except for the type of variable. NLsolve is cool, that it shares macros with ModelingToolkit.jl . I’ll keep using each, and see my preference.

I wonder why you want to use root finding methods for what is essentially an optimization problem. Typically, specialized optimization algorithms should be more efficient and easier to use (because they take derivatives automatically, for example)

1 Like

I agree, I was using the Mean Value Theorem and the Interval Value theorem, for demonstration, but for practical things I would use optimization. What packages would I use for optimization problems?

So far, I have mostly used Optim.jl which is very straight-forward. If you know that your function is convex, Convex.jl is great. Many swear by Jump.jl – I haven’t tried it yet because as far as I can tell it requires me to rewrite my functions using their domain specific language.

2 Likes

I’ll look at Optim.jl . I’ve heard of JuMP, but never used it.

Autodiff can be done for equations as well. The larger difference between minimizing f and
solving grad f = 0 with a nonlinear solver is that the nonlinear solver is not aware that the goal is minimization. Optimization methods use a model Hessian that is symmetric at least and often positive definite. Optimization methods also guide the search with f, rather than norm(grad f) which is what nonlinear solvers do.

1 Like