Here’s what I checked, minimizing |x|^2 with a fixed step-size of 0.1. There’s no error, but if I just have my target function return e.g. 0, you can see it terminates early even with a negative f_tol
,
using Optim, LineSearches, LinearAlgebra
bfgs = BFGS(
alphaguess = LineSearches.InitialStatic(alpha=0.1),
linesearch = LineSearches.Static()
)
# do a proper minimization
optimize(
x->dot(x,x),
x->2x,
ones(10),
bfgs,
inplace=false
).iterations # returns 181
# attempt to always return 0 from the target function,
# but with correct gradient
# this doesn't work and will terminate early
optimize(
x->0,
x->2x,
ones(10),
bfgs,
Optim.Options(f_tol=-1, allow_f_increases=true),
inplace=false
).iterations # returns 2
# the only workaround I've found
f₀ = 0
optimize(
x->(global f₀-=1),
x->2x,
ones(10),
bfgs,
inplace=false
).iterations # returns 181