ForwardDiff evaluation of objective in Optim returns wrong result (objective uses DifferentialEquations pkg)

I am having a strange problem with ForwardDiff that is internally used in Optim. The minimum returned by Optim is not the same value as the value of the objective function when the minimizer returned by Optim is used, see the following MWE.

using Julia
using DifferentialEquations

function f(du,u,p,t)
  du[1] = dx = p[1]*u[1] - u[1]*u[2]
  du[2] = dy = -3*u[2] + u[1]*u[2]
end

u0 = [1.0;1.0]
tspan = (0.0,10.0)
p = [1.5]
prob = ODEProblem(f,u0,tspan,p)
tstops = range(0.,stop=10.,length=10)
sol = solve(prob,Tsit5(),saveat=tstops)

randomized = VectorOfArray([(sol(t[i]) + .01randn(2)) for i in 1:length(tstops)])
data = convert(Array,randomized)

function least_squares(x)
           _prob = remake(prob, u0=convert.(eltype(x),prob.u0),p=x)
           sol = solve(_prob,Tsit5(),saveat=tstops)
           sum((hcat(sol.u...) .- data).^2)
end

result = optimize(least_squares, [5.], Newton(),autodiff=:forward)

result.minimum # returns 275.97
least_squares(result.minimizer) # returns 276.68

Please read the first part of this post: PSA: make it easier to help you.

You should provide a minimal working example.

I have added a MWE, see the original post above.

It might be related to the DifferentialEquations pkg as well. I have not seen it in an optimization without the solving of differential equations.

Solving with dual numbers can change the stepping behavior in order to do norm control on the derivative terms, which in turn would cause this.

Thanks for the hint!

I have switche to NLopt and used ForwardDiff for the calculation of the gradient likewise. However, I do not observe the same problem there.