LsqFit: strange DomainError

I am using LsqFit to fit the data for some correlation function. I am not going to go into details, but it is convenient for me to actually fit the logarithm of the correlation function.

using LsqFit
g(x, N, ξ, α, β) = log((exp(-(x-1)/ξ)+exp(-(N+1-x)/ξ)) + α) + β
model(x,p) = g.(xdata, 14, p[1], p[2], p[3])
fit = curve_fit(xdata, corfunc_data.|>log, [1.0,0.1,1.0]; lower = [0.01,0.0,-Inf])

When I am trying to fit the data, sometimes I get a DomainError coming from the logarithm inside g-function: for some reason, I get a negative argument there.

The exponentials in g-function are always positive, so the only way to obtain a negative value, is to use negative α. However, I put a lower zero bound on α. Why do I get DomainError then? Does LsqFit for some reason tries the values outside the specified boundaries?

Maybe you get 0, which, as far as I remember, is also not defined.

You can also try to set show_trace=true to print all parameters during the optimization.

I think the lower bound is a valid parameter value according to the code.

No, log(0.0) does not produce a DomainError, it simply gives -Inf.

I think the problem might be with the jacobian estimation. If the jacobian is not provided, then it is approximated using a finite difference method. My guess, finite difference procedure does not care about the bounds and results in a negative argument.

It may be good idea to add an option, which will restrict finite difference method to use the values inside the specified boundaries as well.

I will try it tomorrow. But I think, my gues would prove to be correct.

ForwardDiff should work with log(0).

julia> ForwardDiff.jacobian((x)->[log(x[1])],[0.0])
1×1 Matrix{Float64}:
 Inf

julia> ForwardDiff.jacobian((x)->[log(x[1])],[0.1])
1×1 Matrix{Float64}:
 10.0

julia> ForwardDiff.jacobian((x)->[log(x[1])],[0.5])
1×1 Matrix{Float64}:
 2.0

Also the xdata in the model function might be a typo and should be x.

model(x,p) = g.(x, 14, p[1], p[2], p[3])

I don’t know LsqFit, but in general I think constrained nonlinear solvers only promise that their end result satisfies constraints, and so maybe that applies to box constraints too and the solver peaks a bit outside your box in some intermediate steps. A simple check would just be to instead fit the log of alpha and beta instead so that g has exp(alpha) and exp(beta). That can obviously cause other problems and isn’t always advisable, but it also doesn’t take much effort to try and might be illuminating about what’s going on.

A second option that I think is pretty common is to just check for the constraints inside your objective and return a big number if they are violated. How best to do something like this is probably solver-dependent, but I’m sure you see what I mean.

As I have said, this is really the problem of finite difference jacobian estimation. I have checked the stacktrace of DomainError, and there was a function called something like jacobian_finite_difference (up to reordering of the words).

I have passed autodiff=forward to curve_fit, and the problem was solved.