FiniteDiff.finite_difference_gradient changes input when it errors?

I ran into a surprising thing today. Consider the function sumlog:

function sumlog(x)
    sum(log(i) for i in x)
end

I’m interested in taking the gradient of sumlog at x:

using FiniteDiff

x = [1e-6, 1e-6]

FiniteDiff.finite_difference_gradient(sumlog, x)
#DomainError with -5.05545445239334e-6:
#log will only return a complex result if called with a complex argument. Try log(Complex(x)).

Naturally it errors, what surprised me is that now x changed:

x
#2-element Array{Float64,1}:
# -5.05545445239334e-6
#  1.0e-6

Is that expected? I’m using FiniteDiff v2.5.0.

The gradient error arises because I think it tries to compute the symmetric difference quotient and in doing so it reaches for a point less than 0, so is there a way to request the gradient using simply the difference quotient (where the derivative is approximated using points greater than x)?

Can you open an issue on this?

just did, do you know if FiniteDiff can compute non-symmetric difference quotients?

Not for the gradient IIRC