Hi everyone,
I am currently looking to evaluate some numerical data and need a method similar to numpy.gradient. All the packages, I have found seem to assume that I have some continuous function that I can evaluate at arbitrary points, however I only have a fixed array of function values that I can use (i.e. x = Array{Float64,1}, y = Array{Float64,1}).
I have found an old topic (Is there a central difference/gradient function somewhere?) that discusses this, but I found that I was not completely satisfied with the discussed solutions and hope that something has changed since then.
Most packages seem to have some advanced functions to differentiate matrices or higher dimensional arrays, but not simple 1-D arrays.
The closest answer is this quick and dirty code of mine, inspired by one of the answers from above that gives what I want for evenly spaced x -Arrays:
# returns derivative of f wrt. x such that the result has a length equal to that of x
function deriv(y::AbstractVector,x::AbstractVector)
function centraldiff(v::AbstractVector)
dv = diff(v)/2 # half the derivative
a = [dv[1];dv] # copies first element
a .+= [dv;dv[end]] # copies last element, add both results to compute average
return(a)
end
return centraldiff(y)./centraldiff(x)
end
but for such a simple problem I would not expect having to rely on my own clumsy implementation. It would be nice to import some function that can do this for non-evenly spaced x as well.
Have I overlooked a suitable function in one of the many differentiation libraries, or is my problem ill-formulated somehow, such that there is a much better workflow altogether? Numerically, it is not feasible to compute values of y dynamically.
I hope you can help me in this problem!