Given a matrix, where each row represents a point in 3D space and each column represents the x, y, and z location. What is the recommended way to calculate the gradient at each point (at each row)?
I’m not sure if the concept of ‘gradient’ makes any sense in this context. If points is just a list of positions, what are you taking the grafient of?
The gradient should represent the rate of change of a function or field with respect to position. But you just have the positions, not the ‘thing’ of which you should find the gradient.
The rate of change of positions with respect to position should just be constant.
That’s a scheme for estimating the gradient. But what are you taking the gradient of? I mean, where are the data whose sample points you have provided?
That looks very inefficient, allocating multiple arrays where only one should be needed. (Sorry, it’s too late for me to suggest a fix now, from my phone, but I suggest an array comprehension, or pre-allocate + loop. This is one example where Matlab-style vectorization is very suboptimal.)