This thing with types is not so easy in the beginning. I’m trying to use the gradient() function on an Any array, but it does not work. Could someone help me with the type of the w array? Second, I want to compare the base gradient function with the Forward.gradient function from the ForwardDiff package and would be grateful if someone who knows this package can tell what needs to be changed. Obviously, just changing gradient to Forward.gradient was not enough.
n = 100
p = 10
x = randn(n,p)'
y = sum(x[1:5,:],1) .+ randn(n)'*0.1
w = [0.0001*randn(1,p), 0]
loss(w,x,y) = sumabs2(y - (w[1]*x .+ w[2])) / size(y,2)
lossgradient = gradient(loss)
function train(w, data; lr=.1)
for (x,y) in data
dw = lossgradient(w, x, y)
w[1] -= lr * dw[1]
w[2] -= lr * dw[2]
end
return w
end
for i=1:50; train(w, [x,y]); println(loss(w,x,y)); end
using ForwardDiff
lossgradient = Forward.gradient(loss)