How to get the gradient of NN wr to its input?

Why do I get this error with this MWE ?

nn= Dense(5,1)
static_input = [1,2,3,4]
variable_input = 5
opt = Nesterov()
ps = Flux.Params(variable_input)
g = Flux.gradient(ps) do
    sum(nn(vcat(static_input, variable_input)))
ERROR: MethodError: no method matching copy(::Nothing)
Closest candidates are:
  copy(::Expr) at expr.jl:36
  copy(::Core.CodeInfo) at expr.jl:64
  copy(::BitSet) at bitset.jl:46
 [1] extract_grad!(::Int64) at C:\Users\Henri\.julia\packages\Tracker\SAr25\src\back.jl:82
 [2] gradient_(::getfield(Main, Symbol("##40#41")), ::Tracker.Params) at C:\Users\Henri\.julia\packages\Tracker\SAr25\src\back.jl:102
 [3] #gradient#24(::Bool, ::Function, ::Function, ::Tracker.Params) at C:\Users\Henri\.julia\packages\Tracker\SAr25\src\back.jl:164
 [4] gradient(::Function, ::Tracker.Params) at C:\Users\Henri\.julia\packages\Tracker\SAr25\src\back.jl:164
 [5] top-level scope at none:0
1 Like

I’m a little confused about your MWE. Are you sure that ps is variable_input but not the params in nn? Then what do you want to update with this g?

I want to find the minimum of the curve it approximates using a simple gradient descent. It’s not to train the network. So yes. I could get that value using ReverseDiff.jl but I was hoping to use Flux to use the built-in optimizers.

Does this work as expected?

nn = Dense(5,1)
static_input = [1.,2,3,4]
variable_input = param([5])
ps = Flux.Params(variable_input)
g = Flux.gradient(ps) do
    sum(nn(vcat(static_input, variable_input)))

I think it does, g.grads is an IdDict of length 1 so It appears to have computed something like what I expect. But this is supposed to return the gradient I want isn’t it ?

>julia g[variable_input]
ERROR: KeyError: key Tracker.Tracked{Array{Float64,1}}(0x00000000, Tracker.Call{Nothing,Tuple{}}(nothing, ()), true, [0.8417484760284424]) not found
 [1] getindex at ./abstractdict.jl:599 [inlined]
 [2] getindex at /home/dehaybe/.julia/packages/Tracker/m6d46/src/params.jl:39 [inlined]
 [3] getindex(::Tracker.Grads, ::TrackedArray{…,Array{Float64,1}}) at /home/dehaybe/.julia/packages/Tracker/m6d46/src/params.jl:43
 [4] top-level scope at REPL[14]:1

Replace it with ps = Flux.Params([variable_input]) shoudl fix the error.

1 Like

Yes it does, thank you ! So to sum up, I had to use both param and Params and the inputs of Params should be arrays of tracked objects, not a tracked array.