Hi,
I’d like to take a gradient of a simple function using ReverseDiff, however I stumbled upon an error which I haven’t been able figure out, the code which throws the error is
using ReverseDiff
function f_V(R, a, b, c, d, g)
V = 0.
if R ≤ b
V = Inf
elseif b ≤ R ≤ c
R2 = R^2
V = -a*(c^2 - R2)^g * ((R2 - d^2)/(R2 - b^2))
end
return V
end
function f_U(R, indexer, arg_v...)
n_data = size(R)[1]; n_dim = size(indexer)[2]
Vref = f_V.(R, arg_v...)
U = Matrix{Float64}(undef, n_data, n_dim)
@simd for i=1:n_dim
Vsub = @view Vref[:, indexer[:,i]]
U[:, i] = sum(Vsub, dims=2)
end
U = U./maximum(abs.(U))
return U
end
R = rand(10, 3)
a = 1.; b = 1e-6; c = 2.; d = .9; g = 6.
indexer = [1 1 2; 2 3 3]
ReverseDiff.gradient(var -> f_U(R, indexer, var[1], b, c, d, g), [a])
The complete error trace is
ERROR: LoadError: ArgumentError: Converting an instance of ReverseDiff.TrackedReal{Float64, Float64, Nothing} to Float64 is not defined. Please use `ReverseDiff.value` instead.
Stacktrace:
[1] convert(#unused#::Type{Float64}, t::ReverseDiff.TrackedReal{Float64, Float64, Nothing})
@ ReverseDiff C:\Users\beryl\.julia\packages\ReverseDiff\Y5qec\src\tracked.jl:261
[2] setindex!
@ .\array.jl:905 [inlined]
[3] macro expansion
@ .\multidimensional.jl:910 [inlined]
[4] macro expansion
@ .\cartesian.jl:64 [inlined]
[5] _unsafe_setindex!(::IndexLinear, ::Matrix{Float64}, ::Matrix{ReverseDiff.TrackedReal{Float64, Float64, Nothing}}, ::Base.Slice{Base.OneTo{Int64}}, ::Int64)
@ Base .\multidimensional.jl:905
[6] _setindex!
@ .\multidimensional.jl:894 [inlined]
[7] setindex!
@ .\abstractarray.jl:1315 [inlined]
[8] macro expansion
@ C:\Users\beryl\Documents\Coding\Python\pes\test.jl:20 [inlined]
[9] macro expansion
@ .\simdloop.jl:77 [inlined]
[10] f_U(::Matrix{Float64}, ::Matrix{Int64}, ::ReverseDiff.TrackedReal{Float64, Float64, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}}, ::Vararg{Any})
@ Main C:\Users\beryl\Documents\Coding\Python\pes\test.jl:18
[11] (::var"#5#6")(var::ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}})
@ Main C:\Users\beryl\Documents\Coding\Python\pes\test.jl:30
[12] ReverseDiff.GradientTape(f::var"#5#6", input::Vector{Float64}, cfg::ReverseDiff.GradientConfig{ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}})
@ ReverseDiff C:\Users\beryl\.julia\packages\ReverseDiff\Y5qec\src\api\tape.jl:199
[13] gradient(f::Function, input::Vector{Float64}, cfg::ReverseDiff.GradientConfig{ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}}) (repeats 2 times)
@ ReverseDiff C:\Users\beryl\.julia\packages\ReverseDiff\Y5qec\src\api\gradients.jl:22
[14] top-level scope
@ C:\Users\beryl\Documents\Coding\Python\pes\test.jl:30
[15] include(fname::String)
@ Base.MainInclude .\client.jl:451
[16] top-level scope
@ REPL[1]:1
in expression starting at C:\Users\beryl\Documents\Coding\Python\pes\test.jl:30
The only thing that worked was when I took the gradient from f_V directly since technically the only function that is differentiated is f_V, as follows
ReverseDiff.gradient(var -> f_V(R[1,1], var[1], b, c, d, g), [a]) #<<--- works correctly
However I’d like to do the gradient routine from f_U (actually, from a function which calls f_U, I tried to narrow down the error it seems like it’s coming from f_U), since it’ll be used as a derivative to be fed into Optim with [a,b,c,d] as (the subset of) the tuning parameters. Thus I’d like to solve this error. Thanks.