Problem with ForwardDiff not letting me update variable "d"

Here is a watered down version of code I am using

a = zeros(2393)
a = findall(x -> x == 0, a)

b = zeros(293)

e = rand(2979)

l = zeros(137,293)

g = ones(137,293)

h = rand(137,170)


function dummy(e::T...) where {T}
    
    f = zeros(T,49810)
    
    f[a] .= e[587:end]
    
    f = reshape(f, 293, 170)
    
    f = transpose(f)
    
    c = zeros(T,137, 293)
    
    m = 1.0./(1.0 .- (l * transpose(f)))
        
    n = ((h.*m)*f).*g
    
    d = b
    
    for i in 1:137
        
        for j in 1:293
            
            c[i,j] = m[j]
            d[j] = c[i,j]
        end
    end
    
end

ForwardDiff.gradient(e -> dummy(e...),e)

I get this error.

MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12})
Closest candidates are:
  (::Type{T})(::Real, ::RoundingMode) where T<:AbstractFloat at rounding.jl:200
  (::Type{T})(::T) where T<:Number at boot.jl:760
  (::Type{T})(::AbstractChar) where T<:Union{AbstractChar, Number} at char.jl:50
  ...

Stacktrace:
  [1] convert(#unused#::Type{Float64}, x::ForwardDiff.Dual{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12})
    @ Base ./number.jl:7
  [2] setindex!(A::Vector{Float64}, x::ForwardDiff.Dual{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12}, i1::Int64)
    @ Base ./array.jl:839
  [3] dummy(::ForwardDiff.Dual{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12}, ::Vararg{ForwardDiff.Dual{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12}, N} where N)
    @ Main ./In[78]:38
  [4] #97
    @ ./In[79]:1 [inlined]
  [5] chunk_mode_gradient(f::var"#97#98", x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12}}})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/QdStj/src/gradient.jl:150
  [6] gradient(f::Function, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12}}}, ::Val{true})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/QdStj/src/gradient.jl:21
  [7] gradient(f::Function, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#97#98", Float64}, Float64, 12}}}) (repeats 2 times)
    @ ForwardDiff ~/.julia/packages/ForwardDiff/QdStj/src/gradient.jl:17
  [8] top-level scope
    @ In[79]:1
  [9] eval
    @ ./boot.jl:360 [inlined]
 [10] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)
    @ Base ./loading.jl:1116

Line 38 is as follows:

d[j] = c[i,j]

Your c is of type Array{ForwardDiff.Dual} but you initialise (even globally since it is your global b) your d to be an array of Floats.

You already have the dummy function to be generic with the T, so you could use that to initialise d instead of initialising it to b.
Surely this is only due to minimising your problem, but your dummy also currently does not have a return line.

Here is an adjustment to the code:

import ForwardDiff
a = zeros(2393)
a = findall(x -> x == 0, a)

b = rand(293)

e = rand(2979)

l = zeros(137,293)

g = ones(137,293)

h = rand(137,170)


function dummy(e::T...) where {T}
    
    f = zeros(T,49810)
    
    f[a] .= e[587:end]
    
    f = reshape(f, 293, 170)
    
    f = transpose(f)
    
    c = zeros(T,137, 293)
    
    m = 1.0./(1.0 .- (l * transpose(f)))
        
    n = ((h.*m)*f).*g
    
    d = zeros(T,293)
    
    d .= b
    
    for i in 1:137
        
        for j in 1:293
            
            c[i,j] = m[j]
            d[j] = c[i,j]
        end
    end
    
end

I made b be the following:

b = rand(293)

I made d the following:

d = zeros(T,293)

When I run the following command:

ForwardDiff.gradient(e -> dummy(e...),e)

I get the following error

MethodError: no method matching zero(::Nothing)
Closest candidates are:
  zero(::Union{Type{P}, P}) where P<:Dates.Period at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.6/Dates/src/periods.jl:53
  zero(::ForwardDiff.Partials) at /home/ubuntu/.julia/packages/ForwardDiff/QdStj/src/partials.jl:39
  zero(::SparseArrays.AbstractSparseArray) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.6/SparseArrays/src/SparseArrays.jl:55
  ...

Stacktrace:
 [1] partials
   @ ~/.julia/packages/ForwardDiff/QdStj/src/dual.jl:100 [inlined]
 [2] partials
   @ ~/.julia/packages/ForwardDiff/QdStj/src/dual.jl:105 [inlined]
 [3] extract_gradient_chunk!(#unused#::Type{ForwardDiff.Tag{var"#117#118", Float64}}, result::Vector{Nothing}, dual::Nothing, index::Int64, chunksize::Int64)
   @ ForwardDiff ~/.julia/packages/ForwardDiff/QdStj/src/gradient.jl:86
 [4] chunk_mode_gradient(f::var"#117#118", x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#117#118", Float64}, Float64, 12, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#117#118", Float64}, Float64, 12}}})
   @ ForwardDiff ~/.julia/packages/ForwardDiff/QdStj/src/gradient.jl:152
 [5] gradient(f::Function, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#117#118", Float64}, Float64, 12, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#117#118", Float64}, Float64, 12}}}, ::Val{true})
   @ ForwardDiff ~/.julia/packages/ForwardDiff/QdStj/src/gradient.jl:21
 [6] gradient(f::Function, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#117#118", Float64}, Float64, 12, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#117#118", Float64}, Float64, 12}}}) (repeats 2 times)
   @ ForwardDiff ~/.julia/packages/ForwardDiff/QdStj/src/gradient.jl:17
 [7] top-level scope
   @ In[91]:1
 [8] eval
   @ ./boot.jl:360 [inlined]
 [9] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)
   @ Base ./loading.jl:1116

Hm, that is a bit of guesswork here from my side but the error states that the result (in [3]) is a vector of Nothings, maybe because your function returns nothing?

Also your d .= b will error afterwards probably, since the types of the elements of b and do not agree.

To help further I am sorry, I miss quite a bit of context here, since it is just a snipped of code and an error message and your dummy function returns nothing, so without any details on what you aim to achieve – I can sadly not do much more than pointing out single, technical errors.

Oh you know what, I’m dumb.

I wasn’t returning a value, which is why the gradient was giving that error.

Thank you for your help, @kellertuer

For context, I had an objective function that was failing to be registered in JuMP. So to further diagnose the problem, I went in and tried to get a dummy function (with the same error) to work in FowardDiff.

Ah, I would not use such an attribute – but from the error message my candidate was the result/return value of your dummy.

But good that the tips helped :slight_smile: just check carefully how to best set up d, maybe just initialising it to zero is enough since you set all values later anyways; also I just noticed what you are overwriting d 137 times currently since it is in the inner for loop but independent of the outer one - but maybe that is also only because you tried to minimise your case.