ReverseDiff.jl gives zeros when using fill

Hey,

the following MWE:

function g(x)
    out = fill(0.0, size(x))
    for i = 1:size(x)[1]
        out[i, :] .= x[i, :]
    end
    return out
end
ReverseDiff.gradient(x -> sum(g(x)), randn((5, 5)))

which returns

5×5 Array{Float64,2}:
 0.0  0.0  0.0  0.0  0.0
 0.0  0.0  0.0  0.0  0.0
 0.0  0.0  0.0  0.0  0.0
 0.0  0.0  0.0  0.0  0.0
 0.0  0.0  0.0  0.0  0.0

Using out = zeros(eltype(x), size(x)) instead of the fill, works as expected:

function g(x)
    out = zeros(eltype(x), size(x))
    for i = 1:size(x)[1]
        out[i, :] .= x[i, :]
    end
    return out
end
ReverseDiff.gradient(x -> sum(g(x)), randn((5, 5)))

5×5 Array{Float64,2}:
 1.0  1.0  1.0  1.0  1.0
 1.0  1.0  1.0  1.0  1.0
 1.0  1.0  1.0  1.0  1.0
 1.0  1.0  1.0  1.0  1.0
 1.0  1.0  1.0  1.0  1.0

Is that something I’m not aware of in ReverseDiff or a bug?

Thanks,

Felix

Ah, it’s getting more clear I guess:

function g(x)
    out = fill(eltype(x)(0), size(x))
    for i = 1:size(x)[1]
        out[i, :] .= x[i, :]
    end
    return out
end
ReverseDiff.gradient(x -> sum(g(x)), randn((5, 5)))

This also works, despite that my first fill expression and this one have the same type.
So the reason must be something due to the connection of x and out, right?