ReverseDiff method error in multiplication

Hi all,

I have a question about a method error in ReverseDiff.jl, which arises in my function when calling * (line p/g + p*(a-am) in my MWE below), and using ReverseDiff to calculate the Hessian matrix. I’ve checked that ForwardDiff works correctly for this problem (and using ReverseDiff to get just the gradient works too FWIW). I’d really appreciate some help interpreting more details behind the error and any suggestions for how I can fix it.

My deliberately simplified MWE is below. Thanks everyone.

[EDIT]: I’m on Julia 1.8.3 at the latest versions of those package (from the general julia repo).

using ForwardDiff
using ReverseDiff
using Distributions

log_ω::Float64 = -4.0
μg = -0.2
log_σg = log(0.1)
μp = 0.2
log_σp = log(0.1)

n = 10
a::Vector{Int64} = rand(Poisson(10), n)
log_g::Vector{Float64} = rand(Normal(μg, exp(log_σg)), n)
log_p::Vector{Float64} = rand(Normal(μp, exp(log_σp)), n)

function mwe(log_ω, log_g, log_p, a)
    ω = exp(log_ω)
    p = exp.(log_p)
    g = exp.(log_g)
    am = @. log(p/(g*ω))/g
    μ = map(a, am, g, p) do a, am, g, p
        if a < am
            ω * exp(g*a)
        else
            p/g + p*(a-am)
        end
    end
    return sum(μ)
end

ForwardDiff.hessian(x -> mwe(log_ω, log_g, x, a), log_p)
ReverseDiff.hessian(x -> mwe(log_ω, log_g, x, a), log_p)

# just the gradients work fine
ForwardDiff.gradient(x -> mwe(log_ω, log_g, x, a), log_p)
ReverseDiff.gradient(x -> mwe(log_ω, log_g, x, a), log_p)

I’ve found where the problem is but still no closer to understanding why. The problem seems to be in the multiplication p*(a-am) in the line from where the error arises. In fact it can be simplified to p*am which will give the same method error. So something odd is occurring when multiplying p, which is the tracked input, by am which is created from the tracked input.