Reduce type inference when using nested ForwardDiff

Hi Julia wizards,

I am calculating derivatives wrt. different arguments of a function. Unfortunately, the compilation time in my actual use case is very high due to type inference. I had a quick fix, that reduced it, but did not remedy it to the full potential.

Consider the following code mwe:

using ProfileView
using BenchmarkTools
using ForwardDiff

const D = ForwardDiff.derivative

function func(p::Vector{T1}, v1::T2, v2::T3) where {T1 <: Real, T2 <: Real, T3 <: Real}
    out_type = typeof(p[1] * v1 * v2)   # bad fix
    v1 = convert(out_type, v1)          # bad fix
    v2 = convert(out_type, v2)          # bad fix
    p = convert(Vector{out_type}, p)    # bad fix

    p[1] * v2 * (p[2] - v2) - p[3] * v1^p[4] * log(v1) + p[4]^2 * v1^2 * v2^3
end

p_orig = [0.436, 0.105, 0.436, 1.01]
v1 = rand()
v2 = rand()

# a simplie derivative wrt one variable
# @profview D(v1_i -> func(p_orig, v1_i, v2), v1)
@time D(v1_i -> func(p_orig, v1_i, v2), v1)

# calculate the jacobian of the mixed derivative above
@time ForwardDiff.gradient(p_i -> D(v1_i -> D(v2_i -> func(p_i, v1_i, v2_i), v2), v1), p_orig)

I think the type inference happens, because T1…T3 are different. I would like to help the compilation by pre-converting all values to dual, but cannot do so without the small calculation in out_type. If I do a promote_type I get Real, which is of course true, but does not solve the problem. Maybe this is due to the Tags in the Dual type? How can I safely convert all the arguments to the same type and depth of dual and have the Tags be appropriate?
Thanks in advance :slight_smile: