Finding types of combined variables to initialise arrays

I recently removed all concrete types from my code with the goal to use autodiff for gradients. In doing so I was faced with a difficulty, to initialise arrays with a type inferred from input parameters. I worked around this by constructing dummy variables to infer the type and seems to work, but feels quite clunky, so I was hoping to get some tips (typs?) on idiomatic patterns for this kind of problem.

For concreteness, consider this example


# this helper function updates existing A
function blind_helper!(A, x, y)
   k = 0.1im + pi/12
   A[1,1] = exp( k*x[1] * y[1])
   return A
end

function dummy(x, y)

    # what is the type of arrays to initialise?
    proto_x = x[1]
    proto_y = y[1] 
    proto_α = 0.1 + 0.1im # dummy internal variable, complex
    T1 = typeof(proto_x * imag(proto_α * proto_y)) # e.g Float64 in normal use
    T2 = typeof(proto_x * proto_α * proto_y) # e.g Complex{Float64} in normal use
    
    # initialise some identity matrix with the Complex{"something"} type
   A = Matrix{T2}(I, 3, 3)
   results = Array{T1}(undef, (3)) # array of results, e.g. Float64 in normal use
   
   # update some matrix elements that depend on x and y
   blind_helper!(A, x, y)
   for i in 1:3
      results[i] = imag(dot(A[:,i], A[:,1]))
   end

   return results
end

dummy([1.0, 2.0], [3.0, 4.0])

# but then I might need to run it through AD, which will feed Dual types

using ForwardDiff
function cost(p)
   res = dummy(p, [3.0, 4.0])
   return res[1]
end

grad = ForwardDiff.gradient(cost, [0.0,0.0])

The part around

    proto_x = x[1]
    proto_y = y[1] 
    proto_α = 0.1 + 0.1im # dummy internal variable, complex
    T1 = typeof(proto_x * imag(proto_α * proto_y)) # e.g Float64 in normal use
    T2 = typeof(proto_x * proto_α * proto_y) # e.g Complex{Float64} in normal use

is something I came up with for sheer lack of better ideas, and I would imagine there are cleaner ways to deal with this kind if problem? (typeof() is basically the only function I know, but I imagine there are compositions around this such as combined_typeof(x,y), type_of_complexthing(Complex{thing}), etc.?)

From your example I would say that

T1 = eltype(x)
T2 = Complex{eltype(y)}

I don’t understand what “Dual types” are, so I can’t imagine what p can be.
Perhaps you need typejoin, e.g.:

julia> typejoin(Float64,Int)
Real

julia> Complex{typejoin(eltype(x),eltype(y))}
ComplexF64 (alias for Complex{Float64})

but I am speculating.

1 Like

Thanks! re Dual types, I mean something like ForwardDiff.Dual

Generally promote_type will work well in this kind of circumstance:

julia> promote_type(Float64, Float64)
Float64

julia> promote_type(Float64, ForwardDiff.Dual{:tag, Float64, 2})
ForwardDiff.Dual{:tag, Float64, 2}

promote_type uses the same promotion machinery that many other operations use, as discussed here: Conversion and Promotion · The Julia Language

There are also some fancier things like promote_op which may be useful in specific cases:

# Promotion of two `Int64`s just gives another `Int64`...
julia> promote_type(Int64, Int64)
Int64

# ... but division of integers gives a floating point value, 
# and `promote_op` can show us:
julia> Base.promote_op(/, Int64, Int64)
Float64

but I would stick with promote_type until you have some specific need.

3 Likes