Inferring the return type of an operation on two variables

Suppose I have two vectors: A::Vector{Type1} and B::Vector{Type2}. I want to create a vector C::Vector{Type3} where Type3 is the type of a * b, where a and b are elements of A and B, respectively.

A basic example of this is A::Vector{Int64} = [1,2,3], B::Vector{Flot64}=[1.0, 2.0, 3.0], C::Vector{Float64} = A .* B. In this case, Julia is able to infer the type of A .* B, but in my case each element of C is create by a more complicated series of operations, so my intended approach is to do C = Vector{Type3}(undef, num_elements), then fill C by doing the necessary operations.

How can I determine Type3? I would like to do this without performing any arithmetic operations. For example, if A and B are vectors whose elements are large matrices, I wouldn’t want to perform a matrix-matrix multiplication just to know what the type of the result is.

It seems Base.promote_op does what I need (e.g. Type3 = Base.promote_op(*, A[1], B[1])), but the documentation for the function says

  │ Warning
  │
  │  Due to its fragility, use of promote_op should be avoided. It is preferable to base the container eltype on the type of the actual elements. Only in the absence of any
  │  elements (for an empty result container), it may be unavoidable to call promote_op.

In this case my result container will not be empty, so is there some other function or technique I should be using?

1 Like

Technically, the only way to stably determine Type3 in perpetuity is to specify it manually in preallocations, then handle any converts or type errors during computation. Automatic output type inference is inherently dynamic. Consider Type1 = Number and Type2 = Number. A not-so-clean way to compute a wide enough element type for all known * methods is:

julia> Union{Base.return_types(*, Tuple{Number, Number})...}
Any

And this matches promote_op’s own compiler-related inference, though I don’t know if that’s true generally:

julia> Base.promote_op(*, Number, Number)
Any

It’s quite unfortunate that we can’t even infer a Number output in all cases, that’s just how generic Julia’s methods can be. But what actually happens when we do the actual computation on instances?

julia> Number[1 2] .* Number[3, 4]
2×2 Matrix{Int64}:
 3  6
 4  8

julia> Number[1 2.1] .* Number[3, 4]
2×2 Matrix{Real}:
 3  6.3
 4  8.4

The element type diverges from the compiler-inferred types and is narrowed by the elements. That’s what promote_op’s docstring means by being a mere upper bound. Julia is a dynamic language after all, leveraging runtime information is par for the course. I’m a little surprised that promote_op is even in the Manual because it’s internal and has so many caveats, but maybe generic matrix operations were just important enough to say something for developers. I wouldn’t use it for general purposes.

In practice, concrete input types and an inferrable method do result in a reliable concrete return type over time because it’s incredibly unusual and undesirable for the compiler to regress to an abstract type. Basic operations on primitive (or primitive-like) types are usually even more straightforward by design and are expected to fit promote_type regardless of the actual function; regressions on the BLAS-supported types (Float32, Float64, and their Complex counterparts) would be especially disastrous. If you really don’t want to manually specify output types, you could leverage these.