Type promototion of Vectors

I believe this might be considered a bug in type promotion, but I wanted to ask here first before reporting an issue. In Julia v1.2.0 and v1.3.0-rc4.1, I see the following behavior:

julia> promote_type(ComplexF64,Float32)
Complex{Float64}

julia> promote_type(Vector{ComplexF64},Vector{Float32})
Array{Complex{Float64},1}

as I would expect. However, I also see this:

julia> promote_type(ComplexF32,Float64)
Complex{Float64}

julia> promote_type(Vector{ComplexF32},Vector{Float64})
Array{T,1} where T

even though I would expect Array{Complex{Float64},1}. This also seems to happen with other numeric types:

julia> promote_type(Complex{Int32},Int64)
Complex{Int64}

julia> promote_type(Vector{Complex{Int32}},Vector{Int64})
Array{T,1} where T

julia> promote_type(ComplexF16,Float64)
Complex{Float64}

julia> promote_type(Vector{ComplexF16},Vector{Float64})
Array{T,1} where T

It seems like the promotion rule should be:

promote_rule(::Type{Array{T,N}},::Type{Array{S,N}}) where {T,S,N} = Array{promote_type(T,S),N}

but that doesn’t seem to be the case here.

Any idea if this is expected behavior or just a missing promote_rule?

-Matt

That promotion would entail copying entire arrays and converting to a different type, which can be an arbitrarily large amount of work and require an arbitrary amount of extra memory for the copy. In general, the system should never do that kind of conversion or copying “behind your back”, only when you explicitly ask for it.

1 Like

Thanks for the reply. That argument seems to be making an assumption about where promote_type is being used (so perhaps I am using it wrong and there is a better way to be doing what I am trying to do).

In my use case, I have a custom AbstractArray type that uses a Vector as data, for example:

struct MyArray{T,N} <: AbstractArray{T,N}
  data::Vector{T}
  inds::NTuple{N,Int}
end

I want to perform a binary operation on these, for example perform a tensor contraction. I need a generic way to determine the output type of the result of the tensor contraction, so I was hoping to use type promotion rules on the type of the data. Clearly this is a situation that Julia itself handles, for example:

julia> A = randn(ComplexF32,2,2)
2Ă—2 Array{Complex{Float32},2}:
  1.06398-0.592462im  0.124138+0.386911im
 0.234007+1.13804im   -1.49845-0.844161im

julia> B = randn(Float64,2,2)
2Ă—2 Array{Float64,2}:
  1.16284    0.981459  
 -0.299988  -0.00321795

julia> A*B
2Ă—2 Array{Complex{Float64},2}:
     1.2-0.805005im   1.04386-0.582722im
 0.72163+1.5766im    0.234491+1.11966im 

Is there a better pattern to use besides promote_type here? Note that the storage may not just be Vector, but in general will be AbstractVector. If it is just Vector I could just promote on the element type of the input tensors (maybe that is the better way to go, but it seemed nicer to not assume anything about the type of the data and just rely on promotion rules of the data).

Do you know of Base.promote_op?

I didn’t know about that function, thanks for pointing it out. Indeed, it may be a way to determine the output:

julia> Base.promote_op(+,Vector{ComplexF32},Vector{Float64})
Array{Complex{Float64},1}

However, as the warning in the docstrings mentions in that function, it relies on type inference to determine the output so may be fragile.

Perhaps the answer in the end is that I need to create my own set of type promotion functions and rules for my own storage types, since the concept of promote_type might not fit my particular use case (I was just hoping to leverage Julia’s system, since it is nice and extensible).

The use case for promote_type is that it’s used by promote which converts its arguments to compatible types. Fortunately, it’s not hard to roll your own promotion-like mechanims that can even call promote_type and modify it however you need to.

I see. I suppose the way promotion is used in Julia is quite similar to the concept of taking the inputs to binary operations and determining the output type, but not exactly the same, so I may be pushing the boundary for what it is meant to be used for.

However, I would still argue that maybe the rule I proposed in the original post might be a good one. This still seems a bit strange to me:

julia> [Vector{ComplexF64}(),Vector{Float32}()]
2-element Array{Array{Complex{Float64},1},1}:
 []
 []

julia> [Vector{ComplexF32}(),Vector{Float64}()]
2-element Array{Array{T,1} where T,1}:
 Complex{Float32}[]
 Float64[]         

The current rule does avoid some copying in that case, but creates an inconsistent behavior (clearly the two data types can be promoted to a common one). Type promotion inherently involves the possibility of copying data, so avoiding copies in some situations but not others seems a bit arbitrary. But maybe it is a balancing game of trying to get things that “just work” but also don’t do too many unnecessary copies.

Anyway, yes it seems like the solution is to create my own promotion system and leverage Julia where I can (in general it worked very smoothly, but I came across this corner case).

1 Like