Though, in fairness, TypeSortedCollections.jl seems to just handle the Tuple case by collecting it and treating it as a vector, which is… IMO not good, but I guess it depends on @Larbino1’s use-case if that’s acceptable or not.
If I may, I think you’re missing the point in my question, which is understandable as it is not a typical way to use tuples. If the compiler knows the index at compile time, it knows the type. I am talking about a situation where the compiler does not know the index at compile time.
I think that TypeSortedCollections is a much more idiomatic, sensible, way to structure my data and to solve this problem, so will pursue that moving forward, but for now I’m using @raminammour 's nice solution!
The solution is sensitive to the tuple input. Right-hand type assertions introduce typeasserts that the compiler can leverage for type inference, but they isolate rather than outright eliminate runtime type checks, which includes sugarcoating @code_warntype reports. Something I neglected to notice earlier is that the allocation count being interpreted as a sign of instability is also affected by @time being in the global scope or whether you’re retrieving one of the interned Ints:
julia> let # 2nd run, omitted 1st run; no allocation like in global scope
tup = (10, 100im, 1., 2.0im, 3.)
i = 2
@time test4(tup, i, Complex{Int})
end
0.000001 seconds
0 + 100im
julia> let # 2nd run, omitted 1st run; 1 allocation shows up with non-literal tuple
tup = Tuple(append!(Any[10, 100im, 1., 2.0im, 3.], fill(1im, 0) ) )
i = 2
@time test4(tup, i, Complex{Int})
end
0.000008 seconds (1 allocation: 32 bytes)
0 + 1im
julia> let # 2nd run, omitted 1st run; 33+ elements scale in allocations
tup = Tuple(append!(Any[10, 100im, 1., 2.0im, 3.], fill(1im, 33) ) )
i = 20
@time test4(tup, i, Complex{Int})
end
0.000026 seconds (221 allocations: 11.000 KiB)
0 + 1im
julia> let # 2nd run, omitted 1st run, retrieving interned Int saves an allocation
tup = Tuple(append!(Any[10, 100im, 1., 2.0im, 3.], fill(1im, 20) ) )
i = 1
@time test4(tup, i, Int)
end
0.000007 seconds
10
As someone mentioned earlier, omitting the type assertion (x::T to x) does not change the performance, even though the @code_warntype switches from “type-stable” to returning ::Any. Limiting the number of types to help @code_warntype infer better doesn’t affect allocations, either. This makes me think that the allocations in this case actually has little to do with type instability.
I had assumed that getindex would show the same effects, but it actually scales better (I don’t know why, might be a perk of calling a built-in function to do the work):
julia> let
tup = (10, 100im, 1., 2.0im, 3.)
i = 2
@time getindex(tup, i)
end
0.000005 seconds
0 + 100im
julia> let
tup = Tuple(append!(Any[10, 100im, 1., 2.0im, 3.], fill(1im, 0) ) )
i = 2
@time getindex(tup, i)
end
0.000001 seconds (1 allocation: 32 bytes)
0 + 1im
julia> let
tup = Tuple(append!(Any[10, 100im, 1., 2.0im, 3.], fill(1im, 33) ) )
i = 20
@time getindex(tup, i)
end
0.000001 seconds (1 allocation: 32 bytes)
0 + 1im