Why is reduce with concatenation type unstable?

reduce(vcat, arr) returns a scalar or an array depending on the length of the array.

julia> reduce(vcat, [1])
1

julia> reduce(vcat, [1,2])
2-element Array{Int64,1}:
 1
 2

julia> @code_warntype reduce(vcat, [1,2])
Variables
  #self#::Core.Compiler.Const(reduce, false)
  op::Core.Compiler.Const(vcat, false)
  A::Array{Int64,1}

Body::Union{Int64, Array{Int64,1}}
1 ─ %1 = Core.NamedTuple()::Core.Compiler.Const(NamedTuple(), false)
│   %2 = Base.pairs(%1)::Core.Compiler.Const(Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}(), false)
│   %3 = Base.:(var"#reduce#622")(%2, #self#, op, A)::Union{Int64, Array{Int64,1}}
└──      return %3

I find this surprising as vcat does not behave like this:

julia> vcat(1)
1-element Array{Int64,1}:
 1

julia> vcat(1,2)
2-element Array{Int64,1}:
 1
 2

The issue may be resolved by specifying an initial value, eg. as

julia> reduce(vcat, [1], init=Int[])
1-element Array{Int64,1}:
 1

Why is the type-unstable definition the default?

But reduce is not splatting. What you are doing is equivalent to

1, vcat(1, 2), vcat(vcat(1, 2), 3) etc…

And this has nothing to do with type stability. It’s just how it must behave by the definition.

Currently reduce with a single element calls reduce_first, and it is encouraged to define additional methods if it helps reduce become type-stable. I have found the following issues discussing this:

The docstring for reduce_first states

The default is x for most types. The main purpose is to ensure type stability, so
additional methods should only be defined for cases where op gives a result with
different types than its inputs.

As discussed in the issues, concatenation is one such case where it makes sense to define reduce_first appropriately to obtain a type-stable method.

And this has nothing to do with type stability. It’s just how it must behave by the definition.

I would argue that the use of reduce_first implies that it does have to do with type-stability.