Is there a reason why eigvals returns a vector of length 1 instead of a scalar when given a scalar input? For example: typeof(eigvals(1.0)) == Array{Float64,1}.
It seems a bit inconsistent compared to svdvals which returns a scalar. For example: typeof(svdvals(1.0)) == Float64.
I personally like the behavior of svdvals much better, as it avoids allocating arrays unnecessarily, and allows for ending recursion based on the return type, rather than checking the size. It also aligns with the general Julia behavior of returning the most efficient type that can be cleanly inferred from the input type(s).
Maybe there’s a historical reason. Currently, Julia allows for superfluous indexes, as long as those indexes are equal to one. For example: 1.0[1] == 1.0. (Hence, if the return type of eigvals(::Number) were to be changed form vector to scalar, most existing would continue to work.) But I don’t know if this was always the case.
At that point both eigvals and svdvals were changed to return a vector for scalar input. Whatever reasoning reverted this for svdvals should probably apply to eigvals as well.