Index type of UnitRange{BigInt} changed from Int to BigInt after Julia 1.8: regression?

This expression returns Int on Julia 1.8, but BigInt beyond Julia 1.8:

typeof(firstindex(map(BigInt, 1:1)))

I think this might be a regression, but it seems like it could also have been done on purpose?

This is arguably inconsistent with the behavior of:

typeof(firstindex(BigInt[10, 20]))

Personally I don’t really see why would a BigInt type for indices be necessary, it’s not like the indexing base/offset will be some huge number. In fact, the offset is fixed at one!

How would you access a[end] for a = 1:big(10)^300 without a BigInt index?

I know that @dlfivefifty worked a lot on related issues (e.g. https://github.com/JuliaLang/julia/pull/37741), so may be he can comment more on when and why this change occurred.

3 Likes

You’ve definitely got a point. On the other hand, maybe UnitRange isn’t quite the right type for that kind of thing? I think that breaking assumptions about the index type being Int stands to make writing efficient code more complicated, because now I have to explicitly cast my indices to Int from BigInt. And BigInt isn’t even a pure-Julia type, inhibiting Julia’s inference capabilities and affecting performance.

I mean, why would someone use 1:big(10)^300 instead of just writing a short anonymous function?

LazyArrays.jl supports lazy manipulations of such arrays, something an anonymous function can’t do. One can use such big arrays to do some combinatorial calculations, eg you can take a sum.

Though the real reason for this was to support InfiniteArrays.jl and InfiniteLinearAlgebra.jl.