More generally, would it be possible to have some equivalent of type inference for Integer and float literals? Literals are generally one of the bits of Julia that I find most annoying.
For example, if I write x = 101, it would be great if 101 parsed as an int16 if x known to be a 16 bit integer, or a 64 bit integer if x is known to be a 64 bit integer.
Maybe create a special type called “Literal” which promotes to other numeric types as soon as it is used in some other compound expression? So that I can write y = 1 - x instead of y = one(x) - x . In that system, the 1 would be of type NumericLiteral{“1”}. Whenever it is used in any primitive numeric operation, it would promote to the type of the other argument, or to some default type. This would also make it a lot harder to write code that isn’t type stable.