What is the effect of `parse` on `Char' types?

Changing the default base modifies the behavior, and it seems to default to base 36 for the Char type, with 10 to 35 mapped from ‘A/a’ to ‘Z/z’
It also has a base 62 with ‘a’ as 36 and ‘z’ as 61

parse(Int, 'A', 36) == parse(Int, 'a', 36)
parse(Int, 'A', 36) != parse(Int, 'a', 37)
parse(Int, 'z', 37) #error
parse(Int, 'A', 10) #error