Is the least significant bit from bitstring() correct?

This is just rounding. It’s kinda like a plain old decimal representation of \frac{1}{101} = 0.\overline{0099}. If you limit that repeating 0.009900990099\dots to 7 decimal places, it rounds to: 0.0099010. That’s exactly what’s happening here: \frac{7}{9} is similarly a repeating fraction in both decimal and binary and it’s gotta snap to the nearest representable value at some point.

1 Like