Division by zero runs without warning -> complicates finding bugs

/ in Julia is floating-point division (unlike C/C++), and the IEEE 754 standard says that floating-point division of a nonzero (and non-nan) ±value by +0 gives ±Inf. (Though it would be nice to have a way to trap fp exceptions (julia issue #27705).)

In C/C++, you will also get Inf with floating-point division like 100/0.0. It’s just that 100/0 in C/C++ is truncated integer division (3/2 == 1 in C/C++), not floating-point division, analogous to Julia’s div:

julia> div(100,0)
ERROR: DivideError: integer division error

which also throws an error.

Even in Python, this produces inf with NumPy:

>>> np.int64(100)/0
inf

I’m guessing that the reason 100/0 doesn’t produce inf in Python is a legacy of Python2, where integer / integer was truncated integer division like C/C++, not floating-point division.

In other languages where / is floating-point division, the examples I can find also typically seem to follow the IEEE 754 behavior of producing Inf from division by 0, similar to Julia. For example, R, Javascript, Matlab …

25 Likes