Will decreasing the precision of intermediate variables improve performance of code?

Here’s an example of performances getting worse when performing Float32 math involving an Int64 loop variable.

The solution in that case was to explicitly use an Int32 loop variable:

for i in Int32(1):N
  ...
end

However that Int32(1) is rather unpleasant to look at and, notably, very easy to forget with disastrous consequences.

1 Like