I am writing functions which are basically algebra coded up (from an economic model), with code like
(1-βᵢ)*(1-αᵢ)*ωᵢ / ((1-βᵢ + γᵢ*βᵢ)*(1-αᵢ)*ωᵢ + γᵢβₖ*(1-αₖ)*ωₖ)
and it gets worse. They should work with Float64
, Rational{Int64}
(for unit tests!), and ForwardDiff.Dual
.
Out of habit (following suggestions in the manual), I use one(...)
for 1
, but now I have come to question this, because the compiler seems to produce the same code for the literal 1
(which is neatly promoted) and one(...)
. MWE:
with1(x) = 1+x
withone(x) = one(x) + x
eg @code_warntype
is the same.
So is it an OK habit to use 1
in formulas? Or is there a downside?
PS: I of course understand that a standalone application for a generic type where no promotion applies would require one(...)
for type stability.