Apart from a little inaccuracy, seems like â€ś2.718281828459045 ^â€ť is much faster than â€śexp()â€ť. Similarly, â€ś2.0 ^â€ť is faster than â€śexp2()â€ť, but exp() is particularly slow:

Sort of â€śrelatedâ€ť to this (and itâ€™s Friday, so cut me some slack. ).

Computing log10(x) using log(x)*log10(exp(1)), with log10(exp(1)) replaced by the resulting â€śnumerical valueâ€ť, is faster than log10(x) according to the benchmark tools (if Iâ€™m using them properly?). Obviously not producing exactly the same value, but sometimes speed trumps accuracyâ€¦ Julia 1.0.0/Windows 7.

Yes, this is something that is useful to keep in mind. Generally, if a function is called something like log10 (or exp or â€¦) it will be implemented as fast as possible, but within the constraints of being within 1ulp of the â€śtrue solutionâ€ť. That is, not quite correctly rounded, but quite close still. Iâ€™m not sure what the worst case error is for your f1, but Iâ€™m certain that it is not 1ulp.

That said, we do have a @fastmath macro, and thereâ€™s certainly more we could do with that, to actually make it choose faster less accurate versions for all the elementary functions.

Iâ€™m using JuliaPro 0.6.3 â€¦ and somehow there is no help on @fastmath ???

help?> @fastmath
No documentation found.
Base.FastMath.@fastmath is a macro.
# 1 method for macro "@fastmath":
@fastmath(expr::ANY) in Base.FastMath at fastmath.jl:127

so, starting julia --math-mode=fast would put all maths in fastmath mode?