# Precompute some values of a function

Consider the following MWE:

``````d=rand(10^8);
f(x,d) = x + sum(d)
g(x) = f(x,d)
``````

If I run `@benchmark f(1,d)` and `@benchmark g(1)` (in the REPL), both take the same amount of time to run (which is what I kind of excepted). Is there a way to write a function equivalent to `g(x)` to run as fast as the sum of two float points? I am thinking that maybe using macros could work because they are evaluated at parse time instead of execution time, but I don’t know how to do it.

``````d=rand(10^8);
g = let D = sum(d)
x -> x + D
end
``````

This does not really solve my issue, `g(x)` needs to be defined through `f(x,d)`. The context in which I am using this is more somthing like

``````my_fun = function(f)
d =generate_data()
g(x) = f(x,d)
# do things with g(x)
return result
end
``````

So, in some cases the function `f(x,d)` has some values that, “intuitively”, would not need to be computed at each call (such as in the first example), but in other cases, it does not.

Basically, it sounds like you need to refactor your code so that expensive calculations can be precomputed (and passed as parameters, possibly hidden within an opaque data structure) if desired.

For example, `A \ b` (where `A` is a square matrix) calls `lu(A) \ b`, so if you want you can precompute the expensive LU factorization `F = lu(A)` (stored in an opaque “factorization object” data structure) and then re-use it for subsequent solves `F \ b`.

2 Likes

e.g.

``````using Memoize

f₁(x, d) = x + sum(d)
@memoize f₂(x, d) = x + sum(d)
``````
``````julia> @time f₁(1.2, d)
0.034865 seconds (3.04 k allocations: 147.166 KiB, 14.22% compilation time)
4.999867722459446e7

julia> @time f₁(1.2, d)
0.023525 seconds (1 allocation: 16 bytes)
4.999867722459446e7

julia> @time f₂(1.2, d)
0.036218 seconds (37.59 k allocations: 1.805 MiB, 34.95% compilation time)
4.999867722459446e7

julia> @time f₂(1.2, d)
0.000005 seconds (2 allocations: 48 bytes)
4.999867722459446e7``````

Or a functor, see one example here:

Thank you for all the answers, folks 