I am writing the initialization part of my julia program and I want to define functions that depend on parameters already defined.
I am using @generated for the definition of the function.
Then I have to use precompile() for making the solve thing happening at this stage:
@generated function my_fun(x)
computed_results = solve(BIG_PROBLEM)
return :($computed_results[x])
end
precompile(my_fun,(Int64,))
Is there another way to do this?
Why precompile needs the type tuple also if I annotate the arguments?
Why I need to provide the type tuple to precompile also if x is not used in the expression interpolation?
If the results of solve is serializalbe, this in global scope should work:
const computed_results = solve(BIG_PROBLEM)
Also, I started incorporating static computations like this into the deps/build.jl file. This way, the user can rebuild things more transparently and not just over deleting some precompiled cache file (which might as well be independent of the computation).
Why precompile needs the type tuple also if I annotate the arguments?
Because precompilation is mostly about lowering and typeinference. I guess one could introduce more heuristics to figure out the types e.g. by your typed arguments. But you still would need to select the method, if a function combines multiple signatures
I’m sure that these things will get a lot more comfortable in the future, until then, it’s probably a lot more consistent to always force the user to supply the type tuple.
Thank you for replying. It does make sense.
I don’t like the global const solution because the code readability is affected by the fact the user will not be able to understand that the computation is used only for generating the function. Another problem in general is that the body could depend on type of x.
What about the following solution?
I could define a new macro called @generated_precompile that could embed the precompile call and also derive the type of the arguments for the generated function and precompile by analizing the argument given.
Is this considered useful?
Can I open an issue with a request for this?
What you are allowed to do in a @generated function is very limited. Something like computed_results = solve(BIG_PROBLEM) inside a @genrated function is not the right way to do it. The results should be cached in a dict or something. You can generate the function from the result with an @eval. @generated functions are meant for non trivial computations with type parameters which you seem to not do at all.
Specifically (quoting from the manual), you are among other things not allowed to.
Caching of native pointers.
Interacting with the contents or methods of Core.Inference in any way.
Observing any mutable state.
Inference on the generated function may be run at any time, including while your code is attempting to observe or mutate this state.
Taking any locks: C code you call out to may use locks internally, (for example, it is not problematic to call malloc, even though most implementations require locks internally) but don’t attempt to hold or acquire any while executing Julia code.
Calling any function that is defined after the body of the generated function. This condition is relaxed for incrementally-loaded precompiled modules to allow calling any function in the module.
I would strongly recommend reconsidering using @generated functions for this use case.