Precompilation, __init__, and eval

I have some formulas that are loaded from text configuration files. These are used in other places besides my Julia program so I don’t have any control over the format. They are in the form of an expression using the single variable x. e.g., “sqrt(x)” or “231.0 - 0.12*x”. They are all valid Julia expressions, so I was hoping to parse them.

Here is a simplified example that demonstrates my problem.
File Test1.jl:

module Test1

fn = nothing
function __init__()
    global fn
    # text is actually loaded from a cell of an xlsm file
    fn_text = "sqrt(x)"
    fn = eval(Meta.parse("x->"*fn_text))
end
end

File Test2.jl:

module Test2

import Test1

function apply(x::Real)
    return Test1.fn(x)
end
end

If I start Julia (1.6.3) with --compiled-modules=no, then I can import Test2 and everything works.
If I start Julia with --compiled-modules=yes, then I get:
julia> import Test2
[ Info: Precompiling Test2 [top-level]
ERROR: LoadError: InitError: Evaluation into the closed module Test1 breaks incremental compilation because the side effects will not be permanent. This is likely due to some other module mutating Test1 with eval during precompilation - don’t do this.
Stacktrace:
[1] eval
@ ./boot.jl:360 [inlined]
[2] eval
@ ~/test_julia_module/Test1.jl:1 [inlined]
[3] init()

Is init() supposed to be evaluated during precompilation?
I found a closed discussion here: https://github.com/JuliaLang/julia/issues/29059 but couldn’t understand what the conclusion was.

1 Like

Does the fn_text loaded from your xlsm file change over time? If not, the easiest way to fix this is to do the evaluation at top-level in module Test1 rather than in __init__() — then the eval’d functions will be precompiled into your Test1 module and this is efficient when loading Test1 later.

The next easiest solution would probably be to use the RuntimeGeneratedFunctions package. You can use this from __init__ or from any other function to dynamically create functions from Julia expressions. For example:

julia> using RuntimeGeneratedFunctions

julia> RuntimeGeneratedFunctions.init(@__MODULE__)

julia> function rgf_demo()
           fn_text = "sqrt(x)" # loaded from file in the real version
           fn_ex = Meta.parse("x->"*fn_text)
           fn = @RuntimeGeneratedFunction(fn_ex)
           # We can call this right away (even in the same world age)
           fn(2.0)
       end

julia> rgf_demo()
1.4142135623730951

Yes, precompilation just loads your package (mostly as normal), and then serializes out “all the changes brought about by the module” (new methods and types, etc) into the precompile file.

If you absolutely need to detect precompilation, something like the following works (though I don’t recommend it if you have an alternative):

_isprecompiling() = ccall(:jl_generating_output, Cint, ()) == 1

function __init__()
    if !_isprecompiling()
        # do precompile-unsafe stuff
    end
end
3 Likes

Thank you Chris. This was very clear.
Regarding the first proposed solution: There are actually many formulas, and the text does change occasionally. Unfortunately, the names and number of files that are used also changes, otherwise I might have tried your first solution along with Base.include_dependency for the files containing the formulas.
The RuntimeGeneratedFunctions module seems to be exactly what I was looking for. I had been having headaches about the world-age issues of attempting to use Meta.parse/eval to create functions.

1 Like

Perfect! Yes RuntimeGeneratedFunctions is designed for exactly this use case and solves the world age problems you typically get if you try to do eval() any time later than __init__().

RuntimeGeneratedFunctions hackily does several things that the Julia runtime should be managing, though we manage to mostly hide that from users :laughing: