Solution concept for "world age" problem occuring with @generated and macros

This is not a beginner’s question, it is a question to advanced julia users
whether my concept below is a valid and generalizable approach.
And if the answers turn out to be positive, it might help others to overcome
“world age” errors in their use cases.

I will explain my concept with some demonstration code (I recommend to execute it in REPL).
It is around a function f which is performance critical and is to be optimized.

f0 is the slow first version, suffering from a complex but pure calculation.
Basic optimization idea is to avoid this computation at runtime, doing it
only once at compile time.

Best solution would be to persuade julia compiler to “compile away” the
complex computation, doing constant propagation. In some cases, a reformulation
of the code helps. In very rare case, Base.@pure can enforce it. In my real
application, I tried both, finally without success (interested in details? See
@pure discussion and this issue).

An alternative is metaprogramming, using a macro or, in this case,
a @generated function. It is coded as function f1, but fails due to the
“world age problem”, details follow.

f2, with methods generated by generate, is my proposed solution.

Module M is a tool module, to be used by 3rd party
applications. Module App is such an application.

module M

function bitsizeof(::Type{T}) where T<: Enum 
    # use "new world" in all function calls depending on T
    8*sizeof(Int) - leading_zeros(Int(Base.invokelatest(typemax,T))-Int(Base.invokelatest(typemin,T)))
end

fib(i::UInt) = i<=1 ? 1 : fib(i-1)+fib(i-2) # I know you can optimize it - compiler can't
transform(s) = fib(hash(string(s))%48) # just for demo: a pure but expensive function

function f0(::Type{T},s) where {T<:Enum}
    bits = bitsizeof(T)
    sTransformed = transform(s) # expensive pure calculation
    return (T,bits,sTransformed)
end

@generated function f1(::Type{T},::Val{s}) where {T<:Enum , s}
    bits = Int(Base.invokelatest(bitsizeof,T)) 
    sTransformed = transform(s) # expensive pure calculation
    return :(($T,$bits,sTransformed))
end

function f2 end

function generate(::Type{T}) where {T<:Enum}
    bits = bitsizeof(T)
    for s in instances(T) # generate method per enum instance
        sTransformed = transform(s) # expensive pure calculation
        ex = :(function f2(::Type{T}, ::Val{$s}) where {T <: Enum}
        return (T, $bits, $sTransformed)
        end)
        eval(ex)
    end
    return nothing
end
export f0,f1,f2,generate

end # module M

module App
using Main.M
@enum MyEnum ::Int8 e1 = -5 e2 = 0 e3=5
export MyEnum, e1,e2,e3
generate(MyEnum)
end # module app

# in application (another module)
using Main.App
using Main.M


f0(App.MyEnum,e2)
f2(App.MyEnum,Val(e2))
@time f0(App.MyEnum,e2)
@time f2(App.MyEnum,Val(e2))
f1(App.MyEnum,Val(e2))

If you execute it, you will see f0 and f2 returning the same result, f1 fails with

ERROR: MethodError: no method matching typemax(::Type{MyEnum})
The applicable method may be too new: running in world age 29649, while current world is 29661.
Closest candidates are:
  typemax(::Type{MyEnum}) at Enums.jl:197 (method too new to be called from this world context.)
  typemax(::Union{Dates.DateTime, Type{Dates.DateTime}}) at C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\Dates\src\types.jl:426        
  typemax(::Union{Dates.Date, Type{Dates.Date}}) at C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\Dates\src\types.jl:428

The problem arises from the sequence of function and type compilations:
Functions in module M are compiled before enum MyEnum is created.
The first use of MyEnum by calling f0 or f1 causes multiple dispatch
compilation of a bunch of methods having MyEnum as its parameter.
With normal julia code, it is no problem - all methods are compiled
on the fly when needed. In @generated functions and macros,
this does not happen, that code only looks at the method tables as they
were to its compile time, and does not compile missing methods.

A common recommendation to overcome the “world age problem” is the use of
Base.invokelatest. In the example code, it works for call of bitsizeof
in f1, but that moves the problem only one stage down in the call chain,
“world age problem” occurs again for typemax and typemin. Even wrapping
all calls in bitsizeof by Base.invokelatest does not work
(I have no idea why - can anyone explain?). Even if you manage to
fix that, more world age problems will pop up in transform(s).
Polluting your code with lots of Base.invokelatest calls kills type stability
and is no good idea with respect to performance.

You could try to reorder statements such that MyEnum and all methods used by f1
like typemax are compiled before the first call of f1. It is possible in
the concrete case, but hardly generalizable. And
working around with reordering will conflict with the common practice
to put all using / import statements at the begin of a module.

My solution proposal is: generate code directly with eval in a
generating function. generate looks very similar to f1, it just puts a function
around the computed expression. The critical point: generate must be called after
MyEnum is defined, and before f2 is used with MyEnum as
parameter. This puts some burden on the programmer of module App.

On the other hand, the generating function allows for more control
of code generation. In the sample code, generate produces one method per enum
instance. Fine for the example, impractical if MyEnum happens to have
100.000 instances. generate could get parameterized in a real world scenario
to offer different variants of code generation.

Dear julia experts, what do you think about my approach? Is it reliable?
Or do you see a “back door” for the world age problems to return?

eval has much less restrictions than @generated or macros, but it has.
Will another restriction cause problems with my generated methods in a more
complex setting?

1 Like

I don’t know if it is relevant, but have you seen GitHub - JuliaStaging/GeneralizedGenerated.jl: A generalized version of Julia generated functions @generated to allow closures in generated functions and avoid the use of runtime eval or invokelatest. and GitHub - SciML/RuntimeGeneratedFunctions.jl: Functions generated at runtime without world-age issues or overhead ?

3 Likes

No, thanks for the hint. Just took a quick look at it. GeneralizedGenerated.jl seems to specialize on closures, its doc recommends RuntimeGeneratedFunctions.jl if closures are not involved.

RuntimeGeneratedFunctions.jl states it does not generate a named generic function or methods of a generic function, which participate in multiple dispatch, but something like an anonymous function with a fixed signature.

My proposal is fully compatible with multiple dispatch and generic functions. It works excellent for generating new methods of already existing functions. One of my important use cases is generation of Base.getproperty methods. The code example utilizes multiple dispatch for memoizing function results as methods, with zero runtime costs. My impression at first glance is, that cannot be done with RuntimeGeneratedFunctions.jl. But I could be wrong.

I think both approaches avoid “world age” problems by delaying code generation. In my proposal, the user is responsible to call generate not before all prerequisites are given. @RuntimeGeneratedFunction wraps the expression to compile in a struct which is made callable (supertype is Function). Looks like the expression is compiled when this struct is called. I am not sure about runtime overhead caused by this wrapping. I will contact the author, maybe he clarifies things and puts a statement here.

I can imagine the following issues with the eval-ed solution. Possibly nothing new here for the OP, but someone may find this helpful, and I am trying to validate my understanding of the limitations of eval:

  • The world age issue with new methods: Between calling generate and using the newly generated methods, execution must reach top-level. Usually not a problem in scripts, but it may be for a large application. Also, it seems impossible for any package that uses generate to provide a single-function api.

  • The information to generate all methods may not be available before the need to use one of them. (finer-grained generation can help)

  • Generating everything at once may take long time, and if must be at “startup”, that may worsen the ttfp problem.

Overall it seems like an acceptable solution for some use cases, but I would first try to fix the @generated solution.

It would be great to hear what more seasoned Julians think!

About the @generated: Method tables are mutable global state, which is forbidden to “access” from generated functions during generation. At least in theory. But practice is different: e.g. RuntimeGeneratedFuntions.jl violates this rule. Also, in my understanding this rule also says that generated functions cannot call other functions, which is absurd.

The big question for me is: Why wrapping the call in invokelatest failed to help? Do I understand correctly that invokelatest fails to work during @generated generation?

Important requirement, thanks. Is fulfilled in my example code (call at global level), but might be violated if someone wraps initialization and subsequent use in a function, for convenience.

Sure. I did not work it out in the example, it is already very complex for a discussion input. I would like to emphasize this point: in real world applications, think about parameterizing generate, or supplying several generate functions.

Right. It-s so easy to overuse multiple dispatch. Moving a function parameter into a type parameter, as Val(s) does in my example, multiplies the number of methods to generate by the cardinality of s. It does not scale. You can find it in the wild, e.g. when using TypedTables.jl. Works fine with a couple of tables, but do not try to model a complete database with hundreds of table types.

It must be “before use”, not at startup. A complex application could do lazy initialization and include souce code with types, methods and code generation later on. However it needs additional logic. I would not do it in a batch or server application.

Have a look at the example code (or just run it, maybe in debugger). f1(App.MyEnum,Val(e2)) calls Base.invokelatest(bitsizeof,MyEnum)). No world age error. bitsizeof calls Base.invokelatest(typemax,MyEnum). World age error.

So: yes, it happens during generation phase. Generation is aborted, generated function is not called.

What I really do not understand: if bitsizeof can be called in f1 via Base.invokelatest, why does the call of typemax in bitsizeof fail? In the expression sequence in Main, f0 is called before f1. It calls exactly the same sequence of functions as f1 before code generation, including a call to typemax(::Type{MyEnum}). How can it be, that typemax(::Type{MyEnum}) exists in the world state when returning from f0, but is not found in the newest world in the subsequent call of f1?
If there is no answer in the next days, I will open an issue in julia on that.

I suspect, from the point of view of julia committers, there is nothing to fix, everything is working as designed. The “world age” conception is well thought, and results in a clear, understandable error message. But the error is an error of the programmer (improper use of @generated), not a bug in julia Base.

My understanding is slightly different: you may access global state in a @generated block, at least method tables, but it is a state frozen at the time @generated block was precompiled. This is checked in every subsequent call, and if a method is called which does exist in current world state but not in frozen state, we get the “world age” error.

2 Likes

You are right, this is also documented.

This issue helps understanding the background: generated functions seem to have too-conservative world age counter set · Issue #23223 · JuliaLang/julia · GitHub

And maybe Cassette’s solution with an extra macro and some black magic is also relevant: add pass type generation · Issue #26 · JuliaLabs/Cassette.jl · GitHub

1 Like

There is no runtime overhead to them. You can test it out, and the regression tests in the repo make sure that’s the case.

Essentially what it does is takes the expression, hashes it, and pushes it into a dictionary. That hash is kept in the type. When the function is called, an @generated function is called which takes the code from the dictionary from the spot of the hash and compiles it. The cached compiled code is saved. Technically this violates purity since the code in the dictionary could in theory be changed, and this process will cache the code (to achieve maximal runtime performance) which will not update if the code in the dictionary is changed. This is the violation that @tisztamo refers to. In practice, the it’s hash-based so it should never be directly modified anyways since then the hash would be wrong and you’d want a different RGF.

Granting purity of functions is very difficult in Julia, due to Julias concept of generic functions, which are by design redefineable at any time, in any module. Hashing is a perfect solution to prevent unintentionally changes.

I opened this issue, conclusion: @generated works as designed, using Base.invokelatest in a @generated function (or any function called from there) is forbidden. If it is used, behaviour of the program is undefined. This applies to the apparently working call Base.invokelatest(bitsizeof,T) in f1 and the strange error message of the subsequent call Base.invokelatest(typemax,T) in bitsizeof stating that there is a newer version of typemax which cannot be called.

If there is any issue, it is a documentation issue: doc on @generated and Base.invokelatest(typemax,T) does not explicitly state that it is not allowed to call Base.invokelatest from a @generated block. Stefan Karpinski admitted ‘It might be good to clarify this in the docs though.’

1 Like