Is there a way to know in advance if a function definition will overwrite a previous one?

I’d like to know if there is any way to know in advance if this can be done, e.g. say I have

f(x::Int) = 3

In pseudo-code I’m trying to do something like

m = method overwritten by f(x::Int) = 4
if m !== nothing
    f(x::Int) = 4

Is there a way to know the method m which will be eventually overwritten?


In my actual code I have access to the signature of the new definition (as a expression in a macro context), so I thought I could use methods(f, (Int,)), but it doesn’t work because it matches the one it will be called even if it will not be overwritten:

f(x) = 2


> methods(f, (Int,))
# 1 method for generic function "f" from Main:
 [1] f(x)
     @ REPL[3]:1
1 Like

ok maybe

julia> tuple(methods(f, (Int,))[1].sig.parameters[2:end]...)

does the trick to match the signature more strictly

To answer the question from your title:
Generally no and I think the question might be a X-Y-Problem.

To go into a bit more detail: You can of course try to detect if there is currently a method with a given signature but this is not good enough. The method could simply not be defined yet because it is loaded later, or generated later or… IIUC from your example, you would consider foo(x::Int) to not overwrite a definition of foo(x::Any) but this is not really what Julia’s semantics reflect in my view.

The aim of your question is likely connected to type piracy (perhaps inspired by this current thread: How to detect/avoid type piracy?). If that is the goal, then no meta programming can really help you. Even checking whether a method “exists” before you provide your own definition does not safe you from committing type piracy. Also that check should be stricter than what you showed in your post. Defining foo(::Int) for a foo you do not own that has a definition for foo(::Any) is definitively type piracy and can bite you.
Perhaps one can argue that defining foo(::String) if you don’t own foo(::Number) is not quite as bad, as hopefully nobody relied on the MethodError being hit, but it still stays type piracy.


thanks for the help @abraemer, to add some more context, I’d like actually to redefine a function which was previously defined in the same module (actually by my macro at a previous point inside the module), the problem with just redefining it without any further thought is that I discovered that precompilation is then not possible because it gives a warning “function overwritten at …”. To try to solve this, then I thought I could just try to remove the previous method with Base.delete_method but this means I need to identify if and which method was previously defined, I’m doing something along the lines of

    m = which($f_name, tuple($(f_signature...)))
    if tuple($(f_signature...)) == fieldtypes(m.sig)[2:end]
    # no method already defined

where f_name contains an expression of the name of the function and f_signature its signature always as a expression. This seems to work fine. Is there any issue with something like that?

This still sounds a bit confused to me. Let me try to understand better:

  1. You define some function in your module (via a macro)
  2. Later you overwrite it
  3. This gives a warning that precompilation is broken.
  4. You want to retain precompilation and thus work around this redefinition.

This raises a couple of question for me:

  1. Why do you need to redefine the function at all? Why not just have the latter definition?
  2. When you check and see that the function is already defined, then you want to remove the old method and put your new definition into its place, right? That won’t fix precompilation in a meaningful way though. I am not sure what will happen, when you try this, but in the best case the precompilation is just wasted and in the worst case you break something badly, so I would recommend not doing this.
  3. If you really need to overwrite methods constantly, then precompilation does not make sense and you should probably just turn it off (by putting __precompile__(false) at the top of the module). You can possibly put the stuff that needs to change all the time in their own module to still have precompilation for the rest of the code, but tbh my understanding is a bit hazy in that area.
  4. If you define the function via a macro, then the easiest thing would be to have that macro generate a bit more code that also does some book keeping for you so you can check for the existence of definitions more easily.
1 Like

Yeah… or generate some book-keepings of which functions you need to generate and generate all the functions at the end.

1 Like

I have seen this kind of technique in a couple of repos e.g. in ProtoStructs.jl where methods are removed and new ones are added, I would be okay to waste the precompilation of that function, if that still fixes the bad precompilation warnings. At the same time, I don’t really want that something bad happens like undefined behaviours. I thought about __precompile__(false), but I wasn’t sure how to apply it to only those function defined (maybe in another package) with my macro, I’m not sure if this is even possible.

But yes fortunately I have already the book-keeping of methods in place, what is keeping me away from just defining all the methods at the end is that I would not want to require the user to register the functions at the end with some other macro, do you know if there is a way to tell Julia to define them at the end of a module automatically @abraemer @Tarny_GG_Channie?

One thing you could play around with is using more modules. I.e. you could about generating a whole module per function and importing the function from there (your macro would generate the necessary code).

Or you could make a single module (with precompilation disabled) for all the functions you need to generate and use @eval module to add the functions to that module.

It is kinda hard to suggest useful things without knowing more about your requirements/constraints. If you produce a MWE code for us to play around with, it would be easier to help you.

1 Like

yes, sure no problem: more specifically I’m talking about the @dispatch macro in MixedStructTypes.jl, I discovered this issue when in another package it was tried to be used and it returned the overwriting warnings (Use dispatch instead of manual branching by Tortar · Pull Request #90 · mastrof/MicrobeAgents.jl · GitHub), this is how it works

julia> using MixedStructTypes

julia> @sum_structs AB begin
           struct A end
           struct B end

julia> @dispatch f(::A) = 1
f (generic function with 1 method)

julia> @dispatch f(::B) = 2
f (generic function with 1 method)

julia> f(A())

julia> f(B())

the dispatch macro actually add/substitute some branches

Do you still think that the module tricks you suggest can be used? I’m still digesting the suggestion

Maybe I could use something like

julia> macro m(name)
           e = quote
                    @eval module $name
                        f() = $name
                    $__module__.f = $name.f 
           return e
@m (macro with 1 method)

which seems to work fine

julia> @m X
f (generic function with 1 method)

julia> f()

julia> @m Y
f (generic function with 1 method)

julia> f()

I would whenever I change the function produce a new module like that

mmh, I hate the solution I proposed, forbidding precompilation for the @dispatch-ed functions would have also side effects on other functions in the package which use them I think, something I would not want.

I would just really like to do some bookkeeping and after all macros are expanded in a module (maybe when the last one of that kind is called?) evaluate all the definitions. Do you have any idea how to accomplish this @abraemer? Thanks a lot for the help anyway, really appreciated

Well the inelegant solution would be to just have 2 macros, e.g. @dispatch_add and @dispatch_finish or so. Where the first ist just used to add branches to the bookkeeping and the second actually generates the definition. You could use this split to allow @dispatch to define multiple branches ones like:

@dispatch f begin
    f(::A) = 1
    f(::B) = 2

This would transform to

@dispatch_add f(::A) = 1
@dispatch_add f(::B) = 2
@dispatch_finish f

I don’t think there is a solution that preserves the exact current syntax without downsides. The module solution above should work with precompilation as well (as you can simply generate a new module each time which then will be precompiled) but the huge downside is, that the function name is no longer a const value and so using this function for dispatch is using a global… Typing it makes it slightly better but still it will be horrible for performance.

Edit: A slight variant: instead of @dispatch_finish f you could have a @dispatch_generate_all that just generates all the definitions and then users need to put that once at the very bottom of their module when they decide to use the package. That seems quite low effort to me, so maybe it’s the closest we can get to the current state.

1 Like

that is what I thought as well. I’m not sure if there is some sort of trick which could keep entirely the current syntax, e.g. call_at_the_end_of_the_module(f), maybe I want too much :smiley:

Well I just found a hack :smiley: Not saying you should use it though… I am unsure wether this is behavior that should be relied upon.

We can use a @generated function to generate the dispatch function on first call from some global variable.
Consider this behavior:

@generated function foo(x)
    stmts = [:(if x < $i return $i end) for i in sort(BREAKPOINTS)]
    return Expr(:block, stmts...)
foo(6.3) # this generates the function with all BREAKPOINTS defined up to here

So with this in mind:

  1. The first @dispatch should register the function name in some global datastore and generate the @generated function definition
  2. all following @generated macros just put their data into the datastore
    That avoids the precompilation issues and retains the exact current semantics for packages at least.

The downsides here are:

  1. Of course this function is not precompiled - which a package author could fix by just calling it once in a precompile workload
  2. In interactive development adding new branches via @dispatch AFTER the function was run once does NOT lead to recompilation… This could maybe be fixed by using a second macro @dispatch_interactive that behaves like the current @dispatch and is intended specifically for interactive use and not for package code.
1 Like

Ingenious!! I think 2. could be solved by checking isinteractive() and behave differently in that case. 1. is still a bit concerning to me because this means that also any function which contain these @dispatch function wouldn’t be precompiled, right?

No I think that’s fine. IIUC precompilation is, if you don’t do anything else manually, essentially just parsing the julia files and creating the bookkeeping stuff. Inference and actual compilation is only part of precompilation if you actually run some functions otherwise Julia doesn’t for what types it should compile methods.

1 Like

then, even if it is a bit hacky, I think we have a solution! But I need to test this before crying victory :smiley: Thank you a lot @abraemer

1 Like

I am unsure wether this is behavior that should be relied upon.

Before though I could maybe try to open a new thread for this :slight_smile:

Well actually I am sure that this hack breaks the usual contract of generated functions (see 5. in the manual Metaprogramming · The Julia Language)
The thing is that I think it will be relatively fine. Maybe some future Julia version breaks something but I wouldn’t see how really. So while it is technically undefined behavior or something like that, I think it should rather fine. But someone else might have more to say on that. The official recommendation will be to not do it though.

Edit: Thinking about it some more: We really need to try this in a module/precompilation setting… Maybe it the global variable holding the dispatch rules does not survive well into runtime and that would spell doom for this approach. In that case I would resort to the 2 macro variant.

1 Like