I have a package which defines a function and some methods for this function when the package gets loaded.
In addition, the package defines a macro which returns an expression which defines additional methods to this function depending on the macro arguments.
Both the function and the macro are exported.
This works fine if the user of the package just does a using MyPackage. However, in general, the user is free to only do a using Mypackage: @my_macro, because that’s all what they think they need. However, this breaks the straightforward definition of additional methods as the function is defined in MyPackages scope, whereas the macro is evaluated in the user module’s scope.
I haven’t found a clean solution to this problem, yet. So I boiled it down to the essential and wanted to ask whether you know of a better solution. I have already added some of my (non-)solutions. You need to comment all but one to test it.
module MyModule
export my_function, @my_macro
my_function(::Int) = 42
macro my_macro()
quote
# This does not work, because `MyModule` is not in
# the user module's scope.
$my_function(::String) = -1
17 |> $my_function |> println
"" |> $my_function |> println
# Neither does this work, because `my_function(::Int)`
# is not known in e.g. `Main`.
my_function(::String) = -1
17 |> my_function |> println
"" |> my_function |> println
# This does work, but now there are two functions with
# one method each.
my_function(::String) = -1
17 |> $my_function |> println
"" |> my_function |> println
# This works, but is problematic if the definition of the
# method depends on other expressions which should
# be returned by the macro and are thus not yet available
# when the next line is evaluated.
$(@eval my_function(::String) = -1)
17 |> $my_function |> println
"" |> $my_function |> println
# This works, but it feels wrong to push `my_function` into
# the user's scope when they decided not to have it.
import .MyModule: my_function
my_function(::String) = -1
17 |> $my_function |> println
"" |> my_function |> println
end |> esc
end
end
# How to get this to work?
using .MyModule: @my_macro
@my_macro()
No you’re right, I missed it in the pipe |> esc at the very end. The first part does surprise me because I expected to be able to embed arbitary Julia objects into Expr to be evaluated, in other words whether the name MyModule is available in the call scope shouldn’t matter to the directly embedded function. That’s exactly what you do in this case:
julia> (@macroexpand @my_macro).args[2].args[1].args[1]
my_function (generic function with 1 method)
julia> (@macroexpand @my_macro).args[2].args[1].args[1] |> nameof
:my_function
Yet the macro call throws an error about an invalid name "MyModule.my_function", even without the esc or the calls, just the definition in MyModule’s scope. Ironically, writing out MyModule.my_function without the interpolation would work because MyModule is indeed in the call scope for your example. But we don’t necessarily want to rely on that, which is why I like to embed objects. $MyModule embeds the module, not the function, so it works too.
Update: It only throws that invalid name error for the named method definition; comment that and its associated call out, and the other call works. Interpolating just a function object where it expects a Symbol possibly qualified by modules (Symbols or not) results in a wrong name. I don’t know exactly how it’s wrong because defining a function with var"MyModule.my_function" is allowed but results in a separate function with a weird name that has nothing to do with MyModule. If you really want to interpolate the function object directly into the macro’s Expr instead of the module followed by some symbols, then you need the type-annotated functor syntax:
which just makes me wonder why $my_function in the name position can’t do the same thing.
MWE of this interpolation limitation for method definition names, no macros just eval:
julia> function foo end # motivation: foo or Main may not exist in macro call scope
foo (generic function with 0 methods)
julia> eval(:($foo()=0))
ERROR: syntax: invalid function name "Main.foo"
...
julia> eval(:($(@__MODULE__).foo()=0))
julia> eval(:((::typeof($foo))(x)=1))
julia> foo(), foo("bar")
(0, 1)
Adding and interpolating the module or using a functor are really great solutions, thanks a lot to both of you!
I wonder the same. Interpolating the function is more intuitive to me. Does anyone know if there is a reason why it would not be a good idea to support it? Might be worth an issue if there are no disadvantages involved.
I tried Googling out a github issue earlier and failed. I wouldn’t know the reason, but I can speculate that it’s not actually straightforward what type’s method table should be changed. For the typical const-named function, it seems obvious we’re interested in the singleton type it has, effectively making $foo()=0 do (::typeof(foo))() = 0. But there are complications for other kinds of functions or callables:
when the callable or function doesn’t have a singleton type e.g. closures capturing variables, it’s not possible to define a method for the callable instance’s particular values, just its type. (::typeof( Bar(args...) ))() would work, but having to instantiate dummy values before interpolation isn’t a good thing.
If you interpolate the callable’s type, say Bar, instead to dodge dummy instantiation, you don’t have the syntax to specify whether you’re trying to define a method for callables (::Bar)() or a constructor for the callables’ type (::Type{Bar})(). You’d have to pick one, and it’s not intuitive which.
When the callable is a type and you do want to define constructor methods, you really don’t want to do (::typeof(Bar)) because that goes into the method table for DataType:
julia> struct Bar end # instead of function foo end
julia> (::typeof(Bar))() = 1234567890 # would work for foo
julia> Bar() # wat
Bar()
julia> DataType() # oh no
1234567890
So it’s practical for definitions to demand either a true name like source code, even if qualified with interpolated modules, or a type annotation in order to make the target type unambiguous, otherwise eval has to do a much more complicated branch over an arbitrary interpolated callable to guess what you intended. I’d prefer a better error message than “invalid name”, though.
This is a enormously interesting perspective! Thanks for sharing it.
But I still have a missing link in my mental model:
I see that interpolation is complex in general, but why does it get easier when the module is interpolated? Doesn’t $MyModule.my_function, i.e. “resolve my_function in the current module” have to solve the very same problems you outlined as $my_function, which would mean “resolve my_function in the current module”, too?
I think the difference there is that $MyModule.my_function means resolving the name:my_function in the interpolated module object MyModule, like the call getglobal(MyModule, :my_function), whereas $my_function or $(MyModule.my_function) is only the function object. Granted, you can get the module (parentmodule) and a usable name (nameof) from that object, but there are callables that don’t have a neat name. Like other interpolated objects, callables are fine as values or for calls "" |> $my_function; we run into trouble when dealing with types and method tables in definitions.
I think the question around my missing link boils down to:
Would anything break if $my_function would always be syntactic sugar for $MyModule.my_function?
This means instead of “function object” → parentmodule → nameof, just effectively do getglobal(current_module, :my_function). In my current understanding $my_function always needs to be resolvable in the current module’s scope. But maybe I am missing a counter example?
Definitely. Note that the expression isn’t necessarily interpolating a global variable in MyModule, my_function could very well be assigned something else in a local scope, even a function with a different parent module. Changing the expression to do $MyModule.my_function would then do something very different from ::typeof($my_function). The disconnect between the information attached to an object and how names and annotations are provided to and handled by definitions is fairly deep.
That also generously assumed that it’s easy to isolate this change to method definitions. It’s not, the various method definition syntaxes have different expressions that contain a :call expression, which by itself is not distinguishable from actual calls that we don’t want to change.
Thanks a lot, now I got it! This might still be a desirable thing in the long run, but it would involve all the complications you mentioned and probably a significant amount of time to solve them only to save typing the module name in some interpolations.
Not sure if I would want to interpolate in the name position of a method definition anymore even if it could be pulled off flawlessly. To summarize the following demo in words, definitions treat the name position strictly as a name, not the type or function assigned to that name at the time. Definition-time values have to be used in type annotations instead. Besides increased typing, type annotations with non-const names don’t seem like a good idea because the possible reassignment of that name can cause confusion and failures, even silent ones. Better to use the const name when possible, resort to annotations otherwise e.g. functors. In the example, you ensure a const name my_function is used whether you do $MyModule.my_function or $my_function, but the definition doesn’t know that after interpolation in the latter case, it just finds a function object where it currently expects a const name. It’s inconsistent and arguably strange for definitions to derive the const name from the interpolated object if possible when it’s not done to non-const names in plain source code.
julia> struct Bar end # const global
julia> b = Bar # non-const global
Bar
julia> b() = 0 # fails because treated as non-const name :b
ERROR: cannot define function b; it already has a value
...
julia> let b = Bar;
b() = 0 # fails because local name b is reassigned
Bar()
end
Bar()
julia> (::Type{b})() = 0 # succeeds by definition-time value
julia> Bar() # constructor successfully ruined
0
julia> (::b)() = 1 # succeeds by definition-time value
julia> Bar.instance() # need internal because constructor was ruined
1
julia> b = 2 # change value, now previous definitions are confusing
2
julia> (::Type{b})() = 3 # fails silently(!) by definition-time value
julia> Bar()
0
julia> (::b)() = 3 # fails by definition-time value
ERROR: function type in method definition is not a type
...