[RFC/WIP] DuckDispatch.jl

I’ve been working through an idea for a package, and I’d love your feedback.

Tl;Dr: This package allows you to create method dispatches based on the methods defined for a type, not the place the type has in the type lattice.

Here’s a little snippet to get the idea:

@duck_type struct FiniteIterable{T} 
    # the signatures that must be defined to make a type a `FiniteIterable`
    function Base.iterate(::This)::Union{Nothing, Tuple{T, <:Any}} end
    function Base.iterate(::This, ::Any)::Union{Nothing, Tuple{T, <:Any}} end
    function Base.length(::This)::Int end

    # Calculate the paramater for `FiniteIterable` for a given type
    @narrow InputT -> FiniteIterable{eltype(InputT )}

@duck_dispatch function my_collect(arg1::FiniteIterable{T}) where {T}
    return T[x for x in arg1]

ch = Channel{Int}() do ch
        for i in 1:2
            put!(ch, i)

my_collect((1,2)) # [1,2]
my_collect([1,2]) # [1,2]

# When a type does not have the methods required, an error is thrown
# currently, this error is a generic ExceptionError, but will soon be a proper
# MethodError
my_collect(ch) # error -- no method `length`

In the background, it wraps any DuckType args in a Guise{<:DuckType, T}. When it gets to a method call listed in the DuckType’s interface, it unwraps the data.

DuckType’s can be composed together so you can build complex DuckTypes without cluttering the method table with tons of new methods. Composition looks like this:

@duck_type struct Indexible{T} 
    function getindex(::This, <:Any)::T end
    @narrow InputT -> Indexible{eltype(InputT)}

@duck_type struct Container{T} <: Union{FiniteIterable{T}, Indexible{T}}
    @narrow InputT -> Container{eltype(InputT )}

Now, a Container has behaviors for iterate, length, and getindex.


All the dispatch work happens in the type domain, so it completely compiles away in my testing. Since a lot of the composition logic requires recursion, I don’t know how the inference performs in deeply nested DuckType definitions. That said, I see this kind of flexibility as most useful at the top level of a script. I don’t think it makes sense to use this inside a hot loop in general.

I will soon add a @check function along with @narrow to allow custom checking of input types beyond just “does it have these methods.” This is because a lot of types have default methods that don’t make sense. For example, iterate and eltype are defined for Int, but you probably want to exclude Int from an Iterable DuckType.

All my error messages are generic and unhelpful at the moment. One of the perks of owning the dispatching infrastructure is that I can design very nice error messages. That’s on my roadmap.

Edit: After a couple rounds of optimization, there is no overhead for many types!


This is really neat, especially for being able to dispatch based on behavior rather than the type hierarchy directly. Do you have any thoughts on this versus the various trait or interface implementations?

Also, I think showing what happens when the methods are missing is instructive as well:

julia> struct Foo end

julia> my_collect(Foo())
ERROR: Could not find a matching method for the given arguments.
 [1] error(s::String)
   @ Base ./error.jl:35
 [2] wrap_args(duck_sigs::Tuple{DataType}, args::Tuple{Foo})
   @ DuckDispatch /tmp/tmp.moMlfFdyQf/DuckDispatch.jl/src/MethodDispatch.jl:95
 [3] #_#3
   @ /tmp/tmp.moMlfFdyQf/DuckDispatch.jl/src/MethodDispatch.jl:39 [inlined]
 [4] my_collect(arg1::Foo)
   @ Main /tmp/tmp.moMlfFdyQf/DuckDispatch.jl/src/MethodDispatch.jl:38
 [5] top-level scope
   @ REPL[12]:1

E- I think idea behind this is particularly interesting because it lets you define your interfaces in an easy-to-opt-into but rigidly enforced kind of way. It doesn’t require anyone to work directly with your package to interoperate with code that accepts a Guise. In comparison with, say BinaryTraits.jl, you don’t have to apply the trait to conform to the interface.


I use Holy Traits in my code, but I find reading a bunch of method cascades to reason about how a type will dispatch less intuitive than calling DuckDispatch.quacks_like(MyDuckType, SomeType).

I see this project as related to all the work on interfaces because a DuckType is basically laying out runtime checks on an interface (whereas the interface packages are trying to prove interface adherence apriori).

I’ll update my post to reflect the behavior for missing methods!


Very cool idea!

Indeed, from what I can tell, it mixes:

  • the required methods from RequiredInterfaces.jl (ping @sukera)
  • the trait-based dispatch from Interfaces.jl (ping @Raf)

Hope this can lead to a fruitful discussion, the maintainers will know the differences between their packages and yours better than me.


I do like this feature in C++20, so nice to see an effort at making it convenient in Julia, even if it doesn’t compile away yet (in C++ it is of course totally resolved at compile time).


I got it to almost compile away, but I’m hitting an issue.

julia> hasmethod(iterate, (Vector{Int}, Any))

julia> hasmethod(iterate, (Tuple{Int}, Any))
false # grrrr...

julia> methods(iterate,  (Tuple{Int}, Any))
# 1 method for generic function "iterate" from Base:
 [1] iterate(t::Tuple, i::Int64)
     @ tuple.jl:70

julia> hasmethod(iterate, (Tuple{Int}, Int))

hasmethod will compile away, but methods does not, but I can’t check “is there any method which has a signature that is a subtype” without !isempty(methods(...)). And, since it reads from global, I can’t use a generated function either. Does anyone have an idea for how I could convince Julia to only check this at compile time?

Edit: One thought I had was that, when I create a new method with @duck_dispatch, the macro could also eval a frozen version of get_methods for all the behaviors of all the DuckTypes in the signature of the new method. That won’t work because a type defined after the method won’t be included in the frozen method table.

To start the discussion:

I only have required behaviors, where Interfaces.jl allows for optional methods. And Interfaces.jl still requires that you add the trait types to the signature of the function.

I also borrow from @Sukera 's idea of Meet in the docs for RequiredInterfaces.jl. DuckType is a recursive type which holds Behaviors and the Meet of other DuckTypes. Then, each concrete DuckType defined by a user is a subtype of some nested DuckType.

I think that DuckDispatch.jl should be able to consume interfaces from these other packages and create dispatch types, but I have not tried it yet. I want to add that to extensions later once my design has settled down a bit.

1 Like

EDIT: The post below is out of date. DuckDispatch.jl no longer has an overhead!

If anyone wants to take a stab at areas for improvement, here is a script you can try:

using BenchmarkTools

# Content Generated by @duck_type macro
module tmp

DuckDispatch.@duck_type struct Iterable{T}
    function Base.eltype(::Type{DuckDispatch.This})::Type{T} end
    function Base.iterate(::DuckDispatch.This)::Union{Nothing, Tuple{T, <:Any}} end
    function Base.iterate(
            ::DuckDispatch.This, ::Any)::Union{Nothing, Tuple{T, <:Any}} end
    @narrow T -> Iterable{eltype(T)}

DuckDispatch.@duck_type struct IsContainer{T} <: Union{Iterable{T}}
    function Base.length(::DuckDispatch.This)::Int end
    function Base.getindex(::DuckDispatch.This, ::Int)::T end
    function Base.collect(::DuckDispatch.This)::Vector{T} end
    @narrow T -> IsContainer{eltype(T)}

DuckDispatch.@duck_dispatch function collect_ints(arg1::IsContainer{T}) where {T}
    return collect(arg1)

function collect_ints_concrete(arg1::Vector{T}) where {T}
    return collect(arg1)

@benchmark tmp.collect_ints($[1, 2, 3])
@benchmark tmp.collect_ints_concrete($[1, 2, 3])

On my machine using the current main, I get:

julia> @benchmark tmp.collect_ints($[1, 2, 3])
BenchmarkTools.Trial: 10000 samples with 960 evaluations.
 Range (min … max):   87.292 ns …   4.765 μs  ┊ GC (min … max): 0.00% … 96.94%
 Time  (median):     101.875 ns               ┊ GC (median):    0.00%
 Time  (mean ± σ):   112.963 ns ± 132.757 ns  ┊ GC (mean ± σ):  4.39% ±  3.79%

  ▇█▄▄▄▄▇▆▄▃▄▄▂▂▂▂▁  ▁   ▁▁                                     ▂
  █████████████████████████████▇▇▇▇▆▇▆▆▆▆▆▄▅▅▆▆▅▄▄▅▄▄▅▄▅▅▄▄▅▄▅▄ █
  87.3 ns       Histogram: log(frequency) by time        252 ns <

 Memory estimate: 80 bytes, allocs estimate: 1.

julia> @benchmark tmp.collect_ints_concrete($[1, 2, 3])
BenchmarkTools.Trial: 10000 samples with 996 evaluations.
 Range (min … max):  24.900 ns …   3.689 μs  ┊ GC (min … max):  0.00% … 97.92%
 Time  (median):     29.016 ns               ┊ GC (median):     0.00%
 Time  (mean ± σ):   38.520 ns ± 136.849 ns  ┊ GC (mean ± σ):  14.43% ±  4.04%

  ▄█████▇▅▄▄▃▃▃▃▃▂▂▂▂▂▂▂▂▂▃▂▂▂▂▂▁▁▁▁▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁ ▂
  24.9 ns         Histogram: frequency by time         73.4 ns <

Edit: I’m now matching the same number of allocations, but it’s still about 3 times slower than without a Guise. Still, I’m pleased with this performance for now.

This improvement does not hold for types where we have to fall back to using methods. In this case, there are a bunch of allocations and it is much slower.

Cool to see more interface stuff happening! I’ll definitely take a look over the weekend :slight_smile:

1 Like

Ah yes, I’m familiar with this area. Trying to do reflection around the details of function methods in Julia currently feels very hacky, in DataToolkit I have lines like:

first(Base.unwrap_unionall(T).parameters).ub #, and
T isa DataType && T.name.name != :Type #, and

for working out “what types can argument X of f be according to the methods defined, given another argument is of type Y?”


UnionAll and I don’t get along :wink:

Edit: I switched to Tricks.jl and I now have zero-overhead for types where I can use hasmethod and just 4 allocations when I need to fallback to methods! Thanks @oxinabox!

P.S. I think the PSA at the top of the README for Tricks.jl is incomplete. It reads " Tricks.jl is not required post-Julia v1.10.0-DEV.609." But it has a lot of features besides static_hasmethod, so it is still required for those cases.
I’ve edited my comments above, but just so that they don’t get lost in the noise, for types where I can check behaviors with hasmethod, the dispatch system does not allocate at all. It does add some overhead (about 80ns) to the function call right now. I’ll try to work on that next.

For types where I have to fallback to using methods, I’m hitting tons of allocations that are out of my control. For now, I’m putting a pin in that issue.

1 Like

Interesting! I’m actually working on a similar package right now. The package is called ExtendableInterfaces. Unfortunately, it’s been more than a month since I’ve had time to work on it. Hopefully I can get back to work on that package and release an initial version soon.

ExtendableInterfaces has two primary goals, which appear to overlap with the goals of DuckDispatch. The two primary goals of ExtendableInterfaces are to provide:

  • Multiple inheritance of interfaces.
    • An interface is a fixed list of required methods. (Optionality of methods is achieved by defining a subinterface.)
    • A new interface can extend multiple existing interfaces.
    • You can declare which interfaces a type implements.
  • Dispatch on interfaces.
    • The interface dispatch code should compile away.
    • Defining a method with interface dispatches will look something like this:
      @polymorphic f(x: Foo + Bar, y::Int) = ...

However, the overall design and implementation of ExtendableInterfaces differs a bit from the design and implementation of DuckDispatch.

1 Like

Oh, nice! When you’re ready to share, I’d love to look at the repo to see if there is any chance of merging my stuff into yours. I definitely don’t need to be the owner of anything.

That said, my primary goal is Go-like interfaces where the compiler automatically detects if a type meets the requirements of an interface and automatically does the dispatching. My position is that interface adoption would benefit from requiring very little coordination between packages.


Note that @Raf has also been working on multiple inheritance for interfaces in Interface.jl. Here’s the draft PR:

I very much appreciate that spirit! At the moment I’m aware of at least 5 initiatives for interfaces, and it would be great if merging some of them were possible:

1 Like

There is also the unregistered GitHub - quinnj/Interfaces.jl: interfaces for Julia

Its great to have all of these to compare for/against for a possible Base version one day.

(for this topeic, I intentionally didn’t use hasmethod in Interfaces.jl because of the amount of false positives out there - fallback methods that just error with a warning are pretty common, as are fallback methods that “just work” on a lot of objects. We really need to run the method at some point to know if its actually properly defined)

Also Interfaces.jl trait dispatch will entirely compile away, although the new multiple inheritance may add some compilation load to this.


I intentionally do not use the word “interface” anywhere in the code or docs for DuckDispatch.jl because it is not an apriori proof of adherence to an interface, but a runtime check. Adding return type annotations to the Behaviors of a DuckType will add type conversions to those methods, but that’s about all.

Edit: Maybe clearer, declaring a DuckType and dispatching on it doesn’t protect you from getting imposters that don’t behave the way you want putting on the Guise of your DuckType. But it does guarantee that you won’t get any MethodErrors unless you as the caller accidentally call a function which is not included in the DuckType’s Behaviors.

I also plan to add an additional check function where users can specify that they, for example, don’t want <:Number types includes in Iterable even though they have a eltype and iterare methods.

Yes. Please make a PR to this effect


I feel we should really be able to use effects (and a little compile-time code analysisis) to break methods down into Always throws, vs May throw, vs Never throws

Probably even to getting a list of all exception types it might throw – should be a reasonable use of abstract interpetetation.
Just having “Always throws” or not is enough to eliminate a ton of the false positives – namely the ones that that come from stub definitions that just throw “NotImplementedException”. And that should just need effects.

I just haven’t got round to looking into it.


One final performance update: I was able to remove the last of the allocations for checking the MethodTable, so it is now zero-overhead for all the test cases I have.

If you find a case where there is overhead, please open an issue!