Help optimise an advice implementation

Hello, in my DataToolkitBase.jl I make extensive use of advised functions, e.g. instead of

identity(x)

I’ll have

@advise identity(x)

As such, I’m very interested in minimising the performance impact of advised calls. Currently, an advised call seems to have an overhead of a few microseconds.

I’ve had a good look at this myself, but I can’t see anything I can do to improve this, and so I’m hoping other people might be able to provide some pointers :pray:.


Each “advice” is represented with the following structure,

struct DataAdvice{func, context} <: Function
    priority::Int # REVIEW should this really be an Int?
    f::Function
end

where the “advise function” (f), takes a tuple of:

  • a post-processing function
  • the function being called
  • the function arguments
  • the function keyword arguments

and returns a tuple of the same form.

This allows an “all advises” function to be created by composing all of the individual DataAdvices, and I do exactly this: ∘(reverse(advisors)...).

Each advice is called with this method:

The method definition
function (dt::DataAdvice{F, C})(
    (post, func, args, kwargs)::Tuple{Function, Function, Tuple, NamedTuple}) where {F, C}
    # Abstract-y `typeof`.
    atypeof(val::Any) = typeof(val)
    atypeof(val::Type) = Type{val}
    # @info "Testing $dt"
    if hasmethod(dt.f, Tuple{typeof(post), typeof(func), atypeof.(args)...}, keys(kwargs))
        # @info "Applying $dt"
        result = invokepkglatest(dt.f, post, func, args...; kwargs...)
        if result isa Tuple{Function, Function, Tuple}
            post, func, args = result
            (post, func, args, NamedTuple())
        else
            result
        end
    else
        (post, func, args, kwargs) # act as the identity fuction
    end
end

In the no-op case, this has an overhead of ~0.5us and performs 9 allocations (of ~430 bytes in total). I can reduce this to ~0.2us (and 8 allocations) by resolving the hasmethod test in a generated function, but then the first call takes as much as 1000us.

Generated function form
@generated function (dt::DataAdvice{F, C})(funcargs::Tuple{Function, Function, Tuple, NamedTuple}) where {F, C}
    Tpost, Tfunc, Targs, Tkwargs = funcargs.parameters
    kwargkeys = first(Tkwargs.parameters)
    if hasmethod(F, Tuple{Tpost, Tfunc, Targs.parameters...}, kwargkeys)
        quote
            post, func, args, kwargs = funcargs
            result = invokepkglatest(dt.f, post, func, aargs...; kwargs...)
            if result isa Tuple{Function, Function, Tuple}
                post, func, args = result
                (post, func, args, NamedTuple())
            else
                result
            end
        end
    else
        :funcargs
    end
end

While this is somewhat underwhelming, with the generated method the total overhead for calling a ~dozen no-op composed advise functions drops from ~4us to ~300ns. The first-call cost is now as much as 10,000us though. ~10ms of first-time cost per function for ~50 different advised calls adds up to half a second of latency, which isn’t ideal.


Separately to the cost of applying the advise functions to the tuple, there’s the cost in creating that tuple and then going from the final tuple to the end result.

This is done by this method that converts a (func, args...; kwargs...) call to said tuple, and then obtains the final result:

function (dta::DataAdviceAmalgamation)(func::Function, args...; kwargs...)
    post::Function, func2::Function, args2::Tuple, kwargs2::NamedTuple =
        dta((identity, func, args, merge(NamedTuple(), kwargs)))
    invokepkglatest(func2, args2...; kwargs2...) |> post
end

This seems to incur an overhead of ~1.5us.


For reference, the full “advice” implementation can be found here: src/model/advice.jl.

If there’s any way I could shave this down, and you could give me some pointers on that, it would be much appreciated :grinning:.

1 Like

Did you ever make any progress here? I was thinking about an advice system as well recently.

Unfortunately not. In my packages, thankfully these aren’t called enough to be a big performance concern, but it would be great to get the overhead down if possible.

I don’t really understand your type parameters here tbh. They are not used to parametrise fields of the struct and you don’t use them where you call the advice if I read that right. So what are they used for?

Somewhat related: DataAdvice.f is abstractly typed. You could maybe consider putting its type into the type parameters like:

struct DataAdvice{func, context, F} <: Function
    priority::Int # REVIEW should this really be an Int?
    f::F
end

Maybe that was your intention anyways with the func parameter? In that case just put it as type of f. Or is it the wrapped function?

I think there should be a lot of performance to gain, if we try to make this as type-stable as possible. Could you maybe provide a MWE of the implementation of the advice machinery so it is easy to analyze/benchmark?

I read the code some more (still don’t get the type parameters though) and I have some observations. Disclaimer: I did not check how this advice system is used in practice, so these comments might not be accurate.

  • In the end you always use invokepkglatest which essentially wraps Base.invokelatest. I don’t see why this necessary. Do you expect to have advised function in long running parallel loops, while the user redefines some methods?
  • The code contains a lot of @nospecialize. This of course subverts type inference but is this really necessary? If you worry about execution speed usually you want as much type inference as possible and compile specialized methods for everything. Well except if there is an unreasonable amount of type combinations such that compilation times would dwarf any runtime improvements. Did you check that this is the case?
  • If you could afford to compile specialized methods for your Advices then you could restructure your code and types a bit to consistently have the relevant information in the type domain such that the compiler can infer everything. This should give you code that is as fast as if you had written it by hand. Maybe in practice this would not work out due to excessive compilation times but I don’t quite see why this should be the case.

Thanks for your comments @abraemer.

I put a minimal advice implementation in the slides of my JuliaCon presentation (see p43), which I’ll copy-paste here

struct Advice
    f::Function
end

function (adv::Advice)(callform::Tuple{Function, Function, Tuple, NamedTuple})
    post, func, args, kwargs = callform
    atypeof(val::Any) = typeof(val)
    atypeof(val::Type) = Type{val}
    if hasmethod(adv.f, Tuple{typeof(func), atypeof.(args)...}, keys(kwargs))
        after, func, args, kwargs = invokelatest(adv.f, func, args...; kwargs...)
        (post ∘ after, func, args, kwargs)
    else
        (post, func, args, kwargs) # act as the identity fuction
    end
end

function (adv::Advice)(func::Function, args...; kwargs...)
    callform = (identity, func, args, merge(NamedTuple(), kwargs))
    post, func, args, kwargs = adv(callform)
    func(args...; kwargs...) |> post
end

function (advs::Vector{Advice})(func::Function, args...; kwargs...)
    callform = (identity, func, args, merge(NamedTuple(), kwargs))
    post, func, args, kwargs = (∘(reverse(advs)...))(callform)
    func(args...; kwargs...) |> post
end

Some example usage:

julia> f(x) = x^2
f (generic function with 1 method)

julia> plus1 = Advice((::typeof(f), x::Int) ->
       (identity, f, (x+1,), (;)))
Advice(var"#4#5"())

julia> postdouble = Advice((::typeof(f), x::Int) ->
       (fx -> fx*2, f, (x,), (;)))
Advice(var"#6#8"())

julia> [plus1, postdouble](f, 2)
18

That said, you asked about the particular implementation of DataAdvice and the intent of the func and context parameters. I think the inner constructor should help clarify this:

struct Advice{func, context} <: Function
    priority::Int # REVIEW should this be an Int?
    f::Function
    function Advice(priority::Int, f::Function)
        validmethods = methods(f, Tuple{Function, Any, Vararg{Any}})
        if length(validmethods) === 0
            throw(ArgumentError("Advising function $f had no valid methods."))
        end
        functype, context = first(validmethods).sig.types[[2, 3:end]]
        new{functype, Tuple{context...}}(priority, f)
    end
end

Which allows for display of Advice to show what exactly is being advised.

I see you’ve put a few more questions to me :slight_smile:

In the end you always use invokepkglatest which essentially wraps Base.invokelatest. I don’t see why this necessary.

So, in the particular use-case of DataToolkit, the problem isn’t just redefinition but that a package may be loaded on-the-fly during the execution of an advice function. This is handled by a custom error type, and that’s what invokepkglatest handles + the invocation of invokelatest.

The code contains a lot of @nospecialize. This of course subverts type inference but is this really necessary?

Perhaps I’m too greedy, but I’m rather worried about latency. Currently DataToolkit is nowhere near being able to call itself a “lightweight” package in terms of load time and latency, but I’d love it to be able to be an easy dependency for other packages that want to use some external data.

If you could afford to compile specialized methods for your Advices then you could restructure your code and types a bit to consistently have the relevant information in the type domain such that the compiler can infer everything.

This is basically what the @generated approach I showed does, no?

I think the obvious advice here would be not to use advice, heh, as they expose you to the painful trade off between compilation and execution time. That said, perhaps you don’t care about run time as much as about the compilation time? Have you considered disabling optimization and or compilation? Something like:

Experimental.@compiler_options optimize=0 compile=min infer=no max_methods=1

Ah, but the flexibility they provide is :ok_hand: and the backbone of the plugin system.

FTR the struct definitions are instead here:

Just at a glance: the func field is annotated with ::Function instead of the type param Func. Doing so would make it type stable. Is there a reason why it’s not done that way?

It’s a trade off between having a field with a non-concrete type and having a container of non-concrete element type.

I’m definitely missing something here still.

It looks like the struct has a parameter for func which the constructor sets to the typeof the function stored in the field f. So Vector{Advice} isn’t concrete because it’s missing the type parameter and accessing f for a particular Advice isn’t type stable. So neither are concrete/stable?

1 Like

Yep, I’m actually giving a bit of attention to load time and inference right now, and as a result the revised struct I currently have exists without type parameters.