I would like to add function meta data and reflection to accomplish the following (I’ll use a simplistic scenario):

Say I building a big-data solutions (call it ‘g’) using a library of functions (‘f_i’) )for performing: sort, set-intersection, disk-access, …
Assume that the library has a few versions for each of these “primitive” functions, and that their performance depends on the data types, data size, data distribution, h/w (disk, network, cores) parameters, …

Knowing this dependence, I would like to annotate each of the primitive functions with their known “performance dependence” function, and using some sort of reflection on my function (‘g’) be able to select the library functions the optimize my performance criteria

This example might seem a bit contrived, but my actual use case (which is more complicated to describe) would greatly benefit from such a feature (hopefully without becoming a Julia parsing guru)

Actually, you can profitably use a simple version of traits here:

julia> struct Fast end
julia> struct Slow end
julia> run(f::Fast) = "fast!"
julia> run(f::Slow) = "slow!"
julia> d = Dict(sin => Fast(), cos => Slow())
Dict{Function,Any} with 2 entries:
cos => Slow()
sin => Fast()
julia> run(f) = run(d[f])
run (generic function with 3 methods)
julia> run(cos)
"slow!"
julia> run(sin)
"fast!"

julia> annotate(::typeof(sin)) = Fast()
annotate (generic function with 1 method)
julia> run(f) = run(annotate(f))
run (generic function with 3 methods)

I’ve thought about this for a bit. I think it would be really interesting to have an extension to multiple dispatch that allowed methods to be picked via a simple cost function. The details, of course would be really complicated to implement, but if successful, it would simplify lots of the sparse linear algebra generic fallbacks issues.

I may not understand the specs fully, but I think that traits are the perfect solution here. See eg StaticArrays.jl which uses a Size trait (or similar) in quite a few places to fall back to generic linear algebra methods for arrays above a certain size.

Thanks for the suggestions
I am looking for a dynamic analysis capability rather than automatic dispatching
I would like take a function, and “reflect” on it (including loops and all…) and discover how many times each of the library functions is called and with what parameters, so that I can output both an estimate of expected resource usage as well as identify if the “contracts” of the library functions are met (e.g. maintaining a security parameter when calling a sequence of cryptographic primitives)