Compile Time Dictionary

Does that message potentially result in a new parametrized Max ?

No. All concrete node types are known ahead of time.

You could just as well actually have a proper precomputed, constant lookup table that will certainly be faster than dispatching.

A constant lookup table is what I am looking for. Although I am not sure if it would be faster than dispatching (I think as fast in terms of runtime). Because wouldn’t the specialization of calc compile away the dispatch of lookup. I thought that was the whole point of using metaprogramming to create the lookup methods rather than use a dictionary

Is there a way to have a dictionary that is precomputed and immutable? Hence the lookups can be compiled away?

I’m not sure I properly understand what you want - if you just have struct Node{T} and some functions f1(::Node{Int}), f2(::Node{Float64}), etc… then the compiler will specialize f1 and f2 and you see no run-time cost. This should almost always be what you want.

However, I did make a compile-time dictionary a couple of years back, I don’t remember for what purpose. Posting code inline below. You can use it like:

julia> using Base: setindex

julia> f1(x) = x * 2
f1 (generic function with 1 method)

julia> f2(x) = x + 2
f2 (generic function with 1 method)

julia> const lookup = let d = StaticDict()
           d = setindex(d, f1, Int)
           d = setindex(d, f2, Float64)
       end
StaticDict{Int64, f1, StaticDict{Float64, f2, StaticDict{StaticDictionary.NullTag, StaticDictionary.NullTag, StaticDictionary.NullTag}}}()

julia> function using_lookup(x::T) where T
           lookup[T](x)
       end
using_lookup (generic function with 1 method)

julia> using_lookup(3)
6

julia> using_lookup(3.0)
5.0

julia> @code_warntype using_lookup(3.0)
MethodInstance for using_lookup(::Float64)
  from using_lookup(x::T) where T in Main at REPL[8]:1
Static Parameters
  T = Float64
Arguments
  #self#::Core.Const(using_lookup)
  x::Float64
Body::Float64
1 ─ %1 = Base.getindex(Main.lookup, $(Expr(:static_parameter, 1)))::Core.Const(f2)
│   %2 = (%1)(x)::Float64
└──      return %2

Probably missing some methods that it should have.

module StaticDictionary

export StaticDict

import Base: keys, values, getindex, haskey, setindex

struct NullTag end

struct StaticDict{K, V, D}
    StaticDict{K, V}() where {K, V} = new{K, V, StaticDict{NullTag, NullTag, NullTag}}()
    StaticDict() = new{NullTag, NullTag, NullTag}()
    
    function StaticDict{K, V}(::Type{C}) where {K, V} where C <: StaticDict
        if haskey(C, Val(K))
            KeyError("Will not implicity merge StaticDicts with " *
                     "duplicate keys (duplicated key: $K") |> throw
        else
            new{K, V, C}()
        end
    end
end

const NullDictionaryType = StaticDict{NullTag, NullTag, D} where D

StaticDict{K, V}(::C) where {K, V} where C <: StaticDict = StaticDict{K, V}(C)
StaticDict(::Val{K}, ::Val{V}) where {K, V} = StaticDict{K, V}()
StaticDict(::Val{K}, ::Val{V}, ::C) where {K, V} where C <: StaticDict = StaticDict{K, V}(C)

keys(::D) where D <: StaticDict = keys(D)
keys(::NullDictionaryType) = ()
keys(::Type{StaticDict{K, V, NullTag}}) where {K, V} = (K,)
keys(::Type{StaticDict{K, V, D}}) where {K, V} where D <: StaticDict = (K, keys(D)...)

values(::D) where D <: StaticDict = values(D)
values(::Type{StaticDict{NullDictionaryType}}) = ()
values(::Type{StaticDict{K, V, NullTag}}) where {K, V} = (V,)
values(::Type{StaticDict{K, V, D}}) where {K, V} where D <: StaticDict = (V, values(D)...)

getindex(::D, ::Val{K}) where D <: StaticDict where K = getindex(D, Val(K))
getindex(::D, k) where D <: StaticDict = getindex(D, Val(k))
function getindex(::Type{D}, ::Val{K}) where D <: NullDictionaryType where K
    KeyError("Did not find key $K in StaticDict") |> throw
end
function getindex(::Type{StaticDict{K, V, D}}, ::Val{K2}) where {K, V, K2} where D <: StaticDict
    K === K2 ? V : getindex(D, Val(K2))
end

haskey(::D, ::Val{K}) where D <: StaticDict where K = haskey(D, Val(K))
haskey(::D, k) where D <: StaticDict = haskey(D, Val(k))
Base.@pure haskey(::Type{D}, ::Val{K}) where D <: NullDictionaryType where K = false
function haskey(::Type{StaticDict{K, V, D}}, ::Val{K2}) where {K, V, K2} where D <: StaticDict
    K === K2 || haskey(D, Val(K2))
end

setindex(::D, ::Val{V}, ::Val{K}) where D <: StaticDict where {K, V} = setindex(D, Val(V), Val(K))
setindex(::D, v, k) where D <: StaticDict = setindex(D, Val(v), Val(k))
setindex(::Type{D}, ::Val{V}, ::Val{K}) where D <: NullDictionaryType where {K, V} = StaticDict{K, V}()
function setindex(::Type{StaticDict{K}}, _, ::Val{K}) where K
    KeyError("Refusing to set different mapping for key $K in StaticDict") |> throw
end
function setindex(::Type{StaticDict{K, V, D}}, ::Val{V2}, ::Val{K2}) where {K, V, K2, V2} where D <: StaticDict
    StaticDict{K, V}(setindex(D, Val(V2), Val(K2)))
end

end # module
1 Like

No, if your operation is inherently dynamic, dispatch can never be compiled away. My initial suggestion with @eval was under the assumption that whatever produced the incoming type was type stable - if that is not the case, you will pay for dispatch, which can be more expensive than a lookup in a dictionary or otherwise specialized datastructure.

Like @Oscar_Smith said in the beginning, you’re probably not going to be better than IdDict.

I’m not sure why you want to so badly compile away a dynamic lookup of your inherently dynamic data. You can’t get rid of dynamicness if you have to parse an incoming message to figure out what to return. The only way to “compile away” the lookup would be to know ahead of time what messages you need to return at which points in time.

This does not solve the underlying issue of having to dispatch due to being type unstable from parsing the incoming message.

I think everything is type stable. The only thing that is dynamic is the value of the val in each node. Its type is always the same.

Nor are nodes ever created or destroyed dynamically.

I am not too hung up on compiling away the lookup. I just thought it would be interesting to do. This is for a personal prototype.

You’re quite correct. Like I said I don’t have time to properly understand the problem right now but I don’t see how a compile-time lookup table solves the problem unless the whole DAG computation could be boiled down to a lookup table… in which case you could just make the whole thing a lookup table.

I remember a time when I was hung up on trying to metaprogram everything and eventually I found an intuition for where it might be beneficial and where it’s not. I think it can be quite hard to form a mental model of what things are necessarily computed in dynamic fashion.

I’m probably just not explaining it well. I am pretty sure that the metaprogramming solution for creating a type stable non polymorphic lookup method for each concrete node type in my DAG would cause the lookup the compile away at the expense of a very large compilation time.

Having a compiled method for each possible concrete type seems to be the equivalent of specialization in my mental model.

Similarly using a method for each type also seems to be the equivalent of a lookup table of type to value. Since the methods themselves are basically forming key value pairs. The methods would look like

lookup(::Type{CONCRETETYPE}) = VALUE

No, I think I understood what you’re trying to do. The issue is that while lookup itself is type stable, parsing a message received over the network ensures that the compiler cannot know ahead of time which specialization of lookup to call. Hence, it has to insert a dynamic dispatch, no way around that.

Parsing is inherently a type unstable operation, because the output type depends on a value not an input type.

The lookup is inside of a calc method. Which lookup to call is given by the types of arguments coming into calc

Yes, but which calc will be taken?

The engine will iterate over its nodes (of which the types are known, since the engine type is parameterized by the type of all nodes) and call calc for each of them. There is a polymorphic calc method for each parametric type (one for Max one for Min etc.)

I think I need to continue to read up on some stuff. I only started learning Julia two days ago and have probably made several mistakes in my description and mental model.

I think your mental model is remarkably advanced for someone that has studied the language for just two days. My impression is that the problem lies more in the fact that what you want is something complex and at this high level of descriptions it is hard to have enough detail to say what will and will not work.

2 Likes

Let’s work from the situation as @Henrique_Becker surmises it to be.

@HashBrown give some thought to articulating what you make happen and the constituent elements (kinds of information) that are available. Rather than getting to wrapped up in specifics of how parts and wholes and partial abstractions would best be designed … let us help with that.

1 Like

I think I am good to close this discussion, but thank you for the offer. I’ve picked up useful pieces of information and see some areas where I can educate myself more with more directed questions.

2 Likes