Compile time exponentiation

I want to evaluate x^y at compile time.
The reason is to facilitate efficient construction of NTuple using ntuple(f, Val(N)) method.

Compare bar1 and bar2 functions below:

struct Foo end

getX(::Foo) = 5

function bar1(foo::Foo)
    X = getX(foo)
    Y = 2^X
    return (ntuple(i->1, Val(X)), ntuple(i->2, Val(Y)))
end

function bar2(foo::Foo)
    X = getX(foo)
    Y = prod(ntuple(i->2, Val(X)))
    return (ntuple(i->1, Val(X)), ntuple(i->2, Val(Y)))
end

Benchmarking and @code_warntype suggest that exponentiation is evaluated at run time for bar1 and compile time for bar2.

  1. Is this the correct interpretation?
  2. If so why isn’t x^y evaulated at compile time? (I believe +, -, +, / operators are?)
  3. On a practical note, is there any reason not to use bar2?
Julia-1.1.0> using BenchmarkTools

Julia-1.1.0> @btime bar1(foo)
  2.965 μs (1 allocation: 304 bytes)
((1, 1, 1, 1, 1), (2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2))

Julia-1.1.0> @btime bar2(foo)
  7.184 ns (0 allocations: 0 bytes)
((1, 1, 1, 1, 1), (2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2))
Julia-1.1.0> @code_warntype bar1(foo)
Body::Tuple{NTuple{5,Int64},Any}
1 ─       nothing
2 ┄ %2  = φ (#1 => 2, #6 => %16)::Int64
│   %3  = φ (#1 => 2, #6 => %23)::Int64
│   %4  = φ (#1 => 2, #6 => %15)::Int64
│   %5  = (Base.slt_int)(0, %4)::Bool
└──       goto #7 if not %5
3 ─ %7  = (Base.cttz_int)(%4)::Int64
│   %8  = (Base.add_int)(%7, 1)::Int64
│   %9  = (Base.sle_int)(0, %8)::Bool
│   %10 = (Base.bitcast)(UInt64, %8)::UInt64
│   %11 = (Base.ashr_int)(%4, %10)::Int64
│   %12 = (Base.neg_int)(%8)::Int64
│   %13 = (Base.bitcast)(UInt64, %12)::UInt64
│   %14 = (Base.shl_int)(%4, %13)::Int64
└── %15 = (Base.ifelse)(%9, %11, %14)::Int64
4 ┄ %16 = φ (#3 => %2, #5 => %21)::Int64
│   %17 = φ (#3 => %8, #5 => %18)::Int64
│   %18 = (Base.sub_int)(%17, 1)::Int64
│   %19 = (Base.sle_int)(0, %18)::Bool
└──       goto #6 if not %19
5 ─ %21 = (Base.mul_int)(%16, %16)::Int64
└──       goto #4
6 ─ %23 = (Base.mul_int)(%3, %16)::Int64
└──       goto #2
7 ─       goto #8
8 ─       goto #9
9 ─ %27 = %new(Main.:(##6#8))::Core.Compiler.Const(getfield(Main, Symbol("##6#8"))(), false)
│   %28 = invoke Main.Val(%3::Int64)::Val{_1} where _1
│   %29 = (Main.ntuple)(%27, %28)::Any
│   %30 = (Core.tuple)((1, 1, 1, 1, 1), %29)::Core.Compiler.PartialTuple(Tuple{NTuple{5,Int64},Any}, Any[Const((1, 1, 1, 1, 1), false), Any])
└──       return %30

Julia-1.1.0> @code_warntype bar2(foo)
Body::Tuple{NTuple{5,Int64},NTuple{32,Int64}}
1 ─ %1 = (Core.tuple)((1, 1, 1, 1, 1), (2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2))::Core.Compiler.Const(((1, 1, 1, 1, 1), (2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2)), false)
└──      return %1


It looks like the compiler doesn’t propagate known constants through Base.power_by_squaring. For example look at the @code_warntype of pbs25() = Base.power_by_squaring(2,5).

You can work around this by introducing a shim function marked with @pure:

Base.@pure _power(m,n) = m^n

function bar3(foo::Foo)
    X = getX(foo)
    Y = _power(2,X)
    return (ntuple(i->1, Val(X)), ntuple(i->2, Val(Y)))
end

@pure comes with some caveats, but should be safe if you’re only putting integers into _power.

3 Likes

Thanks!
(I must admit I don’t really understand @pure, but it seems to work here, so I’m happy)

Currently I think of @pure as a tool for convincing the compiler that it should constant propagate through a function with abandon. I believe the name comes from the standard definition of pure function. For functions which are pure in that mathematical sense, it’s always valid to evaluate them with compile-time constant inputs.

The compiler team have avoided committing to the mathematical definition so far or making it a public API. I think mostly because it’s considered a compiler implementation detail, but also because it’s easy to abuse.

2 Likes

A big problem with @pure is that it does not conform to the naive mathematical standard definition: @pure functions don’t properly propagate backedges. This means that their reverse dependencies, i.e. call-sites, don’t get recompiled if something in (or downstream of) your pure function changes, and this creates lots of load-order dependent behavior and potential debug nightmares.

3 Likes

I think that the consensus is that not misusing Base.@pure requires a level of understanding about Julia’s internals that very few people have, and even then it is easy to make mistakes.

I don’t think it is good practice to advocate its use on this forum, especially in the Usage category.

Well of course it’s good to avoid messy hacks, but I think this use of pure is pretty harmless. (The annotation may go away of course.)

Feel free to offer an alternative. I didn’t find a neat one yet.