A recursive solution might be nonintuitive to implement, but it’s safe, as opposed to generated functions. The @assume_effects
part is not essential, and, in any case, it’s much easier to tell when assuming :terminates_globally
is correct than when @generated
is correct. Generated functions are an UB-loaded footgun for anyone not intimate with the Julia implementation. E.g.:
People are notoriously bad at guessing what effects should apply to a function. Particularly things like (struck out because nsajko edited their comment to remove the foldable
foldable
reference). @assume_effects
is a massive UB-footgun.
But regardless, I really don’t think it’s appropriate to browbeat people for daring to show a user that a generated function can solve their problem.
OK, but @generated
has those same issues and then more. See the docs excerpt above.
Can it? The rand()
call accesses and mutates global state. AFAIK that’s not allowed in any generated function.
rand()
is not being called in the generated function body, it’s being called in the produced code which is fine. If the Expr
produced by the generated function was dependant on global mutable state, that’d be incorrect.
I’m going to duck out of this conversation now because it’s veering way off topic, but I’ll just re-iterate that I think @danielwe’s solution is clean and easy to understand, and is fine to use.
The generated function provided here is as safe as one can get. The body of the generated function will not be deterministic only if one commits type piracy and changes the definition of push!
, :
or any other widely used Base
function, at which point all hell will break loose and not due to the generated function…
This generated function is a perfectly reasonable solution and there is little reason to discourage its usage.
The only possible reason to discourage it I can think of is if one wants to statically compile the code with PackageCompiler.jl
or similar tools.
In this case, since the OP knows an upper bound to the tuple length (up to 10), we can still use metaprogramming while avoiding the generated function by just using @eval
. Something along the lines of (I didn’t test it):
for N in 2:10
ex = quote
x = rand()
end
for i in 1:(N - 1)
line = :(x < cumweights[$i] && (usefunc(data, funcs[$i]); return))
push!(ex.args, line)
end
lastline = :(usefunc(data, funcs[$N]); return)
push!(ex.args, lastline)
@eval function pickfunc(data, funcs::NTuple{$N,Any}, cumweights::Tuple)
$ex
end
end
I like this! I haven’t had the chance to try these solutions yet, so a question: I would be surprised ‘NTuple{N, Function}’ leads to methods that correctly specialize to tuples of specific functions, instead of a single method (per N) with lots of boxing/unboxing. Is that indeed the case?
The NTuple{N, Function} where N
signature is used for dispatch purposes only. Julia will still compile different codes based on the concrete type of funcs
passed to the function.
For example:
julia> function test_specialize(t::NTuple{N,Any}) where N
@show N
@show typeof(t)
end
test_specialize (generic function with 1 method)
julia> @code_warntype test_specialize((sin,cos,+,2.0))
MethodInstance for test_specialize(::Tuple{typeof(sin), typeof(cos), typeof(+), Float64})
from test_specialize(t::Tuple{Vararg{Any, N}}) where N @ Main REPL[2]:1
Static Parameters
N = 4
Arguments
#self#::Core.Const(test_specialize)
t::Tuple{typeof(sin), typeof(cos), typeof(+), Float64}
Locals
value@_3::Type{Tuple{typeof(sin), typeof(cos), typeof(+), Float64}}
value@_4::Int64
Body::Type{Tuple{typeof(sin), typeof(cos), typeof(+), Float64}}
1 ─ (value@_4 = $(Expr(:static_parameter, 1)))
│ %2 = $(Expr(:static_parameter, 1))::Core.Const(4)
│ %3 = Base.repr(%2)::String
│ Base.println("N = ", %3)
│ value@_4
│ %6 = Main.typeof(t)::Core.Const(Tuple{typeof(sin), typeof(cos), typeof(+), Float64})
│ (value@_3 = %6)
│ %8 = Base.repr(%6)::String
│ Base.println("typeof(t) = ", %8)
└── return value@_3::Core.Const(Tuple{typeof(sin), typeof(cos), typeof(+), Float64})
julia> @code_warntype test_specialize((-,^,exp,2im))
MethodInstance for test_specialize(::Tuple{typeof(-), typeof(^), typeof(exp), Complex{Int64}})
from test_specialize(t::Tuple{Vararg{Any, N}}) where N @ Main REPL[2]:1
Static Parameters
N = 4
Arguments
#self#::Core.Const(test_specialize)
t::Tuple{typeof(-), typeof(^), typeof(exp), Complex{Int64}}
Locals
value@_3::Type{Tuple{typeof(-), typeof(^), typeof(exp), Complex{Int64}}}
value@_4::Int64
Body::Type{Tuple{typeof(-), typeof(^), typeof(exp), Complex{Int64}}}
1 ─ (value@_4 = $(Expr(:static_parameter, 1)))
│ %2 = $(Expr(:static_parameter, 1))::Core.Const(4)
│ %3 = Base.repr(%2)::String
│ Base.println("N = ", %3)
│ value@_4
│ %6 = Main.typeof(t)::Core.Const(Tuple{typeof(-), typeof(^), typeof(exp), Complex{Int64}})
│ (value@_3 = %6)
│ %8 = Base.repr(%6)::String
│ Base.println("typeof(t) = ", %8)
└── return value@_3::Core.Const(Tuple{typeof(-), typeof(^), typeof(exp), Complex{Int64}})
The only possible reason to discourage it I can think of is if one wants to statically compile the code with
PackageCompiler.jl
or similar tools.
Just a sidenote, generated functions work fine with PackageCompiler and other forms of static compilation.
The
NTuple{N, Function} where N
signature is used for dispatch purposes only.
Oh, I had assumed that this was analogous to Float64 <: Any == true
but say Vector{Float64} <: Vector{Any} == false
, and hence tuples of concrete types would not match an NTuple
containing abstract types. Good to know!