I have an example of a simple processing pipeline where I noticed Transducers.jl was making a lot of allocations and behaving slowly compared to compositing iterators and generators together. I’m wondering if there’s a way to do this with Transducers.jl that is better.
Version with iterators and function composition
function process(input)
output = Iterators.map(a -> a^2, Iterators.filter(isodd, input))
Iterators.drop(output, 100) |> first
end
julia> process(a^3 for a in Iterators.countfrom(1))
65944160601201
julia> @btime process(a^3 for a in Iterators.countfrom(1))
109.819 ns (0 allocations: 0 bytes)
65944160601201
Version with transducers
function processxf(input)
input |> Filter(isodd) |> Map(a -> a^2) |> Drop(100) |> first
end
julia> processxf(a^3 for a in Iterators.countfrom(1))
65944160601201
julia> @btime processxf(a^3 for a in Iterators.countfrom(1))
1.149 μs (109 allocations: 3.50 KiB)
65944160601201
I tried to check with Cthulhu for type instability or some other obvious problem but couldn’t find anything.