How to write a function that outputs the linear combination of other functions?

I have a certain list of functions fs and coefficients cs with which I would like form a linear combination, e.g., I’d like to obtain the function

(args...; kwargs...) -> ( cs[1] * f[1](args...; kwargs...) + ... 
                        + cs[n] * f[n](args...; kwargs...) )

The attempt

function linear_combination(fs, coeffs)
    (args...; kwargs...) -> sum(c * f(args...; kwargs...) for (c,f) ∈ zip(coeffs, fs))
end

is a partial solution. By runing the example

f(x) = x
g(x) = x^2

fs = (f, g)
coeffs = (1,1/2)
h = linear_combination(fs, coeffs)
h(1) # 1.5

I get the expected answer, but running @code_warntype h(1) gives me

MethodInstance for (::var"#15#18"{var"#15#16#19"{Tuple{typeof(f), typeof(g)}, Tuple{Int64, Float64}}})(::Int64)
  from (::var"#15#18")(args...; kwargs...) @ Main ~/Desktop/DevStructuredLight.jl/test.jl:4
Arguments
  #self#::var"#15#18"{var"#15#16#19"{Tuple{typeof(f), typeof(g)}, Tuple{Int64, Float64}}}
  args::Tuple{Int64}
Body::Any
1 ─ %1 = Core.getfield(#self#, Symbol("#15#16"))::var"#15#16#19"{Tuple{typeof(f), typeof(g)}, Tuple{Int64, Float64}}
│   %2 = Core.NamedTuple()::Core.Const(NamedTuple())
│   %3 = Base.pairs(%2)::Core.Const(Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}())
│   %4 = Core.tuple(%3, #self#)::Tuple{Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, var"#15#18"{var"#15#16#19"{Tuple{typeof(f), typeof(g)}, Tuple{Int64, Float64}}}}
│   %5 = Core._apply_iterate(Base.iterate, %1, %4, args)::Any
└──      return %5

so h is not type stable. Could anyone help me improve this solution? Thanks!

What about putting coefficients and functions into a new type?

struct LinearCombinations{C,F}
    c::C
    f::F
end

function (lc::LinearCombinations)(x...; kw...)
    map(lc.c, lc.f) do c, f
        c*f(x...; kw...)
    end |> sum
end
julia> lc = LinearCombinations(coeffs, (f, g))
LinearCombinations{Tuple{Int64, Float64}, Tuple{typeof(f), typeof(g)}}((1, 0.5), (f, g))

julia> @code_typed lc(1)
CodeInfo(
1 ─ %1  = Core.getfield(x, 1)::Int64
│   %2  = Base.getfield(lc, :c)::Tuple{Int64, Float64}
│   %3  = Base.getfield(%2, 1, true)::Int64
│   %4  = Base.mul_int(%3, %1)::Int64
│   %5  = Base.getfield(%2, 2, true)::Float64
│   %6  = Base.mul_int(%1, %1)::Int64
│   %7  = Base.sitofp(Float64, %6)::Float64
│   %8  = Base.mul_float(%5, %7)::Float64
│   %9  = Base.sitofp(Float64, %4)::Float64
│   %10 = Base.add_float(%9, %8)::Float64
└──       return %10
) => Float64
1 Like

No new type needed. You can also get type-stable code with:

linear_combinations(f, c) = (x...; kw...) -> map(c, f) do c, f
    c*f(x...; kw...)
end |> sum

Defining new callable types (“functors”) can be useful for other reasons, of course; see How are functors useful? - #6 by stevengj

The trick here is that you construct tuples from tuples (map has specialized methods for tuples), rather than generators foo for (c, f) in zip(...). I believe zip by itself is also type-unstable for heterogeneous tuples, for that matter.

6 Likes

This version is indeed type stable! But, at least for the example that I gave, it appears to allocate:

@benchmark h(1)

BenchmarkTools.Trial: 10000 samples with 999 evaluations.
 Range (min … max):   9.966 ns … 201.626 ns  ┊ GC (min … max): 0.00% … 0.00%
 Time  (median):     13.266 ns               ┊ GC (median):    0.00%
 Time  (mean ± σ):   13.422 ns ±   6.974 ns  ┊ GC (mean ± σ):  0.00% ± 0.00%

           ▁               ▂█▆▃▂▄▃▂                             
  ▁▁▂▂▂▂▅█▇█▄▂▂▁▁▁▁▁▂▂▃▂▂▃▅████████▆▅▅▅▅▅▄▄▃▃▃▃▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁ ▃
  9.97 ns         Histogram: frequency by time         16.9 ns <

 Memory estimate: 16 bytes, allocs estimate: 1.

Could it be avoided? It also appears slower than the written version h(x) = x + x^2 / 2.

Use a dollar sign:

julia> @benchmark $h(1)

BenchmarkTools.Trial: 10000 samples with 1000 evaluations.
 Range (min … max):  2.679 ns … 18.640 ns  ┊ GC (min … max): 0.00% … 0.00%
 Time  (median):     2.772 ns              ┊ GC (median):    0.00%
 Time  (mean ± σ):   2.963 ns ±  0.540 ns  ┊ GC (mean ± σ):  0.00% ± 0.00%

  █ █ ▄▁▁▄ ▄▁ ▃  ▄  ▃▁  ▄      ▃▃                            ▂
  █▁██████▅██▁█▆▃█▆▁██▁▆█▄▅▃▁▁▃██▁▄▄▃▁▄▁▁▆▆█▇▆▆▆▆▇▆▇█▇▇▆▇▆▇▇ █
  2.68 ns      Histogram: log(frequency) by time     4.86 ns <

 Memory estimate: 0 bytes, allocs estimate: 0.
1 Like

Well, now the allocation goes away, but I never saw one using $ before the function when benchmarking, only before global variables as arguments. What is going on here? :thinking:

The point is that h is a global variable that is not a constant. If you write

const h = linear_combinations(fs, coeffs)

then you don’t need the dollar sign. If you define a function via

h(x) = x + x^2 / 2

then h is automatically constant.

2 Likes

I don’t get this observation. I can freely redefine h, e.g., I could type h(x) = x and then h(x) = x^2 without a problem. It does not seem constant to me. What am I missing?

You can add or change methods of a function, but it’s still considered the same function. You cannot do something like the following:

julia> h(x) = x
h (generic function with 1 method)

julia> h = 1
ERROR: invalid redefinition of constant Main.h

This seems like a strange definition of “sameness” to me, but I get it now, thank you!

It may become a bit less strange if you think of a function as a special “dictionary” that maps type signatures to methods. With regular dictionaries, you can say const d = Dict(1 => 2) and still add new entries to d because it’s considered the same dictionary. (This analogy is not perfect.)