GPs: sum of independent Kernels

The documentation of KernelFunctions.jl specifies many building blocks to define composite Kernels. But I can’t find a way to build a sum of independent Kernels, something similar to the KernelTensorProduct but with addition instead of multiplication:

For inputs x = (x_1,\dots,x_n) and x' = (x_1',\dots,x_n'), the independent sum of kernels k_1, \dots, k_n is defined as:

k(x, x'; k_1, \dots k_n) = \sum_{i=1}^n k_i(x_i, x_i')

For similar kernels I think one could use an ARDTransform, but if the kernels are different it is as far as I know not possibel. Is there a smart way to do this? Or is this functionality not yet supported?

Hey,

It’s true right now there is no way to build that. It generally seems like a good idea though! Could you open an issue on the repo?

If you need a temporary fix, you can use each kernel with a SelectTransform and make a sum of them.
So something like this:

sum(kernels[i] ∘ SelectTransform([i]) for i in 1:ndims) 

Issue: Sum of independent kernels · Issue #506 · JuliaGaussianProcesses/KernelFunctions.jl · GitHub