The documentation of `KernelFunctions.jl`

specifies many building blocks to define composite Kernels. But I can’t find a way to build a sum of independent Kernels, something similar to the `KernelTensorProduct`

but with addition instead of multiplication:

For inputs x = (x_1,\dots,x_n) and x' = (x_1',\dots,x_n'), the independent sum of kernels k_1, \dots, k_n is defined as:

k(x, x'; k_1, \dots k_n) = \sum_{i=1}^n k_i(x_i, x_i')

For similar kernels I think one could use an `ARDTransform`

, but if the kernels are different it is as far as I know not possibel. Is there a smart way to do this? Or is this functionality not yet supported?