I am trying to use a Epanechnikov kernel density estimator in KernelDensity.jl. Surprisingly, this is not implemented, but it should be possible to accomplish by extending the function `kernel_dist`

. I have been unable to match the results from R. It is not clear to me whether this is because I extended `kernel_dist`

incorrectly or because the interpolation functions differ. So I was hoping that someone who has more familiarity with the topic might provide some help.

Julia code:

```
using KernelDensity,Distributions
import KernelDensity: kernel_dist
kernel_dist(::Type{Epanechnikov},w::Real) = Epanechnikov(0.0,w)
data = [0.952,0.854,0.414,0.328,0.564,0.196,0.096,0.366,0.902,0.804]
kd = kde(data;kernel=Epanechnikov)
dist = InterpKDE(kd)
pdf(dist,.3)
```

Result = 1.3029

R code:

```
data = c(0.952,0.854,0.414,0.328,0.564,0.196,0.096,0.366,0.902,0.804)
kd = density(data,kernel = "epanechnikov")
f = approxfun(kd)
f(.3)
```

Result = .9670

I repeated the comparison for a standard normal with 10^5 samples and found a discrepancy, albeit a smaller one: .3818 for R which is very close to the theoretical true value and .3782 for Julia.