You assume that there were only one definition for each window. Even if the wikipedia article showed a normalized form, that would be a correct definition.
Indeed, let’s look at the wikipedia entry about Julius von Hann, the inventor of the Hann filter:
His original filter window was [1/4, 1/2, 1/4], which was normalized! (He was from my field: meteorology, oceanography, climate science, etc.)
We can see what happened to normalization in this article about the Hann function:
It first shows the continuous form, which is normalized; that is
integral H(x) dx = 1
When converting this integral form to a discrete form, one would normally put dx = L/N and then
integral H(x) dx = sum_n H(x_n) L / N = sum_n G(n),
where G(n) = (L /N) H(x_n) is a discrete form of the Hann window. If this were the definition, we would have
sum_n G(n) = 1.
Instead, the above Wikipedia article ignores N and defines F(n) = L H(x_n) to be the discrete version of the Hann window. Under this definition,
sum_n F(n) = N
instead of 1. Apparently, this is the definition people in the field of digital signal use.
I’m not saying that the definition of the window in your discipline is wrong. You must have your own reason. But, the reason shouldn’t be that it is the correct definition.
Maybe a boolean keyword
normalized
that defaults tofalse
would be a good idea.
I would appreciate that, but I guess that the current user base of the DPS
package is dominated by people from the digital signal field and then normalized
may not be widely used even if implemented.
On the other hand, I’m discovering other julia packages that directly talk about “local filter”, moving average, smoothing, etc. One of them (I forgot which one) includes the capability of plugging in one’s own window functions. Maybe I should go with such a package.
But, what worries me is this proliferation of similar packages. One of the packages I looked at explicitly mentions that it has a large overlap with DSP
.