Understanding the behaviour of Butterworth filter in DSP

Hi all,
Iโ€™m using a highpass filter to isolate the oscillations from the trend in my signal. The filter I use for my time series signal is:

response = Highpass((๐‘“/1); fs)
design = Butterworth(2)
filtered = filt(digitalfilter(response,design), signal)

Where fs is the sampling frequency, and ๐‘“ is the frequency of the oscillations.

For some reason, the result is this:

Can someone explain to me why I get this behavior at the beginning of the filtered signal? This drop is something I canโ€™t really pinpoint the reason to, and not sure how to fix it.

Thatโ€™s the filter transient. A filterโ€™s frequency response assumes an infinite-duration sinusoidal input โ€“ anything else produces transients. Transient response - Wikipedia

1 Like

Interesting thank you! Is there any possible way to create a filter that minimizes this effect in this case?

The usual approach is to ignore the transient โ€“ just ignore the first few output samples. Another possibility is to use a FIR filter, which may have a shorter transient; in any case, their transient length is easier to predict (roughly half the filter order).

Another interesting resource here: Transient Response, Steady State, and Decay | Introduction to Digital Filters

You may also consider using filters especially conceived to remove DC trends; thereโ€™s a good selection here: Linear-phase DC Removal Filter - Rick Lyons

1 Like