Where fs is the sampling frequency, and 𝑓 is the frequency of the oscillations.
For some reason, the result is this:
Can someone explain to me why I get this behavior at the beginning of the filtered signal? This drop is something I can’t really pinpoint the reason to, and not sure how to fix it.
That’s the filter transient. A filter’s frequency response assumes an infinite-duration sinusoidal input – anything else produces transients. Transient response - Wikipedia
The usual approach is to ignore the transient – just ignore the first few output samples. Another possibility is to use a FIR filter, which may have a shorter transient; in any case, their transient length is easier to predict (roughly half the filter order).