[Math] Artifacts and low frequencies FFT.

fourier analysissignal processing

I am working on analyzing a time signal and want to preform a FFT. However I run in to some artifacts at low frequencies. I have managed to reproduce the behavior in a test signal. Given by
$S(t) = \theta (0.3-t)[sin(2 \pi \ 30 \ t)+sin(2 \pi \ 100 \ t) + 3.0]$
In other words a signal that is zero up to t = 0.3 and then a combination of two sine waves with frequency 30 and 100 Hz.

enter image description here

This gives the following DFT, amplitudes.
enter image description here

I have tried removing the mean, windowing and combinations of the two. The only thing that seems to help is either to remove the constant or to make the time interval, where the function is zero, shorter.Also need to window it.

So why not just cut it off and be done with it? Well for this test signal that would be an ideal solution, but for my real signal I do not really want to do that and it do not seem to help either.

Do anyone know what causes this behavior and how to fix it?

EDIT: The plot do not include the DC term.

Best Answer

  • compare it with the FFT of $\theta (0.3-t)$ and of $$H(t) = \theta (0.3-t)[\exp(2i \pi \ 30 \ t)+\exp(2 i \pi \ 100 \ t) + 3.0]$$

  • use a window (hanning)

  • if you understand what's happening, you will understand nearly everything of the FFT.