Solved – Dubious use of signal processing principles to identify a trend

data miningsignal processingtime seriestrend

I am proposing to try and find a trend in some very noisy long term data. The data is basically weekly measurements of something which moved about 5mm over a period of about 8 months. The data is to 1mm accuracey and is very noisy regularly changing +/-1 or 2mm in a week. We only have the data to the nearest mm.

We plan to use some basic signal processing with a fast fourier transform to separate out the noise from the raw data. The basic assumption is if we mirror our data set and add it to the end of our existing data set we can create a full wavelength of the data and therefore our data will show up in a fast fourier transform and we can hopefully then separate it out.

Given that this sounds a little dubious to me, is this a method worth purusing or is the method of mirroring and appending our data set somehow fundamentally flawed? We are looking at other approaches such as using a low pass filter as well.

Best Answer

It sounds dodgy to me as the trend estimate will be biased near the point where you splice on the false data. An alternative approach is a nonparametric regression smoother such as loess or splines.

Related Question