I have a vector with timecourse (1 Hz) data that gradually rises then levels over time. The data has some natural variability and occasional artifacts that appear as one or two sample peaks or valleys. The variability is useful, but I want to get rid of the artifacts. A 'rule' of the data is that changes of more than 2.6 between adjacent samples cannot possibly be valid, and must therefore be artifacts. I have tried a first order Butterworth at .2, and median filter with 3 samples, and they both get rid of the spikes but flatten the variability. In this instance, the median gives a better result. As I understand things, band-pass filters don't make sense since specific amplitudes are not a problem per se, but only if they differ too greatly from their immediate neighbours.
What I would like is some kind of moving window filter that would apply only to amplitude changes more than the 2.6 allowable, but not flatten smaller spikes.
so in the following example, the 160 in the middle and the 148 at the end are suspects, but the rest of the data should just be left alone.
HR=[136 137 138 140 141 142 143 143 144 144 145 146 160 144 143 143 142 143 143 144 145 146 145 148];[b,a]=butter(1,.2); %1st order butterworth,
butterHR=filter(b,a,HR);
gives
butterHR=[33.35 83.94 110.21 124.33 132.26 136.79 139.59 141.26 142.36 143.16 143.82 144.64 148.74 150.34 146.99 145.03 143.79 143.16 143.08 143.29 143.88 144.68 145.08 145.78]
and
medHR=medfilt1(HR,3); % 3 points averaged
gives
medHR=[136 137 138 140 141 142 143 143 144 144 145 146 146 144 143 143 143 143 143 144 145 145 146 145];
The butter smooths everything a little too much, and the median does a great job of getting rid of the 160 and 148, but it also changes samples 18, 22 and 23. In my perfect world, I would leave those alone.
Any thoughts?
Best Answer