Smoothing Sensor Signals with Response Times – How to Implement Effective Filtering

filtermoving averagesmoothing

I have a Sensor (e. g. for temperature) that has a response time ($t_{99}$ -> this is the time that the sensor needs to give an output of 99 % of the actual value) of let's say 10 seconds. This sensor gives me a value each second (1 Hz). The signal is noisy, but also has peaks which should not be smoothed too much.

How would you approach this problem? Is it correct to use the response time as the window size in e. g. simple moving averaging? Is it better to use a Savitzky-Golay filter for preserving the peak sizes? I would probably center the window(?) because it's all done in post processing and is not needed in realtime.
How (or is it even reasonable) can I connect the response time to my window size?

Best Answer

If it takes the sensor 10s to achieve a reliable measurement and it gives an output each second, there is probably already an internal (physical) smoothing in place. In this case, an additional smoothing would not be necessary or useful.

If the sensor gives you the raw data every second but the producer specified that a 99% reliable measurement would take 10 seconds, then you have to compute a moving average over 10s to get the correct readings. In this case, the peaks might be unreliable outliers, so it would be bad to preserve them.

If you are unsure, a little experimentation would be in order. Heat up something to a certain temperature, set up the sensor in a controllable environment and expose it to 1s, 2s, 5s, 10s of heat from this source, always giving it 30s of cooldown time after exposure. Note the exact timestamps on which you exposed it. Then, look at the data stream and adjust the post-processing (moving average, window position, peak retainment) so it matches the actual series of events best.

Related Question