[Math] Measuring Linearity of a Time Series

st.statistics

Is there a de facto standard process or function to measure the linearity of a time series? I have Googled the problem and have come across a few different papers outlining various methods of doing this. The problem is that I'm not well-versed enough in mathematics to be able to comprehend each of these papers to determine which, if any, of these methods are best for my application.

Here are several papers that I came across:

Just in case I phrased this incorrectly, I'll outline what I'm trying to do: I have a time series data set. I would simply like to know how much the series is like a straight line. For example, a time series derived from f(x)=2x would have a linearity of 1.0, f(x)=sin(x) would be something less, and a random data set would have a linearity of 0.0 or near-zero.

Any ideas on how to derive this measurement given an arbitrary time series?

Best Answer

First of all: By linear time series, do you mean a time series with a linear recurrence relation or a time series that is linear with respect to time? the two are entirely different things. The first paper you cited has the definiion of "linear" time series as the one with linear recurrence relation. In the beginning pages of the first paper, the author says that the series is assumed to be a standard $AR(p)$ process and then proceed to determine the linearity.

From your post, I think that you are not concerned with this kind of linearity. Let $X(t)$ be the time series where $t=1,2,\cdots$ Then take t as independent variable, $X(t)$ as dependent variable, fit a line to the data. From the fitted line, obtain the remainders $e(t)=\hat{X}(t)-X(t)$ and check whether they are significantly different from zero using a t-test.

In case you are not well-versed in all this, read up the basic stuff on curve fitting and t-test. The latter can be done very easily in all stat softwares and the former in many math softwares.

Related Question