Solved – Automatic test measuring dissimilarity between two time series

computational-statisticscross correlationdistributionskolmogorov-smirnov testtime series

I have two time series, series1 and series2. My aim is to find how much Series2 is different from Series1, automatically/quantitatively.
Image can be seen in its original size by clicking here.
alt text

Series1 is the expected result.
Series2 is the test/incoming series.

I am providing a histogram plot, where Series2 is represented in dark brown colour. You can also note in the x-axis between 221 and 353 there is a significant variation. ie Series2 is less than Series1. I am coding using C++.

I think, crosscorrelation will help, but produces a value based on similarity rather than dissimilarity. I see people talk about Kolmogorov-Smirnov Test. Is this the test which i should be performing?

UPDATE 1: I am trying to perform a template matching. I have divided my template image in to 8×8 blocks as well as my incoming test image. I am trying to compare one block in template image with the same block(based on the spatial pixel positions) in the test image. I calculate the intensity sum within each block. I obtain series1 for the Template image and have Series2 for the test image.

Best Answer

There are many different distance measures. For starters, there's always the correlation.

You can look at the mean square error. In R, you can see the algorithm for time series in Rob Hyndman's ftsa package (see the error function).

See Liao (2005) for a nice short survey of time series similarity measures, including euclidean distance, root mean square distance, mikowski distance, pearson's correlation, dynamic time warping distance, kullback-liebler disance, symmetric Chernoff information divergence, and cross-correlation.