The problem is definitely hard.
Mechanical rules like the +/- N1 times standard deviations, or +/ N2 times MAD, or +/- N3 IQR or ... will fail because there are always some series that are different as for example:
- fixings like interbank rate may be constant for some time and then jump all of a sudden
- similarly for e.g. certain foreign exchanges coming off a peg
- certain instrument are implicitly spreads; these may be near zero for periods and all of a sudden jump manifold
Been there, done that, ... in a previous job. You could try to bracket each series using arbitrage relations ships (e.g. assuming USD/EUR and EUR/JPY are presumed good, you can work out bands around what USD/JPY should be; likewise for derivatives off an underlying etc pp.
Commercial data vendors expand some effort on this, and those of use who are clients of theirs know ... it still does not exclude errors.
It sounds like you have a set of known patterns and want to find places in your signal where these patterns occur. A typical way of doing this is using the cross correlation. In this approach, you'd compute the cross correlation of your pattern with the signal. You can think of this as repeatedly shifting the pattern by some lag to align it with a different portion of the signal, then taking the dot product of the pattern and the local portion of the signal. This gives a measure of the similarity between the pattern and the local signal at each lag. When the signal matches the pattern, this will manifest as a peak in the cross correlation.
Different variants of the cross correlation exist. For example, some versions locally scale and/or normalize the signals. This can be useful if you want your comparison to be shift/scale invariant (e.g. you want the shape of the signal to be the same, but don't care about the actual magnitude; in the case of detecting accelerometer patterns, this might correspond to performing the same motion but more or less vigorously).
The cross correlation will naturally fluctate, reflecting varying degrees of similarity between the pattern and signal. So, the question is how to distinguish peaks that represent a 'true match' from those that reflect partial similarity. You'll have to define this based on the variant of cross correlation you use. For example, if the pattern exactly matches the signal at some offset, the magnitude of the unnormalized cross correlation will equal the squared $l_2$ norm of the pattern (i.e. the dot product of the pattern with itself). Some normalized versions of the cross correlation will have maximum amplitude 1. Another thing you'd need to define is some tolerance, to account for noise in the signal (you probably don't want to require an exact match).
Another possibility is that you want to use some other measure of similarity (e.g. the euclidean distance). In this case, you could use peaks in the cross correlation to identify candidate matches, then check them using whatever distance metric/similarity function you like.
One of main the reasons to use cross correlation is that it's very computationally efficient. For large signals, you can gain even more speed by computing it in the Fourier domain, using FFTs. Many packages/libraries are available to do this.
The cross correlation approach (and FFT acceleration) will also work for higher dimensional signals (e.g. images).
Best Answer
Pattern recognition in time series can involve a number of components. Memory i.e. auto-dependence can be characterized via an ARIMA component (stochastic/adaptive structure). Deterministic structure such as level shifts,local time trends,pulses and seasonal pulses can be found via Intervention Detection schemes. Changes in error variance and changes in parameters of the model over time can be found via residual diagnostic checking. Furthermore patterns can respond to known events , oftentimes the response is in anticipation of the event but much more frequently when the event occurs and periods following the event. If you have a pet time series please post it and I will try and develop the underlying pattern via available software.