First there are ways to make a non-stationary series stationary.
1. Remove polynomial trends (e.g. 1st, 2nd,3rd order differencing).
2. Remove seasonal components (e.g. seasonal differencing)
3. Identify time changes (intervention analysis)
These are not the only approaches but are what you will find suggested by Box, Jenkins and Reinsel in their time series book.
Those steps will make the series stationary but not necessarily independent or uncorrelated.
After doing the above an autoregressive moving average time series model may then leave the residuals uncorrelated (of course uncorrelated does not imply that the residuals are independent).
This is often called the Box-Jenkins approach to time series analysis. Usually the interest is in the model that characterizes the data and not the uncorrelated residuals.
This is all done in the time domain. Periodic components can also be identified in the frequency domain by taking a Fourier transformation and then smoothing the periodogram after stationarity is observed.
Rather than removing trends and autocorrelation intervention analysis will identify change points as part of the modeling process. Software such as autobox will do this automatically for you. Irishstat can help you with that.
Keep in mind that lack of correlation is not the same as independent. Also stationarity means that the series does not change behavior in time. This means no trends, seasonal components or changes in variance with time. Also stationary series can have significant autocorrelation and filters or ARMA models need to be applied to stationary series to get uncorrelated residulas.
As a signal is by definition a time series, there is significant overlap between the two.
I would expect a book on time-series analysis to be either a mathematical treatment, or a business/commercial treatment, while a book on statistical signal processing is likely to make heavy use of mathematics, but interested in the problems of signal analysis, classification, noise reduction, and other problems relevant to engineering / applied science.
Statistical signal processing uses the language and techniques of mathematical time-series analysis, but also introduces into the problem domain many concepts and techniques out of electrical engineering: signal to noise, dynamic range, and time/frequency domain transforms.
In my view, time-series analysis is a mathematical field, which then has applications wherever time series tend to crop up. Those fields then develop techniques that are specialised for those problem domains, with a specialised body of knowledge.
As time series arise in business and economics, there is an industry of material on time-series forecasting, trend analysis, etc. Much of this 'commercial' application is not present in the material on statistical signal processing, in part because the nature of the two time series is very different: signals are continuous over both time and measurement variables (e.g. voltage, intensity, etc.) Whereas most business time-series are taken over a discrete time domain (days, weeks, months, quarters, years).
Best Answer
It sounds dodgy to me as the trend estimate will be biased near the point where you splice on the false data. An alternative approach is a nonparametric regression smoother such as loess or splines.