[Math] How to generalise the Fourier transform

fourier analysissignal processing

The Fourier transform approximates a signal using a bunch of sine and cosine waves. The inverse Fourier transform then reconstructs the original signal from this information.

I am told that it's possible to decompose a signal using some other set of functions, rather than the usual sine and cosine. My question is, how do you do this?

For a start, I'm assuming that for your set of functions to be able to approximate every possible signal, you need to have "enough" of these functions, and ideally you want them to be "different" such that each one measures an unrelated aspect of the signal.

To be completely clear, I'm mostly interested in the case of digital sampled data. But the continuous case might be interesting too…


Edit:

I'm not sure why my question isn't producing any answers. Maybe it's because nobody actually knows the answer, or maybe it's because the answer is "too obvious" to somebody who actually possesses formal mathematical training. I'm not sure. But I've wanted to know the answer to this question for years, so let's try one more time…

The discrete Fourier transform works by computing the correlation of the input signal with several different sine waves. The inverse transform then adds together the specified amplitudes of those waves, recovering the original signal. That much seems clear.

It looks like I could just invent a family of functions to use instead of the sine and cosine functions, and do exactly the same process… except that when I do this, it doesn't work in any way, shape or form. If I transform and then inverse-transform, I get gibberish. And I don't know why… but it seems like the key phrase is "complete set of orthonormal functions", whatever that means.


Update:

I had assumed that if I could just find a system of basis functions such that none of them are correlated, and their number equals the number of points in the input, the transform would work. Apparently, it does not.

Consider the following set of functions:

$$f_1 = [1,1,1,1]$$
$$f_2 = [0,1,0,-1]$$
$$f_3 = [1,0,-1,0]$$
$$f_4 = [1,-1,1,-1]$$

Clearly, there are 4 functions. As far as I can tell, none of them are correlated. For example,

$$f_1 * f_4 = (1 * 1) + (1 * -1) + (1 * 1) + (1 * -1) = 1 – 1 + 1 – 1 = 0$$

If we take, say, $x = [1,2,3,4]$ and compute the correlations, we get

$$f_1 * x = 1 + 2 + 3 + 4 = 10$$
$$f_2 * x = 0 + 2 + 0 – 4 = -2$$
$$f_3 * x = 1 + 0 – 3 + 0 = -2$$
$$f_4 * x = 1 – 2 + 3 – 4 = -2$$

Now, computing $10 f_1 – 2 f_2 – 2 f_3 – 2 f_4$, we get

$$10 + 0 – 2 – 2 = 6$$
$$10 – 2 + 0 + 2 = 10$$
$$10 + 0 + 2 – 2 = 10$$
$$10 + 2 + 0 + 2 = 14$$

Clearly $[6, 10, 10, 14]$ is nothing like $[1,2,3,4]$, even with scaling. So… what am I missing?

Best Answer

Your question is at the core of not only signal processing but differential equations and orthogonal special functions, fields of study that have a long history and are still active and evolving, so it's a daunting task to point out where you could start your studies.

The Wiki leonbloy pointed out, Generalized Fourier Series, and also the Wiki Green's Function with the section on eigenvalue expansions introduce the jargon that you should be thoroughly familiar with.

The basic algorithm is to find dual sets of eigenvectors/eigenfunctions parametrized by a continuous (e.g., $\omega$ below) or discrete index (e.g., $n$ below), that satisfy completeness and orthogonality relations encapsulated in Dirac delta function resolutions such as that for the Fourier transform

$$\delta(x-y)= \int_{-\infty}^{\infty}\exp(i2\pi \omega x)\exp(i2\pi \omega y)d\omega$$

giving

$$\int_{-\infty}^{\infty}f(y)\delta(x-y)dy=f(x)=\int_{-\infty}^{\infty}\exp(i2\pi \omega x)\int_{-\infty}^{\infty}f(y)\exp(i2\pi \omega y) dy d\omega$$

$$=\int_{-\infty}^{\infty}\exp(i2\pi \omega x)\hat{f}(\omega) d\omega,$$

or that for the eigenvectors of Sturm-Liouville differential operators over finite domains

$$\delta(x-y)=\sum_{n=0}^{\infty }\Psi_n(x)\Psi_n^*(y)$$

giving

$$f(x)=\sum_{n=0}^{\infty }\Psi_n(x)\int_{-\infty}^{\infty}f(y)\Psi_n^*(y) dy,$$

or Kronecker delta resolutions such as that for the associated Laguerre functions

$$\frac{(n+\alpha)!}{n!}\delta_{mn}=\int_{0}^{\infty}x^{\alpha}e^{-x}L_{n}^{\alpha}(x)L_{m}^{\alpha}(x)dx$$

giving

$$f(x)=\sum_{n=0}^{\infty }\frac{n!L_{n}^{\alpha}(x)}{(n+\alpha)!}\hat{f}_n$$

with

$$\hat{f}_n=\int_{0}^{\infty}x^{\alpha}e^{-x}L_{n}^{\alpha}(x)f(x)dx.$$

The Fourier Transform and Its Applications by R. Bracewell is a really good book for grasping the fundamentals of the FT and DFT, as well as G. Strang's Introduction to Applied Mathematics.

Methods of Applied Mathematics by F. Hildebrand and Principles and Techniques of Applied Mathematics by B. Friedman give good intros to Fredholm theory and Green's functions.

More advanced books on harmonic analysis, such as J. Partington's Interpolation, Identification, and Sampling might be the next leap if you are comfortable with complex analysis (e.g., fractional linear transformations) and other integral transforms such as the Laplace transform.

Related Question