[Math] Waves of differing frequency are orthogonal – help me understand

linear algebrasignal processing

I know that sinusoidal waves of different frequencies are orthogonal to each other. For instance:

# Shows that 1Hz and 2Hz waves are orthogonal
import numpy, scipy
x = numpy.linspace(0, 1, 1000)
wave_1hz = scipy.sin(1 * 2*scipy.pi*x)
wave_2hz = scipy.sin(2 * 2*scipy.pi*x)
numpy.dot(wave_1hz, wave_2hz)
# This prints a value very near 0, showing they are orthogonal

I am wondering if someone can give me an analogy to help me understand "why" waves of different frequencies are orthogonal.

To give a better idea of what I am looking for, I intuitively understand that if you have 2 $\Bbb R_2$ vectors which are at a right angle, if you look at the dot product as projecting one onto the other, there it will be 0 (like shining a flashlight straight down on a vertical poll). This helps me understand what orthogonality means in the context of $\Bbb R_2$ vectors. But I don't have any such analogy for waves of different frequencies.

Best Answer

Orthogonality in this context means using an inner product like $$\langle\phi_1,\phi_2\rangle = \int_0^{2\pi} \phi_1(x)\phi_2(x)\ dx.$$ This inner product measures scalar projections by averaging two functions together.

So let's look at the integral of the product of two sine curves of differing frequency. Let's use $\phi_1 = \sin(x)$ and $\phi_2 = \sin(2 x)$. Note that the frequency of $\phi_1$ is $1$ and the frequency of $\phi_1$ is $2$.

The basic idea is that if the frequencies of the two sine curves are different, then between $0$ and $2\pi$, the two sine curves are of opposite sign as much as they are of the same sign: the two sine curves, courtesy WolframAlpha

Thus their product will be positive as much as it is negative. In the integral, those positive contributions will exactly cancel the negative contributions, leading to an average of zero: sin(x)sin(2x)

That's the intuition. Proving it just takes a bit of Calc 2:

We know from trig that $\sin(mx)\sin(nx) = \frac{1}{2}\bigg(\cos\big((m-n)x\big) - \cos\big((m+n)x\big)\bigg)$, so here for $m=1$ and $n=2$, \begin{align*} \int_0^{2\pi}\sin(x)\sin(2x)\ dx &= \frac{1}{2}\int_0^{2\pi}\cos(-x)-\cos(3x)\ dx\\ &= \frac{1}{2}\bigg(-\sin(x)\bigg|_0^{2\pi}-\frac{1}{3}\sin(3x)\bigg|_0^{2\pi}\bigg)\\ &= 0 \end{align*} since $\sin(0) = \sin(2\pi mx) = 0$.

I'll leave the general case of of two sines/cosines of differing frequencies to you as an exercise.


More generally, functions out of some space into $\mathbb{R}$ form a vector space. They can be added, subtracted, and scaled. Thus you can do linear algebra to them.

In particular, you can decompose functions on $[0,2\pi]$ into sinusoidal components by averaging them with sine and cosine curves. This is exactly analogous to shining a flashlight on the function and seeing how much of its shadow projects onto the $\sin(x)$ vector; the projection is $$\langle f(x),\sin(x)\rangle \sin(x).$$

As we saw previously, the sine and cosine curves of different frequencies are orthogonal to each other because they average against each other to zero. In fact, they form an orthonormal basis of the vector space of functions on $[0,2\pi]$. Every function $f$ can be written as a sum of these basis vectors: $$f(x) = \sum_{k=0}^\infty \langle f(x),\sin(kx)\rangle\sin(kx) + \langle f(x),\cos(kx)\rangle\cos(kx).$$ This is its Fourier series; the study of this decomposition is Fourier analysis.


Here's one more neat trick. The second derivative $\frac{d^2}{dx^2} = \Delta$ is a linear operator on the vector space of functions on $[0,2\pi]$. If you integrate by parts, you can see that it's a symmetric linear operator, like a symmetric matrix. It turns out that the sines and cosines are eigenvectors of $\Delta$, a fact you can easily verify for yourself by differentiating.

Abstract fun fact: different-eigenvalue eigenvectors of a symmetric operator on any vector space with inner product are orthogonal, for if $v$ and $w$ are eigenvectors with eigenvalues $\lambda_v$ and $\lambda_w$, respectively, and $\lambda_v\neq \lambda_w$, \begin{align*} \lambda_v\langle v,w\rangle &= \langle \lambda_vv,w\rangle \\ &= \langle \Delta v,w\rangle \\ &= \langle v,\Delta w\rangle \\ &= \langle v,\lambda_w w\rangle\\ &= \lambda_w\langle v,w,\rangle \end{align*} so $\langle v,w\rangle = 0$.

Related Question