Proof that sinusoids are unique in retaining their shape when summed with waveforms of the same shape and frequency

fourier analysisperiodic functionssignal processingtrigonometry

The Wikipedia article on Sine wave states that the sine wave "retains its wave shape when added to another sine wave of the same frequency and arbitrary phase and magnitude." I am aware that this can be shown using the harmonic addition theorem.

However, the article goes on to state that the sine wave "is the only periodic waveform that has this property."

How can it be shown that the sinusoid is unique in this respect? (that is, that for sinusoidal waveforms only does summing together waveforms with the same frequency but arbitrarily different amplitudes and phase offsets result in a waveform of the same shape)? It is not difficult to test the property with single examples, such as with square waves, but it is not clear to me how to determine that the property only holds for sine waves.

It seems to me that this property could be stated as a functional equation along the lines of:

$$ f(t) + A_2 \, f(t + \delta_2) = A_3 \, f(t + \delta_3) $$

I found several answers by user21467 that reference the paper below, in which a set of functional equations is used to define sine and cosine; however, they are not quite the same functional equations as what is directly relevant to my question.

Robison, G. (1968). A New Approach to Circular Functions, II and lim (sin x)/x. Mathematics Magazine, 41(2), 66-70. [doi:10.2307/2689051] [jstor]

Best Answer

Here is an attempt at a constructive answer. The claim is that a pure sinusoid is the only periodic waveform that has the property $\forall A_1,\tau_1 \exists A_2,\tau_2$ such that $x(t) + A_1x(t+\tau_1) = A_2 x(t+\tau_2)\;\forall t$ where $x(t)$ is $T$-periodic.

I am going to limit myselfto "well-behaved" functions which can be uniquely expressed as a Fourier series, i.e., $$x(t) = \sum_\ell \gamma_\ell {\rm e}^{\jmath \ell t/T}.$$ The pure sinusoids are the functions where only one pair of $\gamma_{\ell_0}$, $\gamma_{-\ell_0}$ is nonzero and all other $\gamma_\ell$ are zero. For instance $\gamma_1 = \gamma_{-1} = \frac 12$ and all others zero gives $x(t) = \cos(2\pi t/T)$ but it could also be $\gamma_2$ and $\gamma_{-2}$, giving rise to a cosine of twice the frequency (whose fundamental period is then $T/2$, but it is still also $T$-periodic).

Now apply this expansion into the definition. We obtain $$\sum_\ell \gamma_\ell {\rm e}^{\jmath \ell t/T} + A_1 \sum_\ell \gamma_\ell {\rm e}^{\jmath \ell t/T} {\rm e}^{\jmath \ell \tau_1/T} \stackrel{!}{=} A_2\sum_\ell \gamma_\ell {\rm e}^{\jmath \ell t/T}{\rm e}^{\jmath \ell \tau_2/T}, $$ which we can write as $$\sum_\ell \gamma_\ell\cdot\left(1+{\rm e}^{\jmath \ell \tau_1/T}A_1\right) {\rm e}^{\jmath \ell t/T}\stackrel{!}{=} \sum_\ell \gamma_\ell\cdot\left({\rm e}^{\jmath \ell \tau_2/T}A_2\right) {\rm e}^{\jmath \ell t/T}.$$

Now we can claim that for both sides of the equation to be equal for all $t$, all coefficients must be equal since each deviation in coefficients (for the same $\ell$) gives rise to a nonzero difference function and different coefficients (for different $\ell$) cannot cancel as the basis functions of the Fourier series are orthogonal. Therefore, the above condition translates to $$ \gamma_\ell\cdot\left(1+{\rm e}^{\jmath \ell \tau_1/T}A_1\right) \stackrel{!}{=} \gamma_\ell\cdot\left({\rm e}^{\jmath \ell \tau_2/T}A_2\right) \; \forall \ell. $$

So again, given an arbitrary $A_1$ and $\tau_1$ we must find an $A_2, \tau_2$ such that the above condition is true for all $\ell$. There are two ways to satisfy the equation: either $\gamma_\ell = 0$ or $|A_2| = \sqrt{1+A_1^2 + 2A_1\cos(\ell \tau_1/T)}$ (using $|1+A|=\sqrt{((1+\Re A)^2 + (\Im A)^2}$) and $\tau_2= \frac{T}{\ell} \arg\left\{\frac{1+{\rm e}^{\jmath \ell \tau_1/T}A_1}{A_2}\right\}$. Now, obviously the solution for $A_2$ will be different for each $\ell$ as long as $A_1 \neq 0$ (and for $\tau_1/\pi$ irrational). So we can solve $A_2$ only for one $\ell$ (and since the cosine is even, the same solution works for $-\ell$).

In consequence, we can have one pair $(\ell,-\ell)$ for which $\gamma_\ell \neq 0$ that allows us to solve for $(A_2, \tau_2)$ for any given $(A_1,\tau_1)$. However, since the same solution will not work for any other $\ell$, all other $\gamma_\ell$ need to be zero. Hence, the only solutions that work are of the form $$x(t) = \gamma_1 \cdot {\rm e}^{\jmath \ell_0 t/T} + \gamma_{-1}{\rm e}^{-\jmath \ell_0 t/T} = (\gamma_1 + \gamma_{-1}) \cos( \ell_0 t/T) + \jmath (\gamma_1 - \gamma_{-1}) \sin( \ell_0 t/T),$$

which are exactly the pure sinusoids of (radial) frequency $\ell_0/T$.