The Wikipedia article is still under construction, and still contains errors.
I am one of the editors.
The formula you cite is from the section of the article about the power spectral density of a stochastic process, and is rather sloppy, it still needs to be corrected.
But the wordy definition you cite is from a different paragraph of the article, and applies first of all to an individual signal, i.e., a deterministic signal, i.e., a sample function of the process, ignoring the existence of all other sample functions, thus, ignoring the structure of the process. Secondly, it applies to a process too but only to the spectral decomposition of the process, and not to the formula you mention.
Now, the truth is this: given any function of time (a deterministic function of time) x(t),
such that $$\lim_{T\rightarrow\infty} {1\over 2T} \int_{-T}^T x(t+\tau)x(t) dt \,\,\,\,\,\ \ \ \ \ \ \ (*)$$ exists for all $\tau$, then one can find a statistical distribution function $S$, called the power spectral distibution function of $x$, such that for almost all frequencies $f_1,f_2$,
$S(f_2)-S(f_1)$ is the amount of power contributed to $x$ by frequencies in the band $[f_1,f_2]$ in the sense of the sum of the squares of the jumps at frequencies in that band of $s$, the generalised Fourier transform of $x$, defined by the limit in mean (i.e, the limit in an $L^2$ space, not a pointwise limit) of
$$s(\omega) = \int_{-A}^{-1} x(t) {e^{-i\omega t}\over it} dt + \int_{-1}^{1} x(t) {e^{-i\omega t}-1\over it} dt + \int^{A}_{1} x(t) {e^{-i\omega t}\over it} dt \,\,\,\,\,\ \ \ (**)$$ as $A$ goes to infinity, with $\omega = 2\pi f$.
The first tricky bit is that $x$ will not usually have a Fourier transform, which is why we have to put a factor of $t$ in the denominator here, for convergence. If only $x$ had a Fourier transform $X$, this generalisation, $s$, would be the integral of $X$.
The second tricky bit is that even if $s$ is continuous, it might be so far away from being differentiable that its "infinitesimal" jumps contribute something to the power. For this reason, the intuitive notion of "sum of squares of the Fourier coefficients of $x$ " has to be interpreted as the sum of the squares of the jumps of $s$" which, in turn, has to be interpreted as $$\lim_{\epsilon\rightarrow0} {1 \over 2\epsilon} \int_{f_1}^{f_2} \vert s(f+\epsilon) - s(u-\epsilon) \vert ^2 du.$$ This succeeds in defining $S$ almost everywhere.
Now even if $S$ is not differentiable, it does define a distribution, and its derivative in the sense of a distribution can be defined as the power spectral density. But since $S$ can have jump discontinuities, its derivative can have delta functions in it.
=== The case of a stochastic process
Suppose now that $X(t)$ is a stochastic process. We must further assume that it is stationary (in the wide sense)---this assumption is analogous to assumption $(*)$ above for a deterministic signal. Then $X$ has a spectral decomposition, which is a rather sophisticated analogue of the Fourier transform of a deterministic function. It uses the notion of stochastic integration which is much more elementary than Ito's notion of a stochastic integral. See Gnedenko, Kyrc Torpii Vepoyath¿nocteu, Chapter 10, No. 56, ctp. 316. Provided these are understood in the sense of the limit in mean of stochastic processes, one can write a spectral decomposition of $X$ entirely analogous to $(**)$:
$$X(t) = \int_0^\infty \cos \omega t dZ_1(\omega) + \int_0^\infty \sin \omega t dZ_1(\omega) ,$$
where $$Z_1(t) = \lim_{T\rightarrow\infty} {1\over 2\pi} \int_{-T}^T X(t) {\sin \omega t\over t} dt$$
and
$$Z_2(t) = \lim_{T\rightarrow\infty} {1\over 2\pi} \int_{-T}^T X(t) {1-\cos \omega t\over t} dt.$$
Now here, too, if the process is ergodic so that time averages can be replaced by ensemble averaging by taking the expectation operator $\bf E$, then the average power or variance contributed by the frequency $f$ can be found by looking at the expected value of the jump of $Z_i$ at $f$, i.e., studying $\bf E( \vert Z_i(\omega + \Delta\omega) - Z_i(\omega)\vert ^2 )$ etc. But at this point one bails and uses a theorem of Bochner, as generalised by Khinchin to the context of stochastic processes, and sees that this is equal to $F(\omega + \Delta\omega) - F(\omega)$ where $F$ is the statistical distribution function given by Bochner's theorem applied to the auto-correlation function of the process $X$.
==Now, as to the formula itself==
The formula you quote,
Sxx(ω):=limT→∞E[|1T−−√∫T0x(t)e−iωtdt|2],
is not correct. I have never seen a reliable source that proves (or even asserts) that it converges. I see it a lot on the internet and in engineering textbooks: they never bother to assert that it converges. I computed an example for a line spectrum, it does not converge---admittedly, it should not converge since a line spectrum does not have a power spectral density.
Consider the right hand side without the expectation operator, as if for a deterministic signal. Then it does not converge even when the spectral density exists, since the sample paths of a noisy process have unbounded variation on any finite interval whatsoever. As far as I know, one must introduce a lag window factor to make it converge, i.e., something like Cesaro summation but for an integral instead of a series.
This topic is fraught with peril: a signal contaminated with noise is modelled by a function which is continuous but nowhere differentiable and with unbounded variation on any finite interval, so Fourier inversion never is valid. More generally, because of the nature of these signals, one can never be sure it is valid to interchange two limits.
One often hears hand-waving assertions to the effect that the use of Laurent Schwartz's method of distributions makes these formulas all right. But even with distributions, one still has to convolve with a lag window or a spectral window to make it converge. I have never seen proofs of these handwaving assertions, and the only careful statements I know (without proofs, but it is after all a handbook which omits the proofs), D. C. Champeney, A handook of Fourier theorems, Cambridge Univ. Press, does not treat stochastic processes.
This result proves the conjecture is true. Consider a stationary complex random function $\zeta(t)$. Stationarity requires that the autocorrelation function depend on the difference $t –t’$ only and that the mean value must be a constant (not a member of a distribution function nor depend on $t$). Assume the power spectral density (hereafter called the spectrum) is absolutely continuous everywhere. The Fourier transform of $\zeta(t)$ itself does not exist in general, because stationary random functions are generally neither absolutely integrable nor square integrable, i.e. they do not satisfy Dirichlet’s condition. The correct representation of the random variable $\zeta(t)$ is either as a Fourier series or as a Fourier-Stieltjes integral [see, for example, Akira Ishimaru, Wave Propagation and Scattering in Random Media, (IEEE Press, 1997) Appendix A] Here, the Fourier-Stieltjes integral will be used
$$\zeta(t) = \int_{-\infty}^{+\infty} e^{i\omega t}dA(\omega)$$
where $dA(\omega)$ is the random amplitude. Typically, the mean value and the autocorrelation function are found using ensemble averages, but this method provides little insight into the behavior of $dA(\omega)$. Hence, the average value and autocorrelation function for $\zeta(t)$ will be found using temporal averages. The ergodic hypothesis states that for a stationary process the temporal average and the ensemble average are equivalent.
$\mathbf Temporal Averages -$ The temporal average $\overline{\zeta(t)}$ is calculated over a finite interval $T$, and then the limit is taken as $T$ goes to infinity
$$\overline {\zeta(t)} = \lim_{T\to \infty} \frac 1T \int_{-T/2}^{+T/2}\zeta(t) dt$$
Substitute the Fourier-Stieltjes integral into the equation and perform the $t$ integration to get
$$\overline {\zeta(t)} = \lim_{T\to \infty} \frac 1T \int_{-\infty}^{+\infty} dA(\omega) \int_{-T/2}^{+T/2} e^{i\omega t}dt = \lim_{T\to \infty} \frac 1T \int_{-\infty}^{+\infty} dA(\omega) \frac {2sin(\omega T/2)} {\omega} $$
Use the following expressions to take the limit
$$ \lim_{T\to \infty} \frac {2\sin(\omega T/2)}{\omega} \rightarrow 2\pi\delta(\omega) \quad \mathrm{and} \quad \lim_{T\to \infty} \frac 1T = \lim_{T\to \infty} \frac {\Delta\omega} {2\pi}\rightarrow \frac {d\omega}{2\pi} $$
which gives for the temporal average
$$\overline {\zeta(t)} = \int_{-\infty}^{+\infty} dA(\omega) \ \delta(\omega) \ d\omega = dA(0) $$
Finally, for a zero-mean process
$$\overline {\zeta(t)} = \langle \zeta \rangle = dA(0) = 0 $$
Thus, the mean value of $\zeta(t)$ obtained via temporal averaging is $dA(0)$ which is the random amplitude evaluated at the origin.
In the following analysis the random variable $\zeta(t)$ will be decomposed into the sum of zero-mean random variable $\zeta_0(t)$ and the mean $\langle\zeta\rangle$ so that the mean value is everywhere explicit. Now the temporal average for the autocorrelation function of $\overline{\zeta(t)\zeta^*(t’)}$ is given by
$$ \overline{\zeta(t)\zeta^*(t+\tau)} = \overline{\zeta_0(t)\zeta_0^*(t+\tau)} + \langle \zeta \rangle^2 = \sigma^2C(\tau) + \langle \zeta \rangle^2 $$
$$ \lim_{T\to \infty} \frac 1T \iint_{-\infty}^{+\infty} e^{-i\omega\tau}dA(\omega)dA(\omega’) \int_{-T/2}^{+T/2} e^{i(\omega-\omega’)t}dt $$
Performing the $t$ integration yields
$$ \sigma^2C(\tau) + \langle \zeta \rangle^2 = \iint_{-\infty}^{+\infty} e^{-i\omega\tau}dA(\omega)dA(\omega’) \lim_{T\to \infty} \frac 1T \frac {2sin[(\omega-\omega’) T/2]} {(\omega-\omega’)} $$
The following expressions are used in the limit process
$$ \lim_{T\to \infty} \frac {2\sin[(\omega-\omega’) T/2]}{(\omega-\omega’)} \rightarrow 2\pi\delta(\omega-\omega’) \quad \mathrm{and} \quad \lim_{T\to \infty} \frac 1T = \lim_{T\to \infty} \frac {\Delta\omega} {2\pi}\rightarrow \frac {d\omega}{2\pi} $$
Substituting these expressions into the autocorrelation gives
$$ \sigma^2C(\tau) + \langle \zeta \rangle^2 = \iint_{-\infty}^{+\infty} e^{-i\omega\tau}dA(\omega)dA(\omega’) \delta(\omega-\omega’) \ d\omega $$
Finally, performing the $\omega$ integration yields
$$ \sigma^2C(\tau) + \langle \zeta \rangle^2 = \int_{-\infty}^{+\infty} e^{-i\omega'\tau}|dA(\omega’)|^2 \qquad \mathrm {Equation \ (1)}$$
The quantity $dA(\omega’)$ can be related to the mean value $\langle\zeta\rangle$ and spectrum $S(\omega)$ by taking the inverse Fourier transform of the autocorrelation function
$$ \int_{-\infty}^{+\infty}\sigma^2C(\tau)e^{i\omega\tau} \ d\tau + \int_{-\infty}^{+\infty} \langle\zeta \rangle^2 e^{i\omega\tau} \ d\tau = \int_{-\infty}^{+\infty}|dA(\omega’)|^2\int_{-\infty}^{+\infty}e^{i(\omega-\omega’)\tau}\ d\tau$$
Performing the $\tau$ integration yields
$$ S(\omega)+2\pi \langle \zeta \rangle^2 \delta(\omega)= 2\pi \int_{-\infty}^{+\infty} |dA(\omega’)|^2 \delta(\omega-\omega’) $$
Consider the case of $\omega = 0$ for a zero-mean process, i.e. $\langle\zeta\rangle = 0$; both sides of the expression reduce to
$$ S(0)+0 = 2\pi \int_{-\infty}^{+\infty} |dA(\omega’)|^2 \delta(\omega’) $$
The Dirac delta function forces the terms on the right side to be evaluated at $\omega’ = 0$ to produce
$$ S(0) = 2\pi |dA(0)|^2\delta(\omega’) = 2\pi \langle \zeta \rangle ^2 \delta(\omega’) =0 $$
where the previous result $dA(0) = \langle\zeta\rangle$ has been used. Thus, it is shown that the spectrum for a zero-mean process is zero when the frequency is zero, i.e. $S(0) = 0$. It can be seen in Equation (1) for the general case that the required value of $|dA(\omega’)|^2$ must be given by
$$|dA(\omega’)|^2 = \frac 1{2\pi}S(\omega’) \ d\omega’+|dA(0)|^2 \delta(\omega’) \ d\omega’ $$
$\mathbf Discussion -$ The vanishing of the spectrum at $\omega = 0$ establishes some new requirements on the autocorrelation function and spectrum. For a zero-mean random variable, when $\tau = 0$ the autocorrelation form of the Wiener-Khinchin theorem reduces to
$$ \sigma^2 = \frac 1{2\pi}\int_{-\infty}^{+\infty}S(\omega)\ d\omega $$
This expression has proven useful and has wide application. Now consider the spectrum form of the Wiener-Khinchin theorem when $\omega = 0$. This expression becomes
$$ S(0) = \int_{-\infty}^{+\infty}\sigma^2C(\tau)\ d\tau =0 $$
Besides the information on the behavior of the spectrum, this gives useful information on the autocorrelation function. For example, the autocovariance function must have both positive and negative excursions, and the area under these excursions must be equal otherwise the integral cannot be zero. This means that the simple Gaussian or exponential forms often used for the autocorrelation function are not valid for a zero mean process – they have no negative excursions anywhere.
Best Answer
Your process is 1-periodic with probability 1. So I think you should take the Fourier series. Then the power spectrum is the modulus-squared of that.