[Math] Proof of the law of large numbers for higher moments

fourier analysislaw-of-large-numbersprobabilityprobability theorystatistics

Let us work on some probability space $<\Omega,\mathscr{A},\mathbb{P}>$:

I'm looking for (independent) proofs of two proofs, of the generalised weak and strong law of large numbers respectively.

That is I'm looking for proofs that the sample moments are consistency estimators of the moments of the distribution in question (given appropriate conditions thereon).
In symbols: I'm looking for two proofs that:

$m_k\overset{D}{\rightarrow} \mu_k$ and $m_k\overset{as}{\rightarrow} \mu_k$ (where m_k is the $k^{th}$sample moment and $\mu_k$ is the k$^{th}$ (assuming at-most that $\mu_{k+1}<\infty$)?

It would be preferable if the proof of the weak law relied on characteristic functions.

Thanks in advance

Best Answer

Proof Sketch:

Just use the law of large of large numbers together with the continuous mapping theorem (under the assumption that the $X_1\in L^2k(\mathscr{A})$, for $k\in[1,\infty)$).


Bonus:

As a corollary if $X_1\in L^2k$ then $X_1\in L^j(\mathscr{A})$ for every $j\in [1,k]$, whence the binomial theorem and the continuous mapping theorem yields the result that the $k^{th}$-central sample moments $M_k:=\frac{1}{n}\sum (X_i -m_1)^k$ converge a.s. to the $k^{th}$-central moment.


Comment:

(the weak version follows as convergence as. implies convergence in distribution); however I don't know how to prove convergence in distribution directly using CFs, without passing through a stronger form of convergence.

Related Question