$$\sup_{n} E[|X_n| 1_{|X_n|>M} ] \leq \sup_{n} E[|X_n|] \leq \sup_{n} E[|X_n|^p],$$
using Jensen.
You didn't apply Jensen's inequality correctly; it should read
$$\sup_{n} E[|X_n| 1_{|X_n|>M} ] \leq \sup_{n} E[|X_n|] \leq \sup_{n} \left( E[|X_n|^p] \right)^{\color{red}{\frac{1}{p}}}.$$
[...] and the claim follows by letting $M \rightarrow \infty$.
No, it's not that simple. Letting $M \to \infty$ you get
$$\lim_{M \to \infty} \sup_n \mathbb{E}(|X_n| 1_{|X_n|>M}) \leq \sup_{n \in \mathbb{N}} \|X_n\|_p,$$
but that's not good enough; you have to show that the limit equals $0$. Hint for this problem: Use Markov's inequality, i.e.
$$\mathbb{E}(|X_n| 1_{\{|X_n|>M}) \leq \frac{1}{M^{p-1}} \mathbb{E}(|X_n|^p 1_{|X_n|>M}) \leq \frac{1}{M^{p-1}} \mathbb{E}(|X_n|^p).$$
Define $$M_0:=\max_{n \in N} |X_n|.$$ Then we have $$E[|X_n| 1_{|X_n|>M_0}]= E[|X_n|\cdot 0 ] = 0,$$
No this doesn't work, because $M_0$ depends on $\omega$. Unfortunately, this means that your approach fails. Hint for this one: Using e.g. the dominated convergence theorem check first that the set $\{f\}$ is uniformly integrable. Extend the approach to finitely many integrable random variables.
When $E[\sup_n |X_n|] < \infty$, then the sequence is uniformly integrable.
Hint: By assumption, $Y := \sup_n |X_n|$ is integrable and $|X_n| \leq Y$ for all $n \in \mathbb{N}$. Consequently,
$$\mathbb{E}(|X_n| 1_{|X_n|>M}) \leq \mathbb{E}(|Y| 1_{|Y|>M}) \qquad \text{for all $M>0$ and $n \in \mathbb{N}$.}$$
Now use the fact that $\{Y\}$ is uniformly integrable (see question nr. 2).
The condition in Wikipedia is to weak to imply 1), 2) and 3). Consider $\mathbb N$ with power set and the counting measure. Any family of functions satisfies the Wikipedia condition since $\mu (E) <1$ implies that $E$ is the empty set. You can easily write down a family of functions each of which satsifies the condition $\|f||_1=\sum |f(n) |<\infty$ but the norms are not bounded when$f$ varies over the family. [I will write down an explicit example if you ask for it].
Best Answer
This can be proved in many ways. First way: Note that $$E(|X_n|^q) = E(|\sigma_nN + m_n|^q) \leq E((\sigma_n|N| + |m_n|)^q).$$ For $a, b \geq 0$, $(a + b)^q \leq (2\max(a, b))^q = 2^q\max(a, b)^q \leq 2^q(a^q + b^q)$. Hence $$E(|X_n|^q) \leq 2^q(\sigma_n^qE(|N|^q) + |m_n|^q) \to 2^q(\sigma^qE(|N|^q) + |m|^q) < \infty.$$ Second way: Use DCT to conclude that $E(|X_n|^q) = \int |x|^q \frac{1}{\sqrt{2\pi}\sigma_n}\exp(-\frac{(x - m_n)^2}{2\sigma_m^2})\,dx \to \int |x|^q \frac{1}{\sqrt{2\pi}\sigma}\exp(-\frac{(x - m)^2}{2\sigma^2})\,dx < \infty$.
This is again due to $(a + b)^q \leq 2^q(a^q + b^q)$ for $a, b \geq 0$.
You know that $X_n \to X$ in $L^2$, so $X_n \to X$ in probability. This means $|X_n - X| \to 0$ in probability. Then by continuous mapping theorem for convergence in probability, you get $|X_n - X|^p \to 0^p = 0$ in probability.
On a probability space, uniform integrability of a collection $\{X_i : i \in I\}$ is equivalent to existence of a function $f : [0, \infty) \to [0, \infty)$ such that $f(x)/x \to \infty$ as $x \to \infty$ and $\sup_{i \in I}E(f(|X_i|)) < \infty$. See theorem 6.19 on page 154 of "Probability Theory" by Klenke. Now take $f(x) = x^2$.