Let $Y_n = E(X| \mathcal{F}_n)$. Then it $Y_n$ is a martingale, and
$$\sup_n E(|Y_n|) = \sup_n E(|E(X| \mathcal{F}_n)|) \leq \sup_n E(E(|X||\mathcal{F}_n)) = E(|X|) $$
where the bound in the middle is due the conditional Jensen inequality.
Now the heavy artillery, by Doob's Convergence theorem $Y_\infty := \lim_{n \to \infty} Y_n$ exists almost surely. And since the sequence is dominated by $X$ (again conditional Jensen) we conclude the $L^1$ convergence and thus the convergence in probability.
You can find the Doob's Convergence theorem in Williams' "Probability with Martingales" Thm. 11.5. Is a rather important result based on a "band argument" and it can be extended to continuous time martingales.
Let $(Y_n)_{n \in \mathbb{N}}$ be a sequence of independent random variables such that
$$\mathbb{P}(Y_n = 1) = \mathbb{P}(Y_n=-1) = \frac{1}{2n} \qquad \mathbb{P}(Y_n=0) = 1- \frac{1}{n}.$$
If we define
$$X_n := \begin{cases} Y_n, & X_{n-1} = 0, \\ n X_{n-1} |Y_n|, & X_{n-1} \neq 0 \end{cases} \qquad X_0 := 0$$
then the process $(X_n)_{n \in \mathbb{N}_0}$ is a martingale with respect to $\mathcal{F}_n := \sigma(Y_k; k \leq n)$. Indeed:
$$\begin{align*} \mathbb{E}(X_n \mid \mathcal{F}_{n-1}) &= 1_{\{X_{n-1}=0\}} \underbrace{\mathbb{E}(Y_n \mid \mathcal{F}_{n-1})}_{=\mathbb{E}(Y_n)=0} + n 1_{\{X_{n-1} \neq 0\}} X_{n-1} \underbrace{\mathbb{E}(|Y_n| \mid \mathcal{F}_{n-1})}_{=\mathbb{E}(|Y_n|) = 1/n} \\ &= 0 \cdot 1_{\{X_{n-1}=0\}} + 1_{\{X_{n-1} \neq 0\}} X_{n-1} = X_{n-1}. \end{align*}$$
For any fixed $a \in \{-1,0,1\}$ we have
$$\begin{align*} \sum_{n \geq 1} \mathbb{P}(Y_{2n}=0, Y_{2n+1}=a) &= \sum_{n \geq 1} \mathbb{P}(Y_{2n}=0) \mathbb{P}(Y_{2n+1}=a) \\ &\geq \sum_{n \geq 1} \left(1-\frac{1}{2n} \right) \frac{1}{2(2n+1)} = \infty, \end{align*}$$
and therefore the Borel-Cantelli lemma shows that for almost all $\omega$ it happens for infinitely many $n \in \mathbb{N}$ that $Y_{2n}(\omega)=0$, $Y_{2n+1}(\omega)=a$. By the very definition, this implies that $X_{2n}(\omega)=0$ and $$X_{2n+1}(\omega)=Y_{2n+1}(\omega)=a$$ for any such $n \in \mathbb{N}$. Consequently, we have shown that $$\mathbb{P}(X_k = a \, \, \text{infinitely often})=1$$ for any $a \in \{-1,0,1\}$. It remains to prove that $$\sup_{n \in \mathbb{N}} |X_n(\omega)| < \infty \quad \text{a.s.}$$ To this end, we note that $$\sum_{n \geq 1} \mathbb{P}(Y_n \neq 0, Y_{n+1} \neq 0) = \sum_{n \geq 1} \mathbb{P}(Y_n \neq 0) \mathbb{P}(Y_{n+1} \neq 0) \leq \sum_{n \geq 1} \frac{1}{n^2} < \infty,$$ applying the Borel-Cantelli lemma we find that for almost all $\omega$ we can choose $N=N(\omega)$ such that $$Y_{n}(\omega) \neq 0 \implies Y_{n+1}(\omega)=0 \quad \text{for all $n \geq N$.}$$ As $$X_n(\omega) \neq 0 \implies Y_n(\omega) \neq 0 \quad \text{and} \quad Y_{n+1}(\omega) = 0 \implies X_{n+1}(\omega)=0$$ this means that $$X_n(\omega) \neq 0 \implies X_{n+1}(\omega)=0 \quad \text{for all $n \geq N$.}$$ By the definition of $X_n$, this implies that $|X_n(\omega)| \leq |Y_n(\omega)| \leq 1$ for all $n \geq N$. Thus, $$\sup_{n \in \mathbb{N}} |X_n(\omega)| \leq \sup_{n \leq N} |X_n(\omega)| + 1<\infty.$$
Best Answer
The conditional expectation operator is a contraction on $L^p$ for each $p \ge 1$, so the hypothesis that $A_n(x) \to 0$ in $L^p$ implies that $E[A_n \mid \mathcal F_n] \to 0$ in $L^p$. However, the answer is different for almost sure convergence.
Consider the unit interval with Lebesgue measure $\mu$. For integers $n \in [2^k,2^{k+1}-1]$, let $\mathcal F_n$ denote the $\sigma$-field generated by the uniform partition of $[0,1]$ to $2^k$ intervals.
Also, for $n \in [2^k,2^{k+1}-1]$, let $I_n$ be an interval of length $2^{-k}/k^2$ contained in the interval $J_n=[n/2^k-1,(n+1)/2^k-1]$, and define $A_n(x)=k^2$ for $x \in I_n$ and $A_n(x)=0$ for $x \notin I_n$.
Then $A_n(x) \to 0$ in $L^p[0,1]$ for every $p \ge 1$, and $A_n(x) \to 0$ a.e. by Borel-Cantelli, since $\sum_{n \ge 1} \mu(I_n)=\sum_{k \ge 1} 1/k^2 <\infty\,.$
However, the conditional expectations $E[A_n \mid \mathcal F_n] ={\bf 1}_{J_n}$ do not tend to zero a.e.