The $L^p$ martingale convergence theorem holds also true for non-negative submartingales. The proof relies on Doob's maximal inequality:
Let $(X_j)_{j \in \mathbb{N}}$ be a non-negative submartingale (or a martingale). Then $X_n^* := \sup_{j \leq n} |X_j|$ satisfies $$\|X_n^*\|_p \leq \frac{p}{p-1} \|X_n\|_p$$ for any $p>1$. Moreover, for $X_{\infty}^* := \sup_{j \geq 1} |X_j|$ we have $$\|X_{\infty}^*\|_p \leq \frac{p}{p-1} \sup_{j \geq 1} \|X_j\|_p. \tag{1}$$
For a proof see e.g. René Schilling: Measures, Integrals and Martingales, Theorem 19.12.
So let's prove the convergence theorem for non-negative submartingales:
Let $(X_j)_{j \in \mathbb{N}}$ be a non-negative submartingale which is bounded in $L^p$ for some $p>1$, i.e.
$$\sup_{j \geq 1} \|X_j\|_p <\infty \tag{2}.$$
Then $(X_j)_{j \in \mathbb{N}}$ is in particular bounded in $L^1$ and therefore, by a standard convergence theorem for submartingales, we have $X_j \to X$ almost surely for some random variable $X$. As $|X| \leq X_{\infty}^*$, we find by $(1)$ and $(2)$
$$|X-X_j| \leq 2 X_{\infty}^* \in L^p.$$
Consequently, the dominated convergence theorem proves
$$\|X-X_j\|_p \to 0 \qquad \text{as $j \to \infty$.}$$
You've stepped off a cliff in deducing that the expectation $E(X^+_n)$ is bounded above by the random variable $\sup_n X_n$.
Suggestion: Employ the argument used by Durrett in the proof of his Theorem 5.3.1. Fix a positive real $K$, define the stopping time $T=T_K$ to be the first time $n$ that $X_n$ is larger than $K$, and observe that the stopped process satisfies
$$
X_{n\wedge T}\le K+\sup_m\xi_m^+,
$$
so that
$$
E(X_{n\wedge T})\le K+E(\sup_m\xi_m^+)<\infty,\qquad\forall n.
$$
Now apply the submartingale convergence theorem to the stopped process. This yields a.s. convergence of $X_n$ on the event $\{T_K=\infty\}=\{\sup_mX_m\le K\}$. Finally, vary $K$.
Best Answer
Let $R_i,\,i\ge1,$ be independent, nonnegative r.v. with mean $1$. Then $M_n:=\prod_{i=1}^nR_i,\,n\ge0,$ defines a nonnegative martingale w.r.t. its natural filtration $F_n=\sigma(R_1,\ldots,R_n),\,n\ge0$. By the martingale convergence theorem, it converges almost surely as $n\to\infty$ towards a nonnegative r.v. $M_\infty$, and further $\mathbb E[M_\infty]\le1$ (by Fatou's lemma).
Next, let $X_n:=-\sqrt{M_n}$. It is clear that $X_n,\,n\ge0,$ is a $\{F_n\}_{n=0}^\infty$-adapted process bounded in $\mathrm L^2(\mathbb P)$ (we have $\mathbb E[X_n^2]=\mathbb E[M_n]=1$ for every $n\in\mathbb N$). Further, by convexity of the function $-\sqrt{\cdot}$ and (conditional) Jensen's inequality, $$\mathbb E[X_{n+1}\mid F_n]=\mathbb E\!\left[-\sqrt{M_{n+1}}\mid F_n\right]\ge-\sqrt{\mathbb E[M_{n+1}\mid F_n]}=-\sqrt{M_n}=X_n$$ for every $n\in\mathbb N$. Thus $(X_n)_{n\ge0}$ is a submartingale bounded in $\mathrm L^2(\mathbb P)$.
Now $X_n$ converges almost surely as $n\to\infty$ towards $X_\infty:=-\sqrt{M_\infty}$. Using the Riesz-Scheffé lemma, the following assertions are therefore equivalent:
By Kakutani's martingale theorem, these assertions are also equivalent to $$\prod_{i=1}^\infty\mathbb E\!\left[\sqrt{R_i}\right]>0,$$ or equivalently $$\sum_{i=1}^\infty\left(1-\mathbb E\!\left[\sqrt{R_i}\right]\right)<\infty.$$
For instance, choose the independent $R_i$'s such that $$R_i=\begin{cases}\frac{(i+1)^2}{i^2},&\text{with probability $\frac{i^2}{(i+1)^2}$,}\\0,&\text{with remaining probability.}\end{cases}$$ In this case $(X_n)_{n\ge0}$ is a (negative) submartingale, bounded but not converging in $\mathrm L^2(\mathbb P)$.