It follows from your assumption on the conditial expectation that
$$Z_n = X_n - \sum_{i=0}^{n-1} Y_i$$
is a supermartingale. If we define
$$T_k := \inf\left\{n \in \mathbb{N}; \sum_{i=0}^n Y_i \geq k\right\}$$
for fixed $k \in \mathbb{N}$, then $T_k$ is an $F_n$-stopping time. By the optional stopping theorem, the stopped process $(Z_{n \wedge T_k})_{n \in \mathbb{N}}$ is also a supermartingale. Moreover, the definition of $T_k$ and the non-negativity of $X_n$ and $Y_n$ entail that $Z_{n \wedge T_k} \geq -k$ for all $n \in \mathbb{N}$, and therefore
$$\sup_{n \in \mathbb{N}} \mathbb{E}(Z_{n \wedge T_k}^-) < \infty.$$
Applying the standard convergence theorem for supermartingales, we conclude that the limit $\lim_{n \to \infty} Z_{n \wedge T_k}$ exists almost surely, and so
$$\Omega_0 := \bigcap_{k \geq 1} \{\omega \in \Omega; \lim_{n \to \infty} Z_{n \wedge T_k}(\omega) \, \, \text{exists}\}$$
has probability $1$. Now if $\omega \in \Omega_0$, then $\sum_{n} Y_n(\omega)<\infty$ implies that we can choose $k \in \mathbb{N}$ large enough such that $T_k(\omega)=\infty$. As $\omega \in \Omega_0$ we thus know that
$$\lim_{n \to \infty} Z_{n \wedge T_k}(\omega) = \lim_{n \to \infty} Z_n(\omega) = \lim_{n \to \infty} \left( X_n(\omega)- \sum_{i=0}^{n-1} Y_i(\omega) \right)$$
exists. Using once more that $\sum_{i} Y_i(\omega)<\infty$, this shows that $\lim_n X_n(\omega)$ exists.
Take $X_n=n$ and a sequence of independent r.v.s. $\{Y_n\}$ such that $\mathsf{P}(Y_n=0)=1-1/n$ and $\mathsf{P}(Y_n=-2n)=1/n$. Then,
$$
\sum_{n\ge 1}\mathsf{P}(X_n+Y_n<0)=\sum_{n\ge 1}\frac{1}{n}=\infty.
$$
Thus, $\mathsf{P}(X_n+Y_n<0\text{ i.o.})=1$.
Best Answer
Without any additional assumptions (e.g. on the speed of convergence), the assertion is wrong.
(Counter)Example Let $Z \sim N(0,1)$ be a standard Gaussian random variable. If we set $$X_n := \frac{1}{n} Z \qquad Y_n:= \frac{1}{\sqrt{n}} Z$$ then clearly $X_n \to 0$, $Y_n \to 0$ in probability. The density $p$ of $Z$,
$$p(y) = \frac{1}{\sqrt{2\pi}} \exp \left( - \frac{y^2}{2} \right)$$
is continuous and satisfies $p(0)=1/\sqrt{2\pi}$, and therefore it follows that we can choose $r>0$ such that
$$p(y) \geq \frac{3}{4} \frac{1}{\sqrt{2\pi}} \quad \text{for all $|y| \leq r$}.$$
This implies that the transition density $f_n$ of $X_n$ satisfies
$$f_n(y) = \sqrt{n} p \left( \sqrt{n} y \right) \geq \frac{3}{4} \frac{\sqrt{n}}{\sqrt{2\pi}} \quad \text{for all $|y| \leq r/\sqrt{n}$.}$$
Since
$$\int_{\mathbb{R}} |f_n(y)-g_n(y)| \, dy \geq \int_{|y| \leq r/\sqrt{n}} (f_n(y))-g_n(y)) \, dy$$
and $\|g_n\|_{\infty} \leq n^{1/4}/\sqrt{2\pi}$, we get
$$\int_{\mathbb{R}} |f_n(y)-g_n(y)| \, dy \geq \left( \frac{3}{4} \frac{\sqrt{n}}{\sqrt{2\pi}} - \frac{n^{1/4}}{\sqrt{2 \pi}} \right) \frac{2r}{\sqrt{n}} \xrightarrow[]{n \to \infty} \frac{3}{2} \frac{r}{\sqrt{2\pi}}>0$$
and so
$$\int_{\mathbb{R}} |f_n(y)-g_n(y)| \, dy \to 0$$
does not hold true.
Remark For random variables $X$ and $Y$ with density $f$ and $g$, respectively, the total variation distance is defined as
$$d_{\text{TV}}(X,Y) = \frac{1}{2} \int |f-g|.$$
It is possible to show that
$$d_{\text{TV}}(X,Y) = \sup_{A \in \mathcal{B}(\mathbb{R})} |\mathbb{P}(X \in A)-\mathbb{P}(Y \in A)|.$$
The above (counter)example shows that $X_n \to \mu$, $Y_n \to \mu$ in probability does, in general, not imply $d_{\text{TV}}(X_n,Y_n) \to 0$.