If $X_n \overset{\text{a.s.}}{\to} \infty$ and $Y_n = O_P(1)$, does $X_n + Y_n \to \infty$ in probability or almost surely

probability theoryprobability-limit-theoremssolution-verification

Question

Let $(X_n)$ and $(Y_n)$ be random variables on the probability space $(\Omega, \mathcal A, P)$.

Suppose $X_n \overset{\text{a.s.}}{\longrightarrow} \infty$ and $Y_n = O_P(1)$, i.e. $(Y_n)$ is tight. What do we then know about the behavior of $X_n + Y_n$ as $n \to \infty$ ?

I think that $X_n + Y_n \overset{P}{\longrightarrow} \infty$. (See my proof attempt below.)

I don't think that $X_n + Y_n \overset{\text{a.s.}}{\longrightarrow}\infty$ necessarily holds. (Although I don't have a counterexample.)


Thoughts/Attempt

Claim: $X_n + Y_n \overset{P}{\longrightarrow} \infty$, i.e. $P(X_n \geq c) \to 1$ for all $c \in \mathbb R$.

Proof:
Fix $c \in \mathbb R$ and $N \in \mathbb N$.
$$
\begin{aligned}
P(X_n + Y_n \geq c) =& P(X_n + Y_n \geq c \mid Y_n \in [-N, N]) \cdot P(Y_n \in [-N, N]) \\
&+ P(X_n + Y_n \geq c \mid Y_N \notin [-N, N]) \cdot P(Y_n \notin [-N, N]) \\
=& P(X_n + Y_n \geq c \mid Y_n \in [-N, N]) \cdot P(Y_n \in [-N, N])\\
\geq& P(X_n – N \geq c \mid Y_n \in [-N, N]) \cdot P(Y_n \in [-N, N]) \\
\geq& P(X_n \geq c + N \mid Y_n \in [-N, N]) \cdot P(Y_n \in [-N, N])
\end{aligned}
$$

Assuming w.l.o.g. $N$ is stufficiently large that $P(Y_n \in [-N, N]) >0$, it holds
$$
\lim_{n \to \infty} P(X_n \geq c + N \mid Y_n \in [-N, N]) = 1,
$$

since $X_n \overset{\text{a.s.}}{\longrightarrow} \infty$.

By the tightness of $(Y_n)$,
$$
\lim_{N \to \infty} \liminf_{n \to \infty} P(Y_n \in [-N,N]) = 1.
$$

Hence,
$$
\liminf_{n \to \infty} P(X_n + Y_n \geq c) \geq \lim_{N \to \infty} \liminf_{n \to \infty} P(Y_n \in [-N,N]) = 1,
$$

so $\lim_{n \to \infty} P(X_n + Y_n \geq c) = 1. \quad \square$


Update

Here's a similar counterexample to d.k.o.'s below.

Set $X_n := n$ for all $n \in \mathbb N$.

Let $F$ be an arbitrary distribution function. Let $B_n \overset{\text{iid}}{\sim} \text{Bernoulli}(1/n)$ be a sequence of random variables that are independent of all the others. Let $Y \sim F$ independent of the other random variables.

Set
$$
Y_n =
\begin{cases}
Y, \quad &B_n = 0, \\
-n, \quad &B_n = 1,
\end{cases}
$$

for all $n \in \mathbb N$.

Then $Y_n \overset{d}{\longrightarrow} F$. In particular, $Y_n = O_P(1)$.

However,
$$
\sum_{n=1}^\infty P(X_n + Y_n = 0) \geq \sum_{n=1}^\infty 1/n = \infty.
$$

By the 2nd Borel-Cantelli lemma,
$$
P(X_n + Y_n = 0 \quad \text{i.o.}) = 1.
$$

In particular,
$$
X_n + Y_n \overset{\text{a.s.}}{\nrightarrow} \infty.
$$

Best Answer

Take $X_n=n$ and a sequence of independent r.v.s. $\{Y_n\}$ such that $\mathsf{P}(Y_n=0)=1-1/n$ and $\mathsf{P}(Y_n=-2n)=1/n$. Then, $$ \sum_{n\ge 1}\mathsf{P}(X_n+Y_n<0)=\sum_{n\ge 1}\frac{1}{n}=\infty. $$ Thus, $\mathsf{P}(X_n+Y_n<0\text{ i.o.})=1$.