Big Oh asymptotic notation for real sequences

asymptoticsreal-analysisself-learning

This is an old question for me. Let $a_n$ be a real sequence so that it tends to zero as $n\to\infty$. Introduce the asymptotic notation Big Oh in probability. Suppose that $a_n=O_p(b_n)$, assuming $b_n\to 0$ as $n\to\infty$.
If $C_1,C_2$ are constants, independent of $n$, then
$$\frac{C_1+a_n}{C_2+a_n}=\frac{C_1}{C_2}+O_p(b_n).$$
Why is it true?

The term $C_1/C_2$ is very intuitive as the limit of the L.H.S. of above equation. It is not quite intuitive to me the rate $b_n$ though. I am looking for a proof of this result. If one shows that
$$\frac{C_1}{C_2+O_p(b_n)}=\frac{C_1}{C_2}+O_p(b_n)$$
I thinks the rest is straighforward.

Best Answer

That is because, with the usual rules for big Oh and little oh, and with the hypothesis that $C_1,C_2\ne 0$, \begin{align} \frac{C_1 +a_n}{C_2+a_n}&=\frac{C_1}{C_2}\frac{1+\cfrac{a_n}{C_1}}{1+\cfrac{a_n}{C_2}}=\frac{C_1}{C_2}\Bigl(1+\cfrac{a_n}{C_1}\Bigr)\Bigl(1-\cfrac{a_n}{C_2}+o\bigl(a_n\bigr)\Bigr)\\ &=\frac{C_1}{C_2}\Bigl(1+a_n\Bigl(\frac{1}{C_1}-\frac{1}{C_2}\Bigr)+o\bigl(a_n\bigr)\Bigr)\\ &=\frac{C_1}{C_2}+a_n\Bigl(\frac{C_2-C_1}{C_2^2}\Bigr)+o\bigl(a_n\bigr)\\ &=\frac{C_1}{C_2}+O(a_n)+o(a_n)=\frac{C_1}{C_2}+O(a_n). \end{align} and, of course, if $a_n=O(b_n)$, then $O(a_n)\in O(b_n)$.

Related Question