Prove $\lim_{x\to x_0}\left(f(x)+g(x)\right)=+\infty$ if $f(x)\rightarrow A$ and $g(x)\rightarrow +\infty$

calculuslimitsreal-analysis

I want to prove – if $\lim_{x\to x_0} f(x) = A \in \mathbb{R}$ and $\lim_{x\to x_0} g(x) = +\infty$ then $\lim_{x \to x_0} \left(f(x) + g(x)\right) = +\infty$ (where $x_0 \in \bar{\mathbb{R}}$), using Cauchy definition.

Translating it into $\epsilon,\delta$ language it becomes:

Hypothesis 1. $\forall \epsilon>0, \exists \delta>0, \forall x \in D_f, x\in U(x_0, \delta) \rightarrow A-\epsilon < f(x) < A + \epsilon$.
Hypothesis 2. $\forall \epsilon>0, \exists \delta>0, \forall x \in D_g, x\in U(x_0, \delta) \rightarrow \frac{1}{\epsilon} < g(x)$

Goal. $\forall \epsilon>0, \exists\delta>0, \forall x\in D_f \cap D_g, x\in U(x_0, \delta) \rightarrow \frac{1}{\epsilon} < f(x) + g(x)$

So, I have arbitrary $\epsilon$, choose some $\epsilon_1>0$ and $\epsilon_2>0$ to put into both hypotheses, get corresponding $\delta_1>0$ and $\delta_2>0$… then I use value $\min(\delta_1, \delta_2)$… skipping forward, I get to the following: I have to prove from $A-\epsilon_1 < f(x) < A + \epsilon_1$ and $\frac{1}{\epsilon_2} < g(x)$ that $\frac{1}{\epsilon} < f(x) + g(x)$ follows.

How to choose $\epsilon_1$ and $\epsilon_2$ to do that? If $A > 0$ it seems I can choose $\epsilon_1 = A$ and $\epsilon_2 = \epsilon$, but what to do when that's not case?

Best Answer

If $\lim_{x \to x_0}f(x)=A$, where $A\in\Bbb{R}$, then there is a $\delta>0$ such that if $|x-x_0|<\delta$ then $|f(x)-A|<1$. Equivalently, $A-1<f(x)<A+1$.

If $\lim_{x \to x_0}g(x)=\infty$, then for every $N>0$ there is a $\delta'>0$ such that if $|x-x_0|<\delta'$, then $g(x)>N$.

Combining these inequalities, this means for every $N>0$, if $|x-x_0|<\min(\delta,\delta')$, then $f(x)+g(x)>A+N-1$. Can you finish the proof?

Related Question