[Math] Does a Sequence Converge to 0 if and only its Reciprocal Sequence Diverges to Infinity

real-analysissequences-and-series

I was considering yesterday whether or not the question in the title is, in fact, true. I believe that the definition of a convergent sequence is
$(\forall \epsilon>0)(\exists N\in\mathbb{N})(\forall\mathbb{N}\ni{n,m}>N):\left|a_m-a_n\right|<\epsilon$

and so a divergent sequence is:

$(\exists \epsilon>0)(\forall N\in\mathbb{N})(\exists\mathbb{N}\ni{n,m}>N):\left|a_m-a_n\right|\ge\epsilon$

So I'm trying to see if the statement:

$\langle a_n\rangle\rightarrow0\iff\langle\frac{1}{a_n}\rangle\rightarrow\infty$

is true, or, at least, that the statement

$\langle a_n\rangle\rightarrow0\implies\langle\frac{1}{a_n}\rangle\rightarrow\infty$

is true.

I'm not sure how I would do this. Assuming the statement implying the other, I have to find an epsilon and pair of indices such that $|a_{m'}-a_{n'}|\ge\epsilon$, correct?

I would think my method of attack here would be to suppose that the sequence of its reciprocals does converge and derive a contradiction. For instance, fixing an $\epsilon'>0$ I suppose I have found an index satisfying it for $n',m'>N$

$|\frac{1}{a_{m'}}-\frac{1}{a_{n'}}|<\epsilon'$ which means that $|a_{m'}-a_{n'}|<|a_{n'}a_{m'}|\epsilon'$ for all $n',m'<N$ and by assumption there also exists some $\epsilon$ for this sequence such that $|a_{m'}-a_{n'}|<\epsilon_{0}$ for all $n',m'<N$. Then either $\epsilon_{0}\ge{|a_{n'}a_{m'}|\epsilon'}$ or $\epsilon_{0}\le|a_{n'}a_{m'}|\epsilon'$ for all $n',m'>N$. Of course, this does not seem like a very promising way to see that the other sequence diverges explicitly to infinity.

Does this seem like the correct approach? Any input is welcome! Personally, I've never been very good with these, so I'm likely attacking this from an angle that will horrify and appall many of the more astute types of you out there!

Best Answer

There appears to be some confusion between a few concepts. First, the definition of convergent sequence you provide is the definition of a Cauchy sequence (which is equivalent to convergence in the complete metric space $\mathbb{R}$).

At any rate, "divergent sequence" and $a_n \to \infty$ usually mean two different things. We usually write $a_n \to \infty$ for $(\forall M)(\exists N)(\forall n > N)(a_n > M)$. This is not the same as "not converging".

Another point is that if you use the definition of Cauchy sequence, you're using the fact that the sequence is convergent, not the fact that it is converging to $0$.

Anyway, as a conclusion:

. The fact that $a_n$ is a convergent sequence does not imply that $1/a_n$ is a divergent sequence (quite the contrary; unless $a_n \to 0$, the reciprocal is convergent).

. If $a_n \to 0$, then $\frac{1}{|a_n|} \to \infty$ (with the definition I provided above).