Using the definition of convergence to prove a sequence converges

limitsreal-analysis

Use the definition of convergence to prove that $x_k = (\frac{k}{k+1}, \frac{1}{k})$ converges.

Here is my attempt:

The definition in my book is as follows : ${x_k}$ is said to converges to some point $a \in \mathbb{R}^n$ iff for every $\epsilon > 0$ there exists an $N \in \mathbb{N}$ such that $k \geq N$ implies $|| x_k – a|| < \epsilon$

This is what I have gathered so far:

For $x^{(1)}_k$ we have that $x^{(1)}_k = \frac{k}{k+1}$ and taking the limit of this we have that the limit is equal to $1$. Doing the same thing for the second index we have $x^{(2)}_k = \frac{1}{k}$ and taking the limit we have that it is equal to $0$. We may now let $\epsilon >0$ and $N \in \mathbb{N}$ such that $k \geq N$, and so we have $|| x_k – (1,0)|| = ||\frac{k}{k+1},\frac{1}{k} – (1,0)|| = || \frac{-1}{k+1},\frac{1}{k}$||. But what does this say exactly, and is this right so far? We want that this term is less than $\epsilon$ but what step do I take next if what I have done so far is right?

Best Answer

Simple approach: $||x_k|| \lt \frac{\sqrt{2}}{k}$ You need $N$ where $\frac{\sqrt{2}}{N}\lt \epsilon$ or $N\gt \frac{\sqrt{2}}{\epsilon}$.