For a Taylor Series of a function:
$f(x) = \displaystyle\sum\limits_{k=0}^{+\infty}c_k (x-a)^k = c_0 + c_1(x-a) + c_2(x-a)^2 + \dots$
The equality holds for $x-a$ in the radius of convergence:
$\displaystyle\frac{1}{R} = \limsup\limits_{k \rightarrow \infty} \left|c_k\right|^{\large{\frac{1}{k}}}$ (Cauchy-Hadamard formula)
where $R$ is the radius of convergence of the given function, or more concretely, the power series converges to the function for all $x$ that satisfies: $\displaystyle\left|\,x-a\, \right| < R$, where $R\in [0,+\infty]$. $\hspace{1cm}\blacksquare$
In your example, for $\displaystyle f(x) = \frac{1}{x}$, we have:
$\displaystyle f(x) = \sum\limits_{k=0}^{+\infty} \frac{(-1)^k}{a^k}(x-a)^k$
so $\displaystyle c_k = \frac{(-1)^k}{a^k}$.
Now for radius of convergence:
$\displaystyle\frac{1}{R} = \limsup\limits_{k\rightarrow\infty} \frac{1}{\sqrt[k]{|a|^k}} = \frac{1}{|a|} \implies R = |a|.$
So our function is convergent for: $|x-a| < |a|
\Longleftrightarrow \left\{
\begin{array}{l l}
0 < x < 2a & \quad \text{for $a>0$}\\
2a < x < 0 & \quad \text{for $a<0$}
\end{array} \right.$
Now, you wanted $a = 1$, so our series is convergent for $\displaystyle\boxed{0 < x < 2}$. For all other $x$, the series is divergent.
My professor used to say:
You might want to do calculus in $\Bbb{R}$, but the functions themselves naturally live in $\Bbb{C}$. Euler was the first to discover that if you don't look at what they do everywhere in the complex plane, you don't really understand their habits.
This is as subjective as it gets, but it has always helped my intuition. In particular, you might think that some function is doing nothing wrong, so it should be analytic. Well, if it does nothing wrong in $\Bbb{R}$, look at what it does in $\Bbb{C}$! If also in $\Bbb{C}$ it does nothing wrong, then it is analytic. If in $\Bbb{C}$ it makes some mess, then you have to be careful also in $\Bbb{R}$. To quote my professor again:
Even in $\Bbb{R}$, and in the most practical and applied problems, you can hear distant echos of the complex behavior of the functions. It's their nature, you can't change it.
Best Answer
Surprisingly, the answers to your questions require some complex analysis, even though your questions are ostensibly about real-valued functions of a real variable.
That said:
1:
Taylor's theorem with the Lagrange remainder says that if $\lim_{n \to \infty} \frac{|f^{(n)}(x_1)|}{n!} (x-x_0)^n = 0$ whenever $x \in (a,b)$ and $x_1$ is between $x$ and $x_0$ then the Taylor series centered at $x_0 \in (a,b)$ converges to $f$ at all points of $(a,b)$. However this result can be extremely hard to check.
2:
Complex analysis tells us that the radius of convergence of the Taylor series at a given $x_0$ is the distance from $x_0$ to the nearest complex singularity. For $\operatorname{Log}(z)$ (the principal complex logarithm) for example this is the distance from $x_0$ to $0$, which is a singularity of $\operatorname{Log}$ because it is a branch point. Thus for instance $\ln(1+x)$ centered at $x_0=5$ has radius of convergence $6$.
Complex analysis also tells us that if $f$ is complex differentiable on a disk of radius $r$ centered at $z_0$, then the Taylor series of $f$ at $z_0$ converges to $f$ on a disk of radius $R=\min \left \{ r,\left ( \limsup_{n \to \infty} \left ( \frac{|f^{(n)}(z_0)|}{n!} \right )^{1/n} \right )^{-1} \right \}$, where $0^{-1}$ is understood as $+\infty$. We can't freely drop this "if $f$ is complex differentiable..." assumption. If we do drop it, we can still say the series converges on a disk of radius $\left ( \limsup_{n \to \infty} \left ( \frac{|f^{(n)}(z_0)|}{n!} \right )^{1/n} \right )^{-1}$ but we cannot be sure that the limit is actually $f$.
3:
I think this is answered by my answer to #2.
This gets much more counterintuitive with functions that look completely innocuous on the real line, such as $\frac{1}{x^2+1}$. Here the series at $x_0=0$ has radius of convergence $1$ even though the restriction to the real line has no singularities anywhere. In some sense the restriction "knows" about the singularity at $\pm i$.