The Taylor polynomial does not always tell you the value of $x$ everywhere.
There is a radius of convergence. So if you look at the Taylor polynomial of $\ln x$ centered at $a$ it will only converge if $|x-a| < a$ which is kind of obvious as $x$ gets near zero something has to break.
You could say, that if the function is "smooth" i.e. it is infinitely differentiable, then all of its derivatives at any point tell you everything you need to know about the function everywhere inside the radius of convergence.
Anyway, how do you know that the Taylor polynomial of a function will be near
that function for all $x$ in the domain?
Taylor showed that for an approximation of any degree, he could estimate the error.
Suppose $T_n(x)$ is the $n^{th}$ degree Taylor polynomial of some function $f(x)$
i.e.
$T_3(x) = x - \frac 16 x^3$ is the third degree Taylor approximation of $\sin x$ centered at 0.
This estimate has some error and that error can be bounded by a polynomial of higher degree:
$|T_k(x) - f(x)| < |M_{k+1} (x-a)^{k+1}|$
And $M_k = \frac {1}{(k+1)!} |f^{(k+1)}(\xi)|$
Where $\xi$ would be the element of the domain that maximizes the absolute value of the $(k+1)^{th}$ derivative.
If $x$ is inside the radius of convergence: $|x-a| < r \implies \lim_\limits {k\to \infty} M_k(x-a)^k = 0$
Without using a calculator, a set of tables, etc, how would find the value of $e^x$? For some people, that series actually is the definition of $e^x$.
A more general use is expanding the domain of a function e,g. from $\mathbb{R}$ to $\mathbb{C}$.
Another is integration of a function for which there is no anti-derivative.
Best Answer
The first order Taylor approximation will be linear, second order will be quadratic, etc.
So you can think of the first order approximation as the tangent line to the true function at the point you are choosing to expand from.
Think "order" as "degree" for Taylor's theorem.