It's not that your understanding is wrong, it's more that the wording of the question suggests that the definitions, properties, applications, and "cultural context" of power series are a bit tangled.
For concreteness and definiteness, let's work over the real numbers. If $x_{0}$ is a real number, and if $(a_{k})_{k=0}^{\infty}$ is an arbitrary sequence of real numbers, the associated power series with coefficients $(a_{k})$ and center $x_{0}$ is the expression
$$
\sum_{k=0}^{\infty} a_{k} (x - x_{0})^{k}.
\tag{1}
$$
Power series make sense as purely algebraic entities. For example, two power series (with the same center) can be added termwise and multiplied using the "Cauchy product":
\begin{gather*}
\sum_{k=0}^{\infty} a_{k} (x - x_{0})^{k} + \sum_{k=0}^{\infty} b_{k} (x - x_{0})^{k}
= \sum_{k=0}^{\infty} (a_{k} + b_{k}) (x - x_{0})^{k}, \\
\left(\sum_{k=0}^{\infty} a_{k} (x - x_{0})^{k}\right) \left(\sum_{k=0}^{\infty} b_{k} (x - x_{0})^{k}\right)
= \sum_{k=0}^{\infty} \left(\sum_{j=0}^{k} a_{j} b_{k - j}\right) (x - x_{0})^{k}.
\end{gather*}
These definitions make sense whether or not the individual series converge (next item). The crucial point is, each coefficient of the sum or product is a finite algebraic expression in the coefficients of the "operands".
For each real number $x$, the power series (1) is an infinite series of real numbers, which may converge (the sequence of partial sums
$$
s_{n}(x) = \sum_{k=0}^{n} a_{k} (x - x_{0})^{k}
$$
converges to a finite limit) or diverge (otherwise). Clearly, (1) converges for $x = x_{0}$. It's not difficult to show that:
a. If (1) converges for some $x$ with $|x - x_{0}| = r$, then (1) converges for every $x$ with $|x - x_{0}| < r$;
b. If (1) diverges for some $x$ with $|x - x_{0}| = r$, then (1) diverges for every $x$ with $|x - x_{0}| > r$.
It follows that for every power series (1), there exists an extended non-negative real number $R$ (i.e., $0 \leq R \leq \infty$) such that (1) converges for $|x - x_{0}| < R$ and diverges for $|x - x_{0}| > R$. (If $R = 0$, the former condition is empty; if $R = \infty$, the latter is empty.) This $R$ is called the radius of the power series (1). The power series
$$
\sum_{k=0}^{\infty} k!\, x^{k},\qquad
\sum_{k=0}^{\infty} \frac{x^{k}}{R^{k}},\qquad
\sum_{k=0}^{\infty} \frac{x^{k}}{k!}
$$
have radii $0$, $R$, and $\infty$, respectively.
If (1) has positive radius (i.e., $0 < R$), the sum of the series defines a function
$$
p(x) = \sum_{k=0}^{\infty} a_{k} (x - x_{0})^{k}
$$
whose domain is the set of $x$ with $|x - x_{0}| < R$. In this region, $p$ is infinitely differentiable, and its derivatives are found by termwise differentiation, e.g.,
$$
p'(x) = \sum_{k=1}^{\infty} ka_{k} (x - x_{0})^{k-1}
= \sum_{k=0}^{\infty} (k + 1) a_{k+1} (x - x_{0})^{k}.
$$
Each derivative is itself a power series, convergent in the same region.
A function $f$ is analytic at $x_{0}$ if $f$ is represented by a convergent power series in some open interval about $x_{0}$, and is analytic if $f$ is analytic at $x_{0}$ for every interior point $x_{0}$ of the domain of $f$.
If $f$ is infinitely-differentiable at some point $x_{0}$, the Taylor series of $f$ at $x_{0}$ is the power series
$$
\sum_{k=0}^{\infty} \frac{f^{(k)}(x_{0})}{k!} (x - x_{0})^{k}.
$$
As is well-known, a function can be infinitely differentiable at $x_{0}$ without the Taylor series converging to $f$ in any open interval of $x_{0}$. The standard example is $f(x) = e^{-1/x^{2}}$ if $x \neq 0$, and $f(0) = 0$, whose Taylor series is identically zero.
A transcendental function is real-analytic by definition. As Paramanand Singh notes, the defining property of a transcendental function is not "requires an infinite series" (i.e., "not a polynomial"), but "does not satisfy a polynomial equation in two variables" (i.e., "is not an algebraic function").
Power series are important for many reasons. The most common rationale for introducing them in calculus is to obtain algebraic/analytic expressions for the exponential function (and the closely related circular and hyperbolic functions, and power functions with non-integer exponents), etc., etc. For example, from the exponential power series, one obtains
$$
e = \exp(1) = \sum_{k=0}^{\infty} \frac{1}{k!},
$$
from which one can easily show $e$ is irrational.
Another application is to obtain power series solutions of linear ordinary differential equations with non-constant coefficients. Power series are by no means the only infinite series useful for studying functions; Fourier series and wavelets come immediately to mind.
As for generalizations: Power series with complex coefficients (and complex centers) make perfect sense; the wordings and notation above are chosen to minimize the modifications required to consider complex power series. There are useful notions of matrix-valued power series, operator-valued power series, etc.
Complex power series exhibit interesting behavior (such as monodromy) not encountered over the reals, because a connected open set in the plane need not be simply-connected. Further, every holomorphic (i.e., complex-differentiable) function of one complex variable is automatically analytic. The logical strength of being "once complex-differentiable" gives complex analysis a completely different flavor from real analysis.
Best Answer
I think you should take a look at your derivative again to make sure it is right (it is not quite, but close). You are on the right track, you have the form $\frac{1}{1+y}$, you want $\frac{1}{1-y}$, what do you have to do to your $y$ term to get this? When someone told you to integrate, they mean integrate the power series. You know you can do this in the interval of convergence, right? You are right, once you integrate it you will get $\arctan$ again, but you will also have the integral of the power series on one side which is then the power series for $\arctan$!