I don't think there's a very definite question up there, but I understand why it might have been hard for you to pinpoint exactly what was troubling you, or to even express it efficiently supposing you could pinpoint it.
So what I'll just do is answer the titular question (by explaining the link between sequences and series), and then I'll talk about summability.
First of all, a ramble about terminology. The word sequence in modern mathematics is often exclusively reserved for a succession of objects -- thus a sequence of steps, a sequence of notes, a sequence of operations, a sequence of numbers, a sequence of sequences, and so on. However, both in nonmathematical speech and in ancient mathematics, this word was used more or less synonymously with the word series, which in those contexts mean quite the same thing. So we have a series of tones, a series of blasts, etc. However, in contemporary mathematics the word series is often exclusively used to refer to a particular type of sequence that has become very ubiquitous in mathematics -- a sequence of additions. Thus by a series is meant a sequence of addition operations. For example, start with any real number you like, and transform it by adding to it any real number you like (positive or negative or zero), then another to the sum, then another to the new sum, and so on, and you have what is called a series. What I have actually done is explain in words what's meant by the (quite informal) symbol $$a_1+a_2+a_3+a_4+a_5+a_6+\cdots,$$ which is more formally written as $$\sum_{i\ge 1}{a_i}.$$ That is, a series is just a way of representing a sequence of cumulative sums, each sum in the $i$th position in the sequence consisting of $i$ summands. This sequence is what's called the sequence of partial sums of the series.
In this way you see the link between series and sequences; every series is actually a sequence of sums as described above; on the flip side every sequence can be thought of as some series, namely the series obtained by summing the differences between consecutive terms of the sequence to its first term.
Finally, we can talk about infinite series; that simply means that the addition does not end. This question arises because such series do not always give a number as a result even if the terms are all numbers. They exhibit different behaviours; some series simply accumulate to become larger than any number; some don't even have a predictable behavior, changing course every now and then. The definition given is to pin down those series of numbers that -- though infinite -- add up to a real number. These are the ones called convergent, or summable.
The definition simply asks you to consider the sequence of partial sums of a series, and to see if this sequence converges (becomes stable and settles to a real number); if so, it says that such series will also be called convergent, or summable.
In sum, it's all about sequences. A series is just a sequence of additions, or a sequence of partial sums; conversely, every sequence of addable objects (in particular, numbers), can be decomposed into a series.
Even though you call it the "Geometric Series Test," the actual argument your proof describes is clearly the Ratio Test:
For example,
$\sum_{n=1}^∞ \frac{x^n}{n^4 4^n} = \sum_{n=1}^∞ \frac{1}{n^4} \left( \frac{x}{4} \right)$
The "common ratio" is $r = \frac{x}{4}$
since it's the factor being raised to the power $n$.
Here, $a_n = \frac{x^n}{n^4 4^n}$, so applying Ratio Test gives $$r = \lim_{n \to \infty} \frac{a_{n+1}}{a_n} = \left( \frac{x}{4} \right) \lim_{n \to \infty} \frac{n^4}{(n+1)^4} = \frac{x}{4}.$$ The Ratio Test and the Root Test are both based on (and proven via) the condition for convergence of a geometric series. So it's not surprising that "pretending the series is geometric" works when the complicating factor is a rational function of $n$ like $\frac{1}{n^4}$, as that factor will multiply the limit by 1 in either the Ratio Test or the Root Test.
Best Answer
1) They are the same function, so they have the same power series.
2) In this answer, it is shown that for the generalized binomial theorem, we have for negative exponents, $$ \binom{-n}{k}=(-1)^k\binom{n+k-1}{k} $$ Thus, we have $$ \begin{align} (a+x)^{-3} &=a^{-3}\left(1+\frac xa\right)^{-3}\\ &=a^{-3}\sum_{k=0}^\infty\binom{-3}{k}\left(\frac xa\right)^k\\ &=a^{-3}\sum_{k=0}^\infty\binom{k+2}{k}\left(\frac xa\right)^k\\ &=\sum_{k=0}^\infty\binom{k+2}{2}\frac{x^k}{a^{k+3}}\\ \end{align} $$ The same can be done for fractional exponents, but the formulas for the coefficients are more complicated.
3) In the answer to 2), we factored out the $a^{-3}$ so that one term of the sum was $1$. This allows us to use the binomial theorem in an open-ended way; that is, we don't need to worry about what the exponent of $n-k$ needs to be. In particular, the generalized binomial theorem reads $$ (1+x)^n=\sum_{k=0}^\infty\binom{n}{k}x^k $$ where $$ \binom{n}{k}=\frac{n(n-1)(n-2)\dots(n-k+1)}{k!} $$ Furthermore, if $n$ is not a non-negative integer, the binomial expansion does not terminate. In that case, the series for $$ (a+x)^n=a^n\left(1+\frac xa\right)^n $$ converges for $|x|\lt|a|$.
Extension for $\boldsymbol{|x|\gt|a|}$
We can extend the convergence of a series for $(a+x)^n$ for $|x|\gt|a|$ if we allow Laurent expansions and write $$ (a+x)^n=x^n\left(1+\frac ax\right)^n $$ Using the same example as above, $$ \begin{align} (a+x)^{-3} &=x^{-3}\left(1+\frac ax\right)^{-3}\\ &=x^{-3}\sum_{k=0}^\infty\binom{-3}{k}\left(\frac ax\right)^k\\ &=x^{-3}\sum_{k=0}^\infty\binom{k+2}{k}\left(\frac ax\right)^k\\ &=\sum_{k=0}^\infty\binom{k+2}{2}\frac{a^k}{x^{k+3}}\\ \end{align} $$ which converges for $|x|\gt|a|$.