$\frac{1}{x^2} $ using geometric series vs. Taylor

geometric seriestaylor expansion

So I have the function $f(x) = \frac{1}{x^2} $ and I want to represent it as a Taylor series [centered @ x=1] (also evaluating at $x = 1.02$).
$$$$I did it using the standard Taylor series method:

$$ \sum_{n=0}^\infty\frac{f^n(a)}{n!}(x-a)^n = 1 -2(x-1)+3(x-1)^2-4(x-1)^3+…$$

And I did it using algebraic manipulations to get it to match a geometric series:

$$ \frac{1}{x^2}=\frac{1}{1-1+x^2} = \frac{1}{1-(1-x^2)} = \sum_{n=0}^\infty (1-x^2)^{n} $$
Or:

$$ \frac{1}{x^2}=\frac{1}{1-1+x^2} = \frac{1}{1-(-(x^2-1))} = \sum_{n=0}^\infty (-(x^2-1))^{n} = \sum_{n=0}^\infty (-1)^n(x^2-1)^{n} $$

So these 2 end up being pretty close when using $T_n(x)'s$ but they're not the same. Did I do something wrong?
$$$$
Thank you for your help!

Best Answer

The second approach is wrong, since $\sum_{n=0}^\infty(1-x^2)^n$ is not a power series centered at $1$.

Related Question