Essentially, such a series is alternating: for each $x\in D$, the series $\sum\limits_{n=1}^\infty{(-1)^{n+1}}f_n(x)$ is a convergent alternating series. From this, and 3) (note that 3) is saying that the sequence $(f_n)$ converges uniformly to $0$), it will follow that the series is uniformly Cauchy on $D$ and thus uniformly convergent on $D$.
Let $\alpha_n=\sup\limits_{x\in D}\,\{f_n(x)\}$.
Then for any positive integers $m$ and $n$ with $m\ge n$ and $x\in D$, using the fact that $\sum\limits_{n=1}^\infty{(-1)^{n+1}}f_n(x)$ is an alternating series of real numbers
$$\tag{1}
\Biggl|\,{(-1)^{n+1} } f_n(x)+{(-1)^{n+2} }f_{n+1}(x)+\cdots+{ (-1)^{m+1} }f_m(x)\,\Biggl|\ \le\ f_n(x)\le \alpha_n .
$$
The term on the right hand side of $(1)$ is independent of $x$ and can be made as small as desired, since $\lim\limits_{n\rightarrow\infty }\alpha_n=0$. From this, it follows that the series $\sum\limits_{n=1}^\infty{(-1)^{n+1}}f_n(x)$ is uniformly Cauchy on $D$, and thus uniformly convergent on $D$.
Recall that a sequence $( g_n)$ of real-valued functions is uniformly Cauchy on a set $A$ if for any $\epsilon>0$, there exists a positive integer $N$ so that whenever $n$ and $m$ are positive integers with $m,n\ge N$ we have $\bigl|g_n(x)-g_m(x)\bigr|<\epsilon$ for all $x\in A$.
An easily proven result is that a sequence $( g_n)$ of real-valued functions is uniformly convergent on $A$ if and only if it is uniformly Cauchy on $A$. A proof of this result can be found here.
This result phrased for a series of functions would read as follows: a series $\sum\limits_{n=1}^\infty g_n$ of real valued functions converges uniformly on $A$ if and only if its sequence of partial sums is uniformly Cauchy on $A$.
We say a series of functions is uniformly Cauchy if its sequence of partial sums is.
Note that $\sum\limits_{n=1}^\infty g_n$ is uniformly Cauchy on $A$ if and only if for any $\epsilon>0$, there is a positive integer $N$ so that for $m\ge n\ge N$,
we have $\Bigl| \sum\limits_{k=n}^m g_k(x)\Bigr|<\epsilon$ for all $x\in A$.
I would do it as follows: if your series was uniformly convergent on $(0,1)$, then the sequence $\bigl(f_n(x)\bigr)_{n\in\mathbb N}$, where $f_n(x)=\frac1{(nx)^2+1}$, would converge uniformly to the null function. But it doesn't, since$$(\forall n\in\mathbb N):f_n\left(\frac1n\right)=\frac12.$$
On the other hand, if $N\in\mathbb N$, then$$g(N)=\sum_{n=1}^\infty\frac1{(Nn)^2+1}<\sum_{n=1}^\infty\frac1{(Nn)^2}<\sum_{n=N}^\infty\frac1{n^2}\to_{N\to\infty}0.$$This, together with the fact that $g$ is decreasing on $(1,\infty)$ (since its the sum of decreasing functions), shows that $\lim_{x\to\infty}g(x)=0.$
Best Answer
In view of the definition of $f(x)$ we have $$ \left| \sum\limits_{k=1}^n f_k (x) - f(x) \right| = \left| \sum\limits_{k=n+1}^\infty f_k (x) \right| \leq \sum\limits_{k=n+1}^\infty |f_k(x)| \leq \sum\limits_{k=n+1}^\infty u_k \to 0 \text{ as } n\to \infty. $$ The last estimate is independent of $x$, hence the claim.