For (i), given $\epsilon > 0$ there exist $N \in \mathbb{N}$ such that for all $m > n > N$ we have for all $x \in X$
$$\left|\sum_{k=n+1}^m \log (1 + g_k(x)) \right| \leqslant\sum_{k=n+1}^m |\log (1 + g_k(x))| \leqslant \frac{3}{2}\sum_{k=n+1}^m|g_k(x)| < \epsilon,$$
since the RHS series is uniformly convergent.
For (iii), the series $\sum_{n \geqslant 1} \log(1+g_n(x))$ converges uniformly if and only if $\sum_{n \geqslant n_0+1} \log(1+g_n(x))$ converges uniformly. We can add or subtract a finite number of terms without consequence.
Also if $S_n(x) \to S(x)$ uniformly then $\exp(S_n(x)) \to \exp(S(x))$ since the exponential function is continuous everywhere. Thus, uniform convergence of $h(x)$ imples uniform convergence of $f(x)$.
Addendum: Absolute convergence of an infinite product implies convergence
Let $P_n = \prod_{k=1}^n (1+a_k)$ and $Q_n = \prod_{k=1}^n (1+|a_k|)$. We have
$$P_n - P_{n-1} = (1+a_1) \ldots (1+a_{n-1}) a_n, \\ Q_n - Q_{n-1} = (1+|a_1|) \ldots (1+|a_{n-1}|) |a_n|,$$
and it follows that $|P_n - P_{n-1}| \leqslant Q_n - Q_{n-1}$.
If $\prod(1+|a_n|)$ is convergent then the series $\sum(Q_n- Q_{n-1})$ converges since
$$\lim_{N \to \infty}\sum_{n=2}^N (Q_n - Q_{n-1})= Q_1 + \lim_{N \to \infty}Q_N = \prod_{n=1}^\infty (1 + |a_n|)$$
By the comparison test, the series $\sum(P_n- P_{n-1})$ is convergent and, therefore, the product $\prod(1+a_n)$ is convergent since
$$\prod_{n=1}^\infty(1+a_n) = \lim_{N \to \infty} P_N = P_1 + \sum_{n=2}^\infty (P_n - P_{n-1})$$
A final but important detail is to show that $\lim_{N \to \infty}P_N \neq 0$. This follows from the convergence of $\sum|a_n|$ which implies $1 + a_n \to 1$. It follows that the series $\sum |a_n(1+a_n)^{-1}|$ and, hence, the product $\prod(1 - a_n(1+a_n)^{-1})$ are convergent. Thus,
$$\lim_{N\to \infty} \frac{1}{P_N} = \prod_{n=1}^\infty \frac{1}{1+a_n} = \prod_{n=1}^\infty \left(1 - \frac{a_n}{1+a_n}\right) \neq \infty$$
Best Answer
To say that $\sum f_n$ is uniformly and absolutely convergent (for $x \in D$) without further qualification generally means that there exist functions $S$ and $\hat {S}$ such that for all $\epsilon > 0$ there exists $N_1 \in \mathbb{N}$ such that for all $n > N_1$ and for all $x \in D$ we have
$$\left |\sum_{k=1}^n f_k(x) - S(x) \right| < \epsilon,$$
and there exists $N_2(x) \in \mathbb{N}$, which may depend on $x$, such that $n > N_2(x)$ implies
$$\left |\sum_{k=1}^n |f_k(x)| - \hat{S}(x) \right| < \epsilon$$
However, it is possible that $\sum f_n$ converges uniformly, but $\sum|f_n|$ converges pointwise but not uniformly.
If $\sum |f_n|$ is also uniformly convergent then the correct and unambiguous terminology is that $\sum f_n$ is uniformly absolutely convergent.
For the proof in question, "they" are specifying uniform absolute convergence by saying "converging uniformly and absolutely". This facilitates the proof of uniform absolute convergence of $\sum f_n$ through the Cauchy criterion using the inequalities
$$\left|\sum_{k=n+1}^m f_k(x) \right| \leqslant \sum_{k=n+1}^m |f_k(x)| \leqslant \sum_{k = n+1}^m |g_k(x)|$$
Otherwise, with complex valued functions $|f_n(x)| \leqslant |g_n(x)|$ is not sufficient information to prove uniform convergence of $\sum f_n$ if $\sum|g_n|$ does not converge uniformly.