Why we ONLY use ratio test and not conditional convergence to determine the interval of convergence of an alternating series

absolute-convergenceconvergence-divergencesequences-and-series

For example, consider
$$S_n=\sum_{n=1}^{\infty} \frac{(-1)^n x^n} {\sqrt{n}}$$
While determining the interval of convergence, we use the ratio test to determine the interval in which the series converges absolutely.
From the ratio test we get the IoC to be
$$ -1 \lt x \le 1 $$

Why don't we use the Leibniz Test for an alternating series to find the IoC?
Wouldn't it give a larger interval of convergence?

There is no specific mention of this anywhere on the internet I searched. Is there a particular reason we do this?

Best Answer

You are wrong in your assumption that from the ratio of convergence you deduce that the series converge if and only if $-1<x\leqslant1$. The ratio test is inconclusive when $x=1$ and that's why you need the Leibniz test to determine whether the series converges if $x=1$.