[Math] Does this series violate the decreasing condition of the Integral Test for Convergence

calculusconvergence-divergencesequences-and-series

I'm working on the section involving the Integral Test for Convergence in my calculus II class right now, and I've run into a seeming conflict between the definition of the Integral Test, and the solutions to some of the homework exercises as given by both my professor and the textbook.

According to the definition, my research, and my understanding of the integral test, the integral test can only be used for the series $a_n$ where $ a_n = f(n) $ and $ f(x) $is positive, continuous and decreasing for all $ x \ge N $, where $N$ is the index of $n$. However, there are several problems where $f(x)$ is only decreasing if we add a condition, such as $f'(x) < 0 \Leftarrow\Rightarrow x > 3$, where $N \lt 3$. It seems to me that the Integral Test cannot be used to determine convergence of a series when the function is only decreasing when $ x \gt k $ and $ k \lt N $, yet the book and my professor apply the test anyway.

For example, with the series:

$$ \sum_{n=1}^\infty = \frac{n}{(4n+5)^\frac{3}{2}} $$

If we let $a_n = f(n)$, then for $f(x)$:

  • $f(x) \gt 0 $ for all $x$ in the domain
  • $f(x)$ is continuous for all $x \gt -\frac{5}{4} $

But, $f'(x) \lt 0 $ only when $ x > \frac{5}{2} $, as shown when testing the critical points with the derivative:

$$ f'(x) = \frac{5-2x}{(4x+5)^\frac{5}{2}}$$

The professor notes this in her solution, but instead of ending with that and writing, "The Integral Test cannot be applied because $f(x)$ fails to satisfy the required conditions," she applies the test using the original index for $n$:

$$ \int_1^\infty \frac{x}{(4x+5)^\frac{3}{2}} \rightarrow \infty \Rightarrow a_n \text{ Diverges} $$

The textbook reaches the same conclusion. Also, the problems in question are listed under a section where the instructions state, "Confirm that the Integral test can be applied to the series. Then use the Integral Test to determine the convergence or divergence of the series," implying the test can be used on the subsequent exercises.

Is there a reason the Integral Test for Convergence can be used to test for convergence in these problems, where $N \lt k $ and $f'(x) < 0 \Leftarrow\Rightarrow x \gt k $? Am I missing something, or are the book and my professor wrongly using the Integral Test for these series?


Other exercises with same result:

$$\bullet \sum_{n=1}^\infty \frac{\ln n}{n^2} $$

$$\bullet \frac{\ln 2}{2} + \frac{\ln 3}{3} + \frac{\ln 4}{4} + \frac{\ln 5}{5} + \frac{\ln 6}{6} $$

Best Answer

The point, which isn't made often enough (in my opinion) in classes on the subject, is that the convergence of a series is a limit process. In this case, what that means is that the question of convergence is completely determined by the behavior of the series for say $n > N$ for any fixed, finite $N$. If I take a convergent series $\sum a_n$, and I cut off the first 55 quintillion terms, and replace them all with $n!$ to get a new sequence

$$ b_n = \begin{cases} a_n \text{ if } n > 55,000,000,000,000,000,000 \\ n! \text{ if } n \leq 55,000,000,000,000,000,000 \end{cases}$$

Then the sum $\sum b_n$ is still convergent. In fact, $b_n$ is convergent if and only if $a_n$ is. Of course, the sums will be different, but by precisely

$$ \sum\limits_{n=1}^{\infty} b_n - \sum\limits_{n=1}^{\infty}a_n = \sum\limits_{n=1}^{55,000,000,000,000,000,000} (n! - a_n)$$

Which is just a finite number. Of course, I've contrived the example to be huge and ridiculous. But the the point to be made about the integral test is that as long as the function behaves in the desired way beyond some large $N$ (say 55 quintillion), the argument still works for the infinite part of the sum, beyond that, and that is where all problems of convergence lie. Everything else is just addition.

Related Question