$x_{n+1}=\log(1+x_n)$, then how can I solve $\lim{nx_n}$ within high-school level

calculuslimitslimits-without-lhopital

Given $x_{n+1}=\log(1+x_n)$, I know $x_n\to0$ because if $\lim x_n=\alpha$, then $\alpha=\log(1+\alpha)$.

And using Stolz-Cesaro Theorem, then $\lim nx_n=\lim\frac{x_nx_{n+1}}{x_n-x_{n+1}}$, and it can be changed as $\lim_{t\to0}\frac{t\log(1+t)}{t-\log(1+t)}$. It is not difficult to see that this value is 2.(Using L'Hopital, Talyor… etc.)

But, how about high-school level? I don't know any other way not to use the Stolz-Cesaro theorem. Please help me if you know how to solve this in high school level.

Best Answer

With David Speyer's approach , but I provide a complete "high-school" answer, high-school in quotes because I hope my methods reflect this.

It is not possible to do this question without using the Taylor expansion of $\log(1+x)$. The reason for this is that (while it is not available to us, but still true) Cesaro-Stolz helps convert this to a limit involving $t,\log(1+t)$ (see the question post) and a limit like that cannot be solved without performing what is the equivalent of a Taylor expansion (e.g. L'Hopital requires taking the derivative of $\log(1+x)$, which exposes its Taylor coefficients). It may be possible to further dress up the argument to be more "high-school", but I prefer to balance being concise and accurate.

The attempt here is to provide a high-quality instance tailored to context.


Let $x_{n+1} = \log(1+x_n)$ with $x_1 \geq 0$. By induction, $x_i \geq 0$ for all $i$.

We can do better : consider $h(y) = y - \log(1+y)$ on the interval $[0,\infty)$. Its derivative is $h'(y) = 1 - \frac 1{1+y} \geq 0$. It follows that $y \geq \log(1+y)$ on $[0,\infty)$. Therefore, $x_n$ is a monotonic bounded sequence, hence converges. If it converges to $\alpha$, then $x_{n+1} = \log(1+x_n)$, so by continuity of $\log(1+z)$ we have $\alpha = \log(1+\alpha)$. $\alpha=0$ is forced from the injectivity of $\log(1+z)$, and $\lim_{n \to \infty} x_n = 0$ can be concluded.

Now, let $I = [0,x_1]$. On this interval, we use the Taylor expansion (at zero) to the third order, with the Lagrange form of the remainder (which is the first "remainder-based" version of this theorem). This tells you that for $x \in [0,x_1]$,there exists a $\xi \in [0,x]$ such that $$ \log(1+x) = 0 + x - \frac{x^2}{2}+\frac{x^3}{6} \left[\frac{2}{(\xi+1)^3}\right] $$

However, on the interval $[0,x_1]$, $$ 0\leq \left[\frac{2}{(\xi+1)^3}\right] \leq 2 $$

and therefore $$ x-\frac{x^2}{2}\leq \log(1+x) \leq x-\frac{x^2}{2}+\frac{x^3}{3} $$

as a concrete inequality that can be used to replace the O-notation from David's answer. Plugging this into the recursion gives $$ x_n - \frac{x_n^2}2 \leq x_{n+1} \leq x_n - \frac{x_n^2}{2} + \frac{x_n^3}{3} $$

We have already seen that $x_1>0$ implies $x_i >0$ for all $i$ by the injectivity of $\log(1+z)$. Therefore, the sequence $y_n = \frac 1{x_n}$ is well-defined. I'll explain why we need to bring the reciprocal some time later : the need of the hour is to take the reciprocal of all sides above :$$ \frac{1}{x_n - \frac{x_n^2}{2}} \geq \frac 1{x_{n+1}} \geq \frac{1}{x_n - \frac{x_n^2}{2} + \frac{x_n^3}{3}} $$

I might as well explain the reciprocal-step now : we need to find the limit of $nx_n$, right? Now, multiplying by $n$ is a tricky affair because if terms are going to $0$, then a multiplication by $n$ could make the same terms converge to non-zero terms or even blow-up, which is a significant risk.

On the other hand, if we divide by $n$, then terms which were contributing earlier will likely go to zero, and those going to zero will do so anyway. In other words, converting the problem to finding the limit of $\frac{1}{nx_n}$ is an easier process. This is what the aim of reciprocation is.

However, the point now is that we should be able to somehow replace the complicated terms on either side with $\frac{1}{x_n}$ and some easier terms. How do we do that? Using the geometric series formula. For this, we will first use the limit $x_n \to 0$ to make $x_n$ small enough : for some $N$ , we have $x_n<1$ for $n>N$. We will assume that $n>N$ from now on.

Using the Geometric Series formula, $$ \frac 1{x_n - \frac{x_n^2}{2}} = \frac 1{x_n}\frac 1{1 - \frac{x_n}{2}} = \frac 1{x_n} \sum_{i=0}^{\infty} \frac{x_n^i}{2^i} = \frac{1}{x_n}+\frac 12 + x_n\sum_{i=2}^\infty \frac{x_n^{i-2}}{2^i} $$

(Note : the series formula is correct since $0<\frac{x_n}{2} < 1$ which is part of the convergence radius. At the high-school level, I do not gather this needs much explanation). On the other side, we have $$ \frac{1}{x_n - \frac{x_{n}^2}2 +\frac{x_n^3}{3}} = \frac 1{x_n} \frac{1}{1 - \frac{x_n}{2} +\frac{x_n^2}{3}} = \frac{1}{x_n} \sum_{i=0}^\infty \left(\frac{x_n}{2} - \frac{x_n^2}{3}\right)^i $$

(Once again, since $0<\frac{x_n}{2} - \frac{x_n^2}{3} <1$)But there are so , so many powers of $x_n$ lying around? Fear not : most of these terms are , in truth, quite obsolete in their contribution. For example, $x_n<1$ leads to $$ \frac{1}{x_n}+\frac 12 + x_n\sum_{i=2}^\infty \frac{x_n^{i-2}}{2^i} \leq \frac 1{x_n} +\frac 12 + x_n\sum_{i=2}^\infty \frac{1}{2^i} \leq \frac 1{x_n}+ \frac 12 + x_n $$

(note : the coefficient of $x_n$ can be explicitly calculated, I'm too lazy). On the other side, we want $\frac 1{x_n}+\frac 12+$something involving $x_n$ as a lower bound, so let's just truncate the series to contain just the $i=0,1$ terms. That's an obvious lower bound. $$ \frac{1}{x_n} \sum_{i=0}^\infty \left(\frac{x_n}{2} - \frac{x_n^2}{3}\right)^i \geq \frac 1{x_n} \left(1+\frac{x_n}{2} - \frac{x_n^2}{3} \right) = \frac {1}{x_n} + \frac 1{2} - \frac{x_n}{3} $$

Combining everything gives $$ \frac{1}{x_n}+\frac 12+x_n \geq \frac 1{x_{n+1}} \geq \frac {1}{x_n} + \frac 1{2} - \frac{x_n}{3} $$ Which we rewrite as $$ \frac{1}{2}+x_n \geq \frac 1{x_{n+1}}-\frac 1{x_n} \geq \frac 12 - \frac{x_n}{3} $$

Note that this inequality is only true for $n>N$ : but the end is in sight! Indeed, let's take $m>N$ and write $$ \frac 1{x_m} - \frac 1{x_{N+1}} = \sum_{i=N+1}^{m-1}\left(\frac{1}{x_{i+1}} - \frac 1{x_i}\right) $$

Now, we use the bounds and get $$ \frac{m-N-2}{2}-\sum_{n=N+1}^{m-1} \frac{x_n}{3} \leq \frac 1{x_m} - \frac 1{x_{N+1}} \leq \frac{m-N-2}{2}+\sum_{n=N+1}^{m-1} x_n $$

which is the same as $$ \frac{1}{x_{N+1}} + \frac{m-N-2}{2}-\sum_{n=N+1}^{m-1} \frac{x_n}{3} \leq \frac 1{x_m} \leq \frac{m-N-2}{2}+\sum_{n=N+1}^{m-1} x_n + \frac 1{x_{N+1}} $$

we may now divide by $m$ : $$ \frac{1}{mx_{N+1}} + \frac{m-N-2}{2m}-\frac{\sum_{n=N+1}^{m-1} \frac{x_n}{3}}{m} \leq \frac 1{mx_m} \leq \frac{m-N-2}{2m}+\frac{\sum_{n=N+1}^{m-1}x_n}{m} + \frac 1{mx_{N+1}} $$

Squeeze-theorem time , with $m \to \infty$. $\frac{1}{mx_{N+1}} \to 0$ and $\frac{m-N-2}{2m} \to \frac 12$ are quite obvious. What about the other terms? A very simple result is that the running average of a sequence that converges to zero, also converges to zero. The proof can be found here. The two leftover terms are the running averages of appropriate sequences that converge to zero, which I leave you to find. Once you do that , those terms converge to zero.

By the squeeze theorem, $\frac{1}{mx_m} \to \frac 12$, and therefore $mx_m \to 2$. If this proof was difficult, you can see why Stolz-Cesaro is so useful and perhaps motivate it at the high-school level.

Related Question