Let $f'(x)=\frac{x^2-f(x)^2}{x^2(f(x)^2+1)}$ prove that $\lim\limits_{x \to \infty}f(x) = \infty$.

calculusderivativeslimits

Let $f:(0,\infty)\to\mathbb R$ be a differentiable function such that $$f'(x)=\frac{x^2-f(x)^2}{x^2(f(x)^2+1)}$$ for all $x\gt1$. Prove that $\lim\limits_{x \to \infty}f(x) = \infty$.

Here is what i thought:$$f'(x)=\frac{x^2-f(x)^2}{x^2(f(x)^2+1)}=\frac{1-\frac{f(x)^2}{x^2}}{f(x)^2+1}$$
if somehow it can be proved that $\frac{f(x)^2}{x^2}$ lies in range $(-1,1)$ then it can be concluded that $f'(x)\gt0$, which means function is strictly increasing and for increasing function $\lim\limits_{x \to \infty}f(x) = \infty$ is always true.

But am not sure how to do that and if some one can come up with other creative solution that would be great! and also please explain thought process behind your given solution.


EDIT: My assertion that increasing function tends to limit $\infty$ is wrong for some cases, for an example in case of $f(x)=\arctan(x)$. But it can be explained since derivative of $\arctan(x)$ is $\frac{1}{1+x^2}$ and $\lim\limits_{x \to \infty}\frac{1}{1+x^2}=0$ which means, if we can prove other condition which is $\lim\limits_{x \to \infty}f'(x)\ne0$ then it would be complete proof.

Best Answer

Introduction

I planned to give this question some real good high-level treatment. However, following my efforts to do so, I realized that I would probably be rising above the level of the OP a smidgen too much, thereby turning inaccessible to that level.

Hence, I have instead adopted the same argument for this specific case. Note that I will still provide an answer for a more general case.

I do not think that this solution is related to any of the solutions presented in the manual in the comments above, which are excellent in their own right. I will instead show why the approach here is far, far more amenable to a very large class of equations.


The big result

Consider the differential equation $$ y' = \frac{x^2-y(x)^2}{x^2(y(x)^2+1)} \tag{ODE} $$

The denominator of the RHS of $(ODE)$ is positive for $x>1$. As for the numerator, it factors as $(x-y(x))(x+y(x))$. We claim that any solution $y$ of $(ODE)$ on $x>1$ must be eventually monotonic i.e. there exists an $M>0$ such that $y'(x) \geq 0$ for $x > M$.

Note first that $y(x)$ cannot be a constant solution by substitution, so this is ruled out. Then, the RHS of $(ODE)$ is continuous on the domain $\{x>1\} \times \mathbb R$, so that $y'$ is a continuous function. Hence, the question now reduces to the following : there does not exist a sequence of points $x_i$ such that $y'(x_i) = 0$ and $x_i \to \infty$.

Observe that at points where $y'=0$, we also have $x^2 = y(x)^2$ i.e. either $x = y(x)$ or $x = -y(x)$. So, either $x_i = y(x_i)$ or $x_i = -y(x_i)$ happens infinitely often. We will show that this cannot happen under the insistence $x_i \to \infty$.

Consider, for example, the set of all points $x_{1i}$ such that $y'(x_{1i})=0$ because $x_{1i} = y(x_{1i})$. Suppose that this is an infinite sequence which goes to $+\infty$. Let $x_{1k}$ be such a point. We claim that $x_{1k}$ cannot be an extremum of $y$.

To see this, consider small enough $\delta>0$ and $x_{1k}+\delta>x>x_{1k}$. Without loss of generality, assume that $x_{1k}$ is a maximum. At such $x$, we must have $y'(x) \leq 0$ because of the maximum condition if $\delta$ is small enough. On the other hand, at such points, $y(x)^2 \leq x_{1k}^2 < x^2$ so that $y'(x)>0$, a contradiction. Thus, $x_{1k}$ cannot be a maximum. It similarly cannot be a minimum.

Thus, each $x_{1k}$ is an inflection point. Consider two consecutive points $x_{1}, x_{1(k+1)}$ in this sequence.

Since $x_{1k}$ is an inflection point, we know that $y'(x_{1k})=0$. However, for the function $f(x)=x$, the derivative at the same point equals $1$. Thus, for $\delta>0$ small enough and $x_{1k}+\delta>x>x_{1k}$, we have $(x,x)$ lies above the point $(x,y(x))$ because the function $f(x)=x$ increases faster then $y(x)$ at $x_{1k}$.

However, at $x_{1(k+1)}$, the opposite reasoning takes over. Indeed, the left derivative of $y(x)$ at this point is $0$, while the left derivative of $f(x)=x$ is $1$. The left derivative of $y(x)$ is smaller, hence for some $\delta>0$ the point $(x,x)$ lies below the point $(x,y(x))$.

However, this is impossible : if $x_{1k},x_{1(k+1)}$ were chosen to be consecutive intersection points of $(x,y(x))$ and the line $x=y$, then the graph of one function must either stay below the other or above the other on the entire interval $[x_{1k},x_{1(k+1)}]$. This isn't happening because of what we wrote.

Finally, we conclude the following : the graph of $y'(x)$ intersects the line $x=y$ at only finitely many places. An analogous argument yields that $y'(x)$ intersects the line $x=-y$ at only finitely many places (the argument doesn't change by much, really).

Consequently, the number of intersection points of $y'(x)$ with the graph $x^2-y^2=0$ are finitely many. Therefore, by continuity, $y'(x)$ is either positive or negative for large enough $x$, as desired.


Finishing the proof

Now that we have eventual monotonicity, we know that $\lim_{x \to \infty} f(x)$ exists because the limit of monotonic functions exist as $x \to \infty$. What can this limit be? We will now take over from a solution from the manual.

Call the limit $L$. Note that $L=-\infty$ cannot happen since in that case $y'<0$ and $x^2-y(x)^2>0$ for large enough $x$. However, if $L$ is finite then taking limits as $x\to \infty$ in $(ODE)$ yields $\lim_{x\to \infty} y'(x) = \frac{1}{1+L^2}>0$. This is a contradiction because it promotes superlinear growth. Indeed, pick $X$ large enough such that $y'(x)>\frac{1}{2(1+L^2)}$ for $x>X$. Then, by the mean value theorem, $y(x)-y(X) > \frac{x-X}{2(1+L^2)}$ for $x>X$, which clearly implies $y(x) \to \infty$ as $x \to \infty$, a contradiction.

Therefore, $L = +\infty$ and we are done.


The generalization

It is possible to prove the following rather amazing result in general.

Theorem [Hardy, Bellman] : Let $P(r,s),Q(r,s)$ be a polynomials such that for large enough $r,s$, $P(r,s)Q(r,s) \neq 0$. Then, consider the ODE $$ \frac{dy}{dx} = \frac{P(x,y)}{Q(x,y)} $$ For some $x_0$ and $x>x_0$, the solution to this ODE $y(x)$ has the following property : any rational function of the form $T(x) = \frac{H(y(x),x)}{L(y(x),x)}$ (where $H,L$ are some other polynomials that may be arbitrary) is eventually strictly monotone in $x$, unless $L(x,y(x))=0$ identically or $H(x,y(x))$ is constant in $x$.

For example, in the present case this would entail something like $\frac{y(x)+2x^4}{y(x)^3+x^2y(x)}$ being monotonic where $y(x)$ is a solution to our initial ODE. Something that might appear very difficult to believe, but is true.

We have even more that is true : an asymptotic analysis of such solutions.

Theorem [Hardy] : Any solution of $$ y'(x) = \frac{P(x,y)}{Q(x,y)} $$ where $P,Q$ are as in the previous theorem, has the following asymptotics. It is monotonic together with all its derivatives, and either (1) $y(x) \sim ax^be^{P(x)}$ for some polynomial $P(x)$ and reals $a,b$, or (2) $y(x) \sim ax^b(\log x)^{\frac 1c}$ for an integer $c$ and reals $a,b$, as $x \to \infty$.

In the case above, one can use the asymptotic analysis of Hardy to prove $$ \lim_{x \to \infty} \frac{y(x)}{(3x)^{\frac 13}} = 1 $$

Thereby showing that the asymptotics are of form $(1)$ with $P(x)=0$.

The proofs of these two statements are fairly involved, but not very difficult. Indeed, the idea is the following : one doesn't have a general representation of the zeros of $P(x,y)$, but one can obtain finitely many "branches" of zeros that go to infinity. It is then shown that $y'$ intersects each of these branches at only finitely many points (and the arguments are frankly very, very similar) , so that $y'$ is monotonic eventually.

Then, one can notice that $\frac{H(x,y(x))}{L(x,y(x))}$ also solves a similar type of differential equation, thereby allowing the usage of the same argument. One may use the theory of power series to "estimate" the growth rate of these branches outlined before, leading to the estimates outlined in the theorem. The actual mathematics is a little more involved and cannot be presented rigorously within this outline. It may yet be satisfying to know that this extremely general class of equations admits so much regularity.


Links
  1. Hardy, G.H.($1912$), Some Results Concerning the Behaviour at Infinity of a Real and Continuous Solution of an Algebraic Differential Equation of the First Order. Proceedings of the London Mathematical Society, s2-10(1), 451–468(doi:10.1112/plms/s2-10.1.451)

  2. Bellman, R. , Stability Theory of differential equations, Chapter $3$. Many more results can be found than just the one cited here.

  3. This reference contains the proof of the specific result which is problem $B-5$ of the competition. It uses techniques that are very specific to the problem, unlike Hardy's solution which is much more malleable.

Related Question