Prove the following inequality with “real” or “continuous” induction

inequalityinitial-value-problemstransfinite-induction

Consider the function $y(x)$ satisfying $y(0) = 1$ and $y'(x) = x^2 + y(x)^2$ for every $x$ in a maximal interval $(-a,a)$, for some $a \in (0,\infty]$. The function is nonelementary and very complicated, as a quick Wolfram Alpha query can attest. But if you drop the $x^2$ term from the IVP, then the solution is much easier. That is, the function $y_*(x)$ such that $y_*(0) = 1$ and $y_*'(x) = y_*(x)^2$ for each $x \in \operatorname{dom} y_*$. In this case, solving the separable equation yields $y_*(x) = (1-x)^{-1}$ defined on $\operatorname{dom} y_* = (\infty, 1)$.

It is not difficult to convince yourself that $y_*(x) \leq y(x)$ for each $x \in [0,a)$ (and therefore $y(1)$ is undefined and $a \leq 1$). Intuitively, it is because $x^2 + y^2 \geq y^2$ for each $(x,y) \in \mathbb R^2$. As the former determines the growth of $y$ and the latter of $y_*$, it must be the case that $y_*$ grows not faster than $y$, hence $y_*(x) \leq y(x)$.

What I wonder about is how to formally prove this inequality – and, in particular, how to make the intuitive argument above into a formal one. The catch is that you cannot directly compare the IVPs of the two functions, as the first is really $x^2 + y(x)^2$ and the second is $y_*(x)^2$. To compare them directly, you would need the bound $y_* \leq y$ to already be true, but that is what we are trying to show.

Imagine a different problem, where the functions $y$ and $y_*$ are replaced by sequences, and the derivative replaced by the forward difference. That is, imagine $y$ is the sequence given by $y(0) = 1$ and $(\Delta y)(n) = n^2 + y(n)^2$, and similarly for $y_*$. Then you can easily prove $y(n) \geq y_*(n)$ holds for every $n$ by induction.

Is it possible, then, to use something like "real induction" or "continuous induction" to prove that $y(x) \geq y_*(x)$ for every $x$? I did some digging and found one theorem which seems to be close to what I want, but I stil could not figure it out.

Suppose $S \subset \mathbb R$ is a set of real numbers satisfying the following:

  1. $a \in S$,
  2. $S$ is extensible, in the sense that $\forall x \in S\ \exists y > x\ ([x,y] \subset S)$, and
  3. $S$ is upward-closed, in the sense that if $(x_n)_n$ is any increasing, bounded sequence in $S$, then $\lim_n x_n \in S$.

Then $[a,\infty) \subset S$.

One problem with applying this theorem is that I would like to apply it to the domain of $y(x)$, which is (purportedly) a bounded interval, not an infinite one. Even so, I cannot even find a way to show that $\{x : y_*(x) \leq y(x)\}$ is extensible. Seemingly, you want to show that for each $x$ such that $y_*(x) \leq y(x)$, there exists a $\delta > 0$ such that $y_*(x + h) \leq y(x+h)$ for every $h \in (0,\delta)$. Attempting to unpack the definition of the derivative, I have shown something which is almost this, but falls short.

That is, so far I have shown that for every $\varepsilon > 0$ and $x$ such that $y_*(x) \leq y(x)$, there is a $\delta > 0$ such that $0 < h < \delta$ implies $y_*(x+h) \leq y(x+h) + 2\varepsilon h$. But you cannot eliminate the $\varepsilon$ from the statement by casting it to zero, without also being forced to cast $h$ to zero also.

Best Answer

Continuous induction was discussed recently. I proposed a definition of induction that fits the discrete and continuous cases as well.

For $\mathbb{R}$, the induction of a property $P(x)$ means we have to prove:

  • (0) The property $P$ is true on some $x_0$.
  • (1) $\forall a,b$, if $P$ is true on $[a, b)$ then it is true on $b$.
  • (2) $\forall a,b$, if $P$ is true on $[a, b]$ then $\exists c>b$, $P$ is true on $[b,c]$.

From that, we conclude that $P$ is true on $[x_0,+\infty)$. Of course the propagation can also be reduced to some interval $[x_0, x_1]$ or $[x_0, x_1)$, if what has to be proven for propagation does not work after $x_1$.

Here is a tentative application to the present case, which fails. I post it for possible improvement later (by me or someone else), and for comparison with the problem you encounter in your own tentative.

So, to apply this to our case, we define
$z(x)=y(x)-y_*(x)$.
$z'(x)=x^2+y(x)^2-y_*(x)^2$
$z'(x)=x^2+(y(x)+y_*(x))z(x)$

Our induction hypothesis will be: $y(x) \ge 1, y_*(x) \ge 1, z(x)\ge 0$.

(0) It is true on $x=0$.

(1) If it is true on $[a,b)$, then it is also true on $b$, the three functions being continuous.

(2) If it is true on $[a,b]$, then $y'(x), y_*'(x)$ are all $>0$ on $b$, so the 2 functions are locally growing, so they stay $\ge 1$ on some interval after $b$.
And this is where it fails: we cannot do similarly for $z(x)$, because the case $b=0$ has to be dealt with, and at this point we cannot conclude $z'(0) > 0$, so we cannot get a non-null interval where $z(x)>0$.

It seems the difficulty has less to do with the method (continuous induction), but more with the recurrence hypothesis: we do not have an hypothesis that is both valid on $0$, and sufficiently strong to allow propagation.

Another option would be to refine specifically for differential equations the point (2) presented above (i.e. prove that $\forall a,b$, if $P$ is true on $[a, b]$ then $\exists c>b$, $P$ is true on $[b,c]$): currently, this point requires making a discrete step from $b$ to some $c$, it does not take profit of the differential.

Related Question