Is the rate of change of Finite-Duration Solutions always bounded

finite-durationnonlinear-analysisordinary differential equationssolution-verificationupper-lower-bounds

Is the rate of change of Finite-Duration Solutions always bounded? (between times $[t_0;\,t_F]$)

I have found recently a paper Finite Time Differential Equations (V. T. Haimo – 1985), where its proved that there exists conditions under which first and second order scalar autonomous non-linear ODEs admits Finite-Duration Solutions, meaning here, that they becomes exactly zero due their own dynamics and remains forever on zero after this ending time, so, they are different from thinking on a piecewise section of a solution of a common initial-value problem, neither solutions that are vanishing at infinity, and the same condition of being zero forever after an specific time, make these solutions to fail to fulfill the conditions of uniqueness, discarding every Linear ODEs of having them.

First of all, I will focus here the question only to the scalar second order ODEs examples, and specifically on autonomous systems since are commonly representation of physical models that are time-invariant, but its Wikipedia page don’t talks about the existence of finite-duration solutions in any part (I have included it now).

Also, I will set aside of the analysis pathological solutions as the Weierstrass function, the Cantor function, Fractals, and other Nowhere-Differentiable functions, since I believe their Differential Equations are completely different (as Stochastic Differential Equations for Wiener Processes). With this, for a continuous function, a jump-discontinuity is forbidden, so "no teleportation is allowed".

Finally, since the domain of a scalar finite-duration function is compact-supported (exists between the initial time $t_0$ for which the initial value problem condition is defined, and the ending time $t_F$ where the dynamics of the system "dies"), they will be of unlimited bandwidth, so in theory, they aren´t slope-limited, which will becomes limited If: (i) the solution is differentiable on every point "within" its compact-support, or (ii) the function has finitely-many continuous “sharp edges” like the $\text{abs}()$ function located on measure-zero points, making the derivative discontinuous there, but with a jump-discontinuity of bounded extension. And since the solution is of finite duration, being unbounded at infinity is not a possible scenario (like the eq. $y'=y$ where exponentials will lead to an unbounded $y''$ but it will be locally bounded everywhere except at infinity). So for finite duration solutions is similar to be thinking of "local" properties of functions.

The cited paper show which conditions must fulfill the non-linear, scalar, second order differential equation to support finite-duration solutions (probably, you should read the paper first, since I am not listing here everything it is explained there): on Theorem 2 point (i), it is said that, without losing generality by considering that the ending time of the finite-duration solution happens at $t_F = 0$, for a second order dynamical system described by $\ddot{x}(t) = g(x(t),\dot{x}(t))$ such $g(0,0)=0$ (the system dynamics "die" at $t_F = 0$), with at least $g \in C^1(\mathbb{R}\setminus \{0\})$, then for the system to support finite-duration solutions, the following another differential equation must have solutions:
$$q(z)\frac{dq(z)}{dz} = g(z,q(z)),\,q(0)=0$$

Honestly the papers are bit advanced to my mathematical skills, but if I didn´t make any mistakes, what the author is doing is splitting the second derivative of the scalar one-variable function $x(t)$ as:

$$\ddot{x} = \frac{d}{dt}\frac{dx(t)}{dt} = \frac{d}{dt}\frac{dx}{dx}\frac{d\,x(t)}{dt} = \frac{dx}{dt}\frac{d}{dx}\frac{dx}{dt} = q(z)\frac{dq(z)}{dz}$$
by using the change of variable $z=x(t)$ and $q(z)=\dot{x}$.

I don´t really understand why this transformation leads to another differential equation that "tells" how the original equation will behave, so the following analysis is probably wrong, but I want to share it with you so you can correct me:

Since I am looking from the maximum speed $\sup_t |\dot{x}|$ to figure out if the solutions have bounded derivative $\|\dot{x}\|_\infty < \infty$, and from the papers looks like I can figure out the behavior of $\dot{x}$ from $q(z)$, I believe whatever it achieve a maximum, the values obtained should be the same, so finding $\sup_t |\dot{x}| \equiv \sup_z |q(z)|$. With this, since $q(z)=0$ is not really a value I "care" (if the maximum rate of change happens at a zero value, the solution is constant), I could use first order conditions to look for the maximum value of $q(z)$, so I need to find $z$ such $$\frac{dq(z)}{dz}=0 \rightarrow z^* \rightarrow q(z^*)$$

So, since I am interested in $q(z) \neq 0$, looking for the first order conditions is equivalent (I think) to looking for $q(z)\frac{dq(z)}{dz}=0$, which is indeed the same that looking for $\ddot{x} = 0$ (if my assumption of interchangeability of equations is right), so it would be meaning that finite-duration solutions of differential equations only can achieve their maximum speeds at inflection points of the acceleration profile where it is equal to zero $\ddot{x}=0$, which instantly discard situations as the example $f(t) = \frac{t}{2} \cdot \log(t^2),\,|t|\leq 1$, which "softly" achieve $|f'(t)| \to \infty$ at $t=0$ but at this points it second derivative is non-zero (actually diverges to infinity) – look it here.

This is quite an aggressive affirmation, since it can be reformulated as: finite-duration solutions to scalar-second-order differential equations (which are of unlimited bandwidth because of having finite-duration, so they could achieve infinite speeds in principle), are actually restricted by being solutions of finite-time differential equations (with $g(x, \dot{x}) \in C^1(\mathbb{R}\setminus \{0\})$), so they can achieve their maximum rate of change only when it acceleration is zero, discarding it of happening at discontinuities (only places where a "non-teleportation" function could be "non-differentiable"), so their maximum rate of change $\|\dot{x}\|_\infty < \infty$, are actually bounded…. this is actually too good to be true I think (maybe it happens because of the restriction on $g(x,\dot{x})$ being at least $C^1$), but since this finite duration functions are "localized", I think the second derivative criteria will stand in any case.

I think is interesting to see what restrictions could been rise for mechanical systems described by these finite-time differential equations (as should be every classical mechanics system that consider friction in my opinion) because of being of bounded derivative. Remembering here that they are already bounded since are continuous and of compact support, and If they are also Lebesgue Integrable, they will be also of finite energy due Holder's Inequality, similarly, If they are also of Bounded Variation, since their derivative is bounded (which I am trying to prove here), their Dirichlet Energy will be also bounded… and I hope (a long shot), all these restrictions will give more intuitive interpretation of phenomena like Causality, Entropy and the 2nd Law of Thermodynamics (since "speed" is always limited – if the hypothesis is true), this because since a finite duration solution haves an ending time, they made obvious the direction of time flow.

I did not find too much information related to them so I think are quite unknown: Are these finite-duration solutions always Differentiable? Of Bounded Variation? Absolute continuous? Surely the aren´t Lipschitz, since on the papers the author explains that:

"One notices immediately that finite time differential equations cannot be Lipschitz at the origin. As all solutions reach zero in finite time, there is non-uniqueness of solutions through zero in backwards time. This, of course, violates the uniqueness condition for solutions of Lipschitz differential equations." This also excludes any solutions that are analytical in the whole real line.

Since the ODEs aren´t Lipschitz, I am not sure if solutions will be always continuous differentiable, or it will stand more exotic solutions like functions of unbounded variation, or instead, they will be always Lebesgue Integrable (so absolute continuous), is part of what I am trying to understand through this question.

Hope you find them as interesting as I do, and thanks beforehand for explaining why and where my reasoning is right/wrong. Unfortunately I don´t have any example of these functions so far, which I am asking here.

Update: I believe I have found an example, I started a related bounty on this other question trying to prove it: the equation $\dot{x}=-\text{sgn}(x)\sqrt{|x|},\,x(0)=1$ have as solutions, additional to the trivial $x(t)=0$, the finite duration solution $x(t) = \frac{1}{4}\left(1-\frac{t}{2}+\left|1-\frac{t}{2}\right|\right)^2$ (see plot here), which should be a solution given the Local Existence and Uniqueness of solutions of ODEs (remember I am trying to prove this in the other question).

PS: At the beginning of the solution, since there is a jump discontinuity when achieving the initial value conditions (it depends of your definition of a IVP solution, but if previous $t_0$ you choose them to be zero, this will stand), you could have a problem with the rate of change and the Fourier Transform (here remembering that $\|\dot{f}\|_\infty \leq \|w\hat{f}\|_1$), this is why it is emphasized that the analysis is within its compact support, but without lost of generality, it will be extended including the edges of the support using the Finite Duration Fourier Transform as is done here, avoiding the issue of the discontinuity at the beginning.

Best Answer

Here the author of the question:

I have the intuition that having an infinite fast change will require and infinite power since the Fourier Transform of a Delta Function is of unbounded Energy... but recently, through other questions, I am starting to figure out I am mistaken: a step function, which have an infinite-speed jump, has a Fourier Transform which Energy is convergent (the square of a Sinc Function here), but its Fourier spectra is of unbounded bandwidth, as it is going to be for every finite duration function Wiki.

But this doesn't means that necessarily every finite duration function will have unbounded rate of change: I believe now that every function that is a solution of a differential equation $x^{(n)} = F(t,x,\dot{x},\cdots,x^{(n-1)})$ will be of bounded rate of change if $F(\cdot)$ doesn't have singularities, which is the case of the cited paper where it is required that $F(\cdot)$ is at least class $C^1$ for first and second order ODEs (non-Lipschitz), which "implies" that the solutions are continuously differentiable, so, they derivative are bounded (I cannot prove it, it was explained to me by other user here on a chat - but it looks like is a wide known property of differential equations that their solution are typical of one order difference from its differential equation).

Now, with these, I have understand that the differentiability is quite unrelated to the signal power (which is highly counterintuitive to me), so it is also possible to make examples of finite duration functions that aren't of bounded derivative: $$x(t) = \frac{(1-t^2+|1-t^2|)\,t\log(t^2)}{4}\,\exp\left(-\frac{t^2}{1-t^2}\right)$$ see plot here, which derivative is smooth except at the origin where is unbounded, being a function of compact support (so, of finite duration).