Suppose you have an object with zero for the value of all the derivatives of position. In order to get the object moving you would need to increase the value of the velocity from zero to some finite value. A change is velocity is acceleration, so the value of the acceleration would have to increase from zero to some value. You would also need to increase the jerk from zero to some value to have a change in acceleration. Is there an infinite series of higher derivatives of position for this to work? Or am I missing something?
[Physics] Infinite series of derivatives of position when starting from rest
calculuskinematicsmathematical physicsmathematics
Related Solutions
If you want a simple intuitive explanation, you can get a lot from vehicles.
In a car traveling at a constant speed, suppose there is a white dot painted on the top of the steering wheel. If that dot is in the center, you are traveling in a straight line. If you turn it some angle to the left, say 90 degrees, then the car is traveling in a circular arc at a constant lateral acceleration. That is the second derivative of lateral position.
If you turn the steering at a constant rate from 0 degrees to 90 degrees, then the rate of lateral acceleration is changing constantly while you are turning the wheel. That is jerk, and it is constant because you are turning the steering wheel at constant speed. It is the third derivative of lateral position.
(While you are doing this, the path traced by the car is a spiral of linearly increasing curvature. Highway and railway curves are built using these spirals to connect segments of constant curvature. Such a spiral gives a place to gradually bank the roadway, and without it drivers tend to cross lanes, and trains actually jerk when starting or ending a curve.)
However, if you don't turn the steering wheel at a constant rate, but rather accelerate it leftward from 0 degrees until you are turning it quickly at 45 degrees, and then decelerate it until it reaches 90 degrees, then you are giving it a doublet of snap, first positive, then negative, and that is the fourth derivative of lateral acceleration.
Another vehicle to illustrate it is a submarine having bow planes to control depth. Suppose there is a motor that rotates the bow planes at a constant speed, up, down, or 0. The angle of the bow planes determines the pitch rate of the submarine. The pitch angle of the submarine determines the rate of change of depth.
So the pitch angle is proportional to the first derivative of depth, the bow plane angle determines the second derivative, and the speed of the bow plane motor is the third derivative.
Also, take a rocket with gimballed engines. If the thrust line of a rocket engine does not go through the rocket's center of mass, then it produces angular acceleration, or the second derivative of directional orientation. There are motors that move the engine gimbals, and the rate at which they move them determines the third derivative of direction.
I'm sure you can think of other examples.
Any "reasonable" function $f$ (such functions are called analytic) has such an expansion, known as a Taylor expansion involving the derivatives of the function itself, which converges to it. Consider a position $x(t)$ of a particle: \begin{aligned} x(t) = x(t_0 + \Delta t) &= x(t_0) + x'(t_0)\Delta t + \tfrac{1}{2}x''(t_0)\Delta t^2 + \tfrac{1}{6}x'''(t_0)\Delta t^3 + \dotsb \\ &= \sum_{n=0}^\infty \tfrac{1}{n!}x^{(n)}(t_0)\Delta t^n. \end{aligned} As you mention, we expand the function to higher orders depending on the influence of the higher-order derivatives. However, note that in this expansion, the powers of $\Delta t$ increase with each additional term we add to the expansion. For small values of $\Delta t$, these terms have less and less influence. A small term cubed is smaller than a small term squared is less than a small term to the first power.
For that matter, this kind of series expansion is ubiquitous in physics, for example in the potential energy of a particle near a point of stable equilibrium $x_0$: \begin{aligned} U(x) = U(x_0 + \Delta x) &= U(x_0) + U'(x_0)\Delta x + \tfrac{1}{2}U''(x_0)\Delta x^2 + \dotsb \\ &= U(x_0) + \tfrac{1}{2}U''(x_0)\Delta x^2 + \dotsb \end{aligned} since the term involving the first derivative of $U$ is zero when $U$ is at a point of stable equilibrium (derivative zero at a minimum). If we rewrite this equation ignoring the higher order terms that contribute less and less, and define $U(x) - U(x_0) = \Delta U$ we have \begin{aligned} U(x) &= U(x_0) + \tfrac{1}{2}U''(x_0)\Delta x^2 \\ \Delta U &= \tfrac{1}{2}U''(x_0)\Delta x^2, \end{aligned} which you may see is suspiciously similar to the potential energy of a harmonic oscillator like a mass on the spring: $$ \Delta U = \tfrac{1}{2}k\Delta x^2. $$ It is the harmonic oscillator that is a special case of the more general expansion we are seeing above.
Best Answer
I) Disclaimer: In this answer, we will only consider the mathematically idealized classical problem, which is interesting in itself, although admittedly academic and detached from actual physical systems.
II) It is in principle possible that all time derivatives of the position $x(t)$ vanishes at $t=0$, and yet the particle does move away (from where it was at $t=0$).
$\uparrow$ Fig. 1: A plot of position $x$ as a function of time $t$.
Example: Assume that the position is given by the following infinitely many times differentiable function
$$ x(t)~:=~\left\{ \begin{array}{ccl} \exp\left(-\frac{1}{|t|}\right)&\text{for} & t~\neq~ 0, \\ \\ 0 &\text{for} & t~=~ 0. \end{array} \right. $$
Note that the Taylor series for the $C^{\infty}$-function $x:\mathbb{R}\to\mathbb{R}$ in the point $t=0$ is identically equal to zero.$^1$ So the function $x$ and its Taylor series are different for $t\neq 0$. In particular, we conclude that the smooth function $x$ is not a real analytic function.
III) If you like this Phys.SE question you may also enjoy reading about What situations in classical physics are non-deterministic? , Non-uniqueness of solutions in Newtonian mechanics and Norton's dome and its equation.
--
$^1$ Sketched proof that $x\in C^{\infty}(\mathbb{R})$: Firstly, obviously, $x$ is $C^{\infty}$ for $t\neq 0$. By induction in $n\in\mathbb{N}_0$, for $t\neq 0$, it is straightforward to deduce that the $n$'th derivative is of the form $$x^{(n)}(t)~=~\frac{P_n(t,|t|)}{Q_n(t,|t|)}x(t), \qquad t\neq 0,$$ for some polynomials $P_n$ and $Q_n\neq 0$. Secondly, it follows that the $(n\!+\!1)$'th derivative at the origin $$x^{(n+1)}(0)~=~\lim_{t\to 0} \frac{x^{(n)}(t)}{t}~=~0$$ exists and is zero "because exponentials beat polynomials". $\Box$