Ordinary Differential Equations – What Exactly is Steady-State Solution?

ordinary differential equations

In solving differential equation, one encounters with steady-state solution. My textbook says that steady-state solution is the limit of solutions of (ordinary) differential equations when $t \rightarrow \infty$. But the steady-state solution is given as $f(t)$, and this means that the solution is a function of $t$ – so what is this $t$ being in limit?

Best Answer

In different areas, steady state has slightly different meanings, so please be aware of that.

We want a theory to study the qualitative properties of solutions of differential equations, without solving the equations explicitly.

Moreover, we often want to know whether a certain property of these solutions remains unchanged if the system is subjected to various changes (often called perturbations).

It is very important to be able to study how sensitive the particular model is to small perturbations or changes of initial conditions and of various paramters.

This leads us to an area of DEQ called Stability Analysis using phase space methods and we would consider this for both autonomous and nonautonomous systems under the umbrella of the term equilibrium.

Autonomous

Definition: The equilibrium solution ${y}0$ of an autonomous system $y' = f(y)$ is said to be stable if for each number $\varepsilon$ $>0$ we can find a number $\delta$ $>0$ (depending on $\varepsilon$) such that if $\psi(t)$ is any solution of $y' = f(y)$ having $\Vert$ $\psi(t)$ $- {y_0}$ $\Vert$ $<$ $\delta$, then the solution $\psi(t)$ exists for all $t \geq {t_0}$ and $\Vert$ $\psi(t)$ $- {y_0}$ $\Vert$ $<$ $\varepsilon$ for $t \geq {t_0}$ (where for convenience the norm is the Euclidean distance that makes neighborhoods spherical).

Definition: The equilibrium solution ${y_0}$ is said to be asymptotically stable if it is stable and if there exists a number ${\delta_0}$ $> 0$ such that if $\psi(t)$ is any solution of $y' = f(y)$ having $\Vert$ $\psi(t)$ $- {y_0}$ $\Vert$ $<$ ${\delta_0}$, then $\lim_{t\rightarrow+\infty}$ $\psi(t)$ = ${y_0}$.

The equilibrium solution ${y_0}$ is said to be unstable if it is not stable.

Equivalent definitions can be written for the nonautonomous system $y' = f(t, y)$.

Now we can add notions of globally asymptoctically stable, regions of asymptotic stability and so forth.

From all of these definitions, we can write nice theorems about Linear and Almost Linear system by looking at eigenvalues and we can add notions of conditional stability.

Update

You might also want to peruse the web for notes that deal with the above. For example DEQ.

Regards