On stability of time delayed systems in Laplace form

delay differential equationslaplace transformlinear algebralinear-controlstability-theory

Consider the following time-delayed system:

$$ \dot{x}(t)=A_0 x(t)+A_1 x(t-d), \quad \quad (1)$$

where $A_0$ and $A_1$ are time-invariant matrices and $d$ is a constant time delay. Taking Lapace transformation leads to

$$ s{X}(s)=(A_0+A_1e^{-ds})X(s). $$

Question: Suppose that we investigate the eigenvalues of $A_0+A_1e^{-ds}$ and derive some useful conditions based on which all eigenvalues of $A_0+A_1e^{-ds}$ are strictly negative. Can we conclude that system (1) is asymptotically stable under those conditions?

Example: Let $A_0+A_1e^{-ds} = \left[\begin{matrix} Ce^{-ds} \quad D \\ M \quad -Q \end{matrix}\right]$. Based on the schur complement it holds that if $Ce^{-ds} + DQ^{-1}M<0$ then $A_0+A_1e^{-ds}$ is negative definite. Now, we proceed the stability analysis based on conditions that satisfy $Ce^{-ds} + DQ^{-1}M<0$ for all $s$. Does this make sense?

Best Answer

If you are able to prove that the matrix (with the exponential term) is Hurwitz, then yes. But that might not be easy as the exponential term may introduce an infinite number of solutions. Also, keep in mind that--with a delay--the system may have different regions of stability. Anyhow, I think the answer you are looking for can be found here:

Niculescu, S-I., et al. "Delay-dependent stability of linear systems with delayed state: An LMI approach." Proceedings of 1995 34th IEEE Conference on Decision and Control. Vol. 2. IEEE, 1995.