[Math] Non-hyperbolic fixed points in multidimensional systems

differential equationsds.dynamical-systems

Consider first a one-dimensional dynamical system given by $dx/dt = f(x)$. Suppose that the origin is a fixed point, i.e. $f(0)=0$. Suppose that we're interested in whether trajectories that start from positive initial values of $x$ (hereafter "positive trajectories") will converge towards the origin or diverge away from it.

If $df/dx$ is negative at 0 then the fixed point is locally asymptotically stable; if it's positive then it's unstable. If $df/dx=0$ at the origin then the fixed point is called "non-hyperbolic". If $d^2 f/dx^2 <0$ then it may or may not be Lyapunov stable (because negative trajectories might diverge), but positive trajectories will converge toward the origin, albeit more slowly than a linearly stable fixed point. In general positive trajectories will converge towards the origin iff the lowest non-zero derivative is negative.

My question is about how to generalise this to the $n$-dimensional case. I'm dealing with systems where no trajectory starting in the non-negative orthant can leave the non-negative othant, and I'm interested only in trajectories that are confined to it. ("Nonnegative trajectories".)

In the case where the origin is a hyperbolic fixed point it's clear how to proceed. One creates the linear approximation $\mathbf{\dot{x}} = A\mathbf{x}$. The Jacobian matrix $A$ is the analog of $dx/dt$ – if all its eigenvalue have real part strictly less than 0 then the fixed point is stable, and hence all trajectories (positive or otherwise) will converge toward the origin. If one of the eigenvalues has real part greater than $0$ then the fixed point is unstable. (The condition that trajectories are confined to the non-negative orthant implies that $A$ can have negative entries only on its diagonal, which in turn means that its leading eigenvalue will always have a corresponding eigenvector in the nonnegative orthant, by the Perron-Frobenius theorem applied to $e^{\varepsilon A}$ for sufficiently small $\varepsilon$; hence we don't need to worry about the possibility that it might be unstable only for non-nonnegative trajectories.)

However, I'm currently dealing with systems where the leading eigenvalue of $J$ is exactly 0, i.e. the fixed point is non-hyperbolic. In this case I guess I have to move up to a quadratic approximation:
$$
\frac{dx_i}{dt} = \sum_j A_{ij}x_j + \sum_{jk}B_{ijk}x_jx_k.
$$
I can calculate the values of $A_{ij}$ and $B_{ijk}$ for my systems. (They are generated computationally, and in my systems $n$ is of the order 1000 or so.) In addition to $A$ having negative entries only on its diagonal, the condition of being confined to the positive orthant implies that if $B_{ijk}$ is non-zero then it is negative if and only if $i=j$ or $i=k$ (or both).

The problem is that I don't know how to tell from these values whether the system is stable or not. I'm looking both for an algorithm to determine this, and for necessary and sufficient conditions that are easy to compute. It strikes me that there's probably a nicely worked out theory that deals with this, but I don't know its name, so I can't find any information about it.

Best Answer

Unfortunately you'll have to face a «no» answer, as there is no general/generic way to handle your problem…

First of all, your definition of «non-hyperbolic» is not clear to me when $n>1$. As far as I know a singular point like yours is deemed hyperbolic when $0$ does not belong to the real convex hull of the spectrum (in $\mathbb C$) of the linear part $A$. In case the singularity is hyperbolic, Poincaré's linearization theorem ensures that the system is dynamically equivalent to its linear part, as you refer to in the question. Certainly when you have null eigenvalues the system is not hyperbolic in that sense.

Now if the singular point is not hyperbolic, then a lot of bad behaviors can crop up, even in the case $n=2$. The presence of resonances ($\mathbb Q$-linear relations between the eigenvalues of $A$) or quasi-resonances (irrational relations which are «well approximated» by rational ones) can mess things up. Even from a formal viewpoint the answer is not clear, and certainly not computable without restrictive conditions on $f$ (e.g. polynomial, and even then…). In particular it is not true that the system $\dot x=f(x)$ is conjugate (dynamically equivalent) to a system where $f$ is algebraic using changes of coordinates which are merely $C^1$ at the singular point. Hence it is not sufficient to take a finite jet (let alone the second jet) to answer your question when there are resonances (as is the case here when at least one eigenvalue vanishes).

Dynamical systems with at least one zero eigenvalue are called «saddle-nodes», so you might wish to search for this keyword. Notice, though, that the theory of saddle-nodes is complete only in the smooth case (that is, $f$ smooth) for $n=2$, and partially covered when $n>2$. The more resonances (e.g. zero eigenvalues) the more difficult the problem is, even from a purely theoretical viewpoint…

So, without more information on your specific system, nothing can be said.

EDIT: You still can say something regarding stability. For instance look at the system $$\dot x(t) = x(t)^{k+1} \\ \dot y(t)=-y(t)$$ with $x, y\in\mathbb R$. Depending on the parity of $k$, either all trajectories are stable or only a single trafectory is (that's $y=0$). In these problems you need to identify directions in the $x$-variable where you have contributions of exponential terms $\exp(x^{-k}/k)$. This can be done by inspecting the linear part, and in general governs «how stable» the solutions will be. I don't have a recipe when $n>1$, though.

Related Question