[Math] Estimating the spectral radius of a matrix, noniteratively

linear algebra

Morris Marden's "Geometry of Polynomials" displays a number of formulae that allow one to estimate bounds on the largest root of a polynomial that do not involve actual rootfinding. Having been inspired by this, and since this particular problem crops up in one of the things I'm working on, I was wondering if one could get good estimates of the spectral radius of a general dense n-by-n matrix A that has been previously processed as follows:

  1. a similarity transformation to upper Hessenberg form ($A=QHQ^T$, $Q$ orthogonal and $H$ upper Hessenberg); and
  2. subtracting the identity multiplied by the mean of the eigenvalues from $H$ ($H'=H-\frac{trace(H)}{n}I$, this corresponds geometrically to centering the eigenvalues around the origin of the complex plane).

As much as possible, I am trying to avoid having to resort to an eigenvalue method (e.g. QR (too much effort!), power method (the power method can misbehave when there is more than one eigenvalue whose modulus is equal to the spectral radius)) since I only need a quick 2-3 digit approximation of the spectral radius. I have considered actually expanding $H'$ to its characteristic polynomial (equivalently, a similarity transformation of $H'$ to a Frobenius companion matrix) so that the formulae listed in Marden can apply, but after reading Wilkinson's wonderful book "The Algebraic Eigenvalue Problem" where he details how unstable the computation of coefficients of the characteristic polynomial can get from a matrix, I suppose that idea is shot.

My other naïve attempts at estimating the spectral radius include using $||H'||_\infty$ as an estimate, and deriving rough bounds using Gerschgorin's theorem; the problem I've seen is that both attempts tend to overestimate the spectral radius by a significant factor.

Is there a way to estimate the spectral radius more cheaply and noniteratively than actually computing eigenvalues?

Best Answer

Why are you trying to avoid eigenvalue calculations in the first place? I think Arnoldi methods (such as Arpack, used e.g. in Matlab's eigs) would do a respectable job, and maybe even the power method itself --- when there are multiple eigenvalues with about the same modulus, convergence to the eigenvectors is problematic, but the growth factor should be a reliable approximation of the spectral radius nevertheless.

Related Question