Distance from eigenspace of matrix

algorithmseigenvalues-eigenvectorslinear algebranumerical linear algebraterminology

In linear algebra, is there a separate name / concept for the notion of distance between linear vector subspaces?

I'm asking this because I'm considering a problem in numerical linear algebra where a Krylov subspace iterative method is used. Since for every subsequent $n$ a Krylov subspace method implicitly generates an additional basis vector in Krylov subspace, which approaches the eigenspace of the matrix for which the problem $$Ax=b$$ is being solved, it must be true that if $b$ is in the span of the eigenspace of $A$ then the convergence will happen faster.

But what if $b$ is very "far" from the eigenspace? I'm trying to think about what the notion of a distance between two vector subspaces could mean or how it could be defined. Would a vector $b$ contained in a subspace "far away" from the eigenspace of $A$ make iteration of a Krylov subspace method take longer than in a general case?

Best Answer

The common notion of distance is to consider an orthogonal projection $P$ onto the first linear subspace $V$, and an orthogonal projection $Q$ onto the other subspace $W$.

At this point we can define $$d(V,W) = \| P - Q \|$$ as the distance between these subspaces, where the norm used is the operator norm. For properties and applications see Section 2.5.3 of Golub and Van Loan.

This distance metric is used throughout GVL’s exposition on unsymmetrical eigenvalue problems (which involve Krylov methods) — see Chapter 7.

Related Question