[Math] Fast multiplication of constant symmetric positive-definite matrix and vector.

linear algebrana.numerical-analysis

Consider the matrix $H=H^T$, $H>0$, $H \in R^{n \times n}$, and the vector $v \in R^n$. In a numerical algorithm, I need to compute the product $b = Hv$. Right now I am following the naive approach:
$b_i = \sum_{j=1}^{n} h_{ij} v_j, i=1,…,n$.
Is there a faster way to compute this product? $H$ is non-sparse and constant (i.e. eigenvectors, eigenvalues, etc. of $H$ are available).

Best Answer

If I understand the question right, by "constant" it is meant that $H$ is a fixed, but arbitrary positive definite matrix.

In general, I don't think that you can compute the matrix-vector product $Hv$ faster than $O(n^2)$. But if $H$ has structure (Toeplitz, Circulant, Strictly diagonally dominant, etc.), or if you are willing to settle for approximate answers, you can compute the product faster ("randomized linear algebra" is a good search term).

A very naive approach is the following: Suppose that $v$ is some vector, and instead of using $H$, you use $H_k$, the top-$k$ rank approximation to $H$. Then, $H_kv$ can be computed in time $O(nk)$. The error of this computation is $\|Hv-H_kv\| \le \|H-H_k\|\times\|v\|$, which is fairly easy to characterize.

Related Question