[Math] Deducing a formula for multiplying a tri-diagonal symmetrical matrix with vectors

linear algebramatricesnumerical linear algebra

This is more like a math-programming problem, dealing with memory efficiency, but I thought it would be nice to expose it here.

Let $A \in \mathbb{R}^{n \times n}$ be a tri-diagonal symmetrical matrix, like so: $A = \begin{pmatrix}
d_1 & s_1 \\
s_1 & d_2 & s_2 \\
& s_2 & \ddots & \ddots \\
& & \ddots & \ddots & s_{n-1} \\
& & & s_{n-1} & d_n
\end{pmatrix}$.

Because A is a sparse matrix and $n$ can be large, $A$ is stored into memory as two vectors: $d = [d_1 \ d_2 \ \dots \ d_n]$ and $s = [s_1 \ s_2 \ \dots \ s_{n-1}]$.
Given a column vector $v \in \mathbb{R}^{n \times 1}$ and its row equivalent form (transpose) $v^T \in \mathbb{R}^{1 \times n}$, what formula can be deduced for multiplying

$B = v^T \cdot A \cdot v$

only by using the vectors $v, d, s$?

Best Answer

Just multiply out to get $v^TAv=\sum_{i=1}^nd_iv_i^2+2\sum_{i=1}^{n-1}s_iv_iv_{i+1}$. You can do whatever vectorised operations you want using this.