Linear Algebra – Show Vandermonde Matrix is Invertible Without Using Determinant

linear algebramatrices

Suppose $a_1, a_2, \dots, a_n$ are different elements from the field $F$ (if $i\neq j$, then $a_i\neq a_j$). Show that this matrix is invertible:
$$V=\begin{bmatrix}
1 && 1&&\dots&&1\\
a_1&&a_2&&\dots&&a_n\\
a_1^2&&a_2^2&&\dots&&a_n^2\\
\vdots&&\vdots&&\ddots&&\vdots\\
a_1^{n-1}&&a_2^{n-1}&&\dots&&a_n^{n-1}
\end{bmatrix}$$

Matrix $V$ is a Vandermonde matrix. I don't want to use determinant anywhere. Also I don't know about being linearly independent and I don't want to use it here. These are the thing I know:

  • Elementary Row Operations and Elementary Row Matrices
  • Row-Reduced Echelon Forms
  • Row Equivalence
  • If $Ax=0$ has only one answer, then $A$ is invertible

And things like this as I'm reading K. Hoffman and R. Kunze book about linear algebra. I want to show $V$ is invertible only using things I said above. If you want to use something that you doubt if I know or not, I would be thankful if you ask in the comments. Any help is appreciated!

Best Answer

Here is a proof by induction on $n$: For $n=1$ this is trivial (why?). Now write $$ V_n = \begin{bmatrix} 1 & 1 & \cdots & 1\\ a_1 & a_2 & \cdots & a_n\\ a_1^2 & a_2^2 & \cdots & a_n^2\\ \vdots & \vdots & & \vdots\\ a_1^{n-1} & a_2^{n-1} & \cdots & a_n^{n-1}\\ \end{bmatrix} $$ and assume that $V_n$ is invertible and consider the matrix $V_{n+1}$. Now we apply a sequence of elementary row operations: Starting from row $n+1$ and finishing with row $2$, substract from row $j$ row $j-1$ multiplied by $a_{n+1}$. In terms of elementary matrices this amounts to the same as multiplying $V_{n+1}$ to the left by a sequence of $n$ elementary matrices and hence by an invertible matrix $A_1$. Thus We obtain $$ A_1V_{n+1} = \begin{bmatrix} 1 & 1 & \cdots & 1 & 1\\ a_1-a_{n+1} & a_2-a_{n+1} & \cdots & a_n-a_{n+1} & 0\\ a_1(a_1-a_{n+1}) & a_2(a_2-a_{n+1}) & \cdots & a_n(a_n-a_{n+1}) & 0\\ a_1^2(a_1-a_{n+1}) & a_2^2(a_2-a_{n+1}) & \cdots & a_n^2(a_n-a_{n+1}) & 0\\ \vdots & \vdots & & \vdots & \vdots\\ a_1^{n-1}(a_1-a_{n+1}) & a_2^{n-1}(a_2-a_{n+1}) & \cdots & a_n^{n-1}(a_n-a_{n+1}) & 0 \end{bmatrix}. $$ Now consider the traspose matrix $$ V_{n+1}^TA_1^T = (A_1V_{n+1})^T = \begin{bmatrix} 1 & a_1-a_{n+1} & a_1(a_1-a_{n+1}) & a_1^2(a_1-a_{n+1}) & \cdots & a_1^{n-1}(a_1-a_{n+1})\\ 1 & a_2-a_{n+1} & a_2(a_2-a_{n+1}) & a_2^2(a_2-a_{n+1}) & \cdots & a_2^{n-1}(a_2-a_{n+1})\\ \vdots & \vdots & \vdots & \vdots & & \vdots\\ 1 & a_n-a_{n+1} & a_n(a_n-a_{n+1}) & a_n^2(a_n-a_{n+1}) & \cdots & a_n^{n-1}(a_1-a_{n+1})\\ 1 & 0 & 0 & 0 & \cdots & 0 \end{bmatrix}. $$ Since $a_j-a_{n+1}\neq 0$ for $j=1,\dots,n$ we can divide row $j$ ($1\leq j \leq n$) by the scalar $a_j-a_{n+1}$ by multiplying to the left by elementary matrices. Thus there is an invertible matrix $A_2$ such that $$ A_2 V_{n+1}^T A_1^T = \begin{bmatrix} 1 & 1 & a_1 & a_1^2 & \cdots & a_1^{n-1}\\ 1 & 1 & a_2 & a_2^2 & \cdots & a_2^{n-1}\\ \vdots & \vdots & \vdots & \vdots & & \vdots\\ 1 & 1 & a_n & a_n^2 & \cdots & a_n^{n-2}\\ 1 & 0 & 0 & 0 & \cdots & 0 \end{bmatrix}. $$ Now we interchange rows to put row $n+1$ on top. Thus there is an invertible matrix $A_3$ such that $$ A_3A_2V_{n+1}^T A_1^T = \begin{bmatrix} 1 & 0 & 0 & 0 & \cdots & 0\\ 1 & 1 & a_1 & a_1^2 & \cdots & a_1^{n-1}\\ 1 & 1 & a_2 & a_2^2 & \cdots & a_2^{n-1}\\ \vdots & \vdots & \vdots & \vdots & & \vdots\\ 1 & 1 & a_n & a_n^2 & \cdots & a_n^{n-2} \end{bmatrix}. $$ We now substract row $1$ from row $j$ for $j=2,\dots,n+1$, so there is an invertible matrix $A_4$ such that $$ A_4A_3A_2V_{n+1}^TA_1^T = \begin{bmatrix} 1 & 0 & 0 & 0 & \cdots & 0\\ 0 & 1 & a_1 & a_1^2 & \cdots & a_1^{n-1}\\ 0 & 1 & a_2 & a_2^2 & \cdots & a_2^{n-1}\\ \vdots & \vdots & \vdots & \vdots & & \vdots\\ 0 & 1 & a_n & a_n^2 & \cdots & a_n^{n-2} \end{bmatrix}. $$ Taking transposes, we can write, in block notation $$ A_1V_{n+1}B = \begin{bmatrix} 1 & \mathbf{0}_{1\times n}\\ \mathbf{0}_{n\times 1} & V_n \end{bmatrix}, $$ where $\mathbf{0}_{i\times j}$ is a $i\times j$ matrix of zeroes and $B=(A_4A_3A_2)^T$ is an invertible matrix. By induction hypothesis $V_n$ is invertible. Thus consider the block matrix $$ C = \begin{bmatrix} 1 & \mathbf{0}_{1\times n}\\ \mathbf{0}_{n\times 1} & V_n^{-1}, \end{bmatrix} $$ and a simple multiplication gives $$ A_1V_{n+1}BC=CA_1V_{n+1}B = I_{n+1} $$ where $I_{n+1}$ is the identity matrix of size $(n+1)\times(n+1)$. This shows that $V_{n+1}$ is invertible and that it equals $A_1^{-1}C^{-1}B^{-1}$.

Note. For the ones familiar with the computation of Vandermonde's determinant, it is clear that I just followed the computation of the determinant, so there is nothing new here but to make it a computation of the inverse. Also, taking transposes is to avoid the use of column operations, for the OP's comfort.

ADDED

As I said in the comments, here is a proof using linear maps. Let $F_n[x]$ be the vector space of polynomials of degree strictly less than $n$. We know that $$ \dim_F F_n[x]=n. $$ Consider the linear map $$ T:F_n[x]\to F^{n}, \quad f(x)\mapsto (f(x_1),f(x_2),\dots,f(x_n)). $$ This map is surjective. Indeed, let $(a_1,\dots,a_n)\in F^n$. For each $j\in\{1,\dots,n\}$ let $$ L_j(x) = \frac{(x-x_1)\cdots (x-x_{j-1})(x-x_{j+1})\cdots (x-x_n)}{(x_j-x_1)\cdots (x_j-x_{j-1})(x_j-x_{j+1})\cdots (x_j-x_n)}\in F_n[x]. $$ Note that $L_j(x_k)=\delta_{jk}$, where $\delta_{jk}$ is the Kronecker symbol. Now let $$ f(x) = \sum_{j=1}^n a_j L_j(x)\in F_n[x], $$ then $T(f)=(a_1,\dots,a_n)$. This shows that $T$ is surjective. Since both $F_n[x]$ and $F^n$ have dimension $n$, then $T$ is a vector space isomorphism, in particular the matrix of $T$ in any basis is invertible. Now let $B_1=(1,x,x^2,\dots,x^{n-1})$ which is a basis of $F_n[x]$ and let $B_2=(e_1,\dots,e_n)$ be the canonical basis of $F^n$. Then $$ T(1)=e_1+e_2+\cdots+e_n $$ and for $j\in\{1,\dots,n-1\}$, $$ T(x^j) = x_1^j e_1 + x_2^je_2+\cdots + x_n^j e_n. $$ This means that the matrix of $T$ in this bases is $$ [T] = \begin{bmatrix} 1 & x_1 & x_1^2 & \cdots & x_1^{n-1}\\ 1 & x_2 & x_2^2 & \cdots & x_2^{n-1}\\ \vdots & \vdots & \vdots & & \vdots\\ 1 & x_n & x_n^2 & \cdots & x_n^{n-1} \end{bmatrix} $$ and thus $[T]$ is invertible. But $[T]$ is the transpose of the Vandermonde matrix, so the latter is invertible.

Related Question