Prove that generalized Vandermonde matrix is invertible

determinantlinear algebramatricesmatrix analysis

Given $$A = \left( z_i^{\lambda_k}\right)_{i,j = 1,\ldots, n} =
\begin{pmatrix}
z_1^{\lambda_1} & z_1^{\lambda_2} & \cdots & z_1^{\lambda_n} \\
z_2^{\lambda_1} & z_2^{\lambda_2} & \cdots & z_2^{\lambda_n} \\
\vdots & \vdots & \ddots & \vdots \\
z_n^{\lambda_1} & z_n^{\lambda_2} & \cdots & z_n^{\lambda_n}
\end{pmatrix}.$$

with $z_1 < z_2 < … < z_n \in \mathcal{R}_+$ and $\lambda_1 < \lambda_2 < … \lambda_p \in \mathcal{R}$, how to show that A is invertible?

I think I probably need to show that the determinant is not zero but I am not sure how.

I do some derivation and get

enter image description here

Not sure how to continue. Also not sure if this is the right way.

There is a similar question here Generalized Vandermonde-Matrix but the conditions are different.

Best Answer

We assume $\alpha_j\in \mathbb{R}$ and $x_j>0$ for $j=1,2,\ldots, n.$ The proof will go by induction on $n.$ The conclusion is obviously true for $n=2.$ Assume the conclusion is true for $n-1.$ Assume $\det A=0.$ Equvalently the columns of $A$ are linearly dependent. Therefore there exist constants $c_1,c_2,\ldots , c_n$ not all equal $0,$ such that the function $$c_1x^{\alpha_1}+c_1x^{\alpha_2}+\ldots +c_nx^{\alpha_n}$$ vanishes at the points $x_1<x_2<\ldots < x_n.$ Hence the function $$f(x)=c_1+c_2x^{\alpha_2-\alpha_1}+\ldots +c_nx^{\alpha_n-\alpha_1}\quad (*)$$ vanishes at $x_1<x_2<\ldots < x_n.$ By the Rolle theorem there exist $u_1<u_2<\ldots <u_{n-1}$ such that the function $$f'(x)=c_2(\alpha_2-\alpha_1)x^{\alpha_2-\alpha_1-1}+\ldots +c_n(\alpha_n-\alpha_1)x^{\alpha_n-\alpha_1-1}$$ vanishes at $u_1<u_2<\ldots <u_{n-1}.$ By induction hypothesis we conclude that $c_j(\alpha_j-\alpha_1)=0$, $2\le j\le n.$ Thus $c_2=\ldots =c_n=0.$ By $(*)$ we get $f(x)=c_1,$ i.e. $c_1=0,$ a contradiction. The induction step is thus completed. $\blacksquare$

Remark By the intermediate value property it can be proved that the determinant is positive. Indeed, the function $$F(\alpha_1,\alpha_2,\ldots,\alpha_n)=\det A(\alpha_1,\alpha_2,\ldots,\alpha_n)$$ is continuous. Let $k$ be a positive integer such $\alpha_1<\ldots <\alpha_n <k.$ Then $\alpha_j<k+j.$ Consider the function $$g(t):=F((1-t)\alpha_1+t(k+1) ,(1-t)\alpha_2+t(k+2),\ldots,(1-t)\alpha_{n}+t(k+n))$$ Then $g(0)=F(\alpha_1,\alpha_2,\ldots,\alpha_n)$ and $g(1)=F(k+1,k+2,\ldots, k+n)>0.$ As $g(t)$ does not vanish we conclude $g(0)>0.$

Related Question