Positive definite matrices – symmetric – diagonal elements

matricespositive definitesymmetric matricestridiagonal-matrices

a) Show that the tridiagonal matrix $$A=\begin{pmatrix} 2 & 1 & \dots & 0 \\ 1 & 2 & 1 & \vdots \\ \vdots & \vdots & \ddots & \vdots \\ 0 & \ldots & 1 & 2\end{pmatrix}$$ is positive definite.

b) Show that a symmetric, positive definite matrix $A\in \mathbb{R}^{n\times n}, \ A=\{a_{k\ell}\}_{k,\ell=1}^n$ has only positive diagonal elements and that it holds that $\displaystyle{\max_{k=1,\ldots, n}| a_{kk}| = \max_{k,\ell=1,\ldots,n}| a_{k\ell}|}$.


I have done the following :

a) Let $x\in \mathbb{R}^n\setminus \{\vec{0}\}$.
\begin{align*}x^TAx&=\sum_{i,k=1}^na_{ik}x_ix_k \\ & [\text{ if } i=k : a_{ii}=2 \ , \ \text{ if } k=i-1 : a_{i(i-1)}=1 \ , \ \text{ if } k=i+1 : a_{i(i+1)}=1 ]\\ & =\sum_{i=1}^nx_i x_{i-1}+2\sum_{i=1}^n x_i^2+\sum_{i=1}^n x_ix_{i+1} \\ & =x_1 x_{0}+\sum_{i=2}^nx_i x_{i-1}+2\sum_{i=1}^n x_i^2+\sum_{i=1}^{n-1} x_ix_{i+1} +x_nx_{n+1} \\ & =\sum_{i=2}^nx_i x_{i-1}+2\sum_{i=1}^n x_i^2+\sum_{i=1}^{n-1} x_ix_{i+1} \\ & = \sum_{i=1}^{n-1}x_i x_{i+1}+2\sum_{i=1}^n x_i^2+\sum_{i=1}^{n-1} x_ix_{i+1} \\ & = 2\sum_{i=1}^{n-1}x_i x_{i+1}+2\sum_{i=1}^n x_i^2 \\ & = \left (\sum_{i=1}^{n-1} x_i^2+x_n^2\right )+2\sum_{i=1}^{n-1}x_i x_{i+1}+\left (x_1^2+\sum_{i=2}^n x_i^2\right ) \\ & = \sum_{i=1}^{n-1} x_i^2+2\sum_{i=1}^{n-1}x_i x_{i+1}+\sum_{i=2}^n x_i^2+x_1^2+x_n^2 \\ & = \sum_{i=1}^{n-1} x_i^2+2\sum_{i=1}^{n-1}x_i x_{i+1}+\sum_{i=1}^{n-1} x_{i+1}^2+x_1^2+x_n^2 \\ & = \sum_{i=1}^{n-1} \left (x_i^2+2x_i x_{i+1}+ x_{i+1}^2\right )+x_1^2+x_n^2 \\ & = \sum_{i=1}^{n-1} \left (x_i+ x_{i+1}\right )^2+x_1^2+x_n^2 \\ & >0 \end{align*}
Since it is a sum of positive (squares) terms. It is equal to zero only if all terms are equal to zero and we get this only when $\vec{x}=\vec{0}$.

Is that correct and complete?

$$$$

At b) I need some help.

We have that $A$ is symmetric, that means that $\displaystyle{a_{k\ell}=a_{\ell k}}$, or not?

We also have that $A$ is positive definite, that means that $\displaystyle{x^TAx>0 \Rightarrow \sum_{k, \ell=1}^na_{k \ell }x_kx_\ell >0}$ for $x\neq 0$, right?

How do we get that $a_{kk}>0$ ?

Do we maybe take the half sum and then the other sum is the same due to symmetric property?

$$$$

EDIT :

For b) I have done the following :

We consider that the contrary is true, i.e $\displaystyle{\max_{k,\ell=1,\ldots,n}|a_{k\ell}|=|a_{ij}|}$ with $i<j$.

Since $A$ is positive definite we have that $x^T A x >0 \quad \forall x\neq 0$.

We consider the vector $x_{ij}$ where the $i$-th entry is equal to $1$ and the $j$-th entry is equal to $-1$ and all other entries are equal to $0$, where $i,j\in \{1, \ldots , n\}$ arbitrary.

For this vector we get : \begin{align*}&x_{ij}^T A x_{ij} >0 \\ & \Rightarrow \begin{pmatrix}0 & \ldots & 0 & 1 & 0 & \ldots & -1 & \ldots & 0\end{pmatrix}\begin{pmatrix}a_{11} & \ldots & a_{1n} \\ \ldots & \ldots & \ldots \\ a_{i1} & \ldots & a_{in} \\ \ldots & \ldots & \ldots \\ a_{j1} & \ldots & a_{jn} \\ \ldots & \ldots & \ldots \\ a_{n1} & \ldots & a_{nn}\end{pmatrix}\begin{pmatrix}0 \\ \vdots \\ 0 \\ 1 \\ 0 \\ \vdots \\ -1 \\ \vdots \\ 0\end{pmatrix}>0 \\ & \Rightarrow \begin{pmatrix}a_{i1}-a_{j1} & \ldots & a_{ii}-a_{ji} & \ldots & a_{ij}-a_{jj} & \ldots & a_{in}-a_{jn}\end{pmatrix}\begin{pmatrix}0 \\ \vdots \\ 0 \\ 1 \\ 0 \\ \vdots \\ -1 \\ \vdots \\ 0\end{pmatrix}>0 \\ & \Rightarrow (a_{ii}-a_{ji})-(a_{ij}-a_{jj})>0\\ & \Rightarrow a_{ii}-a_{ji}-a_{ij}+a_{jj}>0\end{align*}
Since $A$ is symmetric we get that $a_{ji}=a_{ij}$, and so we get \begin{equation*}a_{ii}-a_{ji}-a_{ij}+a_{jj}>0 \Rightarrow a_{ii}-a_{ij}-a_{ij}+a_{jj}>0\Rightarrow a_{ii}+a_{jj}>a_{ij}+a_{ij}\end{equation*} and in that way we get a contradiction.

Is that correct and complete?

Best Answer

  1. Show that a (symmetric) positive definite matrix $A\in \mathbb{R}^{n\times n}$, has positive diagonal elements.

Positive definite: $x^T A x >0 \quad \forall x\neq 0$.

Consider the vector $x_k$ with $k^{\text{th}}$ entry $1$ and remaining entries $0$.

$x_k^T A x_k = a_{kk} >0 \quad { }_\blacksquare$

  1. Show that the matrix element with maximum absolute value lies on the diagonal.

Consider the vector $x = [x_1\ x_2\ 0 \cdots 0]$.

$\begin{align} &x^T A x = a_{11}x_1^2 + 2a_{12}x_1 x_2 + a_{22}x_2^2 >0 \\ &\Rightarrow a_{11}a_{22} > a_{12}^2 \\ &\Rightarrow \frac{a_{11}}{|a_{12}|} \frac{a_{22}}{|a_{12}|} > 1 \end{align}$

Which implies that either $a_{11}$ or $a_{22}$ must be greater than $|a_{12}|$. You can easily generalize to show that $|a_{kl}|$ is upper-bounded by either $a_{kk}$ or $a_{ll}$ for all $k,l$, which completes the proof.


Of course, as mentioned earlier, the Gershgorin circle theorem is a useful tool to have in your linear algebra tool box.