[Math] Norm of a vector and a positive definite matrix (inequality)

convex optimizationmatrices

Prove that $\frac{\|a\|^2}{a^TAa}\le \|A^{-1}\|$ where $a\in \mathbb{R}^n$, $A\in \mathbb{R}^{n\times n}$ is a symmetric positive definite matrix and $\|x\|$ is the Euclidean norm $\sqrt(x_1^2+\cdots+x_n^2),x\in\mathbb{R}^n$.

For a matrix $A$, $\|A\|=\max \{\|Ax\|:\|x\|=1\}$, for a symmetric matrix we have $\max\{|x^TAx|:\|x\|=1\}$ and also the maximum absolute value of the eigenvalues of $A$.

I see that $\frac{a^TAa}{\|a\|^2}=\frac{a^T}{\|a\|}A\frac{a}{\|a\|}\le \|A\|$. So $\frac{\|a\|^2}{a^TAa}\ge\|A\|^{-1}$.

We also have $\|A\|\|A^{-1}\|\ge1$.

Best Answer

We have from this min-max theorem that $\lambda_{min}\|x\|^2\leq x^THx \le \lambda_{max}\|x\|^2$ where $\lambda_{min},\lambda_{max}$ are the smallest and largest eigenvalues of the Hermitian matrix $H$.

Thus we have $\frac{a^TAa}{\|a\|^2} \ge \lambda_{min}$ so $\frac{\|a\|^2}{a^TAa} \le \frac{1}{\lambda_{min}}=\|A^{-1}\|$ as $\frac{1}{\lambda_{min}}$ is the largest eigenvalue of $\|A^{-1}\|$.

Related Question