$A$ symmetric positive definite show that $\langle x, Ax\rangle \langle y, A^{-1}y\rangle \geq \langle y, x\rangle^{2}$

linear algebrapositive definite

$\langle\:,\:\rangle$ denotes the canonical inner product of $\mathbb{R}^{n}$
let $A \in S_n^{++}(\mathbb{R})$,
show that $\forall(x,y) \in \mathbb{R}^{n} \times \mathbb{R}^{n}, \langle x, Ax\rangle \langle y, A^{-1}y\rangle \geq \langle y, x\rangle^{2}$

my try:

$\lambda_{min}$ the smallest eigenvalue of $A$, since $A$ is positive definite, $\lambda_{min}>0$

$\lambda_{max}$ the biggest eigenvalue of $A$, since $A$ is positive definite, $\lambda_{max}>0$

we have $\lambda_{min} \: \langle x, x\rangle \leq \langle x, Ax\rangle \le \lambda_{max} \:\langle x, x\rangle$ since $A$ is positive definite

we also have $\frac{1}{\lambda_{max}} \: \langle y, y\rangle \leq \langle y, A^{-1}y\rangle \le \frac{1}{\lambda_{min}} \:\langle y, y\rangle$

thus $ \frac{\lambda_{min}}{\lambda_{max}} \: \langle y, y\rangle \langle x, x\rangle \leq \langle x, Ax\rangle \langle y, A^{-1}y\rangle \leq \frac{\lambda_{max}}{\lambda_{min}} \: \langle y, y\rangle \langle x, x\rangle$

using cauchy Schwarz $\langle y, x\rangle^{2} \leq \langle y, y\rangle \langle x, x\rangle$

In the end, we have $\frac{\lambda_{min}}{\lambda_{max}} \: \langle y, x\rangle^{2} \leq \langle x, Ax\rangle \langle y, A^{-1}y\rangle$ with $0<\frac{\lambda_{min}}{\lambda_{max}}\le1$
but isn't what it is asked.

Best Answer

Fact 1: There exists a positive definite matrix $B$ such that $B^{2}=A$.

Proof of Fact 1: Recall that positive definite matrix is orthogonally diagonalizable. That is, there exists an orthogonal matrix $P$ and a diagonal matrix $D=\mbox{diag}(\lambda_{1},\ldots,\lambda_{n})$ such that $A=PDP^{-1}$, where $\lambda_{1},\ldots,\lambda_{n}$ are eigenvalues of $A$. Note that $\lambda_{1},\ldots,\lambda_{n}$ are positive. Define $B=P\sqrt{D}P^{-1}$, where $\sqrt{D}=\mbox{diag}(\sqrt{\lambda_{1}},\ldots,\sqrt{\lambda_{n}})$, a diagonal matrix whose main diagonal consists of $\sqrt{\lambda_{1}},\ldots,\sqrt{\lambda_{n}}$. Clearly $B$ is also positive definite and $B^{2}=A$.


Claim 2: For any $x,y\in\mathbb{R}^{n}$, we have that $\langle Ax,y\rangle^{2}\leq\langle Ax,x\rangle\langle Ay,y\rangle$.

Proof of Claim 2: Let $x,y\in\mathbb{R}^{n}$ be given. Note that \begin{eqnarray*} \langle Ax,y\rangle^{2} & = & \langle B^{2}x,y\rangle^{2}\\ & = & \langle Bx,B^{t}y\rangle^{2}\\ & = & \langle Bx,By\rangle^{2}\\ & \leq & \langle Bx,Bx\rangle\langle By,By\rangle\\ & = & \langle B^{t}Bx,x\rangle\langle B^{t}By,y\rangle\\ & = & \langle BBx,x\rangle\langle BBy,y\rangle\\ & = & \langle Ax,x\rangle\langle Ay,y\rangle \end{eqnarray*} In the above, we have used the Cauchy-Schwarz inequality.

Remark: The above inequality can be proved directly without considering matrix $B$. For, let $\sigma(x,y)=\langle Ax,y\rangle$. $\sigma$ is positive, symmetric, bilinear, and shares a lot of properties similar to inner product. Therefore, we can repeat the proof of Cauchy-Schwarz inequality. Consider $0\leq\sigma(\lambda x+y,\lambda x+y)=\lambda^{2}\sigma(x,x)+2\lambda\sigma(x,y)+\sigma(y,y)$. If $\sigma(x,x)\neq0$, then the above is a quadratic in $\lambda$. It is always non-negative$\Rightarrow$discriminant is non-positive...


For your problem. Let $x',y'\in\mathbb{R}^{n}$ be given. Put $x=x'$ and $y=A^{-1}y'$ in Claim 2, then we get \begin{eqnarray*} \langle Ax',A^{-1}y'\rangle^{2} & \leq & \langle Ax',x'\rangle\langle AA^{-1}y',A^{-1}y'\rangle\\ & = & \langle Ax',x'\rangle\langle y',A^{-1}y'\rangle. \end{eqnarray*} Note that $\left(A^{-1}\right)^{t}=\left(A^{t}\right)^{-1}$, so \begin{eqnarray*} \langle Ax',A^{-1}y'\rangle & = & \langle(A^{-1})^{t}Ax',y'\rangle\\ & = & \langle(A^{t})^{-1}Ax',y'\rangle\\ & = & \langle(A)^{-1}Ax',y'\rangle\\ & = & \langle x',y'\rangle. \end{eqnarray*}

Related Question