[Math] Positive Semidefinite Function

covariancefunctional-analysislinear algebrapositive-semidefinite

What does it mean for a function to be positive (semi)definite?

Assume we have a positive (semi)definite function $K(X, Y)$ that lives in $R^n$, and maps vector pairs in $R^n$ into a scalar in R. For instance, a kernel function is defined as $K(x,y) = <\phi(x),\phi(y)>$ for some appropriate mapping function $\phi$. Assume further that we have n vectors, $v_1, \cdots, v_n$, with real entries. We say that a matrix, A, has the entries $A_{ij} = K(v_i,v_j) = A_{ji} = K(v_j,v_i)$. Why we can conclude that A is a positive (semi)definite matrix?

Why the sum of positive (semi)definite functions is also positive (semi)definite?

Best Answer

Recall that:

Definition. Let $X$ be a $\def\R{\mathbf R}\R$-vector space. A bilinear map $K \colon X \times X \to \R$ is called positive semi-definite, iff we have $K(x,x) \ge 0$ for all $x \in X$. If moreover $K(x,x) = 0 \iff x= 0$, $K$ is called positive definite.

With that we have: Suppose, $K \colon \R^n\times \R^n \to \R$ is positive (semi)definite, let $v_1, \ldots, v_n \in \R^n$ be $n$ vectors, then the matrix $A = \bigl(K(v_i,v_j)\bigr)_{i,j}$ is positive (semi-)definite, as for $\xi \in \R^n$ we have, due to $K$'s bilinearity: \begin{align*} \def\<#1>{\left<#1\right>}\<A\xi, \xi> &= \sum_{i=1}^n (A\xi)_i\xi_i\\ &= \sum_{i,j=1}^n A_{ij}\xi_j\xi_i\\ &= \sum_{i,j=1}^n \xi_j\xi_i K(v_i, v_j)\\ &= K\left(\sum_i \xi_i v_i, \sum_j \xi_j v_j\right)\\ &\ge 0 \end{align*} If $K$ is positive definite and the $v_i$'s are linear independent, then $A$ is positive definite: Suppose $\<A\xi, \xi> = 0$, then by the above, we have $K(\sum_i \xi_i v_i, \sum_i \xi_i v_i) = 0$, hence - as $K$ is definite - $\sum_i \xi_i v_i = 0$. As the $v_i$ are independent, this implies $\xi = 0$. So $A$ is positive definite.