Functional Analysis – Weakly Separated Sequences in RKHS and Gleason Metric

cv.complex-variablesfa.functional-analysisinterpolationoperator-theory

I am reading Agler and McCarthy's Pick Interpolation and Hilbert Function spaces. In Chapter 9, the authors ask to observe that weakly separated in a Reproducing kernel hilbert space implies separated in the Gleason metric. This is not immediate to me however.


Here are the definitions:

Fix a Hilbert function space $\mathcal H_k$ on $X$. The sequence $\Lambda = \{ \lambda_i \}_{i=1}^{\infty}$ in $X$ is weakly separated if there exist a constant $\varepsilon > 0$ such that whenever $i\ne j$, there is some function $\varphi _{ij}$ in the closed unit ball of the multiplier algebra $\mathcal H _k$ such that $\varphi _{ij} (\lambda_i) = \varepsilon$ and $\varphi _{ij} (\lambda _j) = 0$.

Also, the sequence $\Lambda$ in $X$ is said to be $d$-separated if there is some constant $\varepsilon > 0$ such that for $i\ne j$ we have that $d (\lambda _i , \lambda _j) > \varepsilon$. Note that the (pseudo)metric $d$ on $X$ is given by
\begin{align*}
d(\lambda _1 , \lambda_2) = \sqrt{ 1- \frac{\lvert k(\lambda_1 , \lambda _2)\rvert ^2}{k(\lambda_1 , \lambda_1 )k(\lambda_2 , \lambda_2 )}}
\end{align*}

where $\lambda_1 , \lambda_2 \in X$.


Here's my attempt:
Let $\Lambda = \{ \lambda_i \}_{i=1}^{\infty}$ in $X$ be weakly separated. It suffices to show that
\begin{align*}
\sup \left\{ \frac{\lvert k(\lambda_i , \lambda _j)\rvert ^2}{k(\lambda_i , \lambda_i )k(\lambda_j , \lambda_j )} : \lambda_i \ne \lambda_j \right\} <1
\end{align*}

in order to show that $\Lambda$ is $d$-separated. Since $\Lambda$ is weakly separated, there is some $\varepsilon_0 > 0$ and for each $i\ne j$, there is a function $\varphi_{ij}$ such that in the closed unit ball of the multiplier algebra $\mathcal H _k$ such that $\varphi _{ij} (\lambda_i) = \varepsilon_0$ and $\varphi _{ij} (\lambda _j) = 0$. Note that $\varepsilon_0 \le 1$ since $\varepsilon_0 =\lvert \varphi_{ij} (\lambda_i)\rvert \le \lVert \varphi_{ij} \rVert_{\infty} \le \lVert \varphi \rVert _{\mathcal M (\mathcal H)} \le 1$.

Now, note that for $i \ne j$, we have that
\begin{align*}
\varepsilon_0 ^2 \lvert k(\lambda_i , \lambda_j) \rvert &=\lvert{ \langle \epsilon_0 k_{ \lambda_j}, \epsilon_0 k_{ \lambda_i} \rangle}\rvert \\
&= \lvert \langle \varphi_{ji}(\lambda_j ) k_{\lambda_j}, \varphi_{ij}(\lambda_i ) k_{\lambda_i} \rangle \rvert \\
&= \lvert \langle {M_{\varphi_{ji}}}^* k_{\lambda_j}, {M_{\varphi_{ij}}}^* k_{\lambda_i} \rangle \rvert \\
&\le \lVert k_{\lambda _j} \rVert \lVert k_{\lambda _i} \rVert.
\end{align*}

This implies that
\begin{align*}
\frac{\lvert k(\lambda_i , \lambda _j)\rvert ^2}{k(\lambda_i , \lambda_i )k(\lambda_j , \lambda_j )} \le \frac{1}{\varepsilon_0 ^4}.
\end{align*}

But $\frac{1}{\varepsilon _0 ^4} \ge 1$ so I do not get what I desire. I also realise that I have failed to use that fact that $\varphi_{ij} (\lambda_j) = 0$. Directions on proving this will be appreciated! $\ddot \smile$

Best Answer

If a sequence is weakly separated, i.e. there exists a multiplier $\varphi_{ij}$ of multiplier norm at most one such that $\varphi_{ij}(\lambda_i)=\varepsilon, \varphi_{ij}(\lambda_j)=0$, then necessarily the $2\times2$ Pick matrix associated to this $2$-point interpolation problem is positive semidefinite; \begin{equation*} \begin{bmatrix} (1-\varepsilon^2)k(\lambda_i,\lambda_i) & k(\lambda_i,\lambda_j)\\ k(\lambda_i,\lambda_j) & k(\lambda_j,\lambda_j) \end{bmatrix} \geq0. \end{equation*} In particular its determinant has to be non negative, that is \begin{equation*} (1-\varepsilon^2)k(\lambda_i,\lambda_i)k(\lambda_j,\lambda_j)-|k(\lambda_i,\lambda_j)|^2 \geq0 \end{equation*} which after rearranging the terms gives $d$-separation.

Related Question