[Math] Finding the Spectrum of integral operator

functional-analysis

I have the following integral operator:
$$(Ku)(x)=\int_{0}^{1} k(x,y)u(y) \mathop{dy}$$
with $k(x,y)=$min $ \{ x,y \}$ for $0 \leq x,y \leq 1$.$\\$

I have already shown $K$ is a compact, self adjoint operator but now I want to find the spectrum of $K$. I do not understand how to do this. Do I need to find the eigenvalues? If so, how would I go about doing this? (I am self teaching myself functional analysis so perhaps there is an easy way to do this that I just haven't come across? $\\$

Thanks in advance.

Best Answer

Since $K$ is compact, $0\in\sigma(K)$ (because $K$ is not invertible). Because $K$ is compact, any nonzero element of its spectrum has to be an eigenvalue.

Now suppose that $Ku=\lambda u$ for some $\lambda\ne0$. This, with your specific $k$, looks like $$ \lambda u(x)=\int_0^x y\,u(y)\,dy-x\,\int_1^x u(y)\,dy. $$ Since $u$ is integrable (otherwise $Ku$ makes no sense), the right-hand-side is continuous; so $u$ is continuous. But then the right-hand-side is differentiable, so $u$ is differentiable. Note also that $u(0)=0$, again because the right-hand-side is $0$ at $x=0$.

Now, if we differentiate, $$ \lambda u'(x)=x\,u(x)-\int_1^xu(y)\,dy-x\,u(x)=-\int_1^xu(y)\,dy. $$ Reasoning as before, we deduce that $u'(1)=0$, and that $u'$ is differentiable. Taking derivatives again, $$ \lambda u''(x)=-u(x). $$ The case $\lambda=0$ gives $u=0$, so $0$ is not an eigenvalue (it belongs to the spectrum, though). When $\lambda\ne0$, this is an easy second order boundary value problem: $$ u''+\frac1\lambda\,u=0,\ \ u(0)=0, \ u'(1)=0. $$ The general solution is, if we write $r=1/\sqrt\lambda$, $$ u(x)=\alpha\cos rx+\beta\sin rx. $$ The initial conditions force $\alpha=0$, $\cos r=0$. So $r=\frac{2k+1}2\pi$, $k\in\mathbb Z$, that is $$ \frac1{\sqrt\lambda}=\frac{2k+1}2\pi, $$ so the eigenvalues are given by $$ \lambda_n=\frac4{(2n+1)^2\pi^2}, \ n\in\mathbb N\cup\{0\}, $$ with corresponding eigenfunctions $$ u_n(x)=\sin\frac{(2n+1)\pi}2\,x $$