A self-adjoint operator $S : X \to X$ (where $X$ is an inner product space) is an operator such that for all $x,y \in X$, we have $$\langle Sx,y \rangle = \langle x,Sy\rangle.$$ This is a generalization of a real, symmetric matrix.
One important property of such operators is that the eigenvalues of a self-adjoint operator are necessarily real. Indeed, if $k$ is any eigenvalue with corresponding (normalized) eigenvector $v$, we see $$k = k\langle v,v \rangle = \langle kv, v \rangle = \langle Sv, v \rangle = \langle v,Sv \rangle = \langle v, kv \rangle = \overline k \langle v, v \rangle = \overline k$$ showing that $k$ is real.
Another important property (perhaps the most important property) of self-adjoint operators is that the eigenvectors of a self-adjoint operator can be taken to form an orthonormal basis for the ambient space (here I am assuming you are working in a finite dimensional space, but a similar statement still holds in infinite dimension, we just need to generalize the idea of a basis a bit and we need completeness). That is, we can take $k_1, \ldots, k_n$ to be the eigenvalues of $S$ (possible with repetitions) with corresponding orthonormal eigenvectors $v_1,\ldots, v_n$ forming a basis for $X$. Then for any $v \in X$, there are scalars $\alpha_1, \ldots, \alpha_n$ so that $v = \alpha_1 v_1 + \cdots + \alpha_nv_n.$ Using linearity of the inner product, we see $$\langle v, v\rangle = \sum^n_{i=1} \sum^n_{j=1} \alpha_i \overline \alpha_j \langle v_i, v_j \rangle.$$ But by orthonormality, $\langle v_i, v_j \rangle = 0$ when $i \neq j$ and $\langle v_i, v_i \rangle = 1$. Thus the above sum becomes $$\langle v, v\rangle = \sum^n_{i=1} \alpha_i \overline \alpha_i = \sum^n_{i=1} \lvert \alpha_i \rvert^2.$$ Similarly, since $$Sv = S(\alpha_1v_1 + \cdots \alpha_n v_n) = \alpha_1 k_1 v_1 + \cdots + \alpha_n k_n v_n $$we have $$\langle Sv, v\rangle = \sum^n_{i=1} \sum^n_{j=1} k_i \alpha_i \overline \alpha_j \langle v_i, v_j \rangle = \sum^n_{i=1} k_i \lvert \alpha_i \rvert^2.$$ Clearly if $k_i \ge 0$ for all $i=1,\ldots, n$ then $$\langle Sv, v\rangle = \sum^n_{i=1} k_i \lvert \alpha_i \rvert^2 \ge 0.$$ Also, if $k_i \le 1$ for all $i = 1,\ldots, n$, then $$\langle Sv, v\rangle = \sum^n_{i=1} k_i \lvert \alpha_i \rvert^2 \le \sum^n_{i=1} \lvert \alpha_i \rvert^2 = \langle v , v \rangle.$$ Conversely, if the given condition holds for all vectors $v$, then applying the condition to the eigenvectors gives $$0 \le \langle Sv_i, v_i \rangle \le \langle v_i, v_i \rangle \,\,\,\, \implies \,\,\,\, 0 \le \langle k_i v_i, v_i \rangle \le \langle v_i, v_i \rangle$$ whence pulling the $k_i$ out of the inner product gives $0 \le k_i \le 1.$
In that step, you expand using bilinearity of the inner product, obtaining
$$\sum_i \sum_j c_i \lambda_i c_j (v_i,v_j)$$
but by assumption $(v_i,v_j)=0$ unless $i=j$. (You can fill in the conjugates for the complex case yourself.) Also, $(v_i,v_i)=1$, again by assumption.
As for the orthonormality, it is often convenient to assume eigenvectors are unit. It usually makes no real difference, but if you don't do it then you often wind up normalizing further down the line anyway. Here it really wouldn't make much difference, there would just be a $\| v_i \|^2$ in the last sum if you didn't assume the $v_i$ were unit.
Best Answer
If you want to use the hint:
$$ \langle v_i, (T^2-5T+6I)v_i \rangle=\langle v_i, T^2v_i\rangle-5\langle v_i, Tv_i \rangle+6\langle v_i,v_i \rangle $$
$$=\begin{cases} 4\langle v_i,v_i\rangle-10\langle v_i,v_i\rangle+6\langle v_i,v_i\rangle & \mbox{, eigenv} =2 \\ 9\langle v_i,v_i\rangle-15\langle v_i,v_i\rangle+6\langle v_i,v_i\rangle& \mbox{, eigenv} =2 \end{cases} $$ $$=\begin{cases} 0 & \mbox{, eigenv} =2 \\ 0 & \mbox{, eigenv} =2 \end{cases} $$
So $T^2-5t+6I=0$ because: