Let $L$ be the operator $Lf=-f''$ defined on the domain $\mathcal{D}(L)$ consisting of twice absolutely continuous functions $f$ on $[-\pi,\pi]$ that satisfy periodic conditions $f(-\pi)=f(\pi)$ and $f'(-\pi)=f'(\pi)$. Then $L$ is symmetric on its domain because, for $f,g\in \mathcal{D}(L)$, one has
$$
(Lf,g)-(f,Lg) = \int_{-\pi}^{\pi}f(t)\overline{g''(t)}-f''(t)\overline{g(t)}dt \\
= \int_{-\pi}^{\pi}\frac{d}{dt}\{f(t)\overline{g'(t)}-f'(t)\overline{g(t)}\}dt \\
= \left.\{f(t)\overline{g'(t)}-f'(t)\overline{g(t)}\}\right|_{t=-\pi}^{\pi}=0.
$$
Therefore, if $Lf=\lambda f$ and $f\ne 0$, it follows that $\lambda$ is real because
$$
(\lambda-\overline{\lambda})(f,f)=(Lf,f)-(f,Lf) = 0.
$$
Likewise if $f,g$ are not identically $0$ and $Lf=\lambda f$, $Lg=\mu g$ with $\lambda\ne \mu$, then $(f,g)=0$ because
$$
(\lambda-\mu)(f,g) = (Lf,g)-(f,Lg) = 0.
$$
Note that $L1 = \lambda 1$ where $\lambda=0$. This makes sense because the constant function $1$ is in the domain of $L$, as it is twice absolutely continuous, and it is periodic in the the function and its first derivative. And,
$$
L\sin(nx) = n^2 \sin(nx),\;\; L\cos(nx)=n^2\cos(nx).
$$
So, $\lambda_n = n^2$ are eigenvalues for $n=0,1,2,\cdots$. The eigenspace $\{ 1\}$ for $\lambda=0$ is one-dimensional. The eigenspace $\{\sin(nx),\cos(nx)\}$ is two-dimensional with eigenvalue $\lambda_n=n^2$. There is latitude on how you choose the elements of $E_n$ for $n \ne 0$, but it is convenient to choose an orthogonal basis, which is what choosing $\sin(nx),\cos(nx)$ does. You could instead choose $\{ e^{inx},e^{-inx}\}$, which is also an orthogonal basis. Or you could choose $\{ e^{inx},\cos(nx) \}$, which is not an orthogonal basis for the two-dimensional eigenspace, even though it is a basis.
The standard Fourier basis is $\{ 1,\cos(x),\sin(x),\cos(2x),\sin(2x),\cdots\}$ which consists of orthogonal real functions. To expand $f \in L^2[-\pi,\pi]$ in such a basis,
$$
f \sim a_0 1 + a_1 \cos(x)+ b_1 \sin(x) + a_2 \cos(2x)+ b_2\sin(2x) + \cdots,
$$
one formally takes the dot product of $f$ with one of the basis elements as well as the series on the right. Using orthogonality,
$$
(f,1) = a_0(1,1), \;\;\; a_0 = \frac{(f,1)}{(1,1)} \\
(f,\cos(nx)) = a_n (\cos(nx),\cos(nx)),\;\;\; a_n = \frac{(f,\cos(nx))}{(\cos(nx),\cos(nx))} \\
(f,\sin(nx)) = b_n (\sin(nx),\sin(nx)),\;\;\; b_n = \frac{(f,\sin(nx))}{(\sin(nx),\sin(nx))}.
$$
Using $(1,1)=2\pi$ and $(\cos(nx),\cos(nx))=\pi$, $(\sin(nx),\sin(nx))=\pi$ gives the Fourier series expansion:
$$
f \sim \frac{1}{2\pi}(f,1) + \frac{1}{\pi}\sum_{n=1}^{\infty}\{(f,\cos(nx))\cos(nx)+(f,\sin(nx))\sin(nx)\} \\
\sim \frac{1}{2\pi}\int_{-\pi}^{\pi}f(t)dt
+ \sum_{n=1}^{\infty}\frac{1}{\pi}\int_{-\pi}^{\pi}f(t)\sin(nt)dt\sin(nx)+\frac{1}{\pi}\int_{-\pi}^{\pi}f(t)\cos(nt)dt\cos(nx).
$$
Note: The integral orthogonality did not come from such arguments. The "orthogonality" property of the trigonometric functions was an experimentally discovered fact when trying to write an initial condition for a vibrating string problem in terms of travelling waves. This predated Fourier's work, finite-dimensional linear algebra, eigenfunction analaysis, etc., by decades.
Best Answer
By $p, q, r$ positive defined, I will assume it is a typos that $p, q, r$ are positive definite. i.e. $p(x), q(x), r(x) > 0$ for $x \in [a,b]$.
Start from $$(py')' - qy + \lambda ry = 0 \iff \lambda ry = q y - (py')'$$ Multipy both sides by $y$ and integrate over $[a,b]$, one obtain${}^{\color{blue}{[1]}}$
$$\require{cancel} \begin{align}\lambda \int_a^b ry^2 dx &= \int_a^2 (qy^2 - y (py')') dx = \int_a^b (qy^2 + p(y')^2 - (p yy')') dx\\ &= \int_a^b (qy^2 + p (y')^2) dx - \color{red}{\cancelto{0}{\color{gray}{\left[ p yy'\right]_a^b}}}\\ &= \int_a^b (qy^2 + p (y')^2) dx\end{align} $$ Since $y$ is non-zero and $p, q, r$ is positive over $[a,b]$, we have${}^{\color{blue}{[2]}}$
$$\int_a^b ry^2 dx > 0 \quad\text{ and }\quad \int_a^b (qy^2 + p(y')^2)dx \ge \int_a^b qy^2 dx > 0$$
As a result, $$\lambda = \frac{\int_a^b (qy^2 + p(y')^2)dx}{\int_a^b ry^2 dx} > 0$$
Notes
$\color{blue}{[1]}$ Since $p(a) = p(b)$, $y(a) = y(b)$ and $y'(a) = y'(b)$, we have $$[pyy']_a^b = p(b)y(b)y'(b) - p(a)y(a)y'(a) = 0$$
$\color{blue}{[2]}$ - In general, if you have two continuous function $f$, $y$ with $f$ positive definite and $y$ non-zero over $[a,b]$, we will have $$\int_a^b f y^2 dx > 0$$ Let say's $y(c) = Y \ne 0$ for some $c \in (a,b)$. Let $F = f(c) > 0$, Choose a $\epsilon$ so small so that $(c-\epsilon,c+\epsilon) \subset (a,b)$ and for any $x \in (c-\epsilon,c+\epsilon)$, we have $|y(x)| > \frac{Y}{2}$ and $f(x) > \frac{F}{2}$. This leads to $$\begin{align}\int_a^b fy^2 dx &= \left(\int_a^{c-\epsilon} + \int_{c-\epsilon}^{c+\epsilon}+ \int_{c+\epsilon}^b\right) fy^2 dx \ge \int_{c-\epsilon}^{c+\epsilon} fy^2 dx\\ &\ge \int_{c-\epsilon}^{c+\epsilon} \frac18 FY^2dx = \frac14 FY^2\epsilon > 0\end{align} $$ The analysis where $c = a$ or $b$ is similar and I won't repeat it here.