This follows from Cauchy–Schwarz inequality. The Cauchy–Schwarz inequality states that for any two vectors $a$ and $b$ in an inner product space, we have that
$$\lvert \langle a, b \rangle \rvert^2 \leq \lvert \langle a, a \rangle \rvert \lvert \langle b, b \rangle \rvert$$
In your case, the vector $a$ is taken as $a_i = (x_i-\bar{x})$ and the vector $b$ is taken as $b_i = (y_i-\bar{y})$ and the inner product of $a$ and $b$ is taken as $\displaystyle \langle a, b \rangle = \sum_{i=1}^n a_i b_i$. Hence, we get that
$$\displaystyle \langle a, b \rangle = \sum_{i=1}^n a_i b_i = \sum_{i=1}^n (x_i - \bar{x})(y_i - \bar{y})$$
$$\displaystyle \langle a, a \rangle = \sum_{i=1}^n a_i a_i = \sum_{i=1}^n (x_i - \bar{x})^2$$
$$\displaystyle \langle b, b \rangle = \sum_{i=1}^n b_i b_i = \sum_{i=1}^n (y_i - \bar{y})^2$$
Hence, by Cauchy–Schwarz inequality, we get that
$$\left(\sum_{i=1}^n (x_i - \bar{x})(y_i - \bar{y})\right)^2 \leq \left( \sum_{i=1}^n (x_i - \bar{x})^2 \right) \left( \sum_{i=1}^n (y_i - \bar{y})^2\right)$$
Taking the squareroot, we get that
$$\left|\sum_{i=1}^n (x_i - \bar{x})(y_i - \bar{y})\right| \leq \sqrt{\left( \sum_{i=1}^n (x_i - \bar{x})^2 \right) \left( \sum_{i=1}^n (y_i - \bar{y})^2\right)}$$
Hence, we can conclude that
$$|r_{xy}| = \dfrac{\left|\displaystyle \sum_{i=1}^n (x_i - \bar{x})(y_i - \bar{y})\right|}{\displaystyle \sqrt{\left( \sum_{i=1}^n (x_i - \bar{x})^2 \right) \left( \sum_{i=1}^n (y_i - \bar{y})^2\right)}} \leq 1$$
EDIT
Proof of Cauchy Schwarz inequality:
First note that if the vector $b$ is zero, then the inequality is trivially satisfied since both sides are zero. Hence, we can assume that $b \neq 0$. Now look at the component of $a$ orthogonal to $b$ i.e. $$c = a - \dfrac{\langle a, b \rangle}{\langle b, b \rangle} b$$
i.e.
$$a = c + \dfrac{\langle a, b \rangle}{\langle b, b \rangle} b$$
You can check that $c$ is orthogonal to $b$ by computing $$\langle c, b \rangle = \langle a,b \rangle - \dfrac{\langle a, b \rangle}{\langle b, b \rangle} \langle b, b \rangle = \langle a,b \rangle - \langle a,b \rangle = 0$$
You can also check that $\langle c, \alpha b \rangle = 0 = \langle \beta c, b \rangle$.
We now have that
\begin{align}
\langle a,a \rangle & = \left \langle c + \dfrac{\langle a, b \rangle}{\langle b, b \rangle} b, c + \dfrac{\langle a, b \rangle}{\langle b, b \rangle} b \right \rangle\\
& = \langle c,c \rangle + \left \langle c,\dfrac{\langle a, b \rangle}{\langle b, b \rangle} b \right \rangle + \left \langle \dfrac{\langle a, b \rangle}{\langle b, b \rangle} b, c \right \rangle + \left \langle \dfrac{\langle a, b \rangle}{\langle b, b \rangle} b, \dfrac{\langle a, b \rangle}{\langle b, b \rangle} b \right \rangle\\
& = \langle c,c \rangle + \left \lvert \dfrac{\langle a, b \rangle}{\langle b, b \rangle} \right \rvert^2 \langle b, b \rangle = \langle c,c \rangle + \dfrac{\left \lvert \langle a, b \rangle \right \rvert^2}{\langle b, b \rangle}
\end{align}
Now $\langle c,c \rangle \geq 0$. This gives that
$$\langle a,a \rangle \geq \dfrac{\left \lvert \langle a, b \rangle \right \rvert^2}{\langle b, b \rangle}$$
Rearranging, we get what we want, namely
$$\lvert \langle a, b \rangle \rvert^2 \leq \lvert \langle a, a \rangle \rvert \lvert \langle b, b \rangle \rvert$$
Best Answer
In order to show how the Pearson's correlation correlation coefficient (simply "r" from now) measures the strength of the linear relationship between two variables, it may be useful to show that if one variable is a (positive) linear combination of the other, then $r$ = $1$.
That is:
$$\forall a, b \in \mathbb{R}, Y = aX + b \Rightarrow Cov(X, Y) = \sqrt{Var(X)}\sqrt{Var(Y)}$$
where the latter clearly implies $r$ = $1$.
Proof:
\begin{align} Cov(X, Y) &= E(XY) - E(X)E(Y) \\ &= E[X(aX + b)] - E(X)E(aX + b) \\ &= E(aX^{2} + bX) - a[E(X)]^{2} - bE(X) \\ &= a[E(X^{2}) - [E(X)]^{2}] + bE(X) - bE(X) \\ &= aVar(X) \end{align}
where I have used $E(aX)$ = $a$$E(X)$ and $E(b)$ = $b$ if $b$ and $a$ are constants.
We also have:
$$Var(Y) = Var(aX + b) = a^{2}Var(X)$$
using $Var(aX)$ = $a^{2}$$Var(X)$ and $Var(b)$ = $0$ if $b$ and $a$ are constants. This implies:
$$\sqrt{Var(Y)} = a\sqrt{Var(X)}$$
from which we finally obtain that:
$$\sqrt{Var(X)}\sqrt{Var(Y)} = aVar(X)$$
proving the claim. Similarly can be proved $r$ = $-1$ if $Y$ = $-aX$ + $b$ exploiting $Var(-X)$ = $Var(X)$.
More in general, when $r$ is between $-1$ and $1$ (excluding the case $0$ implying no linear relation) it means that the data present "somewhat" a linear relationship. That is, scatter plotting the two variables, we would see that the majority of the data points (excluding outliers) are gathered in a cloud around line, and the more $r$ is far from either $-1$ or $1$, the more disperse is the cloud around, respectively, a negatively and positively sloped line.