As mentioned in the comments, one way to view it is as a discrete version of L'Hopital's rule. In analogy with a derivative of a function defined in the real numbers,
$$
\frac{d}{dx} f(x) = \lim_{h \to 0} \frac{f(x+h) - f(x)}{h},
$$
a "discrete derivative" of a sequence $a_n$ should be something like
$$
\frac{a_{n+h} - a_n}{h}
$$
for some small $h$. Well the smallest $h$ can be in this case is $1$, so this discrete derivative should be
$$
\Delta a_n \stackrel{\text{def}}{=} \frac{a_{n+1} - a_n}{1} = a_{n+1} - a_n.
$$
Now, the usual L'Hopital's rule says that if
$$
\lim_{x \to \infty} \frac{f'(x)}{g'(x)} = \ell
$$
then
$$
\lim_{x \to \infty} \frac{f(x)}{g(x)} = \ell,
$$
and, by analogy, the Stolz-Cesaro theorem says that if
$$
\lim_{n \to \infty} \frac{\Delta a_n}{\Delta b_n} = \ell
$$
then
$$
\lim_{n \to \infty} \frac{a_n}{b_n} = \ell.
$$
I also like to view the Stolz-Cesaro theorem in terms of summation. Let
$$
a_n = \sum_{k=0}^{n} \alpha_k \qquad \text{and} \qquad b_n = \sum_{k=0}^{n} \beta_k,
$$
where $\beta_k \geq 0$ for all $k$ and
$$
\sum_{k=0}^{\infty} \beta_k = \infty.
$$
Then the theorem says:
If
$$
\lim_{n \to \infty} \frac{\alpha_{n}}{\beta_{n}} = \ell, \tag{1}
$$
then
$$
\lim_{n \to \infty} \frac{\sum_{k=0}^{n} \alpha_k}{\sum_{k=0}^{n} \beta_k} = \ell. \tag{2}
$$
For illustrative purposes we will assume from here on that $\ell > 0$.
Let's introduce some new notation that should help illuminate the idea of the theorem. We'll write $f_n \sim g_n$ to mean that
$$
\lim_{n \to \infty} \frac{f_n}{g_n} = 1.
$$
Intuitively, this notation says that $f_n$ and $g_n$ grow (or shrink) at the same rate. So, in terms of this new notation, the hypothesis $(1)$ becomes
$$
\alpha_n \sim \ell \beta_n.
$$
That is to say, $\alpha_n$ grows or shrinks at the same rate as a constant times $\beta_n$.
The conclusion, $(2)$, becomes
$$
\sum_{k=0}^{n} \alpha_k \sim \ell \sum_{k=0}^{n} \beta_k,
$$
or, pulling the constant $\ell$ inside the second sum,
$$
\sum_{k=0}^{n} \alpha_k \sim \sum_{k=0}^{n} \ell\beta_k.
$$
We can just redefine $\beta_n' = \ell\beta_n$, so what the theorem is says is really:
Let $\alpha_n$ and $\beta_n$ be sequences with $\beta_n \geq 0$ and
$$
\sum_{k=0}^{\infty} \beta_k = \infty.
$$
If
$$
\alpha_n \sim \beta_n,
$$
then
$$
\sum_{k=0}^{n} \alpha_k \sim \sum_{k=0}^{n} \beta_k.
$$
Intuitively, if the summands $\alpha_k$ and $\beta_k$ grow at the same rate, then the sums $\sum_{k=0}^{n} \alpha_k$ and $\sum_{k=0}^{n} \beta_k$ also grow at the same rate. In other words, the sum we get when we replace $\alpha_k$ in the summand by some other equivalent sequence $\beta_k$ will behave approximately the same. We know in fact that $\sum_{k=0}^{\infty} \beta_k$ diverges, so we can also interpret this as saying that the partial sums diverge at the same rate.
Appendix: If $\ell = 0$ then $\alpha_n/\beta_n \to \ell = 0$ can be interpreted to mean that $\alpha_n$ is "smaller" than $\beta_n$ in the limit, and the conclusion that
$$
\sum_{k=0}^{n}\alpha_k \left/ \sum_{k=0}^{n}\beta_k \right. \to \ell = 0
$$
can be interpreted to mean that $\sum_{k=0}^{n}\alpha_k$ is therefore "smaller" than $\sum_{k=0}^{n}\beta_k$ in the limit.
Perhaps this (contrived) example will give some idea of the usefulness of this theorem. Suppose we wish to calculate
$$
\lim_{n \to \infty} \log(n)^{-1} \sum_{k=1}^{n} \sin(1/k).
$$
We could begin by noticing that $\sin(1/k) \sim 1/k$ as $k \to \infty$, so by Stolz-Cesaro we know that
$$
\sum_{k=1}^{n} \sin(1/k) \sim \sum_{k=1}^{n} \frac{1}{k}.
$$
This second sum is much easier to estimate. From this we know that
$$
\log(n+1) = \int_1^{n+1} \frac{dx}{x} \leq \sum_{k=1}^{n} \frac{1}{k} \leq 1 + \int_1^n \frac{dx}{x} = 1 + \log(n),
$$
so
$$
\sum_{k=1}^{n} \frac{1}{k} \sim \log(n).
$$
Consequently,
$$
\sum_{k=1}^{n} \sin(1/k) \sim \log(n).
$$
That is,
$$
\lim_{n \to \infty} \log(n)^{-1} \sum_{k=1}^{n} \sin(1/k) = 1.
$$
Because of Stolz-Cesaro, we know that instead of having to deal with the troublesome summand $\sin(1/k)$ we could replace it with the much simpler summand $1/k$ and not change the limiting behavior of the sum.
Best Answer
By definition an increasing function means
Therefore the orientation of the inequality does not change by the function $f.$ For the same reasoning decreasing functions reverse inequalities.