I'm assuming the conclusion is supposed to be that $f'(x)$ is either always positive or always negative (rather than $f(x)$), since the conclusion is false for $f(x)$ (If it were true for a function $f$, then doing a suitable vertical translation you would be able to get a function which is sometimes positive and sometimes negative on $(a,b)$, without changing the derivative.
Assume there are points $d,e$ with$ a\lt d\lt e\lt b$ and $f'(d)\lt 0 \lt f'(e)$.
By the Extreme Value Theorem, $f(x)$ achieves a maximum and a minimum on $[d,e]$, and this minimum is achieved at a critical point or at an endpoint. By hypothesis, there are no critical points, so the minimum must be achieved at $d$ or at $e$.
Show that the fact that $f'(d)\lt 0$ implies that $f$ cannot achieve its minimum over $[d,e]$ at $d$. One can do this by using the definition of derivative as a limit.
Added. Here is how one can prove this without trying to show that $f$ is decreasing on a neighborhood of $d$: by definition,
$$\lim_{h\to 0}\frac{f(d+h)-f(d)}{h} = f'(d)\lt 0.$$
In particular,
$$\lim_{h\to 0^+}\frac{f(d+h)-f(d)}{h} = f'(d)\lt 0.$$
By definition of the limit, for every $\epsilon\gt 0$ there exists $\delta\gt 0$ such that if $0\lt h\lt \delta$, then $\left|\frac{f(d+h)-f(d)}{h} - f'(d)\right|\lt \epsilon$. Take $\epsilon= |f'(d)/2$, and let $\delta_0$ be the corresponding $\delta$. Then for any $h$, $0\lt h\lt \delta_0$ we have
$$\left|\frac{f(d+h)-f(d)}{h} - f'(d)\right| \lt \frac{|f'(d)|}{2}.$$
From this, we conclude that $\frac{f(d+h)-f(d)}{h}\lt 0$. Since $h\gt 0$, that means that $f(d+h)-f(d)\lt 0$, or $f(d+h)\lt f(d)$. This holds for all $h$ with $0\lt h\lt \delta_00$. That means that $f(d)$ is strictly larger than all values of $f$ on $(d,d+\delta_0)$, so $f(d)$ cannot be the minimum on $[d,d+\delta_0)$, and so cannot be the minimum on $[d,e]$.
Likewise, show that the fact that $f'(e)\gt 0$ implies that $f$ cannot achieve its minimum over $[d,e]$ at $e$. This contradiction arises from the assumption that we can find points $d$ and $e$ with $f'(d)\lt 0 \lt f'(e)$. Replace $f$ with $-f$ to show you cannot find points $d$ and $e$ with $f'(d)\gt 0 \gt f'(e)$ either.
I show that it's true in case $f$ is not only differentiable but also its derivative is continuous.
Since $f'$ is non-zero function, there is a point $\tilde{x} \in (a, b)$ such that $f'(\tilde{x}) > 0$ or $f'(\tilde{x}) < 0$. Suppose $f'(\tilde{x}) > 0$. Then take $c$ and $d$ as follows:
$$ \begin{align*}
c &:= \inf\{\, \tilde{c} \mid a \leq \tilde{c} \leq \tilde{x}, \quad f'(c') > 0 \quad \text{for all $c' \in (\tilde{c}, \tilde{x}]$} \,\} \\
d &:= \sup\{\, \tilde{d} \mid \tilde{x} \leq \tilde{d} \leq b, \quad f'(d') > 0 \quad \text{for all $d' \in [\tilde{x}, \tilde{d})$} \,\}. \\
\end{align*}$$
From assumption that $f'$ is continuous, $c \neq \tilde{x} \neq d$. The interval $(c, d)$ is the required one (indeed, the largest interval containing $\tilde{x}$).
Best Answer
Since $f'$ is continuous, we have a $\delta>0$ such that in a neighbourhood $(a-\delta, a+\delta)$, $f'>0$. This is a general property of continuous functions not difficult to prove.
Suppose $f$ is not strictly increasing in $(a-\delta, a+\delta)$. From this supposition, we will derive a contradiction. If it is not strictly increasing, for some $x,y \in (a-\delta, a+\delta)$ such that $y>x$, $$\frac{f(y) - f(x)}{y-x} \leq 0$$
By the mean value theorem, we thus have some $c \in (x,y) \subset (a-\delta, a+\delta)$ such that $f'(c) = \frac{f(y) - f(x)}{y-x} \leq 0$. However, this is absurd, since we have already demonstrated that $f'>0$ in $(a-\delta, a+\delta)$. We therefore conclude, by contradiction, that $f$ is strictly increasing in $(a-\delta, a+\delta)$.