Since $f^{(n)}\ge 0$ on $(a,b)$ for every $n\ge 1$, $f^{(n)}$ is increasing on $(a,b)$ for every $n\ge 0$.
Now given $c\in (a,b)$, let us estimate $f^{(n)}(c)$ for every $n\ge 1$. Fix some $0<h<b-c$. By Taylor's expansion with Lagrange form of the remainder, for every $n\ge 1$, there exists $\xi_n \in(c,c+h)$, such that
$$f(c+h)=f(c)+\sum_{k=1}^{n-1}\frac{f^{(k)}(c)}{k!}h^k+\frac{f^{(n)}(\xi_n)}{n!}h^n.\tag{1}$$
Since $f^{(k)}(c)\ge 0$, $1\le k\le n-1$ and $f^{(n)}(\xi_n)\ge f^{(n)}(c)$, it follows that
$$\frac{f^{(n)}(c)}{n!}\le\frac{f^{(n)}(\xi_n)}{n!}\le{h^{-n}}[f(c+h)-f(c)]. \tag{2}$$
Claim: If $x_0,x\in (a,c)$ and $|x-x_0|<h$, then
$$\sum_{n=0}^\infty\frac{f^{(n)}(x_0)}{n!}(x-x_0)^n\tag{3}$$
converges to $f(x)$.
Proof: Since $x_0,x\in(a,c)$, similar to $(1)$, for every $n\ge 1$, there exists $\xi_n\in(a,c)$, such that
$$f(x)=\sum_{k=0}^{n-1}\frac{f^{(k)}(x_0)}{k!}(x-x_0)^k+\frac{f^{(n)}(\xi_n)}{n!}(x-x_0)^n.$$
By $(2)$ and noting that $0\le f^{(n)}(\xi_n)\le f^{(n)}(c)$ and $|x-x_0|<h$, we have
$$0\le \frac{f^{(n)}(\xi_n)}{n!}|x-x_0|^n\le \frac{f^{(n)}(c)}{n!}|x-x_0|^n\le[f(c+h)-f(c)]\cdot\left|\frac{x-x_0}{h}\right|^n\to 0$$
as $n\to\infty$, so the series in $(3)$ converges to $f(x)$. $\quad\square $
The claim implies that $f$ is analytic on $(a,c)$. Since $c\in(a,b)$ is arbitrary, the proof is completed.
$\dfrac {f(x)-f(c)}{x-c}=f'(t_x)$, where
$t_x \in (\min (c,x), \max (c,x))$.
$\lim x \rightarrow c$ implies $\lim t_x \rightarrow c$.
$\lim_{x \rightarrow c} \dfrac{f(x)-f(c)}{x-c}=$
$\lim_{t_x \rightarrow c} f'(t_x) =L.$
$ \lim_{ x \rightarrow c} \dfrac{f(x)-f(c)}{x-c}$ exists and is equal to $L,$ $f$ is differentiable at $c.$
Best Answer
We may assume $[a,b]=[-1,1]$; furthermore we assume that $f\in C^n\bigl([-1,1]\bigr)$, where $n\geq0$. The following is a proof by induction.
When $n=0$ (i.e., $f$ is continuous on $[-1,1]$) and an $\epsilon>0$ is given then according to the Stone-Weierstrass theorem we can find a polynomial $p$ such that $$|f(x)-p(x)|\leq\epsilon\qquad(-1\leq x\leq 1)\ .$$ (Such a polynomial has nothing to do with any Taylor expansion the function $f$ might have.)
Assume now that the statement is true for $n-1\geq0$ and that $f\in C^n\bigl([-1,1]\bigr)$. Applying it to $f'$ we obtain a polynomial $p$ whose derivatives up to order $n-1$ approximate the corresponding derivatives of $f'$: $$|(f')^{(k)}(x)- p^{(k)}(x)|<\epsilon \qquad(0\leq k\leq n-1)\ .\tag{1}$$ We now define the polynomial $P$ by $$P(x):=f(0)+\int_0^x p(t)\ dt\qquad(-1\leq x\leq 1)\ .$$ Then $$|f(x)-P(x)|\leq\left|\int_0^x \bigl(f'(t)-p(t)\bigr)\ dt\right|\leq |x|\sup_{-1\leq t\leq1}|f'(t)-p(t)|<\epsilon\ ,$$ and using $(1)$ it is easy to see that $P$ satisfies stated requirements for $f$.
Without the assumption that $f^{(n)}$ is continuous the stated claim is false. A sequence of polynomials $(q_k)_{k\geq1}$ converging uniformly to some $g$ on the interval $[-1,1]$ necessitates that $g$ is continuous to begin with. As an example consider the function $$f(x):=\cases{x^2\sin{1\over x}\quad&$(0<|x|\leq 1)$ \cr 0&$(x=0)$ .\cr}$$ This function is differentiable on all of $[-1,1]$, but as $$f'(x)=2x\sin{1\over x}-\cos{1\over x}\qquad(x\ne0)$$ there is no polynomial that can approximate $f'$ with an error $<1$ in the $\sup$-norm.