How does using limits to find the derivative of a function avoid dividing by $0$

calculusderivativesintuition

What is wrong with this argument that differentiation using the First Principle leads to division by $0$:

$$
f'(x)=\lim_\limits{h \to 0} \frac{f(x+h)-f(x)}{h} \\
$$

Using the quotient limit law:

$$
\lim_\limits{h \to 0} \frac{f(x+h)-f(x)}{h}=\frac{\lim_\limits{h \to 0}f(x+h)-f(x)}{\lim_\limits{h \to 0}h}
$$

$$
\lim_\limits{h \to 0}h = 0
$$

Therefore, there the top half of the fraction is divided by $0$. Here is my reasoning for why $\lim_\limits{h \to 0}h = 0$:

As $h$ approaches $0$, its value becomes smaller (and will become smaller than any number strictly greater than $0$). For example, you cannot evaluate the limit as equalling $0.001$, because at some point $h$ will be lower than this. $0$ is the largest number that does not have this problem. Therefore, the limit expression is equal to $0$.

Thank you for reading.

Best Answer

The part which is wrong in your reasoning is this: $$\lim_\limits{h \to 0} \frac{f(x+h)-f(x)}{h}=\frac{\lim_\limits{h \to 0}f(x+h)-f(x)}{\lim_\limits{h \to 0}h}$$

The formula $\lim \frac{f}{g}=\frac{\lim f}{\lim g}$ can only be used when $\lim(g)\neq 0$. If $\lim g \neq 0$ the RHS of your formula, does not make any sense.

To understand what happens here, just look at $$\lim_{h \to 0} \frac{h}{h}$$

$\frac{h}{h}=1$ and $\lim_1=0$. But you cannot write $$\lim_{h \to 0} \frac{h}{h}=\frac{\lim_{h \to 0} h}{\lim_{h \to 0} h}$$ since the RHS makes no sense.