Calculus – Is x^3 an Increasing Function for All Intervals?

applicationscalculusderivativesfunctionsreal numbers

I had an argument with my maths teacher today…
He says, along with another classmate of mine that $x^3$ is increasing for all intervals. I argue that it isn't.
If we look at conditions for increasing, decreasing and stable states of a function,
for a function $f(x)$:

function is increasing for $f'(x) > 0$

function is decreasing for $f'(x) < 0$

(as per my knowledge) function is neither increasing nor decreasing (i.e. it's stable) for $f'(x) = 0$

so for $x = 0$, function $f(x) = x^3$ should be stable, NOT INCREASING. Hence the interval for which the function must be increasing, according to me, would be $\mathbb{R}$ – {$0$}.

This means, the function $f(x)=x^3$ should be increasing for all real values EXCEPT $x=0$.

My teacher denied, saying that we consider values of $f'(x) >= 0$.
My response was to look at the function $f(x) = x^2$ which is increasing for interval $(0, \infty]$ and decreasing for interval $(\infty, 0]$, without a doubt.

again, I reason that this time too, $x=0$ is neither decreasing nor increasing hence it is not included in either of the intervals. My teacher, had nothing to say on this, simply telling me that these two examples are completely incomparable.

[another reasoning that my classmate and my teacher gave was that $x=0$ is a single point. Thus the function has to be constant there. Since the function is increasing before and after the point $x=0$, the function must be increasing for value $x=0$ as well.

In my opinion, that is completely against the very idea of calculus. When we define the rate of change at a point, it is solely attributed to that point alone. The limit of the rate of change, as points around $x=0$ close in on 0 is 0, hence in my opinion, the function should be constant at $x=0$, while increasing elsewhere in the set of real numbers]

Am I correct?

Best Answer

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but instead about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

Final clarification

[upgraded from comments since it relates to the original question]

I see near the end of your post you mentioned the example of $f(x)=x^2$. Here there is a subtle distinction at play. It is true that $f$ is increasing over $[0,\infty)$. However, when we say $f$ is increasing at a point, that's usually shorthand for "increasing over some open interval containing that point."

This creates two slightly different questions:

  1. What's the largest interval over which $f$ is increasing? (answer: $[0,\infty)$).
  2. What's the set of points at which $f$ is increasing? (answer: $(0,\infty)$).

However, the first question doesn't always have a well defined answer - consider something like $f(x)=2x^3-3x^2$, which is increasing on $(-\infty,0]$ and on $[1,\infty)$, but not over the union of those intervals, (since $f(1)<f(0)$), so which interval would you pick to be the answer? Therefore, usually we ask the second question instead (in this final example the answer would be $(-\infty,0)\cup(1,\infty)$).

Related Question