Elementary Number Theory – Why Cube Roots Don’t Sum Up to an Integer

diophantine equationsdiophantine-approximationelementary-number-theory

My question looks quite obvious, but I'm looking for a strict proof for this. (At least, I assume it's true what I claim.)

Why can't the sum of two cube roots of positive non-perfect cubes be an integer?

For example: $\sqrt[3]{100}+\sqrt[3]{4}$ isn't an integer. Well, I know this looks obvious, but I can't prove it…

For given numbers it will be easy to show, by finding under- and upper bounds for the roots (or say take a calculator and check it…).

Any work done so far:

Suppose $\sqrt[3]m+\sqrt[3]n=x$, where $x$ is integer. This can be rewritten as $m+n+3x\sqrt[3]{mn}=x^3$ (by rising everything to the power of $3$ and then substituting $\sqrt[3]m+\sqrt[3]n=x$ again) so $\sqrt[3]{mn}$ is rational, which implies $mn$ is a perfect cube (this is shown in a way similar to the well-known proof that $\sqrt2$ is irrational.).

Now I don't know how to continue. One way is setting $n=\frac{a^3}m$, which gives $m^2+a^3+3amx=mx^3$ but I'm not sure whether this is helpful.

Maybe the solution has to be found similar to the way one would do it with a calculator: finding some bounds and squeezing the sum of these roots between two well-chosen integers. But this is no more then a wild idea.

Best Answer

Suppose $a+b=c$, so that $a+b-c=0$, with $a^3, b^3, c$ all rational.

Then we have $-3abc=a^3+b^3-c^3$ by virtue of the identity $$a^3+b^3+c^3-3abc=(a+b+c)(a^2+b^2+c^2-ab-ac-bc)$$ (take $c$ with a negative sign)

Hence $a+b$ and $ab$ are both rational, so $a$ and $b$ satisfy a quadratic equation with rational coefficients.

There are lots of ways of completing the proof from here.