Questions about 0.999… equals 1

decimal-expansionrational numberstruncation error

Being 0.999… = 1, I expect that they have the same behaviour when applying the same algorithm/operation, but:

  1. If we define >, <, =, as checking digit by digit two number, we have that 0 < 1 in the first check and an algorithm will say 0.999… < 1

  2. If we truncate both numbers at any point (let's say 3 decimal digits), we have 0.999 vs 1.

Why I am wrong? Can someone help me to clarify?

Thank you in advance from this junior amateur noob! 🙂

Please, note that I'm aware of
Is it true that $0.999999999\dots=1$?
but I wanted to know why truncating and comparison as explained in
school are wrong when dealing with 0.9999…

Best Answer

It's true that if we truncate both numbers at any finite point, $0.\overline{9}$ will compare less than 1. But by analogy, consider the following two programs:

i = 0
while true:
    print i
    print i+1
    i <- i + 2

and

i = 0
while true:
    print i
    i <- i + 1

Those two programs will print out exactly the same numbers - but at any given iteration, the first one will have printed out twice as many numbers. Any finite truncation of the first process will "look much bigger" than the corresponding finite truncation of the second process; and yet their output "at infinity" is the same.

You need to "look at the whole process" to determine equality.


By the way, equality of arbitrary reals is undecidable. So your algorithm never stood a chance of being a general way of comparing arbitrary reals.

Related Question