Is it true that 0.999999999…=10.999999999\ldots=1?

I’m told by smart people that
and I believe them, but is there a proof that explains why this is?


What does it mean when you refer to .99999\ldots? Symbols don’t mean anything in particular until you’ve defined what you mean by them.

In this case the definition is that you are taking the limit of .9, .99, .999, .9999, etc. What does it mean to say that limit is 1? Well, it means that no matter how small a number x you pick, I can show you a point in that sequence such that all further numbers in the sequence are within distance x of 1. But certainly whatever number you choose your number is bigger than 10^{-k} for some k. So I can just pick my point to be the kth spot in the sequence.

A more intuitive way of explaining the above argument is that the reason .99999\ldots = 1 is that their difference is zero. So let’s subtract 1.0000\ldots -.99999\ldots = .00000\ldots = 0. That is,

1.0 -.9 = .1

1.00-.99 = .01



1.000\ldots -.99999\ldots = .000\ldots = 0

Source : Link , Question Author : Community , Answer Author :
6 revs, 6 users 62%

Leave a Comment