0.9999... != 1

Not open for further replies.

Ben Gooding

Registered Member
I hold the position that

(a) 0.9999... is not 1

(b) 0.3333... is not 1/3

however, the limit of (a) and (b) are 1 respectively 1/3


If you are familiar with proofs of induction, the follwing will be easy to grasp.

Step 1. 0.9 is not one, it differs from one by 0.1; "0."+["9"*n] where n=1 thus differs from one by 1 - 0.9 = one unit of value at 1 digit position after the decimal point
Step 2. If "0."+["9"*n] differs from one by 1 unit of value of position n, then also "0."+["9"*(n+1)] differs from one (by one unit of value at position n+1), since, 1 unit of value at position n correspond to 10 units of value at position n+1 and, 10 units of value at position n+1 is required to get to one (the difference between 1 and "0."+["9"*n]), and only 9 units are added at the n+1 position, thus 10-9=1 units difference from one at n+1 digit position after the decimal point (which is the difference between 1 and "0."+["9"*(n+1)] )
Step 3. Therefore, since Step 1 (n=1) is true and Step 2 is true, it is true for all n=1,2,3,... as n goes to infinity
Hence, 0.9999... is not one.

Similar proof with 9/10 + 9/100 + 9/1000 + ... (same as 0.9999...)
and with 3/10 + 3/100 + 3/1000 + ... (same as 0.3333...)

What does 0.9999... mean?

0.9999... means an infinitely many 9's are followed after zero dot.

It does not introduce a meaning of implied limit. If one wants to imply a limit, to that number, then one has to explicitly state that, eg,

lim (0.9999...)

Now that limit is one.

This does not mean that 0.9999... is one, just because the limit is one. The above proof is not limited to a finite n, n can be infinity, and the proof is still valid.

Thus 0.9999... is not one, since there is no implied limit.
You know, as a highschool student I can say that your work here is very cool. If it wasn't an already well-established idea though, you'd score more points.:)

Oh well, I'm going to show it to all my math nerds at school. Thanks. I'll give ya credit.
I disagree.

.333333333 equals 1/3 when it continues infinitly, and only because 3 doesn't mix wel with base 10.

Do long division on 1/3 and see what happens.
Let us speculate a bit: numbers are abstract representations of (mental) objects. There is a mental object of 1/3 but no mental object of 0.3333....It is out of the realm of mathemathical representation of reality (infinity is no mental object because it has no mental representation as an object). The statement of 1/3 = 0.333333 is an artifitial expression which converts the non mathematical object 0.3333.. into a mathematical one (1/3). 0.33333...can wel exist but is not mathematically useful: no calculations can be made with 0.3333.. because it value is undeterminate. Therefore dividing numbers which leads to values like 0.3333 shows that mathematics is an artificial system which must (at times) be corrected ("adjusted") to remain valid. To divide 4/0 is said to be "undeterminate"> the same should be said about dividing 10/3 because it is undeterminate as well.
Is it just me, or is the physics & math forum




- Warren
What you're attempting to prove is more philosophical perspective, than mathimatical fact. The value of 1 is subjective, it just represents a whole. Could you be as specific, as to tell us when 0.999~ isn't equal to one?
some reflections

Let me just say that I can see the viewpoint of those who advocate that 0.9999... is equal to one.

The main argument, as I see it, is this,

(1) 0.9999...
is equal to
(2) 9/10 + 9/100 + 9/1000 + ...
is equal to
(3) Sum[n goes from 1 to infinity](9*((1/10)^n))
is equal to
(4) 1

The problem here step (2) to (3), the sum of a series always implies a limit. And that is not necessarely what we introduce in step (2). Step (3) implies a limit, and therefore we cannot go from step (2) to (3) without introducing the concept of a limit. The hidden added limit concept in step (3) is however insidious.

Since we introduce a new concept, the limit, we cannot go from step (2) to (3).

My whole argument depends on this.

Once the concept of a limit is introduced, then and only then is 0.9999... equal to 1. However, if the concept of a limit is not introduced, then 0.9999... is not one.

Therefore one can say that the whole matter depends on what viewpoint one has about 0.9999... and implied limits.

My viewpoint is that 0.9999... is equal to

9/10 + 9/100 + 9/1000 + ...

and we look at the sum as it is, without introducing the concept of a limit.
your problem, ben, is that you don t know any set theory. if you want to get very attentive to the finer details of the properties of numbers, you cannot do so without knowing the definitions of the numbers.

let me tell you, i know how numbers are defined, and you are simply wrong. i would tell you why you re wrong, except i m getting tired of repeating myself on this issue.

one hint: look up dedekind cuts.

i ll tell you something else. there is no mathematical way to write down an infinite number without taking a limit. therefore when you write down an infinite string of 9s, a limit is necessarily implied.
1 * 1,000,000 = 1,000,000
0.9 * 1,000,000 = 900,000

Awfully big difference. Heck, looks like a 10% difference to me.
The concept of infinity; Base-ten number system

0.99999... does not equal one. It approaches a limit of one. But no matter how many nines you have, the power of ten represented by that last digit defines the difference between the series of nines and one. If you had a way of reaching infinity, then you'd have a way of extending that number far enough to actually equal one. But you can't reach infinity, by definition.

As for 0.33333... not being equal to 1/3, that's just a silly trick of our base-ten number system. If we used a base three number system, then we'd be saying that 1/10 = 0.1 and it would be unequivocally true.

"There are 10 kinds of people in the world. Those who understand binary, and those who don't."
Re: The concept of infinity; Base-ten number system

Originally posted by Fraggle Rocker
But you can't reach infinity, by definition.

hmm.... seems my textbooks got that definition wrong. lemme just get some whiteout... fix em right up...
Re: Re: The concept of infinity; Base-ten number system

"You can't reach infinity, by definition."
Originally posted by Lethe: hmm.... seems my textbooks got that definition wrong. Lemme just get some whiteout... fix em right up...
OK, OK, that was not very well stated at all! What I should have said is that in this particular situation, you cannot write an infinite series of nines. So, by writing with a pencil or typing with a word processor, you cannot physically create a display that presents enough nines for the series 0.99999... to equal one.

But because we all studied higher math, we are able to establish the notational convention that the string of symbols including the trailing ellipsis:


stands for an infinite series of nines, and is therefore equal to one.

But I don't much like that particular convention, sorry! It is pretty ambiguous. The ellipsis means "and so on until infinity," in this case. But if you write the similar-looking string of symbols 1.414... or 3.14159..., those three dots mean something entirely different: "continue the series of digits resulting from the implied calculation." Not a good notational convention in my opinion.

Fortunately we have other notational conventions. We can write

Sigma, x=1 ---> infinity, 9 * 10^-x

which is unambiguous and does equal one.

Or maybe it's not such a great notational convention, since I can't even come close to transcribing it correctly with this word processing software!

So, yes, if we all agree on the notational convention, then 0.99999... equals 1.00000... and 0.33333... equals 1/3.

Have we beaten it to death yet? How did the Greeks and the Mayas ever do long division, much less algebra, with their austere character sets?
In reality...

If I see 0.999..., and I'm going to bake some good cookies, I'm using a whole cup of milk.

But, if I'm traveling NEAR the speed of light, and I don't want to become a memory in wave form, I want my computer to understand the difference between 0.999... and 1.

"It only matters if it matters in the real world; knowing the difference is wisdom."
view on decimal numbers

What it all boils down too is if one has the view that a decimal representation of numbers involving infinite decimals has an implied limit or not.

For example,

2/7 is 0.285714285714... (bold repeat)

If one takes my view, then 2/7 is not equal to the decimal representation above, since the decimal representation involves a limit of 2/7 of the above number. And a limit is not implied just by stating a series of numbers.

So it boils down to how one view decimal representations of numbers.

My view, anyway.
Yes, Ben, on that I agree: "It boils down to how one view(s) decimal representations of numbers."

Of course, whether 1+1 is equal to 2 or 11 also boils down to how one views decimal representations of numbers.

The "view" (i.e. definition) that any mathematics text uses is that "0.999... " MEANS the infinite sum .9+ .09+.009+...
and that is defined as the limit of the partial sums: yes, there is a limit there- it is "implied" in the very definitions. And, as any student should know, the infinite sum .9+ .09+ .009+... is
a geometric series whose limit 1.

Fraggle Rocker said:"0.99999... does not equal one. It approaches a limit of one"

No, it is not "approaching" anything. It is not changing. It would be correct to say that the sequence .9, .09, .009, etc. is "approaching a limit of one" but 0.9999... is DEFINED as that limit.

If you are going to talk about mathematics then you must either use the standard definitions or state clearly that your definition is not standard. (Of course, it helps to know what the standard definitions ARE. Consult any mathematical analysis text for that.)
If I try to visualise this concept of 0.99999....=1 I can eg. use an infinite line as a representation of decimal numbers (from 0 to 2). In such continuum you can never point out where the 0.999999....ends and 1 starts. You can artifitially point out a point on the line and proclaim it to be 1 but this point is not "stable", its place is determined by the neigbouring numbers ( in a way 1 > 0.999999999999999 ..... and 1< 1 + (1-0.9999999999999999999....) none of these numbers really has an exact value. They look more like variabeles to me.

Stating 0.9999999999.. = 1 is an intervention which artificially structures this continuum, and doing so one should start with some phrase such as "let's suppose that 0.9999999...= 1" because .....etc.
HallsofIvy is right.

Also, Ben's original argument misses the idea of infinity.

0.9 differs from 1 by 0.1
0.99 differs from 1 by 0.01
0.999 differs from 1 by 0.001

If you have n nines after the decimal point, the number differs from 1 by (0.1)<sup>n</sup>. Now, here's the crunch:

<b>In the limit as n goes to infinity, the number differs from 1 by (0.1)<sup>infinity</sup>. But (0.1)<sup>infinity</sup> = 0.</b>

Therefore, 0.999... = 1.
Originally posted by Ben Gooding

What does 0.9999... mean?

0.9999... means an infinitely many 9's are followed after zero dot.


0.9999....... = 1
is true for the fact that numbers are used for measuring and you will never need the measurment 0.99999.... and the remaining number left from the subtraction from 1(one) would be such a small number that there would be no use for that in measurment
So therefore
0.9999........ = 1
when i was at school we did a patten thing that went

1/9 = 0.11111
2/9 = 0.22222
3/9 = 0.33333
8/9 = 0.88888
9/9 = 0.99999 = 1
Not open for further replies.