It's funny that this started as an effort to understand 10c - c = 9c, and has spun into me no longer believing in rational numbers.

In other words, you have "progressed" in this thread from misunderstanding fringe mathematics (intuitionism) to misunderstanding even the most basic of mathematics. Have fun in Cuckooland.

BTW, you quoted me out of context here:

No, but it does assume that $$10\times0.\overline{9} = 9.\overline{9}$$ which is why many mathematicians reject this as a proof that $$0.\overline 9 = 1$$

Oh, man, thanks. That's it. I will stop torturing you with this other stuff. This is all I wanted to hear. Thanks.

You cut off the part where I said why mathematicians reject this as a proof. They don't reject it because it is false (it is not false; 0.999...=1); they reject it because it lacks rigor.

1. I take rational to mean "expressed as a ratio", or something like that.

"

*Expressible* as a ratio of integers" is a better way of saying it.

I don't take it that repeating decimals are, by definition, rational. It is the ability to be expressed as a fraction that makes them rational.

All repeating decimals are

*expressible* as a ratio of integers and are thus by definition rationals.

2. 1/9 expressed as a repeating decimal is .1r.

3. .1r x 9 = .9r

Both statements are true. Your point?

Less plainly obvious:

4. Therefore, .1r is not a valid expression of 1/9.

Ohhhh. That's your point. This is simple, elementary school mathematics that you are rejecting.

Just plain contentious: As a result, mathematicians move reality.

Mathematics is not about reality. That is the realm of science, engineering, finance, etc. Show me a one in the real world -- not one apple or one dog or one dollar or the symbol 1 -- I want you to show me a one. Science, engineering, finance, etc. use mathematics as a tool by relating the concepts developed by mathematicians to the real world.

7. All following proofs that .9r really equals 1 proceed from the assumption that repeating decimals really are valid expressions of rational numbers.

It's not an assumption, its a theorem called the

*division algorithm*.

As an example, 1/9 is not, by definition, the limit of the sum of 1/10 + 1/100 ... + 1/10^n as n approaches infinity.

This time do not quote me out of context. Read and understand this and the next paragraph, please. First off, mathematicians do not define 1/9 to be equal to $$\sum_{n=1}^{\infty}\frac 1{10^n}$$

Defining every possible fraction in this way would be downright silly and utterly fruitless, as there are an infinite number of rational numbers. Fortunately there is no reason to do this because the division algorithm tells us how to translate any rational expressed in the form p/q to a decimal representation (or to a representation in any other base, for that matter).

Math argues that the two are equal because the limit of 1/9 x 10^n as n approaches infinity is zero. But it's not a given that the limit is the expression's actual value.

Yes, it is.