.999999999(repeating) is equal to 1
Post Voting Period
The voting period for this debate has ended.
after 3 votes the winner is...
Shorack
Voting Style:  Open  Point System:  7 Point  
Started:  12/7/2008  Category:  Science  
Updated:  8 years ago  Status:  Post Voting Period  
Viewed:  4,457 times  Debate No:  6169 
Debate Rounds (2)
Comments (7)
Votes (3)
.999... is equal to one. First i would like to represent this using digit manipulation...
x=.999... 10x=9.999... 10xx=9.999....999... 9x=9 x=1 .999...=1 Now, .999... can be explained through an infinite series where, .999... is a sum with a common ratio (1/10), so .999...=9(1/10)+9(1/10)(squared)+9(1/10)(cubed) +....=[(9(1/10)/(1(1/10))]=1 Also, using fractions, 1/3=.33... 2/3=.66... so thus, 3/3=.99... .333...=1/3 3*.333...=3*(1/3) =3*1/(1/3) .999...=1 .111=1/9 9*.111...=9*(1/9)=9*1/(1/9) .999...=1 Both equations are valid due to the Transitive Property [If you don' know the Transitive property, I will state it... Transitive Property If a=b and b=c then a=c] http://www.mathmojo.com... http://polymathematics.typepad.com... http://en.wikipedia.org...... http://www.greatplay.net... I look forward to meeting my opponent, and would like to listen to their argument. Best of luck, Julius Caesar!!
This is probably one of the hardest fought debates imaginable. It is my sincere belief that is due to both parties failing to make distinction between 'reality' and mathematics. The mathematical proof given is a 100% correct and trying to argue against it is a lost fight, but still people keep trying. You can't beat mathematical proof. If it's proven, it's correct, no matter what you try. However, the mathematician's side often fails to make the leap back to reality. If you keep on placing 9's, it doesn't turn in a 1. You can keep adding 9's all your life, make your children do it and their children and so on (and even then you don't achieve infinity), it won't make it one. What number is it? Impossible to tell, nobody can tell. Now, back to mathematics. You can't work with a number you can't specify, that's an extremely annoying situation. For this, there is the concept of infinity. Note the use of the word concept, because real infinity doesn't exist. (nothing is infinite, cause to take note of that  which is impossible too  you'd have to see it in its end state, if it's in an end state, you can technically get to the end, if it is still going on, you haven't achieved infinity, since you are at the very moment going further, so not in infinity) The mind isn't even capable of grasping with it, let alone working with it. You might think you do, but you're actually counting on the concept again. Being able to work with real infinity would mean you're able to picture yourself all those nines up to the end. Since there is no end, you can't, i'm very sorry but we're limited. So what is the practical concept based on  more or less  of infinity? Basically, on the fact that if you go that far, the difference doesn't matter. To give an example: if you write 1 instead of 0.9, you're making an error of the size of 0.1. Quite some people will still care you make such a rounding (depending on the situation ofc, but let's assume it does matter, for example the thickness of steel a car producer wants to order or so). If you write 1 instead of 0.99, you make an error of 0.01 and less people will make a fuss about the difference. Writing one 1 instead of 0.999 is only 0.001 off the mark. Even less people will care about such a tiny error. Now this goes on and on, every step you go further the error gets smaller. At a certain point, the error is getting so extremely little that it really can be discarded for us, people. Now one may try to argue that i'm making up how this concept is working. I'll give simple and  in a way  mathematical prove i'm not. For that, i'd like to make a practical example: the asymptote. let's portray this 0.9999... as a function: y = 0.9*_x_ (the _ _ means you always round it down, no decimals) If you draw that, you'll get a horizontal asymptote y=1. Definition of an asymptote: "A linear asymptote is essentially a straight line to which a graphed curve becomes closer and closer but does not become identical." wikipedia And the same definition will be found in books on mathematics and i believe most people are familiar with asymptotes. A function never ever has a point in common with an asymptote, but always gets closer. That's an asymptote by definition. Yet if you'd take the limit of x to infinity, you get 1. If you say 0.999... is 1, you directly violate the definition and idea of an asymptote. So a mathematician has to admit that there is a distinction between reality and concept. Yes, you never touch it, so your y never is 1. But yes, we note it as one in the limit. Why? Because it is a workable concept: your calculations will still work out with this concept. To finish i'd like to repeat my main points/theme (again? :p): A. Real infinity doesn't exist and it is impossible for us to grasp it. B. The mathematical concept of infinity hence isn't reality (or isn't infinity, but it can't be both, that would violate A ;)) C. The mathematical approach of infinity allows us not to remain stuck with the problem, but to remain able to work out calculations. D. If you refuse to make a distinction between mathematics and reality on this level, mathematics itself will pose you some paradoxes. Apologies for my bat habit of over repeating myself. Kind regards, Shark 

Thank you very much for accepting my debate. I recognize that this has been fought over MANY times, and may be repetitive, but I feel it is a fascinating topic, and I'm eager to finish it.
Now back to the debate: "However, the mathematician's side often fails to make the leap back to reality." I would just quickly like to remind you, this debate has to do with mathematics, and not a concern with concepts of reality. Thus, the mathematician's side must be correct when judging the computation of a number. "You can't work with a number you can't specify, that's an extremely annoying situation. For this, there is the concept of infinity. Note the use of the word concept, because real infinity doesn't exist. (nothing is infinite, cause to take note of that  which is impossible too  you'd have to see it in its end state, if it's in an end state, you can technically get to the end, if it is still going on, you haven't achieved infinity, since you are at the very moment going further, so not in infinity)" So you agree that .999… is infinite, right? So as I represented earlier, through the equation, x=.999...< .999… is represented as a variable. 10x=9.999...< We multiply both sides by 10 to make the equation easier to understand. 10xx=9.999....999...< (*Note this) we have subtracted the ‘X' of the equation, which is an infinite decimal. Obviously, you know XX must equal 0, so .999… .999…. equals 0, in my representation, 9.999….999… must equal 9, and 10xx must be 9x.) x=1<we divided both sides by 9 (9x=9), so 1 equals x .999...=1< Plugging ‘x' in, we get 1=.999… Basically, in order to get my point across, I can subtract my concept of infinity by itself, and get 0, making the digit manipulation possible. You address that you have to "see" infinite in its end state. You can not "see" infinity. It is a mathematical term used to explain something that continues endlessly. Clearly, you can specify and infinite number in 2 simple examples, 1/3 and 2/3. 1/3=.333…. and 2/3=.666… the idea of infinity is clearly defined by a fraction, and can be multiplied by something to get a noninfinite decimal. (i.e. 1/3*3=1; .333…*3=1) either way, we have represented and infinite decimal and multiplied it by another number, receiving one whole number. *to sum this up, Infinity is endless, but can be specified by fractions. You can not "see" infinity, or "go" into infinity, simply because it's endless and has no "end state." " So what is the practical concept based on  more or less  of infinity? Basically, on the fact that if you go that far, the difference doesn't matter. To give an example: if you write 1 instead of 0.9, you're making an error of the size of 0.1. Quite some people will still care you make such a rounding (depending on the situation ofc, but let's assume it does matter, for example the thickness of steel a car producer wants to order or so). If you write 1 instead of 0.99, you make an error of 0.01 and less people will make a fuss about the difference. Writing one 1 instead of 0.999 is only 0.001 off the mark. Even less people will care about such a tiny error. This would be true if the 9's stopped. Obviously, .999 is .999, not one; conversely, .999… is never ending and thus can not be justified by being a certain number less than one. To simplify this statement, I ask you, what is 1.999…? It is a never ending series of zero's, with nowhere to verify a "1." "Now this goes on and on, every step you go further the error gets smaller. At a certain point, the error is getting so extremely little that it really can be discarded for us, people." If it's that tricky for "us, people," why not just round up? "Now one may try to argue that i'm making up how this concept is working. I'll give simple and  in a way  mathematical prove i'm not. For that, i'd like to make a practical example: the asymptote. Let's portray this 0.9999... as a function: y = 0.9*_x_ (the _ _ means you always round it down, no decimals) If you draw that, you'll get a horizontal asymptote y=1." I believe you have made a major mistake. You can not multiply ‘x' by .9. You must multiply it by .999…, this is because .9 is equal to 9/10, 90%, etc. In other words, it stops, and is directly equal to .9. "Definition of an asymptote: "A linear asymptote is essentially a straight line to which a graphed curve becomes closer and closer but does not become identical." wikipedia And the same definition will be found in books on mathematics and i believe most people are familiar with asymptotes. A function never ever has a point in common with an asymptote, but always gets closer. That's an asymptote by definition. Yet if you'd take the limit of x to infinity, you get 1. If you say 0.999... is 1, you directly violate the definition and idea of an asymptote. So a mathematician has to admit that there is a distinction between reality and concept. Yes, you never touch it, so your y never is 1. But yes, we note it as one in the limit. Why? Because it is a workable concept: your calculations will still work out with this concept." This doesn't work because your asymptote is incorrect. Sorry if I sound like an imbecile. XD  Btw i appreciate that you accepted such a difficult debate!! Really! Sincerely, Julius Caesar Shorack forfeited this round. 
3 votes have been placed for this debate. Showing 1 through 3 records.
Vote Placed by Julius_Caesar 8 years ago
Julius_Caesar  Shorack  Tied  

Agreed with before the debate:      0 points  
Agreed with after the debate:      0 points  
Who had better conduct:      1 point  
Had better spelling and grammar:      1 point  
Made more convincing arguments:      3 points  
Used the most reliable sources:      2 points  
Total points awarded:  7  0 
Vote Placed by theitalianstallion 8 years ago
Julius_Caesar  Shorack  Tied  

Agreed with before the debate:      0 points  
Agreed with after the debate:      0 points  
Who had better conduct:      1 point  
Had better spelling and grammar:      1 point  
Made more convincing arguments:      3 points  
Used the most reliable sources:      2 points  
Total points awarded:  0  7 
Vote Placed by paramore102 8 years ago
Julius_Caesar  Shorack  Tied  

Agreed with before the debate:      0 points  
Agreed with after the debate:      0 points  
Who had better conduct:      1 point  
Had better spelling and grammar:      1 point  
Made more convincing arguments:      3 points  
Used the most reliable sources:      2 points  
Total points awarded:  0  7 
"if you go to a store, and something is one million dollars, and you have 999,999.99$ will they sell it to you?
No."
This is clearly true on what you said, but .9999999... is a number that is continuous. It's stupid to even compare a measly million digits to the infinite amount of "9"s there are. Do you recognize the difference between to two? One is infinite and the other is definite.
i also provided better info, so really, what the heck
This is totally different. It's not that simple.
No.
It's the same argument on both sides EVERY time.