??? 04/27/04 03:40 Read: times |
#69275 - RE: It is a very nice riddle! Oleg Responding to: ???'s previous message |
Hi Oleg,
First, I've said before and I'll say again, your English is far better than my Russian. I do think I understand what you are communicating, syntax notwithstanding. I think you're running into trouble because you're thinking of this in terms of numeric approximation. I'm talking about more than just approximating 1/3 with the finite decimal approximation 0.33333333. In general you seem to be thinking in terms of how mathematics are used rather than from a purely theoretical perspective (which frankly is why you were able to immediately spot the error, at least intuitively, whereas a mathematician would have had more difficulty). Calculator results and numeric approximations work just fine in applied sciences because they force us to deal with significant digits. But from a purely mathematical standpoint they are not exact. Now ask the question, what does "well defined" mean? It means, with regard to this subject, a definite and consistent result. This is why multiplication by zero is "well defined." Anytime you multiply any number by zero, the result is zero. It may be a trivial result, which is to say useless to an engineer, but it is never ambiguous. Thus, the line in this "proof" which states (a + b)(a - b) = b(a - b) is in fact still quite correct. This line was derived by factoring (a - b) out of both the (a^2 - b^2) term, and also the (ab - b^2) term. And yes, at this point the "proof" is still valid because it still asserts a true, albeit trivial, statement (that zero does indeed equal zero). Now think about the next step. By carelessly eliminating the common terms on both sides of the equation one arrives at the false assertion that (a + b) = b. While there is a trivial case of a = b = 0 whereby this statement would still be true, in the general case that a = b this is obviously not true. It confuses most people because it is a perfectly common practice to eliminate common terms on both sides of an equation, so common in fact that most people fail to consider what they are actually doing. You can do anything you want to an equation, as long as you do it to both sides, AND as long as it is well defined. But in this case, eliminating common terms, you are dividing both sides by that common term to produce (a - b)/(a - b). But since (a - b) = 0, and since division by zero is undefined, this is not allowed. And this "proof" is a perfect example of why it is not allowed. This proof is also a good example of why, if you are solving an equation and you eliminate common terms, you absolutely MUST specify that the solution is not valid in cases where the common term is eual to zero. Dividing by zero produces erroneous results. As someone pointed out, just because 1*0 = 2*0, does not mean that 1 = 2. To better understand how division by zero is undefined, consider the reciprocal of x (dividing by x) as x gets closer to zero (note that you can use any number in the numerator). At x = 1, 1/1 = 1. At x = 1/2, 1/x = 2. At x = 1/4, 1/x = 4, and as x gets closer and closer to 0 its reciprocal assymptotically approaches infinity. But what if we came at it from the other side of zero. At x = -1, 1/x = -1. At x = -1/2, 1/x = -2. At x = -1/4, 1/x = -4 and as x gets closer and closer to zero the reciprocal of x assymptotically approaches -infinity. So the same exact point on the number line, zero, produces two different values based on which direction you approach it from. Picture the graph and you will see that it is a discontinuity. This is what the phrase "not well defined" means. In order to be well defined it would have to approach the same value from either direction. |