Jump to content

Broken Maths? (0/0=X)


Lightmeow

Recommended Posts

Sometime someone asked what zero divided by zero was. Someone answered if 0x=0, then 0/0=x. That's all fine in good, but isn't there a rule that anything divided by itself is one? So wouldn't you technically be turning nothing into one? How does that work? Can someone give a mathematical explanation on what is right? I don't get how 0/0=x, but how can that be true. Wouldn't 0/0=1, so x would equal one? But 0 does not equal one.

 

Thanks for your thought and time.

Link to comment
Share on other sites

The structure of the real numbers, as we usually take it to be, does not allow for division by zero, for the reason that assigning a definition leads to logical inconsistency. Therefore, the correct answer in this context is that "division by zero is undefined."

 

I suppose that it's possible, in principle, to define a structure in which 0/0 is defined, but I'd imagine we'd have to abandon certain intuitive properties of numbers in that case.

 

Numberphile has a decent video covering this and other operations involving zero.

Edited by John
Link to comment
Share on other sites

x/0 is inconsistent but 0/0 isn't, so it is called "indeterminate" instead of "undefined" as x/0 is.

I don't know if this is debatable, or if it's a rule or if it depends on context or what.

 

 

Because any value of x satisfies 0x=0, x is indeterminate in that equation.

You might be able to reason through your questions by considering that asking "what is 0/0?" is like asking "How many times does 0 divide into 0 groups?", and the answer is "any number of times," or "How many groups of size 0 can 0 be divided into?" and the answer is "any number of groups."

 

x/x = 1 doesn't override other rules, it just happens that x divides into x groups 1 times for non-zero x. It's true that 0 divides into 0 groups 1 times, but it also does so 2 times, or 0 times or 5 (I could go on). Conversely, for 1/0, there is no sensible meaning of splitting 1 into 0 groups. There is no number of groups of size 0 that add up to 1.

Link to comment
Share on other sites

x/0 is inconsistent but 0/0 isn't, so it is called "indeterminate" instead of "undefined" as x/0 is.

I don't know if this is debatable, or if it's a rule or if it depends on context or what.

 

"Indeterminate form" is used in the context of limits in analysis, when simple substitution gives a result that doesn't determine the value of the limit itself.

 

Division by zero, itself, is undefined, even if the numerator is also zero.

Link to comment
Share on other sites

Undefined:

 

X * 0 = 5 --> 5/0 = X

 

Indeterminate:

 

X * 0 = 0 --> 0/0 = X

 

There are many numbers that when multiplied by Zero equal Zero, but there is nothing that when multiplied by Zero that equals 5.

 

This is how I generally view the difference anyways.

Link to comment
Share on other sites

No. Zero is certainly a real number, whereas infinity is not.

 

However, division by zero cannot be defined in the usual structure of the real numbers without leading to logical inconsistency. That is to say, give me any reasonable and useful value for the expression x/0 and we can use it to deduce a false statement.

 

Also, it seems 0/0 is sometimes referred to as "indeterminate" even outside the context of limits, which is news to me. Admitting 0/0 as a valid arithmetical expression still leads to absurdity, however.

Edited by John
Link to comment
Share on other sites

Admitting 0/0 as a valid arithmetical expression still leads to absurdity, however.

Do you have examples?

 

I can think of an example only if you don't properly treat it as indeterminate, for example if we say "Since it is unknown we can replace 0/0 with x, where x is unknown", and we've mistakenly implied that if 0/0 is used several times then it should have the same value everywhere it's used.

An equation like "0/0 = 0/0" would have to be treated as indeterminate forms, not as unknown reals, and understood to mean that "the indeterminate forms are the same" and not "the forms have the same unknown real value." Either that, or simply avoid defining 'equality' for indeterminate forms. Seems messy and tricky, but I can't think of any absurdity.

Edited by md65536
Link to comment
Share on other sites

I watched the video at school today. So basically what you are saying that zero is like infinity, it is a concept, not a real number? :unsure:

In modern mathematics zero and the negative numbers (eg -2) are real numbers. The real numbers satisfy some nice properties with respect to addition and multiplication. The inverse operations are also defined, so minus and dividing, however the number zero is not invertible with respect to multiplication. We have no sensible way of defining division by zero, as lots of people here have stated.

 

Infinity is different. It (or really they) do not obey the nice rules of real numbers and so cannot be thought of as a real number.

 

The distinction between infinity as a mathematical concept and numbers as numbers is artificial. Everything in mathematics is a concept, just some of these concepts are much closer tied to our physical world than others. For example, we have the natural numbers 1,2,3,... which are used for counting or ordering. This concept is easy to link with reality; I have 2 apples, you came 3rd in the class test etc.

 

This is why the concept of the number zero took a while to become part of mathematics. Then of course we have the negative numbers, but that is another topic.

Link to comment
Share on other sites

Do you have examples?

 

I can think of an example only if you don't properly treat it as indeterminate, for example if we say "Since it is unknown we can replace 0/0 with x, where x is unknown", and we've mistakenly implied that if 0/0 is used several times then it should have the same value everywhere it's used.

An equation like "0/0 = 0/0" would have to be treated as indeterminate forms, not as unknown reals, and understood to mean that "the indeterminate forms are the same" and not "the forms have the same unknown real value." Either that, or simply avoid defining 'equality' for indeterminate forms. Seems messy and tricky, but I can't think of any absurdity.

Well, to be clear, I'm speaking in the context of the field of real numbers. Thus, the expression "0/0" is meaningless to begin with, since 0 (more generally, the additive identity) has no multiplicative inverse in this field or any other. Even if we ignore that fact, admitting 0/0 means it must be taken as a real number to maintain closure under multiplication and retain the field structure.

 

By the definition of multiplicative inverse, 0/0 = 1. Given any a ≠ 1, we have 0a = 0, thus 0-1(0a) = 0-1(0), so a = 0-1(0) = 0/0 = 1, a contradiction. Specifically, we have 0(0) = 0, so 0 = 0/0 = 1. Thus the additive and multiplicative identities are equal, which by definition is impossible in a field.

 

Just taking 0/0 to be a unique element leads to two possibilities. If 0/0 is allowed to "mingle" with other reals through addition and multiplication, then (I think) we end up just finding 0/0 = 0, i.e. it's just another symbol for the additive identity. Other values can probably be determined as well, in which case we can derive that given two elements a1a2, we have a1 = a2, a contradiction. If, on the other hand, we isolate 0/0 into a separate class of "indeterminates," then it doesn't seem to add anything to the structure, rendering the expression rather useless.

Edited by John
Link to comment
Share on other sites

Having read through the above posts, I'm still having trouble with this consideration -

 

Suppose we say "divide 5 by 0", Isn't that the same as saying "divide 5 by nothing" . And "nothing" literally means "no thing", And if you divide 5 by no thing, surely that means 5 ought to stay as 5.

 

Just as when you "subtract" no thing from 5, it stays 5. As in: "5 - 0 = 5" That doesn't seem to cause any problems in maths. So I can't see why division should either.

 

I mean, isn't "division" just a shorthand word for repeated subtraction? You can subtract 0 from 5 as many times as you like, 5 - 0 - 0 - 0 ..... and the answer is always 5, without apparently causing absurdity.

 

Which is why this whole business of "division by 0" causing a problem, has always puzzled me. Can't mathematicians just stipulate that "5/0" means "5 - 0" ie, still 5. Would that resolve the problem?

Edited by Dekan
Link to comment
Share on other sites

Having read through the above posts, I'm still having trouble with this consideration -

 

Suppose we say "divide 5 by 0", Isn't that the same as saying "divide 5 by nothing" . And "nothing" literally means "no thing", And if you divide 5 by no thing, surely that means 5 ought to stay as 5.

 

Just as when you "subtract" no thing from 5, it stays 5. As in: "5 - 0 = 5" That doesn't seem to cause any problems in maths. So I can't see why division should either.

 

I mean, isn't "division" just a shorthand word for repeated subtraction? You can subtract 0 from 5 as many times as you like, 5 - 0 - 0 - 0 ..... and the answer is always 5, without apparently causing absurdity.

 

Which is why this whole business of "division by 0" causing a problem, has always puzzled me. Can't mathematicians just stipulate that "5/0" means "5 - 0" ie, still 5. Would that resolve the problem?

Division is really "seperation into equal sized groups". How do you separate anything into groups of size 0? You can acheive the same result by subtraction, but with subtraction, you're actually reducing the size of the single group, not splitting the group - while they can be used to acheive the same result, they are fundamentally different operations.

 

This of it like this - if I divide a pie into 2 pieces, I have two pieces of the same pie, but they're both still there. If, on the other hand I eat half the pie (subtract half of it), I will only ever have half the pie from that point forward.

 

Think of division in this way:

 

[math] 5 \div 2 [/math] means to divide a set of 5 into 2 equal sets. That's what division means. So then

[math]5 \div 0[/math] means to divide a set of 5 into 0 equal sets, which is nonsense, since the smallest number of equal sets that makes sense is 1, which is the original set of 5.

Link to comment
Share on other sites

 

The real numbers satisfy some nice properties with respect to addition and multiplication.

 

What's nice about them?

 

I only wish my wallet obeyed the rules for manipulating infinities, rather than the rules for manipulating real numbers, so If I take out £10 from my wallet I still have the same amount of money in it.

 

:)

Link to comment
Share on other sites

Well, to be clear, I'm speaking in the context of the field of real numbers.

I think all of the problems you came up with involve treating 0/0 as a real number, with a definite value and sharing the properties of real numbers. But wouldn't you be able to do the same thing with say i, deriving contradictions if i is treated as real and using rules that apply to reals?

 

If, on the other hand, we isolate 0/0 into a separate class of "indeterminates," then it doesn't seem to add anything to the structure, rendering the expression rather useless.

I think this is the only way it could be done. As a mathematical object it would have to be able to retain the property of being indeterminate, and it could not be consistently treated as a real. It could be useless... I can't conceive of a case otherwise.

Suppose we say "divide 5 by 0", Isn't that the same as saying "divide 5 by nothing" . And "nothing" literally means "no thing", And if you divide 5 by no thing, surely that means 5 ought to stay as 5.

Just to add to what's already been said...

 

By similar reasoning, if you divide 10 into 5 groups, you still end up with 10. Some of the properties of numbers are based on those of physical objects, including conservation of mass, and the operations don't "destroy the input values". If you try to divide 5 into 0 groups, you can't. Yes, you still have that original 5, but it is not the result of the division operation.

 

To make 5/0 = 5 would redefine division and give it very different meaning in different cases. You could always define a new operation.

Link to comment
Share on other sites

Having read through the above posts, I'm still having trouble with this consideration -

 

Suppose we say "divide 5 by 0", Isn't that the same as saying "divide 5 by nothing" . And "nothing" literally means "no thing", And if you divide 5 by no thing, surely that means 5 ought to stay as 5.

 

Just as when you "subtract" no thing from 5, it stays 5. As in: "5 - 0 = 5" That doesn't seem to cause any problems in maths. So I can't see why division should either.

 

I mean, isn't "division" just a shorthand word for repeated subtraction? You can subtract 0 from 5 as many times as you like, 5 - 0 - 0 - 0 ..... and the answer is always 5, without apparently causing absurdity.

 

Which is why this whole business of "division by 0" causing a problem, has always puzzled me. Can't mathematicians just stipulate that "5/0" means "5 - 0" ie, still 5. Would that resolve the problem?

 

Division is defined as the inverse of multiplication, i.e. when we say "a divided by b," we're really saying "a multiplied by the multiplicative inverse of b." Thus, 5/0 is a shorthand for 5 * 0-1.

 

Now, by definition, the multiplicative inverse x-1 of a field element x is the unique element such that xx-1 = 1. Thus, we have that 0(0-1) = 1. This is problematic, first because the proposed definition means 0(0-1) should equal 0 (which also follows from the demonstrable fact that 0 times any field element equals 0), and second because it yields inconsistency, e.g. 3 * 0 = 0 yields 3 = 1.

 

I think all of the problems you came up with involve treating 0/0 as a real number, with a definite value and sharing the properties of real numbers. But wouldn't you be able to do the same thing with say i, deriving contradictions if i is treated as real and using rules that apply to reals?

 

I think this is the only way it could be done. As a mathematical object it would have to be able to retain the property of being indeterminate, and it could not be consistently treated as a real. It could be useless... I can't conceive of a case otherwise.

Indeed. All my points in this thread have been made in the context of the field of real numbers. Exotic structures may permit operations that we usually think of as being undefined, as mentioned in my first post in this thread.

Link to comment
Share on other sites

No. Zero is certainly a real number, whereas infinity is not.

 

However, division by zero cannot be defined in the usual structure of the real numbers without leading to logical inconsistency. That is to say, give me any reasonable and useful value for the expression x/0 and we can use it to deduce a false statement.

 

Also, it seems 0/0 is sometimes referred to as "indeterminate" even outside the context of limits, which is news to me. Admitting 0/0 as a valid arithmetical expression still leads to absurdity, however.

 

Then how is zero a real number. If you can't get it, then what is it. Is zero even a real number. I have always been taught that it is on the number line, so it must be a real number. But because it can not be divided in any way, what is it? Why is it on the number line if it isn't a real number? Is it being used as a metaphor as something more? If zero is nothing, then what is it?

Link to comment
Share on other sites

The number axioms are suprisingly few and do not include division.

 

In advanced algebra there is a formal entity called a division ring.

 

http://en.wikipedia.org/wiki/Division_ring

 

Part of the reason we have rings, fields, groups and so on (carefully) defined is to overcome the difficulties that appear in less detailed treatments.

Link to comment
Share on other sites

 

Then how is zero a real number. If you can't get it, then what is it. Is zero even a real number. I have always been taught that it is on the number line, so it must be a real number. But because it can not be divided in any way, what is it? Why is it on the number line if it isn't a real number? Is it being used as a metaphor as something more? If zero is nothing, then what is it?

 

Zero is indeed a real number. In particular, zero is the unique element that serves as the additive identity and multiplicative annihilator for the field of real numbers, i.e. for any real number x, we have that x + 0 = x and x * 0 = 0.

 

I'm not sure what you mean by "if you can't get it." One can arrive at zero by subtracting a number from itself.

 

Also, zero can be divided by any non-zero real number, and this will again yield zero.

 

Division by zero is undefined because given a/0 = b for some a ≠ 0, there is no real value of b such that 0b = a.

 

As discussed above, conceptually, we might think of a/0 as meaning either "how many times must we subtract 0 from a to arrive at 0" or "starting at 0, how many times must we add 0 to arrive at a." In either case, the answer cannot be a real number, as adding 0 to itself any real number of times still yields 0 and subtracting 0 from a any real number of times still yields a.

 

Each of these also lends itself to conceptualization in terms of grouping or partitioning. How many groups of 0 objects must we collect to have a objects? Alternatively, given a objects, how many groups of 0 can we form if we must completely exhaust our collection? For any non-zero a, we run into the same issues as before.

 

If we consider 0/0, then it turns out that every real number b satisfies 0b = 0. Therefore, instead of having no possible real values, 0/0 has infinitely many possible real values. This is why, as some posters mentioned before, we might call 0/0 "indeterminate" rather than (or perhaps in addition to) "undefined."

 

Looking at the number line, including zero is part of achieving a very nice property of the real numbers, which is that the reals have no "gaps." This property is called "completeness," and without it, many important results (including, notably, much of calculus) are invalid.

 

Alternatively, if we take zero to be a natural number (or even if we leave it until we've defined the integers), then by definition zero is a real number, as [math]\mathbb{N} \subset \mathbb{Z} \subset \mathbb{R}[/math].

Edited by John
Link to comment
Share on other sites

In another current thread Unity+ is asking about the ancient Greeks.

 

The ancient Greeks took a largely geometric view of maths and they offered a geometric construction for 'division'.

 

It is instructive to try to apply this to both division of a nozero line (=division of a real number by zero) and a zero line (=0/0).

Link to comment
Share on other sites

The number axioms are suprisingly few and do not include division.

 

In advanced algebra there is a formal entity called a division ring.

 

http://en.wikipedia.org/wiki/Division_ring

 

Part of the reason we have rings, fields, groups and so on (carefully) defined is to overcome the difficulties that appear in less detailed treatments.

 

Ok, so I read that and it makes sense. This graph that is on the Multiplicative inverse kind of says that you can never get to zero.300px-Hyperbola_one_over_x.svg.pngWikipedia: http://en.wikipedia.org/wiki/Multiplicative_inverse

And due to this graph, you can never get to 0, because you can't divide by zero. Instead, the number can get closer to zero, to a point were it is 1/infinity. But, isn't there a way to prove that .9999...(repeating) is equal to 1.

 

I think it goes something like this:

x=.99999...

10x=9.9999...

10x-x=9.999....-.999

9x=9

x=1

 

Couldn't you do the same thing for zero, because it is on the other side. What I mean it to prove that 1 over infinity is 0.

Link to comment
Share on other sites

Ok, so I read that and it makes sense. This graph that is on the Multiplicative inverse kind of says that you can never get to zero.300px-Hyperbola_one_over_x.svg.pngWikipedia: http://en.wikipedia.org/wiki/Multiplicative_inverse

And due to this graph, you can never get to 0, because you can't divide by zero. Instead, the number can get closer to zero, to a point were it is 1/infinity.

While this particular figure is drawn well, be careful not to place too much trust in graphs and diagrams. They can be misleading.

 

But, isn't there a way to prove that .9999...(repeating) is equal to 1.

 

I think it goes something like this:

x=.99999...

10x=9.9999...

10x-x=9.999....-.999

9x=9

x=1

It is true that 0.999... = 1 (in fact, every nonzero terminating decimal has a second representation ending in infinitely many 9's). I've seen some people object to this particular line of reasoning, but that's a minor point. There are many ways to show that the equality holds.

 

Couldn't you do the same thing for zero, because it is on the other side. What I mean it to prove that 1 over infinity is 0.

 

Well, again we must remember we're talking about the real numbers here, and since infinity is not a real number, an expression like 1/∞ isn't really valid. We can append ±∞ to the real numbers to obtain the extended real number line, but this is a new structure not equivalent to the field of real numbers. At the level we're talking about here, appending ±∞ is mainly useful to give us a shorthand notation for describing limits involving values that increase or decrease without bound. So while it's true that [math]\lim_{x \to \infty} \frac{1}{x} = 0[/math], it's incorrect to say that [math]\frac{1}{\infty} = 0[/math].

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.