Jump to content

The origin of zero


Recommended Posts

  • 2 months later...

The Babylonians used a symboy as a placeholder long before 1700 BC, it's more like 3000 BC. The Greeks later used a round symbol to note the gap.

 

In 642 AD the Moslem's destroyed the great library of Alexandria. Most of the ancient Greek mathematics was lost, but a few books survived.

 

Later the Hindu's in India realised zero was a number in it's own right. Years earlier Aristotle had said it could not be a number as dividing something by zero was incomprehensible. Brahmagupta used the division of something by zero to be a definition of infinity.

 

It was India who also made the counting system we use today.

Link to comment
Share on other sites

  • 4 weeks later...

Oh me, oh my...

 

A teacher actually said 0/0 = 1 ?

 

At first I thought Homunculus was just being rough on the teacher untill I actually read that for myself.

 

The other two i can understand from someone not (well ?) versed in math... but even as an elementary student I knew that was wrong as hell.

 

If that is the level of pre - secondary educators' intelligence, it makes you wonder. I learned more at home then i ever did in grade school anyway, but that is something I just cannot comprehend. Maybe I was just a little too trusting of my education system (naive?) to doubt their teachings.

 

Edit -- http://www.mathmojo.com/interestinglessons/division_by_zero/division_by_zero_1.html

 

is what I'm talking about.

Link to comment
Share on other sites

  • 7 months later...

0/0 is undefined for the same reason that any quantity is defined.

 

It makes a lot more sense if you just think about it for a couple of seconds. The definition of "a divided by b" is to find some number x such that b*x=a. Now consider the case 1/0 - in this case, a = 1, b = 0. so we want to find a number such that 0*x=1. But this is impossible, so 1/0 is not defined. You can use a similar argument to show that we can't define any number divided by 0.

Link to comment
Share on other sites

  • 7 months later...

I could be way of base here but i could not resist replying to this forum.

 

I am currently doing a paper on the origin or mathematics and trully Zero has been most interesting!

 

maily be cause we have been conditioned to thing of zero as both a place holder and nothing.

 

if you are selling weat in the Fertile cresent market. and you have one sack left, after you sell that one you have nothing and that's it ...

 

Zero was not concrete or necessary why represent nothing with something. I believe that the best place holder for nothing would actually be nothing!, but we are much more conditioned now to think abstractly and analitically .

 

 

as for division by 0...

 

if we were to look at the limit of 1/x , as x approaches 0 from the right we can find the following

 

1/1 = 1

1/.5 = 2

1/.25 =4

1/.001 =1000

1/(1e-25)=1e+25

 

we can conclude that the limit of Y/x as x approaches 0 from the right is Infinity! and Y is any number.

 

by doing this from the left we get -infinity

 

+infinity from right

-infinity from left this is not continuous and has not value

 

but doing it with another digit like 1 from the left and the right we get, 1/1.0000000001 and 1/.9999999999 both get close to 1.

therefore continiuous and defined.

 

now if you take a value small enouph to be negligable and devided it by another small number this is what you get

 

1e-136 Very very small

 

1e-136/1e-136 =1 or 1e-136/-1e-136=-1

 

or (a number infinitly small/another number infinitly small) = 1

 

That is perhaps why his teacher might have said 0/0 = 1 because we are conditioned to see zero as a whole number not an absence of anything.

 

we could be able to conclude that deviding nothing with nothing we still remain with nothing

0/0 =0 since 0*0 = 0 both of these statements could be seen as correct.

 

but here is the trick that leave Y/0 undefined.

1e-136/1e-140 =10,000 ... ( and infinitly small number/more infinite number) = large number or -large number ...

Link to comment
Share on other sites

I could be way of base here but i could not resist replying to this forum.

 

I am currently doing a paper on the origin or mathematics and trully Zero has been most interesting!

 

maily be cause we have been conditioned to thing of zero as both a place holder and nothing.

 

if you are selling weat in the Fertile cresent market. and you have one sack left, after you sell that one you have nothing and that's it ...

 

Zero was not concrete or necessary why represent nothing with something. I believe that the best place holder for nothing would actually be nothing!, but we are much more conditioned now to think abstractly and analitically .

 

 

as for division by 0...

 

if we were to look at the limit of 1/x , as x approaches 0 from the right we can find the following

 

1/1 = 1

1/.5 = 2

1/.25 =4

1/.001 =1000

1/(1e-25)=1e+25

 

we can conclude that the limit of Y/x as x approaches 0 from the right is Infinity! and Y is any number.

 

by doing this from the left we get -infinity

 

+infinity from right

-infinity from left this is not continuous and has not value

 

but doing it with another digit like 1 from the left and the right we get, 1/1.0000000001 and 1/.9999999999 both get close to 1.

therefore continiuous and defined.

 

now if you take a value small enouph to be negligable and devided it by another small number this is what you get

 

1e-136 Very very small

 

1e-136/1e-136 =1 or 1e-136/-1e-136=-1

 

or (a number infinitly small/another number infinitly small) = 1

 

That is perhaps why his teacher might have said 0/0 = 1 because we are conditioned to see zero as a whole number not an absence of anything.

 

we could be able to conclude that deviding nothing with nothing we still remain with nothing

0/0 =0 since 0*0 = 0 both of these statements could be seen as correct.

 

but here is the trick that leave Y/0 undefined.

1e-136/1e-140 =10,000 ... ( and infinitly small number/more infinite number) = large number or -large number ...

Link to comment
Share on other sites

Just to add: The reason zero took so long to become an accepted part of mathematics is because most of the mathematical traditions in history have viewed geometry as the subject's foundation. While number theory and arithmetic were certainly studied, definitions and proofs in these fields were usually put in geometrical terms. For instance, the Babylonians, the Muslim mathematicians and the later western European mathematicians would express a simple quadratic equation such as [math]x^2+2x=3[/math] as "a square plus twice its side is 3," and a solution for x would be found by a geometrical construction*.

 

Consequently, zero and negatives were not developed (the work of Brahmagupta was ignored by later mathematicians), not having obvious geometrical interpretations. Their inclusion as legitimate mathematical concepts from the 17th century represents a conceptual shift away from a view that "all is geometrical" to a more arithetically-oriented view. It is also interesting to note that complex numbers were introduced at around the same time, though the secure position they enjoy nowadays as legitimate concepts had to wait for Hamilton and others in the mid-19th century.

 

*In this case, you construct a square with side x, extend the edges by 1 unit, and then, literally, "complete the square" by adding a smaller square of side 1 in the corner, to deduce that [math]\left(x+1\right)^2=4[/math], and thus that [math]x=1[/math].

 

EDIT: Just trying out LaTeX in this forum.

Link to comment
Share on other sites

Just to add: The reason zero took so long to become an accepted part of mathematics is because most of the mathematical traditions in history have viewed geometry as the subject's foundation. While number theory and arithmetic were certainly studied, definitions and proofs in these fields were usually put in geometrical terms. For instance, the Babylonians, the Muslim mathematicians and the later western European mathematicians would express a simple quadratic equation such as [math]x^2+2x=3[/math] as "a square plus twice its side is 3," and a solution for x would be found by a geometrical construction*.

 

Consequently, zero and negatives were not developed (the work of Brahmagupta was ignored by later mathematicians), not having obvious geometrical interpretations. Their inclusion as legitimate mathematical concepts from the 17th century represents a conceptual shift away from a view that "all is geometrical" to a more arithetically-oriented view. It is also interesting to note that complex numbers were introduced at around the same time, though the secure position they enjoy nowadays as legitimate concepts had to wait for Hamilton and others in the mid-19th century.

 

*In this case, you construct a square with side x, extend the edges by 1 unit, and then, literally, "complete the square" by adding a smaller square of side 1 in the corner, to deduce that [math]\left(x+1\right)^2=4[/math], and thus that [math]x=1[/math].

 

EDIT: Just trying out LaTeX in this forum.

Link to comment
Share on other sites

Interesting thread with some fascinating links. I would have made Dave's mistake and said it was the Arab's who invented zero. Also, I did not know, or had long forgotten, that at one time all proofs were done geometrically. All in all the thread proves that you can make something out of nothing.icon7.gif

Link to comment
Share on other sites

Interesting thread with some fascinating links. I would have made Dave's mistake and said it was the Arab's who invented zero. Also, I did not know, or had long forgotten, that at one time all proofs were done geometrically. All in all the thread proves that you can make something out of nothing.icon7.gif

Link to comment
Share on other sites

It is possible and reasonable to define 0/0=1. 1/0 isn't really a well defined concept, because it involves an infinity. We can regulated the infinity by saying that it is the limit of a normal number. So [math]\frac{1}{0} \equiv lim_{x \to 0} \frac{1}{x}[/math].

 

In that case, using our definition of 1/0, [math]\frac{0}{0} = \frac{1}{0} \frac{0}{1} \equiv \frac{\lim_{x \to 0} \frac{1}{x}}{\lim_{x \to 0} \frac{1}{x}}=1[/math]

 

The difficulty (or ambiguity) arises when we realise that we needn't have used the definition of 1/0 to say what 0/1 is. We could have just done [math]\frac{0}{0} = \frac{1}{0} \times 0 \equiv \lim_{x \to 0} \frac{1}{x} \times 0 =0[/math].

 

But in the opinion of a physicist 0/0=1 is the most sensible choice since we use the same symbol for numerator and denominator, implying that they are the same thing. In a physical situation (which let's face it is all maths is good for) we only really get close to zero, so a limiting case approach is fine.

 

Professor Homunculus does seem to be a bit of an asshole.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.