Jump to content
Sign in to follow this  
mooeypoo

Devision by Zero, and Devision by Infinite

Recommended Posts

Hi,

 

First of -- I am not a methematical person, so if I understood the entire thing wrong, I'm sorry in advance.

 

We all know what we learned since we're relatively young: division to zero is infinite.

 

A friend of mine told me his teacher said that if you devide something in infinite, you do not get zero, you get (-) infinite.

 

WHAT!?!? :confused:

 

Auhm, well if you said that PHYSICALLY, I could understand, but, isn't mathematics built on straight rules ..?

 

if 1/0 = infinite

then mathematically

1/infinite = 0

 

 

Assuming I actually understood correctly, and that his teacher is not an idiot, could anyone please explain what is going on in that equasion?

That entire thing of devision in infinite and division in zero is not something i could completely understand..

 

If I got it all wrong, I'm sorry, just.. err.. please tell me :P

 

 

Thanks!

 

~moo

Share this post


Link to post
Share on other sites

First off, 1/0 is not infinite. Division by zero is an undefined operation.

 

I've explained this in another thread, so I'm going to try and clear this up once and for all.

 

Let's say you want to divide one number by another. What does this actually mean? In a real world context, division is finding out how much of a cake you can share equally between 3 people, for example.

 

Say you want to define the division operation in a more mathematical way. So say we have some fraction a/b, where the / defines the division operator. The fraction a/b basically says "find a number so that when you multiply b by that number, the answer is a". In other words, find a number x such that b*x = a. (Then x=a/b.)

 

So for the case 6/3, x is obviously 2.

 

Now lets consider what we get when we consider 1/0. If we want to keep things consistant, it makes sense to use the same definition for division. So let's bung the numbers in:

 

We want to find a number such that 0*x = 1. But just look at this for a second - if you multiply any number by 0, then you're going to get 0. It's impossible to find a number that satisfies the equation 0*x = 1, so 1/0 must be undefined.

 

Now to the other bit. There is a great temptation in mathematics (as you have done) to toss around things like infinity casually in equations. The problem arises when you consider that when you say "infinity", it's not actually a number, but rather a symbol to describe an idea. Things like 1/infinity = 0 are really shorthand for a more complex method of saying that the limit of 1/x as x tends to infinity is zero.

 

This is basically saying what you said; if you take a sequence of numbers that goes 1, 1/2, 1/3, 1/4,... and carry this on forever, then the sequence of numbers will tend to zero.

 

What your friend said is wrong, quite frankly. If you take any real number, say a, then apply the same principles - consider the sequence of numbers a/1, a/2, a/3,...,a/x,... Then we have that the nth term of the sequence is actually equal to a*(1/n). Now as you increase n to an extremely large value, 1/n is going to tend towards zero. As I said before, any number multiplied by zero is equal to zero, and since a is constant, the limit of the sequence in this case will be 0.

 

I hope this helps some, it's a bit mathematically orientated, but to explain it properly you need to grasp the concept of limits and sequences of numbers tending to certain values. Cheers.

Share this post


Link to post
Share on other sites

yeap; the previous post has said everything i was about to say in this one.

 

oh i love maths so much

 

have a look at some sequences where the denominators tend to 0 . u will find that the limit of the sequence doesnt have to be infinity.

Share this post


Link to post
Share on other sites

One useful way to look at what happens if you divide 1 by 0 and 1 by infinity is to consider limiting cases.

 

If you consider the sequence 1/1, 1/(1/2), 1/(1/3) i.e the sequence 1/(1/x) which is basically x, it is clear that a sequence with a finite numerator and a denominator which tends to 0 tends to infinity. In mathematics we say

 

[math]\[

\mathop {\lim }\limits_{x \to 0 } \frac{1}{{x}} \Leftrightarrow \mathop {\lim }\limits_{x \to \infty } x \to \infty

\]

 

[/math]

 

Notice that we don't say that it is equal to infinity, only that it tends to infinity.

 

It also follows that a sequence (1/1, 1/2, 1/3 ...) i.e. 1/x tends to 0 as x tends to infinity. Not minus infinity. Again, mathematically,

 

[math]

\[

\mathop {\lim }\limits_{x \to \infty } \frac{1}{x} = 0

\]

 

[/math]

 

This time we can say that the limit is equal to 0, because 0 is a real number, whereas infinity is not.

Share this post


Link to post
Share on other sites

I'm sorry to say, but I hate math :P i like physics a lot more...

but that might have a lot to do with the fact i don't KNOW a lot of advanced maths. So take my words about maths with causion ;)

 

anyways, thanks guys!! that's... a lot clearer now. It sounded so weird before, and you just put it into perspective again.

 

I just wonder why they teach that 1/0=infinite and 1/infinite=(-)infinite in schools. Very weird...

 

Thanks!!

 

~moo

Share this post


Link to post
Share on other sites

Will do, AntiMagicMan.

 

oh, wait.. I'll have to pass through all the schools...

 

err...

 

remind me sometimes in the late 23rd century, when I have the time.

 

:P

 

~moo

Share this post


Link to post
Share on other sites

I`ve just read all this thourougly as I possibly can (and no, I didn`t understand all of it). but I would have thought that in the case of 1/0 the answer would be 1.

 

as the Zero would negate the operator of divide.

an apple divided by nothing would be an apple, still whole too :)

am I looking at it too logicaly again?

Share this post


Link to post
Share on other sites
Let's say you want to divide one number by another. What does this actually mean? In a real world context' date=' division is finding out how much of a cake you can share equally between 3 people, for example.

 

Say you want to define the division operation in a more mathematical way. So say we have some fraction a/b, where the / defines the division operator. The fraction a/b basically says "find a number so that when you multiply b by that number, the answer is a". In other words, find a number x such that b*x = a. (Then x=a/b.)

 

So for the case 6/3, x is obviously 2.

 

Now lets consider what we get when we consider 1/0. If we want to keep things consistant, it makes sense to use the same definition for division. So let's bung the numbers in:

 

We want to find a number such that 0*x = 1. But just look at this for a second - if you multiply any number by 0, then you're going to get 0. It's impossible to find a number that satisfies the equation 0*x = 1, so 1/0 must be undefined.[/quote']

 

For YT, it seems that you are using the real world interpretation rather than the strictly mathematical one as dave so nicely explained above.

 

AntiMagicMan,

It would tend to infinity, but never reach it, creating an asymptote and therefore being undefined? Is this about right?

Share this post


Link to post
Share on other sites

Basically yes. We can say quite mathematically that in the limit of x -> 0 it tends to infinity. But we cannot say it equals infinity. So the sum 1/0 is undefined, but 1/x as x goes to 0 tends to infinity.

 

This is all due to a move in mathematics in the last century or so to remove all mention of infinity and infinitesmals and instead talk about limiting processes.

Share this post


Link to post
Share on other sites
Basically yes. We can say quite mathematically that in the limit of x -> 0 it tends to infinity. But we cannot say it equals infinity. So the sum 1/0 is undefined, but 1/x as x goes to 0 tends to infinity.

 

As x goes to 0 from the right, 1/x tends to infinity. From the left it tends to negative infinity. That's the entire problem of trying to define 1/0.

Share this post


Link to post
Share on other sites

Ah yes, that is a very good point. I shouldn't have said the limit of 1/x as x tends to 0 tends to infinity without specifying a direction. The limit at at 0 does not exist because the function is discontinuous.

 

Basically we cannot equate 1/0 to anything.

Share this post


Link to post
Share on other sites

Cool. I like the limits idea. I does seem to make more sense, though the concept of infinity still needs to remain, I do see why it is better to say a function tends to infinity rather than is infinity. Thanks for the help.

Share this post


Link to post
Share on other sites

You also have to understand infinity. Infinity is not a quantified number, it is a "range". You can have different "degrees" of infinity; some infinities get bigger faster than other infinities as x approaches a number.

The "divide by zero" concept is applied in calculus to calculate derivitives - the slope - of a line.

lim      [u]f(x+h)-f(h)[/u]
h->0         h

This gives you the slope at point x in the function f(x).

Basically, You are computing change in f(x) divided by change in x. As the change in x approaches zero, you find the instantaneous change in the curve.

 

As for dividing by zero in a simple sense, thats pretty tricky. You learn simple division in elementary school - "If you have 9 coins and want to put them into three separate stacks, how many go in each pile?--three" But if you want to put them in zero stacks, you're going to have some trouble making them disappear...

Share this post


Link to post
Share on other sites
lim      [u]f(x+h)-f(h)[/u]
h->0         h

 

that sould be

 

]

lim      [u]f(x+h)-f(x)[/u]
h->0         h

Share this post


Link to post
Share on other sites

He is correct about that definition, sorry.

The other definition of derivative is:

lim     [u]f(x)-f(a)[/u]
x->a       x-a

As you can see on both of these definitions of the derivative, you are calculating the slope when the change in x is approaching zero. So in these instances, you come very close to "dividing by zero."

Share this post


Link to post
Share on other sites

For f(x) = 1/x (or indeed c/x, c being in the set of reals) and a = 0, that limit doesn't exist - which is the entire point of being unable to define c/0.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.