Jump to content

0÷0


0÷0is Easy

Recommended Posts

40 minutes ago, TakenItSeriously said:

While I'm unfamiliar with Computer Mathematics, I don't think it should be confused with Computer Science which is based on logic, not math.

While all math is founded on logical premise, generally speaking, Logic and Math are not the same thing. In fact, in general, I'd desscribe them as being pretty much opposite disciplines, though in a complimentary way.

 

The difference between a Computer Scientist and an IT person, for example, is that the former can tell you which algorithm is better and best when presented with several versions. The Calculus is what makes that possible.

I did return around 2000, to take post-graduate compSci classes, but abandoned the M.A. early. The most fascinating class I took was a semester on the Turing Machine and computability, the name of which I don't recall as I type, but have the text book saved in my basement and not within reach. The question that was posed on day one was, given an algorithm, is it computable?--this, I may not remember correctly. On the last day, the full, complete means to answer the question was revealed and my jaw hit the ground. The old woman who taught the class was very effective. I'll never forget that class.

Edited by scherado
Link to comment
Share on other sites

1 hour ago, TakenItSeriously said:

While I'm unfamiliar with Computer Mathematics, I don't think it should be confused with Computer Science which is based on logic, not math.

While all math is founded on logical premise, generally speaking, Logic and Math are not the same thing. In fact, in general, I'd desscribe them as being pretty much opposite disciplines, though in a complimentary way.

 

 

29 minutes ago, Strange said:

There is a quite a lot of mathematics in computer science.

I assume this is a case of cross posting, but If you include my complete statement I did say that the two are complimentary which implies that they are not incompatible.

So yes, I agree, computer science does make liberal use of mathematics in various functions or algorithms though it's all applied within a logical framework so I still think that they should not be confused with each other.

For instance:

i = i+1

Is a trivial concept in computer science that does not make sense in mathematics.

 

 

 

 

Link to comment
Share on other sites

You seem to be confusing computer programming with computer science.

And (formal) logic is a branch of mathematics, not separate from it. It is also widely used in computer science. (And in computer programming.)

Edited by Strange
Link to comment
Share on other sites

12 hours ago, studiot said:

My apologies if you do not, in fact, have a computer science degree, perhaps you call them something different in your part of the world.

Infomatics, Computer Science, Computing Technology, Information Technology, ............... just plain Computing.

I think that the subject goes under many banners - the list is long.

I am also sure there is a large amount of overlap between syllabuses, although no two university courses will be exactly the same.

But I am also sure that no one course will include everything.

So a BA in Computing will include a good deal of respectable material, whatever it is called.

So why quibble over the name, any of them will (should) command respect?

 

Link to comment
Share on other sites

1 hour ago, scherado said:

The difference between a Computer Scientist and an IT person, for example, is that the former can tell you which algorithm is better and best when presented with several versions. The Calculus is what makes that possible.

I did return around 2000, to take post-graduate compSci classes, but abandoned the M.A. early. The most fascinating class I took was a semester on the Turing Machine and computability, the name of which I don't recall as I type, but have the text book saved in my basement and not within reach. The question that was posed on day one was, given an algorithm, is it computable?--this, I may not remember correctly. On the last day, the full, complete means to answer the question was revealed and my jaw hit the ground. The old woman who taught the class was very effective. I'll never forget that class.

 

Thanks for clarifying. I'm relieved to hear that the computer mathematics major later gave way to computer science, as opposed to being something new since I just don't see a hybrid logic/math approach working to a very wide extent.

 

 

Link to comment
Share on other sites

 

5 hours ago, TakenItSeriously said:

So yes, I agree, computer science does make liberal use of mathematics in various functions or algorithms though it's all applied within a logical framework so I still think that they should not be confused with each other.

No, no, three times no. Computer Science uses mathematics extensively and in depth Did I not tell you that calculus is required to evaluate and compare algorithms? I had to take 3 semesters calculus, 1 Differential equations, 1 Numerical Analysis and there may have been others, I should answer that for myself for these occasions.

When the bleeping show comes to a crawl and people are picking their noses between lengthy response-times, and you don't want to or can't trash the whole thing because a ton of money has been spent on it, you call in the Computer Scientist to analyze the software.

One of my most distinct memories of the one of the post-graduate compSci classes was the professor looking at us--this guy's now Dean of the whole show, congratulations to him--he snickered and said, "Now you know the reason you took the calculus classes", and he wrote the solution to our software problem on the black-board (he was using chalk) in terms of limits. I now knew the reason I took all those Calc classes.

Edited by scherado
Link to comment
Share on other sites

2 hours ago, scherado said:

 

No, no, three times no. Computer Science uses mathematics extensively and in depth Did I not tell you that calculus is required to evaluate and compare algorithms? I had to take 3 semesters calculus, 1 Differential equations, 1 Numerical Analysis and there may have been others, I should answer that for myself for these occasions.

When the bleeping show comes to a crawl and people are picking their noses between lengthy response-times, and you don't want to or can't trash the whole thing because a ton of money has been spent on it, you call in the Computer Scientist to analyze the software.

One of my most distinct memories of the one of the post-graduate compSci classes was the professor looking at us--this guy's now Dean of the whole show, congratulations to him--he snickered and said, "Now you know the reason you took the calculus classes", and he wrote the solution to our software problem on the black-board (he was using chalk) in terms of limits. I now knew the reason I took all those Calc classes.

Wow, it seems that I was wrong about other posters opinions about this guy which I thought unfairly characterised his ideas based on this thread alone. I now assume others were characterising his posts based on his adolescent behavior in past threads.

Too bad really since I thought many of his ideas were correct, but his arguements seemed to be nonsensical and baseless.

My mistake. Regardless of any merit, his ideas may have, his adolescent responses prohibit any attempt at serious discussion.

Link to comment
Share on other sites

12 hours ago, TakenItSeriously said:

So yes, I agree, computer science does make liberal use of mathematics in various functions or algorithms though it's all applied within a logical framework so I still think that they should not be confused with each other.

I think that you read a bit too much into my response and too little from my inference. I'm referring to the underlined portion above. Not only did I take the numerous mathematics courses I may have mentioned in previous posts, I worked 10 years for an American corporation as a computer programmer from 1982 to 1992 and lost more than a little sleep over the problems to which I alluded in my previous post and for which I was payed to solve.

More than that, I'll list the languages I learned in order of most used--many of which used much more than "liberal use of mathematics (your words):

APL
APL2
C+
C++
Pascal
Prolog
PLI
Assembly
Fortran
Cobalt
Visual Basic
Shockwave Flash

Please note that it was the inanity in your sentence that led me to make the response you didn't quite like.

If you dispute anything, then please let me know.

Edited by scherado
Link to comment
Share on other sites

  • 2 weeks later...
On 6/21/2017 at 11:15 AM, OldChemE said:

Wrong, unfortunately. 0 divided by 0 is undefined. Why? you might ask? The rule in math is that anything divided by itself is 1. This would argue that 0/0=1. However, zero has peculiar properties that prevent this. For example, 16/16 = 1, but 16 = 4 x 4, so 16/16 is the same as 4x4/4x4. but 4/4 = 1, so 4x4/4x4 = 1 x 1 = 1, and 16/16 = 1. The point of this example is that if we divide a non-zero number by itself, we always get an answer of 1, even if we factor the number.

 

Now, consider 0/0. 50 x 0 = 0, and 1 x 0 = 0, so 0/0 could be 50x0/1x0. if 0/0=1, then this version of 0/0 = 50. The point here is that the answer changes depending on what numbers were multiplied to make zero. This, then, defeats mathematics, which is why the result you are seeking is "undefined" in mathematics.

 

Attempting to divide any number by zero, even zero itself, creates inconsistencies in mathematics, and is not permitted.

Considered as a reply, mate, your response raises more problems than it solves!  You have successully made it clear, and correctly I think, that calculations involving 0 share many properties in common with calculations involving infinities...

Link to comment
Share on other sites

On 21/06/2017 at 0:49 PM, DrKrettin said:

Let x = y

 

x^2 = x.y

 

x^2 - y^2 = x.y - y^2   

 

(x + y)(x - y) = y(x - y)  ===>  2*0=1*0  , cancel out 0 then 2=1, 0 can't be divided by 0.

 

cancel out the (x - y)

 

x + y = y

 

2 = 1

 

Spot the error, or accept that 2 = 1

 

Link to comment
Share on other sites

On 14/10/2017 at 11:09 AM, amplitude said:

Considered as a reply, mate, your response raises more problems than it solves!  You have successully made it clear, and correctly I think, that calculations involving 0 share many properties in common with calculations involving infinities...

It depends on the calculation. :)

Adding 0 or multiplying by 0 are well-defined. 

Arguably, doing any arithmetic with infinity is wrong but, again, you can say what happens in most cases except division.

Note that there is a whole area of maths that deals with infinity (in fact, the infinite number of different infinities).

Link to comment
Share on other sites

On 6/21/2017 at 8:49 PM, DrKrettin said:

Let x = y

 

x^2 = x.y

 

x^2 - y^2 = x.y - y^2

 

(x + y)(x - y) = y(x - y)

 

cancel out the (x - y)

 

x + y = y

 

2 = 1

 

Spot the error, or accept that 2 = 1

Let it be given that your argument is valid.  How do you reconcile it with the axioms and definitions of arithmetic, which stipulate that two numbers cannot have the same successor?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.