Jump to content

proof that "all software has bugs?"


westfalr

Recommended Posts

Actually, no software has bugs- the computer does exactly what you tell it to.

The problem is that you are running the wrong software for the job you want to do.

It's a specification problem.

That's nonsense.

 

The number of times I have had the "It will shorten development time" "But we don't have time" "But it will save you more time than you spend" "I'm sure it will but we just don't have time" conversation is staggering.

Really? I can't even.

Edited by Thorham
Link to comment
Share on other sites

Thank you for taking the time to frame your argument so convincingly.

I'm sure that you will have changed everyone's mind on this matter.

 

It's because software is always written with a particular task or function in mind. If software doesn't perform the function it's supposed to, then it contains one or more bugs. Yes, telling the computer to do something other then you had in mind is a bug, a programming mistake.

Edited by Thorham
Link to comment
Share on other sites

But the fault lies with you, not the software. it's still doing what you told it to.

 

That's exactly the point. Bugs are programming mistakes. The software ends up not doing what it's supposed to do. Saying that all software is correct is just not right.

 

When viewed from the computer's perspective, a piece of software is just a sequence of instructions. Nothing more, nothing less. There is no correct or incorrect at that level, because the computer doesn't know what the software is supposed to do. Only from the human perspective are sequences of instructions correct or incorrect. The computer's viewpoint is simply irrelevant.

Link to comment
Share on other sites

hello,

 

here is the proof of your claim

"all software has bugs"

in the articles in different ways

 

https://docs.google.com/document/d/1dCyAsZoC0UdKikV7el4SdhdvXdu00zMsXb2jJRyFYqs/edit?usp=sharing

 

https://docs.google.com/document/d/1zHT2gF-8uch3dxdH8OqtEh_Hy1D_7kf793A0wVhTFII/edit?usp=sharing

 

https://docs.google.com/document/d/154VkbJDrunJ-ZNahkq9mdY--J1FQe14OoFlil9BSP2Q/edit?usp=sharing

 

Speaking more precisely, your claim is as follows

 

"all programs that run on computer have bugs due to processor work is based on arithmetic operations to which Godel Incompleteness Theorem is applicable"

 

You fixed one bug, the next day there is another bug.

 

That is why there are lots of support persons for each known program.

Edited by super_zhuk
Link to comment
Share on other sites

 

 

 

When viewed from the computer's perspective, ... The computer's viewpoint is simply irrelevant.

I see that you are talking about the computer's perspective as if it was somehow important to my point.

I, on the other hand, never mentioned it.

Was that a deliberate attempt at a straw-man attack, or did you just not read your post and think it through?

 

 

As I said, the problem does not lie with the software, but with the programmer.

(Actually, since I sometimes get a computer to write code for me the programmer sometimes is software- but, if it fails, it's still my fault not the software's)

Link to comment
Share on other sites

I see that you are talking about the computer's perspective as if it was somehow important to my point.

I, on the other hand, never mentioned it.

This implies the computer's perspective:

 

Actually, no software has bugs- the computer does exactly what you tell it to.

Link to comment
Share on other sites

How?

there are two parts to that.

I presume you mean the latter "the computer does exactly what you tell it to."

It's true from any perspective (the computer's, the software's or the programmer's) because a computer can't do anything other than follow what the s/w tells it.

So the bug rests not in the software, but in the programmer.

If, on the other hand, you were referring to the other clause of what I posted "Actually, no software has bugs" then I can assure you that you are simply wrong. that statement is from my perspective and I'm not a computer (or, if I am,, we all are and the whole discussion is moot)

 

In any event your assertion that it's "nonsense" relates to a different view on the meaning of "bug" and as such isn't true from all points of view. I think you were wrong to make the blanket claim that it was "nonsense" without some sort of explanation.

 

it's like claiming that someone who says there were driving at 50 MPH is talking "nonsense" because it's not true from the pov of someone passing them in another vehicle.

Link to comment
Share on other sites

I presume you mean the latter "the computer does exactly what you tell it to."

 

Yes, what else?

 

It's true from any perspective (the computer's, the software's or the programmer's) because a computer can't do anything other than follow what the s/w tells it.

 

The fact that a computer only does what it's told isn't relevant. It's told to do the wrong thing by the software. The software produces the wrong result, so the bug is in the software. Caused by the programmer or the compiler, obviously, but it's still in there.

Link to comment
Share on other sites

Actually, no software has bugs- the computer does exactly what you tell it to.

The problem is that you are running the wrong software for the job you want to do.

It's a specification problem.

That's more of a semantics argument rather than an argument of the actual concept.

Link to comment
Share on other sites

 

The fact that a computer only does what it's told isn't relevant. It's told to do the wrong thing by the software. The software produces the wrong result, so the bug is in the software. Caused by the programmer or the compiler, obviously, but it's still in there.

"The fact that a computer only does what it's told isn't relevant.

it is relevant to the point I was making; that it's not an issue of the computer's point of view (which you claimed it was)

 

"The software produces the wrong result, so the bug is in the software. "

and similarly

The programmer produces the wrong software, so the bug is in the programmer.

And the bug is in the programmer because... well, often it's a specification problem. Sometimes it's a programmer problem.

And it still has nothing to do with the computer's point of view.

 

.

That's more of a semantics argument rather than an argument of the actual concept.

Any proof that "all software has bugs" was going to be a semantics issue anyway.

 

10 PRINT "HELLO"

20 END

Link to comment
Share on other sites

it is relevant to the point I was making; that it's not an issue of the computer's point of view

It's indeed not an issue of the computer's point of view, which is exactly why the bug is in the software.

 

Computer's view: No wrong or right software. Only a bunch of instructions. The exception is, of course, illegal instructions, which are opcodes that the CPU can't decode into anything meaningful.

 

Human view: There is wrong and right software based on the results the software generates and the requirements.

 

Only the human view is relevant, and from that viewpoint the software either does what it's supposed to do or it doesn't. If it doesn't, it contains one or more bugs.

 

The programmer produces the wrong software

 

Of course.

 

so the bug is in the programmer.

 

Software bugs don't refer to humans, they refer to software.

 

Any proof that "all software has bugs" was going to be a semantics issue anyway.

 

Not really, it simply isn't true...

 

10 PRINT "HELLO"

20 END

... as demonstrated by this little program, unless it's supposed to print something other than 'Hello' on the screen.

Edited by Thorham
Link to comment
Share on other sites

What if I deliberately make a program that gives the value of Pi as 3. Would it be a bug or not?

PI constant/variable should be used later in code somewhere (otherwise it's unused variable),

f.e. for drawing circle on screen, making sphere 3D mesh, calculating area of circle, area of sphere, volume of sphere.

If wrong constant value will be used in them, then their results will be also wrong by 3.14159265/3 = 1.0471975511965977461542144610932 (nearly 5% difference).

 

In C/C++ PI is defined in math.h as M_PI. And it should be used in the all programs needing PI somewhere.

Link to comment
Share on other sites

The term bug should only be used to describe errors, leaks, loops or crashes. This is why good devs use flow controls (eg) if, else statements to process data with specific parameters or be ignored.

 

No programming language is without issues that may give rise to bugs.

 

For example, lists. Some use Line 0 first, then Line 1... and so on. In the case of Line 0, when using an integer to get a line number, you'll be one out (+1) and conversely, getting an integer from any line will require -1.

 

Simply using +/- 1 is a workaround to an inadvertent feature, not necessarily a bug. Especially if the program runs as intended.

Link to comment
Share on other sites

...

 

Human view: There is wrong and right software based on the results the software generates and the requirements.

 

....

 

 

Of course.

 

 

 

 

Not really, it simply isn't true...

 

... as demonstrated by this little program, unless it's supposed to print something other than 'Hello' on the screen.

 

Stop saying I'm not human.

But it's obviously trivially true, so, if it's worth asking, it's a semantic issue.

Incidentally, re.

"as demonstrated by this little program, unless it's supposed to print something other than 'Hello' on the screen."

Nope, it's supposed to write "HELLO"

Would that be a bug or not?

Link to comment
Share on other sites

PI constant/variable should be used later in code somewhere (otherwise it's unused variable),

f.e. for drawing circle on screen, making sphere 3D mesh, calculating area of circle, area of sphere, volume of sphere.

If wrong constant value will be used in them, then their results will be also wrong by 3.14159265/3 = 1.0471975511965977461542144610932 (nearly 5% difference).

 

In C/C++ PI is defined in math.h as M_PI. And it should be used in the all programs needing PI somewhere.

 

But I've decided I don't wish to use a more accurate value. Would it be a bug if the program is faithfully executing my commands?

Link to comment
Share on other sites

What if I deliberately make a program that gives the value of Pi as 3. Would it be a bug or not?

No, it wouldn't, because the program is supposed to give PI a value of 3. Mathematically incorrect, of course, but not a bug. The only thing that matters is that the program does what it's supposed to do and nothing more or less.

Link to comment
Share on other sites

But if someone else used the program they might consider it a bug, so which one of us is correct?

 

Strictly speaking neither value is mathematically correct, though obviously one will be more accurate than the other.

Edited by Endy0816
Link to comment
Share on other sites

But if someone else used the program they might consider it a bug, so which one of us is correct?

 

Strictly speaking neither value is mathematically correct, though obviously one will be more accurate than the other.

You will hear the phrase "working as designed" crop up a lot, especially in very large applications. If the program is correctly executing the steps in its code, then it is without bugs. The design that led to that code may be wrong as hell, but that's not the code's fault.

Link to comment
Share on other sites

But if someone else used the program they might consider it a bug, so which one of us is correct?

You are correct, because you wrote the program to do that. The other person would only consider it a bug if they didn't know you wrote it like that on purpose.

 

Strictly speaking neither value is mathematically correct, though obviously one will be more accurate than the other.

That's true, although you might say that something is a correct approximation, so that 3.1415926 is correct, while 3.1415123 is not, because some of the digits are wrong. Perhaps... maybe.

Link to comment
Share on other sites

 

If it does what it's supposed to do, and nothing more, then there's no bug.

Yes.

And it's totally and trivially obvious that a programme can be written that has no bugs.*

So the only way the question can make sense is if it's a matter of semantics.

So, my post making a point about semantics wasn't nonsense- it's just that you didn't understand it.

 

*

END

Edited by John Cuthber
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.