Jump to content

The distinction between software and hardware


geordief

Recommended Posts

 I have no coding skills (apart from html) and so no expertise in computing at all.

But I have often tried to grapple  in my mind with the relationship between the concept of software versus hardware.

In reality do these phenomena overlap or is there a strict demarcation  ?

I can see they have to interact or nothing will happen ,so is there  a region that is composed of both elements?

Any software has to be written onto  a physical component .Is that physical component(eg a hard drive) considered part of the hardware ?

Is everything physical inside the hard drive of a computer part of the hardware?

Where does the software start ?

 

Inside the mind if the coder?(but the brain is physical too)

Is this a pointless question, just something to scratch one's head about when the practicalities are all that really  matter?

Link to comment
Share on other sites

Hardware is the physical side, including your hard drive example. Software is the firmware, OS, programs you know and love, etc. Firmware is where physical and software connect, so to speak. Without it, your computer is a glorified paper weight.

 

https://www.fortinet.com/resources/cyberglossary/what-is-firmware

Edited by Steve81
Link to comment
Share on other sites

One of my university professors used to say, "I can be a software engineer without ever stepping into the computer room." (It was time when computers were mainframes.)

13 minutes ago, geordief said:

In reality do these phenomena overlap or is there a strict demarcation  ?

Memory is hardware. Its content is software.

Link to comment
Share on other sites

5 minutes ago, Genady said:

One of my university professors used to say, "I can be a software engineer without ever stepping into the computer room." (It was time when computers were mainframes.)

But could he be a software engineer (at that level) if the hardware had not already been built?

Is there not a symbiotic/two sides of the same coin relationship?

At what stage in the history of computing (or machines?) could it be said that both hardware  and software could be used to describe aspects of things?

Edited by geordief
Link to comment
Share on other sites

1 minute ago, geordief said:

But could he be a software engineer (at that level) if the hardware had not already been built?

Yes, certainly.

 

1 minute ago, geordief said:

Is there not a symbiotic/two sides of the same coin relationship?

This question, I don't understand. Sorry.

3 minutes ago, geordief said:

At what stage in the history of computing (or machines?) could it be said that both hardware  and software could be used to describe aspects of things?

Never, if I understand the question correctly.

Link to comment
Share on other sites

Just now, Genady said:

Yes, certainly

He could have  been a software engineer at a time when Philip of Macedon was waging wars around the globe?

2 minutes ago, Genady said:

This question, I don't understand. Sorry

Is it chicken and egg?Can you have one without the other?

Link to comment
Share on other sites

5 minutes ago, Genady said:

No. The principles of computing had to be invented first.

If Philip laid the plans on paper for his entry in India ,could his soldiers'arms be viewed as hardware and his plans on paper or in his head as "software"(as a crude concept )?

Aren't modern day computers sophisticated descendants of mental plans and concrete applications?

Edited by geordief
Link to comment
Share on other sites

1 minute ago, geordief said:

If Philip laid the plans on paper for his entry in India ,could his soldiers'arms be viewed as hardware and his plans on paper or in his head as "software"(as a crude concept )?

Are there rules for using metaphors?

3 minutes ago, geordief said:

Aren't modern day computers sophisticated descendants of mental plans and concrete applications?

More metaphors?

Link to comment
Share on other sites

Just now, Genady said:

Yes, I do.

Well,I am wondering whether the  concepts of software and hardware can usefully be applied to human activities in the times before what we would call computers were invented .

Was,eg the abacus a primitive form of computing?

Did it have a software component?

Are there any other activities that could be said to be able to be described conceptually in the same way?

Link to comment
Share on other sites

14 minutes ago, geordief said:

Was,eg the abacus a primitive form of computing?

No.

 

14 minutes ago, geordief said:

Did it have a software component?

No.

 

14 minutes ago, geordief said:

Are there any other activities that could be said to be able to be described conceptually in the same way?

Yes. Turing machine. The tape with cells, the head, the register -- hardware. The set of instructions -- software.

Edited by Genady
Link to comment
Share on other sites

This might be helpful regarding the history / development of modern computing:

https://en.m.wikipedia.org/wiki/History_of_general-purpose_CPUs

 

and some other basics

You can see how massively complex a modern CPU is here:

https://en.m.wikipedia.org/wiki/Transistor_count

And a little comp sci 101

https://www.geeksforgeeks.org/difference-between-half-adder-and-full-adder/amp/

Edited by Steve81
Link to comment
Share on other sites

8 hours ago, geordief said:

 I have no coding skills (apart from html) and so no expertise in computing at all.

But I have often tried to grapple  in my mind with the relationship between the concept of software versus hardware.

In reality do these phenomena overlap or is there a strict demarcation  ?

I can see they have to interact or nothing will happen ,so is there  a region that is composed of both elements?

Any software has to be written onto  a physical component .Is that physical component(eg a hard drive) considered part of the hardware ?

Is everything physical inside the hard drive of a computer part of the hardware?

Where does the software start ?

 

Inside the mind if the coder?(but the brain is physical too)

Is this a pointless question, just something to scratch one's head about when the practicalities are all that really  matter?

OK so as always the answer to you basic question can be as simple or as complicated as you wish.

At the very simplest, yes there is complete demarcation between hardware and software.
However that view will not get you very far.

 

The most common model is usually called the layer model.
This comes in various amounts of complexity  ie more or fewer layers.

The layers are usually portrayed in a vertical heirarchy.

The simplest is the two layer model , with 'hardware' at the bottom and 'software' at the top.

 

The all important idea is that there are predifined exchange parameters which are passed from one layer to the next.

All communication between layers is by way of these parameters.
Layers do not have direct communication with or control of other layers and can only communicate via these parameter channels with the layer immediately above/below.

The reason for this is that the 'software layer' does not need to 'know' what is actually in the hardware layer.
It could be switches, transistors, light bulbs, charged/discharged capacitors or it could be something non electrical at all.
My flatmate in University did his final year project building a fluid logic hardware layer.

 

One more generalisation and that will be enough for now.

All this is of no use without the traditional 'black box' theory.
You must have means of input and output otherwise you have a system that just sits there and contemplates its own navel.
So I favour adding a horizontal direction to the model to account for input and output.

 

If you find this worth pusuing, we can use the models, building up complexity, to look at your other questions about interactions and the nature of input and output.

Link to comment
Share on other sites

2 hours ago, studiot said:

OK so as always the answer to you basic question can be as simple or as complicated as you wish.

At the very simplest, yes there is complete demarcation between hardware and software.
However that view will not get you very far.

 

The most common model is usually called the layer model.
This comes in various amounts of complexity  ie more or fewer layers.

The layers are usually portrayed in a vertical heirarchy.

The simplest is the two layer model , with 'hardware' at the bottom and 'software' at the top.

 

The all important idea is that there are predifined exchange parameters which are passed from one layer to the next.

All communication between layers is by way of these parameters.
Layers do not have direct communication with or control of other layers and can only communicate via these parameter channels with the layer immediately above/below.

The reason for this is that the 'software layer' does not need to 'know' what is actually in the hardware layer.
It could be switches, transistors, light bulbs, charged/discharged capacitors or it could be something non electrical at all.
My flatmate in University did his final year project building a fluid logic hardware layer.

 

One more generalisation and that will be enough for now.

All this is of no use without the traditional 'black box' theory.
You must have means of input and output otherwise you have a system that just sits there and contemplates its own navel.
So I favour adding a horizontal direction to the model to account for input and output.

 

If you find this worth pusuing, we can use the models, building up complexity, to look at your other questions about interactions and the nature of input and output.

As always worth pursuing but with the likelihood that my brain will not be able to cope with the complexity of the subject.

My mind tends to be attracted to   broad generalities(an excuse for lazy thinking?) rather than hard analysis. (I actually posted this in Philosophy before it was reallocated here)

 

Still, I seem to follow your description so far.The input  seems to be a bit like the drum in my washing machine ,providing dynamism to something of a comfortable ,academic arrangement  between the software and hardware layers.(maybe the loose  and inconsequential analogy would even extend to calling the water the software and the dirty clothes the hardware) 

 

One thing that occurs to me is that it must be possible to take the output from the system  and return it to the input  inlet.

Maybe that is common practice?

 

Aside from that ,do you agree with @Genady that the concept of software/hardware is not a thing that could have had any application in the earlier history of our civilization?

 

It only emerged as a meaningful concept when computers were actually  developed?(I think I have read that the Jacquard Loom was invented) was the earliest precursor  of modern day computers. )

 

Link to comment
Share on other sites

12 hours ago, geordief said:

Was,eg the abacus a primitive form of computing?

The abacus by itself was not a computer. But the original definition of computer was a person who did computations and one could use an abacus to do that. So using an abacus is a primitive form of computing.

Link to comment
Share on other sites

Quote

The distinction between software and hardware

Hardware is created at the factory. It is extremely hard to copy. Copy is never identical (different physical particles are used).

 

Software is a special case of data. Data that is understandable to a processor (compiled into machine code) or understandable to a language interpreter (interpreted text file). It is extremely easy to make an unlimited number of identical copies.

Examples of compiled languages are: C, C++,

Examples of interpreted languages are: JavaScript, PHP, Perl, Python, Bash, Basic.

Examples of mixed compilation-interpretation (i.e. interpretation by a virtual machine): Java, C#

 

Software communicates with hardware through drivers. They are OS-specific, OS-compliant, small programs that read and write hardware registers. Generic drivers are provided by the developer of the operating system (e.g. Microsoft) (they are usually very limited and/or slow). The proper drivers are provided by the hardware manufacturers by their engineers.

Low-level software can ignore drivers and have direct access to hardware. This was especially true in the 1980s-90s, on 8 bit computers with primitive operating systems.

Link to comment
Share on other sites

I want to clarify that I am not very comfortable with my own example of Turing Machine. Calling the instructions, a 'software' there is already a stretch. I think that it was von Neumann's idea to have instructions in memory like other data. This was the birth of software.

Link to comment
Share on other sites

  • 4 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.