Jump to content

information theory and entropy


Recommended Posts

The concept seems silly. Do I lose entropy when I learn something and gain entropy when I forget it?

 

Does something exist if it can't be measured, counted, weighed, timed . . . ?

 

Isn't it a problem of systems and system boundaries? Only someone/some thing outside the system can make the measurement? To say a system has an increase or decrease of entropy one must be outside the system in question. So If I say, "I learned something . . ." am I confusing information and knowledge? Logically, do we consider our minds as some THING existing outside the observable universe?

 

There is an old saying that you don't know something unless you can teach it to someone else. If you can't extract information from me . . . compare with the concept black holes subtracting information from the universe when they consume organized material. Is this not like comparing apples and oranges?

 

When Shannon wrote about information wasn't it information as data and not information as ideas?

 

Link to comment
Share on other sites

  • 4 weeks later...

Do I lose entropy when I learn something and gain entropy when I forget it?

Assuming the brain uses some kind of chemical structure to store memory (I'm not sure how memory works) then learning would involve a decrease in entropy as the chemicals become more ordered/structured. Forgetting something would not mean a gain in entropy, as it only means your brain can no longer access that memory. The memory itself is never "gone," only lost.

 

Does something exist if it can't be measured, counted, weighed, timed . . . ?

Yes. For example emotion, instinct, habits etc can't be "measured" as such, but they still exist.

Link to comment
Share on other sites

Hello Jaden, please don't take offence at this, but since you say you don't know how memory works how can you then go on to offer an explanation of entropy changes due to memory changes?

 

Billwald was correct in saying that Shannon 'information' refers to data not ideas.

However he was incorrect in saying that entropy always cannot be measured (registered) within a system.

 

FYI (pun intended) the link between classical entropy and information arises because for any system that may only exist in certain 'states' we may store shannon information (data) by causing that system to take up one of those states.

 

For a simple example a switch may be on or off - 2 states - 2 chunks of data.

 

The link between this and classical entropy was stated by Boltzman in 1896 and is

 

 

[math]S = k{\log _e}(w)[/math]

 

in conventional notation, where S is entropy , w is the number of possible states and k is a constant.

 

Current theory has that memory and thought activity is conducted by 'synapses' which are biochemical switches in the nervous system (including the brain) that transfer information between neurons.

 

Here are two good links to neurons and synapses

 

http://faculty.washington.edu/chudler/cells.html

 

http://faculty.washi...dler/cells.htmlhttp://faculty.washi...er/synapse.html

Edited by studiot
Link to comment
Share on other sites

The concept seems silly. Do I lose entropy when I learn something and gain entropy when I forget it?

 

Does something exist if it can't be measured, counted, weighed, timed . . . ?

 

Isn't it a problem of systems and system boundaries? Only someone/some thing outside the system can make the measurement? To say a system has an increase or decrease of entropy one must be outside the system in question. So If I say, "I learned something . . ." am I confusing information and knowledge? Logically, do we consider our minds as some THING existing outside the observable universe?

 

There is an old saying that you don't know something unless you can teach it to someone else. If you can't extract information from me . . . compare with the concept black holes subtracting information from the universe when they consume organized material. Is this not like comparing apples and oranges?

 

When Shannon wrote about information wasn't it information as data and not information as ideas?

 

Your body has thermodynamic entropy. It is generated by dissipative processes in your body such as chemical reactions. There is also a flow of entropy due to heat and mass flows with surrounds (e.g., when you eat your body gains the entropy contained in the food). For mature bodies, the thermodynamic entropy is approximately constant.

 

Thermodynamic entropy and Shannon informational entropy are two different beasts. The former is physical (entropy is a physical quantity), the second is not.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.