thoughtfuhk

Consciousness causes higher entropy compared to unconscious states in the human brain

Recommended Posts

0*TUMUT2L_xmGF3Ntv.png

 
Anyway, the paper from the new article above says the more deep in sleep the mind, the lower the entropy.
 
The converse is true, the more awake the mind is, the higher the information content, the higher the amount of neuronal interactions, the higher the values of entropy.
 
The paper uses "Stirling Approximation" to compute some measure of entropy, "macrostate" [math]C[/math]:
 
[math]S = ( N \cdot ln(N/N − p) − p \cdot ln(p/N − p) ) \equiv lnC[/math] Figure 1 Stirling approximation on human EEG data
 
  • I think it is fair to imagine that "[math]C ∈ {X}[/math]", where "[math]C[/math]" represents an ensemble or macrostate sequence about some distribution of entropy in human neuronal terms as underlined by Mateos et al in new article /paper above, while "[math]{X}[/math]" (wrt equation 4 by Alex Wissner Gross from my earlier thread) describes some macrostate partition that reasonably encompasses constrained path capability, that permits entropy maximization, as underlined by Dr. Alex Wissner Gross.
 
  • Finally, beyond the scope of humans (as indicated by "[math]C[/math]") one may additionally garner of some measure of "[math]{X}[/math]" that may subsume higher degrees of entropy. (i.e. Artificial General Intelligence will likely have more brain power than humans, and hence a higher measure of "[math]{X}[/math]" compared to humans.)

Share this post


Link to post
Share on other sites
3 hours ago, thoughtfuhk said:

My earlier thread regarding human purpose, entropy and artificial general intelligence may be a good way to help explain the paper above. (See the earlier thread)

It seems unlikely. Your idea (as expressed in the title of this thread) is the almost exact opposite of the paper.

They say: increasing entropy may cause consciousness

You say: consciousness causes increasing entropy

(I say: correlation is not causation.)

You could also learn from the cautious language in that article. They are scientists doing research based on real data, and yet they use terms like "could be", "suggests", "might be", "a good starting point", "needs to be replicated", "hints at".

Compare that to your(*) aggressively confident, assertive and insulting tone. Perhaps you could learn something from the thoughtful approach in the article you link.

(*) Random Guy On The Internet with little support for your ideas beyond a few cherry-picked articles.

Share this post


Link to post
Share on other sites
7 minutes ago, Strange said:

It seems unlikely. Your idea (as expressed in the title of this thread) is the almost exact opposite of the paper.

They say: increasing entropy may cause consciousness

You say: consciousness causes increasing entropy

(I say: correlation is not causation.)

You could also learn from the cautious language in that article. They are scientists doing research based on real data, and yet they use terms like "could be", "suggests", "might be", "a good starting point", "needs to be replicated", "hints at".

Compare that to your(*) aggressively confident, assertive and insulting tone. Perhaps you could learn something from the thoughtful approach in the article you link.

(*) Random Guy On The Internet with little support for your ideas beyond a few cherry-picked articles.

I find the cherry picking less problematic than your point that thoughtfunk seems to misunderstand the research.

Share this post


Link to post
Share on other sites

It is a great shame that two different branches of Science should employ the same word for quite different meanings, just because the numerical pat has the same mathematical form.

Readers should be aware of which type of entropy is intended.

Share this post


Link to post
Share on other sites
7 hours ago, Strange said:

It seems unlikely. Your idea (as expressed in the title of this thread) is the almost exact opposite of the paper.

They say: increasing entropy may cause consciousness

You say: consciousness causes increasing entropy

(I say: correlation is not causation.)

You could also learn from the cautious language in that article. They are scientists doing research based on real data, and yet they use terms like "could be", "suggests", "might be", "a good starting point", "needs to be replicated", "hints at".

Compare that to your(*) aggressively confident, assertive and insulting tone. Perhaps you could learn something from the thoughtful approach in the article you link.

(*) Random Guy On The Internet with little support for your ideas beyond a few cherry-picked articles.

 

1. They didn't say that "entropy may cause consciousness". In fact the word "cause" can't be found in the paper!

image.png.8f38716643299f08ca6afad3fbc1bb1c.png

1.b) Reference-A, showing what they say, contrary to your claim "Strange": They say that the "the maximisation of the configurations" in the brain is what yields higher values of entropy.

2.c) Reference-B, showing what they say, contrary to your claim "Strange": "In our view, consciousness can be considered as an emergent property of the organization of the (embodied) nervous system, especially a consequence of the most probable distribution that maximizes information content of brain functional networks".

2.d) Reference-C, showing what they say, contrary to your claim "Strange": "Hence, the macrostate with higher entropy (see scheme in Fig. 4) we have defined, composed of many microstates (the possible combinations of connections between diverse networks, our C variable defined in Results), can be thought of as an ensemble characterised by the largest number of configurations."

2.) You're correct about the confidence seen in the old purpose thread, and in fact, I had updated my article elsewhere to reflect that Alex Gross' work was hypothesis level. (I even called it hypothesis several times in the earlier OP, although you seem to have missed that!)

3.) Reference-D, showing that I updated the tone, before your advice

(It looks like duplicate threads were created, but you chose to respond to the one without the words "hypothesis" added)

 

7 hours ago, EdEarl said:

I find the cherry picking less problematic than your point that thoughtfunk seems to misunderstand the research.

I find it odd that you simply took "Strange's" words as valid, without thinking about the problem yourself.

Strange is shown to be wrong above.

6 hours ago, studiot said:

It is a great shame that two different branches of Science should employ the same word for quite different meanings, just because the numerical pat has the same mathematical form.

Readers should be aware of which type of entropy is intended.

I don't detect the relevance of your quote above. 

Could it have stemmed from your haphazard support of Strange's false claim above?

Edited by thoughtfuhk

Share this post


Link to post
Share on other sites
10 minutes ago, thoughtfuhk said:

 

 

I find it odd that you simply took "Strange's" words as valid, without thinking about the problem yourself.

Strange is shown to be wrong above.

I don't detect the relevance of your quote above. 

Could it have stemmed from your haphazard support of Strange's false claim above?

 

I you wish to mention my post, then address my words.

Don't quote another responder at me.

Share this post


Link to post
Share on other sites
5 minutes ago, studiot said:

 

I you wish to mention my post, then address my words.

Don't quote another responder at me.

I apologize.


Those profile pictures looked quite similar for a moment.

How do  the types of entropy supposedly disregard the study's validity?

Edited by thoughtfuhk

Share this post


Link to post
Share on other sites

Since you wished to preach about the subject, surely you understand what entropy is and the difference between Shannon Entropy, Classical Thermodynamic Entropy and Statistical Mechanical Entropy?

Share this post


Link to post
Share on other sites
2 minutes ago, studiot said:

Since you wished to preach about the subject, surely you understand what entropy is and the difference between Shannon Entropy, Classical Thermodynamic Entropy and Statistical Mechanical Entropy?

 

  • Shannon entropy does not prevent the measurement of the difference between conscious and unconscious states. (As indicated by the writers, Shannon entropy was used to circumvent the enormous values in the EEG results. It is typical in programming to use approximations or do compressions of the input space!)

 

  • Reference-A, Quote from paper, describing compression rationale: "However, the estimation of C (the combinations of connections between diverse signals), is not feasible due to the large number of sensors; for example, for 35 sensors, the total possible number of pairwise connections is [1442] = 10296, then if we find in the experiment that, say, 2000 pairs are connected, the computation of [102962000] has too large numbers for numerical manipulations, as they cannot be represented as conventional floating point values in, for instance, MATLAB. To overcome this difficulty, we used the well-known Stirling approximation for large n : ln(n!) = n ln(n)˘n".

 

 

My question to you is: 

  • Why do you personally feel (contrary to evidence) that Shannon entropy measure supposedly prevents the measurement of conscious vs unconscious states in the brain, and also, don't you realize that it is typical in programming to encode approximation of dense input spaces?

Share this post


Link to post
Share on other sites
3 minutes ago, thoughtfuhk said:

 

  • Shannon entropy does not prevent the measurement of the difference between conscious and unconscious states. (As indicated by the writers, Shannon entropy was used to circumvent the enormous values in the EEG results. It is typical in programming to use approximations or do compressions of the input space!)

 

  • Reference-A, Quote from paper, describing compression rationale: "However, the estimation of C (the combinations of connections between diverse signals), is not feasible due to the large number of sensors; for example, for 35 sensors, the total possible number of pairwise connections is [1442] = 10296, then if we find in the experiment that, say, 2000 pairs are connected, the computation of [102962000] has too large numbers for numerical manipulations, as they cannot be represented as conventional floating point values in, for instance, MATLAB. To overcome this difficulty, we used the well-known Stirling approximation for large n : ln(n!) = n ln(n)˘n".

 

 

My question to you is: 

  • Why do you personally feel (contrary to evidence) that Shannon entropy measure supposedly prevents the measurement of conscious vs unconscious states in the brain, and also, don't you realize that it is typical in programming to encode approximation of dense input spaces?

 

I don't feel anything about the use of Shannon Entropy (here) one way or the other, and nothing I have posted could remotely lead to that conclusion.

I have not read your papers, since you have not introduced sufficient summary to understand what is going on, in contravention of the rules of this forum.

Can you not provide a succinct summary of your point which I have yet to divine?

Share this post


Link to post
Share on other sites
7 minutes ago, studiot said:

I don't feel anything about the use of Shannon Entropy (here) one way or the other, and nothing I have posted could remotely lead to that conclusion.

  • So why did you ask the "Shannon Entropy, Classical Thermodynamic Entropy and Statistical Mechanical Entropy" question earlier?

 

  • You see now how "Shannon Entropy" entered into my earlier response to your "Shannon Entropy" related query, especially given that "Shannon Entropy" occurs in the paper in question?

 

  • What was your point in mentioning Shannon Entropy in the beginning?
  • What was your point in asking about the difference between entropy types?
  • How did your question relate to the validity of the paper by Mateos et al in the OP? (or the validity of the OP itself?)
Edited by thoughtfuhk

Share this post


Link to post
Share on other sites

The first use of Shannon in this thread occurs in my post here:

22 minutes ago, studiot said:

Since you wished to preach about the subject, surely you understand what entropy is and the difference between Shannon Entropy, Classical Thermodynamic Entropy and Statistical Mechanical Entropy?

 

I said that you should lay out your stall properly and warned others that there is more than one definition of the crucial word 'entropy'

 

I think you are saying that there is experimental evidence that consciousness is emergent from the complexity of the human neurological system, but I am not sure enough to comment.

 

The simplest emergent phenomenon (non sentient) I can think of is the action of the arch.

Share this post


Link to post
Share on other sites
52 minutes ago, studiot said:

I said that you should lay out your stall properly and warned others that there is more than one definition of the crucial word 'entropy'

I don't detect why I should do that, especially when:

  • Doing that wouldn't change the reality that Shannon Entropy formalism can reasonably measure the difference in entropy between conscious and unconscious states.
  • There is already an indication of the types of entropy, the paper in question mentions types "Gibbs" and "Shannon Entropy" in the results section. (You clearly missed that based on your inquiry)
Edited by thoughtfuhk

Share this post


Link to post
Share on other sites
34 minutes ago, thoughtfuhk said:

I don't detect why I should do that, especially when:

Because it' conforms to both the rules of this forum and good practice and encourages helpful discussion.

Share this post


Link to post
Share on other sites
2 hours ago, studiot said:

Because it' conforms to both the rules of this forum and good practice and encourages helpful discussion.

Please read my prior response carefully.

Share this post


Link to post
Share on other sites
4 hours ago, thoughtfuhk said:

I find it odd that you simply took "Strange's" words as valid, without thinking about the problem yourself.

Strange is shown to be wrong above.

I didn't just take Strange's word. I may be mistaken. Your title for this thread seams misleading to me, even if the researchers didn't make any mistakes. Moreover, I don't believe that consciousness is a side effect of entropy. Here is a report of a conscious robot.

Share this post


Link to post
Share on other sites
3 hours ago, EdEarl said:

I didn't just take Strange's word. I may be mistaken. Your title for this thread seams misleading to me, even if the researchers didn't make any mistakes. Moreover, I don't believe that consciousness is a side effect of entropy. Here is a report of a conscious robot.

  • Thread title: "Consciousness causes higher entropy compared to unconscious states in the human brain".
  • The title for this thread aligns with the researchers' words: "We find a surprisingly simple result: normal wakeful states are characterised by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values".
  • And for several years now, I'd been aware of that nao robot "self-awareness" test, that uses deontic calculus as far as I recall.

Wrt your "side effect" remark, see the paper by Alex Gross listed in the OP to help you understand why the science alert article (also listed in the OP) used the words "side-effect".

15 hours ago, Strange said:

It seems unlikely. Your idea (as expressed in the title of this thread) is the almost exact opposite of the paper.

  1. They say: increasing entropy may cause consciousness
  2. You say: consciousness causes increasing entropy

(I say: correlation is not causation.)

A recent interaction with EdEarl triggered a scenario, such that I thought about some errors I may have made prior. (Where I claimed the paper said the contrast of your first statement above)


Perhaps the things you say above are valid. (except the correlation remark and your opening statement)
 

  • Your first statement: "increasing entropy may cause consciousness" (i.e. the process of maximizing entropy may yield more and more intelligence, as explored in the "purpose" thread, and the paper by Mateos et al) may be seen as somewhat equivalent with your second statement "consciousness causes increasing entropy" (i.e. maximizing neuronal interactions may yield higher values of entropy) that is likewise explored in the "purpose" thread, and the paper by Mateos et al.

 

  • This is reasonably because (as per research) maximizing entropy may be observed to yield increases in intelligence, and eventually the emergence of consciousness (i.e. roughly your 1'st statement), and also, when consciousness emerges, higher values of entropy may be observed (i.e. roughly your second statement).

(Note: Although it is perhaps considerable that simpler systems may maximize entropy too, although not to the degree of conscious entities.) 

 

  • In retrospect, this probably means that the phrase "only consciousness causes increasing entropy" may be observed as invalid as far as the papers show (see note above), although no body on the forum has made this claim!
Edited by thoughtfuhk

Share this post


Link to post
Share on other sites
9 hours ago, thoughtfuhk said:
  • Thread title: "Consciousness causes higher entropy compared to unconscious states in the human brain".
  • The title for this thread aligns with the researchers' words: "We find a surprisingly simple result: normal wakeful states are characterised by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values".
  • And for several years now, I'd been aware of that nao robot "self-awareness" test, that uses deontic calculus as far as I recall.

Consciousness results in greater brain activity than unconsciousness. in fact, unconsciousness can be caused by low blood flow to the brain. My understanding is that there is greater blood flow in the brain than during unconsciousness, which means the grain is using more energy during consciousness than when it is unconscious. According to Wikipedia's Entropy article, Entropy is inversely proportional to temperature. Thus, higher entropy would indicate an unconscious brain.

Shannon entropy is supposed to be analogous to thermodynamic entropy. Thus, your title seems incorrect; thus, it confuses me.

Share this post


Link to post
Share on other sites
19 hours ago, thoughtfuhk said:

1. They didn't say that "entropy may cause consciousness". In fact the word "cause" can't be found in the paper!

True. I had only read the article you linked to, not the paper. So it is just another example of slightly exaggerated / sensationalised reporting. 

The paper just shows a correlation; they do not seem to claim cause and effect, either way. They are more concerned with the structure and organisation of the brain and its functions.

They certainly don't say that this is the "purpose" of consciousness. And they certainly don't extend it to increasing entropy outside of the brain (which you appeared to be doing, previously).

BTW: very interesting paper. Thanks for bringing it to our attention.

19 hours ago, thoughtfuhk said:

(It looks like duplicate threads were created, but you chose to respond to the one without the words "hypothesis" added)

If they were duplicate threads, then it wouldn't matter which I responded to. I didn't "choose" one (and certainly not on the basis of which words it had or didn't have in it).

 

Share this post


Link to post
Share on other sites
6 hours ago, EdEarl said:

Consciousness results in greater brain activity than unconsciousness. in fact, unconsciousness can be caused by low blood flow to the brain. My understanding is that there is greater blood flow in the brain than during unconsciousness, which means the grain is using more energy during consciousness than when it is unconscious. According to Wikipedia's Entropy article, Entropy is inversely proportional to temperature. Thus, higher entropy would indicate an unconscious brain.

Shannon entropy is supposed to be analogous to thermodynamic entropy. Thus, your title seems incorrect; thus, it confuses me.

Remember, with more work, you've gotta pay out that entropy somehow.

This is a basic thing in entropy, as seen on the very Wikipedia page you cite.

4 hours ago, Strange said:

They certainly don't say that this is the "purpose" of consciousness. And they certainly don't extend it to increasing entropy outside of the brain (which you appeared to be doing, previously).

No they certainly didn't, and that's where my hypothesis came in.

Share this post


Link to post
Share on other sites
17 hours ago, EdEarl said:

Thus, higher entropy would indicate an unconscious brain.

From the same section you cited on your Wikipedia/entropy link: "Entropy is a measure of thermal energy per unit temperature that is not available for useful work."

Okay, I actually just now noticed you claimed that I misunderstood the research, although I had long ago, responded to a message including your claim on my supposed misunderstanding. 

Here's a bit more on why your quote above is reasonably false, and why the OP is likely valid, at least contrary to your criticism. (Ironically, reasonably due to your misunderstanding!)

  1. When the brain is unconscious, data shows that less "work" is being done. 
  2. When the brain is conscious, data shows more "work" is being done.
  3. As the very section on the very page you cited reveals:
    1. Entropy may be observed as a measure of disorder, or measure of inherent loss of usable heat.
    2. The more "work" the brain does, the less heat or energy is available for work. 
    3. This is why the farther away from sleep, and the closer to a conscious state the brain approaches, the higher the value of entropy, or the more the loss of usable heat or energy for "work" or consciousness.
    4. Think of entropy like a currency; up until a point, the more work you are permitted to put in, the more entropy you have to "pay up" to your surroundings, in the form of unusable heat or energy. This reasonably means in the unconscious state, you have less unusable energy at your disposal (or more usable energy, or lower entropy), compared to conscious states where you have used up your energy, thus more unusable heat or energy exists. (i.e. higher values of entropy)

 

A statement from the paper: "We find a surprisingly simple result: normal wakeful states are characterised by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values".

Another statement from the paper: "Note how during wakefulness the entropy is closer to the maximum of the curve, whereas the deeper the sleep stage, the more distant from the maximum the values are".

So, the paper I originally cited along with what I underlined in the OP is reasonable, at least contrary to your criticism, and it is simply likely that you misunderstand the general idea behind entropy. In other words, your quote above regarding entropy and the unconscious brain is "not even wrong"!

 

Edited by thoughtfuhk

Share this post


Link to post
Share on other sites

Entropy has no purpose, so why do you insist it does?

Share this post


Link to post
Share on other sites
13 hours ago, thoughtfuhk said:

From the same section you cited on your Wikipedia/entropy link: "Entropy is a measure of thermal energy per unit temperature that is not available for useful work."

Okay, I actually just now noticed you claimed that I misunderstood the research, although I had long ago, responded to a message including your claim on my supposed misunderstanding. 

Here's a bit more on why your quote above is reasonably false, and why the OP is likely valid, at least contrary to your criticism. (Ironically, reasonably due to your misunderstanding!)

  1. When the brain is unconscious, data shows that less "work" is being done. 
  2. When the brain is conscious, data shows more "work" is being done.
  3. As the very section on the very page you cited reveals:
    1. Entropy may be observed as a measure of disorder, or measure of inherent loss of usable heat.
    2. The more "work" the brain does, the less heat or energy is available for work. 
    3. This is why the farther away from sleep, and the closer to a conscious state the brain approaches, the higher the value of entropy, or the more the loss of usable heat or energy for "work" or consciousness.
    4. Think of entropy like a currency; up until a point, the more work you are permitted to put in, the more entropy you have to "pay up" to your surroundings, in the form of unusable heat or energy. This reasonably means in the unconscious state, you have less unusable energy at your disposal (or more usable energy, or lower entropy), compared to conscious states where you have used up your energy, thus more unusable heat or energy exists. (i.e. higher values of entropy)

 

A statement from the paper: "We find a surprisingly simple result: normal wakeful states are characterised by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values".

Another statement from the paper: "Note how during wakefulness the entropy is closer to the maximum of the curve, whereas the deeper the sleep stage, the more distant from the maximum the values are".

So, the paper I originally cited along with what I underlined in the OP is reasonable, at least contrary to your criticism, and it is simply likely that you misunderstand the general idea behind entropy. In other words, your quote above regarding entropy and the unconscious brain is "not even wrong"!

 

Sleep is confusing, because sometimes the brain is very active during sleep; while not conscious, unconscious does not necessarily mean sleep. Let's consider unconscious to be brain dead or in a coma.

The higher the temperature, the more usable heat energy is available for work. The lower the temperature the less energy available for work. Is it untrue that a high temperature mass produces more [math]\Delta[/math]S, but the mass at high temperature has less entropy than the same mass at low temperature.

I think I figured it out. The more molecules vibrate the higher the entropy. Also, the more disperse molecules are the higher the entropy. However, I think of the Sun as having lower entropy than deep space; is that false.

Edited by EdEarl

Share this post


Link to post
Share on other sites
4 hours ago, dimreepr said:

Entropy has no purpose, so why do you insist it does?

Perhaps you think things need deities in order to have a purpose.

Let me remind you that:

1. Purpose is not constrained to some deity,  as far as definitions go. (See Google definition of purpose)

2. Purpose may mean principle, and there are many principles in science, so I merely hypothesize of yet another principle in science in the OP.

3. Reference-A, Wikipedia/laws of science: "The laws of science, scientific laws, or scientific principles ..."

4. Reference-B, purpose/principle synonym.

Edited by thoughtfuhk

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now