Jump to content

Why is entropy likened to disorder?


bascule

Recommended Posts

I recently watched a documentary on Boltzmann, which was very interesting but continually likened entropy to disorder (and "decay").

 

Wouldn't it be more appropriate to liken entropy to the information content of the universe, and to state that the information content of the universe is increasing? In other words, the universe is complexifying.

 

I've heard it argued that some types of complexity, such as the arrangement of molecules in a gas cloud, are not particularly important/useful/meaningful. Also I suppose as the universe reaches a state of extreme old age and approaches absolute zero it is no longer meaningful to liken entropy to complexity.

 

Thoughts?

Link to comment
Share on other sites

I think the association of entropy with "disorder" is a bit vague.

 

Entropy is more a measure of the "number of possible states". Roughly, if a state has many configurations (ways to achieve that state) then it is more probable that a state that had fewer configurations.

 

Thus, the higher the entropy the more ways to arrange things and thus the higher the "disorder".

 

In information theory there is a concept of entropy as the measure of information in a message. I believe people have merged the thermodynamic and information entropies to create a measure of "disorder". (Been a while since I looked at information theory.)

Link to comment
Share on other sites

Physical Chemistry; Thomas Engel & Philip Reid introduce entropy as follows:

 

[math] dS = \oint \frac {\slashed{d} q_{reversible} } {T} [/math]

 

and has no worded definition in the entire text!

 

The closest thing to a worded definition of entropy is encountered in the following statement:

 

"It has not yet been demonstrated that S is a suitable function for measuring 'the natural direction of change in a process' that the system may undergo."

 

I find no reference in the text to entropy being defined as a measure of disorder or decay!

 

I do recall an older text I had read having stated this but this new text has a much more developed scope than did that one.............. :P

Edited by buttacup
Link to comment
Share on other sites

Physical Chemistry; Thomas Engel & Philip Reid introduce entropy as follows:

 

[math] dS = \oint \frac {\slashed{d} q_{reversible} } {T} [/math]

 

and has no worded definition in the entire text!

 

The program I watched on Boltzmann showed the following equation for entropy, which is carved into Boltzmann's tombstone:

 

[math] S = k \, \log_{e} W[/math]

 

Please do not ask me to describe what this equation means. I will fail.


Merged post follows:

Consecutive posts merged

Interesting tidbits on the "disorder" issue on Wikipedia. It seems the entire idea of entropy-as-disorder comes from Boltzmann:

 

http://en.wikipedia.org/wiki/Ludwig_Boltzmann#The_Second_Law_as_a_law_of_disorder

 

The idea that the second law of thermodynamics or "entropy law" is a law of disorder (or that dynamically ordered states are "infinitely improbable") is due to Boltzmann's view of the second law. In particular, his attempt to reduce it to a stochastic collision function, or law of probability following from the random collisions of mechanical particles. Following Maxwell, Boltzmann modeled gas molecules as colliding billiard balls in a box, noting that with each collision nonequilibrium velocity distributions (groups of molecules moving at the same speed and in the same direction) would become increasingly disordered leading to a final state of macroscopic uniformity and maximum microscopic disorder or the state of maximum entropy (where the macroscopic uniformity corresponds to the obliteration of all field potentials or gradients). The second law, he argued, was thus simply the result of the fact that in a world of mechanically colliding particles disordered states are the most probable. Because there are so many more possible disordered states than ordered ones, a system will almost always be found either in the state of maximum disorder – the macrostate with the greatest number of accessible microstates such as a gas in a box at equilibrium – or moving towards it. A dynamically ordered state, one with molecules moving "at the same speed and in the same direction," Boltzmann concluded, is thus "the most improbable case conceivable...an infinitely improbable configuration of energy." This view of the second law is still widely disseminated, and indeed often treated as though it were the second law rather than an attempt to reduce it to a stochastic collision function, and this has lead to the view that all dynamically ordered states, including life itself are highly improbable states. But although many popular accounts have yet to catch up, during the last several decades this view of the second law as a law of disorder or dictating the improbability of ordered states has been shown to be false. Simple physical experiments such as the classic Benard experiment readily falsify this view. "Dynamic order, or autocatakinetics, is seen to arise not infinitely improbably, but with probability one, that is, every time and as soon as it gets the chance." It shows that rather than being improbable, "order production is entirely opportunitic..inexorable...and following from natural law"
Edited by bascule
Consecutive posts merged.
Link to comment
Share on other sites

I recently watched a documentary on Boltzmann, which was very interesting but continually likened entropy to disorder (and "decay").

That's appropriate. It was Boltzmann who (rather reluctantly) did much of the work on developing statistical physics. Boltzmann described entropy as disorder. His tombstone:

 

450px-Zentralfriedhof_Vienna_-_Boltzmann.JPG

 

Freeman Dyson similarly described entropy as a measure of disorder. I can't find his 1954 paper online, but the relevant parts are quoted in many undergraduate physics texts. For example, http://books.google.com/books?id=_ozWIrrBBxcC&pg=PA19&lpg=PA19

 

 

Wouldn't it be more appropriate to liken entropy to the information content of the universe, and to state that the information content of the universe is increasing?

 

In other words, the universe is complexifying.

 

You are talking about physical entropy defined as [math]S=-k \sum p_i \ln p_i[/math], compared to information entropy defined as [math]S=-k\sum p_i \log_2 p_i[/math]. Wheeler's "it from bit".

 

The problem with this view is that entropy/disorder does not translate all that well when viewed as information content, and even less so when viewed as complexity.

 

 

I've heard it argued that some types of complexity, such as the arrangement of molecules in a gas cloud, are not particularly important/useful/meaningful.

 

See above.

 

My opinion: Complexity is not the same thing as disorder, or lack thereof. Consider a perfect crystal versus a cloud of ideal gas in equilibrium. The first represents a minimal entropy (disorder) configuration; the latter, a maximal entropy configuration. Neither is not all that complex. Neither is anything close to chaotic. Complexity is more akin to the concept of chaos. Chaos lives on the borderline between the boringly predictable and utterly random.

Link to comment
Share on other sites

Complexity is more akin to the concept of chaos. Chaos lives on the borderline between the boringly predictable and utterly random.

 

I'm sorry if I'm looking at this completely from a computer science perspective, but the amount of data needed to describe the utterly random is significantly higher than what it takes to describe the boringly predictable, because the boringly predictable can be described via simple rules which can be used to compute a boringly predictable structure, whereas the utterly random cannot be described in any other way besides precisely cataloging the particular configuration a system happens to be in.

 

In that regard, utterly random systems exhibit a higher degree of complexity.

Link to comment
Share on other sites

Please do not ask me to describe what this equation means. I will fail.

I wrote my previous post, where I included a picture of Boltzmann's tombstone, before you wrote this.

 

This equation pertains to a system in thermodynamic equilibrium. w (or [math]\Omega[/math] in modern terminology) is the number of micro states the system can be in that are consistent with a given set of macro observations of the system.

 

Another view of [math]\Omega[/math] is that it represents how little one knows about a system. Consider the sum obtained by rolling a pair of dice versus the specific combinations that lead to a given sum. If you told me that you rolled either snake eyes or box cars I would know exactly what you rolled. Suppose, on the other hand, you told me you rolled a seven. What exactly did you roll? The picture is much less clear. There are six combinations that sum to seven.

Link to comment
Share on other sites

The program I watched on Boltzmann showed the following equation for entropy, which is carved into Boltzmann's tombstone:

 

[math] S = k \, \log_{e} W[/math]

 

Please do not ask me to describe what this equation means. I will fail.


Merged post follows:

Consecutive posts merged

Interesting tidbits on the "disorder" issue on Wikipedia. It seems the entire idea of entropy-as-disorder comes from Boltzmann:

 

http://en.wikipedia.org/wiki/Ludwig_Boltzmann#The_Second_Law_as_a_law_of_disorder

 

[math] S = k ln W[/math]

 

[math] S = entropy [/math]

 

[math] W = \frac {N!}{\Pi_j a_j!} [/math] and is the statistical mechanical probability of being in a state or the weight of the configuration

 

[math] k = Boltzmans \, Constant [/math]

 

[math]

\Delta S = \oint \frac {\slashed{d} q_{reversible} } {T}

[/math]

 

This state function is the Thermal Dynamic equivalent of the above Boltzman Relation and is necessary to the derivation thereof!

 

"Entropy is perhaps the most misunderstood thermodynamic property of matter. Introductory chemistry courses generally describe the entropic driving force for a reaction as an inherent desire for the system to increase its "randomness" or "disorder." With statistical mechanics, we will see that the tendency of an isolated system to evolve to a state of maximum entropy is a direct consequence of statistics."

 

"A system approaches equilibrium by achieving the configuration of energy with the maximum weight."

 

-Physical Chemistry; Thomas Engel, Philip Reid

 

Insane Alien a super liquid at approaching absolute zero still in it's liquid state for example helium will still have much less entropy than dirt at room temperature!

 

:P


Merged post follows:

Consecutive posts merged
Consider the sum obtained by rolling a pair of dice versus the specific combinations that lead to a given sum. If you told me that you rolled either snake eyes or box cars I would know exactly what you rolled. Suppose, on the other hand, you told me you rolled a seven. What exactly did you roll? The picture is much less clear. There are six combinations that sum to seven.

 

Consider the fact that there are one possibilities of getting snake eyes and the fact that there are

 

1 6

6 1

2 5

5 2

3 4

4 3

 

six possibilities of getting seven, hence seven has the greater weight (am just rewording this statement to better emphasize the points made above).

Edited by buttacup
Consecutive posts merged.
Link to comment
Share on other sites

Consider the fact that there are one possibilities of getting snake eyes and the fact that there are

 

1 6

6 1

2 5

5 2

3 4

4 3

 

Right, but that's a rather odd way of putting it. The point is that the probability of getting a higher score decreases, due to the restriction of the scores, and specifically we're talking about phase cells i.e there is a restriction on position et.c.

 

Therefore, within a given energy [math]E[/math], which is the restriction, the larger the number of molecules, the smaller the probability of a molecule occupying any given phase cell when [math]E[/math] increases. It's exponential, and as already said, it's only for gases in equilibrium, here's a simpler equation...

 

[math]p = Ae^{-E/kt}[/math]

 

[math]p[/math] being the probability, and [math]A[/math] being the normalization factor, the latter term RHS (the exponential) is the Boltzmann factor. Disorder, is really a rewording of the probability of knowing the position and the velocity of a given molecule, though I think that was already covered (sorry if I'm repeating what's already been said.) It gets more complicated when QM comes into play.

Link to comment
Share on other sites

  • 4 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.