Jump to content

Will entropy be low much of the time?


Tristan L

Recommended Posts

7 hours ago, studiot said:

I was hoping to introduce this in a more measured way, but you have jumped the gun

I didn't and don't want to do that; rather, I only wanted and still want to make sure that there are no misunderstandings before I answer your other points and go on with the discussion.

 

7 hours ago, studiot said:

I am not, and never have, disagreed with your outlining of basic mathematical set theory in respect of the word 'partition'.

The important thing is that in this whole thread, I have only ever talked about partitioning the set of microstates into macrostates, not partitioning the system into subsystems. However, you seem to imply that I have done the latter by saying

21 hours ago, studiot said:

For the subsytems (partitions in your parlance)

In reality, I have only ever talked about partitioning the phase space.

 

7 hours ago, studiot said:

Mathematical partitioning of Mi (is based on) equipartition and implicitly assumes the 'equipartition theorem' of thermodynamics.

In what way does it do that?

Does that mean that without the equipartition theorem, one microstate could belong to more than one macrostate?

 

7 hours ago, studiot said:

You have also mentioned disjoint partitions, which is important in the mathematical statistics of this.

What exactly do you mean by that?

All members of a partition are pairwise disjoint by definition.

Being sets, some partitions P, Q are disjoint (share no common subsets of the ground-set), while others are not.

 

7 hours ago, studiot said:

Nor can it arise in information technology, whose partitions are disjoint.

Partitions of what?

 

7 hours ago, studiot said:

Finally, I do hope, you are not trying to disagree with Cartheodory.

Of course not, but as I said, I want to do away with any misunderstandings on my or your part before talking about your other points, including Caratheodory.

 

7 hours ago, studiot said:

And you seem to be disagreeing with my classical presentation, because it is much shorter than the same conclusion reached in the statistical paper you linked to  ?

Actually, I linked to the paper mainly because I find it interesting that there may be a way to outsmart the Second Law 😁.

Edited by Tristan L
Link to comment
Share on other sites

  • 1 year later...
On 6/4/2020 at 1:55 PM, studiot said:

I was hoping to introduce this in a more measured way, but you have jumped the gun.

I was hoping that you'd actually read and try to understand what I've written, at least the last post of mine, where I sweetle very clearly that I'm not talking about partitioning physical 3D space, but rather PHASE-ROOM. All I'm saying about disjointness is that macro-states are eachotherly disjoint sets of microstates; can we agree/forewyrd on that? Also, what exactly do you have in mind when you talk of disjointness?

Link to comment
Share on other sites

On 6/4/2020 at 7:22 PM, Tristan L said:

Being sets, some partitions P, Q are disjoint (share no common subsets of the ground-set), while others are not.

I think you misunderstand the meaning of disjoint in set theory.

This should be cleared up prior to any other consideration.

 

Link to comment
Share on other sites

17 hours ago, studiot said:
On 6/4/2020 at 9:22 PM, Tristan L said:

others are not.

I think you misunderstand the meaning of disjoint in set theory.

This should be cleared up prior to any other consideration.

We should indeed clear up any misunderstandings before we go on, so we'll do that right now:

For any sets S, T, "S is disjoint with T" means that S and T share no elements in common.

For any sets P, S, "P is a partition of S" means that all members of P are subsets of S and any two members T, B of P are disjoint, th.i. share no elements of S in common, and the union of all members of P is S.

Now to my above quote, which I'll sweetle/explain with the help of a byspel/example: {1, 2, 3, 4} is our groundset. All three of the following are partitions of {1, 2, 3, 4}:

{{1, 2}, {3, 4}}

{{1, 3}, {2, 4}}

{{1}, {2}, {3, 4}}

The first and the second are disjoint since they have no elements in common, but the first and the third one are not, for they have the member {3, 4} in common.

Link to comment
Share on other sites

51 minutes ago, Tristan L said:

We should indeed clear up any misunderstandings before we go on, so we'll do that right now:

For any sets S, T, "S is disjoint with T" means that S and T share no elements in common.

For any sets P, S, "P is a partition of S" means that all members of P are subsets of S and any two members T, B of P are disjoint, th.i. share no elements of S in common, and the union of all members of P is S.

Now to my above quote, which I'll sweetle/explain with the help of a byspel/example: {1, 2, 3, 4} is our groundset. All three of the following are partitions of {1, 2, 3, 4}:

{{1, 2}, {3, 4}}

{{1, 3}, {2, 4}}

{{1}, {2}, {3, 4}}

The first and the second are disjoint since they have no elements in common, but the first and the third one are not, for they have the member {3, 4} in common.

 

It is you attitude which impedes progress.

I said,  "I think that........"

I did not try to lay down the law.

 

Your response could have been  that you think you do understand set theory perfectly and so you wonder what I mean.

Instead you tried to lay down the law.

 

I agree with this first statement.

51 minutes ago, Tristan L said:

For any sets S, T, "S is disjoint with T" means that S and T share no elements in common.

But this second statement is inaccurate.

51 minutes ago, Tristan L said:

For any sets P, S, "P is a partition of S" means that all members of P are subsets of S and any two members T, B of P are disjoint, th.i. share no elements of S in common, and the union of all members of P is S.

This means that the set S must contain sets as members.

Which is not a requirement of set theory.

 

The issue of failure to distinguish between subsets and elements is further compounded by your sudden switch from elements to members.

The number 1 is a member, but not a subset of your set S = {1,2,3,4}

The partition {1} is a subset of S but not a member of S.

 

I don't take kindly to the attitude that you know everything and no one else knows anything.

Particularly as you are introducing mocking irrelevancies such as "byspel", which are no longer funny.

Edited by studiot
Link to comment
Share on other sites

Not at all, you quite misunderstood; I would never dare to lay down the law for anything and never will; I only repeat 8th-grade mathematical definitions and knowledge.

4 hours ago, studiot said:

But this second statement is inaccurate.

In truth, this statement of yours is simply incorrect.

4 hours ago, studiot said:

This means that the set S must contain sets as members.

Which is not a requirement of set theory.

Could you please enlighten me as to the sinn/sense of what you've written there? Set theory does indeed not need every set to only contain sets (though ZFC does actually rule out ur-elements for convenience), but a *partition* does indeed only contain sets as elements, namely disjoint subsets of the ground-set.

4 hours ago, studiot said:

your sudden switch from elements to members.

I brook/use the words "element" and "member" in exactly one and the same meaning.

4 hours ago, studiot said:

The number 1 is a member, but not a subset of your set S = {1,2,3,4}

Oh, I thank you for the info, but my friend's 13 year old cousin already told me today morning. 😉

4 hours ago, studiot said:

The partition {1} is a subset of S but not a member of S.

{1} is not a partition of S; it's a member/element e.g. of {{1}, {2, 3, 4}} and of {{1}, {2, 3}, {4}}, which in turn are partitions of S; accordingly, it's an underset of S.

 

It's funny that you borrowed my lines which I was about to write to you 🤣:

4 hours ago, studiot said:

It is you attitude which impedes progress.

 

4 hours ago, studiot said:

The issue of failure to distinguish between subsets and elements

 

4 hours ago, studiot said:

I don't take kindly to the attitude that you know everything and no one else knows anything.

 

Link to comment
Share on other sites

15 hours ago, Tristan L said:

In truth, this statement of yours is simply incorrect.

You are right it was my sloppy use of language.

I should have said

The set {1} is part of the partition set of S so is a subset of S but not a member of S.

This difference between the use of 'partition' in the physical sciences and in mathematics strikes again.

Link to comment
Share on other sites

That ambiguity in meaning of words can indeed be very hindering. A much better word imho is German/Theech "Zerlegung", which unmistakably means that which is meant by English "partitioning" (should we use that word from now on for the mathematical concept?) or "sectioning" or something like that.

This brings us to the minor side-issue of speech:

23 hours ago, studiot said:

Particularly as you are introducing mocking irrelevancies such as "byspel", which are no longer funny.

I don't mean to mock or anything; I just have a side-hobby of bringing back English's true potential, and that includes brooking/using truly English words, for byspel "byspel", which is the proper English word for "example" and cognate/orbeteed to German "Beispiel". I brook this proper English on purpose/ettling where it's not the object but only the tool of talking; after all, that's the ord/point of speech. But again, that's just a hobby of mine.

On 6/2/2020 at 3:35 PM, studiot said:

Here is another simple problem

 

tube1.jpg.7c7280be993ec0fe695642bddc28c851.jpg

 

Suppose you have a sealed adiabatic tube containing an adiabatic frictionless piston dividing the tube into two chambers, A and B as shown.

Let both sides of the system contain an ideal gas.

Discuss the time evolution of the system in terms of entropy.

This is a very intrysting problem indeed 🤔. I'd say that it evolves as follows:

If the pressure/thrutch is the same on both sides of the resting piston, nothing will happen. Otherwise, 1. the piston will start to go from the high-thrutch side to the low-thrutch side. 2. As it goes in that direction, internal energy of the high-pressure gas is transferred to kinetic energy of both gases and internal energy of the low-pressure gas. 3. When both thruthes become equal, the piston goes on shrithing/moving thanks to the inertia of the gases. 4. Now, the kinetic energy of the gases and internal energy of the former high-thrutch gas (now the low-pressure gas) are transferred to internal energy of the former low-thrutch gas (now the high-pressure one), slowing down the piston, until 5. the piston is at rest again. Now, the whole ongoings repeat in the other righting/direction.

Entropy doesn't change/wrixle during the whole process, so the Second Law of Thermodynamics is of no brook/use. But of course, we have another Second Law, namely that of Newton, and this one helps us further here.

Actually, I believe that we shouldn't find this too surprising 🤔, for there are other systems which wrixle/change although their entropy stays the same, e.g. a frictionless pendulum swinging. Mark that the system doesn't have to be periodic, I think; for instance, two  bodies shrithing/moving at not-zero relative speed forever in space (forget about gravitational waves) make up such a system.

Link to comment
Share on other sites

  • 2 years later...

I’ve finally got ðe answer to my titular question, and þankfully, it’s “Yes, in a way”: In her video I don't believe the 2nd law of thermodynamics. from about five monþs ago, Sabine Hossenfelder made ðe same key point 💡 I made more ðan þree-and-a-half years ago in my topic Will entropy be low much of the time? and in a comment on PBS Space Time’s video The Misunderstood Nature of Entropy. Ðe point in question is ðis: Ðe entropy of a system can’t be defined absolutely. It has to be defined wið regard to a partition (mark ðat I use ðe linked-to set-þeoretic definition of “partition”) of ðe state space of ðe system, ðat is, of ðe set of all possible microstates of ðe system. Ðe elements of a partition P of ðe state space are pairwise disjoint subsets of state space whose union is state space. Ðey are called “macrostates wið respect to P”. As I understand it (please set me right if I be wrong), ðe correct version of ðe Second Law of Þermodynamics says ðat for each partition P of ðe state space, ðe entropy wið respect to P is high more often ðan it is low. (So while ðe 2nd Law says entropy will likely not get lower going into ðe future, it also says entropy will likely not get lower going into ðe past, as I understand Dürr and Teufel explaining on page 90 of ðeir book Bohmian Mechanics: The Physics and Mathematics of Quantum Theory.) However, at each time point t, ðere is a partition Pt wið regard to which ðe entropy is low at t. So at each time t, organisms can live for whom ðe elements of Pt are ðe relevant macrostates. Ðus, it seems ðat ðe spectre of ðe heat deaþ has been dispelled 😃 🎆. What are your þoughts on ðis matter? 🤔

Link to comment
Share on other sites

  • swansont changed the title to Will entropy be low much of the time?
6 hours ago, Tristan L said:

In her video I don't believe the 2nd law of thermodynamics. from about five monþs ago, Sabine Hossenfelder made ðe same key point 💡 I made more ðan þree-and-a-half years ago in my topic Will entropy be low much of the time?

She doesn't. At no point does she state or even imply that:

6 hours ago, Tristan L said:

Ðe entropy of a system can’t be defined absolutely. It has to be defined wið regard to a partition (mark ðat I use ðe linked-to set-þeoretic definition of “partition”) of ðe state space of ðe system, ðat is, of ðe set of all possible microstates of ðe system.

On the contrary, the entropy of a system may be defined absolutely by reference to the 3rd Law. And it is not 'all possible microstates of ðe system', it is 'all possible microstates of ðe applicable macrostate'. To understand this distinction, you need to understand the concept of microstate accessibility as referenced in the ergodic hypothesis. It's important. 

However, what she does say at ~9:40 is:

Quote

What does entropy have to do with order? 

This has always confused me. I now think the brief answer is 'nothing', because 'order' does not have a well-defined meaning.

For instance, imagine you're pouring milk into tea. In the beginning they're neatly separated: 'ordered' you could say. At the end, they are evenly mixed. I'd also call this 'ordered', but in physics, this even distribution doesn't count as order.

This reference to order causes a lot of confusion.

                                                                                                                  S. Hossenfelder

This is true, but rather than follow this logic as I might to demonstate that there is no real conflict with the existence of domains of order within a closed system at thermodynamic equilibrium (for instance, the indefinite stability of an ice phase in an isolated system of water at its triple point) she continues with the following:

Quote

If we put air molecules in the corner of a box, we don't know where they're going, but at least we know where they are. 

If we now let the air spread into the entire box, it's still the same microstate. It's still the same microstate that one moment ago was in the corner of a box. But we can't tell. We used to have information about this state we no longer have. That's why entropy increases: because we lose access to information.

                                                                                                              S. Hossenfelder

 Thermodynamic microstates are a snapshot of a system in a volume of space and have zero entropy; macrostates refer to the properties of that system observable in a volume of space-time. It is not 'still the same microstate': it is the ergodic progression through various microstates of different macrostates(!)

Dr Hossenfelder seems to have confused the evolution of an out-of-equilibrium thermodynamic system stepwise through various macrostates towards a condition of thermodynamic equibrium (which is the process she describes) with the way in which a single quantum state may evolve.

I suspect that she is encountering problems in attempting to square her curious views on determinism with the 2nd Law.  

6 hours ago, Tristan L said:

Ðe elements of a partition P of ðe state space are pairwise disjoint subsets of state space whose union is state space. Ðey are called “macrostates wið respect to P”.

Don't you mean “microstates wið respect to P"?

6 hours ago, Tristan L said:

As I understand it (please set me right if I be wrong), ðe correct version of ðe Second Law of Þermodynamics says ðat for each partition P of ðe state space, ðe entropy wið respect to P is high more often ðan it is low.

No. All microstates have zero entropy. Entropy is an emergent property of all available microstates taken in combination. Hence a macrostate has a uniquely defined entropy even if some of the constituent microstates seem to exhibit spooky patterns.

Edited by sethoflagos
typo
Link to comment
Share on other sites

Let's put some flesh on the bones

Microstates v Macrostates.

Here is an extract from a lovely article

Quote

https://chem.libretexts.org/Bookshelves/Physical_and_Theoretical_Chemistry_Textbook_Maps/Supplemental_Modules_(Physical_and_Theoretical_Chemistry)/Thermodynamics/Energies_and_Potentials/Entropy/Microstates

That might be fine, but how can we find out how many microstates are accessible for a macrostate? (Remember, a macrostate is just any system whose thermodynamic qualities of P, V, T, H, etc. have been measured so the system is exactly defined.) Fortunately, Ludwig Boltzmann gives us the answer in S = kB ln W, where S is the value of entropy in joules/mole at T, kB is Boltzmann's constant of 1.4 x 10-23 J/K and W is the number of microstates. Thus, if we look in “Standard State Tables” listing the entropy of a substance that has been determined experimentally by heating it from 0 K to 298 K, we find that ice at 273 K has been calculated to have an Soof 41.3 J/K mol. Inserting that value in the Boltzmann equation gives us a result that should boggle one's mind because it is among the largest numbers in science. (The estimated number of atoms in our entire galaxy is around 1070 while the number for the whole universe may be about 1080. A very large number in math is 10100 and called "a googol" — not Google!) Crystalline ice at 273 K has 101,299,000,000,000,000,000,000,000 accessible microstates. (Writing 5,000 zeroes per page, it would take not just reams of paper, not just reams piled miles high, but light years high of reams of paper to list all those microstates!)

Entropy and entropy change are concerned with the energy dispersed in a system and its temperature, qrev/T. Thus, entropy is measured by the number of accessible microstates, in any one of which the system's total energy might be at one instant, not by the orderly patterns of the molecules aligned in a crystal. Anyone who discusses entropy and calls "orderly" the energy distribution among those humanly incomprehensible numbers of different microstates for a crystalline solid — such as we have just seen for ice — is looking at the wrong thing.

Liquid water at the same temperature of ice, 273 K has an So of 63.3 J/K . Therefore, there are 101,991,000,000,000,000,000,000,000 accessible microstates for water.

 

Link to comment
Share on other sites

On 12/15/2023 at 2:31 PM, studiot said:

Any chance you could rewrite this in English for us plebs please ?

Ðe letter Ðat (uppercase: 'Ð', lowercase: 'ð') stands for ðe 'th'-sound in "that", "then", "there" aso. whereas ðe letter Þorn (big: 'Þ', small: 'þ') represents ðe 'th'-sound in "thorn", "think", "thank" asf. Wielding 'ð' and 'þ' is þrice as precise as using 'th', since it distinguishes ðe /ð/ sound from ðe /þ/ sound and furðermore avoids mixing ðem up wið a /t/ sound followed by a /h/ sound. It is also double as efficient, because it uses only half as many letters. Ðis makes it six times as good (i.e. five times better).

On 12/15/2023 at 8:04 PM, sethoflagos said:

And it is not 'all possible microstates of ðe system', it is 'all possible microstates of ðe applicable macrostate'.

I wrote: "Ðe entropy of a system [...] has to be defined wið regard to a partition (mark ðat I use ðe linked-to set-þeoretic definition of “partition”) of [...] ðe set of all possible microstates of ðe system."

On 12/15/2023 at 8:04 PM, sethoflagos said:

Don't you mean “microstates wið respect to P"?

Nope, I really do mean macrostates wið respect to P. A macrostate wið respect to P is simply an element of P, where P is a partition (again, please mark ðat use I ðe set-þeoretic definition of "partition") of state space. Ðe elements of P are non-empty pairwise disjoint subsets of state space which togeðer cover state space. So a macrostate is a subset of state space. By contrast, a microstate is an element of state space.

On 12/15/2023 at 8:04 PM, sethoflagos said:

the entropy of a system may be defined absolutely by reference to the 3rd Law.

Wiðout reference to an arbitrarily chosen partition of state space into macrostates? 🤨

Edited by Tristan L
Link to comment
Share on other sites

16 minutes ago, Tristan L said:

Ðe letter Ðat (uppercase: 'Ð', lowercase: 'ð') stands for ðe 'th'-sound in "that", "then", "there" aso. whereas ðe letter Þorn (big: 'Þ', small: 'þ') represents ðe 'th'-sound in "thorn", "think", "thank" asf. Wielding 'ð' and 'þ' is þrice as precise as using 'th', since it distinguishes ðe /ð/ sound from ðe /þ/ sound and furðermore avoids mixing ðem up wið a /t/ sound followed by a /h/ sound. It is also double as efficient, because it uses only half as many letters. Ðis makes it six times as good (i.e. five times better).

Using a common language makes it better. Using terminology that you understand but others don’t is not better.

English is the international language of science. 

Link to comment
Share on other sites

On 12/15/2023 at 8:04 PM, sethoflagos said:

macrostates refer to the properties of that system observable in a volume of space-time.

Observable by whom? Ðat's ðe key point which Hossenfelder and I have made, contrary to:

On 12/15/2023 at 8:04 PM, sethoflagos said:

She doesn't. At no point does she state or even imply that:

On 12/15/2023 at 2:00 PM, Tristan L said:

Ðe entropy of a system can’t be defined absolutely. It has to be defined wið regard to a partition (mark ðat I use ðe linked-to set-þeoretic definition of “partition”) of ðe state space of ðe system, ðat is, of ðe set of all possible microstates of ðe system.

We've said ðat currently, ðe entropy of ðe Universe wið respect to a certain partition, call it "Phuman", of state space is low. Accordingly, ðe Universe is currently teeming wið living beings (at least on Earþ) who divvy state space up into ðe elements of Phuman. In accordance wið ðe Second Law, entropy wið regard to Phuman will very likely rise until living beings, such as humans, for which ðe macrostates wi.re.to Phuman matter can no longer live. It will ðen very probably take a very, very long time until entropy wi.re.to Phuman is low again. However, when entropy wi.re.to Phuman is high, entropy will be low wi.re.to some oðer partition, e.g. PChubachaba, so living beings (like Chubachabas 😉) who split state space up into ðe members of PChubachaba will be able to live.

Let's say we have a very simple system wið just six states. Number ðem 1 to 6. Ðe state space of ðe system is ðe set {1, 2, 3, 4, 5, 6}. If ðe system is currently in state 4, what is its current entropy? Ðe question doesn't make sense. First, we have to break state space down into macrostates, e.g. into {1}, {2, 3, 5}, and {4, 6}. By choosing {{1}, {2, 3, 5}, {4, 6}} as our partition of {1, 2, 3, 4, 5, 6}, we've just categorized ðe microstates according to primality: {1} is ðe macrostate holding all microstates which are neiðer prime nor composite, {2, 3, 5} is ðe macrostate holding ðe prime microstates, and {4, 6} is ðe macrostate containing ðe composite ones. Now, we can ask: What is ðe current entropy of ðe system wið respect to ðe partition {{1}, {2, 3, 5}, {4, 6}}? Ðe answer is -SUMz in current macrostate P({z}|current macrostate)*log2P({z}|current macrostate) = -SUMz in {4, 6} P({z}|{4, 6})*log2P({z}|{4, 6}), which = -log2(1/2) = 1 given ðat all microstates be equally likely. Likewise, if ðe system was in ðe microstate 3 a second ago, its entropy wið regard to {{1}, {2, 3, 5}, {4, 6}} a second ago was = -log2(1/3) ≈ 1.58, and if it was in microstate 1 two seconds ago, its entropy wið respect to {{1}, {2, 3, 5}, {4, 6}} back ðen was = -log2(1/1) = 0.

But who says we have to choose {{1}, {2, 3, 5}, {4, 6}} as our partition? Nobody. It's just ðat (in ðis example), we happen to be living beings who care about primeness. However, living beings who care about being a power of 2 would divvy ðe state space up into {2, 4} and {1, 3, 5, 6}, ðat is, choose {{2, 4}, {1, 3, 5, 6}} as ðe partition wið regard to which ðey define entropy. Ðe key point Hossenfelder and I have made is ðat at each time t, ðe system is in a microstate, and for each z, ðere's a partition, P, of state space such ðat for ðe member (a macrostate), M, of P which contains z, ðe likelihood of {z} given M is high, so if ðe system has microstate z, its entropy wið regard to P is low.

49 minutes ago, swansont said:

Using a common language makes it better. Using terminology that you understand but others don’t is not better.

English is the international language of science. 

Not only is ðe speech I use English, but ðe letters I'm writing are English letters.

On 12/15/2023 at 8:04 PM, sethoflagos said:

out-of-equilibrium

On 12/15/2023 at 8:04 PM, sethoflagos said:

thermodynamic equibrium

Wið regard to which partition of state space into macrostates?

On 12/15/2023 at 8:04 PM, sethoflagos said:

I suspect that she is encountering problems in attempting to square her curious views on determinism with the 2nd Law.

I agree; in a deterministic universe, each probability is eiðer 0 or 1, so ðe entropy is always 0 ... ðough ðis is in accordance wið ðe 2nd Law, of course.

On 12/15/2023 at 8:04 PM, sethoflagos said:

Hence a macrostate has a uniquely defined entropy even if some of the constituent microstates seem to exhibit spooky patterns.

Ðat's correct and you're right. Ðe entropy of a macrostate is defined absolutely. However, when we ask about ðe entropy of ðe system, we have to ask: Which subsets of state space count as macrostates?

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.