Jump to content

Is entropy always low in a way?


Tristan L

Recommended Posts

Since several of you have told me you can more easily read my arguments about entropy if I don't use Þorn ('Þ'/'þ') and Ðat ('Ð'/'ð'), I'll go back to using "th" in my physics arguments for now. As this is a physics forum, I've transfered the discussion about letter use to a thread in the Other Sciences forum (I've found no forum specifically for speechlore). Only in the following paragraph do I use Ðat and Þorn, but you can skip it if you wanna get to the intrysting part, aka the physics.

Again, swansont locked my entropy þread, saying ðat my statement ðat I wield ðe English alphabet be a bad faiþ argument in his estimation. Well, his estimation is clearly wrong, so his repeated closure of my þread is not justified. Ðerefore, I reopen it hereby yet again. Furðermore, I demand ðat he take back ðe penalty point he's unrightly given me for supposedly making bad faiþ arguments. Anoðer þing: Nobody gets to tell me which letters to use, but I do listen to good faiþ arguments, and several of you have given me such arguments for switching back to using "th", pointing out ðat ðe currently not-yet-widely used letters distract ðem from the content of my arguments. I've answered your points in ðe aforementioned þread in ðe Oðer Sciences forum. Ðere as well as here, I've taken your advice to hold back from wielding Þorn and Ðat for now. Wið ðat out of ðe way, let's delve into ðe physics and maþematics of entropy! 😀

Please mark that I use “partition of S” in the set-theoretic sense, that is, to mean a set of subsets of S which are non-empty, pairwise disjoint, and together cover S. Thus, “partition of S” refers to a subset of the power set of S which fulfills certain conditions. For instance, {{1, 2, 3}, {4, 5}, {6}} is called “a partition of {1, 2, 3, 4, 5, 6}”. Choosing which subsets of a state space count as macrostates amounts to choosing a partition of the state space.

The state space (e.g. phase space in Classical Mechanics) of a system is the set of all possible states (called "microstates") of the system. We assume that to each microstate, z, belong a probability, which we call "pr(z)". Of course, this is a highly non-trivial assumption IMHO, as in truth, we have only probabilities of going from one state to another, but let's suppose it be meaningful to say "the probability of a microstate". Then for any subset M of state space, "the probability that the system be in M" and "Pr(M)" refers to the probability that the system have a microstate lying in M, which = the sum of the probabilities of the members of M. For any subsets M, N of state space, the conditional probability of N given M is denoted by "Pr(N | M)". For each subset M of state space, "the entropy of M" and "S(M)" are wielded to mean

zM Pr({z} | M) * log2 Pr({z} | M),

that is, (how much more information we'd have about the system if we knew which microstate in M it has than if we knew only that its microstate is in M) if the microstate of the system lie in M. If all microstates be equally likely, the formula for the entropy of a subset M of state space reduces to

S(M) = -log2 1/#M .

Okay, so what's the entropy of the system? Well, that depends on which subsets of the state space of the system count as macrostates. Given a partition Ma (e.g. {{1}, {2, 3, 5}, {4, 6}}) of state space (e.g. {1, 2, 3, 4, 5, 6}), we use "entropy of the system with regard to Ma" to mean the entropy of the macrostate in which the system is, that is, the entropy of the element (e.g. {2, 3, 5}) of Ma which holds the system's microstate (e.g. 2). The key point which I made more than three-and-a-half years ago in my thread Will entropy be low much of the time? is that as we've just seen, the entropy of the system depends on how we divvy up state space into macrostates, i.e. on which subsets of state space we count as macrostates. I believe Sabine Hossenfelder made the same key point in her video I don't believe the 2nd law of thermodynamics about half a year ago. Both she and I also point out that currently, the Universe's entropy with respect to a certain partition (call it "Mahuman") of state space is low, so currently, living beings, including humans, are thriving who split state space up into the members of Mahuman. As I understand it, the 2nd Law says that for each partition Ma of state space, the entropy with respect to Ma is high most of the time. Thus, it's very likely that the Universe's entropy with regard to Mahuman will become so high and stay that way for a very, very long time that humans, coleoids asf. can't live during that while. However, at each time, there's a partition of phase space with regard to which the entropy of the system is low. So in a googol years, the Universe's entropy w.r.t. Mahuman may be high, but it will be low w.r.t. some other partition, say, MaChubachaba. So in googol years, living beings (call them "Chubachabas") can evolve for whom the members of MaChubachaba are the relevant macrostates of the Universe. Thus, the Universe is likely to always harbor life of some kind.

What are your thoughts on this matter?

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.