Jump to content

Entropy - At Maximum


Sorcerer

Recommended Posts

Is it possible for a system to have a maximum ammount of disorder, ie has maximum entropy? Is this the same as equilibrium?

If so, when in this state, can it change (in regards other than to heat)? Is entropy only a measure of the ordered nature of heat? Is it the same as information? Can there still be information retained in say the position of the particles, or do all the particles also have to be equidistant or completely randomly located?

Does anything prevent a system at maximum entropy losing some entropy temporarily and then regaining it? Most concepts of equilibrium I've encountered is where a system will flow back and forth between states, but not simply rest in one place.

How would the concept of maximum entropy be reconciled with the 2nd law of thermodynamics, if entropy always increases, how could it cease to increase? Is there no such thing as maximum entropy, but rather an assymptote with the maximum?

Alot of questions I know.

Link to comment
Share on other sites

Maximum entropy means you cannot extract work from the system, which is what happens when you no longer have a temperature difference to run your heat engine.

I understand that as how it relates to heat, but I was wondering also about information, lets say there's 15 particles all at thermal equilibrium, but all different, say all different elements on the periodic table, couldn't they also be arranged in an order to code information. Lets say you arranged them, that's putting energy into the system upsetting the equilibrium and stuck them in position, what would it mean for this system to be at equilibrium? Would they need to become unstuck and mix up randomly? Or would they just need to sit for a while so they and the substrate and the "glue" all became the same temperature?

 

Can there still be information retained in say the position of the particles, or do all the particles also have to be equidistant or completely randomly located?

 

Yes, an isolated system it will tend to a maximum entropy at equilibrium, quantum and statistical fluctuations aside.

 

Tend towards or reach? Is maximum entropy asymptotic? Will there always be a tiny ammount of heat to flow to even out?

 

I know these laws work when heat is quantified in joules, but what about when it's quantified as photons, ie as E. Do atoms stop emmitting photons when at equilibrium or do they just share equal ammounts to no effect?

Edit, oops I understand photons energy can also be measured in joules, but I still think there's a good question there.

Edited by Sorcerer
Link to comment
Share on other sites

 

Maximum entropy means you cannot extract work from the system, which is what happens when you no longer have a temperature difference to run your heat engine.

 

 

I think that there are more provisos needed to make this entirely true.

 

 

Yes, an isolated system

 

Do you know and real isolated systems and would this apply to an infinite isolated system?

Link to comment
Share on other sites

I understand that as how it relates to heat, but I was wondering also about information, lets say there's 15 particles all at thermal equilibrium, but all different, say all different elements on the periodic table, couldn't they also be arranged in an order to code information. Lets say you arranged them, that's putting energy into the system upsetting the equilibrium and stuck them in position, what would it mean for this system to be at equilibrium? Would they need to become unstuck and mix up randomly? Or would they just need to sit for a while so they and the substrate and the "glue" all became the same temperature?

 

 

Tend towards or reach? Is maximum entropy asymptotic? Will there always be a tiny ammount of heat to flow to even out?

 

I know these laws work when heat is quantified in joules, but what about when it's quantified as photons, ie as E. Do atoms stop emmitting photons when at equilibrium or do they just share equal ammounts to no effect?

Edit, oops I understand photons energy can also be measured in joules, but I still think there's a good question there.

The atoms don't reach equilibrium, the theoretically isolated system does. They would maintain a velocity distribution that would emit/receive a black body distribution of photons for whatever temperature the system has settled to.

 

 

 

 

Do you know and real isolated systems ?

Not really… might be some lump of iron at 2.73 degrees K floating around somewhere

Link to comment
Share on other sites

 

Do you know and real isolated systems and would this apply to an infinite isolated system?

Isn't the universe an isolated system by definition, even if infinite?

 

 

Not really… might be some lump of iron at 2.73 degrees K floating around somewhere

Wouldn't that have mass energy to exchange if it happened to float by something? So is it an isolated system so long as it remains isolated (and thus unobservable, since observation would input energy).... and then it's not, it doesn't really seem worth making rules for it. It's like an isolated system is the Schrodinger's cat of systems.

Edited by Sorcerer
Link to comment
Share on other sites

I understand that as how it relates to heat, but I was wondering also about information, lets say there's 15 particles all at thermal equilibrium, but all different, say all different elements on the periodic table, couldn't they also be arranged in an order to code information. Lets say you arranged them, that's putting energy into the system upsetting the equilibrium and stuck them in position, what would it mean for this system to be at equilibrium? Would they need to become unstuck and mix up randomly? Or would they just need to sit for a while so they and the substrate and the "glue" all became the same temperature?

 

I don't think information entropy and thermodynamic entropy are the same thing.

Link to comment
Share on other sites

 

I don't think information entropy and thermodynamic entropy are the same thing.

Maybe that's where I'm getting so confused then, does information energy hold to a similar rule as the 2nd law of thermodynamics?

 

Isn't it necessary in most cases I can think of for information to be maintained that work is done to maintain the order? e.g. life and DNA, thus the 2 entropies would be linked. I'm recalling a term I heard many years ago, negentropy or something, let me look it up. (fancy word for free energy https://en.wikipedia.org/wiki/Negentropy).

 

So, I've got 2 related threads going here, another connected maximum entropy and minimum energy and I also have related entropy of information.

 

it occured to me, why is there conservation of energy and information, but not conservation of entropy. Isn't the increase in thermodynamic disorder, entropy also a loss of information?

 

 

Link to comment
Share on other sites

 

So, I've got 2 related threads going here

 

 

Yes, and as I indicated in your other thread, it would help you enormously if you were to get hold of the proper definitions and meanings for the basic terms you are throwing around.

 

Please take this in the (good) spirit in which it is meant.

 

Information is another such word and has a very specific meaning in statistical thermodynamics that is different from, but also related to, the (also specific) meaning in communications theory or the meuch wider meaning in use in general English.

 

To understand the meaning of information you need to understand the meaning of 'state'.

Link to comment
Share on other sites

OK so someone likes my last post so here is a rough guide to some of these basic terms.

 

You are currently posting similar stuff in three threads about thermodynamics, but I am trying to collect together in one thread (this one).

 

 

First 'state'.

 

The state of anything is a complete list of the values of properties of interest for the system concerned.

 

The next bit is very important. A state can only be defined if one value for each property can be obtained that represents the whole system (for instance average particle velocity in a gas)

If one part of the system is in one state and another is in a different state, you need to divide the system into two subsystems.

It may also be that one value cannot be obtained (for instance turbulent motion in a fluid) in which case the state of the system is undefined.

 

For properties of interest that do not change with time (ie are independent of time) then the system is said to be in a state of equilibrium with respect to those properties.

A system can be in equilibrium with respect to one property but not with respect to another.

For example a system can be in horizontal equilibrium, but not vertical equilibrium in mechanics.

 

Which brings me to note that I have been general in my statements so far since thermodynamic state is not the only state possible.

For instance colour. It makes no thermodynamic difference if the working fluid changes colour at some temperature (although this might be useful for other purposes) or through ageing with time.

 

Further thermodynamics and other subjects offer relationships (equations) connecting the properties of interest so it is not necessary to know all of these to complete the list. Each relationship can reduce the number of unknown property values by one.

 

OK so we have our list of properties and we have a list of values so we have a state for some system.

 

The information content is a second list which is a list of all the possible states. plus one piece of additional information - telling us which state the system is in.

Since all the possible states are not independent, we can reduce the list by one as this can be deduced from the rest of the list of possible states, as being whatever is not already listed.

 

Relating this to entropy is tractable for systems that have a defineable set of states (this is good in quantum theory) but not so easy where we enter an (infinite) continuum of states.

 

Finally heat and work etc are not properties of the system. They are exchange or interaction variables which relate interactions between the system and that which is not system (its surroundings).

Edited by studiot
Link to comment
Share on other sites

I liked it :)

Not all of my questions have been answered, although both threads are about entropy, looking back to the OP, they're different.

So a system can be at maximum entropy, ie have uniform heat, or heat at equilibrium, and the content of information able to be extracted from it is independant, because the "state" of maximum entropy with regards to heat is independant of the "state" of maximum entropy with regards to information?

How does a system retain/maintain information without being able to do "work"?

Perhaps really I should have made a thread for each different question, I'd like an answer to this one specifically.

 

How would the concept of maximum entropy be reconciled with the 2nd law of thermodynamics, if entropy always increases, how could it cease to increase? Is there no such thing as maximum entropy, but rather an asymptote with the maximum?

Edited by Sorcerer
Link to comment
Share on other sites

Keeping to this thread, although you have posted the same question in more than one place,

I don't know what you have been reading to gain the impression that entropy always increases.

It doesn't.

Entropy over a complete cycle can never decrease, but may decrease in parts of that cycle or parts of a system.

That is not the same thing at all.

Your question was pretty specific and to answer it I will have to get a wee bit technical.

Consider a sealed cylinder, divided into two compartments by an adiabatic frictionless piston.

Each compartment ( a and b) contain some ideal gas.

Suppose the piston has an equilibrium position such that the volume of compartment a is Va, and temperature Ta and of compartment b, Vb. and Tb.

Now suppose the piston suffers a slight displacement so its volume increases to Va + dVa.

Since the piston is adiabatic no heat is transferred so q = 0 and dS = q/Ta = 0

Similarly for compartment b, q = 0 so dS = q/Tb =0

 

Or TdS = dUa + PadVa

 

So there is no change in entropy as the piston is moved back and fore, compressing one compartment and expanding the other.

Mechanically we know that at the equilibrium position the pressure on one side of the piston equals the pressure on the other, whilst any displacement leads to a pressure differential which leads to a restoring force.

So if the displaced piston is released it will oscillate about this equilibrium position, in an isentropic process.

A plunger connected to this piston can transfer this oscillation as work to say a resistive dashpot outside the cylinder.

 

There was a much more detailed question about this set in homework help a while back.

All the full mathematics is available in that thread. I will try to find it.

Link to comment
Share on other sites

  • 2 weeks later...

"Special Theory of Relativity", a simple and elegant hypothesis that made a number of remarkable assertions, nothing could exceed the speed of light clocks in a moving object slow down, the length of a moving object shrinks, and the mass of a moving object increases. These assertions have been confirmed by observations and experiment, and are now generally accepted.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.