Jump to content

Statistical Mechanics - Entropy


JamSmith

Recommended Posts

I had discussion on statistical mechanics with my friend last night. I come to know, The relationship between entropy and disorder is studied in a discipline called statistical mechanics.

 

I have no cleare Idea about , How is Entropy Related to Disorder?

Can anyone help me to clear my mind.?

Link to comment
Share on other sites

Entropy is a concept from thermodynamics, which doesn't study the behaviour of individual particles, but (very) large quantities, hence "statistical" mechanics. Examples of statistical properties are pressure and temperature, but also entropy and its complement: enthalpy.

In short: the useful energy in a gas can be split in two parts: entropy and enthalpy. Simply put:
- enthalpy is useful energy, which can e.g. be converted to electricity in a turbine
- entropy is the waste heat that cannot be recovered

The relation between entropy and disorder becomes clear in the second law of thermodynamics:
[math]\frac{dS}{dt} \geqslant 0[/math]

with S the entropy. In words it says: the entropy of a closed system can only increase over time. The typical example is a vase that falls from the table and shatters, while the shards will never accidentally become a complete vase again.

Another example is two containers with air at different pressure: there is useful energy in them, because the pressure difference can be used for mechanical work. When they are connected, the gas will mix and the result will be two containers on the same pressure. While the total energy content did not change (conservation of energy), the result is useless. Something similar will happen with different temperatures and mixtures of fluids: when there is a difference (order), this can be used to produce work, when it is all homogeneous (disorder), it cannot be converted to work.

The transition from order to disorder happens naturally (e.g. objects in contact will evolve to the same temperature; or milk will spread through coffee), the transition from disorder to order can only happen through outside influence (and requires energy).

 

EDIT: a consequence of this is that a combustion engine has a maximum possible efficiency, generally about 40% (Carnot efficiency). This is not a practical limit of how we make engines, but a theoretical limit dictated by thermodynamics.

Edited by Bender
Link to comment
Share on other sites

Entropy is a concept from thermodynamics, which doesn't study the behaviour of individual particles, but (very) large quantities, hence "statistical" mechanics. Examples of statistical properties are pressure and temperature, but also entropy and its complement: enthalpy.

 

In short: the useful energy in a gas can be split in two parts: entropy and enthalpy. Simply put:

- enthalpy is useful energy, which can e.g. be converted to electricity in a turbine

- entropy is the waste heat that cannot be recovered

 

The relation between entropy and disorder becomes clear in the second law of thermodynamics:

[math]\frac{dS}{dt} \geqslant 0[/math]

with S the entropy. In words it says: the entropy of a closed system can only increase over time. The typical example is a vase that falls from the table and shatters, while the shards will never accidentally become a complete vase again.

Another example is two containers with air at different pressure: there is useful energy in them, because the pressure difference can be used for mechanical work. When they are connected, the gas will mix and the result will be two containers on the same pressure. While the total energy content did not change (conservation of energy), the result is useless. Something similar will happen with different temperatures and mixtures of fluids: when there is a difference (order), this can be used to produce work, when it is all homogeneous (disorder), it cannot be converted to work.

The transition from order to disorder happens naturally (e.g. objects in contact will evolve to the same temperature; or milk will spread through coffee), the transition from disorder to order can only happen through outside influence (and requires energy).

 

EDIT: a consequence of this is that a combustion engine has a maximum possible efficiency, generally about 40% (Carnot efficiency). This is not a practical limit of how we make engines, but a theoretical limit dictated by thermodynamics.

Thank you Bender.

 

So finally this follows the rule : A natural process (reaction) causes an increase of entropy or entropy increase is the driving force for natural reactions.

 

If there is an err please correct me.

Edited by JamSmith
Link to comment
Share on other sites

btw, I appear to have made some hasty errors (thanks to studiot for pointing them out): it is obviously not the "useful" energy that is divided in two parts, but the "total" energy.

 

Technically, the entropy can also remain constant, but in dynamic systems, it typically doesn't.

 

I wouldn't call entropy a "driving force", but the idea is correct: nature favours reactions that minimise the potential energy and as a result increase the "thermal" energy, which is an increase in entropy.

Edited by Bender
Link to comment
Share on other sites

It is also possible to rewrite the micro def of entropy S=klnW to the macro def of entropy [math]S=\frac{Q}{T}[/math] both for temperature and volume change. And since S=klnW gives number of combinations that gives the entropy for a given p,V,T it means that the law that max combinations, max entropy (as given in statistical thermodynamic) is most spontaneous can be derived mathematically to the macro version as well. The macro version and calculations are often used in introduction courses in thermodynamics without this derivation. So to me as I look back it seems a bit backward when you take the course that they dont enlighten us with a derivation of entropy from micro to macro mathematically first.

Edited by Tor Fredrik
Link to comment
Share on other sites

It is also possible to rewrite the micro def of entropy S=klnW to the macro def of entropy [math]S=\frac{Q}{T}[/math] both for temperature and volume change. And since S=klnW gives number of combinations that gives the entropy for a given p,V,T it means that the law that max combinations, max entropy (as given in statistical thermodynamic) is most spontaneous can be derived mathematically to the macro version as well. The macro version and calculations are often used in introduction courses in thermodynamics without this derivation. So to me as I look back it seems a bit backward when you take the course that they dont enlighten us with a derivation of entropy from micro to macro mathematically first.

 

Thanks for the concept. But if we think of entropy as "disorder", I come across so many articles. I Found this one short and very clear. http://webs.morningside.edu/slaven/Physics/entropy/entropy7.html

Is it relavent blog to follow?

Link to comment
Share on other sites

I had discussion on statistical mechanics with my friend last night. I come to know, The relationship between entropy and disorder is studied in a discipline called statistical mechanics.

 

I have no cleare Idea about , How is Entropy Related to Disorder?

Can anyone help me to clear my mind.?

 

It is good to see you coming back to discuss your topic; connecting entropy and order/disorder can be paradoxical and often leads to surprising or even inappropriate conclusions.

 

As noted here modern science is therefoe veering away from offering this connection.

 

https://en.wikipedia.org/wiki/Entropy_(order_and_disorder)

 

This Wikipedia summary is a fair summary, and says nothing actually incorrect unlike many offerings, although it raises some questions it does not properly answer.

 

The connection really hinges on what you mean by order or disorder.

The definition of entropy is pretty well specified, swansont has provided a statistical definition, and has bender a physical one.

However there is no such convenient definition of either order or disorder. What is meant depends in part on the parameters of interest.

 

You have not discussed these further or indicated you mathematical level but you can't fully consider the question without some mathematics.

 

Going with the statistical approach, since it is in the title, here is a simple introduction.

 

Consider a chessboard : It has 64 squares.

 

Which means there are 64 ways to place a single black pawn on the (otherwise empty) board.

There is no reason to assume any of these positions or arrangements is 'better' than any other so we choose one square and call it 'order'.

If we place the pawn there the arrangement is 'ordered'.

If we place it anywhere else the arrangement is 'disordered'.

There is thus 1 possible arrangement called order but there are 63 possible ways of disorder.

Now we consider a change of arrangement.

If we make a single change to arrangement, i.e. move a disordered pawn to any other square, there are 62 ways of doing this whilst maintaining the disorder and only one way to change to an ordered pawn.

For this system a change is 62 times more likely to result in disorder than order.

 

This is only for one single pawn.

 

Now take all 8 black pawns and consider arrangements of them on the board.

 

You can place the first pawn in one of 64 ways i.e.on any square.

You can place the second pawn in one of 63 ways i.e.on any remaining square.

You can place the third pawn in one of 62 ways i.e.on any remaining square.

and so on.

 

In total this means there are 64 x 63 x 62 x 61 x 60 x 59 x 58 x 57 = 178462987637760 different arrangements.

So what is order now?

Is it perhaps some relationship between the positions of the pawns, say they are all in a straight line?

 

Well there are 2 ways this can be done if they are to remain on the same colour and another 16 ways if the colour does not matter.

 

This simple model can be developed to embody all the important characteristics of Statistical Mechanics which are, in relation to the OP.

 

1) The pawns are not distinguished. Every pawn is equivalent to every other, so any pawn in the order position constitutes order.

 

2) The arrangements are not distinguished so any position can be chosen as order.

 

3) It is changes to the position that offer meaningful properties.

 

4) When talking about the change, only the beginning and end positions are meaningful. No meaning is attached to the positions during the change.

 

Notes what I have called arrangements or positions are called states in Thermodynamics and Statistical Mechanics.

Edited by studiot
Link to comment
Share on other sites

 

Thanks for the concept. But if we think of entropy as "disorder", I come across so many articles. I Found this one short and very clear. http://webs.morningside.edu/slaven/Physics/entropy/entropy7.html

Is it relavent blog to follow?

 

I did get the article. It looked nice. The volume part change of entropy is directly related to disorder. See this article

 

http://www.pa.msu.edu/courses/2005spring/phy215/phy215wk4.pdf

 

The temperature entropy relation to number of states are as far as I can see related to amount of energy in the system, the more energy the more entropy. And the higher number of molecules, the more energy and more enntropy. The temperature side of entropy as I have learned it is related to the boltzman distribution which is one of the main theorems in thermodynamics. You could find a derivation in the litterature.Atkins physical chemistry was were I found the most info about it myself.

Link to comment
Share on other sites

 

It is good to see you coming back to discuss your topic; connecting entropy and order/disorder can be paradoxical and often leads to surprising or even inappropriate conclusions.

 

As noted here modern science is therefoe veering away from offering this connection.

 

https://en.wikipedia.org/wiki/Entropy_(order_and_disorder)

 

This Wikipedia summary is a fair summary, and says nothing actually incorrect unlike many offerings, although it raises some questions it does not properly answer.

 

The connection really hinges on what you mean by order or disorder.

The definition of entropy is pretty well specified, swansont has provided a statistical definition, and has bender a physical one.

However there is no such convenient definition of either order or disorder. What is meant depends in part on the parameters of interest.

 

You have not discussed these further or indicated you mathematical level but you can't fully consider the question without some mathematics.

 

Going with the statistical approach, since it is in the title, here is a simple introduction.

 

Consider a chessboard : It has 64 squares.

 

Which means there are 64 ways to place a single black pawn on the (otherwise empty) board.

There is no reason to assume any of these positions or arrangements is 'better' than any other so we choose one square and call it 'order'.

If we place the pawn there the arrangement is 'ordered'.

If we place it anywhere else the arrangement is 'disordered'.

There is thus 1 possible arrangement called order but there are 63 possible ways of disorder.

Now we consider a change of arrangement.

If we make a single change to arrangement, i.e. move a disordered pawn to any other square, there are 62 ways of doing this whilst maintaining the disorder and only one way to change to an ordered pawn.

For this system a change is 62 times more likely to result in disorder than order.

 

This is only for one single pawn.

 

Now take all 8 black pawns and consider arrangements of them on the board.

 

You can place the first pawn in one of 64 ways i.e.on any square.

You can place the second pawn in one of 63 ways i.e.on any remaining square.

You can place the third pawn in one of 62 ways i.e.on any remaining square.

and so on.

 

In total this means there are 64 x 63 x 62 x 61 x 60 x 59 x 58 x 57 = 178462987637760 different arrangements.

So what is order now?

Is it perhaps some relationship between the positions of the pawns, say they are all in a straight line?

 

Well there are 2 ways this can be done if they are to remain on the same colour and another 16 ways if the colour does not matter.

 

This simple model can be developed to embody all the important characteristics of Statistical Mechanics which are, in relation to the OP.

 

1) The pawns are not distinguished. Every pawn is equivalent to every other, so any pawn in the order position constitutes order.

 

2) The arrangements are not distinguished so any position can be chosen as order.

 

3) It is changes to the position that offer meaningful properties.

 

4) When talking about the change, only the beginning and end positions are meaningful. No meaning is attached to the positions during the change.

 

Notes what I have called arrangements or positions are called states in Thermodynamics and Statistical Mechanics.

 

 

I did get the article. It looked nice. The volume part change of entropy is directly related to disorder. See this article

 

http://www.pa.msu.edu/courses/2005spring/phy215/phy215wk4.pdf

 

The temperature entropy relation to number of states are as far as I can see related to amount of energy in the system, the more energy the more entropy. And the higher number of molecules, the more energy and more enntropy. The temperature side of entropy as I have learned it is related to the boltzman distribution which is one of the main theorems in thermodynamics. You could find a derivation in the litterature.Atkins physical chemistry was were I found the most info about it myself.

I got your point. Thanks for co-operation.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.