Jump to content

Entropy


Recommended Posts

Well, it´s simply a definition; although it´s suited to get in contact with the entropy defined in (nonstatistical-) thermodynamics.

Some properties which make it look like a good choice to me:

1) As long as the map from the number of microstates to entropy is a monotonous rising one (A > B => log(A) > log(B) ), the very important "the system will be in the macrostate with the most associated microstates"-axiom still translates to "entropy will be at maximum".

2) It seems like a practical definition: I can imagine you often encounter problems where you multiply numbers of microstates. Since log(A*B) = log(A)+log(B), entropy is an additive number for those problems.

 

Sry for being so vague in point 2 but I don´t have a good example in mind right now; perhaps someone else has.

Either way: From the physics-side, it doesn´t really matter if you take the log or not; it changes the equations but it´s still the same physical entity. Perhaps it´s comparable to measuring temperature in Fahrenheit or Kelvin, only a tick more sophisticated.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.