Jump to content

Entropy

Featured Replies

I was wondering if anyone could explain the underlying significance of the form of Boltzmann's formulation of Entropy S = k ln (omega) where omega is the number of microstates. Why is it that the entropy increases with the log of omega?

Well, it´s simply a definition; although it´s suited to get in contact with the entropy defined in (nonstatistical-) thermodynamics.

Some properties which make it look like a good choice to me:

1) As long as the map from the number of microstates to entropy is a monotonous rising one (A > B => log(A) > log(B) ), the very important "the system will be in the macrostate with the most associated microstates"-axiom still translates to "entropy will be at maximum".

2) It seems like a practical definition: I can imagine you often encounter problems where you multiply numbers of microstates. Since log(A*B) = log(A)+log(B), entropy is an additive number for those problems.

 

Sry for being so vague in point 2 but I don´t have a good example in mind right now; perhaps someone else has.

Either way: From the physics-side, it doesn´t really matter if you take the log or not; it changes the equations but it´s still the same physical entity. Perhaps it´s comparable to measuring temperature in Fahrenheit or Kelvin, only a tick more sophisticated.

Archived

This topic is now archived and is closed to further replies.

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.