Jump to content

Entropy order or disorder?

Featured Replies

I have some difficulties to make sense of Entropy. I mean if I inspect a system which is working in changing physical circumstances like increased or decreased temperatur, the system is constantly try to adapt to the new circumstances to maintain equilibrium in the system.

 

The question is why we call the state of equilibrium the maximum entropy (or maximum disorder) when It seems that it is the maximum order in proportion to the physical circumastances we inspect the system in. I mean if I inspect an isolated system as I understood this system will always approach to equilibrium. But why we call this state maximum disorder? In that inspected equilibrium I can predict the correlation of the components in the whole system. (like the density of a NaCl solution) But if we speak about maximum disorder wouldn´t that mean that the system behaves chaotic and I can not predict the properties of the system?

 

 

Where from my confusion originates:

 

Does information increases entropy?

 

If I inspect a human body and all of its molecular functionalities, the system is in constant change and motion. It is a very well orchestrated order, but the disorder of the system is more likely to happen presented as diseases. In this case I would say the the entropy a system increases with the amount of information presented in its operations.

 

An other example: Like the concentration of whiteblodcells when a bacteria is present in the circulation. The equilibrium of the general whiteblodcell concentration in the system changed (disorder increased) and so I would say entropy increased. As soon as the bacteria is eliminated the system turns back to equilibrium where the concentration of the WBC will be more balanced. Order increases. Or?

 

What I missunderstand on Entropy?

Edited by 1x0

 

1x0

Where from my confusion originates:

 

It depends what you understand the words, Entropy, Order and Disorder to mean.

 

 

1x0

But if we speak about maximum disorder wouldn´t that mean that the system behaves chaotic and I can not predict the properties of the system?

 

That's a rather extreme view.

Why would disorder being at a maximum make a system completerly unpredictable?

Could there not be a scale of predictability?

Why do people always pick complex systems to try to understand entropy ?

 

Take a box, with a partition down the middle separating two differing gases. The difference between the gases is arbitrary, it could be temperature, composition, or even colour. It is in equilibrium, but there is only one way to 'organize' this separation.

Now remove the partition from the box and the gases quickly arrive at a new equilibrium, whether a median temperature, a mixed composition, or even a mixed colour. However there are now a multitude of ways, or 'organizations', for this mixing.

This increase in the number of ways to organize the system, or degrees of freedom, is a measure of the increase in entropy of the system ( note also that with gases of differing temps, work can be done, but after the partition is removed and equilibrium is reached, no more work can be done as there is no temp difference ).

 

This is what we mean by order and dis-order, and how entropy increases in non-reversible systems

Edited by MigL

Archived

This topic is now archived and is closed to further replies.

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.