Jump to content

sioprequi

Members
  • Posts

    2
  • Joined

  • Last visited

Posts posted by sioprequi

  1. Here's another forum member's ideas:

    [Thermodynamic entropy and work] is based on the [concept] of a "closed system". When you look at entropy you look at the total system. Entropy can 'decrease' in a subsystem as long as the total entropy of the entire system increases. It must be cyclic' date=' and the overall entropy then can be zero (part of a cycle can have positive energy change, and another part an inverse, or negative energy change, but energy is [b']conserved[/b] -a negative part of the cycle is always balanced, and the overall change is zero, in a closed system).

     

    Another way to say the same thing is to look at the system (of interest) and the surroundings. The entropy of the system can decrease if the entropy of the system + surroundings increases.

     

    Common misconceptions held by chemistry students

     

    Every year I teach my Advanced Chemistry course I tend to run into the same misconceptions. Here are some of the more common examples.

     

    Entropy measures disorder - Entropy describes the number of ways to arrange a collection of particles or the energy of a system; it has little to do with disorder. A messy room doesn't have greater entropy than a neat room and a shuffled deck of cards does not have more entropy than an ordered deck. Entropy is a useful way to describe the macroscopic behavior of a collection of microscopic particles. A shuffled deck is just another arrangement of the collection of cards that is as equally likely as any other arrangement. As a bit of an aside here' date=' it's important to remember that words matter. Using "disorder" to describe entropy allows religious fundamentalist to propose the preposterous idea that evolution violates the Laws of Thermodynamics.[/quote']

    Information usually is considered as some message or set of messages that convey meaning. This has nothing to do with info-theoretic notions of information.

     

    Order, or disorder is something (a property) that can be seen in something as simple as a deck of cards, but any meaning attached to any particular ordering is outside the realm of the theory. Shannon called it information entropy, which is kind of a misnomer, it's really uncertainty/certainty, or probabilistic/expected "messages".

    A message is any combination of bits in some alphabet. Once the alphabet is defined, all messages have the same "meaning".

     

    Any order (shuffle) of a deck of cards is equivalent to any other, probabilistically. Shuffling a deck, or spreading it out on a table, does not increase its entropy. It only changes some ordering--disorder of the deck itself might be maximised by posting single cards to random destinations around the globe, say, or by throwing the deck up in the air so it 'produces' some random ordering. But it's still a deck of cards; changing the order does not change the deck, or its entropic state. The notion of external information is the same kind of notion as external thermodynamical systems, in which we see energy disperse.

    Following the lead of Maxwell who had modeled gas molecules as colliding billiard balls' date=' Boltzmann argued that the second law was simply a consequence of the fact that since with each collision nonequilibrium distributions would become increasingly disordered leading to a final state of macroscopic uniformity and microscopic disorder.

     

    Because there are so many more possible disordered states than ordered ones, he concluded, a system will almost always be found either in the state of maximum disorder or moving towards it.

     

    As a consequence, a dynamically ordered state, one with molecules moving "at the same speed and in the same direction," Boltzmann ...asserted, is thus "the most improbable case conceivable...an infinitely improbable configuration of energy."

     

    Because this idea works for certain near equilibrium systems such as gases in boxes, and because science until recently was dominated by near equilibrium thinking, Boltzmann's attempted reduction of the second law to a law of disorder became widely accepted as the second law rather than simply an hypothesis about the second law, and one that we now know fails.[/quote']

    “We completely ignore the human value of the information. A selection of 100 letters is given a certain information value, and we do not investigate whether it makes sense in English, and, if so, whether the meaning of the sentence is of any practical importance. According to our definition, a set of 100 letters selected at random (according to the rules of Table 1.1), a sentence of 100 letters from a newspaper, a piece of Shakespeare or a theorem of Einstein are given exactly the same informational value."

     

    “Information is an absolute quantity which has the same numerical value for any observer. The human value on the other hand would necessarily be a relative quantity, and would have different values for different observers…..”

     

    “Whether this information is valuable or worthless does not concern us. The idea of “value” refers to the possible use by a living observer. This is beyond the reach of our theory…”

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.