# Fred56

Senior Members

812

1. ## Entropy and Information

When I learned about Shannon's theories I was intrigued by the concept of information having entropy. The explanation, I recall, went something like: A message contains information, or has information content. This content may describe an expected, or an unexpected, event. An unexpected message has more information content than an expected one. This content is said to be a measure of the entropy of the information (Shannon entropy). A message is both a real (i.e. encoded in binary) piece of information, and also any event (like say a supernova, or the decay of a particle, or a change in kinetic energy, ...) which is encodable. It is fairly intuitive that unexpected information has more content than expected information. For example, if someone woke you up in the morning and said: "The sun came up", or: "Breakfast will be ready soon", this is a lot less information than if they said: "The house is on fire", or: "There's a spaceship parked on the front lawn", for instance. Entropy of a particular message is measured by the number of bits used to encode it (and I know the math works, because I've "done" it). Thermodynamic entropy is a measure of change, specifically change in heat content, and is a statistical phenomenon. It also has to do with a much larger field of study called ergodics, which deals with how a system changes over time (often a large interval of time), and with measuring group behaviour, among other things. The units of thermodynamic or classical entropy are units of energy per degree of temperature (joules per degree K). The units of information entropy are bits per message. So just how are thermodynamic and information entropy related to each other? Are they the same thing (one uses physical units, the other uses dimensionless bits)?. I have not really ever been able to reconcile this. Can anyone point out the blindingly obvious (which I must have missed)?
2. ## Time.

When discussing this particular topic, I think it pays to be aware of the following: 1/ We perceive the "flow" of time as a series of events, with intervals between them. 2/ These intervals, at least our perception of different length intervals, are due to our inner body clocks and timers - several of these have been studied fairly extensively, including the well-known diurnal clock which sends our bodies to sleep when it's dark, and wakes us up when it gets light. 3/ Our language has evolved around our experience of the world, and so contains a lot of words that have a sense of, or some connection to, time itself. 4/ With the above in mind, and careful of the difficulties with language (it doesn't offer a "time-independent" viewpoint because of its dependence on concepts of time and its passage), it turns out that trying to understand what time is becomes quite problematic. 5/ We all know what time is in an innate sense. We know that time can be represented by a line, just like any other dimension, but we also know that time doesn't move in any particular direction (it doesn't depend on which way the earth is spinning, for instance), but instead "increases everywhere" (ignoring Einstein for now), and appears, at least here on the earth's surface, to flow at a fairly constant rate (it doesn't speed up or slow down dramatically, for instance), but there is some variation in our personal experience of the rate of flow -this depends on the same internal biological timers, which are affected by things like our current alertness, and concentration, even on whether we have eaten recently, among others. So, lets talk about this thing...
×