# A mass of information

## Recommended Posts

Martin, since relativistic mass is the sum of rest mass and kinetic energy, and is used in the same equations as mass is used, I think it is fair to to write it in terms of mass or energy...

Your photo there is one of Albert Einstein sticking his tongue out. He didn't like the concept of 'relativistic mass' and said that it should not be taught.

Look, I am not a semantic crusader. I will go with whatever Swansont says the appropriate definitions are, if he chooses to say. We should try to all have the same meaning of basic words.

I PREFER the usage (which in my experience grad level and up particle physicists share) where mass has its root meaning of INERTIA and the quantity of inertia is independent of direction. That means for inertia to be defined the body must be at rest.

A body that is moving does not have a well-defined inertia because its acceleration in response to force depends on direction. if you will it has a "longitudinal inertia" and a "transverse inertia". And in-between amount of inertia for directions of force that are in between. But that is too complicated.

Inertia is a deep idea, almost a mystery (the Higgs field and all that). The equivalence between an objects gravity (its interaction with the geometry around it) and its inertia is a mystery.

So I prefer the pure idea of mass as inertia (of an object at rest, because otherwise inertia is directional and not well-defined) and i think of this as logically cleaner.

then E = mc^2 really says something! and it says it about an object at rest.

If you heat a canonball up its inertia increases---and consequently its weight.

But evidently you do NOT prefer to keep the ideas pure like this. You have various kinds of mass in mind, and you moosh together the idea of mass with the idea of energy. I think that muddies the water and makes it more difficult to see what are the most interesting fundamental questions.

So, unless Swansont wants to lay down some official website definition, i will just be quiet and when Fred says "a photon has positive mass" I will let that pass without comment.

• Replies 113
• Created

#### Popular Days

I intermittently loose the focus in the dicussions as they seem to take place at different planes at the same time.

I thought the original topic "mass of information" aimed to understand mass in terms of information, or alternatively as a property of information. This is bound to be muddy but interesting. Perhaps the first step is to dissolve the pre-existing ideas and then try to regroup.

IMO at the information / mass level we are probably below spacetime, then suddenly into the discussions the concept of a "photon" enters, which is defined in a different set of ideas and formalism and IMO is defined as a higher level construct, but in another construction. So what are we talking about?

If we regroup and want to reconstruct the universe again, I think it's very confusing to mix into it objects defined elsewhere, as the connection is unclear to me at least.

I think if we are to start discuss intertia, and information in a reconstruction the photon as well as all other existing concept built on the standard formalisms and models need to be reconnected to the reconstruction? At least that's how I picture it.

What is the meaning of a photon if we have dissolved spacetime?

/Fredrik

I belong to those who wishes to reconstruct the most fundamental concepts from minimal first principles, and then see why new concepts/structures as complexity allows are unvoidable inductions. It seems we may disagree what should be fundamental?

If we start out by accepting the notion of 3D space, then I have personally lost track from the outset. I think we need to see how the space we apparently see can be induced. And what about the certainty of that induction? It seems that at logical level one would want to assign measures similar to intertia and momentum, and thus to find a proper information theoretic connection to these concepts just like we try to find information theoretic explanations of entropy, which has proven to also be non-trivial.

/Fredrik

##### Share on other sites

I PREFER the usage (which in my experience grad level and up particle physicists share) where mass has its root meaning of INERTIA and the quantity of inertia is independent of direction. That means for inertia to be defined the body must be at rest.

My usage of rest mass and relativistic mass stems more from the fact that they are well defined and universally agreed upon (as far as I know). Mass is ambiguous because some people use it to refer to rest mass and others use it to refer to relativistic mass.

My current stance is to avoid using mass, since I understand energy and momentum better. I try to replace mass with these whenever possible, though I don't expect others to do this. This may change as I learn more.

A body that is moving does not have a well-defined inertia because its acceleration in response to force depends on direction. if you will it has a "longitudinal inertia" and a "transverse inertia". And in-between amount of inertia for directions of force that are in between. But that is too complicated.

I hadn't thought about this. My preference is $F = \frac{dp}{dt}$ which I understand is always valid rather than $F = ma$ where m might change at relativistic speeds. I suppose it would depend on what sort of problems I am doing.

But evidently you do NOT prefer to keep the ideas pure like this. You have various kinds of mass in mind, and you moosh together the idea of mass with the idea of energy. I think that muddies the water and makes it more difficult to see what are the most interesting fundamental questions.

That's not true. I am a great believer in Einstein's idea that "Everything should be made as simple as possible, but not simpler." As I understand it, gravitation is based on energy/relativistic mass (such that photons create a gravitational field), and rest mass can always be converted into energy (even if with great difficulty, by using fission/fusion, annihilation, or black holes). I only treat them as equivalent because I see little difference -- other than the difficulty of converting rest mass into energy. If there is some fundamental difference, I will treat them separately.

----

Sorry if I sound like a prick, but definitions are very important to me. In fact, definitions, laws of physics, and the math to understand them are the only things I focus on remembering in physics. If I get my definitions wrong, how can I hope to apply the laws that use them?

##### Share on other sites

What you say in the main strikes me as reasonable and having stated my own preferences I will not quarrel with it. then at the end you say

Sorry if I sound like a prick, but definitions are very important to me. In fact, definitions, laws of physics, and the math to understand them are the only things I focus on remembering in physics. If I get my definitions wrong, how can I hope to apply the laws that use them?

how it sounds is probably irrelevant. this is an area where I see eye-to-eye with you. I care about definitions which are measurable or operational

if it is a physical quantity then I try to visualize just what it means in terms of how to measure----a verbal definition in terms of generalities belongs more to philosophy discussion

and I think we may well agree on that.

##### Share on other sites

then E = mc^2 really says something! and it says it about an object at rest.

If you heat a canonball up its inertia increases---and consequently its weight.

What about all the photons (of energy) that got together at some point to ´make´ the iron atoms (in the cannonball)?

Comment: my interchanging ´mass´ and ´mass equivalent´ (saying information isn´t massless -in the OP) is perhaps questionable (in a thermodynamical sense)' date=' but surely not interchanging ´energy and mass´, since they are (meant to be) equivalent? This appears to be the ´rub´, if you will.

Really the only conclusion from [math'] E=mc^2 [/math] is that mass and energy appear as two different aspects of the same thing. Maybe it should be given a different name -manergy, or enesseance, or whatever. But the two are definitely the ´same´ thing, in two roles, as it were. There are two actors out on loan. They can switch roles if they ´wish´ to, before our very eyes, at least, this is very much what the ´trick´ looks like.

I am only trying to point this out. I also don´t think we are going to get around any anthropomorphic viewpoints, or that it might be especially important to do so...

##### Share on other sites

This guy obviously can´t say ¨photons are massless particles with a mass equivalent" six times fast.

##### Share on other sites

Suck it in dude... Wooa looks like he sucked in a bit too much.

He´s gone then, it´s safe to come out...

I personally was offended by his aberrant use of ´the Kyngge´s Englysshe´...

You can imagine a photon as a traveling wave, or as having a 2d surface that projects at right angles to the direction of travel or movement. There are 2 degrees of freedom for a photon, along this traveling 2d surface (let´s say it looks like a little circle), and there are 2 components, an electrical and a magnetic component, which ´resolve´ into a momentum. The wave is said to collapse when this happens. But this is a model, an idea of what a photon being absorbed or emitted ´looks like´.

A 2-d surface can´t have volume so it´s meaningless in such a space, but area isn´t. The radius for photons in this (mathematical) space is never more than the same constant value (but the two components vary sinusoidally about a zero point, so that the area ...also cycle this way, from zero to a constant value, and the cycle time, or frequency, determines the energy of a particular photon), this maximum constant amplitude appears to be related to its apparent velocity, somehow.

The energy is not related to the distance traveled, unless the photon interacts with another photon (or an electrical or magnetic field, or collides with an electron or other charged bit of matter). In other words you could say that the energy in a photon is bounded by (integrable over) a single period of its cycle (or something similar), like a packetised bit of energy, rather than the integral of all the periods it has cycled through on its journey.

Growing and gathering store

The idea that our knowledge is a store, like some record which has been “carved in the sides of great mountains”, say, for all time is only whimsical at best, as any 'knowledgeable' man should realise. In spite of our hope, all knowledge --all records and symbols and argument-- is all completely useless, meaningless or absurd, without a mind to understand it. The store, ultimately, is in our heads.

Knowledge (of mathematics) is considered both in a static “knowable” sense, and in a growing or evolving “unknown” sense. All knowledge has this static/changing nature. It is up to us to observe and record, but the record, 'external and internal', is necessarily a work in progress, and one we will never complete: there is, for an observer with limited, or “defined and bounded”, connection to an external world, no such thing as complete knowledge (of any thing). The only “complete knowledge” we have available, in that sense, is that information will never be a complete thing, there will always be uncertainty in our measurement. This “observer” uncertainty -an inherent characteristic, is mirrored in an inherent uncertainty in the fine structure of the world -something we have not known about for “very long”. This discovery has (and is) changing the way we look at things.

We check the store's inventory constantly, and evaluate the “storability” of new information, we sort and gather, strew and scatter, connecting and severing the links that become a lasting record on the “wallpaper of the mind”. But if the mind were paper we would not need to cut down so many trees, perhaps. The external record, which we place greater faith in, is more a convenient way to compare individual knowledge, and represents the application of a group of observers who can compare individual knowledge (observation), more efficiently.

Thus the group mind has more status, generally as the selector agent of persistent records. Today the external symbolic record is considered large enough and stable enough to provide (static, learned) knowledge for a very large group, but none of it makes any sense at all to, say a whale, or an amoeba, or any of the many other kinds of observers. We are one of the few species to extend their learning this way (and the most successful at doing it). But the external record is transient (it has its own entropy).

¨She tells Max to stay, when the class has gone away,

so he waits behind.

Writing 50 times, ´I must not be so´, oh oh oh.

But as she turns her back on the boy,

he creeps up from behind...¨

A comment about that mass stuff:

Mass is defined several different ways in Physics, there is potential and kinetic energy (and chemical, nuclear, and thermodynamic, ...).

Einstein concluded that energy, a scalar quantity which is always conserved, is photons, or photons (light waves or particles) are energy, and have zero (rest) mass.

Rest mass is an extrapolation (introduced in the early 20th century by deBroglie), which can never be determined empirically (directly), because nothing is ever at rest.

There is relativistic mass, intrinsic mass, rest mass, inertial mass; there is only, however one kind of radiation energy. Photons have a mass equivalent. Cosmologists refer to energy, rather than mass (maybe they do this because they are 'the same thing').

This is my position too, sorry if anyone has got all confused about it, but I thought the "issue" had been settled, like 90 odd years ago...

If someone can illustrate what's important about saying "photons have zero mass", I'll move it off the page with all the other things it "doesn't have", (which can't be all that useful a way of describing anything). Dogs don't have wings, and cows don't jump over houses, the Moon (apparently) is not made of green cheese, this is all useful information, for some reason or other...

##### Share on other sites

• 2 weeks later...

There's a lot of information in the world around us.

Cosmologists estimate that about 95% of it is in the form of radiation, and space is filled with this background energy, radiating in all directions. This information, however, only becomes real when an observer acquires it, and converts it into more information, which is then maintained by it at some level, in an abstracted form --photons of light become neural signals (in animals that have cells which react to light this way), and are processed and augmented in various biochemical ways, and maintained at some cost.

All this requires ongoing change, which must be regulated somehow, to avoid spending too much energy or effort on any particular information. Energy is available to living things from their environment, and they convert their own "information store" (themselves) into available energy, using various labile compounds that bind together reversibly; and also by exploiting the tension that the bonding in stereochemical compounds exhibits.

Life has learned to capture and harness ongoing biochemical reactions; it is essentially a vast, complex, self-regulating and assembling phenomenon. Life exhibits purpose, or this behaviour (purposefulness, and others) emerges from this "structured entropy".

Structure is also a requirement, or a necessary reality. An organism needs to be contained, or compartmented, somehow (life isn't something that spreads out or diffuses like a gas or a liquid, it "stays together", or has an interface or boundary). This structuring is an important part of Life's ability to "emerge" from a background of matter and energy.

There is information in this structuring (it requires ongoing repair and extension), and there is information which represents the activity that ensues (chemical and electrical), upon this structure. Life is like a player or actor upon its own stage (built of information --energy or its equivalent).

##### Share on other sites

So here we are, processing the film reel as it unwinds, with our own individual cameras recording their own reels, in the reverse sense, as it were, to the one playing on the screen in front of us. The ideas of sampling, framewise, some input from the world might be replaceable (nowadays) with the idea of a solid-state device, but one that has a 2-level cache, and no really permanent storage (as soon as it's switched off, the data is lost). And reality still 'despools' as we 'spool' the visual world into our individual memories.

The sample rate is continuous, but still limited by the ability of the system (the solid-state video camera) to process and encode images, and framing seems to be a natural way to do this, as well as salient image processing to extract features, such as outlines, distance (spatial) processing, and colour information. All these are common features on modern video cameras.

They all sound a lot like what happens both inside such a modern electronic device, and our own brain: there is capture (of images -in an integrated matrix, or a surface of receptor elements), there is processing and encoding, and there is multilevel storage and representation (abstraction). Partitioning is also a natural mathematical approach, to analysis of the behaviour of such systems (especially the organic one), and in particular of the way information "moves around", or undergoes changes in momentum, and entropy. Information, in a brain, is maintained by, or in a larger sense is the energy needed to maintain it.

##### Share on other sites

External records, messages that are representative, or abstracted information from a human mind, represent absolutely zero information (content), because they are a channel.

We can't receive any information from a book written in a language we don't understand. So the information in that sense, isn't in the book (you might be able to get another book, of exactly the same make and with the same pages, the exact weight and same amount of ink, and different symbols in it).

The information is what ends up in (more than) one mind, from all those written words, or spoken words or sounds, music, images (we have very developed visual apparatus), and so on. The channels are noisy; we actually receive only tiny amounts of energy as information from the world, that we must then expend energy making into some kind of internal record, which we know is transient. This is another, well-known practical lesson in the existence of entropy (we even measure it), we are obliged to recall and review our knowledge constantly, and a group of minds can do this more effectively.

So books and words, and all methods of acquiring and understanding that we have in the external world (all the computers that analyse characters, or translate languages, or search for combinations of strings of bits), are all channels, analytical in the Shannon sense, and Shannon's ideas of communication (and the entropy changes involved) come into play.

We have ideas of certain measurement which logically include the idea of no measurement (zero or no result) and all measurement (infinite or complete results), and are forced to abstract these notions as mere vague symbols, as ideas of something that is beyond (all ideas). We can't communicate using 'nothing': communication, by definition, requires something, which is the channel for that information. We use the same idea for the channel as we do for the messages it 'transfers', from and to us and the world, or each other. We say the representation is also the information, but not so.

We can understand the difference between a channel and the information it carries.

A copper wire is a channel that can carry several different messages at once (a multiplexed channel), But the channel by itself can only modulate any message (it always has a certain noise level, or limit to the amount of information it can transmit), it can never provide a message itself, the messages require the medium (the copper wire), but are distinct from the channel. Books do not 'contain' information, they transmit the information that the words (that are understood, or decoded), contain as valid and syntactic elements (frames).

A single photon, or its measurement, cannot represent anything more meaningful than a "Schrodinger's cat" state.

An indeterminate state -will the message arrive?- is the only possible outcome, if there is no known channel, to measure or receive communication from. If 1. there is no channel, then 2. there is only a possibility of there being one at some point, so a single photon cannot send any information except this: "yes, there is a channel". If the single photon is also encoded (say it's polarised by a certain angle), this can be a message (of 1 bit), a yes/no message (these days, it's common to encode single photons using more than one quantum state 'variable', which is how quantum encryption is done).

But you can't send a message with zero photons (or no bits), and a 0 bit isn't 'nothing', it has to be a real physical signal of some kind.

##### Share on other sites

Quantum information processing is waiting in the wings:

"QM plays a significant role in the operation of the laser, the FET, and classical SFQ logic, but none of these are coherent quantum devices. [T]hey do not preserve and exploit quantum mechanical phase information. Accordingly, they cannot provide the parallelism which leads to exponential speedup in a quantum computer." --Uncle Scrooge

"As a generalization ...think about ANY AQC operating on a hard (ie exponentially small gap) problem. Is there any physical system whose temperature is smaller than the gap at an anti-crossing of a hard problem? At an anticrossing, the temperature is ALWAYS going to be orders of magnitude larger than the gap. That's why inclusion of a thermal environment is ...[needed] in order to analyze how to operate an AQC (although note that at the anticrossings it's not really adiabatic anymore).

Qualitatively, the effect of the large temperature is to thermalize the two energy levels involved in the anticrossing, reducing the probability of success by 1/2, which is of course completely acceptable. The qubits are compound [Josephson] junction RF squids. The tunneling matrix elements for each qubit can be controlled by varying the flux applied through the CJJs for each qubit.

This approach is well-known and is centrally featured in the superconducting AQC papers I've linked to here. As I mentioned earlier the Hamiltonian is of the X+Z+ZZ type. Notice the X? ...From the theory perspective, adding environments qualitatively changes the behavior of the system. I don't believe that even this simple point is widely understood.

There are lots of things like this where computation and physics are related in non-trivial ways, and where cross-overs between classical and quantum behavior may affect computational scaling in a way that isn't just either/or. Also the system we're building is [not] going to exponentially speed up anything. The objective is the quadratic speed up for unstructured search. ... The way we operate our AQCs is like this:

$(X_i$ and $Z_i$ are the pauli X and Z matrices for qubit i):

(1) Turn up the tunneling term in the Hamiltonian to its maximum value

$(H=\sum_i \Delta_i X_i)$

(2) Slowly turn the qubit biases and coupler strengths up to their target values (these define the particular problem instance);

after this process the Hamiltonian is

$H=\sum_i (\Delta_i X_i + h_i Z_i) +\sum_{ij} J_ij Z_i Z_j$

(3) Slowly turn the tunneling terms off; after this the Hamiltonian is

$H=\sum_i h_i Z_i +\sum_{ij} J_ij Z_i Z_j$

(4) Read out the (binary digital) values of the qubits

OK so the point of this is that the qubits are only read out when they are in classical bit states by design. The readout devices are sensitive magnetometers called DC-squids which sense the direction of the magnetic field threading the qubit and hence its bit state. The computational model is explicitly set up so that superposition states are used only during the annealing stage; the readouts never fire during this step.

Answers are encoded in bit strings. Each bit string corresponds to a particular solution.

If the computation succeeds, the bit string returned $({s_i})$ will minimize the energy

$E=\sum_i h_i s_i +\sum_{ij} J_ij s_i s_j$."

--Geordie

--superconducting.blogspot.com Jan 2007

##### Share on other sites

I guess this thread has turned blog-like I think the association to computing and in particular _computing time_ is a good one.

One can consider transformations as more or less efficient, and I think that it might be possible to describe spontaneous structure formation in terms of a diffusion of information between structures, where the feeback implies a selection of efficient structures.

So what starts as the the brute force of computing like a random testing of all describable options, will randomly restructure, and clearly the more efficient structures are more likely to survive.

I'm trying to see how such a description might possible explain the emergence of the complex amplitude formalism. I've got a strong feeling, that the mathematical relation between the dual spaces is selected for it's special properties in this way - if this works it implies an interesting connection between these information ideas with mathematics itself. I've got some ideas that are closing up on beeing possible to forumlate in terms of a testable formalism.

In a few months I hope to have some more insight in this.

/Fredrik

##### Share on other sites

Blogging seems to be the way it goes, sometimes, I guess. My previous comments about this, and my "thinking out loud" approach, appear to ensue from the outcome that people don't always respond, so I just keep going, as it were. The above from superconducting.blogspot.com is about a quantum processor they are designing, and I will no doubt stay with this story...

Efficiency certainly seems to be a natural thing that happens in any system that commutes or changes energy state, or microcanonical partition-wise states. Relaxation of coupled oscillations, superposition (of both quantised and analog waveforms) usually 'tries' to find a state of lowest entropy, or it 'self-adjusts' or whatever. Randomness and chaos appear to be linked to this somehow, also.

A surface 'appears', and we seem obliged to look at its structure and sometimes this looks quite complex.

##### Share on other sites

• 2 weeks later...

Light is energy:

Science 14 December 2007:

Vol. 318. no. 5857, pp. 1748 - 1750

DOI: 10.1126/science.1149066

Reports

Stored Light in an Optical Fiber via Stimulated Brillouin Scattering

Zhaoming Zhu,1 Daniel J. Gauthier,1* Robert W. Boyd2

We describe a method for storing sequences of optical data pulses by converting them into long-lived acoustic excitations in an optical fiber through the process of stimulated Brillouin scattering. These stored pulses can be retrieved later, after a time interval limited by the lifetime of the acoustic excitation. In the experiment reported here, smooth 2-nanosecond-long pulses are stored for up to 12 nanoseconds with good readout efficiency: 29% at 4-nanosecond storage time and 2% at 12 nanoseconds. This method thus can potentially store data packets that are many bits long. It can be implemented at any wavelength where the fiber is transparent and can be incorporated into existing telecommunication networks because it operates using only commercially available components at room temperature.

1 Duke University, Department of Physics, Box 90305, Durham, NC 27708, USA.

2 The Institute of Optics, University of Rochester, Rochester, NY 14627, USA.

* To whom correspondence should be addressed. E-mail: gauthier@phy.duke.edu

##### Share on other sites

Light is energy:

Science 14 December 2007:

Vol. 318. no. 5857, pp. 1748 - 1750

DOI: 10.1126/science.1149066

Reports

Stored Light in an Optical Fiber via Stimulated Brillouin Scattering

Zhaoming Zhu,1 Daniel J. Gauthier,1* Robert W. Boyd2

We describe a method for storing sequences of optical data pulses by converting them into long-lived acoustic excitations in an optical fiber through the process of stimulated Brillouin scattering. These stored pulses can be retrieved later, after a time interval limited by the lifetime of the acoustic excitation. In the experiment reported here, smooth 2-nanosecond-long pulses are stored for up to 12 nanoseconds with good readout efficiency: 29% at 4-nanosecond storage time and 2% at 12 nanoseconds. This method thus can potentially store data packets that are many bits long. It can be implemented at any wavelength where the fiber is transparent and can be incorporated into existing telecommunication networks because it operates using only commercially available components at room temperature.

1 Duke University, Department of Physics, Box 90305, Durham, NC 27708, USA.

2 The Institute of Optics, University of Rochester, Rochester, NY 14627, USA.

* To whom correspondence should be addressed. E-mail: gauthier@phy.duke.edu

What's your point here? Do you know what Brillouin scattering is?

##### Share on other sites

Light is energy:

Light possesses energy. Nothing in that abstract says otherwise. Acoustic excitations possess energy, too.

##### Share on other sites

There you go

Light or photons, is that E stuff.

##### Share on other sites

There you go

Light or photons, is that E stuff.

There who goes?

"light is energy" <> "light possesses energy"

##### Share on other sites

Quite - which is it?

Is this "model" -a thing that is energy, i.e. E = hv;

or is it a bag, or 'container' that carries it around? A shopping bag for the energy-fairies? 'Scuse the humour, it just kind of comes out all by itself.

There who goes?
An energy fairy with a little bag (to carry a photon's energy around)...('chuckle', come off it)
storing sequences of optical data pulses by converting[/b'] them into long-lived acoustic excitations
i.e. the energy is stored as sound excitations

Do you know what Brillouin scattering is?

Aren't you going to tell us?

P.S. What does the following have to do with this >> Can you see a connection (I can)

A single photon' date=' or its measurement, cannot represent anything more [b']meaningful than a "Schrodinger's cat" state.

"There is something about our minds that is non computable: NP-complete , something that is beyond the realm of computation.

So we can 'know' things other than through algorithms, sort of related to Godel's famous theorem. The only thing that can give us this non computable element in nature is a process that is not deterministic."

There are lots of things like this where computation and physics are related in non-trivial ways, and where cross-overs between classical and quantum behavior may affect computational scaling in a way that isn't just either/or.

+++++++++++++++<<<<<<<<>>>>>>>>

Now for a bit of sailing

Channels are everywhere.

The idea of a channel is a fairly simple one. There are channels everywhere.

The English Channel is a body of water that flows, also a geological (and submarine) feature, a gap, or separation between bodies of land. So, a river: a static channel, through which something flows -water. Water can be said to be the 'information', the English Channel commutes between the North Atlantic, and the Irish sea and Atlantic.

This body of water will carry, or commute, any object caught up in its flow: that is: anything that floats, or doesn't sink to the seafloor. So, the English channel's geology (it's structure) is a commuter -of water and anything that the water carries.

The water will modulate and translocate or displace any 'message': a sealed, empty bottle, a boat, a log of wood, a Nike trainer, anything with air in it. It's transitive and commutative.

Also, as a natural gap between two bodies of land, there's significant human traffic across this same body of water. Trade, people and vehicle movement, all travel or use this natural barrier, as a channel. There's even a fixed 'route' (or channel): a tunnel; for this commutation (of people and goods) that avoids the natural liquid one.

##### Share on other sites

This isn't a blog.

##### Share on other sites

It isn't not a blog too.

Why don't you blog off, and go blog someone else?

##### Share on other sites

Shannon's 'information entropy" -which should more properly be called (un)certainty, has a symmetry with our thermodynamic model of heat. Despite the perceived problems with equating the two (which are related to the observation that heat is physical and "information" isn't). Entropy, a measure or metric of disorder in some system, offers a level of uncertainty to any observer who wishes to measure it.

Information in fact reduces an observers uncertainty about a system and so reduces the informational entropy (it doesn't add to it, and the thermal or physical entropy doesn't change, just knowledge of it).

The more information (the more content in any message) the more reduction (in uncertainty) occurs. Information has energy. The equations show this symmetry, so it must behave in an equivalent way...

Both information and entropy are nothing but the same thing. Your opinion that information reduces informational entropy is invalid. In statistical mechanics the definition of entropy is " It is the uncertainty in which microstate will be observed during the next measurement". To measure entropy the process has to be randomised if the same microstate appears again and again it does'nt make any sense to measure the entropy of the system using the Boltzman's entropy equation S = K log W. In a cell the DNA will be in the same microstate what ever the time you measure. So you can't measure entropy based on uncertainty which has a system with a specific mechanism.

It is very important to note what Clausius states about entropy " when a system is in equilibruim its entropy increases and decreases when it is out of equilibruim". For the definition of equilibruim http://www.answers.com/topic/equilibrium-4?cat=technology

Life is in well accordance with this law when an organism is in equilibruim with its surroundings (when it is best adapted) it leads to increase in information i.e the organism replicates or multiplies. This does'nt mean that the purpose of life is to increase information because life can decrease information when it is out of equilibruim and it will be in a state of thermal equilibruim and the overall entropy increases.

Gene duplicates so that it helps the organism to attain equilibruim and leads to replication of the organism and the gene is preserved in the gene pool. Also having copies of the same gene reduces error in the transmitted message.

Information exists to an observer only in a specific mechanism. Life has information because it has a specific mechanism and this mechanism is the decoding of DNA and the decoder is the tRNA. Information theory is applicable to molecular biology and it answers many questions such as the repetitive segments of DNA. According to information theory more is the randomness in the message which is encoded more is the information stored in it. We know that these segments do not code for any proteins as there is less information stored in it. DNA is a code but the problem is we do not know how this code originated.

Gene duplication, replication, recombintion etc leads to increase in Information. Information has a tendency to increase in a specific mechanism. So we all are the result of Second law of Thermodynamics.

##### Share on other sites

E= hv tells you how much energy a photon has. Still doesn't mean light is energy.

##### Share on other sites

It isn't not a blog too.

Why don't you blog off, and go blog someone else?

Hmmm... Is this response intended to serve as an example of the entropy of your personal reputation and coherence here on SFN? If so, you've done a bang up job.

##### Share on other sites

Your opinion that information reduces informational entropy is invalid.... it doesn't make any sense to measure the entropy of the system using the Boltzman's entropy equation S = K log W....you can't measure entropy based on uncertainty[/b'].

Good one.

So you think Claude Shannon, John von Neumann, Turing & co., are wrong in saying you can measure entropy based on uncertainty?

Here's my take (from the usual source):

Intuitively, the combined system contains H(X,Y) bits of information: we need H(X,Y) bits of information to reconstruct its exact state.

If we learn the value of X, we have gained H(X) bits of information, and the system has H(Y | X) bits remaining of uncertainty.

H(Y | X) = 0 if and only if the value of Y is completely determined by the value of X. Conversely, H(Y | X) = H(Y) if and only if Y and X are independent random variables.

In quantum information theory, the conditional entropy is generalized to the conditional quantum entropy.

The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory.

The conditional entropy is written S(ρ | σ), or H(ρ | σ), depending on the notation being used for the von Neumann entropy.

John von Neumann provided in this work a theory of measurement, where the usual notion of wave collapse is described as an irreversible process (the so called von Neumann or projective measurement).

Unlike the classical conditional entropy, the conditional quantum entropy can be negative." --wikipedia.org

i.e. Thermodynamics does not apply (except in an equivalent -but inverse- sense)

E= hv tells you how much energy a photon has.

Has where? What does a photon do with this energy it carries around?

What happens to a photon when an electron absorbs it?

Where does it go? The same place it goes when it turns into excitons in that optical fibre. It "vanishes".

P.S. If there are some who are unable to see what this thread is on about; that it doesn't seem to present a coherent discussion, I suggest the following meditation:

Why do you want it to "mean something" to you?

If it isn't transparent, post a question, fer chrissakes. Don't just assume that YOU know anything about what I'm saying. At all.

P.P.S. I can be rude and insulting too. But this is pointless, surely. Now children, really

##### Share on other sites

This topic is now closed to further replies.

×