Jump to content

Does information carry energy?


Mrs Zeta
 Share

Recommended Posts

The information we perceive via our senses is essentially patterns of photons, sound waves etc. These signals are being picked up and manipulated by the brain. Is this process associated with an increase amount of energy reaching the brain? I don't mean increased metabolism therefore more energy, what I mean is does information (of any sort) carry energy?

 

More generally, if we input information and data into a computer, does its internal energy increase? If we input meaningful information into a system, does its entropy decrease?

Link to comment
Share on other sites

Information is an abstract concept devised by humans as a way of interpreting their recognition of patterns. The patterns of energy you are talking about is only us recognizing part of a constant flow of energy. Think of it this way, for loss of a better analogy, you see a pile of rocks, somehow several of them have against the logical odds grouped together in a pattern that spells your name, you are asking if it required more rocks to form this pattern that you recognize.

 

 

More generally, if we input information and data into a computer, does its internal energy increase? If we input meaningful information into a system, does its entropy decrease?

 

Think about what a computer uses to store information. RAM and a HDD, both which have a set size, you are just rearranging bits in the computer to match the pattern you want to enter.

Edited by Nexium Tao
Link to comment
Share on other sites

More generally, if we input information and data into a computer, does its internal energy increase? If we input meaningful information into a system, does its entropy decrease?

 

 

What is the information?

How to we describe that information?

 

This is the basic information unit.

For example 8bit case, 2word

00000000 00000000 No information

00001111 11111111 Information

If 0 is 0 volt, 1 is 0.1 volt

First is 0volt, 2nd is 0.1*12=1.2 volt is required.

 

In the nature no information state is the most low energy state, and recored information state is high energy state.

So we first transform to high energy state, and receive that state, and next transform high energy state to low state , at that stage energy is relased.

If we used other device for recording information, it would be same result.

No information state, entropy S =0,

having information state, S > 0.

Link to comment
Share on other sites

The information we perceive via our senses is essentially patterns of photons, sound waves etc. These signals are being picked up and manipulated by the brain. Is this process associated with an increase amount of energy reaching the brain? I don't mean increased metabolism therefore more energy, what I mean is does information (of any sort) carry energy?

 

Strictly speaking, information is not physical at all. It is only formal. Information may be conveyed and stored via low entropy signals on high entropy carriers including energy waves, but the information is independent of the media it is transported and stored on.

 

More generally, if we input information and data into a computer, does its internal energy increase? If we input meaningful information into a system, does its entropy decrease?

 

Since information is not energy the information itself does not increase energy of the storage device. However if the information is imported into the device on an energy carrier signal then the energy source clearly causes the internal energy of the device to increase.

 

Likewise the thermal entropy of the physical device does not change when information is imported, however the entropy of the stored information may well change when additional information is imported.

Link to comment
Share on other sites

Fill the memory of a computer with a random sequence of "0"s and "1"s this would represent gibberish. Now rearrange the "0"s and "1"s into a meaningful pattern, perhaps a computer program, would this increase the energy within the memory. In my opinion it would not. For one thing the random sequence might have bits of information here and there (no pun intended) . Also different methods of interpretation (e,g. different computer languages) might change what is gibberish and what is information. If it is impossible to decide what is gibberish and what is information then we can't (imo) expect levels of stored energy to change,

Edited by TonyMcC
Link to comment
Share on other sites

Fill the memory of a computer with a random sequence of "0"s and "1"s this would represent gibberish. Now rearrange the "0"s and "1"s into a meaningful pattern, perhaps a computer program, would this increase the energy within the memory. In my opinion it would not. For one thing the random sequence might have bits of information here and there (no pun intended) . Also different methods of interpretation (e,g. different computer languages) might change what is gibberish and what is information. If it is impossible to decide what is gibberish and what is information then we can't (imo) expect levels of stored energy to change,

 

0 and 1 is the simplest form of information.

We can represent it many ways. For example, electron sprin direction, molecular magretic direction, DNA cording, electron voltage difference....

Which way we represent it , the process requires energy.

In nature no information is not high energy state, because nature is not so foolish to do such a thing.

Is it efficient that transfering energy by using information? This problem is required more thinking.

Link to comment
Share on other sites

If a dumptruck filled with bricks dumps the bricks into a pile, the bricks will pile up more or less at a high level of entropy (disorder). If you impart information into the bricks by assembling them into a house, this increases the order of the bricks. If you subsequently bulldoze the house, the entropy is once again increased. In this case, the information was stored in the bricks in the form of potential energy.

 

When you input information into a computer's memory, the signal must add a little heat to the conductors it travels through, but this heat is dissipated by the computer's cooling fan, which increases the heat-differential between the inside of the computer and outside of it. But since the computer is hotter than its surroundings, adding heat technically decreases the entropy of the system containing the computer and its surroundings. Turning off the computer and allowing it to reach temperature equilibrium with its surroundings would be maximum entropy, I think.

Link to comment
Share on other sites

Consider this to see if has a bearing on the discussion:-

You write a letter conveying information and put it in an envelope. In another envelope you put a blank piece of paper. Two people carry the two envelopes to a destination by walking briskly. The two people will get warm, but the two envelopes will arrive at the destination at the same temperature. You will not be able to devise a test to determine which envelope contains information before opening it. I think its ok to rule out weighing the two envelopes to detect the weight of ink. There would be ways to carry information without using ink - e.g. folding the sheet of paper a number of times according to a pre-arranged code.

Edited by TonyMcC
Link to comment
Share on other sites

Consider this to see if has a bearing on the discussion:-

You write a letter conveying information and put it in an envelope. In another envelope you put a blank piece of paper. Two people carry the two envelopes to a destination by walking briskly. The two people will get warm, but the two envelopes will arrive at the destination at the same temperature. You will not be able to devise a test to determine which envelope contains information before opening it. I think its ok to rule out weighing the two envelopes to detect the weight of ink. There would be ways to carry information without using ink - e.g. folding the sheet of paper a number of times according to a pre-arranged code.

 

I think information definition is important.

What is the information?

Closed system

one matter state + information ---------------> another matter state + others

energy.......................energy.................................energy.....................energy

 

information energy = another matter state energy + others energy - one matter state energy

 

Link to comment
Share on other sites

OK, I think what I am coming to understand is that information does not carry energy but it can optimize energy.

 

For example (and sorry to use a biological case in this forum), our brain has enough energy resources provided by the food we eat. But inputting information in the brain optimises the use of this energy. The information (intentional cognitive stimulus) modulates the use of energy by the neuron and thus the cell functions optimally. Without the purposful information the cell would just sit there metabolising quietly (just like in the case of my fellow passengers in the train today!) Any loss of energy associated with, say, aging, would then be compensated by an optimal use of the remaining resources, a process encouraged by the additional (not background) cognitive information we input.

Edited by Mrs Zeta
Link to comment
Share on other sites

In computers each bit of information was (and presumably still is) held in a bi-stable circuit known as a "flipflop". When flipped one way the circuit "remembers" "1" and when flopped the other way it "remembers" "0". In general the circuit takes as much power from a supply to "remember" "1" as it does "0".

Perhaps the principle can be examined with a mechanical bi-stable device based on a child's see-saw (see diagram).

If you want to change its state you do have to do some work. However once in the state you want, it will sit in that new state until you want to change the information. Surely each state has the same condition regarding any form of energy stored?

At least in theory you could make a computer made of millions of these see-saws and just flip them back and forth in the correct sequence to do anything a computer can do!

If a multitude of these devices can store any desired information doesn't this show that the information itself has no energy content?

They say when an old man dies a library burns to the ground - I'm quite old but my brain doesn't seem to have gained energy lol.

post-22702-0-35022700-1292928509_thumb.jpg

Edited by TonyMcC
Link to comment
Share on other sites

Say you buy two new laptops. The first one is an ordinary laptop with nothing pre-installed. The second one is exactly similar but with many programmes preinstalled and downloaded (word processing, photos, music tracks, films, adobe, thesaurus etc). Both are switched on and in idle mode -no programmes actively running. If I had to choose which one had the most 'energy' inside, I would choose the second, even though they both use the same amount of electrical energy at that passive stage. The second one with the information in it has the potential to do stuff (a characteristic of energy). Energy does stuff, no energy equals uselessness i.e death.

Link to comment
Share on other sites

Say you buy two new laptops. The first one is an ordinary laptop with nothing pre-installed. The second one is exactly similar but with many programmes preinstalled and downloaded (word processing, photos, music tracks, films, adobe, thesaurus etc). Both are switched on and in idle mode -no programmes actively running. If I had to choose which one had the most 'energy' inside, I would choose the second, even though they both use the same amount of electrical energy at that passive stage. The second one with the information in it has the potential to do stuff (a characteristic of energy). Energy does stuff, no energy equals uselessness i.e death.

 

If you can't measure it in joules, it isn't "energy." End of story. That something happens to be more useful to you personally doesn't mean there is more energy present or vice versa (the laptop still works if you put in in the freezer, but not if you set it on fire!), and just because something takes more information to describe does not make it more useful or vice versa.

 

In fact, if anything the opposite is true. Entropy doesn't change the total energy of a system, but it does decrease the useful capacity for work but tends to increase the information needed to describe it.

Link to comment
Share on other sites

Mrs Zeta - I shall have to say that what follows comes under the description "in my opinion".

The second computer with the programs installed doesn't actually provide energy to"do stuff". The list of instructions which is the computer program has only a controlling function which in fact needs externally supplied energy to be able to influence the computer circuitry and any externally connected peripherals such as a printer. As an indication of this consider how much stuff the computer can do with a flat battery and not connected to the mains. What follows may not be a good analogy but comes near to what I am saying :- You decide to have a bath and to do this your brain will have used a little energy derived from your food. You walk to the bathroom and turn the taps - again using energy derived from your food. Water gushes from the taps using energy driving the water supply using gravity or a pumping station. etc.. I don't think the thought "I would like a bath and how to achieve my objective "actually supplied any energy. If anything having the thought used energy rather than supply it.

 

Nope. Energy carries information!

Mr Skeptic - I would appreciate it if you would expand your thoughts on this topic - whether you support me or not - regards.

Edited by TonyMcC
Link to comment
Share on other sites

Mr Skeptic - I would appreciate it if you would expand your thoughts on this topic - whether you support me or not - regards.

 

Sure. First of all, it is impossible to manipulate information without using energy, whether you want to read, transmit, or store it. Secondly, any particle can be considered to carry information (its location and momentum and particle type), even if this information does not map to anything useful. All the particles are the same regardless of whether we're using them to transmit information. For example, with fiber optics the light is just the same light as any other light of the same color. Even though for convenience we often use big groups of particles with very unnatural properties (such as being all the same phase with brightness rapidly increasing or decreasing, like a pule in fiber optics), so that any such group if seen would clearly be a structure we made for transmitting information, we can still do it with individual particles at the cost of having much more expensive detectors. So if individual particles can transmit information, and the particles are indistinguishable from any other particle, we can consider them all to carry information.

 

As for the energy required for this:

You can't move something without using energy, so no transmitting without energy. You can't look at something without using energy, so no reading without energy. You can't change something without using energy, so no writing without energy.

Link to comment
Share on other sites

Nope. Energy carries information!

Mr Skeptic. If energy carries information are you saying information does not carry energy? I can see that the construction and recording of information uses energy. I can also see that anything that can be done with information will need an input of energy, but what about the information itself? You seem to be saying any structure contains both information and energy. Also that the amount of energy involved is the same whether we can use that information or not. In particular would you agree with me that a computer's RAM will not have an increase in energy just because we have arranged the data it holds so that the contents are meaningful to us (e.g. we have loaded a program)? I realise the actual arranging of the contents will need an input of energy, but how does the energy state of the RAM compare before and after the rearrangement? Thank you for your thoughts on this matter.

Edited by TonyMcC
Link to comment
Share on other sites

Information is an abstract concept and can take many different forms, and also there are many different definitions. In the example I gave where everything is considered information, it seems physicists are undecided whether information can be destroyed or not, which would make comparison between things with and without information rather difficult, and also it would be inseparable from any particles. If we go with the slightly larger scale where information is represented in macro arrangements and can be removed from them (like burning a book), then we could remove information from something and measure for differences. But then I think it would depend on the specifics of the information storage device, for example if you store information as scorch marks on paper than the paper with information would have less energy than blank.

 

There was also an interesting paper where researchers used information to gain energy, but the energy they gained was from random thermal motion. If they could have gotten that information for free it would be a perpetual motion machine, but of course there' an energy cost in measuring the system to get the information. Basically they use the information to lower entropy, so information as "free energy" (not free as in the stuff you hear from crazies, but as in it can be used, the same sort as the free energy you lose to friction), which is not quite the same as energy.

http://www.nature.com/news/2010/101114/full/news.2010.606.html

Link to comment
Share on other sites

This is another problem.

We first think about information definition.

When we go to the top of the mountain, there would be two cases.

First case is we have a map of the mountain.

2nd case is we do not have a map.

At the first case we can go to the top of the mountain by using less energy, because we have a road information.

But 2nd case we can get there by using very much energy, because we don't know where is the right way.

From here calculated information energy is like this.

Information energy = 2nd case energy - 1st case energy

Is this right?

So information definition is very important.

 

Edited by alpha2cen
Link to comment
Share on other sites

Nope. Energy carries information!

Would you like to define the word "nope"? I rather suspect your answer will be "Nope".

I have followed your link and it seems James Clerk Maxwell didn't try to quantify the amout of energy input provided by the Demon's work in controlling access to rooms and preventing the molecules returning to the central pool.

Masaki Sano's experiment seems quite suspect. The minute polystyrene bead would require an extremely small amount of energy to move and the circuitry providing the necessary detection and timing together with the production of voltage pulses would (imo) be providing more than sufficient energy input.

I have to agree with alpha2cen that in order to come to a general decision in this matter we need a clear definition of "information".

It is possible that my definition of "information" is too narrow - but I don't really think so.

I can certainly provide a system where information can change although any form of energy held in the system doesn't change.

I do hope your answer to the first question is not "Nope"!

Link to comment
Share on other sites

There was also an interesting paper where researchers used information to gain energy, but the energy they gained was from random thermal motion. If they could have gotten that information for free it would be a perpetual motion machine, but of course there' an energy cost in measuring the system to get the information. Basically they use the information to lower entropy, so information as "free energy" (not free as in the stuff you hear from crazies, but as in it can be used, the same sort as the free energy you lose to friction), which is not quite the same as energy.

http://www.nature.com/news/2010/101114/full/news.2010.606.html

 

The information was not used to reduce entropy since even in tis example net entropy was not reduced. They may have reduced entropy of one subcomponent, but not of the entire system/apparatus. Instead information and a powered feedback controller was employed to isolate a particle and allow it to absorb high energy collisions while avoiding low energy collisions that would drain energy.

 

The information is not "free energy" in any sense as that energy carrier was actually an input into the system and contributed to the function. Information was not traded or transformed into free energy.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.