Jump to content

Evolution has never been observed


cabinintheforest

Recommended Posts

People, it's pretty clear he means getting the hot molecules out of room temperature sea water and using them for energy, or rather that that can't be done.

 

Yeah, and then maybe he could write a Nature article about it:

 

As the article noted, there has been reasonable arguments offered that Maxwell's Demon would not violate thermal entropy laws, but my understanding of the resolution is that it would require at least an equal amount of energy to acquire the information and operate the demon. This experiment indicates that a device may be able to raise potential without imparting equal or more energy into the particle as the researchers state, but they do not address the energy cost of acquiring the particle data, processing the inputs into information and operating the demon in the process. Furthermore the paper says that information is converted into energy but was unclear on how was the information destroyed.

 

On this basis I have some questions.

 

Could you describe for me the source of the energy used to acquire the information and operate the feedback controller?

 

How much energy was used to construct the apparatus? How long must it operated to recover this energy investment? Will it ever recover this energy cost?

 

What was the total decrease in information in this system? How did it occur?

 

These questions not withstanding, my question dealt with conversion of information entropy to thermal entropy, while this experiment seems to address a slightly different notion of conversion of information to energy. If the experiment acquires and then destroys information, what is the net information entropy change in that cycle? Wouldn't it be zero? If not, what was the net change in thermal entropy and information entropy? What about after including the demon?

Link to comment
Share on other sites

As the article noted, there has been reasonable arguments offered that Maxwell's Demon would not violate thermal entropy laws, but my understanding of the resolution is that it would require at least an equal amount of energy to acquire the information and operate the demon. This experiment indicates that a device may be able to raise potential without imparting equal or more energy into the particle as the researchers state, but they do not address the energy cost of acquiring the particle data, processing the inputs into information and operating the demon in the process. Furthermore the paper says that information is converted into energy but was unclear on how was the information destroyed.

 

On this basis I have some questions.

 

Could you describe for me the source of the energy used to acquire the information and operate the feedback controller?

 

How much energy was used to construct the apparatus? How long must it operated to recover this energy investment? Will it ever recover this energy cost?

 

What was the total decrease in information in this system? How did it occur?

 

These questions not withstanding, my question dealt with conversion of information entropy to thermal entropy, while this experiment seems to address a slightly different notion of conversion of information to energy. If the experiment acquires and then destroys information, what is the net information entropy change in that cycle? Wouldn't it be zero? If not, what was the net change in thermal entropy and information entropy? What about after including the demon?

 

True that the "demon" won't actually gain energy by this process, rather it is to show that energy and information can be interconverted. You can gain information by taking measurements, however the measurements require energy. That part is simple enough, but converting information back to energy is the new thing demonstrated here. As to how the information is destroyed, I imagine it is simply overwritten by new data. I suppose a copy could be kept, but I doubt energy could be recovered from that since it needs real time data. Anyhow, keeping copies of the data could easily be done, but it would require additional data storage material which would be an energy cost.

 

As for reducing information entropy (as opposed to making more information), that can also be done with energy. For example take undeveloped film with a photograph on it, which has significant information entropy, and expose it to sunlight. Then the information stored on the film will be converted to extremely low entropy information, while (in theory) preserving the total amount of information.

Link to comment
Share on other sites

When one speaks of conversion, as in converting kinetic energy converted to thermal or potential energy, the kinetic component must be reduced. How is the the energy converted if it is not reduced in any meaningful way by the feedback controller? If it is not reduced then it is not converted or consumed. Instead the information use along with energy consumed by the controller seems to be used to catalyze the realignment of energy from one state to another consistent with the postulated solution to Maxwell's demon whereby no net free energy is produced.

 

 

As for reducing information entropy (as opposed to making more information), that can also be done with energy. For example take undeveloped film with a photograph on it, which has significant information entropy, and expose it to sunlight. Then the information stored on the film will be converted to extremely low entropy information, while (in theory) preserving the total amount of information.

 

Isn't whatever information entropy contained in the reflected light is simply transfered to the paper when the paper absorbs the light?

Edited by cypress
Link to comment
Share on other sites

When one speaks of conversion, as in converting kinetic energy converted to thermal or potential energy, the kinetic component must be reduced. How is the the energy converted if it is not reduced in any meaningful way by the feedback controller? If it is not reduced then it is not converted or consumed. Instead the information use along with energy consumed by the controller seems to be used to catalyze the realignment of energy from one state to another consistent with the postulated solution to Maxwell's demon whereby no net free energy is produced.

 

Earlier you had said,

 

Perhaps our friend with the heat engine above will argue that information order can substitute for the thermal order his machine lacks.

 

When taking a measurement, it uses energy. Sure, the energy is conserved but there's also the energy "lost" to entropy. This machine seems to be able to do the opposite, using information to reduce entropy and gain energy (not counting the energy/entropy used to get that info).

 

Isn't whatever information entropy contained in the reflected light is simply transfered to the paper when the paper absorbs the light?

 

I'm not sure what you mean. Are you saying I can't take a picture using energy, that I can't reduce the information entropy of undeveloped film using energy, or neither of those?

Link to comment
Share on other sites

When taking a measurement, it uses energy. Sure, the energy is conserved but there's also the energy "lost" to entropy. This machine seems to be able to do the opposite, using information to reduce entropy and gain energy (not counting the energy/entropy used to get that info).

 

Let's review what it does and does not do:

 

1) Contrary to the title of the article and contrary to your introduction of it, it does not convert information into energy. Instead it employs (but does not consume) information and consumes energy in one subcomponent (the feedback controller) in the process of isolating discrete energy states of another component.

 

2) It does not alter information entropy and it does not substitute information entropy for energy entropy, which is the challenge I requested. You offered this example as an answer to the challenge, but it is not an answer to the challenge. Instead it raises thermal entropy of the feedback controller and slightly raises net thermal entropy of the system.

 

Summary: Information is not converted to energy. Information entropy is not substituted for thermal entropy.

 

I'm not sure what you mean. Are you saying I can't take a picture using energy, that I can't reduce the information entropy of undeveloped film using energy, or neither of those?

 

Neither of those. To the extent that the photograph is a lower resolution representation of the light pattern at the time, location and orientation, net information entropy rises in this process just as does thermal entropy. The light pattern prior to capture of the light by the photographic paper contains both energy and information. Energy is transfered to the paper and information is transfered to the paper, there is however no substitution of energy to information or visa versa.

 

No, the abstract talks about failures to synthesize anything longer than a two or three amino acid oligopeptide. The abstract also uses the word peptide after having referred to oligomeric products. We can argue over the arbitrary line of oligomer, but I think peptide would imply a longer chain. Some people define insulin as a peptide (and not a protein) because it's so short, but I think we can agree it is a functional biomolecule.

 

Also take notice though that the reactions were carried out at room temperature. I'm not sure what pressure, I imagine 1 atm since it wasn't specified. The reaction gave up to 80% yield under those conditions, is it not reasonable that the molecular weight of the poylmer would increase if the reaction was a under a much higher pressure? I'm speculating a bit here but those high pressure conditions and oxygen poor do exist at the bottoms of some of our oceans.

 

The full text of the article makes it clear that only di and tri peptides formed, and because they are so short, I continue to wonder how this could be evidence that natural processes are capable of generating irregular polymers that form stable tertiary structures such that the probability of the set of polymers the specific ones are contained by is small relative to the combinations possible. Low probability is the benchmark of low entropy. Recent studies indicate that the probability of obtaining a protein with a stable tertiary structure is less than 1 in 10^78 for even a short 150 amino acid chain.

 

There are thermophilic bacteria or archaea that live under these conditions currently. They're protein macro-structures are stabilized by relatively high numbers of cysteine residues and the conformationally stabilizing disulfide bonds that come with that. Which by the way is another way in which new [functional] information is created by natural selection. Evidently, only the peptides that formed with a lot of disulfide bonds [cysteine residues] were able to survive in those extreme conditions. So an ordinarily insignificant cysteine residue has become a functional bit of genetic information that could later become associated with a certain mRNA codon and favored for by natural selection because it contributes to the chemical stability of the proteomic information itself.

 

How is it known that these proteins were formed by natural selection? Do the cysteine residues alter the basic shape and primary function or do they primarily serve to stabilize the proteins for higher temperatures?

Link to comment
Share on other sites

Let's review what it does and does not do:

 

1) Contrary to the title of the article and contrary to your introduction of it, it does not convert information into energy. Instead it employs (but does not consume) information and consumes energy in one subcomponent (the feedback controller) in the process of isolating discrete energy states of another component.

 

2) It does not alter information entropy and it does not substitute information entropy for energy entropy, which is the challenge I requested. You offered this example as an answer to the challenge, but it is not an answer to the challenge. Instead it raises thermal entropy of the feedback controller and slightly raises net thermal entropy of the system.

 

Summary: Information is not converted to energy. Information entropy is not substituted for thermal entropy.

 

Using real-time feedback control, the particle is made to climb up a spiral-staircase-like potential exerted by an electric field and gains free energy larger than the amount of work done on it.

 

It uses information to gain energy from Brownian motion. Certainly, there is an energy cost to get that information (and there would be additional energy costs to store the information). But if you could magically feed the machine the proper information, it could go about doing this without having to spend energy to get this information.

 

Neither of those. To the extent that the photograph is a lower resolution representation of the light pattern at the time, location and orientation, net information entropy rises in this process just as does thermal entropy. The light pattern prior to capture of the light by the photographic paper contains both energy and information. Energy is transfered to the paper and information is transfered to the paper, there is however no substitution of energy to information or visa versa.

 

Ah, so would you rather the picture have been taken in the dark using a flash? Would that have created information, since then the information wouldn't have been there in the light patterns? Or, did the various processes that resulted in the items in the picture being there, create information?

 

As for reducing information entropy of the film, I said this would happen when you expose undeveloped film (with a picture on it) to sunlight. Not when taking the picture.

 

And now a consistency check: How much information and information entropy is is in the following sentences? (or at least rank them):

1) "These sentences are to test whether you are being consistent or not about information"
2) "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
3) "pqodjeklgmzelcye3km24jskclwv,adiem=clwpd slkw'ske;2lcio#zxlic.akjhewopkhft,aiehsx7peh"

 

Of particular interest:

Does the more meaningful sentence contain less information than the one that looks like gibberish, and more information entropy than the boring one?

Link to comment
Share on other sites

I don't suppose you will answer the questions regarding change in information and energy values of the feedback controller.

 

 

It uses information to gain energy from Brownian motion. Certainly, there is an energy cost to get that information (and there would be additional energy costs to store the information). But if you could magically feed the machine the proper information, it could go about doing this without having to spend energy to get this information.

 

I does not accomplish what was advertised apparently because it is constrained by physical law. Apparently information is not energy and cannot substitute for energy indicating also that information order is not thermal order..

 

Ah, so would you rather the picture have been taken in the dark using a flash? Would that have created information, since then the information wouldn't have been there in the light patterns? Or, did the various processes that resulted in the items in the picture being there, create information?

 

No information was created. What information there was, previously existed in the configuration of the physical surroundings. The information was copied into the configuration of the reflected and absorbed light and then transfered onto the photographic paper.

 

As for reducing information entropy of the film, I said this would happen when you expose undeveloped film (with a picture on it) to sunlight. Not when taking the picture.

 

Correct me if I am wrong, but when order present in the undeveloped film is reduced by overexposing it entropy is increased.

 

And now a consistency check: How much information and information entropy is is in the following sentences? (or at least rank them):

1) "These sentences are to test whether you are being consistent or not about information"
2) "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
3) "pqodjeklgmzelcye3km24jskclwv,adiem=clwpd slkw'ske;2lcio#zxlic.akjhewopkhft,aiehsx7peh"

 

Of particular interest:

Does the more meaningful sentence contain less information than the one that looks like gibberish, and more information entropy than the boring one?

 

Capt'n Refsmmat and I discussed the difficulty of calculating absolute values for probability measures including information and all forms of entropy a few times now. Since information is a measure of probability elimination, the measure of information is not separable from the circumstances surrounding the process by which the information was obtained and recorded because probability measure are influenced by the process and constraints. Therefore more information is needed to calculate probability and permutations for the outcomes you listed. When the processes are not known then one must postulate the processes and constraints and develop measures based on those postulates. In estimating information changes by evolutionary processes in biological systems, or the information gain in life from non-life, the processes (but not the intent, if any) are relevant and are postulated, then based on those processes, it is possible to have a meaningful discussion of information content.

 

What processes generated the data strings you provided and what were the constraints?

Link to comment
Share on other sites

Carbonyl Sulphide mediated prebiotic peptide formation

 

The above link is to a paper concerning spontaneous synthesis of peptides in aqueous solution via en vitro non-enzymatic catalysis. If you're not a member of the magazine, the abstract should be enough. It shows that spontaneous, natural polymerization of amino acids is not outside the realm of possibility. Carbonyl sulphide is abundant in waters surrounding volcanic thermal vents.

 

I addressed this article and the implications in a short discussion on the previous several posts. The fact that di and tri peptides form deterministically from reaction kinetics does not extend to formation of irregularly sequenced, long chain long chained proteins with stable tertiary structures. Our only experience with formation of these molecules is by use of pre-existing information in the form of biological information stored within the molecular patterns of DNA, which includes a very low entropy sequence on a very high entropy carrier.

 

spontaneous polymerization and microstructure...

 

This link is to an article in which highly branched polymers of n-butyl acrylate are synthesized without initiators at 180 C. One of the possible mechanisms could likely involve radical self-initiation. This would be highly disfavored by entropy, yet it still happens.

 

Please explain in precise terms why you conclude that self-initiation is highly disfavored by entropy. Are you implying that it contradicts molecular or thermal entropy constraints? If so please explain in precise terms how this is so.

 

As I understand the research, the process simply follows and complies with the mechanisms of chemical reaction kinetics including both the random nature imposed by Brownian motion and the deterministic nature imposed by physical laws. I don't see any obvious contradiction to probability theory and the entropy considerations that are derived from probability theory. Please help me see what it is you think I am missing.

 

Nothing could be further from the truth. Spontaneous synthesis of biomolecules from pre-biotics is one of the leading hot topics in physical chemistry today. There is a large movement in p-chem to find the reason for all amino acids having the L-configuration, as well as a large interest in non-enzymatic, pre-biotic catalysis. Here is an article from the American Chemical Society, Journal of Physical Chemistry B:

 

emergence of phospholipid superstructures

 

The article concerns the apparent organization of a homogeneous phospholipid micelle into a myelin like superstructure in about 240 minutes. Just give that another hundred million or two and what could happen?

 

In response to my statement that complex biomolecules (biomolecules imply biological activity, activity implies stable tertiary structure and appropriate binding sites) containing irregular sequences don't seem to form by the typical chemic processes, instead they form using a transcribed blueprint and a carefully managed and controlled set of processes and molecular machines, you replied "Nothing could be further from the truth". Then you offered a this example of generating regularly sequenced polymers using what amounts to nano templates, superstructures or molds. I don't see how these two examples are in the same class. They are not irregularly sequenced. They do not use an information blueprint. They do not make use of a control system. They do not make use of molecular machines but instead use a template or superstructure.

 

As for organization, they take on the organization that was determined by the superstructure or template and the template was deterministically formed in some cases and formed by a purposed plan in other examples, so i am loath to understand what introduction of long periods of time would do to this process. since they are formed deterministically based on the template, it seems time would not change that. What am I missing?

 

Cypress, you have been asking for specific examples of the spontaneous generation of complex biomolecules. Here I have cited three examples from the peer reviewed literature. If you still need more conformation, I have barely scratched the surface of the host of similar articles and would be glad to cite more.

 

I'm sure you would and could site many similar examples as the three you have, and I quite agree they are interesting and relevant for what they do demonstrate, but I don't see how they provide much, indeed if any, insight into the issue being discussed here. If you could instead help reveal how these examples are more similar than they appear, I would be grateful.

Link to comment
Share on other sites

...Are you still on info' Entropy!?!?!?!?

Information is not a real thing. Will you stop thinking it is real. It does NOT exist. Info' entropy therefore does not apply to anything other than information. Info' is an invention of the mind!

Link to comment
Share on other sites

I don't suppose you will answer the questions regarding change in information and energy values of the feedback controller.

 

I'm not really sure what you mean, but it seems you're saying that the researcher is wrong. That's fine, but I can't really defend the researcher's assertions as I don't have access to the full article.

...we can, in principle, convert information to free energy2, <a href="http://www.nature.com/nphys/journal/v6/n12/full/nphys1821.html#ref3" title="Bennett, C. H. The thermodynamics of computation/a[mdash]a review. Int. J. Theor. Phys. 21, 905-940 (1982)." id="ref-link-3">3, 4, 5, 6.

 

Using real-time feedback control, the particle is made to climb up a spiral-staircase-like potential exerted by an electric field and gains free energy larger than the amount of work done on it.

 

Although from this it seems to me that thermal energy is converted to kinetic energy (which are similar but not exactly the same with the difference being mainly entropy). If that is what your question was.

 

 

I does not accomplish what was advertised apparently because it is constrained by physical law. Apparently information is not energy and cannot substitute for energy indicating also that information order is not thermal order..

 

That does not follow. Entropy is about states in which matter may be found -- not about energy. Energy is just one of many things that affect the states matter may be in. States are not energy and so are you also going to say that scientists have made a mistake in defining entropy? But states are very similar to information, and a messy arrangement of states is very similar to a messy arrangement of information.

 

Moreover, never did I claim that information is energy. I said that you can get information with energy, and that this new machine can get energy from brownian motion with information. Not that it magically makes the energy appear or something.

 

No information was created. What information there was, previously existed in the configuration of the physical surroundings. The information was copied into the configuration of the reflected and absorbed light and then transfered onto the photographic paper.

 

Good enough, now back up a little more. If we consider the physical environment to be information (such that taking a picture of it doesn't count as creating information), then where did that information come from? It wasn't always there as the environment is not static. Do natural processes such as cloud formation create the information that would be contained in a picture of clouds?

 

Correct me if I am wrong, but when order present in the undeveloped film is reduced by overexposing it entropy is increased.

 

Sure, I'll correct you. I'm talking about information entropy for the film, not regular entropy, and not entropy of any kind for anything other than the film. And, the film has an image on it, but has not been developed, so that exposure to light will erase the image.

 

Capt'n Refsmmat and I discussed the difficulty of calculating absolute values for probability measures including information and all forms of entropy a few times now. Since information is a measure of probability elimination, the measure of information is not separable from the circumstances surrounding the process by which the information was obtained and recorded because probability measure are influenced by the process and constraints. Therefore more information is needed to calculate probability and permutations for the outcomes you listed. When the processes are not known then one must postulate the processes and constraints and develop measures based on those postulates. In estimating information changes by evolutionary processes in biological systems, or the information gain in life from non-life, the processes (but not the intent, if any) are relevant and are postulated, then based on those processes, it is possible to have a meaningful discussion of information content.

 

What processes generated the data strings you provided and what were the constraints?

 

I think that in keeping somewhat with the purpose of this thread, it would be best to say the data strings were generated by some unknown process, and you know nothing of the string other than that it is there. As such, I would suggest the best measure would be the information content of the string with respect to itself. (That's called the mutual information of the string with itself, or I(X;X)). But if you prefer, you could I suppose choose either God or an evolutionary algorithm as the source.

Link to comment
Share on other sites

I'm not really sure what you mean, but it seems you're saying that the researcher is wrong. That's fine, but I can't really defend the researcher's assertions as I don't have access to the full article.

 

It is surprising you would offer an article you didn't read and can't defend.

 

That does not follow. Entropy is about states in which matter may be found -- not about energy. Energy is just one of many things that affect the states matter may be in. States are not energy and so are you also going to say that scientists have made a mistake in defining entropy? But states are very similar to information, and a messy arrangement of states is very similar to a messy arrangement of information.

 

It follows from the formulas. Information entropy has information content as a component and thermal entropy has heat energy content as a component. It is you who has made the mistake not the scientists who derived the formulas.

 

Moreover, never did I claim that information is energy. I said that you can get information with energy, and that this new machine can get energy from brownian motion with information. Not that it magically makes the energy appear or something.

 

Actually you offered it to counter my claim that information order cannot be substituted or converted into thermal order. Regardless of what you think you said, your attempt to counter my claim failed.

 

Good enough, now back up a little more. If we consider the physical environment to be information (such that taking a picture of it doesn't count as creating information), then where did that information come from? It wasn't always there as the environment is not static. Do natural processes such as cloud formation create the information that would be contained in a picture of clouds?

 

The present state of the universe is a function of the prior state. The information came from the previous state of the universe. Entropy (order) is a function of the initial state of the universe. It was always there.

 

Sure, I'll correct you. I'm talking about information entropy for the film, not regular entropy, and not entropy of any kind for anything other than the film. And, the film has an image on it, but has not been developed, so that exposure to light will erase the image.

 

When the image on the undeveloped film is overwritten by noise from random sunlight, order is reduced, probability is increased and entropy rises, so it is difficult to see how I am wrong.

 

I think that in keeping somewhat with the purpose of this thread, it would be best to say the data strings were generated by some unknown process, and you know nothing of the string other than that it is there.

 

But this would be false, I suspect you are quite aware of the processes by which the strings were generated. Are you suggesting that we instead switch to a hypothetical example? This is moving the goal post and a site rule violation. I think it would be more useful to stay with the real example.

Link to comment
Share on other sites

It is surprising you would offer an article you didn't read and can't defend.

 

Not really. The abstract says what the research is about, and it supports what I said. If you really think you can find fault in an article published in a journal as reputable as Nature, perhaps rather than calling the researcher wrong on some internet forum you should offer a formal critique, and make yourself famous. But if you can't really find fault, then what are you complaining about? It is not my job to defend someone else's paper.

 

It follows from the formulas. Information entropy has information content as a component and thermal entropy has heat energy content as a component. It is you who has made the mistake not the scientists who derived the formulas.

 

You do know that thermal entropy is only a limited subset of entropy in general, and in particular thermal entropy can decrease? Or don't you believe in chemical cold packs?

 

If you use the full definition, then you get something that looks suspiciously like your formula for information:

More specifically, entropy is a logarithmic measure of the density of states: 111eca7ae0de35c9e12b57f0b3822031.png where kB is the Boltzmann constant, equal to 1.38065×10−23 J K−1.

 

In what has been called the most famous equation of statistical thermodynamics, the entropy of a system in which all states, of number Ω, are equally likely, is given by 89d56052d26073e0ac59baebdd9fcf4d.png

 

 

Actually you offered it to counter my claim that information order cannot be substituted or converted into thermal order. Regardless of what you think you said, your attempt to counter my claim failed.

 

Oh, well if you like putting words in my mouth, I'm sure I can think of some very interesting things to say that you said, but that isn't really very constructive. When talking about thermodynamics people frequently say "energy" when they mean "free energy", which works fine and is implicitly understood when talking to people who know thermodynamics. For example, the person I quoted when introducing the article used that technically incorrect wording, and then I did too. If you like I can further explain to you the difference between "energy" and "free energy".

 

However, that does still not excuse substituting "energy" for "states", and then acting surprised that energy is not as similar to information as states are. This applies to thermal entropy as well as to entropy in general.

 

The present state of the universe is a function of the prior state. The information came from the previous state of the universe. Entropy (order) is a function of the initial state of the universe. It was always there.

 

OK. So then there's plenty of information around (a whole universeful of it), so it should come as no surprise to you that there's information in DNA molecules. Glad that is settled. I suppose now we can drop the "information" part, and move on to the "entropy" part, right?

 

When the image on the undeveloped film is overwritten by noise from random sunlight, order is reduced, probability is increased and entropy rises, so it is difficult to see how I am wrong.

 

I know it is hard for you to see why you are wrong, which is why I followed it up with a specific example. Just to clarify yet again since you seem to be trying to confuse the issue again, I'm talking about the information entropy of the film.

 

But this would be false, I suspect you are quite aware of the processes by which the strings were generated. Are you suggesting that we instead switch to a hypothetical example? This is moving the goal post and a site rule violation. I think it would be more useful to stay with the real example.

 

In case you are not aware, strings can be generated in many different ways. All these strings in particular could have been generated by many different ways, not necessarily the same for each string, and knowing the source would give you additional information not included in the strings. As such, giving you that information wouldn't allow you to calculate the entropy of the strings themselves, but of the strings given a source. Not that you'd be able to calculate it anyways. Furthermore, if it is false that you do not know how the strings were generated, then you have no need to ask me about them.

 

Furthermore, there is no moving of goalposts, and no switch to a hypothetical example. Had I told you the source of the strings and then insisted that the source was different, that would be moving of goalposts. I'm being very consistent here: you don't know the source of the strings, and should use only the strings themselves to calculate their information and entropy. In fact, I'm being nice in that I'm giving you something fairly easy to calculate. If I gave a source for the strings other than a random number generator, then you wouldn't be able to calculate (nor even rank) it now could you?

Link to comment
Share on other sites

 

You do know that thermal entropy is only a limited subset of entropy in general, and in particular thermal entropy can decrease? Or don't you believe in chemical cold packs?

 

If you use the full definition, then you get something that looks suspiciously like your formula for information:

 

The probabilities though are specific. In thermal entropy the probability refers to the probability of the discrete thermal energy states. I suspect you are aware that the formula is also written as a function of heat content and temperature while information entropy is a function of information content.

 

 

Oh, well if you like putting words in my mouth, I'm sure I can think of some very interesting things to say that you said, but that isn't really very constructive. When talking about thermodynamics people frequently say "energy" when they mean "free energy", which works fine and is implicitly understood when talking to people who know thermodynamics. For example, the person I quoted when introducing the article used that technically incorrect wording, and then I did too. If you like I can further explain to you the difference between "energy" and "free energy".

 

The problem with the abstract and title is that it makes the incorrect claim that information was "converted" to [higher] energy [states]. The information was not converted.

 

 

OK. So then there's plenty of information around (a whole universeful of it), so it should come as no surprise to you that there's information in DNA molecules. Glad that is settled. I suppose now we can drop the "information" part, and move on to the "entropy" part, right?

 

Absolute values for information and information entropy have never been an issue. The change in information content and information entropy of a particular system is the question at hand. Thus far, the source of this change in information order for abiogenesis and diversity of life has not been identified nor has the mechanism for how the change was transfered into the patterns present in DNA. Most often when an attempt is made to explain this, the response is that thermal entropy accounts for these changes. However as this discussion demonstrates, it is unknown how thermal order substitutes for molecular and information order.

 

I know it is hard for you to see why you are wrong, which is why I followed it up with a specific example. Just to clarify yet again since you seem to be trying to confuse the issue again, I'm talking about the information entropy of the film.

 

When noise is added to the film, by exposing it to random light patterns, probability of man or all of the discrete information states rise and entropy increases. The information is not "destroyed" per se since the discrete information states still exist, but the high degree of order of the information is altered.

 

In case you are not aware, strings can be generated in many different ways. All these strings in particular could have been generated by many different ways, not necessarily the same for each string, and knowing the source would give you additional information not included in the strings. As such, giving you that information wouldn't allow you to calculate the entropy of the strings themselves, but of the strings given a source. Not that you'd be able to calculate it anyways. Furthermore, if it is false that you do not know how the strings were generated, then you have no need to ask me about them.

 

It was false to say "it would be best to say the data strings were generated by some unknown process". It is not best to change the problem in midstream. I don't find anything best about this shift to a different and hypothetical example. It is moving the goal post. , and that is what I meant by false.

 

If I gave a source for the strings other than a random number generator, then you wouldn't be able to calculate (nor even rank) it now could you?

 

I believe I could generally describe net entropy change for each case and rank them. I have previously mentioned the difficulty in calculating absolute values for entropy of any kind, however, in this thread I am primarily concerned with changes in information and entropy, so this is not an issue for the broader topic. Here you want me to tell you the absolute values and now you want the case to be hypothetical as opposed to actual.

 

When one asks a question that involves probability, the system generally must be described. Thermodynamics problems involving entropy provide sufficient background to enable a solution. You are effectively asking me what is the probability of the three strings without describing the system. When someone asks "What is the probability of thus and such?" one generally responds "What are the givens?". It would be like asking a student, for a particular undescribed heat engine to calculate the the net thermal entropy rise per cycle. Failure to describe the actual situation makes the problem hypothetical since the solver must make up a hypothetical system. The issue I have is that hypothetical examples have a way of making the impossible seem probable, the probable seem likely and the likely seem actual.

Link to comment
Share on other sites

The probabilities though are specific. In thermal entropy the probability refers to the probability of the discrete thermal energy states. I suspect you are aware that the formula is also written as a function of heat content and temperature while information entropy is a function of information content.

Do you believe that fro an organism to grow, it needs a source of low information entropy?

 

Because, when an organism grows, it reproduces the cells within in it. In side each of these cells is strands (or a polymer) of DNA, and that DNA polymer was put together from DNA monomers. These monomers were in random places (free floating). So we have gon from high information entropy (complete uncertainy of any infomration because the DNA monomers were free floating) to one of low information entropy (a complete pair bonded DNA polymer that has a high degree of information as it is the blueprint of an entire organism).

 

Do you believe that this process requiers a source of low information entropy to occur? You don't seem to have a problem in believing that organisms grow, and this has been observed to occur.

 

But, these EXACT same processes that are going on in cells that allow the organism to grow and that allow evolution to work. If you are denying that evolution can occur, then you also have to deny that growth of organisms can occur because they are the exact same thing. They have the exact same source of "information entropy" as it is the same process that allows both to occur.

 

So the question is: Does the growth of an organism requier a source of low infromation entropy, and if so, what is it?

Link to comment
Share on other sites

Absolute values for information and information entropy have never been an issue. The change in information content and information entropy of a particular system is the question at hand. Thus far, the source of this change in information order for abiogenesis and diversity of life has not been identified nor has the mechanism for how the change was transfered into the patterns present in DNA. Most often when an attempt is made to explain this, the response is that thermal entropy accounts for these changes. However as this discussion demonstrates, it is unknown how thermal order substitutes for molecular and information order.

 

OK, but you're being inconsistent again... If you are now interested in the information content and information entropy of specific systems, then why when I talk about photographic film are you talking of light-patterns that are not part of the film? Does the information content of the film increase when taking a picture or not? (My line of questioning about the information entropy of the film I will leave for later).

 

When noise is added to the film, by exposing it to random light patterns, probability of man or all of the discrete information states rise and entropy increases. The information is not "destroyed" per se since the discrete information states still exist, but the high degree of order of the information is altered.

 

 

 

It was false to say "it would be best to say the data strings were generated by some unknown process". It is not best to change the problem in midstream. I don't find anything best about this shift to a different and hypothetical example. It is moving the goal post. , and that is what I meant by false.

 

 

 

I believe I could generally describe net entropy change for each case and rank them. I have previously mentioned the difficulty in calculating absolute values for entropy of any kind, however, in this thread I am primarily concerned with changes in information and entropy, so this is not an issue for the broader topic. Here you want me to tell you the absolute values and now you want the case to be hypothetical as opposed to actual.

 

When one asks a question that involves probability, the system generally must be described. Thermodynamics problems involving entropy provide sufficient background to enable a solution. You are effectively asking me what is the probability of the three strings without describing the system. When someone asks "What is the probability of thus and such?" one generally responds "What are the givens?". It would be like asking a student, for a particular undescribed heat engine to calculate the the net thermal entropy rise per cycle. Failure to describe the actual situation makes the problem hypothetical since the solver must make up a hypothetical system. The issue I have is that hypothetical examples have a way of making the impossible seem probable, the probable seem likely and the likely seem actual.

 

I never changed the problem, just refused to change it into the problem you wanted. And yes, feel free to rank the strings instead of finding an absolute value. This is also not changing the problem, since I said so right from the start. Also, it is not a hypothetical example since the data there is real.

 

If you like, you can also compare the same strings' ranks if the strings were generated by a random number generator of equal probability for everything. Not sure if that would be a hypothetical example or a real one -- probably hypothetical due to the length of the strings.

Link to comment
Share on other sites

Do you believe that fro an organism to grow, it needs a source of low information entropy?

 

Biological systems contain functional prescriptive information from inception passed to them by the causal agent of that biological orgnaism and this information is indeed needed to allow for cell function, development and growth.

 

Because, when an organism grows, it reproduces the cells within in it. In side each of these cells is strands (or a polymer) of DNA, and that DNA polymer was put together from DNA monomers. These monomers were in random places (free floating). So we have gon from high information entropy (complete uncertainy of any infomration because the DNA monomers were free floating) to one of low information entropy (a complete pair bonded DNA polymer that has a high degree of information as it is the blueprint of an entire organism).

 

Do you believe that this process requiers a source of low information entropy to occur? You don't seem to have a problem in believing that organisms grow, and this has been observed to occur.

 

It is not just a belief, it is a fact that this process requires and has a source of functional prescriptive information passed to it by its cause.

 

But, these EXACT same processes that are going on in cells that allow the organism to grow and that allow evolution to work.

 

No, sorry the modern theory of evolution is described as being driven by a random, unmanaged, undirected process of genetic error acted on by natural selection. Cell processes are carefully managed, and highly controlled. They include signal induction circuits, feedback and feed forward controllers, error correction circuits, inventory and expression control, transportation, messaging and transcription, and adaptive repair. All of this and much more is coordinated by a complex set of molecular controls and all of it is performed guided by functional prescriptive information.

 

If you are denying that evolution can occur, then you also have to deny that growth of organisms can occur because they are the exact same thing. They have the exact same source of "information entropy" as it is the same process that allows both to occur.

 

I do not deny that change can occur somehow, I question how this functional information was derived and how large scale change occur given that new form and function requires large coordinated increases in functional prescriptive information. The evolutionary model has no answer for this and no wonder, because functional and prescriptive information is formal but it is not physical and it seems not possible to demonstrate that physical systems generate functional information.

 

So the question is: Does the growth of an organism requier a source of low infromation entropy, and if so, what is it?

 

Yes, as described above the source is the cause of that organism.

 

OK, but you're being inconsistent again... If you are now interested in the information content and information entropy of specific systems, then why when I talk about photographic film are you talking of light-patterns that are not part of the film? Does the information content of the film increase when taking a picture or not? (My line of questioning about the information entropy of the film I will leave for later).

 

The photographic film is not a closed system so inputs and outputs must be included. The specific light patterns encountered by the film through the exposure cycle is a relevant part of the equation. When inputs and outputs are included, there is no apparent net change in information. Also, since the light patterns are deterministically generated (P=1), it is unclear how much information if any is transferred to the film.

 

I never changed the problem, just refused to change it into the problem you wanted. And yes, feel free to rank the strings instead of finding an absolute value. This is also not changing the problem, since I said so right from the start. Also, it is not a hypothetical example since the data there is real.

 

If you like, you can also compare the same strings' ranks if the strings were generated by a random number generator of equal probability for everything. Not sure if that would be a hypothetical example or a real one -- probably hypothetical due to the length of the strings.

 

To compute the probabilities, and thus information content, I need background information and boundary conditions as previously requested. If you will kindly provide it, I can answer your question.

Edited by cypress
Link to comment
Share on other sites

Biological systems contain functional prescriptive information from inception passed to them by the causal agent of that biological orgnaism and this information is indeed needed to allow for cell function, development and growth.

But, what is the causal agent here? God, some kind of alien, a super comp[uter simulating us, what?

 

Actually, I have provided one such "causal" agent: The Sun. This is a source of low entropy and solves the problem that you keep bringing up. The low entropy (information or otherwise) comes from the sun and drives the processes that can create low entropy information.

 

You have not disagreed that a process that uses low entropy energy can produce information, and this is what I am claiming in my posts. If you disagree that energy can be converted to infromation, then please state that along with your argument and evidence of disproof. I have provided evidence to support my argument, and you have even posted evidecne that supports my argument.

 

It is not just a belief, it is a fact that this process requires and has a source of functional prescriptive information passed to it by its cause.

This is your assertation and you have yet to prove it or provide any evidecne for it.

 

But, I wonder, could you accet chance as the source. For example:

 

If a change that could be advantagious could only occur as a 1 in a million billion chance, then this might sound like a long shot. However lets look at the numbers game.

 

A bacteria can go through around 50 generations in a day. That is around 1 generation every 30 minutes (just under). When they reproduce they split into two copies.

 

So, startign with just 1 bacteria, lest see how many would exist after 1 day. At a doubling ever 30 minutes that would give us 1,073,741,824 (230). That is just over 1 billion. So how many more days to get to a million billion (1,000,000,000,000,000)? Would it take a million days? No, no where near it actually.

 

After 2 days you would have: 1,152,921,504,606,846,976 (260).

 

In other words in 2 days there would be enough bacteria that the chances of a 1 in a million billion occuring would likely occur around 1,000 times. And that is just 2 days.

 

This means that even if there is only a small chance of an event occuring, it can occur because life reproduces exponentially. Now, this doesn't violate entropy because entropy is a probability statement. It is a statement that there are more disordered configurations than ordered configurations, thus any random change to it would more than likely cause the system to enter a disordered state. But, if there are enough random changes in the system, then it is still possible that the system could enter into an ordered state again.

 

Take for example a box filled with a gas. If we were to force all the gas over to one side of the container, this would be a highly ordered state (low entropy). If we remove whatever it is we were using to force the gas into this state, then through the random motions of the molecules, it is more likely that the gas will spread out and fill up the box (a higher entropy state). But, this does not preclude the posiblilty that the random movments of the gas could make the gas end up all on one side of the box.

 

There is no law of physics against this occurance. It would not need any low entropy energy input. It is just that it is highly unlikely to occur. But it could occur.

 

ANd that is the crux of the issue. Sure, if you want to force a system into a particular state, then it will take energy and increase entropy. Just as if you want to force a particular data set of information it will take a low entropy source.

 

However, with evolution it is not forceing anything, it is just taking advantage of a random occurance. One, that although unlikely, has a high chance of occuring because life reproduces exponentially.

 

No, sorry the modern theory of evolution is described as being driven by a random, unmanaged, undirected process of genetic error acted on by natural selection. Cell processes are carefully managed, and highly controlled. They include signal induction circuits, feedback and feed forward controllers, error correction circuits, inventory and expression control, transportation, messaging and transcription, and adaptive repair. All of this and much more is coordinated by a complex set of molecular controls and all of it is performed guided by functional prescriptive information.

Actually evolution dosn't requier randomness, it just operatse dispite it. Evolution can occur in a totally deterministic system.

 

However, what I was saying (if you bother to read my posts), is that the same processes that occur in a cell when it divides and that allows an organism to grow, is the same processes that occurs in an organism when it reproduces. Namely cell division.

 

It is also at this point that changes to the genetic code can occur in an organism, as well as when the organism reproduces. When these changes occur in an organism growing, they can lead to cancer, or as with the human immune system can lead to protection from deseases (they can be good or bad, or neutral).

 

If you can accept that these changes can occur in the course of normal growth, then you have to accept that these same things can occur during reproduction. Now, if only the good or neutral changes survive to reproduce, then you have evolution.

 

These cell processes that correct mutations are a process and require low entropy energy to drive them. But then this is energy acting to increase information (although that information was changed to begin with).

 

If you can accept that energy can change mutations back to their original state, then why is it so hard to accept that energy can be used to cause that change in the first place.

 

I do not deny that change can occur somehow, I question how this functional information was derived and how large scale change occur given that new form and function requires large coordinated increases in functional prescriptive information. The evolutionary model has no answer for this and no wonder, because functional and prescriptive information is formal but it is not physical and it seems not possible to demonstrate that physical systems generate functional information.

As I have shown, random changes can lead to an ordered data set if the disordered data sets are rejected.

 

As for co-ordination, the evolution of the eye seems to be an imposiblity because, as the often quoted argument : "of what use is half an eye?"

 

Actually half an eye is quite useful, even 1/10 of an eye is useful.

 

We know that even some single celled organisms can react to light as they have photosensitive chemicals in them (which have useful, but non photosensitive precursors, as well as some photosensitive precursors that are not used to detect light). In multicelled organisms, these same chemicals (and their precursors) exist. With such organisms, if they have a spat of cells that expresses these chemicals, then they could react to light or dark.

 

This would be an advantage as it would allow the organism to detect if it is light or dark, or if a potential predator was in the area (but wouldn't give it any more information than that).

 

However, if the cells that would be under that patch did not divide as fast or as much (so a reduction of growth) in the centre of the patch, then this would cause the outer part to curve up a bit and give the patch a slightly concave shape.

 

This causes the patch of photo sensitive cells to be more useful as the raised edges can cast a shadow if the light is shineing from that direction. Thus it gives a new piece of information that the organism can use, direction.

 

IF the cells then excrete mucous (a common thing for cells to do), then this mucous membrane would both protect the photosensitive cells, and because it would be of a different refractive index it would also cause the light to be refracted. This would give this patch of photosensitive cells a sharper definition of the direction.

 

Any changes that would cause this mucous to form a better shap would lead to an organism that had a much better image definition on its photosensitive cells. Eventually, these changes would give the light a high degree of focus on the light sensitive cells, giveing the organism an eye.

 

So, using that we know that photosensitive cells exist and organisms can react to light detected by them. And using just slight changes to the arrangement in ways that are just modifications of existing structures, we can take a patch of photo sensitive cells and through evolution turn it into an eye.

 

This evolution uses natural selection as each step gives the organism more chance to avoid predators (or even find prey), and if you can avoid predators you are more likely to live to breed and produce offspring (remember exponential reproduction rates). Over time because more survive to be able to breed, the number of organisms with these changes will grow. This means that there is more chance that another mutaiton that is benificial will occur to one of these (because ther are so many of them compared to others), and as that new nutation is benificial, it too will give the same breeding advantage to the new offspring which will come to dominate the others (exponential increase in numbers again).

 

As any mutation that is bad would end up making the offspring less able to avoid predators, the organisms that have such bad mutations would end up more likely to be eaten by predators, and thus less likely to survive to be able to breed. IF they don't produce any (or many) offspring, then due to exponential growth, they will not have a large population (if any at all if they don't reproduce) and so a mutation that would be needed for further eye development would be less likely to occur in them (and even if it did, it could still occur in the other population and would still be more likely to anyway).

 

So, what we have is some chemicals that started off with nothing to do with light detection (iirc: they were used as protection from damage caused by radiation as they could absorb the radiation and disperse it harmlessly), but because of small changes and selection for advantagious changes, we end up with a higly sophisticated visual receptor (an eye).

 

One such cemical is Retinal (a type of vitimin A): See here: http://en.wikipedia.org/wiki/Retinal

 

Yes, as described above the source is the cause of that organism.

The "Cause" of the organisms. What does this mean. If you don't clarify this better, it is just a form of thought terminating cliche ( http://en.wikipedia.org/wiki/Thought-terminating_clich%C3%A9#Thought-terminating_clich.C3.A9 ).

Link to comment
Share on other sites

But, what is the causal agent here? God, some kind of alien, a super comp[uter simulating us, what?

 

The causal agent for an organism is the parent from which the organism was derived. Life's known causal agent life. Information's known source is information.

 

Actually, I have provided one such "causal" agent: The Sun. This is a source of low entropy and solves the problem that you keep bringing up. The low entropy (information or otherwise) comes from the sun and drives the processes that can create low entropy information.

 

Your proposed solution is a logical fallacy because it moves the goal post. It answers a question I did not ask. The sun provides a source of thermal energy and order to power biological systems through the irreversible thermal cycle under which they operate. This is the question you answered, but was not asked of you. You have not provided a source of functional, prescriptive information and order to power biological systems through the cycle of net gain in functional information and information order necessary to generate observed diversity. Evolutionary theory posits that all biological diversity is a result of observed evolutionary processes. But these processes do not explain or account for a source for this information.

 

You have not disagreed that a process that uses low entropy energy can produce information, and this is what I am claiming in my posts.

 

Yes I have, there is no indication whatsoever that any physical only process can increase information without a source of information. Functional, prescriptive information is formal, as opposed to physical, though a representation of information can be stored in, transported, transcribed and processed by physical systems. Every attempt to generate prescriptive information from physical only systems greater than what the probabilistic resources import, has failed.

 

Take for example a box filled with a gas. If we were to force all the gas over to one side of the container, this would be a highly ordered state (low entropy). If we remove whatever it is we were using to force the gas into this state, then through the random motions of the molecules, it is more likely that the gas will spread out and fill up the box (a higher entropy state). But, this does not preclude the posiblilty that the random movments of the gas could make the gas end up all on one side of the box.

 

There is no law of physics against this occurance. It would not need any low entropy energy input. It is just that it is highly unlikely to occur. But it could occur.

 

The second law of thermodynamics, the law of entropy, is against this argument.

Link to comment
Share on other sites

The second law of thermodynamics, the law of entropy, is against this argument.

 

No, it isn't. See the fluctuation theorem, and, for example:

 

G.M. Wang, E.M. Sevick, E. Mittag, D.J. Searles & Denis J. Evans (2002), Experimental demonstration of violations of the Second Law of Thermodynamics for small systems and short time scales, Physical Review Letters 89: 050601/1–050601/4. doi:10.1103/PhysRevLett.89.050601

 

As they become smaller, the probability that they will run thermodynamically in reverse inescapably becomes greater. Consequently, these results imply that the fluctuation theorem has important ramifications for nanotechnology and indeed for how life itself functions.

 

This is particularly important for very small boxes.

Link to comment
Share on other sites

Yes I have, there is no indication whatsoever that any physical only process can increase information without a source of information. Functional, prescriptive information is formal, as opposed to physical, though a representation of information can be stored in, transported, transcribed and processed by physical systems. Every attempt to generate prescriptive information from physical only systems greater than what the probabilistic resources import, has failed.

How have complex elements arisen purely from the input of energy acting upon hydrogen and helium?

Link to comment
Share on other sites

How have complex elements arisen purely from the input of energy acting upon hydrogen and helium?

 

Through deterministic processes, though these higher elements are not complex in the same sense as functional information, so other than as a logical fallacy I don't see your attempted point. Are you suggesting that these deterministic physical processes generate functional information and information order?

 

No, it isn't. See the fluctuation theorem, and, for example:

 

G.M. Wang, E.M. Sevick, E. Mittag, D.J. Searles & Denis J. Evans (2002), Experimental demonstration of violations of the Second Law of Thermodynamics for small systems and short time scales, Physical Review Letters 89: 050601/1–050601/4. doi:10.1103/PhysRevLett.89.050601

 

This is particularly important for very small boxes.

 

It is incorrect to describe these demonstrations as violations of entropy. It is well known that mathematically probability distributions broaden as sample size is diminished, not only is it important for very small boxes, small sample sets are a prerequisite. The law of entropy has probability as its basis. As the sample size is diminished, variation increases. Here is what your link says:

 

Note that the FT does not state that the second law of thermodynamics is wrong or invalid. The second law of thermodynamics is a statement about macroscopic systems. The FT is more general. It can be applied to both microscopic and macroscopic systems. When applied to macroscopic systems, the FT is equivalent to the Second Law of Thermodynamics.

 

Here is a much more accurate description of what is occurring than the fallacy you have attempted to present in your post.

 

It would be interesting to see how one could demonstrate that a process involving even small sample sets could "run in reverse" for any period of time or in a continuous unbroken sequence greater than the one or two apparent broken event chains demonstrated (apparent because each demonstration seems to involve external energy sources,and numerous artificial interventions) so that net system entropy drops. It would be even more interesting for someone to demonstrate that these theorems apply to presumed evolutionary processes given the posited long time scales, macro population sizes and presumed unbroken stepwise pathways . His statement regarding life appears to be an opinion in scientistic prose,and I would welcome factual confirmation.

 

Returning to your claim directly, FT does not apply to the situation Edtharan described and I contested. The wiki article you offered contradicts your claim as does the link I offered. Your counter is not supported.

Link to comment
Share on other sites

It is incorrect to describe these demonstrations as violations of entropy. It is well known that mathematically probability distributions broaden as sample size is diminished, not only is it important for very small boxes, small sample sets are a prerequisite. The law of entropy has probability as its basis.

Indeed. And Edtharan said "There is no law of physics against this occurance.... just that it is highly unlikely to occur." Probability.

 

Note that the FT does not state that the second law of thermodynamics is wrong or invalid. The second law of thermodynamics is a statement about macroscopic systems. The FT is more general. It can be applied to both microscopic and macroscopic systems. When applied to macroscopic systems, the FT is equivalent to the Second Law of Thermodynamics.

I wasn't arguing that the Second Law is wrong. I was arguing what your link argues, and what Edtharan said:

 

This minimum in H, or maximum in entropy, is just a statistical average and real systems will fluctuate about this average. These fluctuations are very small for the large number of molecules in common objects, but the fact remains that entropy will fluctuate up and down.

 

It is statistically highly unlikely that what Edtharan says will occur, but not impossible. That is the point.

Link to comment
Share on other sites

The second law is a law of physical chemistry and it applies to Edtharan's example. There is a law against the occurrence Edtharan described since his configuration as described applied at a macro level over an extended series of discrete events. Relocating all the gas molecules to one side of the box Edtharan describes would require many billions of discrete macro events, not the situation described by FT. This thread is about processes that are posited to have been made up of long series of discrete macro events thought to have occurred over an extended period of time. This is the context. To change the context to a handful of discrete nano events is the logical fallacy of moving the goal post. Your argument depends on this logical fallacy

Link to comment
Share on other sites

Edtharan didn't specify the size of his box. A larger size makes the scenario correspondingly likely, but not physically impossible.

 

In any case, my argument has nothing to do with processes occurring over an extended period of time. It has to do with Edtharan's example. If his example is faulty, that's not my fault.

Link to comment
Share on other sites

The second law is a law of physical chemistry and it applies to Edtharan's example. There is a law against the occurrence Edtharan described since his configuration as described applied at a macro level over an extended series of discrete events. Relocating all the gas molecules to one side of the box Edtharan describes would require many billions of discrete macro events, not the situation described by FT. This thread is about processes that are posited to have been made up of long series of discrete macro events thought to have occurred over an extended period of time. This is the context. To change the context to a handful of discrete nano events is the logical fallacy of moving the goal post. Your argument depends on this logical fallacy

 

Yes, unlikely to the point where you wouldn't see it happen even if the universe ended several times over. Still not impossible, technically, just very very unlikely, and more unlikely the more particles involved. You do realize that statistical models allow you to calculate probabilities of things even if they haven't been observed, right?

 

If we're relating this to evolution, then we can limit our size to something like a chromosome or smaller, and natural selection can select some of the intermediate steps (if they affect fitness), and the organisms have an input for lowering thermodynamic and information entropy.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.