Jump to content

Entropy Question


foodchain

Recommended Posts

As I currently understand it entropy is a measure of disorder, but in comparison to what exactly allows for that definition to exist. I mean if you are to say something is in disorder, that’s to mean there is then an order right, or some other state something should or could be in besides the disordered one.

 

*Side Note.

I would also like to know what role entropy plays in string theory, in context of the uncertainty principal if anyone could point me in a direction to read on such that would be great.

Link to comment
Share on other sites

As I currently understand it entropy is a measure of disorder, but in comparison to what exactly allows for that definition to exist. I mean if you are to say something is in disorder, that’s to mean there is then an order right, or some other state something should or could be in besides the disordered one.

 

Well the third law of thermodynamics tells us that when we have a substance in lattice form at ablsolute zero ( zero Kelivin), its entropy is 0 because there are no vibrations, or negligible vibrations among the ions. Anything else has a degree of disorder to it.

Link to comment
Share on other sites

Well the third law of thermodynamics tells us that when we have a substance in lattice form at ablsolute zero ( zero Kelivin), its entropy is 0 because there are no vibrations, or negligible vibrations among the ions. Anything else has a degree of disorder to it.

 

Yes but from going from BEC which is what I understand is absolute zero you have a certain physical phenomena that occurs, what some may call a fourth state of matter I guess or another phase of matter. So how does that actually qualify as order/disorder really? I mean pretty much it stops entropy at that point if my assumptions are correct.

Link to comment
Share on other sites

As I currently understand it entropy is a measure of disorder, but in comparison to what exactly allows for that definition to exist. I mean if you are to say something is in disorder, that’s to mean there is then an order right, or some other state something should or could be in besides the disordered one.

 

There is a more exact meaning of disorder in the statistical definition of entropy. You can look up the canonical ensembles.

http://en.wikipedia.org/wiki/Canonical_ensemble

 

There you see the clear connection between "disorder" and microscopic degrees of freedom, and uncertainty.

 

In the context the system typically has two levels of description. Macrolevel and microlevel. There are certain variables that is taken to define the macrolevel. For example energy and temperature.

 

If the entropy is zero, it means the macrostate defines the microstate.

 

The idea is that the microlevel is generally subject to disturbances, and from plain probabilistic reasoning, if there is no prior assymmetry between the microstates (ie all the microstaes are equally a priori likely), an arbitrary unknown disturbance are more likely to increase the entropy than to decrease it. That's the third law.

 

Low entropy = low uncertainty. However, by the same token, low uncertainty is also less likely. If you throw dice, to come up with the microsates, there is a clear relation between the probability of the outcome and it's entropy. This is the whole beauty of the concept.

 

While this is good stuff, it's still important to understand that in a certain sense the uncertainty in QM is of another kind. The x-p uncertainty can be thought of also as beeing the result of a logical entanglement.

 

What can be done to the entropy stuff in the quantum domain is that you consider the future of a series of equiprobable surfaces. This surfaces can be deduces from entropy principles, as high entropy change is more likely than a low entropy change. And unlike thermodynamics, you can include the kinetics in this reasoning.

 

This is the basic idea behind how, a unknown or random microphysics, can give rise to nontrivial macrophysics. And this idea can be extended and generalized.

 

I think there are many interesting things going on in these fields even today. Some people are working on to deriving general relativity from such generalized entropy principles. This is close to my interests too. The concept of relativity can be deduced to such principles and the concept of a prior probability distribution. Clearly different observers have different priors. The beauty of this approach if successful is that it will probably give a completely new insight in GR and what the most natural extension to the quantum domain is. This is where I personally expect alot of future physics.

 

I'm currently working on this things but i don't want to blur up things by posting incomplete math. I was going to post later for comments, but at this point I think it will come out as baloney and I will keep it for maturation. I've got a few key points to solve first.

 

/Fredrik

Link to comment
Share on other sites

Yes but from going from BEC which is what I understand is absolute zero you have a certain physical phenomena that occurs, what some may call a fourth state of matter I guess or another phase of matter. So how does that actually qualify as order/disorder really? I mean pretty much it stops entropy at that point if my assumptions are correct.

 

 

BECs are cold, but nothing can attain absolute zero, and they are considered the fifth state — plasma is the fourth.

 

Order means that there are a very limited number of states the particles can attain. Since atoms in a BEC have the same phase, they are basically all in the same state.

Link to comment
Share on other sites

another perspective on entropy, (the one use for chemical engineering related work) is that it is just a useful linking function and equation of state like internal energy. not necessarily a measure of order or possible states. although, you can argue that this perception is wrong and doesn't immediately reflect whats going on but it doesn't have to.

Link to comment
Share on other sites

I mean if you are to say something is in disorder, that’s to mean there is then an order right, or some other state something should or could be in besides the disordered one.

 

If you continue to pursue the abstract interpretations, there are different more abstract interpretations of entropy. There is something called information entropy, which can be said to be a measure of our ignorance of the system (could be any variable, or datastream). It's used in various fields out of physics, for example in dataprocessing, where the entropy can also be thought of as information content. This way, entropy is given a cleanly abstract information theoretic meaning, without physical references. It's all about information, and ultimately the dynamis is about updating your information, and making guesses based on incomplete information. I think many of those who work along these lines today hole an opnion somewhere along the lines that reality is about formulating prognosis based on admittedly incomplete information. You know certains things, and you need to make an educated guess based on what you've got. And this exact procedure can be given a scientific formulation. The basics of such an approach is to start to formulate physics from scratch from a much more information theoretic first principles. And many laws of physics *might* be able to be deducable from such a dynamical model in terms of information mechanics. Elementary things are for example, logical entanglements (that are there, but seemingly hidden), and and merging of conflicting information, which happens all the time. What is the scientific way of judging conflicting information?

 

The reference, or "right order", you ask for is the case of perfection or complete information, and there is zero uncertainty.

 

/Fredrik

Link to comment
Share on other sites

BECs are cold, but nothing can attain absolute zero, and they are considered the fifth state — plasma is the fourth.

 

Order means that there are a very limited number of states the particles can attain. Since atoms in a BEC have the same phase, they are basically all in the same state.

 

Right, plasma, sorry about that one and thanks.

Link to comment
Share on other sites

If you continue to pursue the abstract interpretations, there are different more abstract interpretations of entropy. There is something called information entropy, which can be said to be a measure of our ignorance of the system (could be any variable, or datastream). It's used in various fields out of physics, for example in dataprocessing, where the entropy can also be thought of as information content. This way, entropy is given a cleanly abstract information theoretic meaning, without physical references. It's all about information, and ultimately the dynamis is about updating your information, and making guesses based on incomplete information. I think many of those who work along these lines today hole an opnion somewhere along the lines that reality is about formulating prognosis based on admittedly incomplete information. You know certains things, and you need to make an educated guess based on what you've got. And this exact procedure can be given a scientific formulation. The basics of such an approach is to start to formulate physics from scratch from a much more information theoretic first principles. And many laws of physics *might* be able to be deducable from such a dynamical model in terms of information mechanics. Elementary things are for example, logical entanglements (that are there, but seemingly hidden), and and merging of conflicting information, which happens all the time. What is the scientific way of judging conflicting information?

 

The reference, or "right order", you ask for is the case of perfection or complete information, and there is zero uncertainty.

 

/Fredrik

 

Ok, I think I understand, so entropy is a definition that can sort of go along with the scientific definition of energy in that regard. Being we say something like steam in a certain mechanical configuration is capable of doing so much work right? The disorder term comes in basically because the energy is "free" to do something and all of it collectively is information in which an uncertainty exists? If I am close please let me know, thank you. IN that way absolute zero would be information death possibly also right? I think that would be a breech of conservation laws though.

Link to comment
Share on other sites

BECs are cold, but nothing can attain absolute zero, and they are considered the fifth state — plasma is the fourth.

 

Order means that there are a very limited number of states the particles can attain. Since atoms in a BEC have the same phase, they are basically all in the same state.

 

I also forgot this, and again you are right like usual:D

 

"However, a big step was when Cornell and Wieman cooled a small sample of atoms down to only a few billionths (0.000,000,001) of a degree above Absolute Zero! That was what they needed to do to see Bose-Einstein condensation."

 

http://www.colorado.edu/physics/2000/bec/temperature.html

Link to comment
Share on other sites

Ok, I think I understand, so entropy is a definition that can sort of go along with the scientific definition of energy in that regard. Being we say something like steam in a certain mechanical configuration is capable of doing so much work right? The disorder term comes in basically because the energy is "free" to do something and all of it collectively is information in which an uncertainty exists? If I am close please let me know, thank you. IN that way absolute zero would be information death possibly also right? I think that would be a breech of conservation laws though.

 

I'm not sure I understood your suggestion. But if you are referring to the so called free energy (Gibbs, helmoltz) in thermodynamics you are correct that they are strongly related to entropy. If something is in a highly ordered (=unlikely) state, it's a fair guess that such a system is likely to degrade in time, or it will have a tendency to change into a more likely configuration. Yet the rate at which it does is yet unknown. That tendency can be tamed into a force, that can be used to do work. Like organisms digest food, and convert the free energy of the oxidation reactions into work (biosynthesis).

 

http://en.wikipedia.org/wiki/Gibbs_free_energy

 

The free energy concepts is what is most commonly used in chemistry and biology, and it's heavily entangled with entropy as well.

 

In the exended treatment of QM that I lookg for, the conservation laws tend to apply to things like information. This can also explain the logic in other violations.

 

Still I'm not sure íf that's what you meant but it seems like something thereabout.

 

/Fredrik

Link to comment
Share on other sites

Ok, I think I understand, so entropy is a definition that can sort of go along with the scientific definition of energy in that regard. Being we say something like steam in a certain mechanical configuration is capable of doing so much work right? The disorder term comes in basically because the energy is "free" to do something and all of it collectively is information in which an uncertainty exists? If I am close please let me know, thank you. IN that way absolute zero would be information death possibly also right? I think that would be a breech of conservation laws though.

 

Entropy is a measure of the quality of energy of a system, as far as being able to do work. As Fredrik notes, you can look at the Gibbs (or Helmholtz) free energy — there is a TS term (temperature*entropy). Energy that's "tied up" in a higher entropy (at a given temperature) isn't available to be used elsewhere.

Link to comment
Share on other sites

This [statistics] is the basic idea behind how, a unknown or random microphysics, can give rise to nontrivial macrophysics. And this idea can be extended and generalized.

 

I think there are many interesting things going on in these fields even today. Some people are working on to deriving general relativity from such generalized entropy principles. [...] The concept of relativity can be deduced to such principles and the concept of a prior probability distribution.

Now this sparks my interest. However, I have no idea how this is supposed to work. Classically, we derive physics from the principle of extremal action so you could start interpreting the local action as a probability and might get a probability distribution that, on sufficiently large scales, peaks sharply for the classical solution. Is it somehow related to that or am I completely on the wrong track? Can you give some additional information or maybe links to work on that subject?

Link to comment
Share on other sites

Hello Atheist, first I think it should be noted that this is to my knowledge not yet a completed approch and there are open wires, but it's what I'm working on and a number of others are working on similar ideas, but there are variations.

 

Yes it is similiar to the action principle, and the infinitesimal action would correspond to a transition probability. And the idea is the the transition probability can be driven by an microscopic submering everything in noise and then it's boils down to a matter of stability. Unstable and easily excited transitions are more likely - with respect to the current state. Transitions probabilities are fundamentally relative, not absolute. Moreover the transition amplitudes are estimates from the current state of information in the spirit similar to maximum entropy methods. Which means the laws of time evolution looks exactly like a learning algoritm based on inference methods. It also means that stuff like mass and energy will receive a new interpretation in more abstract setting.

 

I have not found a single source yet which lines out a satisfactory approach in detail, and I resumed working on this myself just a month ago after a 10 year break. But here are some links relating to the ideas.

 

See Ariel Caticha's page

http://www.albany.edu/physics/ariel_caticha.htm

- this guys has started to address some fundamental questions and had ideas on derive GR from more basic first principles, but I have not seen any recent papers from him. Wether he has succeded or not I share his ideas on this. If he hasn't done it, someone else will.

 

RANDOM DYNAMICS

http://www.nbi.dk/~kleppe/random/rel.html

- I didn't analyse this completely and I am not sure I agree with their exact approach but no doubt the general spirit is right.

Some other keywords for searches are "information geometry" "entropy dynamics" "bayesian inference"

 

The nice thing about this approach is that it starts off from first principles, and stuff like inertia and relativity will come naturally. And due to the probabilistic foundations i think the extension to QM domain will be natural.

 

I'd be interested in your comments on these ideas. In particular negative critics. But if you search the existing papers I woul'd get hung up on all details because the subject seems to be young, and I've see no complete paper yet.

 

I'll be happy to get back with details as I get more work done myself. Feedback would be nice. But this is a hobby for me and it's slow.

 

/Fredrik

Link to comment
Share on other sites

However, I have no idea how this is supposed to work. Classically, we derive physics from the principle of extremal action so you could start interpreting the local action as a probability and might get a probability distribution that, on sufficiently large scales, peaks sharply for the classical solution.

 

The basic idea how it should work is simple and is something like this:

 

Relative to the current state of knowledge, we have an uncertainty or things we do not know. Now if we could quantify our uncertainty into a set of equally likely outcomes (relative to our prior information), then evolution would be playing dice on what we don't know, constrained to what we do know. The problem is how to define some measure of the unknown so that we can define the outcomes. And here the entropy principle can help out. We consider infinitesimal disturbances, and calculate the increase in the etropy (or just the number of micro degrees of freedom (like the microcanonical ensemble), the exact entropy definition should not matter.). Then we can assign probabilities to each possible disturbance. And a more likely disturbance is will appear more frequently. This will also be self correcting, becase if deviations are observed, our priors will update, but not instantly! Here comes also the correspondence of inertia, the resistance to updating your opinion when exposed to contradictory information. The "internal mass" can be given interpretations of how fast the prior is updated. Here something like the particles information capacity will be a fundamental variable. Someone that can store say only one bit of information, will have minimal intertia and it will align instantly. But a more complex object will consume the deviations but adapt according to it's own inertia.

 

This is some of the philosophy behind the method. And the point is that it starts from very raw first principles, and I see good chances that it will be able to *explain* previously considered fundamental things in terms of the new framework. Another advantage is that it will be closely related to intelligent processing algorithsm. I think it may (eventually) provide a unification of physics and artifical intelligence models, probalby also bringing more insight to the human brain from another perspective.

 

My own starting points has been to start to consider how the prior emerges from data consumption, and the basic rules for prior updates. nd the next thing is how new dimensions are born out of deviations. I'm also trying to reevaluate the choice of formalism, wether the wavefunction formalism is really the ultimate choice or not.

 

/Fredrik

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.