Jump to content

A wrong theory --- Theory of Entropy


shufeng-zhang

Recommended Posts

When define heat engine efficiency as:n = W/ W1 , that is, replacing Q1 in the original definition n=W/ Q1 with W1 , W still is the net work the heat engine applied to the outside in one cycle,W1 is the work the heat engine applied to the outside in the cycle,let the element reversible cycle be Stirling cycle , if circuit integral dQ/T =0 is tenable,we can prove circuit integral dW/T =0 and circuit integral dE/T =0 !

 

If circuit integral dQ/T=0, dW/T=0 and dE/T=0 really define new system state variables, the three state variables are inevitably different from each other; on the other side, their dimensions are the same one, namely J/K, and they are all state variables. So, we have to “make” three different system state variables with same dimensions, and we don’t know what they are, no doubt, this is absurd.

 

In fact , replaceing delta Q with dQ is taking for granted, if only we review the definition of differential, we know that the prerequisite of differential is there is a derivability function as y=f(x), however,there is not any function as Q=f(T) here at all, so, delta Q can not become dQ.

On the other side, when delta Q tend towards 0, lim(deltaQ/T)=0 but not lim(deltaQ/T)= dQ/T.

 

So, circuit integral dQ/T=0?circuit integral dW/T=0 and circuit integral dE/T=0 are untenable at all !

 

See paper Entropy : A concept that is not physical quantity

 

http://blog.51xuewen.com/zhangsf/article_27631.htm

 

http://www.qiji.cn/eprint/abs/3806.html (PDF)

 

shufeng-zhang china Email: uhsgnahz@126.com

Link to comment
Share on other sites

The equation dG=dH-T(dS), can be used as an empirical test for the concept of entropy. Remember that entropy is not a directly measurable quantity, it is a state function. One cannot measure the entropy in a system but one can detect the change in entropy of a system. I'll give you that yes, this is subject to how you treat the system mathematically. But, when measuring Gibbs Energy of a chemical reaction (dG above), as temperature approaches extreme highs and lows, the calculated Gibb's energy deviates significantly from the observed calorimetry. This is due to the increased term T (temperature) and how it modifies the entropy term (dS) greater and greater. I am assuming the existence of entropy here, sorry about the somewhat circular logic. I understand what you are saying but I disagree respectfully.

Link to comment
Share on other sites

  • 4 months later...

The equation dG=dH-T(dS), can be used as an empirical test for the concept of entropy. Remember that entropy is not a directly measurable quantity, it is a state function. One cannot measure the entropy in a system but one can detect the change in entropy of a system. I'll give you that yes, this is subject to how you treat the system mathematically. But, when measuring Gibbs Energy of a chemical reaction (dG above), as temperature approaches extreme highs and lows, the calculated Gibb's energy deviates significantly from the observed calorimetry. This is due to the increased term T (temperature) and how it modifies the entropy term (dS) greater and greater. I am assuming the existence of entropy here, sorry about the somewhat circular logic. I understand what you are saying but I disagree respectfully.

 

 

oh, you really don‘t understand.

Link to comment
Share on other sites

 

See paper Entropy : A concept that is not physical quantity

 

 

Why is this a problem? We have loads of things that are arguably conceptual, rather than physical quantities (electric fields and potentials, phonons, to name a few). What is important is if it helps us describe nature.

Link to comment
Share on other sites

oh, you really don‘t understand.

 

No, you don't understand my understanding.

 

Zombie thread?

 

Read any first year physics or chemistry textbook, you will find the evidence for entropy to be overwhelming. Ranting about entropy not being a physical quantity is irrelevant. Entropy is an intrinsic property of nature, this can be concluded mathematically and by simple observation.

 

Gaussian distrubution is largely conceptual and is only perfectly realized as N reaches infinity which never happens. Is Gaussian distrubution ridiculous?

 

Please explain how the rates of gas phase reactions always seem to increase when the calculated [math] \Delta S [/math] is postive.

Link to comment
Share on other sites

When define heat engine efficiency as:n = W/ W1 , that is, replacing Q1 in the original definition n=W/ Q1 with W1 , W still is the net work the heat engine applied to the outside in one cycle,W1 is the work the heat engine applied to the outside in the cycle,let the element reversible cycle be Stirling cycle , if circuit integral dQ/T =0 is tenable,we can prove circuit integral dW/T =0 and circuit integral dE/T =0 !

 

 

If circuit integral dQ/T=0, dW/T=0 and dE/T=0 really define new system state variables, the three state variables are inevitably different from each other; on the other side, their dimensions are the same one, namely J/K, and they are all state variables. So, we have to “make” three different system state variables with same dimensions, and we don’t know what they are, no doubt, this is absurd.

 

 

In fact , replaceing delta Q with dQ is taking for granted, if only we review the definition of differential, we know that the prerequisite of differential is there is a derivability function as y=f(x), however,there is not any function as Q=f(T) here at all, so, delta Q can not become dQ.

 

On the other side, when delta Q tend towards 0, lim(deltaQ/T)=0 but not lim(deltaQ/T)= dQ/T.

 

 

So, circuit integral dQ/T=0?circuit integral dW/T=0 and circuit integral dE/T=0 are untenable at all !

 

 

See paper Entropy : A concept that is not physical quantity

 

 

http://blog.51xuewen.com/zhangsf/article_27631.htm

 

 

http://www.qiji.cn/eprint/abs/3806.html (PDF)

 

 

shufeng-zhang china Email: uhsgnahz@126.com

 

OK, you are saying entropy is not a physical quantity.

 

I must assume you mean it is not physically measurable perfectly.

 

Given that entropy is a function of infrared radiation, then we would expect from the QT the better you can measure the infrared radiation, the less you can measure the entropy.

 

However, to claim the Brownian motion does not exist would be silly and so I do not think you are claiming this.

 

So, exactly, put into words, what exactly you mean that entropy is not a physical quantity?

Link to comment
Share on other sites

In fact , replaceing delta Q with dQ is taking for granted, if only we review the definition of differential, we know that the prerequisite of differential is there is a derivability function as y=f(x), however,there is not any function as Q=f(T) here at all, so, delta Q can not become dQ.

On the other side, when delta Q tend towards 0, lim(deltaQ/T)=0 but not lim(deltaQ/T)= dQ/T.

 

Q is not a state function and is also a scalar quantity. Thats why [math] \Delta Q [/math] is not expressed in the definition of entropy but [math]dQ[/math] is.

Link to comment
Share on other sites

It is interesting (not positive and not negative, but interesting) to know that I'm not the only one often confused about Entropy. It also doesn't help to capitalize the word. I think it has been one of the more confusing topics while studying physics and I'm still not out of the woods. But I actually do believe, now more than before, that entropy has indeed physical reality. And it has at least as much physical reality as energy has. Why is it then so confusing? I believe it has to do with the fact that entropy was originally introduced in the age of steam engines. Usually one is first exposed to entropy in the context of some kind of physical system going through the Carnot cycle. After that, one either learns a lot more on thermodynamics in physical chemistry, and one may be soon completely comfortable to compute all these different thermodynamic quantities, Gibbs energy, Helmholz free energy, Entalphy and all that. In my case, however, the orientation was a little different. The course on chemistry for physics students was offered on a far away campus and started at 8:15 am, and, as the comedian Lewis Black said on stage about a course he flunked on economics, it's not so easy to learn anything so early in the morning with one bloodshot eye. :).

It so happened that the students who got up late but may have had long discussions till late at night (first about the universe, then perhaps about nature and beauty - and finally the real geeks, drunk from the topic and the beer - about the beauty of women - who had all left the place early enough to get away from them .... ), these physics students (including myself) were then again confronted with the by then already forgotten topic of entropy when we got to statistical mechanics, almost all of which I've forgotten now. By then one had already sufficiently adjusted to more abstract formal systems, for example in QM, and it was simply OK that the entropy there was introduced based on the relation between energy states and probability in multi-state systems. And so this appeared much easier to comprehend than all the stuff that came from Carnot diagrams, because we didn't know then exactly why something that was discovered and explained in the context of steam engines would have any relevance for such things as the black body radiation and even in cosmology. In particular for modern biology, entropy becomes extremely important. One needs to know about it in neurology in particular for the understanding of biological systems for learning, but also directly related to machine learning, Bayesian modelling, because there are plenty of bridges between biology, statistical mechanics and computer science. This topic is interesting for me because I also want to fill all the gaps I have and want to be able to get some of this nice feeling that comes with "getting it". But for now I'm pretty stupid about all this. I'll try any ways, bare with me.

 

One of the best short and to the point explanation of entropy I found in the first 12 pages of Feynman's book Statistical Mechanics, A set of lectures, 1972, Benjamin Inc. This is how Feynman does this: He puts together one statement as if it was simply true (it is actually) namely stating that "The key principle of statistical mechanics is as follows: If a system in equilibrium can be in one of N states, then the probability of having energy E_n is (1/Q) exp(-En/kT), where k is Boltzmann's constant. The first important thing here is that infamous normalization with Z, which is the partition function. Feynman used the symbol Q instead of Z, but Z is somehow more appealing to me: Helmholtz has defined the same thing as Z, namely German "Zustandssumme", which may be vaguely translated as state sum. This general formula with the exponential function can also be derived (somehow, forget the details) by combinatorics arguments and by taking the limit for a system were the number of states N is very large (Stirling's formula is used there). There is one sentence written by Feynman here that lights an LED in my head, even though it sounds somehow tautological: "Because two states of the same energy are equally probable, the probability of a state having energy E is a function only of the energy; P=P(E)".

Think about it: No matter how the energy is related to other physical quantities, or defined to be related to other physical observables, the probability of the system being in a particular state depends only on that energy itself. Next one has to understand (or somehow accept) that if there is a probability function somehow defined or invented, one can make predictions (or more to the point, write down equations) about the expectation values of all kinds of quantities. So that's where Feynman writes down immediately the second equation in the book: The expectation of a physical observable that is associated with a quantum-mechanical operator A can be written as the normalized sum over all states of the inner products, namely as

(1/Z) \sum_n <n| A | n> exp(-E_n/kT) whereby quantum mechanical states are written as |n>.

It doesn't seem to matter if one understands the operator A as an operator on a Hilbert space or as some finite or infinite matrix and replaces <n|A|n> by x^tA x (where x is now some vector valued state), or if one replaces the sum by some (approximate or exact) integral over the state space.

How does this lead to entropy: All you have to do is try to compute the expectation of the energy itself. That could be something like this:

<E> = (1/Z) \sum_n E_n exp(-E_n/kT)

The assumption is that Z is known for this. But then one may notice that if you write something like x exp(x a) it looks like the differential of exp(-x a) with respect to a, or in this case, like the

differentiation with respect to (1/kT). This differentiation can be pulled in front of the sum and the interesting relation comes out:

<E> = k T^2 d (ln(Z))/dT

which contains the derivative of the logarithm of the partition function with respect to temperature.

So there is already the logarithm here, not surprising: if the exponential function is in the game, the logarithm is just around the corner.

 

Now put yourself in the shoes of Helmholtz who may have convinced himself that the Boltzmann equation is correct, namely that the energy of some state in which a system is and the probability of being in that state are directly related as p = (1/Z) exp(-E/kT). But perhaps Helmholtz didn't care about the probability and wanted the energy back. That would be

E = - kT ln(p) - kT ln(Z)

Cool, says Helmholtz and tries to compute the expectation <.> value of the energy, always assuming that somehow Z is known and equal to its expectation value.

<E> = -kT <ln(p)> - kT ln(Z)

So there is now this weird thing in there with the expectation of the logarithm of the probability. It certainly didn't really go that way but imagine that Helmholtz simply didn't

care much about this weird thing and uses the symbol S as S = -k <ln(p)> and rewrite the above as

<E> = TS - kT ln(Z)

 

Then he rewrite this one more time because that thing kT ln(Z) makes about as much sense on first view as the entropy S, namely as follows:

A = -kT ln(Z) = <E> -TS or F = U-TS (in Feynman's book)

This has the symbol A for Arbeit (German for work, namely physical work, steam engines and such, perhaps early attempts at cars)

 

Actually it was A = "useful work", while TS goes through the chimney. Now it's called free energy F, same thing as free work A, in the sense of free to use or useful work. I have no idea if this is how it was actually worked out by Helmholtz but it is kind of interesting to know who enormously general these relations are. In many cases, it is possible to compute the partition function directly from a formal model. That gives the free energy. And if it is also possible to formally compute <E>, then it's no problem to compute the entropy explicitly. Sometimes, and in machine learning where this stuff is also useful, there is no way though to compute the partition function. But there is usually some cute way to compute the entropy: You just bang your learning model in shape so that it becomes possible, which usually means to explicitly define an energy function in such a way that Z can be formally computed, or approximated.

 

In Feynman's book on page 12 then already this whole schebang is used to compute the black body radiation. And that is another reason I looked into these things: One has to know about the origin of the formula for black body radiation and why it goes with a 4th power of T because this has recently been quite useful in the context of trying to debunk the so called deniers of human caused global warming. A recent new argument is to claim that there is no way to compute objectively the theoretical black body radiation from the earth and then to account for the difference that the atmosphere makes. Some deniers try to tell that now that the average surface temperature is not well defined, and hence the arguments for claiming that the earth surface temperature is raising is void. Even though there are a bunch of trained physicists who, imo went astray, are behind that, I think it's total b.s. But I won't get into this now and I'm not gonna start a flame here trying to torture "doubters". There are others who have done this with much more elegance.

 

Almost forgot why on earth I felt the need to write all this stuff down. And now I'm running out of steam (darn entropy) - That big question is ringing in my head, and I hope to find simple explanations: What does this mean:

 

"Life exports entropy"

 

(sounds very deep, doesn't it, but is it?) I think in simple terms it means: Live needs to use energy while reducing entropy locally, but can only do that by increasing the entropy of the entire system which includes life itself and the environment. The export of entropy goes out through dumping all kinds of things, or by the smoke stack. Instead of life call it biosphere, a little less general than life. The question is then, how does life find a trick that results in the local reduction of entropy, that is, going to a less likely states? How does this work on the small scale of a cell, and how on the scale of a complex larger biological system? It's an important question to answer correctly and in detail, because this may be a new wonder weapon against creationists who always come up with the at best lazy idea that the new information comes from something not explainable by known laws of physics. I mean, even if I or others can't make it clear right away, and even if some pius fool may find a bible story, or a wondrous verse in the Koran or in the Torah, or even a Buddhist sutra that seems to mention entropy, count on this: I am absolutely going to stay in the physics department.

 

But I would like to see not just a hand waving explanation for this, rather one that really starts from statistical mechanics, no less. (I'm learning it , i'm gettin there....)

Link to comment
Share on other sites

  • 11 months later...

Theroy of entropy is wrong.

In my opinion, any physical quantity should be physically defined. It must be followed by a mathematical relation. In the case of entropy, there is no clear physical definition. I think, entropy can be defined as the energy possessed by the individual constituents of the system, ie, the internal energy of the system (excluding the internal energies of the individual constituents).

Link to comment
Share on other sites

!

Moderator Note

Topics merged.

shufeng-zhang you have posted this more than once, with no follow-up discussion, which is a violation of the rules.

finiter, if you want to discuss your own interpretation of the concept, please do it in a separate thread.

Do not derail this thread further by responding to this warning

Link to comment
Share on other sites

!

Moderator Note

Topics merged.

 

shufeng-zhang you have posted this more than once, with no follow-up discussion, which is a violation of the rules.

 

finiter, if you want to discuss your own interpretation of the concept, please do it in a separate thread.

 

Do not derail this thread further by responding to this warning

 

oh, thank you for your alert! in fact,this is a new version of that paper,but I didn't give clear indication of this point !

 

Thank you!

 

When define heat engine efficiency as:n = W/ W1 , that is, replacing Q1 in the original definition n=W/ Q1 with W1 , W still is the net work the heat engine applied to the outside in one cycle,W1 is the work the heat engine applied to the outside in the cycle,let the element reversible cycle be Stirling cycle , if circuit integral dQ/T =0 is tenable,we can prove circuit integral dW/T =0 and circuit integral dE/T =0 !

 

 

If circuit integral dQ/T=0, dW/T=0 and dE/T=0 really define new system state variables, the three state variables are inevitably different from each other; on the other side, their dimensions are the same one, namely J/K, and they are all state variables. So, we have to “make” three different system state variables with same dimensions, and we don’t know what they are, no doubt, this is absurd.

 

 

In fact , replaceing delta Q with dQ is taking for granted, if only we review the definition of differential, we know that the prerequisite of differential is there is a derivability function as y=f(x), however,there is not any function as Q=f(T) here at all, so, delta Q can not become dQ.

 

On the other side, when delta Q tend towards 0, lim(deltaQ/T)=0 but not lim(deltaQ/T)= dQ/T.

 

 

So, circuit integral dQ/T=0?circuit integral dW/T=0 and circuit integral dE/T=0 are untenable at all !

 

 

See paper Entropy : A concept that is not physical quantity

 

Thank you for your eloquent and detailed attempt to explain what mississippichem doesn't understand. We always appreciate it when members take the time to explain their views instead of just dismissing those that they disagree with.

 

Thank you for your comments!

Entropy A concept that is not a physical quantity.pdf

Link to comment
Share on other sites

  • 2 weeks later...

Historically - - -

 

Planck: We have now to seek a physical quantity whose magnitude shall serve as a general measure of the preference of nature for a given state. ...... R. Clausius actually found this quantity and called it "entropy". ..... Conduction of heat to a body increases its entropy, and , in fact, by an amount equal to the ration of the quantity of heat given the body to its temperature. Simple compression, on the other hand, does not change it entropy. ......In the limiting case, a reversible isothermal cyclical process, the sign of the (heat) equality holds, and therefore the work consumed is zero, and also the heat produced. This law plays a leading role in the applications of thermodynamics to physical chemistry. ..... The second law of thermodynamics imposes further limitations to the first law, allowing only certain types of transformations subject to certain conditions. in accordance with this law,

the some of Q/T is equal to or greater than zero. Heat is produced and work is consumed. In the limiting case, the work consumed is zero, the produced is zero, and the equality holds." Q = heat.

 

Planck's equation for the general law of entropy is S - (U + pV)/T = phi = dU +pdV where phi is the phase of the system and is linear and homogenous in S, U and V. S in the entropy, and U is the energy.

 

Entropy is "... a measure of the preference of nature for a given energy state". Planck used this concept to define the energy states of chemical reactions, which was the foundation of the development of his radiation equation. It is not clear to me as to how you can apply the concept of entropy to an engine.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.