shufengzhang

Posts
22 
Joined

Last visited
Content Type
Profiles
Forums
Events
Posts posted by shufengzhang


Then why have you brought this to a forum where people typically come to debate!?
You are arguing with poor sentence structure. You've yet to address even one argument raised in this thread. I'll accept that as your statement of defeat.
Just becuause I used entropy in my argument does not mean that I am using circular logic. Show me the circular logic in my argument. I was showing how entropy must exist in order to explain many of the phenomena we observe.
I ask again, what are the natural variables for internal energy? I find it I'll advised for you to evoke fancy stat. mech. arguments when you clearly do not understand rudimentary classical thermodynamics.
You've yet to debate. I wish you would at least attempt to tackle even one of these arguments. People have put time into their posts and you effectively respond with "Nahuh".
And you expect to rebuild it without the presence of spontaneous endothermic processes, and a flawed interpretation of Liouville's theorem.
Thanks for a good laugh. Physical chemistry currently works fine and currently pays my salary. Physical chemistry was already built from thermo and stat. mech. Show where current chemical thermodynamics is inconsistent with these notions from physics.
I'm not going to let you keep making unfounded assertions and snide empty comments. Pony up or shut up. It's quite simple really.
I post the paper here, I want other can read it and think alone, that's all.
Thanks for your participation
please close this thread.
0 
Shufengzhang
Two responders have not argued with your paper.
Both posted questions designed to clarify what you are saying and help you with a discussion.
You have been rude and dismissive to both and completely failed (refused?) to clarify your points.
I have not come across a more tolerant scientific forum than this one so I fear you will soon loose your voice here if you carry on discussion in this way.
I have not yet sought to debate with you since all I have done has been to try to establish the basis for your paper.
I think the following sequence speaks for itself.
Debate can't solve the problem, I have said, just do as you please.
John Cuthber and I have been trying to point out to you that if you do away with the concept of entropy then you have no explanation for why [observed] spontaneous processes occur.
Allow me to clarify:
The internal energy of a system expressed in natural variables is:
[math] U(S,V,N) = TdSPdV+\sum_{i} \mu _{i} dN_{i} [/math]
By definition of the differential we have:
[math] dU=\frac{\partial U}{\partial S}dS + \frac{\partial U}{\partial V}dV + \sum_{i} \frac{\partial U}{\partial N_{i}}dN_{i} [/math]
By inspection we can see that T, P, and "mu" correspond to the partial derivatives of internal energy with respect to its natural variables. Which, by the way, how do you intend to define the internal energy without S?
Alright so now let's define the Gibb's energy which is a great measure of spontaneity near equilibrium. We can get an expression for the Gibb's energy and all the other thermodynamic state functions by Legendre transforming the internal energy any number of times. For the Gibb's energy we Legendre transform in V and S:
[math] G= TSPV+\sum_{i} \mu _{i} dN_{i}V\frac{\partial U}{\partial V}S\frac{\partial U}{\partial S} [/math]
differentiating with knowledge of the coefficients established above gives:
[math] dG(P,T,N_{i})=VdPSdT+\sum_{i} \mu _{i} dN_{i} [/math]
You can do some algebra and show that the enthalpy is hidden inside the expression (yes I know technically you can't integrate the expression as is because S has a T dependence but the approximation is valid near equilibrium and for a relatively small temperature change):
[math] dG=dHSdT [/math]
[math] \Delta G= \Delta H  T \Delta S [/math]
So for spontaneous processes [not just chemical reactions by the way] that require a net input of energy, i.e. those with a positive change in enthalpy, there must be a change in some other quantity in order to meet the spontaneity requirement of the change in Gibb's Energy being negative. How do you explain that?
This is all from classical thermodynamics. But your argument also makes no sense on a statistical mechanics level. Entropy is a volume on the phase space of an ensemble. I don't see how entropy being not directly experimentally measurable, or not unique affects that. Do you not agree that for a larger phase space volume there are a greater number of accessible microstates?
I read your paper by the way. No matter what you justify through the Liouville Theorem or by redefining the Carnot cycle, your result must agree with observed experiment.
EDIT: LaTeX hiccup
You are argueing in a circle.
Debate can't solve the problem.
I think it's clear enough that anyone who doesn't understand the relationship between entropy and endothermic reactions does not understand thermodynamics and is, by that fact, unqualified to offer a meaningful opinion about entropy.
On that basis, and also because the OP's refusal to discuss his ideas is a breach of rule 8, I suggest closing the thread.
Thanks for your " suggest closing the thread " , I'm tired of debate.
It is a pity to attribute to someone long dead something he did not say as the basis for a paper.
Clausius actually said
"Die Warme kann nich von selbst aus einem kalteren in einem warmeren Korper ubergangen"
I see not even the ghost of a cyclic integral in that statement.
What I want to say is: Thermodynamics and statistical physics, from this, Physical chemistry, will be rebuilt, believe it or not.
3 
!
Moderator Note
shufengzheng,
1. The insults stop.
2. You'll answer their questions. As John Cuthber noted, simply telling members to read the paper does not constitute an answer.
If you cannot comply to the rules of this forum, this will be closed.
1) I have answered their questions.
2) I didn't insult him, I only crack a joke with John Cuthber, if this caused misunderstanding, I say sorry.
3) Some questions such as "explain spontaneous endothermic reactions" is unrelated with this theme, I don't think I have to answer any question someone posed.
4) I think I have spoken clearly, but sometimes they could not understand.
5) I would like to answer anyone question RELATED to this theme, but if he could not understand, what to do next?
0 
While certainly effective at making use of words, simply repeating what you have said before isn't likely to convince anyone who wasn't already convinced. Neither will handwaving and declaring questions you don't want to answer as off topic.
I have answered questions, if you can not understand it, I have nothing to say.
In fact, my paper ( the newest version ) has expounded very clearly.
OK, How does "This Paper" explain spontaneous endothermic reactions?
Also, don't pretend that telling me to read it answers the question. Answer the question here in this thread.
I do don't think one should put forward question when he is Confused and don't know what is said,
Attention, I never said " "This Paper" explain spontaneous endothermic reactions? " !!!
what I said is:
" I have said "what you said is remote from the subject !"
You ( or others ) can raise interminable irrelevant questions as this one, surely it doesn't mean that I ( or others )should answer it.
certainly, I can answer this question, but this relates to another paper, I don't want to release this paper now. "
I sincerely suggest you read this three lines of words again and again, when you really think you know its meanings, then, you can continue to talk here.
0 
would you like to read this paper ( the newest version ) carefully? if you are interested in this theme.
I think only you are Back To Original, you may understand ! Why? Because you have been filled with the preexisting content, and you are firmly convinced. sometimes, theory is not only science but also belief. In fact, to a wrong theory, there is also many evidences and applications, otherwise they couldn't exist long time, e.g., caloric theory,phlogiston theory,and so on. I hope you can read this paper thoroughly, cast aside preconceived ideas, I think you can distinguish right from wrong.
The present study has demonstrated the nonexistence of Clausius entropy, which simultaneously denies the Boltzmann entropy.
In statistical physics, the attempt to directly deduce entropy is untenable. On one hand, it involves a key step to translate infinitesimal into differential, which doesn’t hold. On the other hand, the unit (J/K) of entropy (Boltzmann entropy) in statistical physics is transformed from Clausius entropy. So, if Clausius entropy does not exist, there will be no transformation source for the unit (J/K) of Clausius entropy. As a result, the entropy in statistical physics is only a pure digital, with no physical meaning.
In addition, even if we do not consider the issue regarding the unit, from a pure probability point of view, in the equation S = klnΩ, Ω is the socalled thermodynamic probability, and the calculation of Ω involves the phase cell division in surpassing space μ. The phase cell is 2idimensional and i the total freedom degree of the molecules within the system. The essence of Ω calculation is the discretization of the continuous μ space and the generation of objective meaning. In fact, this approach does not work, and there will be no objective conclusion regardless of the amount of previous work people have done. This is due to the lack of objective, physically meaningful criteria for phase cell division, that is, Ω has no objective meaning in physics. Together the Liouville theorem and our conclusions indicate that Boltzmann entropy can be taken as a technique for displaying the irreversibility from a purely probabilistic point of view.
0 
You cannot expect to be taken seriously if you pretend that spontaneous endothermic reactions are irrelevant to a discussion of entropy.
Answer the question.
I have said "what you said is remote from the subject !"，if you talk about THIS PAPER, maybe I have intereste, but you said is beside the mark.
so much for this, just do as you please.
0 
No it isn't.
but perhaps you should start ( as the rules require) by answering the question put to you earlier.
How do you explain spontaneous endothermic reactions?
I have said "what you said is remote from the subject !"
You ( or others ) can raise interminable irrelevant questions as this one, surely it doesn't mean that I ( or others )should answer it.
certainly, I can answer this question, but this relates to another paper, I don't want to release this paper now.
0 
I directly addressed the title of your thread'
I asked three questions about your statements to establish a basis for discussion
You did not answer one of them.
Yet you complain when your own questions are not answered.
A discussion is two way and based on what can be agreed, not what is in doubt.
So if you wish a real discussion let us start again.
Just do as you wish. Debate can't solve problem, to me, so much for this. You can discuss with other, if you have interest in this paper.( you really read this paper carefully?)
Some time ago in the lab I spilled some trimethylpentane on the bench.
It evaporated quite quickly much more so than water would have done. This is because, at room temperature the water has a much lower vapour pressure.
The two liquids have pretty much the same boiling points.
Why do you think the gradient of the vapour pressure vs temperature curve is so much steeper for isooctane than for water?
what you said is remote from the subject !
0 
Perhaps you didn't fully appreciate what I said about indicator diagrams?
In dimensional analysis (Buckingham's theorem) you take energy and divide it by temperature you obtain a definite quantity.
The scientific community has chosen to award this quantity the name entropy.
Do you deny this?
You didn't discuss this paper, continuing such debate will be unmeaning but a waste of time, may be, "entropy" is important to you, just do as you wish! so much for this.
0 
To be honest, after 20 years of never having had occasion to use it, I can't remember the derivation of the expression.
But I can show you where I read it.
http://www.amazon.co.uk/EntropyEnergyLevelsOxfordChemistry/dp/0198554893
If you get a copy (and I'm sure any book on statistical mechanics would do the job just as well) then you can see the answers to your questions.
"1、 what is K ln (W) ？
2、 why the factor is k ? "
It's not a thick book, and most of the explanations are fairly clear. Perhaps you would like to show exactly where they are wrong.
It seemed that you didn't answer these two questions here, you are honest, but I really don't think you are suit for discussing this thesis.
Indeed so.
The original macroscopic derivation of entropy does not rely on system granularity.
It took the world one hundred and fifty years to establish the mechanical equivalence of heat and move towards a proper theory of energy. Caloric was dispensed with at that time.
It took a further fifty years to establish the connection between the statistical mechanics of granular systems and the mechanics of continuous systems we call thermodynamics.
Which connection are you arguing with?
"Free fall theory" had AggressionReigned over Europe more than 2000 years, although it is wrong.
What I said "I think only you are Back To Original, you may understand " means that when you cast aside preconceived ideas and get back to the time before Clausius "deduced" the concept "entropy", and you know this paper, could you still think " Clausius entropy" is right ?
0 
Lots of words.
None of them stops me calculating K ln(W).
In a way, it even helps.
Here's a quote from the wiki page about Liouville's theorem
"Physical interpretation
The expected total number of particles is the integral over phase space of the distribution:
A normalizing factor is conventionally included in the phase space measure but has here been omitted. In the simple case of a nonrelativistic particle moving in Euclidean space under a force field with coordinates and momenta , Liouville's theorem can be written
This is similar to the Vlasov equation, or the collisionless Boltzmann equation, in astrophysics. The latter, which has a 6D phase space, is used to describe the evolution of a large number of collisionless particles moving under the influence of gravity and/or electromagnetic field.
In classical statistical mechanics, the number of particles is very large, (typically of order Avogadro's number, for a laboratoryscale system). Setting gives an equation for the stationary states of the system and can be used to find the density of microstates accessible in a given statistical ensemble."
The number of microstates is the w that I need to calculate the natural log of and multiply by Boltzmann's constant to get the entropy.
Liouville's theorem is that the sum of phase cell getting in and out a phase space is always zero, that is to say, H theorem [ so is Kln(W)] is a skill to display the irreversibility from a purely probabilistic point of view BUT NOT Physics principle. Moreover, this paper have strictly demonstrate that the socalled theory of entropy is wrong, then,
1、 what is Kln(W) ？
2、 why the factor is k ?
1 
I would like to hear how he explains the occurence of spontaneous processes that are endothermic (dH>0).
The Liouville theorem and my conclusions indicate that Boltzmann entropy [ so H theorem(dH>0)] can be taken as a technique for displaying the irreversibility from a purely probabilistic point of view.
In what circumstances do you suppose that K ln (W) does not exist?
As far as I can tell there are no relevant circumstances, so your suggestion is based on a false premise.
The present study has demonstrated the nonexistence of Clausius entropy, which simultaneously denies the Boltzmann entropy.
In statistical physics, the attempt to directly deduce entropy is untenable. On one hand, it involves a key step to translate infinitesimal into differential, which doesn’t hold. On the other hand, the unit (J/K) of entropy (Boltzmann entropy) in statistical physics is transformed from Clausius entropy. So, if Clausius entropy does not exist, there will be no transformation source for the unit (J/K) of Clausius entropy. As a result, the entropy in statistical physics is only a pure digital, with no physical meaning.
In addition, even if we do not consider the issue regarding the unit, from a pure probability point of view, in the equation S = klnΩ, Ω is the socalled thermodynamic probability, and the calculation of Ω involves the phase cell division in surpassing space μ. The phase cell is 2idimensional and i the total freedom degree of the molecules within the system. The essence of Ω calculation is the discretization of the continuous μ space and the generation of objective meaning. In fact, this approach does not work, and there will be no objective conclusion regardless of the amount of previous work people have done. This is due to the lack of objective, physically meaningful criteria for phase cell division, that is, Ω has no objective meaning in physics. Together the Liouville theorem and our conclusions indicate that Boltzmann entropy can be taken as a technique for displaying the irreversibility from a purely probabilistic point of view.
Well you don't seem to have attracted much action so far here with this paper.
I can assure you that entropy is just as much a physical quantity as say surface tension.
Have you ever heard of indicator diagrams?
There are many pairs of quantities that when multiplied together have the dimensions or units of energy.
Entropy and temperature are one such pair and entropy was introduced to pair with the already established quantity, temperature, so that when plotted on a T  S indicator diagram a useful statement about energy could be made.
Surface tension and area are another such pair.
In my view, this way provides a more natural introduction to entropy, without the magical connotations so often ascribed.
I think only you are Back To Original, you may understand ! Why? Because you have been filled with the preexisting content, and you are firmly convinced. sometimes, theory is not only science but also belief. In fact, to a wrong theory, there is also many evidences and applications, otherwise they couldn't exist long time, e.g., caloric theory,phlogiston theory,and so on. I hope you can read this paper thoroughly, cast aside preconceived ideas, I think you can distinguish right from wrong.
thanks for your participation, you are welcome.
!Moderator Note
shufengzhang,
The rules of this forum require that you respond to the questions asked of you. Please have a look at the two posts above this by JohnCuthber and mississippichem and respond to them accordingly.
OK, lets me try.
thank you!
Let me see if I understand what you are saying.
Considering that the sum of dQ/T = 0 for a given temperature, then there is no heat loss nor any work yielded. In this case
∑[(ΔQ)/T)] is also zero,
as is ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0.
This means that ∑(Q) = 0 for any isothermal cyclic process. No heat is produced and no work is consumed, so this is a reversible process, such as the charging of a capacitor or inductor. Why do you claim that this is "untenable"?
Ref: "Planck's Columbia Lectures", 2005 (ISBN 0965917630)
From what you say as abovementioned, I think you misunderstood this paper, would you like to read the paper( The newest version ) again carefully?
0 
hello everyone, please read the paper carefully and understand what does the paper say before making conclusion.
thanks a lot!
0 
When the last copy of that paper crumbles into dust S will still be K ln (w) , just as it always has been.
We know Kln(w) is a method to explain S, so, when S is inexistent, how could Kln(w) exist !
0 
The newest version of this paper:
0 
Entropy：A concept that is not Physical Quantity
shufengzhang china
Email: email removed
We define heat engine efficiency η as: η= W/W1, that is, replacing Q1 in the original definition η=W/Q1 with W1, W still is the net work of the heat engine applied to the outside in one cycle, W1 is the work the heat engine applied to the outside in the cycle, then, we use Stirling cycle as the element reversible cycle , if ∮dQ/T =0 is tenable, we can prove ∮dW/T =0 and ∮dE/T =0.
If the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 can really define new system state variables, it comes to the absurd result of such a definition.
In fact, during the process of obtaining "entropy", ∑[(ΔQ)/T)] become∫dQ/T is untenable, therefore, the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 are untenable.
The"entropy"defined by Boltzmann is used to interpret "entropy" by Clausius, so, it is at the same time denied.
A new version of this paper:
0 
!
Moderator Note
Topics merged.
shufengzhang you have posted this more than once, with no followup discussion, which is a violation of the rules.
finiter, if you want to discuss your own interpretation of the concept, please do it in a separate thread.
Do not derail this thread further by responding to this warning
oh, thank you for your alert! in fact，this is a new version of that paper，but I didn't give clear indication of this point !
Thank you!
When define heat engine efficiency as:n = W/ W1 , that is, replacing Q1 in the original definition n=W/ Q1 with W1 , W still is the net work the heat engine applied to the outside in one cycle,W1 is the work the heat engine applied to the outside in the cycle,let the element reversible cycle be Stirling cycle , if circuit integral dQ/T =0 is tenable,we can prove circuit integral dW/T =0 and circuit integral dE/T =0 !
If circuit integral dQ/T=0, dW/T=0 and dE/T=0 really define new system state variables, the three state variables are inevitably different from each other; on the other side, their dimensions are the same one, namely J/K, and they are all state variables. So, we have to “make” three different system state variables with same dimensions, and we don’t know what they are, no doubt, this is absurd.
In fact , replaceing delta Q with dQ is taking for granted, if only we review the definition of differential, we know that the prerequisite of differential is there is a derivability function as y=f(x), however,there is not any function as Q=f(T) here at all, so, delta Q can not become dQ.
On the other side, when delta Q tend towards 0, lim(deltaQ/T)=0 but not lim(deltaQ/T)= dQ/T.
So, circuit integral dQ/T=0?circuit integral dW/T=0 and circuit integral dE/T=0 are untenable at all !
See paper Entropy : A concept that is not physical quantity
Thank you for your eloquent and detailed attempt to explain what mississippichem doesn't understand. We always appreciate it when members take the time to explain their views instead of just dismissing those that they disagree with.
Thank you for your comments!
0 
Theroy of entropy is wrong.
0 
EntropyA concept that is not Physical Quantity
shufengzhang china
Email: email removed
We define heat engine efficiency η as: η= W/W1, that is, replacing Q1 in the original definition η=W/Q1 with W1, W still is the net work of the heat engine applied to the outside in one cycle, W1 is the work the heat engine applied to the outside in the cycle, then, we use Stirling cycle as the element reversible cycle , if ∮dQ/T =0 is tenable, we can prove ∮dW/T =0 and ∮dE/T =0.
If the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 can really define new system state variables, it comes to the absurd result of such a definition.
In fact, during the process of obtaining "entropy", ∑[(ΔQ)/T)] become∫dQ/T is untenable, therefore, the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 are untenable.
The"entropy"defined by Boltzmann is used to interpret "entropy" by Clausius, so, it is at the same time denied.
0 
The equation dG=dHT(dS), can be used as an empirical test for the concept of entropy. Remember that entropy is not a directly measurable quantity, it is a state function. One cannot measure the entropy in a system but one can detect the change in entropy of a system. I'll give you that yes, this is subject to how you treat the system mathematically. But, when measuring Gibbs Energy of a chemical reaction (dG above), as temperature approaches extreme highs and lows, the calculated Gibb's energy deviates significantly from the observed calorimetry. This is due to the increased term T (temperature) and how it modifies the entropy term (dS) greater and greater. I am assuming the existence of entropy here, sorry about the somewhat circular logic. I understand what you are saying but I disagree respectfully.
oh, you really don‘t understand.
0 
When define heat engine efficiency as:n = W/ W1 , that is, replacing Q1 in the original definition n=W/ Q1 with W1 , W still is the net work the heat engine applied to the outside in one cycle,W1 is the work the heat engine applied to the outside in the cycle,let the element reversible cycle be Stirling cycle , if circuit integral dQ/T =0 is tenable,we can prove circuit integral dW/T =0 and circuit integral dE/T =0 !
If circuit integral dQ/T=0, dW/T=0 and dE/T=0 really define new system state variables, the three state variables are inevitably different from each other; on the other side, their dimensions are the same one, namely J/K, and they are all state variables. So, we have to “make” three different system state variables with same dimensions, and we don’t know what they are, no doubt, this is absurd.
In fact , replaceing delta Q with dQ is taking for granted, if only we review the definition of differential, we know that the prerequisite of differential is there is a derivability function as y=f(x), however,there is not any function as Q=f(T) here at all, so, delta Q can not become dQ.
On the other side, when delta Q tend towards 0, lim(deltaQ/T)=0 but not lim(deltaQ/T)= dQ/T.
So, circuit integral dQ/T=0?circuit integral dW/T=0 and circuit integral dE/T=0 are untenable at all !
See paper Entropy : A concept that is not physical quantity
http://blog.51xuewen.com/zhangsf/article_27631.htm
http://www.qiji.cn/eprint/abs/3806.html (PDF)
shufengzhang china Email: uhsgnahz@126.com
0
How could you do when you want to get a physics paper?
in Classical Physics
Posted
What could you do when you want to get a physics paper,when you didn't subscribe the Journal?
( Several days ago, a professor of Europe asked for my paper "title and publication deleted" via Email, I send this paper to him.
url and email deleted)
no advertising of speculations subjects outside of speculations