# Entropy：A concept that is not Physical Quantity

## Recommended Posts

EntropyA concept that is not Physical Quantity

shufeng-zhang china

Email: email removed

We define heat engine efficiency η as: η= W/W1, that is, replacing Q1 in the original definition η=W/Q1 with W1, W still is the net work of the heat engine applied to the outside in one cycle, W1 is the work the heat engine applied to the outside in the cycle, then, we use Stirling cycle as the element reversible cycle , if ∮dQ/T =0 is tenable, we can prove ∮dW/T =0 and ∮dE/T =0.

If the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 can really define new system state variables, it comes to the absurd result of such a definition.

In fact, during the process of obtaining "entropy", ∑[(ΔQ)/T)] become∫dQ/T is untenable, therefore, the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 are untenable.

The"entropy"defined by Boltzmann is used to interpret "entropy" by Clausius, so, it is at the same time denied.

http://content.yudu....ysical-Quantity

EntropyFA concept that is not Physical Quantity.pdf

Edited by hypervalent_iodine
Personal information removed.
##### Share on other sites

• 3 weeks later...

Entropy：A concept that is not Physical Quantity

shufeng-zhang china

Email: email removed

We define heat engine efficiency η as: η= W/W1, that is, replacing Q1 in the original definition η=W/Q1 with W1, W still is the net work of the heat engine applied to the outside in one cycle, W1 is the work the heat engine applied to the outside in the cycle, then, we use Stirling cycle as the element reversible cycle , if ∮dQ/T =0 is tenable, we can prove ∮dW/T =0 and ∮dE/T =0.

If the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 can really define new system state variables, it comes to the absurd result of such a definition.

In fact, during the process of obtaining "entropy", ∑[(ΔQ)/T)] become∫dQ/T is untenable, therefore, the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 are untenable.

The"entropy"defined by Boltzmann is used to interpret "entropy" by Clausius, so, it is at the same time denied.

http://content.yudu....ysical-Quantity

Let me see if I understand what you are saying.

Considering that the sum of dQ/T = 0 for a given temperature, then there is no heat loss nor any work yielded. In this case

∑[(ΔQ)/T)] is also zero,

as is ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0.

This means that ∑(Q) = 0 for any isothermal cyclic process. No heat is produced and no work is consumed, so this is a reversible process, such as the charging of a capacitor or inductor. Why do you claim that this is "untenable"?

Ref: "Planck's Columbia Lectures", 2005 (ISBN 0-9659176-3-0)

##### Share on other sites

• 10 months later...

Entropy：A concept that is not Physical Quantity

shufeng-zhang china

Email: email removed

We define heat engine efficiency η as: η= W/W1, that is, replacing Q1 in the original definition η=W/Q1 with W1, W still is the net work of the heat engine applied to the outside in one cycle, W1 is the work the heat engine applied to the outside in the cycle, then, we use Stirling cycle as the element reversible cycle , if ∮dQ/T =0 is tenable, we can prove ∮dW/T =0 and ∮dE/T =0.

If the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 can really define new system state variables, it comes to the absurd result of such a definition.

In fact, during the process of obtaining "entropy", ∑[(ΔQ)/T)] become∫dQ/T is untenable, therefore, the formula ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0 are untenable.

The"entropy"defined by Boltzmann is used to interpret "entropy" by Clausius, so, it is at the same time denied.

http://content.yudu....ysical-Quantity

A new version of this paper:

Entropy A concept that is not a physical quantity.pdf

##### Share on other sites

• 7 months later...

Well you don't seem to have attracted much action so far here with this paper.

I can assure you that entropy is just as much a physical quantity as say surface tension.

Have you ever heard of indicator diagrams?

There are many pairs of quantities that when multiplied together have the dimensions or units of energy.

Entropy and temperature are one such pair and entropy was introduced to pair with the already established quantity, temperature, so that when plotted on a T - S indicator diagram a useful statement about energy could be made.

Surface tension and area are another such pair.

In my view, this way provides a more natural introduction to entropy, without the magical connotations so often ascribed.

Edited by studiot
##### Share on other sites

When the last copy of that paper crumbles into dust S will still be K ln (w) , just as it always has been.

##### Share on other sites

When the last copy of that paper crumbles into dust S will still be K ln (w) , just as it always has been.

We know Kln(w) is a method to explain S, so, when S is inexistent, how could Kln(w) exist !

##### Share on other sites

We know Kln(w) is a method to explain S, so, when S is inexistent, how could Kln(w) exist !

In what circumstances do you suppose that K ln (W) does not exist?

As far as I can tell there are no relevant circumstances, so your suggestion is based on a false premise.

##### Share on other sites

I would like to hear how he explains the occurence of spontaneous processes that are endothermic (dH>0).

##### Share on other sites

hello everyone, please read the paper carefully and understand what does the paper say before making conclusion.

thanks a lot!

##### Share on other sites

!

Moderator Note

shufeng-zhang,

The rules of this forum require that you respond to the questions asked of you. Please have a look at the two posts above this by JohnCuthber and mississippichem and respond to them accordingly.

##### Share on other sites

I would like to hear how he explains the occurence of spontaneous processes that are endothermic (dH>0).

The Liouville theorem and my conclusions indicate that Boltzmann entropy [ so H theorem(dH>0)] can be taken as a technique for displaying the irreversibility from a purely probabilistic point of view.

In what circumstances do you suppose that K ln (W) does not exist?

As far as I can tell there are no relevant circumstances, so your suggestion is based on a false premise.

The present study has demonstrated the non-existence of Clausius entropy, which simultaneously denies the Boltzmann entropy.

In statistical physics, the attempt to directly deduce entropy is untenable. On one hand, it involves a key step to translate infinitesimal into differential, which doesn’t hold. On the other hand, the unit (J/K) of entropy (Boltzmann entropy) in statistical physics is transformed from Clausius entropy. So, if Clausius entropy does not exist, there will be no transformation source for the unit (J/K) of Clausius entropy. As a result, the entropy in statistical physics is only a pure digital, with no physical meaning.

In addition, even if we do not consider the issue regarding the unit, from a pure probability point of view, in the equation S = klnΩ, Ω is the so-called thermodynamic probability, and the calculation of Ω involves the phase cell division in surpassing space μ. The phase cell is 2i-dimensional and i the total freedom degree of the molecules within the system. The essence of Ω calculation is the discretization of the continuous μ space and the generation of objective meaning. In fact, this approach does not work, and there will be no objective conclusion regardless of the amount of previous work people have done. This is due to the lack of objective, physically meaningful criteria for phase cell division, that is, Ω has no objective meaning in physics. Together the Liouville theorem and our conclusions indicate that Boltzmann entropy can be taken as a technique for displaying the irreversibility from a purely probabilistic point of view.

Well you don't seem to have attracted much action so far here with this paper.

I can assure you that entropy is just as much a physical quantity as say surface tension.

Have you ever heard of indicator diagrams?

There are many pairs of quantities that when multiplied together have the dimensions or units of energy.

Entropy and temperature are one such pair and entropy was introduced to pair with the already established quantity, temperature, so that when plotted on a T - S indicator diagram a useful statement about energy could be made.

Surface tension and area are another such pair.

In my view, this way provides a more natural introduction to entropy, without the magical connotations so often ascribed.

I think only you are Back To Original, you may understand ! Why? Because you have been filled with the pre-existing content, and you are firmly convinced. sometimes, theory is not only science but also belief. In fact, to a wrong theory, there is also many evidences and applications, otherwise they couldn't exist long time, e.g., caloric theory,phlogiston theory,and so on. I hope you can read this paper thoroughly, cast aside preconceived ideas, I think you can distinguish right from wrong.

thanks for your participation, you are welcome.

!

Moderator Note

shufeng-zhang,

The rules of this forum require that you respond to the questions asked of you. Please have a look at the two posts above this by JohnCuthber and mississippichem and respond to them accordingly.

OK, lets me try.

thank you!

Let me see if I understand what you are saying.

Considering that the sum of dQ/T = 0 for a given temperature, then there is no heat loss nor any work yielded. In this case

∑[(ΔQ)/T)] is also zero,

as is ∮dQ/T=0, ∮dW/T=0 and ∮dE/T=0.

This means that ∑(Q) = 0 for any isothermal cyclic process. No heat is produced and no work is consumed, so this is a reversible process, such as the charging of a capacitor or inductor. Why do you claim that this is "untenable"?

Ref: "Planck's Columbia Lectures", 2005 (ISBN 0-9659176-3-0)

From what you say as abovementioned, I think you misunderstood this paper, would you like to read the paper( The newest version ) again carefully?

##### Share on other sites

Lots of words.

None of them stops me calculating K ln(W).

In a way, it even helps.

Here's a quote from the wiki page about Liouville's theorem

"Physical interpretation

The expected total number of particles is the integral over phase space of the distribution:

A normalizing factor is conventionally included in the phase space measure but has here been omitted. In the simple case of a nonrelativistic particle moving in Euclidean space under a force field with coordinates and momenta , Liouville's theorem can be written

This is similar to the Vlasov equation, or the collisionless Boltzmann equation, in astrophysics. The latter, which has a 6-D phase space, is used to describe the evolution of a large number of collisionless particles moving under the influence of gravity and/or electromagnetic field.

In classical statistical mechanics, the number of particles is very large, (typically of order Avogadro's number, for a laboratory-scale system). Setting gives an equation for the stationary states of the system and can be used to find the density of microstates accessible in a given statistical ensemble."

The number of microstates is the w that I need to calculate the natural log of and multiply by Boltzmann's constant to get the entropy.

Edited by John Cuthber
##### Share on other sites

Lots of words.

None of them stops me calculating K ln(W).

In a way, it even helps.

Here's a quote from the wiki page about Liouville's theorem

"Physical interpretation

The expected total number of particles is the integral over phase space of the distribution:

A normalizing factor is conventionally included in the phase space measure but has here been omitted. In the simple case of a nonrelativistic particle moving in Euclidean space under a force field with coordinates and momenta , Liouville's theorem can be written

This is similar to the Vlasov equation, or the collisionless Boltzmann equation, in astrophysics. The latter, which has a 6-D phase space, is used to describe the evolution of a large number of collisionless particles moving under the influence of gravity and/or electromagnetic field.

In classical statistical mechanics, the number of particles is very large, (typically of order Avogadro's number, for a laboratory-scale system). Setting gives an equation for the stationary states of the system and can be used to find the density of microstates accessible in a given statistical ensemble."

The number of microstates is the w that I need to calculate the natural log of and multiply by Boltzmann's constant to get the entropy.

Liouville's theorem is that the sum of phase cell getting in and out a phase space is always zero, that is to say, H theorem [ so is Kln(W)] is a skill to display the irreversibility from a purely probabilistic point of view BUT NOT Physics principle. Moreover, this paper have strictly demonstrate that the so-called theory of entropy is wrong, then,

1、 what is Kln(W) ？

2、 why the factor is k ?

##### Share on other sites

To be honest, after 20 years of never having had occasion to use it, I can't remember the derivation of the expression.

But I can show you where I read it.

http://www.amazon.co.uk/Entropy-Energy-Levels-Oxford-Chemistry/dp/0198554893

If you get a copy (and I'm sure any book on statistical mechanics would do the job just as well) then you can see the answers to your questions.

"1、 what is K ln (W) ？

2、 why the factor is k ? "

It's not a thick book, and most of the explanations are fairly clear. Perhaps you would like to show exactly where they are wrong.

##### Share on other sites

I think only you are Back To Original, you may understand !

Indeed so.

The original macroscopic derivation of entropy does not rely on system granularity.

It took the world one hundred and fifty years to establish the mechanical equivalence of heat and move towards a proper theory of energy. Caloric was dispensed with at that time.

It took a further fifty years to establish the connection between the statistical mechanics of granular systems and the mechanics of continuous systems we call thermodynamics.

Which connection are you arguing with?

##### Share on other sites

To be honest, after 20 years of never having had occasion to use it, I can't remember the derivation of the expression.

But I can show you where I read it.

http://www.amazon.co.uk/Entropy-Energy-Levels-Oxford-Chemistry/dp/0198554893

If you get a copy (and I'm sure any book on statistical mechanics would do the job just as well) then you can see the answers to your questions.

"1、 what is K ln (W) ？

2、 why the factor is k ? "

It's not a thick book, and most of the explanations are fairly clear. Perhaps you would like to show exactly where they are wrong.

It seemed that you didn't answer these two questions here, you are honest, but I really don't think you are suit for discussing this thesis.

Indeed so.

The original macroscopic derivation of entropy does not rely on system granularity.

It took the world one hundred and fifty years to establish the mechanical equivalence of heat and move towards a proper theory of energy. Caloric was dispensed with at that time.

It took a further fifty years to establish the connection between the statistical mechanics of granular systems and the mechanics of continuous systems we call thermodynamics.

Which connection are you arguing with?

"Free fall theory" had Aggression-Reigned over Europe more than 2000 years, although it is wrong.

What I said "I think only you are Back To Original, you may understand " means that when you cast aside preconceived ideas and get back to the time before Clausius "deduced" the concept "entropy", and you know this paper, could you still think " Clausius entropy" is right ?

##### Share on other sites

Perhaps you didn't fully appreciate what I said about indicator diagrams?

In dimensional analysis (Buckingham's theorem) you take energy and divide it by temperature you obtain a definite quantity.

The scientific community has chosen to award this quantity the name entropy.

Do you deny this?

##### Share on other sites

It seemed that you didn't answer these two questions here, you are honest, but I really don't think you are suit for discussing this thesis.

LOL

##### Share on other sites

Perhaps you didn't fully appreciate what I said about indicator diagrams?

In dimensional analysis (Buckingham's theorem) you take energy and divide it by temperature you obtain a definite quantity.

The scientific community has chosen to award this quantity the name entropy.

Do you deny this?

You didn't discuss this paper, continuing such debate will be unmeaning but a waste of time, may be, "entropy" is important to you, just do as you wish! so much for this.

##### Share on other sites

Some time ago in the lab I spilled some trimethyl-pentane on the bench.

It evaporated quite quickly- much more so than water would have done. This is because, at room temperature the water has a much lower vapour pressure.

The two liquids have pretty much the same boiling points.

Why do you think the gradient of the vapour pressure vs temperature curve is so much steeper for isooctane than for water?

##### Share on other sites

Entropy：A concept that is not Physical Quantity

Have you ever heard of indicator diagrams?

Which connection are you arguing with?

Do you deny this?

You did not answer one of them.

A discussion is two way and based on what can be agreed, not what is in doubt.

So if you wish a real discussion let us start again.

Edited by studiot
##### Share on other sites

You did not answer one of them.

A discussion is two way and based on what can be agreed, not what is in doubt.

So if you wish a real discussion let us start again.

Just do as you wish. Debate can't solve problem, to me, so much for this. You can discuss with other, if you have interest in this paper.( you really read this paper carefully?)

Some time ago in the lab I spilled some trimethyl-pentane on the bench.

It evaporated quite quickly- much more so than water would have done. This is because, at room temperature the water has a much lower vapour pressure.

The two liquids have pretty much the same boiling points.

Why do you think the gradient of the vapour pressure vs temperature curve is so much steeper for isooctane than for water?

what you said is remote from the subject !

##### Share on other sites

what you said is remote from the subject !

No it isn't.

but perhaps you should start ( as the rules require) by answering the question put to you earlier.

How do you explain spontaneous endothermic reactions?

Edited by John Cuthber
##### Share on other sites

No it isn't.

but perhaps you should start ( as the rules require) by answering the question put to you earlier.

How do you explain spontaneous endothermic reactions?

I have said "what you said is remote from the subject !"

You ( or others ) can raise interminable irrelevant questions as this one, surely it doesn't mean that I ( or others )should answer it.

certainly, I can answer this question, but this relates to another paper, I don't want to release this paper now.

##### Share on other sites

This topic is now closed to further replies.
×