Jump to content

Can thermodynamical model be fundamental: reason not result?

Duda Jarek

Recommended Posts

I always thought that thermodynamics/statistical physics is effective theory – statistical result of some fundamental physics below, but recently there became popular theories starting from ‘entropic force’ as fundamental (basing on holographic scenarios, like in http://arxiv.org/abs/1001.0785 ).

I was taught that to introduce effective local thermodynamical parameters to given concrete situation, for each point we average inside some ball around it to get for example local entropy or temperature.

For a simple mathematician like me it sounds like a nonsense – in fundamental theory describing evolution of everything there should be one concrete history of our universe – there is no place for direct probabilities of scenarios required to define e.g. entropy.

So I wanted to ask if someone could explain why we can even think about fundamental ‘entropic’ theories?


To start the discussion I would like to briefly remind/discuss looking clear for me distinction between deterministic and stochastic/thermodynamical models:

DETERMINISTIC models – the future is completely determined

- evolution of gas in a tank is full dynamics of all its particles - for given valve opening there escaped concrete number of particles,

- it's usually Lagrangian mechanics of some field – there is some scalar/vector/tensor/’behavior of functional'(QFT) in each point of our spacetime, such that ‘the action is optimized’ – each point is in equilibrum with its four-dimensional neighborhood (spacetime is kind of ‘4D jello’),

- evolution equations (Euler-Lagrange) are HYPERBOLIC PDE - linearized behavior of coordinates in the eigenbase of the differential operator is

d_tt x = - lambda x

(0 < lambda = omega^2 )

so in linear approximation we have superposition of rotation of coordinates – ‘unitary’ evolution – and so such PDE are called wavelike – the basic excitations on water surface, in EM, GR, Klein-Gordon are just waves,

- the model has FULL INFORMATION – there is no place for direct probability/entropy in electromagnetism, general relativity, K-G etc. – the model has some TIME (CPT) SYMMETRY INVARIANCE (no 2nd law of thermodynamics – there is still unitary evolution in thermalized gas or a black hole)


THERMODYNAMICAL/STOCHASTIC models – there is some probability distribution among possible futures

- gas in a tank is usually seen as thermalized, what allows to describe it by a few statistical parameters like entropy (like sum of –p*lg(p) ) or temperature (average energy per degree of freedom) - for a specific valve opening, the number of escaped particles is given by a probability distribution only,

- it is used when we don’t have full information or want to simplify the picture – so we assume some mathematically universal STATISTICAL ENSEMBLE among POSSIBLE SCENARIONS (like particle arrangements) – optimizing entropy (uniform distribution) or free energy (Boltzmann distribution),

- thermodynamical/stochastic evolution is usually described by difussion-like: PARABOLIC PDE – linearized behavior of coordinates in the eigenbase of the

differential operator is

d_t x = - tau x

(tau - ‘mean lifetime’ )

so in linear approximation we have exponential decay (forgetting) of coordinates – evolution is called thermalization: in the limit there survive only ones with the smallest tau – we call it thermodynamical equilibrium and usually can be describe using just a few parameters,

- these models don’t have time symmetry – we cannot fully trace the (unitary?) behavior so we have INFORMATION LOST – entropy growth – 2nd law of thermodynamics.


Where I’m wrong in this distinction?

I agree that ‘entropic force’ is extremely powerful, but still statistical result – for example if while random walk instead of maximizing entropy locally what leads to Brownian motion, we do it right: globally, we thermodynamically get going to the lowest quantum state probability density – single defects create macroscopic entropic barriers/wells/interactions:


For me the problem with quantum mechanics is that it’s between these pictures – we usually have unitary evolution, but sometimes entropy grows while wavefunction collapses – there is no mystical interpretation needed to understand it: entropy maximizing from mathematically universal uncertainty principle is just enough ( http://arxiv.org/abs/0910.2724 ).


What do you think about this distinction?

Can thermodynamical models be not only effective (result), but fundamental (reason)?

Can quantum mechanics alone be fundamental?

Edited by Duda Jarek
Link to comment
Share on other sites

Interesting topic and questions. Forgive me in advance for any misunderstandings as this area is not something I know a great deal about, but still it is intriguing so I thought I would participate.


One aspect of the description of a deterministic model which, as you described, removes the probability aspect of our current model of entropy but I wonder if it doesn't just recast it. Even in a deterministic model, many processes should be irreversible. Is it possible that we simply misunderstand the fundamental aspect of entropy? Is it possible that entropy is a directional vector that defines or determines the events as opposed to statistical? I guess I am attempting to retain entropy in the deterministic concept.


As far as your follow on questions regarding QM, there seems to be much about our quantum physical model that remains incomplete but I am not seeing how reformulating the statistical model of entropy improves the situation and I don't understand how entropy can be fundamental as the first paper argued. Even if it is correct that gravitational effects do follow from entropy, we still seem to lack a causal reason that it should be that way so in my thinking, it does not yet follow that entropy is fundamental.

Link to comment
Share on other sites

In the only models I can think as theories of everything (the reason): deterministic, in each point of the spacetime, there is some single situation - there is no point in talking about probabilities and so entropy.

While building some THERMODYNAMICAL MODEL OVER THIS SOLUTION - in each point we usually consider a ball and average over it, getting some effective local parameters like entropy or temperature - for our history of Universe, it allows to assign to each point of the spacetime local entropy - and 2nd law of thermodynamics says that we move along four-dimensional gradient of this entropy - so there had to be entropy minimum in our Big Bang (or maybe Bounce) and it probably created entropy gradient giving 2nd law of thermodynamics.


Quantum mechanics by definition ignores the dynamics behind wavefunction collapse and just says probability distribution of its result - like thermodynamical models ... what is interpreted by some people that spcetime is infinitely quickly branching tree of parallel universes ...

I completely disagree - four-dimensional nature of our spacetime already leads to many nonintuitiveness, like (confirmed) Wheeler's experiment or that to translate amplitudes we are working for into real probability we should square it against Bell's intuition, or allows for powerful 'quantum' computers: http://www.scienceforums.net/forum/showthread.php?p=569143

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.