Jump to content

Is Quantum event proportional to Entropy?


Recommended Posts

Information entropy is a concept from information theory. It tells how much information there is in an event. In general, the more certain or deterministic the event is, the less information it will contain. More clearly stated, information is an increase in uncertainty or entropy.

If we take entropy as being information then information is not constant as Entropy is not constant.

The Second Law of Thermodynamics states that the state of entropy of the entire universe, as an isolated system, will always increase over time.

If Entropy is increasing then information in universe is also increasing.

Let's say it is Quantum mechanics for which information is increasing. Quantum mechanics describes how the 12 fundamental particles behave. But It is a probabilistic theory. Because fundamentally it is random, new information is being created every time a quantum event occurs. In that case it could be these quantum measurement which are increasing the entropy of the universe as they are creating new information.

So the uncertainty of quantum particles might be proportional to the number of entropy .Indicating that if there is a high entropy count then there will be more quantum events.

Stephen Hawking showed that black holes emit radiation known as Hawking radiation. By this hawking confirmed that black holes have entropy. He also determined how much entropy they have. The super massive black hole at the center of the Milky Way has about 10 to the 91 Boltzmann constants of entropy. That is 1000 times as much as the early observable universe, and 10 times more than all the other particles combined. And that is just one black hole. All black holes together account for 3 times 10 to the 104 Boltzmann constant of entropy. So almost all the entropy on the universe is tied up in black holes.

So, Hawking's radiation being very high entropy, should hold most quantum events of the universe.

Link to comment
Share on other sites

1 hour ago, Oryza sativa said:

More clearly stated, information is an increase in uncertainty or entropy.

If we take entropy as being information

The two statements above are contradictory. The first says that information is an increase in entropy: \(I=\Delta S\). The second says that information is entropy: \(I=S\). Before continuing with the argument, this contradiction needs to be cleared out. Which one is true, \(I=\Delta S\) or \(I=S\)?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.