Testing the limit of the second law of thermodynamics with Julia

The second law of thermodynamics is an important rule in physics. Understanding it well gives us a crucial understanding in it.

For more context, see the video here.
The question becomes “what is entropy”, which becomes “what is information”? And “gain of entropy” is associated with “loss of information in the system”. It is possible that the entropy never increased, just that the information becomes something we don’t care about.
Now, I propose the following test to see if the information truly is gone.

  1. Start with two batches, one with particles colored differently starting in separate areas, and one with particles randomly colored.
  2. Simulate the particle mixing.
  3. Sample the particles (only a few can be sampled)
  4. Train a model In distinguishing between the starting conditions.

If the model can successfully be trained, it means that the information that the particles were once separate can be recovered from a few samples (measurable information, not the entire system). This may give further insight in understanding the nature of this second law of thermodynamics.

Who wants to join? Any suggestions?

If you can train a model that can distinguish between 1) a well ordered starting condition, or 2) a random starting condition . . . what have you learned?

That the information is not fully gone.

But this is (to my understanding) exactly what the second law is about: If you don’t have all the knowledge about a system, you should assume the most likely thing – which very often turns out to be a “disordered mess”. It doesn’t have to be, as was also stated in the video you linked, but for the particles in the box, it will be.

I don’t see how any model can give you a better prediction than this. If you can somehow predict the initial state of your simulation from just a limited observation, you also found (theoretical) ways to exploit that knowledge, so what you call heat and entropy will have changed, and you can extract (real, physical) work from the system.

There is a brilliant article by E. T. Jaynes about the so-called Gibbs “paradox” (which also involves colored particles in a box) which illustrates at a few examples, how different observers can come to different conclusions regarding entropy, based on their knowledge and technological advancement. There really is no paradox if we take the assumption “entropy ~ information” seriously. It might be worth a read:

1 Like

If there where no numerical errors, you could predict exactly the initial positions from the positions and velocities at any instant.

The second law is not incompatible with the exact knowledge of the system. You need to define a measure of what is having information to you, and then see how it evolves in time.

2 Likes

Also, using a computer to generate random numbers brings to mind that quote by von Neumann: “anyone who attempts to generate random numbers by deterministic means is, of course, living in a state of sin.”

2 Likes