The second law of thermodynamics is an important rule in physics. Understanding it well gives us a crucial understanding in it.
For more context, see the video here.
The question becomes “what is entropy”, which becomes “what is information”? And “gain of entropy” is associated with “loss of information in the system”. It is possible that the entropy never increased, just that the information becomes something we don’t care about.
Now, I propose the following test to see if the information truly is gone.
Start with two batches, one with particles colored differently starting in separate areas, and one with particles randomly colored.
Simulate the particle mixing.
Sample the particles (only a few can be sampled)
Train a model In distinguishing between the starting conditions.
If the model can successfully be trained, it means that the information that the particles were once separate can be recovered from a few samples (measurable information, not the entire system). This may give further insight in understanding the nature of this second law of thermodynamics.
But this is (to my understanding) exactly what the second law is about: If you don’t have all the knowledge about a system, you should assume the most likely thing – which very often turns out to be a “disordered mess”. It doesn’t have to be, as was also stated in the video you linked, but for the particles in the box, it will be.
I don’t see how any model can give you a better prediction than this. If you can somehow predict the initial state of your simulation from just a limited observation, you also found (theoretical) ways to exploit that knowledge, so what you call heat and entropy will have changed, and you can extract (real, physical) work from the system.
There is a brilliant article by E. T. Jaynes about the so-called Gibbs “paradox” (which also involves colored particles in a box) which illustrates at a few examples, how different observers can come to different conclusions regarding entropy, based on their knowledge and technological advancement. There really is no paradox if we take the assumption “entropy ~ information” seriously. It might be worth a read:
Also, using a computer to generate random numbers brings to mind that quote by von Neumann: “anyone who attempts to generate random numbers by deterministic means is, of course, living in a state of sin.”