What is entropy ?

Discussion in 'Physics & Math' started by Mark Turner, Jul 17, 2019.

  1. Q-reeus Valued Senior Member

    Messages:
    3,417
    Nonsense. It's mostly the vastly larger number of available high entropy states vs low entropy states that transitions from high to low are never observed in macroscopic systems. True on a classical or quantum mechanical basis. Energy constraints are another factor - dispelling some popular and erroneous claims e.g. all the air in a room could spontaneously rush into one small corner. No.
    Statistical mechanics had been derived classically but not on a fully self-consistent basis:
    https://www.phas.ubc.ca/~birger/boltzmann/node2.html
    Planck actually gave the vital initial clue way before Feynman path integrals came along. Quantization of energy - which solved the 'ultraviolet catastrophe' problem plaguing classical attempts to explain blackbody radiation. From then on previous ad hoc coarse-graining assumptions used by Maxwell, Boltzmann et. al. could be dispensed with.

    'Feynman' as word appears nowhere in this article: https://arxiv.org/abs/quant-ph/0511225
    Please supply a link to a reputable article where Feynman path integrals are actually a basis for developing statistical mechanics quantum mechanically. I'm skeptical.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. CptBork Robbing the Shalebridge Cradle Valued Senior Member

    Messages:
    5,965
    No, classical mechanics is time-reversible whereas thermodynamics is not, you can't derive those laws from classical postulates without adding in some statistical assumptions which are only justified by quantum mechanics. You can call the statistical assumptions "classical" but they don't follow from Newtonian dynamics.

    I was not talking about the historical development of statistical mechanics and then quantum mechanics thereafter. I was talking about how quantum mechanics justifies the statistical postulates used in statistical mechanics.

    I have gone through a few sources but the connection is complicated and not well-elucidated in the ones I've found so far. I'll get back to you on that later.
     
    exchemist likes this.
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. CptBork Robbing the Shalebridge Cradle Valued Senior Member

    Messages:
    5,965
    If you're looking to measure the state of a system, you don't need to establish thermal equilibrium with the measuring device. Maxwell's daemon isn't a measuring apparatus, it's a selection apparatus for eliminating certain microstates from the ensemble. In principle you can certainly look at a gas of photons and measure the positions and energies of the individual photons and thus measure them in a specific microstate.

    I think you misunderstand. Physics doesn't say that there's an equal chance of finding all gas molecules on one side of the container as there is of finding them evenly mixed. It says that a specific outcome which has them all on one side is just as likely as a specific outcome where they're evenly distributed. Since there are astronomically greater many ways of arranging the particles with a relatively evenly distribution as opposed to ways of arranging them all on one side of the container, in practice there is almost zero probability of finding them in such an organized state. From a thermodynamics point of view, each microstate has equal probability, but there are more microstates with certain features as opposed to others, more potential outcomes which are considered "disordered" as opposed to "ordered", and in thermal equilibrium the system is in a superposition of all these states.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. exchemist Valued Senior Member

    Messages:
    8,646
    Your point about classical mechanics being time-reversible seems quite profound.
     
  8. Q-reeus Valued Senior Member

    Messages:
    3,417
    Wrong comparison. Quantum mechanics at a 'fundamental' level is governed by linear equations and is just as subject to time reversibility as is classical mechanics.
    A nice book arguing where genuine time irreversibility stems from (non-linear systems exhibiting non-equilibrium processes) is:
    The Arrow of Time - Peter Coveney & Roger Highfield
    https://www.amazon.com/Arrow-Time-Through-Science-Greatest/dp/0449906302
     
  9. arfa brane call me arf Valued Senior Member

    Messages:
    6,554
    Well, what I think I actually said was more like: given the set of all binary strings of length k, when k is a large number, the probability of choosing a string which has all 1s in one half, and all 0s in the other half, is almost zero.

    But choosing a string (i.e. any string) from such a set is equivalent to randomly generating a string of length k, using whatever algorithm.
    In that case, I can't see how "selection" is different to "measurement" . . .?
    Moreover, Maxwell's daemon is classical, and so is the mechanism.
     
    Last edited: Aug 8, 2019
  10. iceaura Valued Senior Member

    Messages:
    29,555
    Random generation of a string of integers will produce repeated patterns with a certain probability - significantly different from 0.
    One the other hand, for every finite sequence of integers there is a mathematical function which will generate it - deterministically.

    There is no way to identify - with certainty - a finite sequence of integers as "random", by appearance alone.

    That depends on your mechanism of choice.
    And randomly generating a string using an algorithm - - - What counts as an "algorithm"?
     
    Last edited: Aug 8, 2019
  11. CptBork Robbing the Shalebridge Cradle Valued Senior Member

    Messages:
    5,965
    To be fair, time reversibility isn't in itself a complete argument against deriving thermodynamics from Newtonian dynamics, because you can model systems with increasingly chaotic behaviour which one might argue implies an increase in entropy. What I think constitutes a more fundamental argument is the application of Liouville's theorem:

    https://en.wikipedia.org/wiki/Liouville's_theorem_(Hamiltonian)

    Classical deterministic systems have the overall volume of their occupied position-momentum phase space conserved. That is to say, if you have a collection of particles evenly occupying some small set of positions and momenta, the set of positions and momenta they occupy later has to be equally small. You can't have a classical isolated system begin in an orderly state where the constituent particles have nearly all the same initial positions and momenta, but later on find particles evenly spread out in virtually every occupiable position, ranging over every possible momentum.

    In quantum mechanics with the standard Copenhagen interpretation, the wave function only evolves unitarily (in a time-reversible way) until a measurement is made, at which point the wave collapses in an irreversible manner. If you take the deterministic multiverse point of view then such a collapse would in principle be reversible, but each of the parallel universes which obtained a different measurement would have to contribute to that reversal process, which is extremely improbable, thus entropy can increase (or appear to increase) over time as information about the past becomes irretrievably entangled with the surrounding environment and the universe itself.
     
  12. arfa brane call me arf Valued Senior Member

    Messages:
    6,554
    --https://www.mdpi.com/1099-4300/20/12/934/htm
     
  13. arfa brane call me arf Valued Senior Member

    Messages:
    6,554
    Interesting also is the idea of choosing (necessarily so), a definition of irreversible process; usually this is defined thermodynamically for a classical system.
    We say energy is lost from the system, and moves to the environment irreversibly.

    In the information processing context, erasing information is thermodynamically irreversible; storing or copying information is 'reversible' then, when you 'unstore' it by erasing it. Storing a bit of information increases the entropy of a memory, erasing it increases the entropy of the environment.

    Erasure lowers the entropy of a bit of memory because an erased bit has one state, a stored bit has two.
     
  14. iceaura Valued Senior Member

    Messages:
    29,555
    That is a link to a very good empirical test for the appearance of randomness in a finite string.
    The test measures the appearance of randomness.
    Note: they are not claiming certainty or status as random, but "best" appearance of randomness.

    There is no conflict between that link and my post - it serves, rather, as an illustration of the basic fact: there is no way to determine, with certainty, by appearance alone, that any finite string of integers (or other such signifiers) is random or nonrandom.

    According to that test, for example, a string of digits from within the decimal representation of the number "e" appears random, thereby matching the established assessment of such strings as displaying apparently random distributions of those digits, while strings of digits produced by spins of a roulette wheel are differentiated from the appearance of true randomness, agreeing with the standard prior evaluation of them as well. That is evidence that the test is a good one, and its assessment of the appearance of such strings is both subtle and reliable. The differences in appearance of randomness between the strings of e and the strings of roulette wheel spins are not easy to spot.

    Meanwhile, we know for sure that the digits of "e" are determined, and will be exactly the same every time that string is generated. If you know the position in the global string of any digit, you not only know what it is but what every other calculated digit whose relationship to that one is known will be whenever observed. So this link condition:
    is not met, in the context of this thread - physical reality.

    In addition, the link makes the following statement
    which is in the context of this thread completely false. It is impossible, not easy, to determine by appearance alone - with certainty, mind - that a given finite string of integers is nonrandom. A random integer generator can produce - will produce, given enough time - any given finite string of integers, including whatever string is currently being observed.
     
  15. arfa brane call me arf Valued Senior Member

    Messages:
    6,554
    Thanks for the editorial. I have to quibble though, with determining with certainty that any finite string is nonrandom; since if it is nonrandom then certainly an algorithm exists that doesn't need to store the entire string.

    My two fixed strings that are generated by while loops, iterated k/2 times, are in that sense, an algorithmic proof that neither string is random. I thought that was obvious.

    Maybe the notion "by appearance" needs more work.
     
    Last edited: Aug 9, 2019
  16. James R Just this guy, you know? Staff Member

    Messages:
    33,360
    Thermodynamics and entropy were around long before anybody thought of quantum mechanics. You are saying that thermodynamics can only be properly justified retrospectively, in light of quantum mechanics.

    So, I'm interested to find out from you what statistical assumptions of thermodynamics must be justified by quantum mechanics, in order to retrospectively justify thermodynamics.

    Thanks.
     
  17. CptBork Robbing the Shalebridge Cradle Valued Senior Member

    Messages:
    5,965
    The very existence of thermal statistics itself is what requires quantum justification- the idea that even if you start with a system in a well-defined, well-ordered microstate, once it has been allowed to achieve thermal equilibrium it must exist in a superposition of many microstates covering every possible outcome, where the only certainties you can have about the system without measuring it and disturbing the equilibrium are found in the macroscopic measurements.

    Admittedly though, I believe quantum mechanics provides more than just a basic justification for the existence thermal statistics, but also requires that the individual microstates associated with a given macrostate are all equally probable, which is fundamental to Boltzmann's picture. I'm currently looking into the path integral connection I mentioned to Q-Reeus but it's been a long time since I worked with those things and I'm currently trying to get a grasp on the physical meaning behind the calculations. There's a path integral method for describing and calculating the canonical partition functions you see in stat mech, and I believe it does have some physical meaning within quantum mechanics and quantum field theory, but there's a lot of weird Wick rotation imaginary time stuff going on and it's hard to find sources which make more than a passing mention of the physical significance. I'm currently going back through my old Peskin and Schroeder QFT textbook and hoping to find more info, but like I say it's been a long time and I need to review a lot of stuff before I can say more.
     
  18. arfa brane call me arf Valued Senior Member

    Messages:
    6,554
    Yay. Except that there's only one particle to "work with". The demon only needs a 1-bit memory. So we now have at least two physical examples of a one-particle one-demon model.

    What Maxwell was thinking about was a n-particle one-demon system. Obviously that means a lot more memory.
     
    Last edited: Aug 10, 2019
  19. iceaura Valued Senior Member

    Messages:
    29,555
    You have information beyond the appearance, and that information proves that particular string is not random.

    If you had only the appearance, with no knowledge of how the string was generated, you would not be able to state with certainty that it was not randomly generated. Random generation can produce any string.
     
  20. arfa brane call me arf Valued Senior Member

    Messages:
    6,554
    And in order for this information to be relevant (or if you like, to have meaning) I need to store it somewhere. Storing information is thermodynamically costly.

    On the other hand, suppose I have either of the strings 0101...01, 1010...10; then I can gain information about them by inspection, of the first 4 characters, which tells me I can write them as a repeat of 01, or 10. More inspection tells me I can repeat the repeat, etc. I have to scan the entire string, or inspect it to be certain it can be represented (meaningfully) by a single while loop. I only need to store a 01, or 10 and I need one loop.

    Scanning or inspecting a string you have no information about is also thermodynamically costly, because you have to store the information when you do "get" it.
     
  21. CptBork Robbing the Shalebridge Cradle Valued Senior Member

    Messages:
    5,965
    If the string is allowed to take on any value, then you will need an external piece of info in order to specify that there's a repeating pattern in it without inspecting the whole string. Some data can be compressed more readily than others, but that doesn't mean they represent states of lower entropy- I don't see how entropy is connected to the amount of memory needed to store the exact state of a system.
     
  22. arfa brane call me arf Valued Senior Member

    Messages:
    6,554
    Well what it's actually called is algorithmic complexity. The connection to entropy is that this external information you mention has to be stored somewhere. Then as you scan the string, say from left to right, the stored information might need to be updated. That is, what is stored needs to be erased so new information can be stored. This erasure, as mentioned, increases to the entropy of the environment and lowers the entropy of the memory.

    The string 1010...10 has low algorithmic complexity. What you know about it is that the substring 10 is repeated, and this doesn't change. If it did change, you would erase the "10" in memory and substitute another pattern if you find one.

    There is another little detail which is that apart from defining thermodynamic irreversibility, you also need logical irreversibility. For instance logical AND is irreversible unless the output is 1 (so then you know both inputs are also 1s).
     
    Last edited: Aug 11, 2019
  23. arfa brane call me arf Valued Senior Member

    Messages:
    6,554
    I'd comment that anyone wanting to know more about what entropy is, won't be that satisfied with this thread. It's still quite a tricky subject.

    It involves information--storing and erasing it--and it involves the notion of work. In Maxwell's original thought experiment, the demon does a small amount of work, negligible in the circumstances, and the hot gas particles do most of the work.
    So one side of the container heats up without having to do the amount of work needed according to the laws of thermodynamics.

    But the information the demon needs to store so it can 'track' a particle is equivalent to the work the particle does on the system. This is the kicker, because storage is like the small amount of work Maxwell envisaged, but not until the memory is erased does a cycle exist, so now the demon can track another particle (since the information about the one that has passed through the trapdoor isn't relevant).

    Which gives us an equivalence between energy (as work) and information; so we have entropy of information.

    And maybe a last comment about irreversibility: fairly obviously since an AND logic gate 'erases' information, logical and thermodynamic irreversibility are equivalent when you mean physical devices.
     
    Last edited: Aug 12, 2019

Share This Page