What is entropy ?

Discussion in 'Physics & Math' started by Mark Turner, Jul 17, 2019.

  1. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    I don't know about surely.

    I do have another point though: randomness. Disorder and randomness are sometimes used interchangeably; is a random number generator also "generating disorder" or is that even a question?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    Haven't read running water in over 2 years

    That's the way Iggy works

    Please Register or Log in to view the hidden image!

     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. exchemist Valued Senior Member

    Messages:
    12,538
    Not physical disorder, of course. Whether disorder means something in pure maths I don’t know.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. iceaura Valued Senior Member

    Messages:
    30,994
    The people analyzing the system.
    They are defined so that the reasoning and calculations from them agree with physical measurement and observation.
    It does in Combinatorics - at least, "order" does. Also in Probability.
    - - - -
    One of the most significant (to me) insights I picked up in school days, from thermodynamics - basic, very early on - was that entropy is a physical thing, a measurable, calculable, quantifiable feature or property of an analyzed system. That changed the way I saw the world.
     
    Last edited: Aug 5, 2019
  8. CptBork Valued Senior Member

    Messages:
    6,465
    Disorder refers to the number of different microscopic arrangements available to a system with some fixed macroscopic parameters. For instance if you have a box filled with an ideal gas, and the box has a given volume and contains a fixed number of gas particles and a fixed amount of energy to be shared amongst them, then there are zillions of different ways the particles can be arranged inside the box and have different individual energies from different ways of partitioning the energy in the box. Classically the number of possible arrangements would be infinite, corresponding to an infinite entropy, and one would only consider the differences in entropy between one state and another without worrying about the total entropy of each state on its own. In quantum mechanics, the entropy is always finite and reduces to zero or nearly zero at absolute zero temperature (the latter case occurs when there's more than one available ground state energy for the system).

    The standard entropy formula is given by \(S=k_B\ln\Omega\), where \(k_B\) is a fixed value known as Boltzmann's constant and \(\Omega\) is the number of microstates available to the system with the given macroscopic (i.e. total energy, volume, mole number) parameters. To count the number of available microstates you need to assume certain postulates either from classical or quantum mechanics. You might want to look up "Einstein solids" for an example of how it's done in practice.

    https://en.wikipedia.org/wiki/Einstein_solid
     
    exchemist and James R like this.
  9. James R Just this guy, you know? Staff Member

    Messages:
    39,426
    Actually, river was previously serving a 6 month exclusion period from the Science subforums. That ended, and he lasted about two weeks before I had to re-impose it.
     
    exchemist likes this.
  10. exchemist Valued Senior Member

    Messages:
    12,538
    Ah. I have him on Ignore, so don’t often see his effusions. Anyway great idea to keep him out of the science, thanks.
     
  11. exchemist Valued Senior Member

    Messages:
    12,538
    Very nice summary. One issue that always seems slightly unclear to me is whether we should think of entropy as reflecting the number of microstates themselves, or the way energy is distributed among them. Your explanation, I notice, speaks of energy distribution, as did my description earlier in the thread, but then Q-reeus correctly pointed out there is entropy associated with mixing, which does not seem to alter the way energy is distributed.

    How would you characterise it?
     
  12. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Suppose you define an ordered string as one of 010101...01, or 101010...10.

    Now suppose a function that randomly exchanges pairs of characters in either string. What that should do is 'randomise' the ordered strings, right?
    But what if anything, does it say about entropy?
     
  13. exchemist Valued Senior Member

    Messages:
    12,538
    Nothing. Entropy is a characteristic of a thermodynamic system.
     
  14. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Ok.
    What (if anything) does it say about disorder or randomness?
     
  15. CptBork Valued Senior Member

    Messages:
    6,465
    When counting the number of microstates available to the system, you must also take account of the positions of the particles, when such information is relevant. Given two distinct substances A and B, a state in which all the particles of substance A are on one side of a box and all particles of substance B are on the other side, is considered distinct from a state in which particles of each substance are distributed on both sides, even if the particle energies are the same in either case and some particles of A and B have merely swapped positions.

    Note that, because of the indistinguishability of identical particles in quantum mechanics, such a caveat wouldn't apply if A & B were identical substances, because at the quantum level nature only counts one microstate for each arrangement of positions and energies, without labeling the individual particles as separate entities which can be individually tracked. Classically it was already known that the particles in an ideal gas had to be assumed to be indistinguishable in order to produce the correct entropy formula for ideal gases, and this was known as the Gibbs paradox, before quantum mechanics provided the theoretical justification.

    Also I just want to mention that historically, changes in entropy from one state to another were defined by the accompanying heat absorbed or released, divided by the absolute temperature of the system absorbing or releasing this heat. Later, Ludwig Boltzmann showed that all the postulates and results of thermodynamics could be derived from some basic statistical assumptions about matter and energy at the most fundamental levels, and this is where the formula \(S=k_B\ln\Omega\) comes from. If you know the basic thermodynamic properties of a system, you can readily compute changes in its entropy without any reference to microstates or statistics, but if you don't know any of these properties a priori then you can use the statistical approach to derive them.
     
    exchemist likes this.
  16. CptBork Valued Senior Member

    Messages:
    6,465
    Given \(N\) as the string length, there are \(2^N\) different possible strings that can be randomly obtained, so if each outcome is equally likely then the entropy of the string would be calculated as \(S=Nk_B\ln 2\) from Boltzmann's entropy formula. If the probabilities for each outcome were different, you could still calculate the entropy, but using the more general Gibbs entropy formula instead.

    https://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics)#Gibbs_entropy_formula
     
  17. exchemist Valued Senior Member

    Messages:
    12,538
    I don’t know a precise enough definition of those terms to answer that.
     
  18. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    And there it is.

    Note what CptBork says above your post. In the first sentence he also doesn't define random, but uses it as if there is such a definition, then everything else in his post follows, kind of inevitably. Ok, there is the phrase "equally likely" . . .

    The important detail is the number of strings that can be obtained. Given a binary string of length k, say, what makes it random?
    It's the lack of any repeated pattern. It's also something that can be defined in terms of an algorithm that generates a particular string.

    The two ordered strings (my choice, note), have a pretty simple algorithm, a single while loop that prints a "01" or a "10" k/2 times.
    If the string has no repeated pattern, the algorithm has to store the entire string so it can print it. In between there are strings that can be defined with while loops, nested or otherwise.

    And so we have strings which are random (cannot be compressed), and strings which are randomly generated by a function that swaps characters, "randomly" (but that hasn't been precisely defined yet).
     
    Last edited: Aug 7, 2019
  19. CptBork Valued Senior Member

    Messages:
    6,465
    A sequence of outcomes is considered random if there's no apparent pattern which could allow one to anticipate the next outcome in the sequence. It's not necessary for the outcomes to be equally likely or occur with equal frequency, although that happens to be the case for each of the microstates associated with a single given macrostate in thermodynamics. Obviously in many cases, the apparent randomness has an underlying pattern as in a standard computer random number generator, but without the insider info such sequences are generally indistinguishable from true randomness.
     
  20. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Right, in which case random means unpredictable, or chaotic.

    There's a way to connect thermodynamics to random strings, which involves Maxwell's demon. Can the demon distinguish molecules of gas from each other and so predict which ones will go through the trapdoor, if the demon opens it at the right time? The answer appears to be no, because any kind of demon will 'see' only a background in thermal equilibrium--a photon gas which is monochromatic.

    The demon can't "write down" a string with any kind of order that will describe the system of particles; 'any string' must therefore be one of a large number of random strings at any time.
    Which is to say, any random string is as good a description of the system as any other.
     
    Last edited: Aug 7, 2019
  21. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Ahem. So the connection between information entropy and thermodynamic entropy is obvious in the above context: in thermal equilibrium a gas is in one of a large number of equally probable states; for large enough binary strings, the number of random strings is also large, if a string is chosen from the set of all strings the same (very large) size, it's much more probable the string is random (and not compressible) than having some order.
     
  22. CptBork Valued Senior Member

    Messages:
    6,465
    I believe it's actually spelled "daemon" in this case, as in a mythical gate guardian. Anyhow, it is in fact possible to distinguish different microstates of a system, but in the process of making this measurement and forcing a specific microstate to be selected, you're taking the system out of thermal equilibrium with its environment. It's impossible to derive the laws of thermodynamics/thermostatistics from the postulates of classical mechanics, since those postulates require that isolated systems can transition from high to low entropies just as easily as they can from low to high. They can however be derived from quantum mechanics via the Feynman path integral formulation, which requires that a system in thermal equilibrium exists in a superposition of all its possible microstates. You can collapse this superposition and observe the system or part of the system in specific microstates, but the system will become entangled with the environment and return to a superposition once equilibrium is reestablished.

    The actual issue with Maxwell's daemon is a question about whether it's possible to selectively manipulate a closed system at the individual particle level in order to reduce its overall entropy. The answer is in fact no, because the daemon itself would be subject to thermal fluctuations and must therefore experience an equal or greater entropy increase for any decrease it produces in the system.

    So to summarize, you can't specify a microstate for a system when it's in thermal equilibrium with its environment, and while you could use some summarizing info to specify one of the possible microstates as a representative of the overall macrostate, the simplest description with the fewest parameters involves describing the macrostate itself in terms of macroscopically measurable quantities.
     
  23. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    That's a more acccurate way to exorcise this daemon, I guess.

    But my description of thermal equilibrium as a monochromatic gas (of photons) still stands; indeed if these are infrared photons then any device that might correspond to the postulated daemon will also be at the equilibrium temperature. Because of that, any device we could reasonably construct to selectively manipulate the closed system, can't as I state, "write down" any useful information about the system except for a random string.

    This random string can be any of a large number of random strings. This suggests that, if we view the closed system as a generator of random strings, it doesn't generate any strings that have any order at all, and so are always incompressible algorithmically. I like that connection between thermodynamics (information about a particle or any particle), and binary strings, for some reason.

    There's a distinction between all the strings of length k, some of which have order (defined algorithmically again), and thermodynamics (of a system of 'free' particles) which never has order at equilibrium, although physicists say that an equilibrium system has an equal probability that say, all the gas molecules will move to the same half of a container, this never happens in nature . . .
     

Share This Page