What is entropy and what is information?

Discussion in 'Physics & Math' started by arfa brane, Jul 28, 2017.

  1. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Well, there's a problem. The amount of information, or any amount of information about a system isn't a measure of entropy, because entropy isn't information. If you had complete information about all the particles in a system it would have zero entropy.

    Saying "entropy represents the amount of information . . ." isn't an objective statement.
    I'm claiming that information is necessarily a physical thing, whereas entropy isn't because it's based on probabilities and a probability isn't an intrinsic property of physical things. It just isn't.
    I think probabilities are determinable, but is probability objective?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    en·tro·py
    \ˈen-trə-pē\
    noun
    • 1: a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder, that is a property of the system's state, and that varies directly with any reversible change in heat in the system and inversely with the temperature of the system; broadly: the degree of disorder or uncertainty in a system
    • 2 a: the degradation of the matter and energy in the universe to an ultimate state of inert uniformity
      b: a process of degradation or running down or a trend to disorder
    • 3: chaos, disorganization, randomness
    Mirriam-Webster

    Please Register or Log in to view the hidden image!

     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Entropy is defined as \( S = k ln P \), where P is a probability and k is the Boltzmann constant. Equilibrium means P is maximum.

    The Mirriam-Webster dictionary isn't a great place to learn about what entropy is. The first definition is ok, but 2b isn't very accurate. 3 is ok too.
    But "ok" doesn't cut it
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. exchemist Valued Senior Member

    Messages:
    12,518
    This is going nowhere. I give up.
     
  8. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    My "problem" with entropy is the sort of decision process. Who decides?

    Consider a Rubik's cube arrangement. With all colours in order on the faces at the start is 100% organised. Random twist the faces until the colours are mixed and disorganised. As I understand there is a minimum number of moves available to return the cubeto 100% organised

    So what is the % of entropy for someone able to do it in the minimum number of moves? What is the % of entropy for me who cannot solve the cube? I guess the % of entropy for my cube remains the same no matter how much I twist it

    Another example. Start same as above start arrangement. This time remove one of the 64 cubes. (I can solve this

    Please Register or Log in to view the hidden image!

    )

    If you have removed a corner cube which shows 3 colours is that MORE disorganised than a cube removed from a edge only showing 2 colours?

    And would that be MORE disorganised than removing a face cube showing only 1 colour?

    My lounge is what could easily be considered a mess. However I can operate in it and know where everything is. So for me it is organised but messy.

    ME - 100% organised
    Others - has ? % entropy

    Coffee time

    Please Register or Log in to view the hidden image!

     
  9. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Thus demonstrating that entropy is in the eye of the beholder.

    Some people are very good at solving a scrambled cube, but that's a deterministic process. The entropy could be quantified as what isn't known about how the cube got into a particular state, but applying a deterministic algorithm means this particular state is irrelevant.

    In other words, it depends on what 'experiments' you do as to how entropy is defined (again, by you). Just to illustrate that with a different example, a coin has two states, heads and tails, but you can extend the number of states by say, marking one or both sides of a coin with an arrow. The direction this arrow points gives your coin more degrees of freedom. You can now decide that this direction can only point to one of four quadrants, or to one side of a hexagon, or whatever.

    As to entropy being in the eye of the beholder, suppose you have a two-headed coin. You ask someone who doesn't know this to choose heads or tails.
    This someone 'has' an entropy of log2, the same as a fair coin, the entropy is not something the coin 'has'.
     
    Last edited: Aug 3, 2017
  10. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Information is therefore also in the eye of the beholder. Information about a coin can be statistical, but a coin collector would want information about other things--how old is it and how worn etc.

    You could spend as much time as you want writing down all the properties of a single used coin, all the marks and where they are, or just take pictures. Information is what we define it to be. The entropy involved in any definition of information is something that is also necessarily defined when information is.

    A gas in a sealed container is already defined in an informational sense, by having a constant volume and a constant number of particles. Measuring the temperature is again 'defining' information, but the volume is already a measurement.
     
  11. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    When you have a gas at equilibrium, the number of particles entering a given small (closed) volume anywhere in the bulk is the same as the number leaving that volume, there is no net flow of particles. But this volume has a surface, and the net flow through this surface also is zero.

    In a container, the particles interact with its walls, and again the number moving towards a small part of the wall is the same as the number leaving, because of the equilibrium condition.

    The interaction with the walls is called pressure, it's a time derivative of momentum over area; you can insert a pressure meter which can 'sample' some of this momentum product given your meter has an area the gas interacts with, again, equilibrium says this must be the same pressure as anywhere in the bulk.
    Those are the kinds of things you can say you know about a container of gas.

    You may know that such a gas has a Maxwell-Boltzmann energy distribution, but this tells you nothing about individual particle velocities. To explore individual velocities you need to do a quite different kind of experiment, which does yield such information.
    So clearly it depends on how a system is defined or bounded both physically and by probabilities as to what information is.
     
    Last edited: Aug 3, 2017
  12. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Why I got a bit exasperated with exchemist (maybe). He certainly seems to have reciprocated.

    He says "I would agree that the entropy represents the amount of information that would be needed to completely specify the state."
    This is conflating the concepts of an absence of information with . . . information.
    And says:
    " That is a way of saying it is a measure of the complexity of the state, the number of degrees of freedom it has." This is a perfectly reasonable thing to say, however.

    Why does it matter? If someone hands you a scrambled Rubik's cube you have no information about how it was scrambled. This is your entropy, not the cube's.

    As it happens, the cube is in a state that may be one of many similar states in the same vertex of a graph, which is a tree (this is a real analytical result!). You have no idea which vertex the cube is in, but you know there are at least 20 moves to the 'initial' state if the vertex is extremal (if not it's less than 20). Any more moves than that (God's number) represent an uncertainty--or an unknown--so it's another entropy, but connected to your solving algorithm.
    The first is the 'scrambling' entropy, of a randomly generated word in the algebra (by a process) with no memory.

    So guess what starting to solve a cube does to the entropies?
     
    Last edited: Aug 4, 2017
  13. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    In case someone thinks, "that's all very well, but a Rubik's cube or some coins aren't the same as a gas in thermodynamic equilibrium, information and its entropy aren't the same as the gas and its entropy."

    Maybe, thermodynamic entropy does include the Boltzmann constant, this surely has nothing to do with systems in which temperature is not relevant.

    Well, you can easily change the algorithmic entropies for a Rubik's cube, the scrambling algorithm could remember what it does, in which case the solution is trivial (and physics is no fun anymore! Sad). You could make the solution algorithm, assuming you need one because there is an entropy, more efficient so it requires the least number of moves.

    But with a closed volume of gas, the states of individual particles aren't just something that isn't known, it's something that can't be known even in principle, because of equilibrium. The exorcism of Maxwell's demon from thermodynamics depends on this lack of information about particle states (the demon's entropy), and that storing (and having to erase, because of finite resources) information about particles would require more energy than the gas can have.
     
  14. The God Valued Senior Member

    Messages:
    3,546
    This "information" only Hawkins knows. It made him super star.

    My take is, first law gives you a mathematical state function called internal energy (U), the second law gives you another state function called entropy (S) and the third law of thermo dynamics gives you one more state function called Gibbs free energy and of course T is always there with zeroth law. So we have to have 'GUTS' to understand a system in totality.
     
  15. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    Can we presume that you mean Hawking?

    The bizarre thing is that when I Google "Hawkins" to see if there is anybody famous named Hawkins, Google shows me Hawking. I have to deliberately filter out Hawking to find out that James Stewart once played a TV detective named Hawkins. Maybe that's who you mean.

    Google seems to have taken entropy in information to heart.
     
  16. The God Valued Senior Member

    Messages:
    3,546
    No need to presume, it is what I meant.
     
  17. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    So, just because a Rubik's cube is a handy example, and because it's algorithmic--it's easy to show that scrambling the cube, randomly or otherwise, is an algorithm, and solving it (deterministically or otherwise) is too:

    If you have a solved cube, that's a state you choose to be a solution. Following a single 'move', how uncertain is the number of moves to the solved state? After two or three moves, what does the uncertainty 'look like'? And so on.

    A solution algorithm doesn't look at how many moves it took to get to whatever state the cube is in, but how many it will take (an unknown, but bounded number) to get to the chosen 'initial' state.

    Since temperature and pressure aren't a part of the space the cube and its moves, or operations (permutations of positions and orientations) 'live in', there isn't any thermodynamic entropy. However applying a solution algorithm to a scrambled state has the same 'unknowables' about the permutations--the information has been irreversibly destroyed or dissipated (i.e. erased). Something you can't know, or can't find out easily (tracing a scrambling algorithm backwards requires phenomenal amounts of computing resources, and might take centuries), is a single class of abstraction.

    So when a gas of n particles is at equilibrium, this is equivalent to erasing information as above . . . (or, that's my theory anyway, I hope there isn't too much wrong with it).
     
    Last edited: Aug 6, 2017
  18. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Why a gas in a sealed container is not like a store of information.

    As I think I've pointed out, thermodynamic entropy can't be objectively characterized as "information that represents what isn't known", because it can't be known.
    This is different to a store of information (say, all the books in existence in all the libraries in your city), because a store can in principle be "read" by a receiver of information, or observer. It might take you a long time to read the store of information, and that means reading all of it will necessitate the store being coherent or stable, for as long as it takes. But that's an erasure problem--a store, any kind of store, will dissipate, errors will occur randomly, requiring correction.
    (Books get damaged over time by their borrowers, eventually they have to be replaced by new copies)

    Without correction of the random errors in the store, the information will be irreversibly transformed eventually. But that's entropy, something that can't be known, yet in a bounded way. Of course any system like a store of information has to be bounded, and so the entropy is too, in a natural way.
    The visible universe must therefore have a finite 'store' of information in it, because the invisible part (including in the future expansion), will never be in this store.

    Any kind of device we can reasonably call a computer must have finite resources, a bound on information (algorithmic complexity), and on entropy. Everything is local . . .

    The particles of gas are bounded too, by a fixed volume (locality!), by the total amount of local energy, and then 'output' a pressure and a temperature correspondingly--the individual particle energies are an empty store, although this store has a known "size", it can never contain any information. The Boltzmann constant is there because of the bounds.
     
    Last edited: Aug 9, 2017
  19. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Apart from the non-sequitur (entropy is not information), exchemist does a reasonable job of describing entropy.

    But to hammer home the fact that physical objects don't have any entropy, consider that I can show you a scrambled Rubik's cube, you have no idea what I did to it, but I do, I know exactly the sequence of 'moves' I made.

    Please Register or Log in to view the hidden image!


    Where is the entropy? It's in the uncertainty you have about the permutation distance from the solved state, to the state I show you. Since there are more than 4 quintillion states, it's unlikely you will be able to use a computer (it would need to be pretty big too, maybe the size of a planet). The cube has no entropy, and nor do I (my state of knowledge is complete).
     

    Attached Files:

    Last edited: Aug 11, 2017
  20. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    While I understand what you are trying to say I would like to bring up 2 points

    What if I consider

    Please Register or Log in to view the hidden image!


    this to be my resting state no entropy?

    Regards solving this link might help

    The maximum number of face turns needed to solve any instance of the Rubik's Cube is 20, and the maximum number of quarter turns is 26.These numbers are also the diameters of the corresponding Cayley graphs of the Rubik's Cube group.

    https://en.m.wikipedia.org/wiki/Optimal_solutions_for_Rubik's_Cube

    Please Register or Log in to view the hidden image!

     
  21. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    What does that tell you though?
    It says you can be no further from your chosen 'initial' state than 20 face turns, equivalently 26 quarter turns. Not that helpful if you're an observer who doesn't know the actual sequence--the information has been erased from your 'context'.
     
  22. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    To be honest although I brought up the Rubik's cube in the delete a dimension I'm not sure it fits this thread

    Although there are a gazillion possible arrangements nothing is added or taken from the cube hence anyone of the gazillion arrangements can be the resting state

    Entropy needs the object to move to a lower energy level

    To reverse the entropy requires energy to be pumped back into the object

    In doing that you cause the object you are pumping the energy from to undergone entropy

    Since the Universe is expanding the moment will come when energy will be exactly evenly spread throughout the Universe

    There will no ability to pump energy up and energy will have no ability to fall further

    Eventually even the energy binding atoms will escape into the surrounding nothingness

    Bye bye Universe

    Please Register or Log in to view the hidden image!

     
  23. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    But remember, you're the one defining energy, where is it and how much. Energy in the Rubik's cube consists of the noise and the slight rise in temperature when you rotate a part, it's entirely frictional. It represents an energy of a write operation, I guess.

    Since the noise and the heat have no impact on the algorithmic entropy (you can model the cube on a computer), it isn't defined (because that's up to you).
     

Share This Page