I'm asking this question to clarify my own understanding of science . Is Entropy , represented in math by S , the number of ways a system can change ? If not , what it is ? Thanks in advance .
No. Roughly speaking, it's a measure of how disordered the system is. For example, imagine a 1 x 1 x 1 metre box with gas in it. If all the gas is confined to a 1 x 1 x 1 centimetre cube in the front top left corner of the box, then the gas will be in a state of lower entropy than if is is spread evenly throughout the box.
I have spent months reading various laws and theories. I am more confident in understanding them, compared to before reading, than I used to be. Still, not 100% though. Having said that. My understanding is...not so much the, "number of ways of change" as opposed to, instead, change that leads to disorder. The typical definitions (on the internet) usually refer to a, "closed system"...without external influences. I think I get all that...what we observe...changes. When change occurs, it will usually conform to a move from orderly to disorderly. I usually think of this tree example I read awhile back. The tree in my backyard grows...giving presence to leaves. All orderly (while within the tree structured system). Then one by one they fall to the ground. Moving to a mostly disorderly state. Then I go outside and rake them into a pile...an orderly pile. I walk away to have a drink and the wind scatters the pile (I should have mulched them with the mower lol). The pile, again, resulted in a disorderly condition, scattered about. With enough time...and no external input (from me or anyone intentionally inputting work or energy) they scatter more over time. Even degrading as well. The important thing to me is...is the leaves move from orderly to disorderly. Not the reverse...they never go back into a pile or ascend up to their original branchesPlease Register or Log in to view the hidden image! Not sure if I answered well here or not. I appreciate you bringing this subject up. It allowed me to try and, "walk through" my own thoughts on the issuePlease Register or Log in to view the hidden image! To me...that is a "good day" in my book!
Lower entropy ? I would of thought a denser state was a higher entropy or do you mean lower as in geometrical positioning inside the box ? Would the gas spread out isotopically mean M/V=p where M is mass , V is volume and p is density ? Would F^3 force E=M^3 , I.e plasma ? Because UF^3=>U where U is internal energy
Shouldn't you be scheduling a doctor's appointment? Entropy is disorder. Lower entropy would be less disorder or closer to the starting state. In the example, as the gas spreads out, entropy is increased or higher entropy or disorder.
Well , the voices also tell me science , Einstein told me about Energy and how to make plasma work . P.s He just said , energy is proportional to the force cubed ...seriously things keep popping into my head that I shouldn't know . Perhaps I'd better go the docs .
I like your analogy with the leaves. To be more a bit more strictly accurate, entropy is a measure of dissipation of energy. So if you imagine each leaf carries a bit of energy, which tends to get spread all around your garden over time, that's not far off what an increase in entropy is. You can see perhaps that if heat is more spread out its temperature will be lower. Entropy has units of energy divided by temperature, e.g. J/degK so you can see that the lower the temperature of a given quantity of heat, the higher the entropy, indicating the heat is more dissipated (and less able to do mechanical work). In James's example in post 2, if you expand a gas its temperature drops, so the heat is more spread out and the entropy is higher.
Entropy increase need not always involve net energy dissipation or net change: https://en.wikipedia.org/wiki/Entropy_of_mixing The more fundamental measure of entropy is the number of possible (micro)states in a system, more particularly the logarithm of that number.
As an ex-chemist you will know this could expand to include Gibbs free energy as it relates to enthalpy and entropy e.g. https://socratic.org/questions/how-is-gibbs-free-energy-related-to-enthalpy-and-entropy Then there's equilibrium vs non-equilibrium thermodynamics - particularly how turbulence complicates otherwise 'ideal' processes. But why only encourage more wordplay?
Well yes but introducing free energy would I think really be an uncalled for excursion into chemical thermodynamics. As for non-equilibrium thermodynamics, that is rather out of my league.
Entropy means that energy , in any form , decreases . And never increases . Or stays the same . In Entropy there is no balance . Inotherwords in duration or time ( as most are more comfortable with ) . The Universe becomes non-existent .
Entropy is disorder, but only when "order" is first defined. And this definition is of course, completely arbitrary. Say you want to define a gas as being in an ordered state, so it becomes less ordered if the volume increases and the gas particles have more freedom to move around. Or say you define a wineglass as being an ordered state, then a disordered state can be created by smashing the glass. And so on.
Moderator note: river has been excluded from posting to our Science subforums for a period of 2 years. river has had two previous exclusions applied, but we have observed no change in his posting to the Science subforums. This ban is applied in accordance with our published policy on exclusion from the Science sections. Members who are in doubt as to what is required when posting in the Science sections should review our site posting guidelines and the exclusion policy before posting.
Not the way that entropy is defined in physics. A completely arbitrary definition wouldn't be very useful, would it?
Order is in fact, completely arbitrarily chosen by us. For instance a block of salt can be said to be an ordered state, and then dissolving the block in water will mean the salt is disordered, but relative to the ordered state (arbitrarily chosen). Entropy isn't really disorder is it (surely you know this)?. Defining order means you then have a definition of disorder, this is really just a convenient way to think about entropy. Entropy is really about how many different states some system of particles can be in, but who or what defines these states?