# Entropy in everyday life

Well is disorder "needed" for order? I suppose in the sense that the concept or order needs to be defined relative to its opposite.
To my way of thinking anyway, it would seem difficult to define order, without also defining disorder, and perhaps vice versa. Both are necessary for balance / equilibrium. What are your thoughts on that?

To my way of thinking anyway, it would seem difficult to define order, without also defining disorder, and perhaps vice versa. Both are necessary for balance / equilibrium. What are your thoughts on that?
It isn't a dichotomy. It's a continuum.

To my way of thinking anyway, it would seem difficult to define order, without also defining disorder, and perhaps vice versa. Both are necessary for balance / equilibrium. What are your thoughts on that?
I see no reason why a system can't be perfectly ordered, with zero entropy.

It might be simpler to think of a concrete example, such as a deck of cards.
When the cards come out of the pack they are in a specific order...

Hang on... that's not right...

What constitutes "order" in a deck of cards is arbitrary. We say it's ordered because we consider the numbers to be important. But that's subjective.
Objectively speaking, A23 is exactly as "ordered" as 3A2.

I'm confused... Is order/entropy defined subjectively?

Objectively speaking, A23 is exactly as "ordered" as 3A2.

I'm confused... Is order/entropy defined subjectively?
I was taught to think of entropy in terms of energy instead of things. A23 would have the same energy distribution as 3A2.

I was taught to think of entropy in terms of energy instead of things. A23 would have the same energy distribution as 3A2.
A roomful of mixing gases has entropy. That's just a deck of cards writ large.

A roomful of mixing gases has entropy. That's just a deck of cards writ large.
All God's chillum got entropy.

I think we have two options - we can look at entropy as a measure of disorder within a system or a measure of the energy dispersal in the system. It can be subjective or objective, depending on what you're assessing. My idea of disorder (using the messy house example) might be different than another person's opinion of disorder.

The popcorn example would be objective.

Am I right/wrong?

I think we have two options - we can look at entropy as a measure of disorder within a system or a measure of the energy dispersal in the system. It can be subjective or objective, depending on what you're assessing. My idea of disorder (using the messy house example) might be different than another person's opinion of disorder.

The popcorn example would be objective.

Am I right/wrong?
A measure of energy dispersal is the most correct way to think of entropy, to my understanding. There is a link between that and the degree of disorder of the system measured in other ways, e.g. physical arrangement e.g. solid vs liquid etc.

A measure of energy dispersal is the most correct way to think of entropy, to my understanding

Picked from my thought bubble collection

Could / should it be the other way around?

Currently we think of what we have, Universe wise, as being orderly

But everything is finding a level of heat where it will be the same (level) throughout the Universe

To me that would be more orderly than a Universe with hot and cold spots scattered throughout

As we sometimes point out to creationists, a stack of bricks is more orderly than a house.

A measure of energy dispersal is the most correct way to think of entropy, to my understanding. There is a link between that and the degree of disorder of the system measured in other ways, e.g. physical arrangement e.g. solid vs liquid etc.

So, if entropy is basically the measurement of the energy dispersal, how is it (also) a measurement of the uncertainty of a system? -Or- Do we assume that energy dispersal automatically leads to uncertainty? We've discussed that it's a measure of the amount of energy unavailable to do work, would that equate to the uncertainty (disorder)? That is what confuses me, I think.

exchemist said:
A measure of energy dispersal is the most correct way to think of entropy, to my understanding. There is a link between that and the degree of disorder of the system measured in other ways, e.g. physical arrangement e.g. solid vs liquid etc.

Absolutely

But can not regain the energy expended doing so .

So, if entropy is basically the measurement of the energy dispersal, how is it (also) a measurement of the uncertainty of a system? -Or- Do we assume that energy dispersal automatically leads to uncertainty? We've discussed that it's a measure of the amount of energy unavailable to do work, would that equate to the uncertainty (disorder)? That is what confuses me, I think.

There is no uncertainty . That is mathematical concept .

To the three dimensional physical Universe perspective , all energy states have all ways existed , hence the " uncertainty " becomes certain .

There is a cycle of all energy states . All energy states are part of the cycle .

From the extreme energy heat energy of the Galaxy and Quasars , to the extreme cold of the Cosmic Web ( of which we can't see , because it gives no electromagnetic wave energy , no light waves ).

They all have existed , together , at all moments , for infinity .

Last edited:
what?? ^^

Entropy is a measure of uncertainty.

what?? ^^

Entropy is a measure of uncertainty.

Because of what the physical is doing .

what?? ^^
Entropy is a measure of uncertainty.
Do not take this bait. It will only end in you pulling your hair out.

What constitutes "order" in a deck of cards is arbitrary. We say it's ordered because we consider the numbers to be important. But that's subjective.
Objectively speaking, A23 is exactly as "ordered" as 3A2.

I'm confused... Is order/entropy defined subjectively?
Suppose you know the order of a deck of cards, then the cards are shuffled and placed in a deck face down. The known order and the shuffled one have an entropy.

What you know is subjective--you know an initial ordering and that there is a new ordering, which you have no information about.

Suppose you know the order of a deck of cards, then the cards are shuffled and placed in a deck face down. The known order and the shuffled one have an entropy.

What you know is subjective--you know an initial ordering and that there is a new ordering, which you have no information about.
Right. Which suggests entropy (at least this version of it) is arbitrary.

I could
1] take an ordered deck of cards,
2] declare it in perfect order (because today I happen to like sequential human numbers, and also happen to like spades more than hearts),
3] shuffle it
4] calculate the increase in entropy of the new state
and then
5] declare that the new state is in perfect order (because it spells out my birthday) and that entropy is magically zero again.

Sorry, entropy is arbitrary is not the proper phrase; the proper phrase would be: entropy is dependent on the propert(ies) of interest.

Right. Which suggests entropy (at least this version of it) is arbitrary.
Well, many people are saying, there can be only one version.
I could
1] take an ordered deck of cards,
2] declare it in perfect order (because today I happen to like sequential human numbers, and also happen to like spades more than hearts),
3] shuffle it
4] calculate the increase in entropy of the new state
and then
5] declare that the new state is in perfect order (because it spells out my birthday) and that entropy is magically zero again.

Yes, that's acceptable except the part that says "entropy is magically zero", because the entropy in 4] is 'between' the cards in 1] and in 3].

It's what happens at 3] that is actually the more interesting part. Here is where you now, before you go to 4] and look at the order of the deck, expect to see some 'randomness'. You don't expect that after shuffling a deck of cards they are in the same order; you don't expect to see a repeating pattern either.

In information theory, a message with unexpected information has more "content" than one with expected information. It's a bit back-to-front.

Well, many people are saying, there can be only one version.

Yes, that's acceptable except the part that says "entropy is magically zero", because the entropy in 4] is 'between' the cards in 1] and in 3].

It's what happens at 3] that is actually the more interesting part. Here is where you now, before you go to 4] and look at the order of the deck, expect to see some 'randomness'. You don't expect that after shuffling a deck of cards they are in the same order; you don't expect to see a repeating pattern either.
Except that randomness is arbitrary too.

For all we know, the 52 cards now happen to count out the first 52 digits of pi ... "in perfect order".

In information theory, a message with unexpected information has more "content" than one with expected information. It's a bit back-to-front.
Agree (a conclusion I reached when young, before I had ever even heard of information theory).
But how does it inform this topic?
One man's randomness is another man's information.

Hmm. The ''order'' will never "return", no matter how many times we shuffle a deck of cards, though. So, a ''new state'' would never be in perfect order.