# Entropy in everyday life

Yeah. We're sortta drifting into information entropy, as distinct from thermodynamics entropy.

I used cards as a concrete example, easy to see an order to them - like having 52 molecules of gas in a room.

Yep, and it was very helpful.

So, going back to the shuffling of cards - the deck becomes disordered because the shuffler made it so. The shuffler’s energy increases to create the disordered deck of cards. How does an increase in entropy show itself?

If I decide that the information I'm interested in is the numerical order as printed on the cards, then a fresh deck contains zero information. Before even opening the package I know exactly where every single card will be. I think you'll agree with this.

If, on the other hand, I decide that the information I'm interested in is the weight of each card, then a fresh deck contains at least some information. I do not know exactly where every single card will be in a fresh deck. I could not predict where every card (by increasing weight) will be in the fresh deck. I would have to measure them, thereby recording information.

The point is: by changing what I - a human observer - decide is the most ordered state, the value of information entropy changes. So, in this sense, the amount of entropy is not an objective property of a system; it is dependent on what I - subjectively, arbitrarily and fickl-ly - decide at any given time is of interest to me.

I'd quibble about how much information is in a fresh deck, given the choice of "what information". You need, as I say, some fixed "information", whatever you decide that is. It could be the volume of a gas bottle, for instance.

But yes, generally we decide what we're interested in, and so what information is. Except that doesn't seem to apply to quantum information which does seem to be an objective thing, somehow.

Lol Oh, and to make matters worse, I posted this thread in the sub forum “Philosophy”

And who gets the benefit of being able to weigh cards? The person holding the cards and seeing their value, or the person across the table who cannot hold your cards? The dealer?

This makes no sense at all. A visual clue?... OK. Weight in milligrams?...Naaah.
Nobody is sitting at a table playing cards.

You're still drunk. Stop polluting the thread.

Seems you started this subject of weighty playing cards. Now you demand I start a new thread on a subject you introduced?
Galileo's ball drop experiment is one of the three irrelevant references you made.

It's quite clear why
two cards with differing masses but the same dimensions
is not the same as
two balls of differing masses and differing dimensions.

Because I like you, I will spell it out for you:

I'm sorry you're having trouble with the concept, but you don't need to pollute this thread with your difficulties.

Last edited:
[dupe]

Nobody is sitting at a table playing cards.
You're still drunk. Stop polluting the thread.

Galileo's ball drop experiment is one of the three irrelevant references you made.
It's quite clear why
two cards with differing masses but the same dimensions
is not the same as
two balls of differing masses and differing dimensions.
[rehint: surface area and air resistance] Really, this from a physics grad.
If you ignore the effects of air resistance, all objects accelerate at the same rate. Air resistance is a very complicated force that opposes motion. It is determined by the mass of the object, the cross-sectional surface area of the object, and the velocity of the object. Downward acceleration decreases as velocity increases until you reach terminal velocity, the fastest rate of fall of that particular object. This occurs because an increase of velocity represents an increase in the force of air resistance.
And you speak of weighing cards? That's just weigth and has nothing to do with falling at all.
Of course for basic physics courses, problems are chosen where air resistance is negligible and thus can be ignored to simplify the problems.
I'm sorry you're having trouble with it, but you don't need to pollute this thread with your difficulties.
It is clear you do not understand the Law of falling bodies. The difference in rate of fall only becomes obvious at Terminal speeds.

All things fall with the same acceleration formula.
Maybe the most famous scientific experiment is Galileo Galilei's dropping objects from the leaning tower of Pisa in order to prove that all objects fall at the same rate, whatever their mass

Galileo used inclined planes for his experiment to slow the acceleration enough so that the elapsed time could be measured. The ball was allowed to roll a known distance down the ramp, and the time taken for the ball to move the known distance was measured. The time was measured using a water clock.
Galileo showed that the motion on an inclined plane had constant acceleration, dependent only on the angle of the plane and not the mass of the rolling body. Galileo then argued, but couldn’t prove, that free-fall motion behaved in an analogous fashion because it was possible to describe a free-fall motion as an inclined plane motion with an angle of 90°. Using Newton’s laws, we can prove Galileo’s theory by decomposing the gravitational force, acting on the rolling balls, into two vectors, one perpendicular to the inclined plane and one parallel to it. http://www.physics.smu.edu...

Last edited:
300 lines, and all of it missing the point. Here's two lines:

This is where your hammer and feather thing does become relevant. Which will hit the ground first?

Hint: it'll be the one with the greater mass in relation to its surface area. Feathers fall slow for a reason.

Guess my question will go ignored.

Guess my question will go ignored.
Well, that's why I'm trying to stop W4U from polluting the thread. Not only is he wrong, but he devotes vast stretches of pages to being wrong. I'm devoting as little as possible to correcting his misinformation.

I didn't ignore your (latest) question; I'm just not sure I have a useful answer yet.

Well, that's why I'm trying to stop W4U from polluting the thread. Not only is he wrong, but he devotes vast stretches of pages to being wrong. I'm devoting as little as possible to correcting his misinformation.

I didn't ignore your (latest) question; I'm just not sure I have a useful answer yet.
Okay, thanks.

One thing that is for certain - I'll never look at a deck of cards the same way, again.

Hint: it'll be the one with the greater mass in relation to its surface area. Feathers fall slow for a reason
yes, in the earth's atmosphere. In space a feather falls at the same rate as a hammer. Watch the space clip @ post #104

Last edited:
The point is: by changing what I - a human observer - decide is the most ordered state, the value of information entropy changes. So, in this sense, the amount of entropy is not an objective property of a system; it is dependent on what I - subjectively, arbitrarily and fickl-ly - decide at any given time is of interest to me.
OK, you have defined the subjective selections possible. But that only means they are all objectively possible from every perspective you want to, and which provides you with an opportunity to make your subjective selection in the first place. Superposition.

You do not create, all universal potentials exist outside of any observer. We can only make a "best guess" and sometimes we can prove our observed best guess right or wrong (proof).

Last edited:
yes, in the earth's atmosphere.
Which brings us back to the playing card. The more massive one will fall faster.

That was only 53 posts of derail.

A deck of cards is pretty everyday, so is something breaking when it hits the floor.

Things that break because they aren't strong enough are everyday objects that exhibit randomness. As I say, randomness is a thing you expect to see, in this case when a glass or a china cup or plate hits the floor.

Expectation is one of the key ideas in Shannon's version of entropy of information. And as I said earlier, you don't expect the order of cards to not be changed after shuffling, what you expect is the shuffled deck is "randomized". You could throw a deck of cards in the air to shuffle it too.

When a china coffee mug hits the floor and breaks apart you don't expect to see a pattern (but, is it art?).

In my opinion, ''order'' when speaking of a deck of cards means *I* know with certainty, where each card is located. Since the uncertainty increases when we start spreading cards, cutting decks, or shuffling the deck - the order is gone, and I can't see it ever coming back. I don't know if the deck of cards analogy is the best for describing entropy in terms of molecules, atoms, etc but if someone helps me to understand how order returns to the deck (after shuffling), I'll give you a gold star.
This is an important point. The discussion on the thread has now moved entirely away from thermodynamic entropy to the different, though somewhat related, topic of information entropy. Trying to follow all this about decks of cards will give you no insight at all into how energy is distributed among atoms and molecules.

Yep, and it was very helpful.

So, going back to the shuffling of cards - the deck becomes disordered because the shuffler made it so. The shuffler’s energy increases to create the disordered deck of cards. How does an increase in entropy show itself?
I don't think it is very helpful to mix thermodynamics with card shuffling. The card shuffling business is to do with information entropy, which is not the same thing as entropy in its original thermodynamic usage. Trying to consider energy changes in the process of card shuffling is not going to get anywhere useful. I can't see how, at any rate.

Which brings us back to the playing card. The more massive one will fall faster.

That was only 53 posts of derail.
And you are still wrong.

Any difference in rate of fall will only show up after the cards reach terminal speed in earth's atmosphere. You believe the weight of an ink drop is going to significantly affect the rate of fall of flat planes? It's like betting on falling leaves.

Moreover are you trying to convince me there is anyone on earth who can tell the difference between milligrams of ink, when you can place the cards on a scale and they will all register the same approximate weight.

And you are still wrong.

Any difference in rate of fall will only show up after the cards reach terminal speed in earth's atmosphere. You believe the weight of an ink drop is going to significantly affect the rate of fall of flat planes? It's like betting on falling leaves.

Moreover are you trying to convince me there is anyone on earth who can tell the difference between milligrams of ink, when you can place the cards on a scale and they will all register the same approximate weight.
Oh do stop derailing this thread with this trivial stuff. It is supposed to be about entropy.

If you want a Year 4 discussion about Galileo and gravitational acceleration, have it on another thread.

I don't think it is very helpful to mix thermodynamics with card shuffling. The card shuffling business is to do with information entropy, which is not the same thing as entropy in its original thermodynamic usage. Trying to consider energy changes in the process of card shuffling is not going to get anywhere useful. I can't see how, at any rate.
Agree. Not sure how the topic shifted to card shuffling, but the initial examples were that of describing disorder, or at least visualizing what disorder might appear like, using a deck of cards.

Agree. Not sure how the topic shifted to card shuffling, but the initial examples were that of describing disorder, or at least visualizing what disorder might appear like, using a deck of cards.
This passage from the Wiki article on entropy captures my discomfort about equating the two usages:

"The question of the link between information entropy and thermodynamic entropy is a debated topic. While most authors argue that there is a link between the two,[62][63][64][65][66] a few argue that they have nothing to do with each other.[citation needed] The expressions for the two entropies are similar. If W is the number of microstates that can yield a given macrostate, and each microstate has the same a priori probability, then that probability is p = 1/W. The Shannon entropy (in nats) is:

and if entropy is measured in units of k per nat, then the entropy is given[67] by:

which is the famous Boltzmann entropy formula when k is Boltzmann's constant, which may be interpreted as the thermodynamic entropy per nat. There are many ways of demonstrating the equivalence of "information entropy" and "physics entropy", that is, the equivalence of "Shannon entropy" and "Boltzmann entropy". Nevertheless, some authors argue for dropping the word entropy for the H function of information theory and using Shannon's other term "uncertainty" instead.[68]"

But clearly it is a live point for debate.