Agree. Not sure how the topic shifted to card shuffling, but the initial examples were that of describing disorder, or at least visualizing what disorder might appear like, using a deck of cards.
This passage from the Wiki article on entropy captures my discomfort about equating the two usages:
"The question of the link between information entropy and thermodynamic entropy is a debated topic. While most authors argue that there is a link between the two,
[62][63][64][65][66] a few argue that they have nothing to do with each other.[
citation needed] The expressions for the two entropies are similar. If
W is the number of microstates that can yield a given macrostate, and each microstate has the same
a priori probability, then that probability is
p = 1/
W. The Shannon entropy (in
nats) is:
and if entropy is measured in units of
k per nat, then the entropy is given
[67] by:
which is the famous
Boltzmann entropy formula when
k is Boltzmann's constant, which may be interpreted as the thermodynamic entropy per nat. There are many ways of demonstrating the equivalence of "information entropy" and "physics entropy", that is, the equivalence of "Shannon entropy" and "Boltzmann entropy". Nevertheless, some authors argue for dropping the word entropy for the
H function of information theory and using Shannon's other term "uncertainty" instead.
[68]"
But clearly it is a live point for debate.