Are there any explanations which link shannon entropy to physicality and what are the criticisms of unembodied concepts of entropy

Based on the provided sources, the collection reveals a sharp and often contentious division regarding the relationship between Shannon (information) entropy and physical (thermodynamic) entropy.

The sources present two distinct camps: those who argue that Shannon entropy explains the physical nature of thermodynamics (via “missing information” or “erasure costs”), and those who criticize this link as a category error that confuses objective physical reality with subjective mental constructs.

Explanations Linking Shannon Entropy to Physicality

Proponents of the link argue that thermodynamic entropy is fundamentally a measure of information (or the lack thereof) regarding a physical system’s microstates, or that the processing of information has direct physical energy costs.

**1. The “Identity” View (Jaynes & Ben-Naim)**The strongest theoretical link is provided by E.T. Jaynes and Arieh Ben-Naim, who argue that thermodynamic entropy and Shannon entropy are not merely analogous, but identical concepts.

Entropy as Missing Information: Jaynes asserts that thermodynamic entropy represents the amount of uncertainty or “missing information” about the precise microstate of a system given its macroscopic observables (like temperature and pressure)[1]. In this view, maximizing entropy is simply a method of statistical inference—making the least biased prediction possible given the available data[2][3].

The Argument for Identity: Ben-Naim argues that the only difference between the two is the presence of the Boltzmann constant (kB​), which he views as a historical artifact of units. He proposes redefining temperature in units of energy to eliminate kB​, thereby making thermodynamic entropy dimensionless and formally identical to Shannon’s missing information[4][5]. He posits that interpreting entropy as “missing information” removes the mystery surrounding the concept and is superior to the “disorder” metaphor[6][7].

**2. The Thermodynamic Cost of Information (Landauer & Bennett)**This explanation links information to physicality through the energy required to manipulate it.

Landauer’s Principle: Rolf Landauer argued that “information is physical” because it is inevitably inscribed in a physical medium. He demonstrated that logically irreversible operations, such as erasure (resetting a bit), must dissipate a minimum amount of heat (kTln2) into the environment to satisfy the Second Law[8][9].

Exorcising Maxwell’s Demon: This principle is used to resolve the paradox of Maxwell’s Demon. Charles Bennett and others argue that the Demon cannot violate the Second Law because the physical act of erasing the information it gathers about particle trajectories generates enough entropy to compensate for the order it creates[10].

Thermodynamics of Information: Recent frameworks treat information as a thermodynamic resource (similar to free energy). Parrondo and colleagues describe how non-equilibrium states with information (e.g., memory) can be used to extract work, creating a “thermodynamics of information” where information flow and energy flow are coupled[13].

3. Fold Theory and Morphic EntropyA more specific theoretical approach in the sources, “Fold Theory,” treats entropy as a derived quantity representing the “breakdown of recursive structure under strain.” It defines “morphic entropy” as the degeneracy of recursive paths a system can take. In this view, measurement collapses these possibilities, generating “informational heat,” thereby linking structural uncertainty directly to thermodynamic emissions[16][17].

--------------------------------------------------------------------------------

Criticisms of Unembodied (Information-Only) Concepts

Many authors in the collection aggressively critique the identification of physical entropy with Shannon entropy, characterizing it as a semantic confusion or an “intellectual snobbery” stemming from a joke by John von Neumann.

1. The “Subjectivity” and “Anthropomorphism” CritiqueCritics argue that defining entropy as “missing information” makes a physical property dependent on the observer’s state of mind, which they deem scientifically inadmissible.

P.W. Atkins: Atkins explicitly rejects the information-theoretic interpretation to avoid the “muddleheadedness” of implying that entropy requires a cognizant entity. He argues that entropy is an objective physical property, not an aspect of the observer’s mind[18][19].

K.G. Denbigh: Denbigh argues that thermodynamic entropy is “fully objective” and measurable (e.g., via calorimetry), whereas information depends on a specific question or observer. He asserts that Shannon’s theory is too comprehensive to serve as a surrogate for physical entropy, leading to “objectionable consequences” when applied to physics[20][21].

Wicken: Jeffrey Wicken argues that the “uncertainty” in thermodynamics (microstate fluctuation) is fundamental and objective, whereas the “uncertainty” in information theory is merely about the choice of a message. Once a message is selected, its Shannon entropy is zero, but a physical system remains in a macrostate with positive entropy[22][23].

2. The “Category Error” CritiqueThese arguments focus on the distinct mathematical and physical domains of the two concepts.

Incompatible Units: Authors like Jaffe and Thims emphasize that thermodynamic entropy has units of energy divided by temperature (J/K), while Shannon entropy is dimensionless (bits). They argue that equating them is a fundamental dimensional error[24][25].

• **Static vs. Dynamic:**Terrence Deacon and Wicken note that Shannon entropy is a static, logical property of a signal code, whereas thermodynamic entropy is a dynamical property of matter in motion. Shannon entropy does not spontaneously increase (a book does not scramble itself), whereas physical entropy does[26][27].

The “Neumann-Shannon Anecdote”: Multiple sources cite the story where John von Neumann advised Claude Shannon to call his measure “entropy” because “no one knows what entropy really is,” arguing that this joke unleashed a “Pandora’s box” of confusion[28]. Libb Thims calls the equating of these concepts “science’s greatest Sokal affair”[31].

3. The Failure to Explain Biological ComplexityCritics in biology argue that unembodied entropy concepts fail to capture the functional nature of life.

• **Semantic Vacuity:**Nicholas Georgescu-Roegen and Roegen point out that Shannon’s formula attaches the same “information” value to a string of gibberish as it does to Newton’s Principia, provided the statistical distribution of letters is the same. They argue that physical entropy involves qualitative degradation (energy becoming unavailable), which Shannon’s formula cannot capture[32][33].

Functional Information: In biological contexts, authors like Johnson and Wills argue that information must be “embodied” or have a physical interpreter (like a ribosome reading RNA) to be meaningful. Disembodied information is viewed as incompatible with the Second Law because it lacks the physical mechanism to effect change[34][35].