Based on the provided sources, the concept of entropy relies fundamentally on boundaries to define the system’s interaction with the universe and on distinctions (observer capability) to define the microstates being counted. Different entropies are combined primarily through summation (additivity) for independent systems or by adjusting for correlations using mutual information.
1. Reliance on Boundaries
The definition and behavior of entropy differ strictly based on how a system is physically delimited from its environment.
• System Definition: A thermodynamic system is defined by its boundary.
◦ Isolated Systems: Have boundaries impervious to matter and energy. Only here does the Second Law strictly dictate that entropy must increase or remain constant[1],[2]. ◦ Open Systems: Have permeable boundaries allowing exchange of matter and energy (and thus entropy). In these systems, internal entropy can decrease (locally) if enough entropy is exported across the boundary to the surroundings[3],[4],[5]. ◦ The “System Boundary Problem”: In non-equilibrium thermodynamics (e.g., climate models), predictions using the Maximum Entropy Production Principle (MEPP) can yield contradictory results depending on where the boundary is drawn (e.g., including or excluding the external radiation field)[6],[7]. • Quantization: In quantum mechanics, energy quantization—which leads to discrete states and thus finite entropy—often arises specifically because a system is confined by boundaries to a finite region of space[8].
• Social Systems: In social systems theory, boundaries (distinctions) are necessary to separate the system from the environment; entropy here can measure the inability to distinguish or select states within those boundaries[9].
2. Reliance on Distinction (The Observer’s Role)
Entropy values depend on the ability or choice to distinguish between different particles or states.
• The Gibbs Paradox: The “entropy of mixing” relies entirely on distinction. If a partition is removed between two volumes of identical gases, there is no entropy change. If the gases are distinguishable (e.g., isotopes), entropy increases.
◦ Subjectivity: This leads to the conclusion that entropy depends on the observer’s knowledge or “willingness to distinguish”[10]. As Ben-Naim notes, if an observer is “color-blind” to the difference between particles, the entropic term for mixing vanishes[10],[11]. • Coarse-Graining: Entropy depends on the resolution (grain size) of the measurement. Finer distinctions (smaller cells in phase space) yield different entropy values than coarser ones[12].
• Information Erasure: In the context of Maxwell’s Demon, the demon’s ability to make distinctions (acquire information) allows it to locally reduce entropy. The thermodynamic balance is restored only when the “distinct” records in the demon’s memory are erased[13],[14].
3. How Different Entropies Are Combined
Entropies are combined algebraically, often relying on the property of extensivity (additivity), but adjustments are required when systems are correlated or when combining different types of entropy (e.g., physical and algorithmic).
• Additivity (Independent Systems):
◦ For uncorrelated or spatially separated subsystems (A and B), the total entropy is the simple sum: Stotal=SA+SB[15],[16],[17]. ◦ Molecular Components: Total molecular entropy is often calculated by summing the independent contributions of translational, rotational, vibrational, and electronic entropies: Stotal=Strans+Srot+Svib+Selec[18],[19]. • Sub-additivity (Correlated Systems):
◦ If systems interact or are correlated, simple addition overestimates the entropy. The correct combination involves subtracting the Mutual Information (I):SAB=SA+SB−I(A;B) ◦ This formula relates the joint entropy (SAB) to the individual entropies, where I(A;B) measures the correlation or “shared information” between them[20],[21],[22]. • Zurek’s “Physical Entropy”:
◦ Wojciech Zurek proposes a “physical entropy” (Sd) that combines standard thermodynamic (Shannon) entropy (H) with Algorithmic Randomness (K). This measures the total work potential including the cost of describing the system:Sd=H(p)+K(p) ◦ Here, H(p) represents ignorance (missing information) and K(p) represents the length of the shortest program to describe the known data[23],[24]. • System Metrics (Simpson):
◦ In systems engineering, metrics are combined by summing a “relational score” (based on physical entropy/energy efficiency) and an “object score” (based on information entropy/component distinctness) to create a total “Subsystem Score”[25].
References
[1] [Book] Atkins - The Laws of Thermodynamics A Very Short Introduction.pdf [2] [Book] Prigogine - Modern Thermodynamics From Heat Engines to Dissipative Structures 2nd edition.pdf [3] Life hierarchy and the thermodynamic machinery of planet earth.pdf [4] Nielsen - The Entropy of Entropy.pdf [5] [Book] Prigogine - Modern Thermodynamics From Heat Engines to Dissipative Structures 2nd edition.pdf [6] From Maximum Entropy yo Maximum Entropy Production - a new approach.pdf [7] From Maximum Entropy yo Maximum Entropy Production - a new approach.pdf [8] [Book] Atkins - Molecular Quantum Mechanics.pdf [9] On the Entropy of Social Systems A Revision of the Concepts of Energy and Entropy in Social Context.pdf [10] [Book] Ben-Naim - A Farewell To Entropy Statistical Thermodynamics Based On Information.pdf [11] [Book] Tame - Approaches to Entropy.pdf [12] [Book] Tame - Approaches to Entropy.pdf [13] Porrondo - Thermodynamics of Information.pdf [14] Porrondo - Thermodynamics of Information.pdf [15] [Book] Prigogine - Modern Thermodynamics From Heat Engines to Dissipative Structures 2nd edition.pdf [16] [Book] Tame - Approaches to Entropy.pdf [17] [Book] Volkenstein - Entropy and Information.pdf [18] Statistical Thermodynamics.pdf [19] [Book] Volkenstein - Entropy and Information.pdf [20] Porrondo - Thermodynamics of Information.pdf [21] [Book] Gray - Entropy and Information Theory.pdf [22] [Book] Zurek - Complexity, Entropy and the Physics of Information.pdf [23] [Book] Zurek - Complexity, Entropy and the Physics of Information.pdf [24] [Book] Zurek - Complexity, Entropy and the Physics of Information.pdf [25] Simpson 2013 - Entropy metrics for system identfification and analysis.pdf
