The authors understand and use the concept of entropy through a dual lens: as a thermodynamic measure of physical disorder (Boltzmann entropy) and as a mathematical measure of uncertainty and freedom of choice (Shannon entropy). They utilize these concepts to explain how complex living systems manage to survive, self-organize, and most importantly, bridge the gap between blind physical laws and purposeful biological function.

Here is how these concepts are applied, particularly concerning the Cybernetic Cut and systemic stability:

1. Thermodynamic Entropy and Stability (Far-From-Equilibrium)

Classical thermodynamics dictates that closed systems inevitably wind down toward maximum entropy, disorder, and thermal death, which represents a state of “thermodynamic equilibrium”[1]. If a biological or complex system reaches thermodynamic equilibrium, it is dead[2][3].

Therefore, the authors redefine stability for living systems not as a static, lowest-energy resting state, but as a dynamic metastability. Complex systems and organisms are “dissipative structures”[4][5]. They maintain their internal structure (low entropy) by operating strictly as open systems, continuously sucking low entropy (free energy and data) from their environment and exporting high entropy (heat, errors, and waste) back into it[5][6]. Stability is achieved by operating far-from-equilibrium, where energy gradients push the system to self-organize and adapt to environmental noise[4].

2. Shannon Entropy (Combinatorial Uncertainty)

While Boltzmann entropy measures physical disorder, Shannon entropy mathematically measures uncertainty, unpredictability, or “freedom of choice”[8][9]. A completely random sequence (like a series of coin flips) holds maximum Shannon entropy and maximum uncertainty[10][11]. David L. Abel points out that in traditional complexity science, maximum complexity is mathematically synonymous with this maximum entropy or randomness (Random Sequence Complexity)[12][13]. While a highly entropic sequence has maximum “freedom,” it contains zero functional instruction or meaning[14].

3. Entropy and the Cybernetic Cut

The core of David L. Abel’s work uses these definitions of entropy to establish the Cybernetic Cut—the absolute, unbridgeable divide between the physical world of mass/energy (governed by thermodynamics) and the formal world of choice, concept, and cybernetic control[15][16].

Abel argues that spontaneous physical processes driven by thermodynamic necessity and chance (entropy) can only produce two things:

Ordered Sequence Complexity (OSC): Rigid, highly ordered patterns like crystals. These have low entropy but are forced by physical laws and contain minimal information[13][17].

Random Sequence Complexity (RSC): Incompressible, stochastic ensembles generated by heat agitation and noise. These have maximum entropy but zero prescriptive function[13][18].

Neither thermodynamic necessity nor entropic chance can cross the Cybernetic Cut to generate Functional Sequence Complexity (FSC), which is required for life (like the genetic code)[13][18].

To cross the Cybernetic Cut, a system relies on the high Shannon entropy of its physical substrate. For example, the bonds in a DNA backbone allow any of the four nucleotide bases to attach with equal thermodynamic ease[19]. Because the physics do not dictate which base attaches, these nodes are “dynamically inert” configurable switches[20][21]. This high Shannon uncertainty (freedom from physical law) provides the blank slate necessary for Choice Contingency—the ability of an active, formal agency to purposefully select specific switch settings to write Prescriptive Information (PI) and achieve a functional, homeostatic goal[19][22].

4. Thermodynamic vs. Kinetic Dynamics (Rate-Dependent vs. Rate-Independent)

While the authors do not explicitly contrast the standard chemical definitions of “thermodynamic” versus “kinetic” stability, the Relational Biologists (Howard Pattee and Robert Rosen) translate this physical dynamic into the principle of Complementarity and the Epistemic Cut[23][24].

To achieve sustained stability and function, a system must seamlessly unite two formally incompatible realms:

Rate-Dependent Dynamics: The continuous physical, thermodynamic, and kinetic processes governed by inexorable natural laws (e.g., the chemical reactions of enzymes and proteins)[25][26].

Rate-Independent Constraints: The discrete, formal symbolic rules (e.g., the genetic code) that are not driven by the speed or kinetic energy of the physical system, but act as arbitrary instructions[25][26].

True biological stability is achieved through Semantic Closure[23][27]. The rate-independent symbols (DNA) constrain and direct the rate-dependent thermodynamic kinetics (protein folding), but those resulting physical kinetics are absolutely required to actually “read” and execute the symbols[23][27]. Thus, a living system avoids entropic decay by internally synthesizing its own physical catalysts based on formal cybernetic instructions, rendering the organism “closed to efficient causation” and capable of autonomous survival[28][29].