Based on the provided sources, the distinction between structural and symbolic information (often discussed as the difference between syntactic/physical information and semantic/functional information) can be broken down into the following key areas:
1. Definition and Nature
• Structural Information (Syntactic/Physical):
◦ Physical Arrangement: Structural information refers to the “inner structure” or diversity of an animate or inanimate individual[1]. It is the information “bound” into the physical configuration of matter, such as the specific sequence of a DNA molecule or the atomic arrangement of a crystal[2][3]. ◦ Latent Potential: Corning and Kline describe this as “latent” information. Like an unread book or a DNA strand waiting to be transcribed, it exists as a physical pattern but performs no work until utilized[4][5]. ◦ Quantifiable Syntax: In the context of Shannon’s theory, structural information is “syntactic.” It quantifies the statistical rarity or probability of a specific configuration (state) among possible configurations, disregarding meaning[6][7]. It is a measure of the “capacity” to carry information rather than the meaning itself[8]. • Symbolic Information (Semantic/Functional):
◦ Requires Interpretation: Abbott defines “symbolic entities” (e.g., a sentence, a flag, an algorithm) as those that require an interpreter to exist as such. Without an interpreting agent (a mind or a biological mechanism) to process the syntax, the symbolic entity does not exist; only the physical structure remains[9]. ◦ Context and Function: This form of information is “relational” and “context-dependent”[10][11]. It is defined by the interaction between the sign (symbol) and the receiver. For example, a red traffic light has physical properties (structural), but its symbolic information (“stop”) exists only within the context of traffic rules and a driver’s cognition[12]. ◦ Control: Corning terms this “control information”: the capacity to control the acquisition and utilization of matter and energy in purposive processes[10].
2. Relationship to Thermodynamics and Entropy
• Structural:
◦ Thermodynamic Cost: Creating structural information (e.g., forming a polymer or ordering a crystal) requires energy input and involves “bound energy” or “bound information”[13][14]. High structural entropy implies a disordered distribution, while low structural entropy implies high order[13]. ◦ Measurability: Structural information is linked to physical entropy and can theoretically be measured via thermodynamic states (microstates and macrostates)[3]. • Symbolic:
◦ Thermodynamic Independence: Symbolic information (meaning) is “intangible” and does not carry a thermodynamic burden in itself. The “meaning” of a message (e.g., Shakespeare vs. gibberish) cannot be distinguished by weighing the medium or measuring its energy content[15][16]. ◦ Functional Value: While the storage or transmission of symbols has an energetic cost (structural), the value or significance of the symbol is not a thermodynamic variable[15].
3. Biological Examples
• Genetics:
◦ Structural: The linear sequence of nucleotides in DNA represents structural information. It can be analyzed for its statistical properties (Shannon entropy) or chemical binding forces[2]. ◦ Symbolic: The genetic code is symbolic. It relies on a “physical interpreter” (the ribosome and tRNA machinery) to translate the structural triplet codons into amino acids[17][18]. Without this interpreter, the DNA has structure but lacks symbolic function[19]. • Signaling:
◦ Structural: A pheromone molecule has a specific chemical structure. ◦ Symbolic: To an ant, that structure is a symbol triggering a specific behavior (e.g., “follow this path”). To a human, the same structure lacks that symbolic content[20].
Summary Table
| Feature | Structural Information | Symbolic Information |
|---|---|---|
| Primary Focus | Physical arrangement, syntax, statistics. | Meaning, function, semantics, pragmatics. |
| Dependency | Exists as physical patterns/bound energy. | Requires an interpreter or observer[9][21]. |
| Action | Latent / Potential[4]. | Active / Control (directs work)[10]. |
| Thermodynamics | Linked to entropy and binding energy[22]. | Meaning itself has no mass/energy[15]. |
| Example | The shape of ink marks on a page. | The ideas conveyed by the words. |
References
[1] The relation between Thermodynamics and the Information Theories - the introduction of the term enmorphy.pdf [2] entropy and evolution.pdf [3] entropy and evolution.pdf [4] Corning 1998 - Thermodynamics information and life revisited - part 2.pdf [5] Corning 1998 - Thermodynamics information and life revisited - part 2.pdf [6] Information Genetics And Entropy.pdf [7] Rioul - This is IT a primer on Shannon Entropy and Information.pdf [8] entropy and evolution.pdf [9] Abbott - Emergence entities entropy and binding forces.pdf [10] Corning 1998 - Thermodynamics information and life revisited - part 2.pdf [11] Information Genetics And Entropy.pdf [12] Corning 1998 - Thermodynamics information and life revisited - part 2.pdf [13] Roegen - The Entropy Law and the Economic Process.pdf [14] entropy and evolution.pdf [15] Life information entropy and time - crofts nihms73684.pdf [16] Life information entropy and time - crofts nihms73684.pdf [17] Wills Genetic Information Physical Interpreters and Thermodynamics the material information basis of biosemiosis.pdf [18] Wills Genetic Information Physical Interpreters and Thermodynamics the material information basis of biosemiosis.pdf [19] Wills Genetic Information Physical Interpreters and Thermodynamics the material information basis of biosemiosis.pdf [20] Corning 1998 - Thermodynamics information and life revisited - part 2.pdf [21] Corning 1998 - Thermodynamics information and life revisited - part 2.pdf [22] Abbott - Emergence entities entropy and binding forces.pdf
