Question

Which authors are closest to each other in the ideas how to cope with complexity?

Based on the provided sources, several authors form distinct clusters based on shared philosophies and methodologies for coping with complexity. The closest alignments occur among those who advocate for cybernetic structuralism, those who champion soft/learning approaches for “wicked” problems, those who focus on experimental/evolutionary strategies, and those who emphasize cognitive filtering.

1. The Cybernetic Structuralists: Stafford Beer and Patrick Hoverstadt

These two authors are virtually synonymous in their approach, focusing on Ashby’s Law of Requisite Variety and the Viable System Model (VSM).

Shared Core Concept: Both define complexity as “variety” (the number of possible states a system can exhibit) and argue that “only variety can destroy variety”[1],[2].

Common Strategy: Both advise dealing with complexity through “variety engineering”—specifically the use of attenuators (to filter incoming complexity) and amplifiers (to boost management’s response capacity)[1],[2],[3].

Structural Alignment: Both propose the Viable System Model (VSM) as the primary tool for organization, emphasizing recursion (nesting systems like Russian dolls) and autonomy to manage complexity at the appropriate level[4],[5].

2. The “Soft Systems” & “Wicked Problem” Thinkers: Checkland, Churchman, Rittel, and Ackoff

This group rejects the idea of “solving” complexity through engineering or optimization, viewing complex human issues as “messes” or “wicked problems” that require continuous learning and debate.

Checkland and Churchman: Peter Checkland explicitly builds on Churchman’s work, shifting from a “hard” view of the world to a “soft” view where systems are learning processes[6]. Both argue that dealing with complexity requires unfolding worldviews (or Weltanschauungen) and inclusion of multiple perspectives (“sweeping in”) rather than seeking optimal solutions[7],[8].

Ackoff and Checkland: Both define complexity as a “mess” (a system of interacting problems) rather than a simple difficulty[9],[10]. They agree that one cannot solve these parts in isolation; Checkland advises a “learning cycle”[8], while Ackoff suggests “dissolving” problems through Idealized Design[11].

Rittel and the Group: Rittel’s concept of “wicked problems” (which have no definitive formulation or stopping rule) underpins the logic used by Churchman and Checkland[12],[13]. All four authors advise abandoning linear “first-generation” or reductionist approaches in favor of participatory, argumentative, or dialectical processes[14],[15],[8],[11].

3. The Experimentalists: Snowden, Juarrero, and Cilliers

These authors share a view of complexity as dynamic, non-linear, and emergent, arguing that rigid rules fail and must be replaced by probing, constraints, and modesty.

Snowden and “Claude Shannon”: The source titled Claude Shannon explicitly draws on the Cynefin Framework associated with Dave Snowden[16],[17]. Both sources advise against “best practice” in complex domains, recommending instead a “Probe-Sense-Respond” approach using safe-to-fail experiments[18],[19],[20].

Juarrero and Snowden: Both emphasize that complex systems are shaped by constraints rather than direct causal force. Juarrero advises altering “enabling or governing constraints” to channel behavior[21], while Snowden advocates managing “starting conditions” and using enabling constraints to allow patterns to emerge[22].

Cilliers and the Group: Cilliers aligns with this group by rejecting the ability to have a “total picture” and advocating for “modesty” and distributed control[23],[24]. This mirrors Snowden’s warning against the arrogance of expert diagnosis and the need for distributed cognition[25],[21].

4. The Cognitive Filterers: Simon, Wilk, and TOG

This cluster views complexity largely as a result of human cognitive limitations, advising that we cope by ignoring, filtering, or “satisficing” rather than trying to model everything.

Simon and TOG: Herb Simon argues that we must “satisfice” (find a solution that is good enough) because of bounded rationality[26]. The TOG source explicitly aligns with this, advising practitioners to “master the art of ignoring variables” and use “minimalist interventions” rather than “boiling the ocean”[27].

Wilk and TOG: James Wilk argues that complexity is a “perceptual bug” and advises filtering complexity through “rule-out” questioning rather than trying to model it with boxes and arrows[28],[29]. TOG cites Wilk directly, agreeing that complexity is an observer phenomenon and that the goal is to filter out the “muddle”[30],[31].

Simon and Pirsig: Both suggest dealing with complexity by breaking tasks down to manage cognitive load. Simon calls for “hierarchic decomposition” (boxes-within-boxes)[32], while Robert Pirsig suggests “scaling down” scope and using a “slip system” to chunk information and empty the mind[33],[34].

5. The Narrative & Metaphor Advocates: Allen, Flood, and Lakoff

These authors argue that when formal models fail due to complexity, we must rely on story and metaphor to maintain coherence.

Tim Allen and Alicia Juarrero: Both explicitly identify narrative as the essential tool for complexity. Allen argues narrative is robust enough to handle the contradictions of complex systems[35], while Juarrero suggests “hermeneutical narratives” to explain the unique history of a system[36].

Flood and Lakoff: Robert Flood advises using “systemic metaphors” (machine, organism, brain) as filters to organize thinking about complexity[37]. Similarly, George Lakoff argues that coping with complexity requires a “constant shifting of metaphors” because no single model can capture the whole reality[38].