Based on the comprehensive set of sources provided, a clear, singular definition of “systems thinking” versus “complexity science” does not exist. Instead, the sources present a series of binary distinctions or “cleavages” that define the two fields in opposition to—or as an evolution of—one another.
The common themes that emerge center on where the system exists (in the mind vs. in the world), the state of the system (stability vs. instability), and the aim of the observer (control vs. understanding).
Here are the four dominant themes that define these fields, followed by an assessment of whether they provide a clear definition.
1. The Locus of Systemicity: Epistemology vs. Ontology
The most profound philosophical distinction concerns whether a “system” is a mental tool used to understand the world (Epistemology) or a physical entity existing in the world (Ontology).
• Systems Thinking (Epistemological): In the “soft” systems tradition (Checkland, Ackoff, Churchman), systems thinking is a process of inquiry[1]. It is not a description of the world, but a mental construct or “epistemological device” used by an observer to organize thinking about a “messy” reality[2],[3]. For theorists like Ackoff, complexity is subjective; it is a property of the observer’s lack of understanding, not a property of the reality itself[4].
• Complexity Science (Ontological): Conversely, complexity science often treats systems as objective realities[5]. It assumes that “complex adaptive systems” (like ant colonies or economies) exist “out there” and can be modeled[6]. Complexity is viewed as an intrinsic property of the system (e.g., non-simulability or high interconnectivity), regardless of the observer[7].
2. The Nature of Order: Equilibrium vs. Far-from-Equilibrium
This theme traces the fields’ roots in physics and biology, distinguishing them by how they view stability and change.
• Systems Thinking (Stability): Traditional systems thinking (rooted in cybernetics and General Systems Theory) focuses on homeostasis and equilibrium[8],[9]. It emphasizes “negative feedback” loops that dampen change to maintain a steady state[10]. It views systems as “trivial machines” or “ordered” entities where inputs lead to predictable outputs, aiming for stability and “fail-safe” operation[11],[12],[13].
• Complexity Science (Emergence): Complexity science focuses on systems operating far-from-equilibrium or at the “edge of chaos”[11],[14]. It emphasizes “positive feedback” loops that amplify small changes (the butterfly effect), leading to spontaneous emergence and self-organization[15],[10]. Here, the focus shifts from maintaining stability to understanding how novel forms (morphogenesis) arise from instability[16],[17].
3. Capability of Management: Control vs. Adaptation
This theme defines the fields by their relationship to predictability and human agency.
• Systems Thinking (Design & Control): Systems thinking is often described as normative and teleological (purpose-driven)[2],[18]. It operates on an “engineering metaphor,” assuming that leaders can design ideal future states and steer the system toward them[19]. It seeks to “dissolve” problems through synthesis and design, aiming for optimization or improvement[20],[21].
• Complexity Science (Description & Adaptation): Complexity science is often descriptive, explaining how order emerges without seeking to design it[20],[22]. It argues that because causality is only coherent in retrospect (dispositional states), rigid control is impossible[23],[24]. Instead of design, it advocates for adaptation, “safe-to-fail” experimentation, and navigating environments where the future is unknowable[19],[25].
4. Methodological Rigor: Simulability vs. Narrative
A technical distinction emerges regarding how these systems can be modeled.
• Systems Thinking (Simulable/Simple): In the strict definition provided by Robert Rosen and Tim Allen, systems thinking deals with “simple” systems, defined as those that can be fully simulated or computed[26],[27]. If a system can be captured by a formal model or algorithm, it belongs to the domain of systems thinking[7].
• Complexity Science (Non-Simulable/Narrative): By contrast, “complexity” is defined as the class of systems that are non-simulable; they have no “largest model” that captures all their behaviors[26]. Because mathematical models fail to capture their contradictions and irreducible uncertainty, complexity science often relies on narrative rather than computation to bridge the gaps between different levels of the system[28].
Do these provide a clear definition?
No, a single unified definition does not emerge. Instead, the sources present two conflicting narratives about the relationship between these fields:
Narrative A: Complexity is a Subset of Systems ThinkingIn this view, “Systems Thinking” is the broad, overarching discipline (the “neutral science” or “umbrella”)[29],[30]. Complexity science is merely a specific “strand” or sub-discipline within it—specifically the strand that deals with measurement, non-linearity, and mathematical modeling of dynamic behavior[31],[32]. Here, complexity is simply the “science of complexity,” while systems thinking is the broader philosophical stance[33].
Narrative B: Complexity is a Radical Break from Systems ThinkingIn this view (championed by Snowden and followers of the “new science”), the two are fundamentally different paradigms[34].
• Systems Thinking is defined narrowly as “old” cybernetics—a tool for the ordered/complicated domain focused on linear causality and engineering[35],[23].
• Complexity Science is the distinct domain of the unordered/complex, requiring fundamentally different tools (probe-sense-respond) and rejecting the “machine metaphor” of traditional systems thinking[36],[37].
ConclusionIf a definition is forced from the synthesis of these sources:
• Systems Thinking is the holistic, qualitative, and often normative approach to inquiring into relationships, prioritizing the design of stable, purposeful wholes and the management of “messes” through human dialogue and synthesis[38],[39].
• Complexity Science is the quantitative, descriptive, and non-linear study of how interactions between agents produce emergent, unpredictable behaviors, prioritizing the understanding of self-organization and instability over control[40],[41].
References
[1] C. West Churchman.md [2] C. West Churchman.md [3] OU Course Material.md [4] Russ Ackoff.md [5] OU Course Material.md [6] Peter Checkland.md [7] Tim Allen.md [8] Alicia Juarrero.md [9] MC Jackson.md [10] Max Boisot.md [11] Alicia Juarrero.md [12] Dee Hock.md [13] Horst Ritell.md [14] MC Jackson.md [15] Alicia Juarrero.md [16] Gregory Bateson.md [17] Robert Flood.md [18] C. West Churchman.md [19] Dave Snowden.md [20] C. West Churchman.md [21] Russ Ackoff.md [22] Humberto Maturana.md [23] Dave Snowden.md [24] Claude Shannon.md [25] Mike McMaster.md [26] Relational Biologists - Robert Rosen Howard Pattee Dennis Noble.md [27] Tim Allen.md [28] Tim Allen.md [29] John Warfield.md [30] MC Jackson.md [31] John Warfield.md [32] James Ladyman and Ross Ashby.md [33] Stafford Beer.md [34] MOM.md [35] Claude Shannon.md [36] MOM.md [37] Relational Biologists - Robert Rosen Howard Pattee Dennis Noble.md [38] OU Course Material.md [39] Russ Ackoff.md [40] OU Course Material.md [41] James Ladyman and Ross Ashby.md
