Based on the provided texts, the authors universally reject the traditional, mechanistic pursuit of mathematical optimisation and rigid stability, arguing that these concepts are dangerous when applied to complex human and ecological systems. Instead, they reframe stability as a dynamic, adaptive state and replace optimization with resilience, “satisficing,” and continuous learning.

Here is how the authors understand these concepts and how they relate to theories of organizational drift (such as Jens Rasmussen’s).

1. The Rejection of Optimisation

In classical “hard” systems thinking and traditional management science, the goal is optimization: finding the single most efficient means to achieve a pre-defined objective[1]. The complexity theorists and “soft” systems thinkers in this collection argue that optimization is both computationally impossible and systemically destructive:

• Herbert Simon (Satisficing vs. Optimizing): Simon points out that because of “bounded rationality,” humans lack the computational capacity to find a globally optimal solution in a complex environment[2][3]. Instead of optimizing, humans and organizations must “satisfice”—search for alternatives until they find one that is “good enough” to satisfy immediate constraints[3][4].

• Donella Meadows and Nassim Taleb (Efficiency breeds Fragility): Meadows warns that optimizing a system for a single variable (like “maximum sustainable yield”) strips the system of its natural diversity and variation, ultimately causing it to collapse[5]. Taleb echoes this, arguing that top-down engineering and the pursuit of efficiency remove vital redundancies, making the system “fragile” and highly vulnerable to unpredictable “Black Swan” events[6][7].

• H. William Dettmer & Russ Ackoff (The Trap of Local Optimization): Applying analytical optimization to local parts of a complex system (e.g., maximizing efficiency in one specific department) ignores their interdependence and usually degrades the performance of the overall system[8][9].

2. The Reconceptualization of Stability

If optimization is discarded, what does a successful system look like? The authors redefine stability not as a static, immovable state, but as a dynamic process of continuous adaptation.

• Cybernetic Homeostasis (Ashby & Beer): Early cybernetic thinkers view stability as “ultrastability” or “homeostasis”[10][11]. A system is stable not because it doesn’t move, but because it utilizes feedback loops to continuously correct errors and maintain its “essential variables” within viable physiological limits despite massive environmental shocks[12][13].

• Metastability and Far-From-Equilibrium (Complexity Science): Thinkers like Alicia Juarrero, James Ladyman, and Paul Cilliers argue that classical “equilibrium” (perfect stability) actually means thermal death (entropy)[14][15]. Complex systems must operate far-from-equilibrium[16]. They achieve metastability—a state poised delicately between rigid order (which prevents adaptation) and total chaos (which causes dissolution)[17][18].

• Relationship-Maintaining (Vickers): Geoffrey Vickers shifts the focus from “goal-seeking” (stopping once an optimized state is reached) to “relationship-maintaining,” recognizing that social stability requires continuously balancing an evolving web of internal and external relationships over time[19][20].

3. Connection to Rasmussen and Organisational Drift

The critique of optimization and the need for dynamic stability map directly onto Cognitive Systems Engineering and Jens Rasmussen’s theories of human performance and organizational drift.

John Flach explicitly integrates Jens Rasmussen’s SRK Framework (Skills, Rules, Knowledge) to explain how humans navigate complex work domains[21]. The systemic understanding of organizational drift (often framed as the “drift to danger” or “normalization of deviance”) is a natural consequence of the dynamics described above:

• Optimization Causes Drift: When organizations relentlessly pursue optimization and efficiency, they naturally try to remove what they perceive as “waste.” However, Flach points out that this often removes “essential friction”—the checks, balances, and social negotiations that slow things down[22]. While friction looks like inefficiency, it is actually the buffer that prevents catastrophic errors from cascading[22][23]. By optimizing away this friction, the organization unconsciously drifts toward the boundary of unsafe operations.

• Dynamic Complexity vs. Checklists: Rasmussen and Flach note that complex sociotechnical systems cannot be managed purely by rigid rules. When unpredictable anomalies occur, workers must shift from “Rule-based” behavior to “Knowledge-based” analytical problem-solving[21]. If management has over-optimized the system with strict, “fail-safe” scripts (dumbing down the front line, as John Seddon also critiques[24]), workers lose the requisite variety needed to safely navigate the boundaries of the system.

• Muddling Through vs. Drift: To counteract the drift caused by blind optimization, Flach advocates for “Muddling Through” (Incrementalism)[21]. Because the environment is highly uncertain, workers at the “sharp end” must be empowered to make small, abductive guesses, test them, and use the errors (friction/surprises) as feedback to continually steer the system back to safety[25][26].

• Self-Organized Criticality: Patrick Hoverstadt warns that organizations driven by tight couplings and the momentum of unquestioned structural optimization are prone to “Self-Organized Criticality”—a state where the system drifts into such rigidity that a tiny perturbation causes a total crash[27][28].

In summary, the authors agree that attempting to rigidly optimize a system for a specific, static goal blinds management to the loss of resilience. This blind spot allows the organization to systematically drift toward failure, which can only be prevented by embracing messy, dynamic metastability, preserving essential friction, and empowering local human adaptation.