Question

How is the law of requisite variety fundamental to dealing with complexity and what are the issues with the writers who do not incorporate it in their thinking?

Based on the sources, the Law of Requisite Variety (Ashby’s Law) is the immutable physics of management and systems theory. It provides the mathematical proof for why “command and control” fails in complex environments and defines the only two viable strategies for survival: attenuation (simplifying the world) and amplification (empowering the response).

Here is how the law is fundamental to dealing with complexity and the specific risks associated with approaches that ignore it.

1. How the Law of Requisite Variety is Fundamental

Ashby’s Law, succinctly stated by Stafford Beer and Patrick Hoverstadt, dictates that “only variety can destroy variety”[1],[2]. In simple terms, for a system to be stable, the number of states its control mechanism can adopt (its variety) must be equal to or greater than the number of states the system itself can exhibit[2].

This principle underpins all successful complexity management in the following ways:

A. The Necessity of Variety Engineering

Since the environment always contains more variety than the system managing it (the “Complexity Differential”)[3], managers cannot simply “control” complexity. They must engage in variety engineering by balancing two levers:

Attenuation (Filtering): Reducing the incoming variety to a manageable level. This includes creating “constitutive rules”[4], managing by exception[5], or grouping customers into segments[1]. Max Boisot calls this the “Cognitive Strategy” of filtering out noise to focus on regularities[6].

Amplification (Empowerment): Increasing the variety of the response. This is achieved not by working harder, but by delegating and using technology[5]. By giving autonomy to operational units, the organization amplifies its ability to handle diverse local problems without clogging the central management channel[1].

B. The Architecture of Autonomy (Recursion)

To satisfy Ashby’s Law without dissolving into chaos, organizations must use recursion. Stafford Beer’s Viable System Model (VSM) and Patrick Hoverstadt argue that viable systems must be nested within one another like Russian dolls[7],[8].

• Because no single central brain implies enough variety to control a complex organization, control must be distributed[9]. Each sublevel (e.g., a department) must have the autonomy to handle its own local variety, passing only the “residual” variety up to the next level[8].

C. The Definition of Resilience

Paul Cilliers and the MOM sources argue that resilience depends on “excess diversity” (variety)[10]. If a system is too efficient (i.e., it has stripped away all redundancy and extra variety), it lacks the internal states required to match unexpected changes in the environment[11]. Therefore, “slack” is not waste; it is the requisite variety needed for survival[10].

2. Issues with Writers/Approaches That Do Not Incorporate It

Writers who ignore Ashby’s Law generally fall into the trap of Linear Causality or Optimization. By failing to acknowledge that the system’s variety exceeds their control capacity, they risk catastrophic failure, fragility, or “tightening the mess.”

A. The Delusion of “Root Cause” and Control (vs. Dettmer & Triz)

The Issue: Methodologies like Theory of Constraints (Dettmer) and Triz assume that a complex system can be improved by finding a single “root cause”[12] or an “Ideal Final Result”[13].

The Requisite Variety Critique: From the perspective of Snowden and Beer, these approaches dangerously attenuate the reality. They reduce a system with massive variety (interactions, feedback loops) down to a single logical chain. In a truly complex system (unlike a “complicated” production line), this massive reduction ignores the “Dark Variety” (unobserved states). Ladyman and Ashby warn that if the regulator’s capacity is fixed (and low), and they try to control a high-variety system, the system will become unregulated[14].

Consequence: The “solution” works temporarily but fails to adapt when the system shifts, because the “root cause” was merely a symptom of a dynamic loop, not a static lever.

B. The Trap of Efficiency and Optimization (vs. Bureaucracy & Six Sigma)

The Issue: Traditional management often seeks to maximize efficiency and standardization (reducing the “cost” of complexity).

• **The Requisite Variety Critique:**Paul Cilliers warns that enforcing homogeneity (removing difference) destroys the system’s ability to generate meaning and survive[10]. MOM sources explicitly state that “excessive efficiency kills processes” because it removes the redundancy required to absorb shocks[11].

Consequence: A highly optimized system has low variety. When the environment changes (presenting new variety), the optimized system has no corresponding state to match it, leading to collapse (the “Used Car Law” implies adaptation requires stress/cost, which efficiency seeks to eliminate)[15].

C. The Failure of “Mental Freefalling” (vs. Intuitive/Tacit Models)

The Issue: Some writers, like James Wilk, suggest stripping away models to look at “video descriptions” or “concrete facts”[16], and Pirsig suggests handling data on “slips”[17].

• **The Requisite Variety Critique:**Patrick Hoverstadt and John Warfield argue that the sheer number of interaction permutations in a complex system (billions) exceeds the “channel capacity” of the human brain (The Law of Triadic Compatibility)[18],[19]. Relying on intuition or concrete details without explicit formal models (like VSM) results in “mental freefalling”[20].

Consequence: Without a structure to handle the variety (like Warfield’s “Problematique” or Beer’s VSM), the observer is overwhelmed. Warfield calls complexity the “sensation of frustration” caused by this cognitive overload[21].

D. The “God’s Eye View” Fallacy (vs. Idealized Design)

**The Issue:**Russ Ackoff advocates “Idealized Design”—planning backward from a perfect future state[22].

The Requisite Variety Critique: This assumes that the designer can foresee and specify the future states of the system. Dave Snowden and Paul Cilliers argue that in complexity, the system is “non-computable” and emergent[23],[24]. The variety of the future is unknown.

Consequence: Attempting to impose an idealized design often fails because the design lacks the requisite variety to handle the real-world friction and emergence that occurs during implementation. Beer would argue one must manage the present balance of variety, not a future ideal[25].

Summary of the Conflict

The core conflict is between Reductionism (which ignores Ashby’s Law to make the problem solvable) and Cybernetics/Systems Theory (which accepts Ashby’s Law and redesigns the organization to cope).

Those who incorporate it (Beer, Hoverstadt, Snowden, Jackson): Accept that they cannot control the system directly. They design frameworks (VSM, Cynefin) that allow the system to control itself through distributed autonomy and loose coupling.

Those who do not (Dettmer, Triz, Abel): Treat the system as a puzzle with a definable solution. They attempt to force the system’s variety down to the level of their logic tools. As Ladyman and Ashby note, if you cannot increase your variety, you must “discover constraints”[14]—but if you constrain a complex system too much (treating it as Simple), Snowden warns it will collapse into Chaos[26].


References

[1] Patrick Hoverstadt.md [2] Stafford Beer.md [3] Niklas Luhmann.md [4] TOG - Mastering the Muddle.md [5] Stafford Beer.md [6] Max Boisot.md [7] Patrick Hoverstadt.md [8] Stafford Beer.md [9] Paul Cilliers.md [10] Paul Cilliers.md [11] MOM.md [12] Theory of Constraints - William Dettmer.md [13] Triz.md [14] James Ladyman and Ross Ashby.md [15] James Ladyman and Ross Ashby.md [16] James Wilk.md [17] Robert Pirsig.md [18] John Warfield.md [19] Patrick Hoverstadt.md [20] Patrick Hoverstadt.md [21] John Warfield.md [22] Russ Ackoff.md [23] Paul Cilliers.md [24] Relational Biologists - Robert Rosen Howard Pattee Dennis Noble.md [25] Stafford Beer.md [26] Dave Snowden.md