The “Cynefin” Distraction: A Critique of Anti-Causal Management

So, I have reviewed this collection of papers and transcripts regarding the so-called “Cynefin framework” by Dave Snowden and his associates. To be frank, it appears to be an elaborate attempt to dismantle the rigorous discipline of Systems Thinking by constructing a straw man argument against “order.” The author seems determined to convince us that our models are not just incomplete, but “fundamentally flawed” because they assume causality[1].

Here is my assessment of why this “complexity” approach is problematic, despite its clever branding.

1. The Denial of CausalityThe central premise here is an attack on the very foundation of Systems Dynamics: the ability to model cause and effect. Snowden argues that while “Complicated” systems (like a Ferrari) have discoverable causality, “Complex” systems (like a rainforest or a corporation) do not[2]. He claims these systems are “dispositional” rather than causal, meaning relationships can only be understood in “retrospect” and never predicted[2],[3]. He explicitly targets Peter Senge and the Learning Organization, dismissing them as “mysticism” or “ordered” thinking that falsely assumes a system can be designed toward a future state[4],[5]. As a systems thinker, I find this defeatist. Just because the feedback loops are non-linear doesn’t mean we should abandon the attempt to model them.

2. A Taxonomy that Claims Not to Be a TaxonomyThey present a framework—Cynefin (pronounced kuh-nevin, apparently)—which divides the world into five domains: Simple (or Clear), Complicated, Complex, Chaotic, and Disorder[6],[7].

The Right Side (Order): They graciously allow that “Best Practice” and “Good Practice” work in the Simple and Complicated domains, where cause and effect are repeatable[3],[8].

The Left Side (Un-Order): But they insist that once you cross into “Complexity,” you must abandon “Best Practice” entirely[9]. They argue that applying standard engineering or business process re-engineering here creates a “catastrophic” collapse into Chaos[6],[10].It creates a false dichotomy. They claim it is a “sense-making framework” rather than a categorization model, yet they spend an inordinate amount of time categorizing which management styles fit which box[11],[12].

3. “Trial and Error” Rebranded as “Probes” Because they believe you cannot define a future state in a complex system, they reject “fail-safe” design[13]. Instead, they propose “safe-to-fail” experimentation[13],[14]. They call this “Probe-Sense-Respond”[15],[16]. Essentially, instead of doing the hard work of analyzing the system’s dynamics upfront, they suggest we throw multiple small experiments at the wall to see what patterns emerge[17],[18]. They call this “managing the evolutionary potential of the present” rather than hitting targets[19],[20]. It sounds dangerously like making it up as you go along, justified by the excuse that “the system is too complex to model”[21].

4. The Attack on Data and ModelingSnowden explicitly warns that our outcome-based targets and models are “lying to us”[22]. He argues that in healthcare and social systems, setting targets creates perverse incentives (Goodhart’s Law)[22]. While true to an extent, his solution is to rely on “distributed ethnography” and “human sensor networks”[23],[17]. He wants to replace expert interpretation with “micro-narratives” tagged by the workforce[2],[13]. He calls this “disintermediation”[2]. Effectively, he values water-cooler gossip (mass narrative) over expert analysis, claiming experts suffer from “entrained thinking” and “invisible gorilla” blindness[1],[24].

5. “Anthro-Complexity” vs. Real ScienceFinally, the author tries to separate his “social complexity” from the mathematical complexity of the Santa Fe Institute[25],[4]. He argues that humans aren’t ants; we have “identity, intelligence, and intention,” which breaks standard agent-based modeling[26]. While human agency is real, using it to declare that systems cannot be engineered is a step too far. He dismisses tools like the Balanced Scorecard as reductionist nonsense[1],[27].

ConclusionIn summary, this material is a rhetorical fortress built to protect managers from the responsibility of planning. By labeling difficult problems “Complex” or “Unordered,” they absolve themselves of the need to understand the causal structures driving behavior. They replace the rigorous engineering of systems with the management of “patterns” and “vectors”[28],[29]. It is not a new science; it is a rationalization for the abandonment of control[18].


References

[1] 044_ Making Sense of Human Systems an interview with Professor Dave Snowden.pdf [2] 044_ Making Sense of Human Systems an interview with Professor Dave Snowden.pdf [3] Cynefin applying an understanding of complexity to medicine.pdf [4] Snowden - Multi ontology sense making v2-May05.pdf [5] snowden_archive.pdf [6] 044_ Making Sense of Human Systems an interview with Professor Dave Snowden.pdf [7] Systems Thinking and the Cynefin Framework Final.3 Dettmer.pdf [8] Snowden - Good fences make good neighbors 2011.pdf [9] 100825 Origins of Cynefin.pdf [10] cynefin and ideas.pdf [11] 100825 Origins of Cynefin.pdf [12] 100825 Origins of Cynefin.pdf [13] 044_ Making Sense of Human Systems an interview with Professor Dave Snowden.pdf [14] Cynefin-book-SAMPLE_Ed01.pdf [15] The-Cynefin-Mini-book-online.pdf [16] djmarsay.wordpress on cynefin.pdf [17] Snowden - Good fences make good neighbors 2011.pdf [18] Snowden - Perspectives Around Emergent Connectivity Sensemakimg and Asymmetric Threat Managmenet 2006.pdf [19] Snowden - Naturalising Knowledge Management.pdf [20] snowden multi-ontology sense making 578-1421-1-PB.pdf [21] The-Cynefin-Mini-book-online.pdf [22] When dave snowden tells you your models are lying to you you listen.pdf [23] Dave-Snowden-IEA14.pdf [24] Snowden - Good fences make good neighbors 2011.pdf [25] Snowden - Anticipatory Models for Counter-Terrorism.pdf [26] Cynefin - separated by a common language stacey zimmerman and snowden.pdf [27] Snowden - Multi-ontology sense making.pdf [28] Systems Thinking and the Cynefin Framework Final.3 Dettmer.pdf [29] cynefin and ideas.pdf