How can the ideas and concepts in v2 help address the issue of contested science around the cause of global warming

The ideas and concepts in the V2 collection address contested science not by attempting to prove one side “correct” through more data, but by reframing the conflict as a predictable consequence of how humans perceive, model, and interact with complex systems.

1. Understanding the Cognitive Barrier: Systemic Causation

A primary reason the cause of global warming is contested is a fundamental cognitive mismatch.

  • Direct vs. Systemic Causation: George Lakoff argues the human brain is evolutionarily wired for “direct causation” (e.g., an agent applying force to a patient, like a boy breaking a window). Global warming involves systemic causation—diffuse, indirect, and probabilistic chains of cause and effect. People who rely solely on direct causation frames often reject the science because their brains cannot automatically process the logic of a system where a cause in one location leads to a seemingly unrelated effect elsewhere.

  • Hypocognition: The debate suffers from “hypocognition”—a lack of the simple, fixed conceptual frames necessary to reason about multifaceted systemic shifts. Addressing the issue requires “cognitive policy” to build these frames in the public mind.

2. Reframing as a “Wicked Problem”

Systems thinking classifies the causes of global warming as a wicked problem or “mess” rather than a technical “puzzle”.

  • No Stopping Rule: Unlike scientific experiments with clear end points, wicked problems have no definitive formulation and no “stopping rule”—the process of “solving” them is never truly finished.

  • Better/Worse, Not True/False: Because there is no single objective model for a wicked problem, solutions are judged as “better or worse” based on stakeholder values rather than “true or false”. This helps address contested science by shifting the goal from achieving a 100% “proven” truth to negotiating a viable path forward that addresses the most critical risks.

3. Shifting from “Truth” to “Mental Models”

The framework emphasizes that all scientific knowledge is built on models, which are inherently limited representations of reality.

  • Observer Centrality: “Anything said is said by an observer”. Different stakeholders notice different “facts” based on their appreciative settings—internal filters conditioned by their history and values. A housing developer sees global warming as a regulatory “load,” while an ecologist sees it as an existential “threat”.

  • Exposing Assumptions: Donella Meadows advises that the best way to handle contested views is to get mental models “out there where they can be shot at”. By making unspoken assumptions explicit (using pictures, lists, or equations), parties can move from attacking each other to testing the logical consistency of their combined models.

4. Methodologies for Managing Conflict

Several tools are provided to utilize conflict as a resource rather than a barrier:

  • Strategic Assumption Surfacing and Testing (SAST): This methodology identifies stakeholders and surfaces the hidden beliefs driving their positions. By plotting these assumptions on an Importance/Certainty matrix, a group can focus on the “shaky” assumptions that are most vital to the debate.

  • Hegelian Dialectic: Ian Mitroff suggests structured debates where opposing sides must argue their case using the exact same data. This forces them to realize that their conflict is not about the numbers, but about the underlying worldviews used to interpret those numbers.

  • Accommodation over Consensus: The goal is often “accommodation”—finding a version of the situation that conflicting interests can “live with” to allow progress, even if they do not agree on the underlying values.

5. Managing Uncertainty via “Safe-to-Fail” Probes

In complex domains, causality is often only visible in retrospect.

  • Probe-Sense-Respond: Dave Snowden’s Cynefin framework suggests that when causality is unclear, experts cannot provide a definitive answer in advance. The approach should be to launch multiple, small, parallel, and even contradictory safe-to-fail experiments (probes) to see how the system actually reacts, allowing a more effective practice to emerge.

  • Error-Embracing: Rather than feigning certainty, practitioners should adopt a posture of “error-embracing,” using surprises and anomalies as vital feedback to update their models.

6. Challenging the “Environmental Fallacy”

Contested science often arises from drawing boundaries too narrowly.

  • Suboptimization: The environmental fallacy occurs when a problem is defined in a way that ignores its impact on the broader environment. Solving a localized part of the problem (e.g., carbon alone) without considering its economic or political “enemies” often leads to “Fixes That Fail”.

  • “Sweeping In”: A Singerian approach demands continuously “sweeping in” new variables—ethics, politics, and the perspectives of future generations—to ensure that a “precise” solution to the wrong problem is not adopted.