Based on the sources provided, the detection of “weak signals” is not merely a matter of an individual’s sensory capability; rather, it is primarily an emergent artifact of the “net”—the framing, boundary judgments, and tools chosen by the observer to filter a complex reality.
While individuals possess inherent cognitive limitations, the “net” cast by organizational systems and modeling techniques determines which of those signals are captured and which are discarded as “noise.”
1. Individual Capability: The Bounded Filter
Individual capability is a factor, but it is characterized more by limitations than by raw sensory power.
• Bounded Rationality: Human beings act “intentionally rational, but only limitedly so,” as there are strict limits to the amount of information the human mind can process[1][2].
• Selective Perception: An individual’s “value system” and unique history act as an internal filter, directing them to notice certain characteristics in their world while ignoring others[3][4].
• The “Mental Set”: Individuals often operate within a “mental set” where their thinking follows established lines, making it difficult to perceive explanations or consequences that fall outside that initial frame[5][6].
2. The “Net”: Boundary Judgments and Task Constraints
The sources suggest that what is detected is largely determined by the “spectacles” through which an observer chooses to view a situation[7][8].
• Boundary Judgments: The observer must draw a boundary between the organization and its environment[9][10]. Any data falling outside this chosen boundary is designated as “uncontrollable” or “uncertainty” and is often ignored by the formal system[11][12].
• Procedural Biases: Organizational “nets”—such as costing systems, management information systems (MIS), and reward structures—ensure that managerial attention is limited and biased[13][14].
• Information System Constraints: Traditional MIS often capture only what can be quantified (hard data), effectively “selecting out” the qualitative “soft” signals that are often the “nub of the issue”[15][16].
3. Station Points: The Observer’s Perspective
Where an observer stands (their “station point”) fundamentally changes the nature of the signal detected.
• Role-Specific Interpretation: A signal that is perceived as a “mandate” (a goal) by one person may be viewed as a “constraint” (a tail or obstacle) by another, depending on their position in the organization[17][18].
• Subjective Reality: There is no “objective” problem; a signal only becomes a “problem” or an “issue” when an individual places a personal construction on events[19][20].
4. Detecting Weak Signals Through Designed Nets
To capture these outliers, the sources recommend using specific Problem Structuring Methods to broaden the “net”:
• The “Oracle Question”: By asking what questions one would pose to someone who could foretell the future, observers can surface “weak signals” that are currently known but missing from their formal models[21][22].
• Cognitive Mapping: This technique creates an explicit “mirror” of an individual’s thinking, allowing them to see the structural properties of their own arguments and identify “busy” nodes or feedback loops that were previously imperceptible[23][24].
• Psychological Negotiation: Merging individual maps into a group “net” allows a team to see a “system of systems,” revealing synergies and contradictions that no single person’s sensory capability could detect alone[25][26].
In summary, weak signals are usually found in the “variety” that a system fails to explicitly establish[11][27]. If you cast a “net” designed only for quantifiable, hard data, you will naturally fail to detect the qualitative outliers that often drive grand challenges.
Does this perspective on the “net” resonate with your experience? Have you noticed that the reporting tools or “procedures” in your own organization tend to filter out important qualitative signals?
References
[1] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [2] [Book] Eden et al - 1979 - Thinking in organizations - Macmillan.pdf [3] [Book] Eden et al - 1979 - Thinking in organizations - Macmillan.pdf [4] [Book] Eden et al - 1983 - Messing about in problems - Pergamon Press.pdf [5] [Book] Eden et al - 1979 - Thinking in organizations - Macmillan.pdf [6] [Book] Eden et al - 1979 - Thinking in organizations - Macmillan.pdf [7] [Book] Eden et al - 1983 - Messing about in problems - Pergamon Press.pdf [8] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [9] [Book] Ackermann et al - 2005 - The practice of making strategy - SAGE.pdf [10] [Book] Eden et al - 1979 - Thinking in organizations - Macmillan.pdf [11] [Book] Eden,Harris - 1975 - Management Decision and Decision Analysis - Macmillan London Ltd.pdf [12] [Book] Eden,Harris - 1975 - Management Decision and Decision Analysis - Macmillan London Ltd.pdf [13] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [14] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [15] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [16] eden use of cognitive maps to structure problems.pdf [17] [Book] Ackermann et al - 2005 - The practice of making strategy - SAGE.pdf [18] [Book] Ackermann et al - 2005 - The practice of making strategy - SAGE.pdf [19] [Book] Eden et al - 1983 - Messing about in problems - Pergamon Press.pdf [20] [Book] Eden et al - 1983 - Messing about in problems - Pergamon Press.pdf [21] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [22] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [23] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [24] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [25] eden use of cognitive maps to structure problems.pdf [26] [Book] Eden,Ackermann - 1998 - Making strategy - Sage Publications.pdf [27] [Book] Eden,Harris - 1975 - Management Decision and Decision Analysis - Macmillan London Ltd.pdf
