Based on the sources, the detection of “weak signals” or outliers is primarily an emergent artifact of the “net”—the representation, task constraints, and station points chosen by the observer—rather than a mere property of raw sensory capability. While human sensory capacity sets a biological ceiling, the ability to distinguish a meaningful signal from “noise” is determined by how the problem is framed and the internal “discrimination nets” of the observer.

1. Representation as the “Net”

The way information is organized (the “net”) determines which patterns or outliers are “perceptually obvious” and which remain hidden[1][2].

Diagrammatic vs. Sentential Frames: A diagrammatic representation groups information by location, allowing visual entities like discontinuities or maxima (outliers) to be “readily recognized,” whereas a sentential (list-like) representation of the same data makes them difficult to detect[1][3].

Explicitness: A good representation makes certain inferences explicit that were previously only implicit, effectively “filtering” the environment so the solver can find the signal without extensive search[1][4].

2. The Role of “Station Points” (Organizational and Role-Based Framing)

Simon argues that what an observer detects is heavily influenced by their “station point” or role, which serves as a cognitive filter.

Selective Perception: People perceive in a complex stimulus what they are “ready” to perceive; the more ambiguous the stimulus, the more the perception is determined by the observer’s internal state[5].

Internalized Filters: In studies of industrial executives, each perceived those aspects of a situation that related specifically to the goals of their own department[6]. Thus, a “weak signal” regarding human relations might be detected by a medical executive but missed by a legal executive looking at the same “raw” data[6].

World-View Bias: Decision-makers acquire a representation of a situation that focuses on their specific subgoals, interpreting partial information through a “world view” that shields them from other potentially relevant signals[7].

3. Task Constraints and the “Discrimination Net”

The ability to detect a signal is a function of the observer’s expertise, which Simon defines as an indexed “encyclopedia” of patterns stored in memory.

Chunking: Experts have 50,000 to 200,000 “chunks” of familiar units in long-term memory[8]. When they encounter a situation, familiar cues act as an index that retrieves relevant information[9][10].

The “Aha!” Experience: Detecting a subtle outlier (an “aha!” moment) only happens to the “prepared mind” that already possesses the appropriate discrimination net to categorize that stimulus[11].

If-Then Productions: Detection is governed by “if-then” pairs where the “if” is a pattern to be recognized; if the observer’s “net” does not contain a specific pattern-match for a weak signal, the signal is filtered out as irrelevant noise[12].

4. Attention as the Scarce Resource

Ultimately, Simon interprets the detection problem as one of attention management rather than sensory limit.

Filtering: In an information-rich world, the bottleneck is not the “raw world” but our capacity to attend to it[13][14].

The “Ding an sich”: A problem solver never perceives the world exactly as it is (the Ding an sich), but only an external stimulus filtered through their own preconceptions[15][16].

Agenda Setting: Neither problems nor opportunities (signals) exist for the observer until they are noticed and placed on the mental agenda[17].

In summary, while sensory organs provide the raw data, the detection of a signal is an artifact of the observer’s problem space[18][19]. Complexity and “hiddenness” often lie in the eye of the beholder—a signal is “weak” only if the observer fails to discover a simple description or chooses primitives that are inappropriate for its detection [Previous Turn].

Would you like to explore how organizations specifically design “intelligence units” to help catch these signals before they become crises?[20][21]