The skill of noticing events that didn't happen.
Posted Jul 01, 2016
We react to cues and information that we sense — usually things we see or hear.
But with experience, we also gain the ability to react to events that don’t happen. This important skill doesn’t always get the attention it deserves.
Imagine that you are working on a complex puzzle and you can’t look at the box to identify the picture you are making. So you just sort the pieces, perhaps by color, and try to find matches. You look at each piece and scan for neighbors it might nest beside, a passive strategy. After you make enough progress you get a sense of what the puzzle is about and so you can detect pieces that you need to find in the pile of unsorted fragments.
Now you are actively searching. You are searching for pieces you don’t have, pieces you expect to have. Pieces that are missing. You can use what you have learned thus far to form expectancies about what these pieces look like so that you don’t need to search randomly anymore.
That’s how we use our experience to spot the gaps.
One of the most famous examples of noticing an event that didn’t happen comes from the Sherlock Holmes story, Silver Blaze, about a the kidnapping of a horse (Silver Blaze) shortly before an important race.
The local inspector asks Holmes, “Is there any point to which you would wish to draw my attention?”
Holmes: “To the curious incident of the dog in the night-time.”
Inspector: “The dog did nothing in the night-time.”
Holmes: “That was the curious incident.”
Later on we learn that the dog was one of the guardians of Silver Blaze’s stable, but when the horse was removed, the dog failed to bark or grow agitated. And this behavior suggested that the horse was removed, not by a stranger, but by someone the dog knew well. So the curious incident was what did not happen.
Spotting these kinds of omissions is not just important in works of fiction. Once I interviewed a Navy officer who described an incident in which he was in command of a small patrol boat. He and his crew were participating in a large-scale exercise. They had gotten off to a late start because of mechanical difficulties and were rushing to catch up to the rest of the flotilla, racing across a busy shipping channel. The weather was terrible — driving rain hammering them from the port (left) side. My informant, the commander, assigned two crew members to keep watch on each side as they sped across the channel, and he had his head down studying navigational charts. All of a sudden, he realized that he was getting frequent announcements from the lookout on the starboard side, but hadn’t heard anything from the port lookout for several minutes. And the port lookout needed to be facing directly into the wind, into the downpour. The commander leapt to his feet and scanned the situation on the left of the vessel. He was horrified to see a large tanker bearing directly towards them. He issued a panicky order to turn hard to starboard and narrowly avoided getting cut in half by the tanker. In retrospect, he imagined how the port lookout must have scrunched his eyes into slits and maybe swiveled his head to the right, little by little, without thinking about it, flinching at the sheets of rain. That’s why the commander hadn’t been hearing any reports from the port lookout. It was what he wasn’t hearing that caught the commander’s attention.
Parents know to become worried when the sounds of young children playing in the next room subside and it gets too quiet. Military intelligence analysts start to get concerned when the adversary shifts to “radio silence” — often the precursor to an attack.
An interview with a child protective services caseworker surfaced an incident in which a two-month-old boy suffered a concussion. The culprit was his seven-year-old brother who tried to pick him from a bed and dropped him onto the corner of a nightstand. The caseworker interviewed the mother who explained that she had gone to the bathroom for just a minute, and wasn’t in the bedroom when the accident occurred. The caseworker was willing to accept the mother’s story, but was bothered by what didn’t happen: the mother expressed no guilt or self-recrimination. That’s what worried the caseworker and made her probe more deeply into the mother’s fitness to safeguard her children (and discover a history of drug use that the mother had tried to cover up).
We draw on our expectancies and our expertise to detect missing pieces, words that aren’t said, events that are supposed to happen but don’t. Our expectancies let us anticipate the things that are supposed to occur so that we can be surprised by their absence.
Surprise stems from the violation of expectancies. What is unique about missing pieces is that we are surprised by what didn’t happen, rather than by what did.
Information display designers might keep an eye out for missing pieces. Once my colleagues and I consulted on a project to build a decision support system for an organization that had to rapidly respond to crises such as oil spills. The sponsor had lined up a network of service providers (different types of equipment, logistics for volunteer workers, communications to government agencies and to the media, etc.). In the event of an emergency each of the service providers was to be notified and then would confirm its readiness to assist. The system design team prepared a display showing all the service providers that had logged in. But I argued that the watchstanders really needed to see which service providers had not yet logged in — they were going to be the bottlenecks. They were going to require workarounds. In the original design, they were going to be invisible. I explained why they had to take center stage.
Big Data strategies also need to contend with missing pieces. One premise of Big Data is to capture enormous amounts of data and conduct powerful analyses to spot trends. But what about the events that don’t happen? How are these to be captured and interpreted? It isn’t easy to flag all the things that don’t occur because there are a lot of those. So pure data capture and analysis may be insensitive to missing pieces. Perhaps some kind of intelligence could be added to the mix, to generate expectancies, but then you are at the mercy of the analysts formulating the expectancies. You no longer have a purely empirical approach. And machine learning methods might struggle when the critical cues are the missing pieces.
One final question is about the discoveries triggered by omissions: Which of the three insight paths are involved? I think each of them come into play. The contradiction path is probably the first to kick in — the violation of expectancies contradicts what we thought would occur and catches our attention. The correction path gets activated because the missing piece surprises us and leads us to change some of our beliefs and assumptions. And the connection path fits in here — it’s about putting pieces together except that in this case, the piece is a missing piece, the absence of an event, that connects to other data points.
You can easily see what’s in front of your nose. It’s much harder, it takes experience, to see what’s not there.