Why our tendency to fill the gaps can cause us to ignore important anomalies

decision-making

On 3 July 1988, the USS cruiser Vincennes mistook an Iranian commercial flight for a fighter jet.

The crew of the Vincennes, a missile cruiser, thought the Airbus was an attacking F-14 Tomcat fighter. The commercial flight’s transponder was emitting civilian code, the plane was climbing—not diving to attack, and radio messages warning the Airbus were not broadcast on air traffic control frequencies.

If the Iranian crew did hear the warnings, it wasn’t clear they were meant for them.

The US crew expected to encounter military aircraft because they were in active conflict with Iranian gunboats in Iranian waters.

The lives of 290 people were lost when the US crew ignored the anomalies and shot down Iran Air Flight 655.

Unfortunately, our tendency to resolve anomalies amidst confusion and ambiguity is common.

Psychologist Arie Kruglanski describes this concept as our need for closure. His studies found that this urge to settle discrepancies and make decisions allows us to get things done and function in the world.

It is not a static state – the need for closure varies depending on the context and predisposition of the people involved.

However, it is contagious and exacerbated when we’re under stress. Tiredness, distracting noises, time pressure and challenge can all increase our desire to make sense of what we’re experiencing despite obvious anomalies. We’re also more likely to place greater trust in those we know and pay less attention to those we don’t.

It’s not a new phenomenon but a risk often overlooked.

In 1949, Harvard psychologists Jerome Bruner and Leo Postman published a research study where participants were briefly shown a set of trick playing cards that involved reversing some card colours, i.e., some red cards were black and vice versa.

When questioned, 96% of study participants described the trick cards they’d glimpsed as normal. Bruner and Postman concluded that people denied the anomalies and saw what they expected to see – people’s preconceptions distorted their experience, and their minds filled in the gaps.

The McGurk effect is another demonstration of the power of the unconscious mind. The effect can happen when we integrate visual and auditory information and pay attention to what we see instead of what we hear. It’s more likely if the quality of the visual information is high and the auditory quality is poor.

The professor studying the McGurk effect for more than 25 years still gets caught out even though he’s aware of the auditory illusion. That sounds like a good reason sometimes to close our eyes and concentrate on what we’re hearing.

So, what else can we do to avoid the risk of ignoring crucial anomalies before we make an important decision?

Jamie Holmes, the of author of Nonsense: The Power of Not Knowing says ‘it’s good advice to string decisions that involve ambiguity over a period of days and revisit them in different moods“.

Imagine you’re a stand-up comedian who forgot the punchline. You’re up there, in the spotlight, going with the flow and ad-libbing until something funny—or enlightening—comes up.

First, try to recognise when you’re feeling threatened. The threat doesn’t have to be physical. It can be as simple as someone challenging your worldview.

Then, slow down if you can because urgency and pressure drive us towards less flexible thinking.

And while you’re doing that, keep the possible harmful consequences of your decision centre of mind. Other studies have identified this can be effective in countering the effects of urgency and pressure.

Share This

Related Posts