When two and two equals thirty-six in market intelligence analysis
Imagine you have a 15-year-old fashionista daughter like I do. One day, she discovered a male fashion artist on the internet. She comes up to me with the following facts:
- He recently moved to another city.
- He has or has had a connection with Amsterdam (this is unclear).
- He lives on Xxxxx-street 319 in Utrecht, Netherlands.
- He runs a shop from his house.
- He is young.
- From the street you can recognize his shop by his logo displayed in the window.
- A picture of the window with the logo features on his website; the picture shows more than just the single window, details of the façade are visible as well.
- A Google street view analysis shows that flats dominate Xxxxx-street in Utrecht.
As a caring father, I didn’t want my little girl to go visit a male artist alone (it would have applied to any male actually, I hate to discriminate artists). So we set out to Xxxxx-street 319 in Utrecht. I live near Utrecht and knew the street so we didn’t use a navigation system. What did we find? Xxxxx-street 319 did not exist in Utrecht.
This discouraged us, but we are optimists. We thought we must have the number wrong. Like Sherlock Holmes we started to check the windows we saw for the logo and started to check whether the characteristic façade on the website picture was visible. However, no flat showed the features of the website picture’s façade nor did we see the logo anywhere.
We drew different conclusions. I concluded that internet is a lousy source: we had been duped. She concluded she was wrong. She called to find out he lived in Amsterdam having recently moved from The Hague.
Biases in interpreting information
What had happened? When she had connected with the artist, he sent her an SMS saying he lived on Xxxxx-street 319. She googled Xxxxx-street (without number) and as a first hit got Xxxxx-street in Utrecht. As every millennial knows, Google is never wrong. So, knowing he had moved, she connected the dots by filling in that apparently he had moved from Amsterdam to Utrecht. She checked it was a flat. Young people live in flats so she felt ready to go.
I see at least three biases here:
1. Having a limited amount of data but making up a narrative anyway creates an illusion of understanding. We were convinced that the fashion artist must have moved to Utrecht, since there is a Xxxxx-street in Utrecht. The artist is young, young people live in flats and there are flats on Xxxxx-street in Utrecht.
2. Even in the light of new evidence (when we found out that there is no house number 319 on that street in Utrecht) we stuck to our analysis and continued our search. This is known as premature closing: once formed, an opinion is resistant to change.
3. Facing a phenomenon and faultily using it to confirm a prejudice, is known as the confirmation bias. That was me saying that the Internet is a lousy source, and my daughter trusting Google.
Who knows if two and two add up to thirty-six or not?
What’s good to know is that these biases in information interpretation happen quite commonly. Another example dates back to April 19681, when US submarines were patrolling the coast off the Soviet Russian Far East.
To their surprise, the US submarines discovered that Soviet submarines were extensively and rather unusually using active sonar to look for something. What had they lost?
The conclusion that correctly offered itself was that the Soviets had lost a submarine. And there are two obvious narratives for how a Soviet submarine might have gotten lost: the submarine faced an autonomous problem that had proved to be fatal, or the submarine collided with another submarine and had been lost as a result.
The first narrative to the Soviets was heresy. Soviet submarines could not have autonomous problems. Therefore, Soviet military intelligence started looking for evidence to substantiate the second narrative.
A few days after the Soviet submarine got lost, a US submarine moored in Yokosuka harbor in Japan. In principle, that submarine could have been at the location of the hypothetical collision with the Soviet fish, which was just what the Soviet Navy needed to back up their narrative of the collision.
In addition, this specific submarine, the USS Swordfish SSN 579, moored with visible damage to its sail and periscope, something the Soviet military regarded as the missing piece in the puzzle. According to them, the damage on the Swordfish unquestionably linked it to the Soviet’s missing boat and the claimed collision. The US Navy took a different view:
“they [i.e. the Soviets] would add two and two and come up with thirty-six”
US underwater research later provided the real narrative. The Soviet ship did have an autonomous problem that proved fatal, while the Swordfish, completely unconnected, had simply hit a small iceberg.
Stay alert: beliefs and biases rule out judgement
What do these two real stories tell us as market intelligence practitioners?
1. Beliefs beat critical thinking hands down.
2. Narratives are best when based on a too small number of facts.
3. Biases delude amateurs and top-notch Soviet intelligence professionals alike.
4. Awareness of biases is absolutely no guarantee to be protected against them.
1 Sontag, S., Drew, C. , Blind Man’s Bluff – the untold story of American submarine espionage, Public Affairs, New York, pp. 75-80.
Intelligence Best Practices