Our brains are wired for pattern recognition. It’s an evolutionary trait that helped our ancestors survive (rustling grass = possible lion). Our ancestors rarely came to harm through false positives (they thought it was a lion, and it wasn’t), whereas the price for failing to recognize a link was high. Better to err on the side of caution than become lunch.
Michael Shermer calls this patternicity—seeing correlations where none exist. In business, patternicity may lead to poor decisions based on faulty assumptions.
Andrea’s horse
Andrea buys a horse for 3,000 € and sells it for 4,000 €. Later, Andrea buys the same horse back for 5,000 € and sells it again for 6,000 €. How much profit did Andrea make? *Answer at the end—write yours down first!
Only 45% of U.S. university students got the right answer when working solo. But in teams, that rate jumped to 72%. Why? Collaboration helps break the illusion of false correlation—our tendency to link events that are unrelated.
Andrea bought and sold a horse twice and made a profit each time. The fact that it was the same horse is irrelevant. The events are independent.
The independence is easier to recognize in financial markets. Suppose you buy and sell a stock at a profit. Later, you buy the same stock at a higher price and sell it at a profit again. You didn’t lose money by paying more the second time—the second trade is independent from the first.
Stop this thought, I want to get off
False correlations interfere with rational decision making. In a famous example, the roulette wheel in a Monte Carlo casino landed on black twenty-six times in a row in 1913. Gamblers believed red was “due.” They bet heavily on red and lost millions, ignoring a basic fact: each spin of the roulette wheel is independent.
Correlation does not equal causation
Even when two metrics rise together, it doesn’t mean one caused the other. Take ice cream sales and shark attacks: both rise in summer, but eating gelato doesn’t summon sharks. Temperature is the hidden driver behind both.
Business parallel: when sales spike after a marketing campaign, this does not mean the campaign was responsible for the increase. Other factors could have played a role. Was there a competitor’s supply issue? Market tailwind? Or something else?
Mental shortcuts can mislead
Heuristics are useful cognitive shortcuts that save time—but risk warping decisions. Here are some common traps leaders fall into:
Anchoring bias: Over-reliance on initial data, forecasts, or opinions
Availability bias: Overestimating the impact of recent or dramatic events
Confirmation bias: Ignoring disconfirming evidence and doubling down on assumptions
As Daniel Kahneman notes in Thinking, Fast and Slow, heuristics boost speed—but often at the cost of accuracy. An over-reliance on mental shortcuts can lead to flawed risk assessments, misallocation of investments, erroneous decision-making, and decreased competitive advantage.
Recognizing when your intuition might be “overriding” objective scrutiny—and deliberately slowing down to test your assumptions against robust data—will help you mitigate cognitive traps and make better decisions.
Proactively avoid decision traps
Identify “trigger situations” when you’re most vulnerable to bias and apply appropriate countermeasures. Here are two trigger situations I have identified for myself, along with countermeasures:
Trigger: Decisions piled up on my desk at the end of the day
Countermeasure: Block morning time for complex decisions. Delay decisions if needed.
Trigger: Familiarity bias (“This person reminds me of…”)
Countermeasure: Make an extra effort to see the person with fresh eyes. Ask questions to actively test my assumptions about the person.
Cultivate smarter decision habits
Challenge your assumptions. Ask, “What else could explain this outcome?”
Run A/B tests: Especially for marketing or product tweaks
Use data tools: Run regression analysis to probe causality
Seek disconfirming evidence: Don’t just hunt for validation
Welcome diverse views: They expand your perspective—even if they’re uncomfortable
Review your decision-making process: Think about how you make decisions, especially when the outcome is unexpected or undesirable.
Reflection questions
Have I linked unrelated events in recent decisions?
What are my high-risk moments for cognitive shortcuts?
Am I relying more on intuition or evidence in this decision?
How can I test my assumptions?
Who can I ask for a distinct perspective?
* The correct answer to Andrea’s horse puzzle is 2,000 € profit.
References
Kahneman, D. (2011). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux.
Maier, N. R. F., & Solem, A. R. (1952). The contribution of a discussion leader to the quality of group thinking: The effective use of minority opinions. Human Relations, 5(3), 277–288.
Shermer, M. (2002). Why people believe weird things: Pseudoscience, superstition, and other confusions of our time. New York: Henry Holt and Company.
Shermer, M. (2008). Patternicity: Finding meaningful patterns in meaningless noise. Scientific American, 299(5), 48.
Schultz, W., Dayan, P., & Montague, P. R. (1997). A neural substrate of prediction and reward. Science, 275(5306), 1593–1599.
Images: CC0 Wikimedia Commons; CC0 Pxhere