Fixing your blind spot: biases in decision making

Daniil Pavliuchkov
Product Coalition

People make decisions and interpret data all the time. We see patterns, rely on past experiences, and connect the dots on an almost intuitive level.

Because we receive and process so much information all the time, we naturally want to do it with as little effort and energy spent as possible. Every person forms a set of mental shortcuts or heuristics, which work brilliantly, but sometimes a tad too good, especially in situations where we don’t need that.

I have listed the most common biases that lead to false judgments and how to avoid them. Remember that this list was not meant to ridicule your colleagues and loved ones when they are biased or make a logical fallacy. The intended use it to apply it to your thinking and to stray from irrational judgment whenever possible.

Confirmation bias

We tend to favor, interpret, and recall information that confirms our previously existing beliefs and ignore information that doesn’t.

This is the king of all biases, one bias to rule them all, and the captain of the USS Bias. Everyone, without exception, has it, and we face it multiple times a day. It can manifest as a total stubbornness during an argument or be somewhat mild and alter our perception a little but enough. This process works undetected in the background of our minds to create a convincing illusion that the facts support our beliefs. It is not as we intended to manipulate the reality, our brain does that to fortify our worldview with a cohesive narrative.

You might have a narrative that left-handed people are more creative. Whenever you encounter a left-handed designer, video maker, or artist, you would mentally mark a checkbox and remember that as evidence of your view. If they happened to be right-handed, you would dismiss that information as an exception, or might think, “actually, he was not that creative,” and forget the encounter altogether. And only the story of left-handed people being creative lives forever, proving your narrative.

Confirmation bias is definitely taking place when:

  • Beliefs persist after the evidence for them is shown to be false.
  • People rely on and trust existing information more than newly presented information.
  • People tend to interpret ambiguous evidence as supporting their existing position.
  • A disagreement becomes tenser even though the different parties operate with the same evidence.
  • People falsely perceive an association between two events or situations that correlate but don’t causate.

Confirmation bias leads to overconfidence, which makes decisions more costly.

  1. A biased Head of Design who believes that left-handed people are more creative will favor lefties when recruiting and even promoting.
  2. A biased business owner who thinks the target users are CEOs in their mid-50’s will mistrust new research results that reveals that the core power users are 35–40 years old.
  1. Look for ways to challenge what you think you see. Seek out information from a range of sources, surround yourself with a diverse group of people, and don’t be afraid to listen to dissenting views.
  2. Rephrase the question and recheck the results. Instead of researching the number of left-handed renaissance painters, investigate what fosters creativity in general.
  3. Be open to new information and absorb it in a neutral frame of mind. Try to be less emotional and look at the facts rather than dismissing them because they are contrary to your view.
  4. Ponder the concept that there is no such thing as an exception in data. There is only data that support one outcome or another. Some results are most likely; some are less. An anomaly in data is then just a low probability event.

Motivated reasoning

Motivated reasoning is a process of defending a position, idea, or assumption that we hold with emotional investment.

It is confirmation bias taken to the conscious level. If confirmation bias makes us more receptive to the information that confirms our ideas, motivated reasoning makes sure that when we have a viewpoint, we will do everything we can to support and rationalize it. For most opinions, we are (mostly) rational and update them as new information comes to our attention.

For example, we can quickly change our memory of a historical fact if we read new information on Wikipedia. But I would not change my belief that the sun is at the center of our solar system unless presented with hard evidence from multiple sources. This system works really well; we update our memories and opinions daily for minor things that we care less about.

But the more solid our belief is, or the more we are invested in our own idea, the harder it is for us to change our position. We feel good when our assumptions are correct because our brain releases dopamine when that happens. This mechanic positively reinforces our learning and helps us form new skills, which is good. What is terrible is when we defend ideas that no longer make any sense just for the sake of being “correct” and getting that dopamine shot.

The horrible thing is when you have finally proved it “right,” you will act based on false premises, and your results will be faulty as well. You will then face a situation “how could that go so wrong, the assumption was correct.” Confirmation bias will then kick in, saying “that was just a random fluke,” and you will close the loop of ignorance.

A performance manager believes that users aged 15–25 respond better to video ads than to images. He or she then goes and looks for evidence in analytics that supports that point. If the evidence is absent, he or she googles and asks other PMs in the community. They can even search for studies of teenagers’ perception or mention that “my nephew always watches at least 5 seconds of the video while browsing Facebook”.

  1. Ask yourself: “If I had an opposing view, would I conduct the same analysis?” and “Am I presenting a full version of the truth?”
  2. Humans can rationalize and find a reason for everything. Be conscious about that, and remember that unlikely cases are less likely to be the cause.
  3. Make sure that objectively stronger assumptions prevail, while weaker ones are disregarded.
  4. Let go of your idea. Views and assumptions change over time, multiple people work on ideas and contribute to the vision, ultimately transforming them into something new.

Appeal to tradition, novelty, or authority

A tendency to give a logical discount to something or someone that we inherently trust.

While in most cases, you should trust the things you believe, you should stay skeptical and look for bits and pieces that don’t fit together. It is especially true for the appeal to authority. What has worked well for other companies and teams doesn’t mean it will work well for you.

Appeal to tradition — if it has survived the test of time, it must be a good practice.

  • This is how we designed landing pages for new products; we should not change what is working.

Appeal to novelty — if it is a new approach, it must be modern and advanced.

  • This is a cutting edge way to manage projects and is the last thing in product management. We should follow.

Appeal to authority — if this is being used or promoted by a successful person, it must be trustworthy.

  • VP of Product from FAANG says this practice works incredibly well in building MVPs; let’s use it too.
  1. Check the grounds and if they still make sense now. My mom told me that to cook a chicken, you have to divide it into four parts and lay them flat because her mom cooked like that. In reality, my grandmother had a tiny oven, and a whole chicken simply didn’t fit in.
  2. Use your judgment and trace the logic. If your circumstances are different compared to the source, the assumption might not hold for you. Practices from big companies have weaker relevance for small startups than we think, no matter how successful those tech giants are.
  3. Don’t appeal to non-authorities. Don’t cite Albert Einstein on theology or biology because he knew little about that, his primary expertise was in physics.

False dichotomy

A situation is presented as either/or while, in reality, there are more options on the table.

When a problem is presented as a binary decision with one of the outcomes being horrendous, you might lean towards the other option even if you don’t like it. Most of the time, there are many ways out of the situation than you can see right now.

  1. Either we increase our marketing budget, or we learn to live with fewer leads (there are other ways of attracting new customers).
  2. I thought you were an experienced manager, but you missed standup today (many good people are late once in a while).
  3. You either approve the design, or you don’t (you could have a neutral standing or favor just some parts of it).
  1. Take a step back. If people are pressing you for an answer here and now, there must be something they don’t want you to notice.
  2. Look at the big picture and ask for context. Why are there only two options presented? Who said those options are mutually exclusive? What will happen if we do something completely different?

Slippery slope

A belief that a small first step will lead to a chain of related events culminating in a significantly negative outcome.

Slippery slope argument is often used as a form of fear-mongering, in which the probable consequences of a given action are exaggerated in an attempt to scare the person off. This might be used to stall decisions or block specific actions completely.

If we release the new dashboard now, some customers will be unhappy with it, they will leave comments on our Twitter and Facebook pages, this will negatively affect our reputation, and we will lose customers. We should not release it until we are 100% sure it is perfect.

  1. Don’t accept without further justification that once the first step is taken, the others are going to follow with 100% probability.
  2. Double-check if the final catastrophic outcome is the only possible one.
  3. Be ready for what is coming, but don’t allow low probability events to stop you from acting. Most of the decisions can be reversed, features rolled back, and damage mitigated.

Sunk cost

A behavioral pattern to attempt to recover your loss (of users, time, money, or effort) because you have invested so many resources already.

Our aversion to losing makes us irrationally cling to the idea of ‘regaining,’ even though it has already been lost. This is known in gambling as chasing the pot — when we make a bet, and then another bigger bet to recoup the original until we finally go all in. With this strategy, each attempt has higher risks, while the chances stay relatively the same.

  1. A workshop in your office converted only a few leads into customers. You increase the budget for the next one hoping to win double the leads to offset your initial loss.
  2. You are testing a new ad channel, but the results are poor. You decide to spend even more money in an attempt to find the perfect ad that will finally work.
  3. Some people sometimes order too much food and then over-eat just to “get their money’s worth.”
  1. Set your max budget and make the cut when the numbers don’t look good anymore.
  2. Don’t increase the stakes when events are independent of each other. If you repeat the same thing twice, most likely, it will yield the same results. Only dramatically changing the approach will give you significant changes.
  3. Be honest with yourself, some things are just lost, and there is no positive side to that, not even the experience you gained.

Conclusion

People like their world to make sense. Brains like to work less when possible. If it didn’t, we would have no pre-existing routine to fall back on, and we’d have to think harder to contextualize new information. With that, if there are gaps in our thinking of how we understand things, we will try to fill those gaps in with what we intuitively think makes sense. Most of the time, multiple biases are working hand in hand to help you cope.

While such shortcuts are generally useful, some of them can make our judgments irrational. I covered several cognitive pitfalls in this post, but these are by no means the only ones out there. A website yourbias.is has a decent overview for those who want to dive deeper into the subject.

There is one rule that helps me be less biased, which I call the Rule of 8.

Search for and identify at least eight of the most compelling pieces of evidence, four for and four against a particular perspective, before you make a call.

Remember, we make thousands of decisions every day, some more important than others. Make sure that the ones that do matter are not made based on bias in haste, but rather on reflective judgment and critical thinking.

Daniil Pavliuchkov is a product consultant and speaker helping companies grow by improving their OKR and product processes.

Upcoming talks:

  1. How to grow your PM team into heroes” at the Product Management Festival in Zurich, 13th of November. (Ping me for a 20% discount)
  2. Decision-making biases in startups” at B2B Meetup in Berlin, 28th of November.

Illustrations by Federica Bordoni.