20 cognitive biases to be aware of (and embrace) as a Product Manager

Bogdan Coman
Coman Says
Published in
4 min readJan 12, 2020

--

— Guys, we should go for feature X in the next sprint!
— But we didn’t validate any assumption yet. It doesn’t fit with with the data we have so far…
— But is fitting with our Y initiative, everybody knows, right?… We invested already a lot in Y and is too late to drop it or spend additional time in research. Our customers are waiting, the competition already has it. Seems to be a go.

This is a discussion that every Product team has faced with at a point in time. What happens afterwards and how the team can avoid a disaster depends on its ability to negotiate a compromise, isolate the flaws and take the best decision.

C’mon, “best decision”?… What “best decision” means?

There is not one single “best decision” in building a new thing. There are just decisions that might help the team/company to reduce the waste and deliver something that is fitting as much as possible with the business and user needs.

Let’s call them “optimal decisions”.

But why can’t we have a best decision instead of multiple, possible, optimal decisions?

Because of our judgement flaws and cognitive biases.
According to Wikipedia, cognitive biases are systematic patterns of deviation from norm or rationality in judgment.

I don’t agree with the definition. Especially with the deviation term.

Norm is a fuzzy term, rationality is still a wide open discussion between philosophers and maybe the first rational debate was in front of a cave about the decision to go hunting or not in a stormy day. Our judgement is fully based on the interpretation we give to a certain input, not on the input itself. So there is not any deviation, is the course.

We are not machines. Machines are not (yet and ever) able to build products that humans love. Humans still have to take decisions based on incomplete data in uncertain territories only according to their own (mis)interpretation.

So, is it possible to avoid the biases, cognitive flaws and bad judgement patterns?

No, is not… We can’t be unbiased, we can be biased-aware.

The eye sees only what the mind is prepared to comprehend. Robertson Davies

So, here is a list of most important biases to be aware of when creating products and experiences that other humans will love. In no specific order.

#1 Blind Spot bias

The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself.

#2 Experimenter’s bias

The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.

#3 Illusory Truth effect

A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual veracity.

#4 Planning fallacy

The tendency to underestimate task-completion times.

#5 Information bias (also known as analysis-paralysis)

The tendency to seek information even when it cannot affect action.

#6 Pseudocertainty effect

The tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.

#7 Pro-innovation bias

An excessive optimism towards an invention or innovation’s usefulness throughout society, while often failing to identify its limitations and weaknesses.

#8 Gambler’s fallacy

The tendency to think that future probabilities are altered by past events, when in reality they are unchanged. The fallacy arises from an erroneous conceptualization of the law of large numbers. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”

#9 Neglect of probability

The tendency to completely disregard probability when making a decision under uncertainty.

#10 Not invented here

Aversion to contact with or use of products, research, standards, or knowledge developed outside a group.

#11 Irrational escalation (also known as sunk cost fallacy)

The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong.

#12 Groupthink (also known as Bandwagon effect)

The psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. Group members try to minimize conflict and reach a consensus decision without critical evaluation of alternative viewpoints by actively suppressing dissenting viewpoints, and by isolating themselves from outside influences.

#13 Negativity bias

Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories.

#14 Observer-expectancy effect

When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it.

#15 Hindsight bias

Sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable at the time those events happened.

#16 Hot-hand fallacy (or survivor bias)

The belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts.

#17 Overconfidence effect

Excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.

#18 Law of the instrument

An over-reliance on a familiar tool or methods, ignoring or under-valuing alternative approaches. “If all you have is a hammer, everything looks like a nail.”

#19 Framing effect

Drawing different conclusions from the same information, depending on how that information is presented.

#20 Illusory correlation

Inaccurately perceiving a relationship between two unrelated events. Also discussed as inability to distinguish between correlation and causation.

--

--