Cognitive Biases And How To Outsmart Them

“I’m probably less biased than most.”

Personally, it was humbling to face reality and realise how biased I was. I thought I was less biased than others, and that too — not seeing the impact of bias on myself — was a bias.

Even the smartest people exhibit biases in their judgments and choices. It’s foolhardy to think we can overcome them through sheer will. But we can anticipate and outsmart them by nudging ourselves in the right direction when it’s time to make a call.

— Harvard Business Review

Today we’ll look at some curious cognitive biases to improve our awareness and sharpen our minds.


Welcome to all new subscribers! This is The Ambitious Writer, a weekly newsletter for fellow writers who want to succeed as creatives.

In case you haven't already done so — THIS is the ideal time to subscribe.

What are cognitive biases?

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behaviour in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.

— Mother Wiki

Our brains are powerful and stunning machines, yet they’re subjected to some limitations. One of them, in particular, is self-induced: the brain looks for efficiency in the attempt to reduce energy expenditure — it basically limits itself. We could see this behaviour as a sort of thumb rule to help us make sense of the world around us (in which there are too many inputs) and make decisions at high speed. It happens spontaneously. We don’t control it.

Well, this is what often happens with cognitive biases, too: they could be a brain's attempt to simplify information processing.

If you’re walking down an alley and spot a dark shadow that seems to be following you, a cognitive bias might lead you to assume that it’s a mugger and that you need to exit the alley as quickly as possible. The dark shadow may have been caused by a flag waving in the breeze, but relying on mental shortcuts can often get you out of the way of danger when decisions need to be made quickly (example from this article).

While they can often be surprisingly accurate (you instinctively run away from an actual danger), they can also lead to errors in thinking (you instinctively run away from nothing).

On the other hand — we’re gonna focus here! — they can lead us to make inadequate evaluations or wrong decisions. Cognitive biases distort our critical thinking, possibly perpetuating misconceptions or misinformation that can damage others. They drive us to avoid information that may be unwelcome or uncomfortable, rather than investigating the information, which could lead us to a more accurate outcome. Besides, biases can cause us to see patterns or connections between ideas that aren’t necessarily there.

Causes & Impacts

If you had to think about every single possible option when making a decision, it would take too much (both in terms of energy and time) to face even the simplest choice. It’s necessary to sometimes rely on mental shortcuts and act quickly. Many different elements can cause cognitive biases, but these mental shortcuts — known as heuristics — seem to play a major contributing role.

These are the other factors that contribute to biases generation:

  • Emotions

  • Individual motivations

  • Social pressures

  • Mind's limits on processing information

  • Decreased cognitive flexibility (due to ageing, for example)

Conspiracy theory beliefs, to give you an example, are often influenced by a variety of biases. Their impact is terrible! Biases underlie people’s difficulty in exchanging accurate information or deriving truths.

This is why they are worth studying.

[Extra: Avoid the common mistake of confusing cognitive biases with logical fallacies. A cognitive bias refers to how our internal thinking patterns affect the ways we understand and process information. A logical fallacy is an error in our reasoning structure that weakens or invalidates an argument. Different animals.]

It’s time for my biases playlist!

My Cognitive Bias Checklist

1. Blind Spot Bias

Also known as “Bias Bias”, it sneakily takes control of your mind and makes you blind in front of the problem. You won’t try improving your status if believing to be less biased (this is why very few people focus on biases!).

You’ll know what to say the next time you hear someone saying, “I’m probably less biased than most”, Right?

2. Dunning–Kruger Effect

The Dunning–Kruger effect — aka “being on Mount Stupid" — occurs when people with low ability at a task overestimate their ability.

The miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others.

— Mother Wiki

We are generally bad at objectively evaluating our level of competence. When starting an activity for the first time, we can feel an “illusory superiority” — failing to recognise our lack of ability.

I printed out this graph: it’s so true!

Let’s notice that in writing — but you can apply this to almost everything — the first ignorant steps on the path towards mastery are the ones that make you try. Being on Mount Stupid is funny and necessary for committing (I’m kind of an expert, as I spent most of my life there). Now that I’m walking through the Valley of Despair, I’m sure I would never have tried such a masochist pursuit if I had known up front what it’s all about.

3. Anchoring Bias

A classic — anchoring or “focalism” occurs when one depends too heavily on initial information to make subsequent judgements. That specific piece of information is called “anchor” and keeps you steady in your position.

It is actually a human tendency to entirely rely on the first information we receive, no matter how reliable that piece of information.

E.g. One of your idols, say an actress or a singer, suggests reading a book and declares that it is “well-written and simply outstanding”. You buy the book, read it, and then suggest it to all the people you know. When they ask, you answer “it’s well-written and simply outstanding”. No matter how bad the book actually is, you’re biased and your judgement remains aligned with the first thing you’ve heard about it — the anchor.

What’s fascinating is that, once the value of this anchor is set, all future inputs or arguments are discussed in relation to it. Information that aligns with the anchor tends to be assimilated, while information that is more dissonant or less correlated tends to be ignored.

Writers usually leverage this bias in stories by anchoring readers to wrong information in order to surprise them with plot twists.

4. Belief Bias

A widespread one, it’s the tendency to judge the strength of arguments based on their conclusion's plausibility rather than how strongly that conclusion is supported.

It happens for people are more likely to accept an argument that supports a conclusion aligned with their values, beliefs or prior knowledge.

E.g. My grandma thinks Italians are good at cooking. If I told her a story of an Italian cook who served bad food (maybe compared to cooks of other nationalities), she wouldn't believe me. Not even if I had experienced the food myself. The fact is that she believes Italians are good cooks - end of the story.

And it is not just Italian grandmas. We can all easily be blinded by our beliefs and come to wrong conclusions. This is why, for example, writers usually ask for feedback when writing important and opinionated stuff — you cannot find your blind spots with your eyes alone!

5. Information Bias

I spent around 20 hours writing this newsletter because I couldn’t stop reading stuff. It’s a good example of information bias, the tendency to seek information even when it cannot affect action.

It’s curious to notice that people can often make better predictions or take wiser choices with less information — as more is not always better. Often a lot of information is added that is irrelevant for the decision to be made.

Here and now, I risk providing you with too much irrelevant information about biases, which can negatively affect your decision to keep reading. Information bias applies well to writing, if you notice.

6. Optimism Bias

It’s the tendency to be over-optimistic: underestimate the probability of undesirable outcomes while overestimating favourable and pleasing ones. This bias leads to wishful thinking, the production of beliefs based on what might be pleasing to imagine rather than rational to expect.

Optimism bias is essentially a product of the conflict between belief and desire — a conflict I personally experience every day.

7. Non-Adaptive Choice Switching

After experiencing a bad outcome with one of your past decisions, you tend to avoid choosing in front of similar problems — even though the same choice you made in the past is optimal. This is because of a non-adaptive choice switching, a painful bias. It’s also known as “once bitten, twice shy” or “hot stove effect”.

Realistically, if you want to sustain a creative career of any sort, you have to remove this bias from the equation — bad outcomes are the most important element of a healthy growth process. I’d rephrase it: “once bitten, twice grown”.

[Extra: Similar to “Non-Adaptive Choice Switching” is the “Outcome Bias”, the tendency to judge a decision by its eventual outcome instead of the quality of the decision itself (at the time it was made).]

8. Planning Fallacy

The tendency to underestimate your task completion times. I could teach this one.

9. Courtesy Bias

The tendency to give an opinion that’s more socially accepted than true so as to avoid offending anyone.

10. False Uniqueness Bias

People tend to see their projects and themselves as more singular than they actually are.

“People only see what they are prepared to see.”

— Ralph Waldo Emerson

See you next Wednesday,

Lorenzo Di Brino