On Cognitive Biases

Concept by Delcan & Co; photograph by The Voorhes

Remember the pictures of optical illusions we saw back in school? Those where measurably straight lines appear to be curved, objects of equal size seem to be disproportionately different, and colors of the exact same luminance look drastically contrasting due to their surroundings?

These illusion reveal the imperfections of our visual perception system. We can’t get rid of those, just like we can’t change the fact that unlike a dragonfly, we only have one lens in our eye, and therefore one visual perspective of the world.

Remarkably, the same holds true for our cognition. Our thinking, our judgment, and our foresight are by design affected by a set of biases that aren’t even conditioned – they’re hardwired in the brain. Cultural conditioning can reinforce those biases (which it most often does) or raise awareness about them. Juxtaposing rationality and logic against emotion is an example of maladaptive conditioning (and an example of bias, namely dichotomic thinking, by itself).

These imperfections of our cognitive system are in part due to the complex and long neurobiological evolution of the human brain. There are structures within it that take over the control and make us feel and think like animals when certain triggers are fired. Is someone shaming you in front of people you care about? Flee, fight, or freeze, like an animal being followed by a predator. Does the unexpected geopolitical catastrophe ruin the business you’ve been building for years, or the stocks you’ve been amassing for retirement? Sell whatever still remains, preparing for the worst.

While those might be particular and extreme examples, the truth is that because of those hardwired neuroarchitectural imperfections, our critical and rational thinking gets hijacked by emotion literally in a heartbeat. As the neuroscientist Antonio Damasio once said, responding to the question about whether humans are thinking machines or feeling machines, “We are feeling machines that think.” While that might look like an evasive rhetorical reply, it emphasizes the truth: feeling comes before thinking, evolutionarily and architecturally. It works faster and has more control over our behavior.

The most dangerous thing, both for individuals and organizations, is to disown the primacy of emotion and rationalize incorrect judgments and maladaptive decisions with formally logical arguments. That’s where our inherent cognitive biases come in handy, disguise the underlying emotional struggle, and end up wreaking havoc in our lives.

There are two domains in my work that help solve this. The first is cultivating emotional intellect and proactively working to accept the inherent duality of thought and emotion in controlling our behavior. The second is studying and deconstructing cognitive biases, which in the end helps us improve judgment and foresight, even when emotion gets intense enough to overthrow critical thinking. Interestingly, just like the skill of shame resilience doesn’t remove shame from our emotional landscape but instead keeps it from sabotaging our behavior, the awareness around cognitive biases doesn’t mean getting rid of them altogether – instead, it means getting cued about them when they’re likely to be at work, and then double-checking our judgment before we act upon it.

Studying cognitive biases becomes even more important in the era of big data that we’re living right now. The deluge of information we’re facing makes those cognitive shortcuts more likely to be deployed and more costly in the end. Especially in finance and tech, there’s this dangerous sentiment that given the computational power of modern machines, we can safely delegate high-risk decision-making processes to them – because they are allegedly free from biases. This is wrong for at least two reasons. First, the algorithms by which these machines work are written by humans. Our cognitive biases easily creep into the logic of the code we write – and then we have a computer making the same mistake a human would, but on a bigger scale and with bigger consequences. The second reason is, for all their computational power, computers architecturally cannot account for ethical and emotional factors when making decisions and forecasts. They cannot practice empathy and assess how the ethical benefits of decision compare to its monetary costs. While that might be okay for small-scale applications, like buying or selling a particular stock, when shaping organization-level policies and creating long-term strategies this deficit has huge negative implications.

So just like it’s important for us to cultivate the cognitive capacity for paradox and accept the duality of thought and emotion, it’s also key to understand the limited application scope of machine learning and computing in decision-making, especially when stakes are high. Statistical heavy-lifting, high-end encryption and cryptocurrency mining can be surely delegated to machines, but best foresight and best judgment happens when you combine their performance with the performance of a bias-aware, proactively critical thinking human brain.

In my course on big data and cognitive biases, we explore the most common and pervasive cognitive biases in organizations: confirmation bias, actor-observer bias, present bias, hindsight bias, heuristical thinking, the fallacies of formal logic, base-rate neglect. We also get clear and granular on the particular emotional underpinnings of each bias. After years of doing this work, I’ve seen that the best way to help people wrap their heads around biases is by doing focus groups and interactive sessions, in which every participant evokes an example of a particular bias at work in their everyday life. Already at this level, group members start seeing how universal cognitive biases are, which helps dismantle shame surrounding them and fosters curiosity. From there, we go higher and start studying how the same biases influence group dynamics, organizational cultures, and corporate policies. During the course, I employ role plays and tests to constantly measure how much the information helps participants improve judgment in bias-provoking situations, and then adjust the teaching approach accordingly.

While this course improves critical thinking skills and judgment of any person and any group, I’ve seen the biggest impact of it in tech companies, finance corporations, and medical professionals. Although they might seem unrelated, these are the people who, by virtue of formal training and operational guidelines, are conditioned to think that they’re thinking is mostly rational. That’s why they rarely have curiosity and ensuing awareness about the biases that are, literally, at work along with them. At the same time, the quality of judgment in these professions has huge implications. When a human life, a large stock investment, or a costly tech innovation is at stake, the last thing we want to have emotions and cognitive interfere with analyzing the information we have at hand and choosing the best course of action. No amount of insurance or risk mitigation can compare in power to the measurable and observable improvement of our thinking.

Looking to book individual coaching sessions with me? Please refer to this section below.

Connect with me through email
and on social media: