The Downward Spiral of Cognitive Bias - And How to Defend Against It
We all make bad decisions. Sometimes we are unlucky, and sometimes we make honest mistakes. But bad decisions are also frequently the result of bad thinking — choices that are impulsive, poorly thought out, or based on misinformation. Quite often, we are victims of our own cognitive biases, which can serve as self-reinforcing spirals of mistaken ideas disengaged from reality. Many of these are preventable – if we can teach ourselves to think more rationally!
I’ve long been interested in the challenge of being rational for its own sake, as the path to truth. Exploring that path has, as I’ve discussed in my longer essays, revealed the deep shortcomings in our ability to know the truth. Yet being rational is also valuable instrumentally in helping us make better, more successful choices in life. Nassim Taleb actually equates rationality with being successful – a somewhat limiting idea that I recently critiqued in Why Nassim Taleb is Wrong.
I’ve also concluded, what may seem obvious, that no one can actually ever achieve perfect rationality. We are, after all, only human. Yet rationality is a worthy goal. And I’ve argued that we can get better by following four principles, what I call Rational Meta-Principles:
- Humility – Recognize the possibility that any given claim (including your own) may be wrong.
- Reflection – Be aware of the potential biases of those making a given claim (including your own).
- Openness – Be open to possibilities. Weakly supported claims may be true, while strongly supported claims may be false.
- Intention – Try to understand the motivations at work (including your own) and how that may influence a claim.
One of my readers, Simon Saval, an investor and self-proclaimed science geek in LA, recently shared a link to his Survival Guide on Cognitive Bias, which provides a helpful set of recommendations on how to avoid some of the pitfalls of making irrational or ill-advised decisions. I found it a quick and helpful read that fits nicely with my own more philosophical recommendations on how to be rational. Simon illustrates the top ten forms of cognitive bias that interfere with good decision-making, and offers simple tips to avoid them:
- Ambiguity Effect (avoiding uncertainty) – fill in the gaps in your information.
- Availability Heuristic (emotional tilt) – focus on the facts, not the anxiety.
- Bandwagon Effect (what lemmings do) – step back and reassess.
- Bias Blind Spot (the plank in one’s eye) – be honest with yourself.
- Confirmation Bias (hearing what we want to hear) – find arguments on both sides.
- Declinism Bias (everything’s going bad) – defend against nostalgia with data.
- Ostrich Effect (head in the sand) – look hard at the elephant in the room.
- Outcome Bias (it worked last time) – double-check the odds.
- Stereotyping (our first impressions) – recheck the conclusions that you jump to.
- Survivorship Bias (seeing only the victories) – consider all those who tried and failed.
In our increasingly frenetic and distracted world, choosing among the multiplicity of available options and alternatives can be tough. This also applies to the judgments and choices we make about politics, people and relationships — decisions with deep emotional content and significant consequences can be particularly challenging. If we take a few minutes to evaluate our potential cognitive biases before we make such judgments and choices, we increase the chances that they will have better outcomes. Rather than spiraling down in what can become a series of bad decisions based on biased, mistaken reasoning, we can reverse the spiral and make progress towards better judgments, better decisions and a better life.
Making good decisions can be tough. Thinking about our biases can make them better!