A few years ago, I was observing a meeting that one of my clients was having with his direct report. I listened to their conversation, heard the direct report share a seemingly simple story of the way he had handled an issue, and watched my client react with anger…to his interpretation of the simple story. My client was getting angry, and from everything I witnessed, he was getting angry based on his interpretation, not based on the actual facts. He was making up a story.
We all make up stories, all day long. It’s the way our brains work, or rather the way our brains have evolved to work, in order to make as much sense of the world as we can. It’s great…until it gets in our way. Until we believe our stories, as if they are absolutely true. Which happens all the time.
The psychology behind this is cognitive bias. We have developed cognitive biases (see the cognitive bias cheat sheet compiled by Buster Benson) that, in many ways, help us understand the world. These biases help us handle:
- too much information (how do we take it all in?) – there is simply too much information in the world for us to process it all, so we have to filter things out and select what we notice. We therefore notice things that stand out or are different (bizarreness effect) or that confirm what we already know (confirmation bias). Unfortunately, we therefore miss things that confirm what goes against our belief system and things that may seem more routine.
- unclear information (what does it all mean?) – we connect the dots and make sense of information that doesn’t make much sense. We reconstruct the world when we don’t have enough information (clustering illusion) and fill in gaps of information with our best guesses and past experiences (stereotyping and authority bias). We think we know what others are thinking (spotlight effect) and more. The challenge is that we forget that we’re making up some of this information, and we believe it to be true.
- information that might slow us down (how do we quickly decide what to do?) – we often need to act fast, without all the information and without time to gather more information. We therefore become overconfident of our own ability (overconfidence effect and optimism bias) and seek to finish things we have started (sunk cost fallacy) and to favor simple options (information bias). Unfortunately, all these biases that help us move quickly also help us miss important information and avoid doing further research to arrive at a potentially more powerful result.
- information that is too much to remember (what do we need to remember?) – we aren’t able to remember everything, and our biases therefore select – somewhat arbitrarily – what we’ll remember and how we’ll remember it. We edit memories (misattribution of memory) and reduce events to their key elements (peak-end rule). This helps us remember what we think we need to remember, but it can make us remember things incorrectly. (Remember, I wrote a memoir. I know this to be true.)
These biases help us in many ways. That is why we’ve evolved to have so many of them. However, as wonderful as these biases are at simplifying things for us, they often simplify things too much, and we take in, understand, decide, and remember things in ways that are, in many ways, flawed. (See this TED talk by J. Marshall Shepherd that highlights three biases that get in our way when we’re examining scientific data.)
So, if we are prone to make up stories – stories that are created and reinforced by the biases in our brains – how can we ever get past this and create clear communication? How can we come to shared understanding with others? How can we have our biases get in our way less – especially as we try to connect with, and lead, others? Here are four first steps:
- Accept that we are often flawed in our thinking – there is a common saying, “awareness is the first step.” If we acknowledge that our thinking – and decision making – is inherently flawed, we can be more open to thinking through and beyond our biases, to get to a fuller truth.
- Learn more about our biases – when we know the various cognitive biases, we’re more likely to be aware of them when they kick in (or after they kick in), which can help us, again, be more open to thinking through and beyond them.
- Ask others to help you – open communication and open-ended questions can help us get less caught in our own stories and interpretations. Ask others, “What do you see? What makes sense to you? How do you interpret this? What am I missing?” and listen to their answers, even – or especially – if they contradict yours.
- Repeat steps 1 through 3 – it is a potentially never-ending cycle.
Our brains evolved to help us navigate the overload of information and facilitate our need to make sense of the world. However, the biases in our brain can get in our way, especially because we’re biased to think we have no biases. By becoming as aware as we can be of the stories we may be making up, and by looking for the potentially flawed thinking behind those stories, we are more likely to find the real truth.
How have you learned to recognize your own cognitive biases? How has this helped you?
Please leave a comment.
If you enjoyed this post, you can read more like it in our book, The Power of Thoughtful Leadership: 101 Minutes To Being the Leader You Want To Be, available on Amazon.
For support in becoming more aware of your “stories,” contact Lisa at firstname.lastname@example.org.
Click here to receive The Thoughtful Leaders™ Blog posts via e-mail and receive a copy of “Ending Leadership Frenzy: 5 Steps to Becoming a More Thoughtful and Effective Leader.”
Photo Credit: cartoon resource/Bigstock.com