Reducing Our Bias

Muslim women holding up a sign saying "Love Hate" and "we the people" - encouraging us to change our bias towards them by Vlad Tchompalov

Cognitive Bias

We humans are stuck with this thing called cognitive bias.

Because of the way our brains evolved, no matter how much we want to be rational and reasonable, we tend to jump to conclusions; make assumptions; judge others by the group to which they belong; change our memories; and give more weight to information that confirms, rather than challenges, our worldview.

In other words, we’re full of isms and biases that our conscious mind may not even notice.

This isn’t all bad, though. Cognitive biases make life easier. They speed up simple decision-making, make routine possible, and allow us respond to danger before we even realize there’s a problem, thus keeping us safe.

Unfortunately, our biases also influence who we choose as friends, who we hire, who we vote for, who we trust, and, to some degree, who we love and who we hate. They influence the laws we write, how we raise our children, who we pardon for their crimes, where we donate our money, and if we give away money at all.

Muslim women holding up a sign saying "Love Hate" and "we the people" - encouraging us to change our bias towards them by Vlad Tchompalov

“First Thought Wrong”

When we realize our subconscious mind’s incredible influence on us, we might feel overwhelmed and dismayed. However, we can slow down our thinking, become more conscious. The twelve-step slogan, “First thought wrong,” speaks to this. Although our first thoughts aren’t always wrong, when our minds are conditioned to choose drugs or dishonesty if we feel anxious, reconsidering that first impulse makes sense. Indeed, it can sometimes save our lives or the lives of others.

Slowing down also makes sense when we hear religious or political views we don’t like, or when our spouse or co-worker makes us angry, or when we feel disgusted by someone we see on the street. If we can pay attention to our beliefs, we can start to question them.

On the other hand, when we don’t realize how irrational we can be, our biases can so distort our thinking that we’ll happily believe a falsehood we’ve been taught rather than accept truths that conflict with it.

The priest and storyteller, Anthony de Mello, illustrates this with a story about Nasruddin, the Sufi trickster character.

Nasruddin Is Dead

Once while Nasruddin was musing about life and death, he said aloud, “How do we know what death is?”

His wife, working in the kitchen, scoffed. “Men are always so impractical,” she said. “Everyone knows that when a person is cold and rigid, he is dead.”

Impressed with his wife’s wisdom, Nasruddin decided the issue was settled and went out for a walk. The day was quite cold, and he’d forgotten his gloves, so his hands grew cold and numb. His feet got so cold, he could barely move them.

Remembering his wife words, and believing she her implicitly, he decided that since he was cold and rigid, he must be dead.

“So what am I doing upright?” he asked, and hastily laid in the middle of the road.

Nasruddin Is Taken to the Cemetery

An hour later, some travelers discovered him lying there. He did not move, nor did he respond to their questions. Was he alive or dead, the wondered.

Nasruddin wanted to tell them he was dead, but he knew dead men didn’t speak, so he kept his mouth shut.

Finally, the travelers decided the man must be dead, so they hauled his body up onto their shoulders and began to carry him to the cemetery. When they came to a fork in the road, however, they started arguing about which way to go.

Finally, Nasruddin could stand it no more. He said, “Excuse me, but the road to the cemetery is on your left. I know dead men do not talk, but I promise I will not say another word.” [1]

We may laugh at this, thinking we’re nothing like the ridiculous Nasruddin. But part of what makes the story so funny is that all of us cling to unconscious biases that are just as ridiculous.

Bias and Our Brains

Why do we do this?

Exploring what cognitive, emotional, mental, social, physical, and neurological aspects of our being contribute to our cognitive biases would take a book. We can, though, identify a few simple explanations.

For instance, what we see is not really what’s there. First, our conscious minds take in only a small fraction of the sights, smells, and sounds around us, which makes eye witness accounts so unreliable. Second, our eyes don’t even see the scene our brains tells us is there. Although we don’t feel it consciously, our eyes take in information through quick, jerky movements called saccades.  If our brains didn’t compensate for what we actually see, smoothing out our eye movements and filling in gaps, we’d have trouble making sense of the world. To understand what we’re experiencing, we unconsciously construct “a coherent sensory story.” [2]

We need that coherence to feel comfortable in the world.

Memory and Rationalization

That’s why our memories tend to change with time. Research shows that the stories we tell right after an event are different from the ones we tell two years later, after we’ve had time to embellish and redefine our experience. Sometimes we change a story to make ourselves look better. At other times, we seek coherence and meaning, or we might change the story to make our actions look rational and reasonable.

For instance, we will make up explanations about why we did something that have little connection to our true purpose. In split brain patients, the right brain cannot talk to the left. In one study, such patients sat in front of a machine that could send images and words to just one side of the brain. What the right brain knew, the left did not, and vice versa. Since our left brain is the seat of our conscious and rational mind, these people’s consciousness remained ignorant of what the right brain knew.

When the right brain was given an instruction, say, to stand up, the left brain didn’t realize what was going on. So when the person went ahead and stood, you’d think she’d be confused. But when asked why she stood, the person would invent a reason.

Why can’t our left brain simply admit its ignorance? I don’t know, but I’m pretty sure that whatever reason we come up with will be an invention or rationalization of our left brain. [3]

Twisting Scripture

Yet while we can create a bogus reason to explain why we stand up, when asked to explain our moral reasoning, we flounder. We “believe we are reacting to absolute truths,” when we decide what is right or wrong, but “[m]ost moral judgments,” Michael Gazzinga explains in The Ethical Brain, “are intuitive.” We have trouble providing a rational explanation for what we believe. [4]

Additionally, we choose what to believe. I am repeatedly struck by the way religious followers twist their leader’s teachings to fit their personal prejudice. Take Christianity, for example. Jesus accepted outcasts and told parables like the Prodigal Son in which the Father loves and accepts both the rebel and the “good boy.” How do we find in his words the license to judge and hate?

Christians aren’t the only ones who interpret their scriptures this way. Buddhists, Muslims, Hindus, and Atheists find whatever truths they need to justify the biases they hold. Most of us won’t change our attitude unless a disaster forces us to re-examine our world view.

That’s why Nasruddin would rather believe he was dead than confront the possibility that his wife didn’t know what she was talking about. Add to this our tendency to categorize, and you can see how we form prejudices and condemn entire groups of people.


Babies eagerly learn the differences and similarities between all the various tables they see, thus allowing them to generalize. If children couldn’t do so, they wouldn’t be able to tell their dining-room table from their couch.

Obviously, to understand the world, we must generalize. If we had to take the time to experience everything strictly on its own merit, as unique and unpredictable, our brains would explode. We wouldn’t be able to communicate or make sense of our experiences.

Still, although this is convenient and at times necessary, our tendency to generalize can cause problems. As Sarah-Jane Leslie explains in her paper, “The Original Sin of Cognition,” it leads to prejudice, racism, sexism, and all the other isms that fester within us. Using our rationalizing and generalizing brain, we decide that “immigrants are dangerous,” or “Muslims are terrorists,” or “black people are lazy.” [5]

If you’re like me, you’re telling yourself you don’t do this, that the only people who do it are the Neo-Nazis or the Antifas or whatever group you don’t like. And maybe you don’t do it much. Yet how many of us would be willing to listen to a white supremacist explore why it’s wrong to fight for his tribe? If we have strong feelings about the fate of Confederate statues, can we discuss the question calmly with those who disagree? [6]

How Do We Slow Down?

How do we slow down, ask questions, consider all sides? When is it right to do that? Is there an absolute definition of evil that we can all agree with, a morality we can all espouse? If so, how do we slow our minds and our hearts down enough to find our way to this universal core? If not, how do we honor the core values others hold, even when they seem as ridiculous as the “truth” that Nasruddin was dead?

Usually, unless a crisis occurs that shatters us and our world view, we’re unlikely to question our beliefs and values. We may not even see the need to do so. That’s one reason why addictions can be so liberating, because the pain and suffering we experience when active in our addiction is sometimes horrible enough that we’ll do anything to get out of it, even question everything think we know.

True, some of us more easily question what we’ve been taught than others. Our genetic predisposition, the stories we were told as children, the values our caretakers emphasized all impact how well we bounce back when our world view is turned upside down.

Choosing Not to Label

Regardless, there are things we can do to slow our minds down and question our assumptions. Leslie, for example, suggests that we change how we talk about people from different groups, because the words we use greatly influence our judgments. Labeling, for example, cements an object or person in our minds in a certain way.

When we label a child as “thoughtless” or “careless,” for example, we affect not only how we think about the child, but also how the child thinks about himself. If instead, we say the child is being thoughtless or careless, we allow the child to behave differently later, to be thoughtless and thoughtful, careless and careful.

To change our racist attitudes, we can start by changing how we talk about other people. Instead of using the labels “African American” or “Muslims,” whether or not you identify with them, Leslie suggests saying “people with darker skin” or “people who follow Islam.” [7] How would this change us and our behaviors?

Perhaps we could also stop calling people “white supremacists” or “Confederates” or “Republican” or “Democrat” and instead call them something like “people who identify with tribalism” or “people who cling to a Confederate past” or “people who voted Republican or Democrat.”

Changing our “First Thought”

Yes, this takes more time. It seems awkward. To notice and change our “first thought” feels uncomfortable. It requires learning to see the world differently, to appreciate things we didn’t used to appreciate, and to be open to new experiences and ideas, even when they seem evil.

Another Nasruddin story may offer insight. A man asks him for help with his garden. Although the  man plants roses and tends them carefully, only dandelions grow. He doesn’t like dandelions. They’re ugly; they’re weeds. How can he make them go away?

Nasruddin gives him one suggestion after another, and the man says he’s tried each one, yet each one failed.

“There’s only one thing left,” Nasruddin said. “You must learn to love dandelions.”

Loving Dandelions – And Our Neighbor

The man didn’t like that advice. At first, he fought against, but after a while, he did indeed teach himself to love dandelions. In the end, this made his life easier.

Does this mean we should learn to love our enemy? Well, Jesus and the Buddha both thought so. The Hassidic leader, the Baal Shem Tov, taught that we should love even those who sin.

So how do we address the problem of evil? Does evil not exist? Is everything relative?

Of course not. Some actions are unacceptable. That doesn’t mean that those who perform them are unacceptable. Nor does it mean that our definition of good and evil is always the best. Indeed, we can do a lot of questioning, and a lot of listening, without encouraging evil to flourish.

Besides, we’re most likely to be biased when we focus on judging others instead of looking at the truth of our own nature. A little internal jihad wouldn’t hurt any of us.

In faith and fondness,



  1. de Mello, Anthony, The Song of the Bird, Anand, India: Gujarat Sahitya Prakash, 1982, 48-49.
  2. Linden, David, The Accidental Mind: How Evolution has Given Us Love, Money, Dreams, and God, Boston, MA: Belknap, 2008, 255.
  3. Ibid 226-227.
  4. Gazzaniga, Michael, The Ethical Brain, New York: Dana Press, 2005, 172.
  5. Leslie, Sarah-Jane, “The Original Sin of Cognition: Fear, Prejudice, and Generalization,” unpublished version, 24. See
  6. See and
  7. Leslie 41-42.

Photo by Vlad Tchompalov on Unsplash