⌄ Scroll down to continue ⌄
Last Updated on

Communication, Happiness, Productivity, Self-education, Success

How Mentally Strong People Avoid False Beliefs

Cognitive neuroscientist and behavioral economist; CEO of Disaster Avoidance Experts; multiple best-selling author
⌄ Scroll down to continue ⌄

Kanisha grew up in a Democratic household in Memphis, Tennessee. As far as she remembers, her family and friends always supported leftist candidates. She watched liberal-leaning television programs. She read leftist newspapers. Her Facebook friends posted overwhelmingly liberal-friendly news articles, and Facebook’s newsfeed algorithm edited out the articles posted by her few conservative friends. Google and other search engines also sent her similar leftist information. Kanisha lives in what is known as a filter bubble, in which she rarely sees information at odds with her views. So, what’s your guess on how she votes?

Considering Other Perspectives

Even when Kanisha learns about evidence for perspectives other than her own, she generally does not give due weight to that information. For instance, when her teacher offered some strong evidence about some negative side effects of raising the minimum wage, Kanisha decided to Google the phrase “why is raising the minimum wage the right thing to do?”

Do you think the articles that came up helped her gain the most accurate perspective on this politically sensitive issue? By phrasing her Google search that way, Kanisha did not give due consideration to other perspectives. This is characteristic of Kanisha’s behavior: when she hears something that makes her question her beliefs, she looks for ways to protect them, as opposed to searching for the truth.

Now, I don’t mean to pick on Kanisha. This technology-enabled filter bubble is a characteristic of the personalization of the web. It affects many of us. This filter bubble has combined with another novel aspect of the Internet—how easily new media sources can capture our attention. Websites, bloggers, and so on tend to have lower standards for neutrality and professionalism than traditional news sources. These are key contributors to the polarization of political discourse we’ve seen in recent years.

Addressing Our Thinking Errors

I have to acknowledge that sometimes I myself am guilty of falling for the filter bubble effect. However, I fight the effect with my knowledge of cognitive biases (thinking errors made by our autopilots) and strategies for dealing with them.

The worst thinking error that Kanisha, myself, and others exhibit when we ignore information that does not fit our previous beliefs is called confirmation bias. Our brains tend to ignore or forget evidence that is counter to our current perspective, and will even twist ambiguous data to support our viewpoint and confirm our existing beliefs.

ADVERTISING

The stronger we feel about an issue, the stronger this tendency. At the extreme, confirmation bias turns into wishful thinking, when our beliefs stem from what we want to believe instead of what is true. Confirmation bias is a big part of the polarization in our opinions, in politics, and in other areas of life.

Updating Your Beliefs

So, how do you deal with confirmation bias and other thinking errors? One excellent strategy is to focus on updating your beliefs. The concept of “updating your beliefs” has helped me and many others who attended Intentional Insights workshops, such as this videotaped one, to deal with thinking errors. To employ this strategy, it helps to practice mentally associating positive emotions, such as pride and excitement, with the decision to change our minds and update our beliefs based on new evidence.

Being proud of changing our minds is not intuitive, because the emotional part of the brain has a tendency to find changing our minds uncomfortable. It often persuades us to reject information that would otherwise lead us to rethink our opinions. However, we can use the rational part of our mind to train the emotional one to notice confusion, re-evaluate cached thinking and other shortcuts, revise our mental maps, and update our beliefs.

In addition to associating positive emotions with changing your mind, you can use these habits to develop more accurate beliefs:

1) Deliberately seek out contradictory evidence to your opinion on a topic and praise yourself after giving that evidence fair consideration.

2) Focus in particular on updating your beliefs on controversial and emotional topics, as these are harder for the human mind to manage well.

ADVERTISING

3) It’s especially beneficial to practice changing your mind often. Recent research shows that those who update their beliefs more often are substantially more likely to have more accurate beliefs.

Taking all of these steps and feeling good about them will help you evaluate reality accurately and thus gain agency to achieve your life goals.

Questions for Consideration

  • When, if ever, have confirmation bias and associated thinking errors steered you wrong? What consequences resulted from these thinking errors?
  • How can you apply the concept of updating beliefs to improve your thinking?
  • What are other strategies you have found to help you change your mind and gain a more clear evaluation of reality?
  • How do you think reading this post has influenced your thinking about evaluating reality? What specific steps do you plan to take as a result of reading this post to shift your thinking and behavior patterns?
⌄ Scroll down to continue ⌄
Advertising
Advertising
Advertising
Advertising
Advertising
⌄ Scroll down to continue ⌄
⌄ Scroll down to continue ⌄
⌄ Scroll down to continue ⌄
⌄ Scroll down to continue ⌄