Belief Bias: 3 Belief Bias Examples and How to Avoid the Error
Written by MasterClass
Last updated: Nov 3, 2022 • 4 min read
Belief bias occurs when human reasoning capability breaks down. It causes you to rely on preexisting beliefs rather than follow through with an argument to its logical conclusion. This error in cognition is common and hard to detect, given how elusively natural it can feel in real time. Learn more about belief bias and how to combat it.
Learn From the Best
What Is Belief Bias?
Belief bias is a cognitive bias in which a person judges the conclusion of an argument based on their prior knowledge and existing beliefs rather than on the validity or invalidity of the argument itself.
Belief bias is closely related to confirmation bias. The two are distinct, however, in that belief bias occurs when a person incorrectly assesses an argument’s conclusion, whereas confirmation bias occurs when a person actively seeks out information to confirm what they already believe. Belief bias is also a form of response bias that leads people to respond to arguments incorrectly.
3 Potential Causes of Belief Bias
Belief bias is remarkably common, even among the most astutely rational. Here are three potential reasons people so easily fall prey to this error:
- 1. Cognitive processes: The dual-process theory of reasoning is particularly popular in the field of cognitive psychology. In this assessment, cognitive psychologists assert human beings have two neurological tracks by which they reason—one intuitive (and more prone to belief bias) and the other more actively analytical. On a similar note, others have suggested belief bias occurs when a person relies too heavily on their working memory rather than on assessing new information.
- 2. Deeply held convictions: Falling into belief bias is far more likely when you’re arguing over something you feel particularly passionate about. Ethical, political, and religious discussions, for example, can cause people to rest on their prior knowledge and beliefs. In these circumstances, people might opt for the more personally believable conclusion rather than use objectivity, heuristic techniques, or reasonable decision-making to single out the logical conclusion (even if it differs from what they would like to believe).
- 3. Unclear arguments: Sometimes it’s the person making the argument who deserves attribution for giving rise to the other person’s belief bias. If an argument’s premises are unclear, people become more likely to rely on prior beliefs and mental models to assess such an argument’s validity. As such, the bias can arise accidentally due to how someone phrases an argument rather than due to a more impassioned rejection of the conclusion for other reasons.
3 Examples of the Belief Bias Effect
Belief bias crops up in all sorts of different scenarios and situations. Consider these three examples:
- 1. An ethical quandary: A dedicated vegetarian might argue it’s never okay to eat meat, citing ethical reasons, even in a life-or-death situation. A different person might argue if a person is starving, eating any kind of food is justified. Both parties will want to reject the other’s conclusion because of their beliefs.
- 2. A political election: Politics serves as a natural battlefield in which to study this error in social psychology and reasoning. Suppose a candidate loses and then insists their opposition could only have won by cheating but fails to produce evidence to substantiate this conclusion. Similarly, suppose a candidate wins by a slim margin and then proceeds to govern as if they have a mandate befitting of a landslide election. In either case, these politicians are operating off the basis of prior beliefs rather than objective realities.
- 3. A technically valid syllogism: An argument can be valid but actually untrue, pointing to an unbelievable conclusion. Take this technically valid syllogism: “All chickens lay eggs. This bird has laid an egg; therefore, it must be a chicken.” While this argument isn’t totally sound, it remains valid in a formal sense. Use your logical reasoning ability to both assess an argument’s validity (its formal accuracy) and truth (its correspondence to reality), rather than falling back on your prior beliefs about either.
How to Avoid Belief Bias
It’s possible to counteract belief bias if you keep a vigilant eye on yourself. Keep these tips in mind to reason more accurately:
- Break the argument down. Accurate information processing relies on breaking down arguments into smaller chunks. If you think you might be letting your prior point of view cloud your objective reasoning capabilities, take a step back and utilize the syllogistic reasoning process. Try to state an argument in these simple terms: “If A, then B. If B, then C. Therefore, if A, then C.” If the premises lead to an accurate conclusion, the argument is valid; if they don’t, it’s invalid.
- Distinguish between validity and truth. The believability of the conclusion has to do with the truth of an argument rather than its logical validity. Conflating truth and validity can cause you to fall into the belief bias easily. Distinguishing between these two attributes so you can identify invalid syllogisms is one of the most important reasoning tasks for any logician.
- Remain objective. Try your hardest to keep an eye on where your personal beliefs might show up in your deductive reasoning. Perform a psychological review of yourself in the effort to see where belief bias might show up most prevalently for you. Perhaps you become particularly prone to the error when you become fired up about politics and religion, or maybe it’s just when you feel you have inadequate time to truly reason through an argument.
Want to Learn How to Be More Empathetic?
Practicing empathy can help you lead more effectively while building stronger relationships across the personal and professional facets of your life. Challenge your perceptions with the MasterClass Annual Membership and take lessons on emotional intelligence from Pharrell Williams, Roxane Gay, Gloria Steinem, Dr. Cornel West, Walter Mosley, Robert Reffkin, and Robin Arzón.