This blog post, written by Dr. Terry Flynn and Tim Li, is based on a research paper by Thomas Wood, Ph.D., Assistant Professor at Ohio State University; and Ethan Porter, Ph.D., Assistant Professor of Media and Public Affairs at George Washington University.
- A “backfire” effect may be rarer than researchers previously thought. A backfire effect means facts and corrections to misinformation may strengthen beliefs in false claims.
- People are more likely to pay attention to new information and use it to update their beliefs, rather than argue against it.
- However, communicators should always be aware that fact-based information can be used by those with deeply held beliefs to counter argue your claims. The key is understanding the depth of the opposing beliefs and the degree to which your fact-based claims conflict with those beliefs.
Implications for Public Relations
Considering that people will generally use the facts they are given, rather than argue against them, communicators looking to address rumors and false claims should provide facts to correct the misperceptions.
However, the backfire effect still presents a serious challenge and undermines efforts to support well-informed decision-making. Communication that is perceived as an attack on an individual’s identity or worldview may provoke counter-arguing, resulting in a backfire effect. It is important for professional communicators to understand the issue and audience they are dealing with before engaging in a fact-based discussion with those that hold strong and deeply held, opposing views.
Prior research has suggested that trying to correct people’s misinformed beliefs by presenting them with facts can actually backfire, making them feel stronger about their views. People may find the facts to be weak arguments if they don’t see how they are relevant, or become motivated to come up with their own arguments to dismiss the new information.
Wood and Porter sought to better understand instances where corrections were likely to backfire. They conducted a series of experiments where they presented participants with statements from US politicians that expressed a wide variety of different political misconceptions. After each statement, the participants were asked how much they agreed with it, either immediately after or after being presented with a correction using data from a neutral government source. The experiments varied the complexity of the erroneous claims and whether they were presented alone or as part of a longer news article.
The authors found no evidence of a backfire effect in any of their experiments, even when the false claim and the speaker aligned with the participants’ own political affiliation. Participants generally used the new information to update and inform their current beliefs when provided with a correction and in no situation did the correction motivate them to strengthen their misperceptions. The findings support the notion that most people tend to avoid the mental effort of arguing against corrections.
Blog post compiled by Dr. Terry Flynn and Tim Li of McMaster University.
Wood, Thomas, & Porter, Ethan. (2018). The Elusive Backfire Effect: Mass Attitudes’ Steadfast Factual Adherence. Political Behavior, 41(1), 135-163. https://doi.org/10.1007/s11109-018-9443-y