This blog post, written by Dr. Terry Flynn and Tim Li, is based on a research paper by Gordon Pennycook, Assistant Professor at the University of Regina; and David G. Rand, Associate Professor of Management Science and Brain and Cognitive Sciences at Massachusetts Institute of Technology. 

Key Findings

• People often fall for misinformation, or misleading news or information, because they aren’t thinking carefully or deeply enough about the information they are exposed to.
• Individuals with a stronger tendency to engage in analysis and evaluation of information are less susceptible to misinformation, even if it aligns with their ideology.

Implications for Public Relations

Understanding why people are susceptible to believing misinformation, or false or misleading news or information, is critical to developing communication strategies to mitigate the impact of misinformation. Organizations looking to reduce belief in false or misleading claims should encourage more analytic and more careful thinking among those stakeholders impacted by “misinformation.”

Targeted communication prompts or nudges to evaluate information more carefully and specific insights around the spread of fake news could help your stakeholders to move away from solely relying on intuition, which is generally guided by factors outside of the content’s accuracy, like its familiarity and recency to the stakeholder.

The findings of this study provide some optimism that doing so can reduce the perceived accuracy of misinformation even when it supports the stakeholder’s pre-existing attitudes or ideology, suggesting that people are not always biased by their affiliations. In many cases, they just need to think more carefully and deeply about what they see or hear.


To examine the relationship between reasoning and susceptibility to misinformation, Pennycook and Rand presented over 3,000 participants a variety of real and fake political headlines, and had them judge the accuracy of the claims. The participants then completed a cognitive reflection test, which assessed their ability to ignore their intuition or gut reaction and think more analytically and deeper to solve a problem.

The results show that more analytic individuals performed better at assessing factual news accuracy. This finding held true even if the misinformation aligned with their political partisanship, suggesting that people aren’t always biased toward only believing information that supports their ideology. More often than not, they fall for misinformation because it’s easier to accept than trying to expend greater cognitive resources to analyze it.

Processing information requires mental effort and processing more contentious or challenging information requires much more significant mental effort. People tend to avoid exerting this effort to preserve their thinking capacity for other tasks they might encounter, if they don’t feel a deep connection or investment in the information. Instead of critically evaluating information, they rely on other mental shortcuts, like the perceived credibility of a communicator or how familiar a particular claim feels, to determine whether they believe it is accurate or not.

Blog post compiled by Dr. Terry Flynn and Tim Li of McMaster University.


Pennycook, Gordon, & Rand, David G. (2018). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39-50.

Heidy Modarelli handles Growth & Marketing for IPR. She has previously written for Entrepreneur, TechCrunch, The Next Web, and VentureBeat.
Follow on Twitter

Leave a Reply