Solutions and Good Practices for Misinformation

The preceding three blog posts, “Misinformation,” “The Origins of Misinformation,”, and “Cognitive Processing of Information,” have painted a pessimistic view of the problem of misinformation. This post will propose some solutions to the problem. All four posts draw heavily on a review article in Psychological Science in the Public Interest.1 This post also draws on Nobel Prize winning psychologist Daniel Kahneman‘s two system view of human cognition.2 According to Kahneman, we have two systems for processing information. System 1, called Intuition, is very fast, and seems to work effortlessly. System 2, called reasoning, is slow and effortful. System 1 is the default system that we use when we are walking, driving, conversing, engaging in any type of skilled performance. System 2 is what we might term conscious thinking. One of the reasons that System 1 is so fast is that it employs heuristics and biases in its processing. Although most of the time they are correct, occasionally they are wrong. System 2 is supposed to monitor System 1 processing and should step in and do some thinking to assure that nothing is wrong. (See the Healthymemory Blog post, “The Two System View of Cognition.”)

It is System 1, which facilitates the processing of good information, that inadvertently processes misinformation. System 2 is supposed to monitor and correct these mistakes, but it is a very difficult task. A person’s worldview, what the person already believes, has an enormous effect on what is regarded as misinformation. One person’s information can be another’s misinformation. Skepticism reduces susceptibility to misinformation effects when it prompts people to question the origins of information that may later turn out to be false. One way of dealing with this worldview is by framing solutions to a problem in worldview-consonant terms. For example, people who might oppose nanotechnology because they have an “eco-centric” outlook might be less likely to dismiss evidence of its safety if the use of nanotechnology is presented as part of an effort to protect the environment.

There is a danger one needs to recognize when trying to correct the effects of misinformation, particularly misinformation about complex real-world issues. People will refer more to misinformation that is in line with their attitudes and will be relatively immune to corrections. Retractions might even backfire and strengthen the initially held beliefs.

So much for the difficulties. Four common misinformation problems follow along with associated solutions and good practices.

Continued Influence Effect. Despite a retraction, people continue to believe the misinformation. The solution is to provide an alternative explanation that fills the gap left by retracting the misinformation without reiterating the misinformation. Then continue to strengthen the retraction through repetition (without reinforcing the myth).

Familiarity Backfire Effect. Repeating the myth increases familiarity reinforcing it. The solution is to avoid repetition of the myth by reinforcing the correct facts instead. When possible provide a pre-exposure warning that misleading information is coming.

Overkill Backfire Effect. Simple myths are more cognitively attractive than complicated refutations. The solution is to provide a simple, brief, rebuttal that uses fewer arguments in refuting the myth—less is more. Try to foster healthy skepticism. Skepticism about information source reduces the influence of misinformation.

Worldview Backfire Effect. Evidence that threatens worldview can strengthen initially held beliefs. The solution is to affirm worldview by framing evidence in a worldview-affirming manner by endorsing the values of the audience. Self-affirmation of personal values increases receptivity to evidence.

It should be clear that correcting the effects of misinformation is not easy. Moreover, the effects are likely to be modest. Nevertheless, correcting misinformation is a serious problem that needs to be addressed. Clearly more research is needed.

We also need to be aware that our own worldviews influence System 1 processing and the failure to reject misinformation. Here I am reminded of something Mark Twain said. “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”

1Lewandowsky, S., Ecker, U.K.H., Seifert, C.M., , Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, 13, 106-131.

2Kahneman, D. (2011) Thinking Fast and Slow. New York: Farrar, Straus, and Giroux.

© Douglas Griffith and healthymemory.wordpress.com, 2012. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Advertisements

Tags: , , , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: