Posts Tagged ‘false=consensus’

Cognitive Processing of Information

October 9, 2012

This is the third in a series of four posts on the topic of misinformation and its correction. All four posts draw heavily on a review article in Psychological Science in the Public Interest.1 The first post, “Misinformation,” introduced the problem of misinformation. The second post, “The Origins of Misinformation” discussed the mechanisms of misinformation. The current post discusses how we process information, assess its truth, and correct misinformation we have received and, mistakenly, believed. This post draws on Nobel Prize winning psychologist Daniel Kahneman‘s two system view of human cognition.2 According to Kahneman, we have two systems for processing information. System 1, called Intuition, is very fast, and seems to work effortlessly. System 2, called reasoning, is slow and effortful. System 1 is the default system that we use when we are walking, driving, conversing, engaging in any type of skilled performance. System 2 is what we might term conscious thinking. One of the reasons that System 1 is so fast is that it employs heuristics and biases in its processing. Although most of the time they are correct, occasionally they are wrong. System 2 is supposed to monitor System 1 processing and should step in and do some thinking to assure that nothing is wrong. (See the Healthymemory Blog post, “The Two System View of Cognition.”)

The default mode for System 1 processing is that the information is true, unless the source is questionable at the outset. Then System 2 would raise an alert regarding the accuracy/truth of the information. Otherwise our processing of information would be quite slow and others would tend to lose patience with us. There is a sense of familiarity or fluency regarding the information. Should it be unfamiliar System 2 will likely pay more attention to the information including its veracity.

System 2 does raise some questions. For example, is the information compatible with what I believe? If it isn’t, it is likely either to be disregarded or to be examined quite carefully. If it is a story, System 2 will judge whether it is coherent. If it is incoherent and does not fit together, it will be regarded with suspicion.

Repeated exposure to a statement is known to increase its acceptance as true. Repetition effects can create a perceived social consensus even when no consensus exists. This is important as one of the factors determining whether the information is believed is whether others believe the information. Social-consensus information is particularly influential when it pertains to one’s reference group. One possible consequence of this repetition is pluralistic ignorance, which is the divergence between the actual prevalence of a belief in a society and what people in the society think that others believe. The flip side of pluralistic ignorance is the false-consensus effect in which a minority of people incorrectly feel that they are in the majority. These effects can be quite strong. It has been found that people in Australia who have particularly negative attitudes toward Aboriginal Australians or asylum seekers overestimate public support for their attitudes by 67% and 80% respectively. Although only 1.8% of people in a sample of Australians were found to have strongly negative attitudes towards Aboriginals, those few individuals thought that 69% of all Australians (and 79% of their friends) shared their extreme beliefs.

Unfortunately research indicates that retractions rarely eliminate the influence of misinformation. This is true even when people believe, understand, and later remember the retraction. This is true of research in the laboratory where misinformation is often retracted immediately and within the same narrative. Of course the situation is even worse when misinformation is presented through media sources and a correction is presented. This correction is usually presented in a later edition, so the format is temporally disjointed.

Most misinformation is the result of fast System 1 processes. The failure of retractions is due to faulty System 2 processes. We construct mental models of events. When a portion of this model is disrupted, System 2 processes should recognize that the entire model falls apart. But we don’t. Sometimes the false information that was retracted is still employed in the model. So System 2 is not doing the strategic monitoring it is supposed to do. Misinformation can have a fluency and familiarity about it, which is the result of System 1 processes and the failure of System 2 processes to correct the misinformation even when the correct information is available.

There is also the psychological phenomenon of reactance. People generally do not like to be told how to think or how to act, so they may reject especially authoritative retractions. This effect has be documented in courtroom settings where mock jurors are presented with a piece of evidence that is later ruled inadmissable. When the jurors are asked to disregard the tainted evidence, their conviction rates are higher when an inadmissable ruling was accompanied by the judge’s extensive legal explanations than when the inadmissability was left unexplained.

To this point the presentations have been pretty pessimistic. Misinformation is a large problem produced by many sources and processed by cognitive mechanisms that are vulnerable to misinformation but fairly indifferent to corrections. The next post, the final one in these series, will provide some partial solutions to this serious problem.

1Lewandowsky, S., Ecker, U.K.H., Seifert, C.M., , Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, 13, 106-131.

2Kahneman, D. (2011) Thinking Fast and Slow. New York: Farrar, Straus, and Giroux.

© Douglas Griffith and, 2012. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.