Posts Tagged ‘MIsinformation’

What to Do About Disinformation

December 3, 2019

The title of this post is identical to the title of a section in Richard Stengel’s informative work, Information Wars. This book is highly informative and provides information not only about the State Department, but also about the actions Rick Stengel took performing his job. But the most useful part of the book is this section, What to Do About Disinformation. Several posts are needed here, and even then, they cannot do justice to the information provided in the book.

When the Library of Congress was created in 1800 it had 39 million books. Today the internet generates 100 times that much data every second. Information definitely is the most important asset of the 21st Century. Polls show that people feel bewildered by the proliferation of online news and data. Mixed in with this daily tsunami there is a lot of information that is false as well as true.

Disinformation undermines democracy because democracy depends on the free flow of information. That’s how we make decisions. Disinformation undermines the integrity of our choices. According to the Declaration of Independence “Governments are instituted among Men, deriving their just powers from the consent of the governed.” If that consent is acquired through deception, the powers from it are not just. Stengel states that it is an attack on the very heart of our democracy.

Disinformation is not news that is simply wrong or incorrect. It is information that is deliberately false in order to manipulate and mislead people.

Definitions of important terms follow:
Disinformation: The deliberate creation and distribution of information that is false and deceptive in order to mislead an audience.
Misinformation: Information that is false, though not deliberately; that is created inadvertently or by mistake.
Propaganda: Information that may or may not be true that is designed to engender support for a political view or ideology.

“Fake news” is a term Donald Trump uses to describe any content he does not like. But Trump did not originate the term. The term was familiar to Lenin and Stalin and almost every other dictator of the last century. Russians were calling Western media fake news before Trump, and Trump in his admiration of Russia followed suit. Stengel prefers the term “junk news” to describe information that is false, cheap, and misleading that has been created without regard for its truthfulness.

Most people regard “propaganda” as pejorative, but Stengel believes that it is—or should be—morally neutral. Propaganda can be used for good or ill. Advertising is a form of propaganda. What the United States Information Agency did during the Cold War was a form of propaganda. Advocating for something you believe in can be defined as propaganda. Stengel writes that while propaganda is a misdemeanor, disinformation is a felony.
Disinformation is often a mixture of truth and falsity. Disinformation doesn’t necessarily have to be 100% false to be disinformation. Stengel writes that the most effective forms of disinformation are a mixture of information that is both true and false.

Stengel writes that when he was a journalist he was close to being a First Amendment absolutist. But he has changed his mind. He writes that in America the standard for protected speech has evolved since Holme’s line about “falsely shouting fire in a theater.” In Brandenburg v. Ohio, the court ruled that speech that led to or directly caused violence was not protected by the First Amendment.

Stengel writes that even outlawing hate speech will not solve the problem of disinformation. He writes that government may not be the answer, but it has a role. He thinks that stricter government regulation of social media can incentivize the creation of fact-based content and discentivize the creation of disinformation. Currently big social media platforms optimize content that has greater engagement and vitality, and such content can sometimes be disinformation or misinformation. Stengel thinks the these incentives can be changed in part through regulation and in part through more informed user choices.

What Stengel finds most disturbing is that disinformation is being spread in a way and through means that erode trust in public discourses and democratic processes. This is precisely what these bad actors want to accomplish. They don’t necessarily want you to believe them—they don’t want you to believe anybody.

As has been described in previous healthy memory blog posts, the creators of disinformation use all the legal tools on social media platforms that are designed to deliver targeted messages to specific audiences. These are the same tools—behavioral data analysis, audience segmentation, programmatic ad buying—that make advertising campaigns effective. The Internet Research Agency in St. Petersburg, Russia uses the same behavioral data and machine-learning algorithms that Coca-Cola and Nike use.

All the big platforms depend on the harvesting and use of personal information. Our data is the currency of the digital economy. The business model of Google, Facebook, Amazon, Microsoft, and Apple, among others, depends on the collection and use of personal information. They use this information to show targeted advertising. They collect information on where you go, what you do, whom you know, and what your want to know about, so they can sell that information to advertisers.

The important question is, who owns this information? These businesses argue that because they collect, aggregate, and analyze our data, they own it. The law agrees in the U.S. But in Europe, according to the EU’s General Data Protection Regulation, people own their own information. Stengel and HM agree that this is the correct model. America needs a digital bill of rights that protects everyone’s information as a new social contract.

Stengel’s concluding paragraph is “I’m advocating a mixture of remedies that optimize transparency, accountability, privacy, self-regulation, data protection, and information literacy. That can collectively reduce the creation, dissemination, and consumption of false information. I believe that artificial intelligence and machine learning can be enormously effective in combating falsehood and disinformation. They are necessary but insufficient. All three efforts should be—to use one of the military’s favorite terms—mutually reinforcing.”

POLITIFACT

June 5, 2019

The problem of misinformation is acute and currently the best means of addressing this misinformation is POLITIFACT politifact.com. POLITIFACT is a winner of the Pulitzer Prize. It is said that all politicians lie, and that is the truth. It is also likely that practically all humans lie. What one discovers in POLITIFACT is that all politicians also tell the truth, if only rarely. Moreover, there is not a strict dichotomy between true and false. Rather, there are shades of truth and false. That is why POLITIFACT uses a Truth-O-Meter that ranges as follows:

True
Mostly True
Half True
Mostly False
False
Pants-on-Fire, which is a rating that means the item is flamingly false.

You can look for specific issues and see a score card (the breakdown of the ratings) and see a sampling of the individual ratings by prominent individuals along with their statements. HM found this feature to be especially useful.

Each of these ratings is justified with a prose passage explaining the justification for the rating. So one doesn’t need to accept the rating. But the justification should be read to understand the basis for the rating.

One can access different editions. There is a national edition, a punditfact edition, which addresses various pundits, health check edition, which addresses health topics. There is a Facebook-Hoaxes edition, which is especially needed. There are editions specific to states, but no all states are available yet.

Certain individuals merit special editions. Visit the website to see who they are.
There is a Promises heading that has a Trump-o-meter and an Obameter.

There is also a Pants-on-Fire Heading that allows you to examine the most egregious lies.

It’s highly recommend to visit this website on a regular basis and spend as little or as much time as you want.

The problem of misinformation is chronic and POLITIFACT provides the best mean of dealing with this misinformation.

Passing 73

May 6, 2019

Meaning that today HM is entering his 74th year. He engages in ikigai, the Japanese term referring to living a life with purpose, a meaningful life. His purpose, in addition to living a fulfilling life with his wife, is to learn and share his thoughts and knowledge with others. HM does this primarily through his blog healthymemory, which focuses on memory health and technology.

HM’s Ph.D is in cognitive psychology. That field has transitioned to cognitive neuroscience, a field of research and a term that did not exist when HM was awarded his Ph.D. HM is envious of today’s students. However, he is still fortunate enough to be able to keep abreast of current research and to relay relevant and meaningful research from this field to his readers.

What is most disturbing is the atmosphere of fear and hate that prevails today. It is ironic that technology, which had, and still has, a tremendous potential for spreading knowledge, now largely spreads disinformation, hatred, and fear.

HM understands why this is the case, but, unfortunately, he does not know how to counter it.

The problem can best be understood in terms of Kahneman’s Two Process Theory of cognition. In Nobel Lauerate Daniel Kahneman’s Two System View of Cognition. System 1, intuition, is our normal mode of processing and requires little or no attention. Unfortunately System 1 is largely governed by emotions. Fear and hate are System 1 processes. System 2, commonly referred to as thinking, requires our attention. One of the roles of System 2 is to monitor System 1. When we encounter something contradictory to what we believe, the brain sets off a distinct signal. It is easier to ignore this signal and to continue System 1 processing. To engage System 2 requires attentional resources to attempt to resolve the discrepancy and to seek further understanding. To put Kahneman’s ideas into the vernacular, System 2 involves thinking. System 1 is automatic and requires virtually no cognitive effort. Emotions are a System 1 process, as are identity based politics. Politics based on going with people who look like you requires no thinking yet provides social support.

Trump’s lying is ubiquitous. Odds are that anything he says is a lie. His entire candidacy was based on lies. So why is he popular? Identifying lies and correcting misinformation requires mental effort, System 2 processing. It is easier to be guided by emotions than to expend mental effort. The product of this cognitive miserliness is a stupidity pandemic.

Previous healthy memory posts have emphasized the enormous potential of technology. Today people, especially young people, are plugged in to their iPhones. Unfortunately, the end result is superficial processing. They get information expeditiously, but they are so consumed with staying in touch with updated information, that they have neither time nor attention left for meaningful System 2 processing. Unfortunately, technology, specifically social media, amplifies these bad effects, thus increasing misinformation, hatred and fear. Countering these bad effects requires implementing System 2 processes, that is thinking. A massive failure to do this enables Trump to build his politics on lies spreading hatred and fear.

As has been written in many previous healthy memory posts, System 2 processing will not only benefit politics, but will also decrease the probability of suffering from Alzheimer’s and dementia.

Personally, all this is upsetting. But HM believes it is essential to love one’s fellow humans. He tries to deal with this via meditation. Progress is both difficult and slow but it needs to be done. Hatred destroys the one who hates. So HM continues a daily struggle to be a better human being.

© Douglas Griffith and healthymemory.wordpress.com, 2019. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Hey US Teachers, leave those climate myths alone

February 22, 2016

The title of this post is the title of an article by Michael Mann in the Feb 20-26, 2016 edition of the New Scientist.  He summarized an article in the 12 Feb 2016 issue of science, whose authors are Eric Pulitzer, Mark McCaffrey, A. Lee Hannah, Joshua Rosenau, Minda Berbeco, & Ann H. Reid (doi.org/bcgt).  Even though 97% of active climate scientists attribute recent global warming to human causes, and most of the general public accept that climate change is occurring, only about half of U.S. adults believe that human activity is the predominant cause.   The U.S. ranked the lowest in this belief among 20 nations polled in 2014.

The article examines how the societal debate in the U.S.  affects science classrooms.  They found that whereas  most U.S. science teachers include climate change in their courses, it appears that their insufficient grasp  of science is hindering effective teaching. Generally teachers devote a paltry 1 to 2 hours to this important topic.  Despite the fact that 97% of experts agree climate change is mainly human caused, many teachers still “teach the controversy,” suggesting a sizable “consensus gap” exists.  They survey showed that 7 in 10 teachers mistakenly believe that at least a fifth of  experts dispute human-caused climate change.  Although they are supposed to be teaching science, they have insufficient knowledge in the discipline they are teaching.

Michael Mann in his book “The Hockey Stick and Climate Wars describes how those with interests in fossil fuels have spent tens of millions of dollars to create the impression of a consensus gap by orchestrating a public relations campaign aimed at attacking the science and the scientists, thus confusing the public about the reality and threat of climate change.  They also have created a partisan political divide on the issue.  The United States is the only advanced country with a major political party denying the reality of climate change.

These climate myths provide an unfortunate example of the effectiveness  of Big Lies.  These Big Lies are working their damage in the United States and not only on the issue of climate change.

These myths on climate change are exacerbating a problem and endowing it to our
children.  These children need to know truth so that they can educate their parents.

© Douglas Griffith and healthymemory.wordpress.com, 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

More on Erroneous Eyewitness Testimony

March 11, 2015

This post is based primarily on an article by Steven J. Frenda, Rebecca M. Nichols, and Elizabeth F. Loftus titled “Current Issues and Advances in Information Research,” in Current Directions in Psychological Science (2015) 20, 20-23.  They note a recent discussion of the distorting effects witnesses have on the memory of other witnesses by Wright, Memon, Skakerberg, and and Gabbert (2009) in Current Directions in Psychological Science, 18, 174-178.  They propose that there three accounts of why eyewitnesses come to report incorrect information.
A witness’s report may be altered due to normative social influence.  A witness might decide that the cost of disagreeing with law enforcement—or with other witnesses—is too high, and so adjusts her report accordingly.
Through informational social influence processes, a witness comes to endorse a version of events that is different from what he remembers because he believes it to be truer or more accurate than hi own memory.
A witness’s memory can become distorted, sometimes as a result of being exposed to incorrect or misleading information.
It is this third possibility that this blog post addresses.

Perhaps the first question is “who is vulnerable?”  The short answer is that nobody is immune to the distorting effects of misinformation, but some people are more vulnerable than others.  Very young children and the elderly are more susceptible to misinformation than adolescents and adults.  People who report lapses in memory and attention are also specially vulnerable.  These facts suggest that a poverty of cognitive resources results in an increased reliance on external cues to reconstruct memories.  Misinformation effects are easier to obtain when individuals’ attentional resources are limited.  Similarly, people who perceive themselves to be forgetful and who experience memory lapses may be less able or willing to depend on their own resources as the sole source of information as they mental reconstruct an event.

Two major studies containing more than 400 participants explored cognitive ability and personality factors as predictors of susceptibility to misinformation.  In these studies participants viewed slides of two crimes and later read narratives of the crimes that contained misinformation.  Participants who had higher intelligence scores, greater perceptual abilities, greater working memory capacities, and greater performance on face recognition tasks tended to resist misinformation and produce fewer false memories.   Some personality characteristics were also shown to be associated with false memory formation, particularly in individuals with lesser cognitive ability.  Individuals low in fear of negative evaluation and harm avoidance, and those high in cooperativeness, reward dependence and self-directedness were associated with increased vulnerability to misinformation effects.

Functional magnetic resonance imaging fMRI is being used to investigate brain activity association with misinformation effects.  In one study participants were shown a series of photographs and later listed to an auditory narrative describing, which included misinformation.  Shortly thereafter, they were placed in an MRI scanner and given a test of their memory for the photographs.  fMRI data revealed similar patterns of brain activity, but the true memories (formed by visual information) showed somewhat more activation in the visual cortex, whereas the false memories (derived from the auditory narrative) showed somewhat more activity in the auditory cortex.

Obviously a critical question is how to protect against misinformation effects.  To this end a cognitive interview (CI) methodology, which consists of a set of rules and guidelines for  interviewing eyewitnesses.  For example, the recommended methodology uses free recall, contextual cues, temporal ordering of events, and recalling an event from a variety of perspectives (for example, from a perpetrator’s point of view).
The technique also recommends that investigators avoid suggestive questioning, that they develop rapport with the witness, and discourages witnesses from guessing.  Research has supported the idea that the CI reduces or eliminates the misinformation effect.

Here the misinformation effect is considered only in the context of eyewitness testimony.  Unfortunately misinformation is a large problem that has only been exacerbated with the advent of the internet.  The central problem is that it is difficult to correct misinformation.  I would contend that there is an epidemic of misinformation with large numbers of people holding notions contrary to science.  It is extremely difficult to correct their misconceptions.  To read more about misinformation simply enter “misinformation”  into the healthy memory search box.

More on Avoiding Collapse

March 3, 2013

Preceding posts have been on Costa’s The Watchman’s Rattle: A Radical New Theory of Collapse. The immediately preceding post has been on Insight, a cognitive capability that Costa believes could prevent collapse. This post expands on that theme. Insight is closely related to creativity, and there have been many healthymemory blog posts on creativity (just enter creativity into the Search Box on the healthymemory blog).

The central thesis of Costa’s The Watchman’s Rattle: A Radical New Theory of Collapse is that societies collapse as a result of beliefs not keeping up with facts. She writes of five supermemes that threaten civilization. They are: Irrational Opposition, The Personalization of Blame, Counterfeit Correlation, Silo Thinking, and Extreme Economics. These supermemes result in defective cognitive processes and unhealthy memories. We need to be aware of them in both ourselves and others. When appropriate, challenge others you find fostering these supermemes. The reality is that the solutions to the vast majority of our problems exist, but these supermemes operate to prevent their implementation.

These supermemes are types of unhealthy memories. And they are unhealthy memories that threaten civilization. They need to be stamped out.

Transactive memory is one of the major topics of the healthymemory blog. There are two types of transactive memory. One is technological, and includes conventional technology, paper publications, and modern technology of electronic publication and communication. Many of the solutions can be found there as well as the technology for collaborations and discussions that lead to these solutions. Our rapidly changing and increasingly complex societies requires collaboration and team efforts to reach solution. Social interactions are important to maintaining a healthy memory, and interactions among many, many healthy memories are what is needed not only for our civilization to survive, but also for our species to survive.

In addition to the supermemes, one of the risks is the amount of misinformation that is available. What is particularly alarming is that there is ample evidence of concerted efforts by vested interests to disseminate misinformation (See the healthymemory blog post, “The Origins of Misinformation). This willful manufacture of mistaken beliefs has earned its own term, “agnogenesis.” The comic strip Doonesbury introduced an online service, myFacts, that would provide you with facts that would support anything you believed or wanted to support. Although Doonesbury is a comic strip it is portraying a parody of an underlying reality. One needs to be on the alert for these efforts.

There is an increasing realization that being cognitively active is important not only to reduce or preclude the effects of dementia as we age, but also to allow us to participate effectively in our complex society. Costa writes of businesses, analogous to gyms and health centers designed for our bodies, that are set up like exercise facilities, but the exercises and workouts are designed to sharpen our minds. The digital brain health market is expanding at a rapid rate. Just enter “Healthy Memory” into a search site such as duckduckgo.com to find a wealth of resources (enter Healthy Memory Blog to find the current blog). Brain fitness will also return a wealth of sites. Many of these sites are commercial, but others are free. Readers who have found worthwhile sites are encouraged to enter these sites and their reviews as comments to this post.

© Douglas Griffith and healthymemory.wordpress.com, 2012. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Solutions and Good Practices for Misinformation

October 10, 2012

The preceding three blog posts, “Misinformation,” “The Origins of Misinformation,”, and “Cognitive Processing of Information,” have painted a pessimistic view of the problem of misinformation. This post will propose some solutions to the problem. All four posts draw heavily on a review article in Psychological Science in the Public Interest.1 This post also draws on Nobel Prize winning psychologist Daniel Kahneman‘s two system view of human cognition.2 According to Kahneman, we have two systems for processing information. System 1, called Intuition, is very fast, and seems to work effortlessly. System 2, called reasoning, is slow and effortful. System 1 is the default system that we use when we are walking, driving, conversing, engaging in any type of skilled performance. System 2 is what we might term conscious thinking. One of the reasons that System 1 is so fast is that it employs heuristics and biases in its processing. Although most of the time they are correct, occasionally they are wrong. System 2 is supposed to monitor System 1 processing and should step in and do some thinking to assure that nothing is wrong. (See the Healthymemory Blog post, “The Two System View of Cognition.”)

It is System 1, which facilitates the processing of good information, that inadvertently processes misinformation. System 2 is supposed to monitor and correct these mistakes, but it is a very difficult task. A person’s worldview, what the person already believes, has an enormous effect on what is regarded as misinformation. One person’s information can be another’s misinformation. Skepticism reduces susceptibility to misinformation effects when it prompts people to question the origins of information that may later turn out to be false. One way of dealing with this worldview is by framing solutions to a problem in worldview-consonant terms. For example, people who might oppose nanotechnology because they have an “eco-centric” outlook might be less likely to dismiss evidence of its safety if the use of nanotechnology is presented as part of an effort to protect the environment.

There is a danger one needs to recognize when trying to correct the effects of misinformation, particularly misinformation about complex real-world issues. People will refer more to misinformation that is in line with their attitudes and will be relatively immune to corrections. Retractions might even backfire and strengthen the initially held beliefs.

So much for the difficulties. Four common misinformation problems follow along with associated solutions and good practices.

Continued Influence Effect. Despite a retraction, people continue to believe the misinformation. The solution is to provide an alternative explanation that fills the gap left by retracting the misinformation without reiterating the misinformation. Then continue to strengthen the retraction through repetition (without reinforcing the myth).

Familiarity Backfire Effect. Repeating the myth increases familiarity reinforcing it. The solution is to avoid repetition of the myth by reinforcing the correct facts instead. When possible provide a pre-exposure warning that misleading information is coming.

Overkill Backfire Effect. Simple myths are more cognitively attractive than complicated refutations. The solution is to provide a simple, brief, rebuttal that uses fewer arguments in refuting the myth—less is more. Try to foster healthy skepticism. Skepticism about information source reduces the influence of misinformation.

Worldview Backfire Effect. Evidence that threatens worldview can strengthen initially held beliefs. The solution is to affirm worldview by framing evidence in a worldview-affirming manner by endorsing the values of the audience. Self-affirmation of personal values increases receptivity to evidence.

It should be clear that correcting the effects of misinformation is not easy. Moreover, the effects are likely to be modest. Nevertheless, correcting misinformation is a serious problem that needs to be addressed. Clearly more research is needed.

We also need to be aware that our own worldviews influence System 1 processing and the failure to reject misinformation. Here I am reminded of something Mark Twain said. “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”

1Lewandowsky, S., Ecker, U.K.H., Seifert, C.M., , Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, 13, 106-131.

2Kahneman, D. (2011) Thinking Fast and Slow. New York: Farrar, Straus, and Giroux.

© Douglas Griffith and healthymemory.wordpress.com, 2012. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Cognitive Processing of Information

October 9, 2012

This is the third in a series of four posts on the topic of misinformation and its correction. All four posts draw heavily on a review article in Psychological Science in the Public Interest.1 The first post, “Misinformation,” introduced the problem of misinformation. The second post, “The Origins of Misinformation” discussed the mechanisms of misinformation. The current post discusses how we process information, assess its truth, and correct misinformation we have received and, mistakenly, believed. This post draws on Nobel Prize winning psychologist Daniel Kahneman‘s two system view of human cognition.2 According to Kahneman, we have two systems for processing information. System 1, called Intuition, is very fast, and seems to work effortlessly. System 2, called reasoning, is slow and effortful. System 1 is the default system that we use when we are walking, driving, conversing, engaging in any type of skilled performance. System 2 is what we might term conscious thinking. One of the reasons that System 1 is so fast is that it employs heuristics and biases in its processing. Although most of the time they are correct, occasionally they are wrong. System 2 is supposed to monitor System 1 processing and should step in and do some thinking to assure that nothing is wrong. (See the Healthymemory Blog post, “The Two System View of Cognition.”)

The default mode for System 1 processing is that the information is true, unless the source is questionable at the outset. Then System 2 would raise an alert regarding the accuracy/truth of the information. Otherwise our processing of information would be quite slow and others would tend to lose patience with us. There is a sense of familiarity or fluency regarding the information. Should it be unfamiliar System 2 will likely pay more attention to the information including its veracity.

System 2 does raise some questions. For example, is the information compatible with what I believe? If it isn’t, it is likely either to be disregarded or to be examined quite carefully. If it is a story, System 2 will judge whether it is coherent. If it is incoherent and does not fit together, it will be regarded with suspicion.

Repeated exposure to a statement is known to increase its acceptance as true. Repetition effects can create a perceived social consensus even when no consensus exists. This is important as one of the factors determining whether the information is believed is whether others believe the information. Social-consensus information is particularly influential when it pertains to one’s reference group. One possible consequence of this repetition is pluralistic ignorance, which is the divergence between the actual prevalence of a belief in a society and what people in the society think that others believe. The flip side of pluralistic ignorance is the false-consensus effect in which a minority of people incorrectly feel that they are in the majority. These effects can be quite strong. It has been found that people in Australia who have particularly negative attitudes toward Aboriginal Australians or asylum seekers overestimate public support for their attitudes by 67% and 80% respectively. Although only 1.8% of people in a sample of Australians were found to have strongly negative attitudes towards Aboriginals, those few individuals thought that 69% of all Australians (and 79% of their friends) shared their extreme beliefs.

Unfortunately research indicates that retractions rarely eliminate the influence of misinformation. This is true even when people believe, understand, and later remember the retraction. This is true of research in the laboratory where misinformation is often retracted immediately and within the same narrative. Of course the situation is even worse when misinformation is presented through media sources and a correction is presented. This correction is usually presented in a later edition, so the format is temporally disjointed.

Most misinformation is the result of fast System 1 processes. The failure of retractions is due to faulty System 2 processes. We construct mental models of events. When a portion of this model is disrupted, System 2 processes should recognize that the entire model falls apart. But we don’t. Sometimes the false information that was retracted is still employed in the model. So System 2 is not doing the strategic monitoring it is supposed to do. Misinformation can have a fluency and familiarity about it, which is the result of System 1 processes and the failure of System 2 processes to correct the misinformation even when the correct information is available.

There is also the psychological phenomenon of reactance. People generally do not like to be told how to think or how to act, so they may reject especially authoritative retractions. This effect has be documented in courtroom settings where mock jurors are presented with a piece of evidence that is later ruled inadmissable. When the jurors are asked to disregard the tainted evidence, their conviction rates are higher when an inadmissable ruling was accompanied by the judge’s extensive legal explanations than when the inadmissability was left unexplained.

To this point the presentations have been pretty pessimistic. Misinformation is a large problem produced by many sources and processed by cognitive mechanisms that are vulnerable to misinformation but fairly indifferent to corrections. The next post, the final one in these series, will provide some partial solutions to this serious problem.

1Lewandowsky, S., Ecker, U.K.H., Seifert, C.M., , Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, 13, 106-131.

2Kahneman, D. (2011) Thinking Fast and Slow. New York: Farrar, Straus, and Giroux.

© Douglas Griffith and healthymemory.wordpress.com, 2012. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

The Origins of Misinformation

October 8, 2012

The immediately preceding post introduced the problem of misinformation. This post discusses the origins of misinformation. The sources discussed here are rumors and fiction, government and politicians, vested interests, and the media. This post, as was the preceding post and the next post, draws heavily on a review article in Psychological Science in the Public Interest.1

Although the believability of information is one factor determining the propagation of information, there is a strong preference to pass information that will invoke an emotional response in the recipient, regardless of the truth value of the information. People also extract information from fictional sources, both in literature, the movies, and the theater arts. People rely on misinformation acquired from clearly fictitious stories to respond to quiz question even when the misinformation contradicted common knowledge and when people where aware that the source was fictional.2 These effects of fictional misinformation are difficult to correct. Prior warnings were ineffective in reducing the acquisition of misinformation from fiction, and that misinformation was reduced, but not eliminated only when participants were instructed to actively monitor the contents they were reading and to press a key whenever they encountered a piece of misinformation.3 Michael Crichton’s novel State of Fear misrepresented the science of global change yet was introduced as “scientific” evidence into a U.S. Senate committee.

Before the invasion of Iraq in 2003, U.S. Government officials proclaimed that there was no doubt that Saddam Hussein had weapons of mass destruction (WMD). The Bush administration also juxtaposed Iraq and the 9/11 terrorist attacks as the frontline in the “War on Terror.” Moreover, it implied that it had intelligence linking Iraq to al–Qaida. All of this turned out to be misinformation, yet large segments of the American public continued to believe these claims. Moreover, 20% to 30% believed that WMDs had actually been discovered in Iraq after the invasion and about half the public believed in the links between Iraq and al–Qaeda.

In the political arena Sarah Palin made the claim that Obama’s health care plan had provisions for “death panels.” In five weeks 86% of American had heard this claim and half either believed this myth or were uncertain as to its veracity. Although the public is aware of politically motivated misinformation, particularly during election campaigns, they are poor in identifying specific instances of misinformation, being unable to distinguish between false and correct information.

There is also ample evidence of concerted efforts by vested interests to disseminate misinformation. This willful manufacture of mistaken beliefs has earned its own term, “agnogenesis.” In 2006 a U.S. Federal Court ruled that cigarette manufacturers were guilty of conspiring to deny, distort, and minimize the effects of cigarette smoking. In the early 1990s, the American Petroleum Institute, the Western Fuels Association, and The Advancement of Sound Science Coalition (TASSC) drafted and promoted campaigns to case doubt on the science of climate change. These industry groups have formed alliances with conservative think tanks, using a handful of scientist as spokesmen. More than 90% of books published between 1972 and 2005 that expressed skepticism about environmental issues have been linked to conservative think tanks. This review is hardly exhaustive and supplies only a hint of the magnitude of this type of misinformation.

The media, defined roughly as print newspapers and magazines, radio, TV, and the internet are also a source of misinformation. There are a variety of factors at play here. Journalists with weak backgrounds in the subjects they are addressing can oversimplify the topic they are addressing. There is also a strong motivation to sensationalize their stories. Sometimes, in an effort to be fair and balanced, they can be misleading. For example, an overwhelming majority (more than 95%) of actively publishing climate scientists agree on the fundamental fact that the globe is warming and that this warming is due to greenhouse-gas emissions caused by humans. Yet, the media, in an attempt to be even-handed will give equal time to individuals, often without appropriate backgrounds, who hold a contrary view. Consequently, the public misses the relative weighting of opinion among knowledgeable scientists.

There are also differences among media outlets as to how much misinformation they disseminate. Research4 has shown that the level of belief in misinformation among segments of the public varies according to preferred news outlets. The continuum runs from Fox News (whose viewers are the most misinformed on most issues) to National Public Radio (whose listeners are the least misinformed overall).

This blog has argued that the internet is not the cause of misinformation, but merely the means of communicating misinformation. A good argument can be made that this is not entirely true. There is a phenomenon known as selective exposure that can produce fractionation. Blogs and other social media tend to link to blogs and social media having similar viewpoints and to exclude opposing views. This can lead to “cyber-ghettos.” It is likely that this bears some responsibility for extreme divergent views in the political arena and an unwillingness to compromise or negotiate.

1Lewandowsky, S., Ecker, U.K.H., Seifert, C.M., , Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, 13, 106-131.

2Marsh, E.J., Mease, M.L., & Roediger, H.L. III. (2003). Learning Fact from Fiction. Memory & Cognition, 49, 519-536.

3Marsh, E.J., & Fazio, L.K. (2006). Learning Errors from Fiction: Difficulties in Reducing Reliance on Fictional Sources. Memory & Cognition, 34, 1140-1149.

4For example Kull, S., Ramsay, C., & Lewis, E. (2003). Misperceptions, the media, and the Iraq war. Political Science Quarterly, 118, 569-598.

© Douglas Griffith and healthymemory.wordpress.com, 2012. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Misinformation

October 7, 2012

The explosion of the internet has led many to fear misinformation on the internet. A recent review article1 on misinformation and its correction has motivated this post and the next three posts. As will be seen, misinformation is not a new problem; it has been with us a long time. The problem is more psychological than technological, and it is difficult, but not impossible to correct.

Misinformation is widespread and persistent. Here are some prominent examples:

Barack Obama’s Birth Certificate. In addition to his Hawaiian birth certificate, birth announcements in the local papers, and information that his pregnant mother went into the Honolulu hospital and left it cradling a baby “birthers” claimed that he had been born outside the United States and was not eligible to be president. Undoubtedly Obama had been vetted by the Republican Party and the US government prior to the election. Still a majority of Republican primary voters believed this myth. Birthers still exist even after Obama eventually and unnecessarily produced his birth certificate.

In 1998 in the United Kingdom a study suggesting a link between a common childhood vaccine and autism generated considerable fear. The UK Department of Health along with other health organizations immediately pointed to the lack of evidence for such claims and urged parents not to reject the vaccine. The media also reported that none of the claims had been substantiated. Nevertheless in 2002 between 20% and 25% continued to believe in the vaccine-autism link. 39% to 53% of the public still believed there was equal evidence on both sides of the debate. What is even more disturbing is that a substantial number of health professionals continued to believe the misinformation (so don’t assume that your doctor is up to date on the medical literature—you might be more current than your doctor is). Eventually the fact came out that the first author of the study had failed to disclose that he had substantial conflicts of interest. His co-authors distanced themselves from the study, and the journal officially retracted the article. The first author was found guilty of misconduct and lost his license to practice medicine. The basis of this information was one article. Consider the effort in correcting this misinformation. There have been several similar incidents in the United States, and they will continue to occur.

When the few lucky North Koreans manage to escape North Korea and make it to South Korea, they go to sessions where they learn how to live in a free society. They are also provided some relevant history. In North Korea they were thoroughly indoctrinated in the belief that South Korea and the US Imperialists started the war. In spite of their total disillusionment with North Korea, to the point that they risked their lives to escape the North Korean nightmare, they find this correction of this egregious misinformation difficult to accept.

It can be extremely difficult to correct misconceptions. Advertisements for Listerine mouthwash claimed for more than 50 years that the product helped prevent or reduce the severity of colds and sore throats. After a long legal battle The U.S. Federal Trade Commission mandated corrective advertising that explicitly withdrew the deceptive claims. In spite of a $10 million dollar campaign, the corrections were modest. Overall levels of acceptance of the false claim remained high. 42% of Listerine users still believed that the product was still promoted as an effective cold remedy and more than half (57%) reported that the product’s presumed medical effects were a key factor in their purchasing decision.

So misinformation is a serious problem that is neither new nor unique to the internet. This problem is psychological, not technological. The internet is merely a delivery system.

1Lewandowsky, S., Ecker, U.K.H., Seifert, C.M., , Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, 13, 106-131.

© Douglas Griffith and healthymemory.wordpress.com, 2012. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.