Posts Tagged ‘Confirmation Bias’

Denying the Grave: Why We Ignore the Facts that Will Save Us

March 13, 2017

“Denying the Grave:  Why We Ignore the Facts that Will Save Us” is the third of three books of three books to be reviewed from an article titled, “That’s What You Think:  Why reason and evidence won’t change our minds” by Elizabeth Kolbert in the 27 February 2017 issue of “The New Yorker.”

The authors of this book are a psychiatrist, Jack Gorman, and his daughter, Sara Gorman, a public health specialist.  They probe the gap between what science tells us and what we tell ourselves.  Their concern is with those persistent beliefs which are not just demonstrably false, but also potentially deadly, like the conviction that vaccines are hazardous.

The Gormans argue that ways of thinking that now seem self-destructive must at some point have been adaptive.  They dedicate many pages to the confirmation bias, which they claim has a physiological component.  This research suggests that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs.  They observe,”It feels good to ‘stick to our guns’ even if we are wrong.”

The Gormans do not just want to catalogue the ways we go wrong;  they want to correct them.  Providing people with accurate information does not seem to help; people simply discount it.  They write that “the challenge that remains is to figure out how to address the tendencies that lead to false scientific belief.”

The Enigma of Reason

March 11, 2017

“The Enigma of Reason”  by Hugo Mercier and Dan Sperber is the first of three books to be reviewed from an article titled, “That’s What You Think:  Why reason and evidence won’t change our minds” by Elizabeth Kolbert in the 27 February 2017 issue of “The New Yorker.”

Ms. Kolbert notes hat since research in the nineteen-seventies revealed that we humans can’t think straight and that reasonable—seeming people are often totally irrational, the question remains:  How did we come to be this way.  “The Enigma of Reason” is the first book to be discussed that attempts to address this question.  The argument of Mercier and Sperber is that our biggest advantage over other species is our ability to cooperate.  Cooperation is difficult to establish and also difficult to sustain.  They argue that reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; instead it developed to resolve the problems posed by living in cooperative groups.

Mercier and Sperber write “Reason is an adaptation to the hyper social niche humans have evolved for themselves.”  Habits of mind that seem to be weird or goofy or just plain dumb from and intellectual point of view prove shrewd when seen from an “interactionist” perspective.

They use confirmation bias to further their argument.  This is the tendency we have to embrace information that supports our forms of faulty thinking.  “Confirmation bias” is the subject of entire textbooks’ worth of experiments.  One of the most famous was conducted at Stanford.  Researchers rounded up a group of students who had opposing opinions on capital punishment.  Half of these students were in favor of it and thought that it deterred crime;  the other half were against it and thought that it had no effect on crime.

These students were asked to respond to two studies, which the students did not know had been made up.  One of these studies was pro and the other was anti and presented what were, objectively speaking, equally compelling statistics.  The students who had originally supported capital punishment rated the pro-deterrence  highly credible and the anti-deterrence data unconvincing.  The students who’d originally opposed capital punishment did the reverse.  At the end of the study, the students were again asked about their views.  The only difference was that this time the students were more in favor of their original views than they had been originally.

To further their point Mercier and Sperber suggest what would happen to a mouse that thinks as we do.  If such a mouse were bent on confirming its belief that no cats were around, he would soon be dinner.  To the extent that confirmation bias leads people to dismiss evidence of new or under appreciated threats, it’s a trait that should have been selected against.  The fact that we both have survived, Mercier and Sperber argue, proves that it must have some adaptive functions, and they maintain that that function is related to our “hypersociability.”

Mercier and Serber prefer the term “myside bias” to “confirmation bias.”  They point out that we humans are not randomly credulous.  Presented with someone else’s argument, we’re quite adept at spotting the weaknesses.  In an experiment illustrating this post by Mercier and some European colleagues participants were asked to answer a series of simple reasoning problems.  Then they were asked to explain their responses, and were given a chance to modify them.  Only fifteen% changed their minds in step two.

In step three, participants were shown the same problems, along with their answer and the answer of another participant who had come to a different conclusion.  However, the responses presented to them as some else’s  were actually their own and vice versa.  Only about half the participants realized what was going one,  Among the remaining half, suddenly people became much more critical.  Almost 60% rejected the responses they’d earlier been satisfied with.

Why Facts Don’t Matter

August 15, 2016

The title of this post is identical to the title of a column written by David Ignatius in the 5 August edition of the Washington Post.  Ignatius began his column by asking, “How did Donald Trump win the Republican nomination despite clear evidence that he had misrepresented or falsified key issues throughout his campaign?”  Also read or reread the healthy memory blog posts “Donald Trump is Bending Reality to Get Into the American Psyche” and “Trick or Tweet or Both?  How Social Media is Messing Up Politics.”  Trump makes outrageous statements, contradicts himself, and betrays a woeful ignorance about government and international relations, and makes claims that he is going to fix problems without providing any plans as to how he is going to fix them.  Nevertheless, people say that they are going to vote for him.  When pressed they say that are unhappy with current politics and the country is going in the wrong direction.  To this HM asks, so the bridge is crowded and slow moving, does that mean you are going to jump off the bridge, even though you don’t know that you’ll survive the jump or that you might be eaten by the crocodiles in the water?

There have been prior posts about the confirmation bias and the backfire effect.  The confirmation bias refers to our bias to believe statements or facts that are in consonance with our beliefs.  The backfire effect refers to the effect when efforts to correct misinformation actually strengthen beliefs in the misinformation.  Ignatius is referencing an article by Christopher Graves in the February 2015 issue of the Harvard Business Review.  Research by Brendan Nyhan and Jason Reifer showed the persistence of the belief that Iraq had weapons of mass destruction in 2005 and 2006 after the United States had publicly admitted that they didn’t exist.  They concluded “The results show that direct factual contradictions can actually strengthen ideologically founded factual belief.

Graves also examined how attempts of debunk myths can reinforce them, simply by repeating the untruth.  This study in the Journal of Consumer Research is titled “How Warnings About False Claims Become Recommendations.  It seems that people remember the assertion and forget whether it’s a lie.  The authors wrote, “The more often older adults were told that a given claim was false, the more likely they were to accept it as true after several days have passed.”

Graves noted that when critics challenge false assertions, say, Trump’s claim that thousands of Muslims cheered in New Jersey when the twin towers fell—their refutations can threaten people rather than convince them. And when people feel threatened, they round up their wagons and defend their beliefs.  Ego involvement generates large mental efforts to defend their erroneous beliefs.    Not only does the Big Lie Work, but small lies also work

Social scientists understand  why the buttons that Trump’s campaign pushes are so effective.  “When the GOP nominee paints a dark picture of a violent, frightening American, he triggers the “fight or flight’ response that is hard-wired in or brains.  For the body politic, it can produce a kind of panic attack.

So attempts to correct misinformation can backfire and have the opposite effect.  So what can be done?  Some possible approaches will be found in the next HM post.

© Douglas Griffith and healthymemory.wordpress.com, 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Mindware

January 17, 2010

Mindware is a term Stanovich uses to refer to specific skills or knowledge that have been acquired through learning.1 You can think of it as software or an application program that you have acquired for the mind through learning. Scientific reasoning is one kind of mindware. Stanovich writes of “mindware gaps,” which can refer either to the lack of knowledge or knowledge that is not used. One mindware gap is a failure to consider alternative hypothesis. This failure is quite evident in police work. Consider the case involving the missing Chandra Levy and the Congressman Gary Condit. The police honed in on Gary Condit as the chief suspect in the missing and later the death of Chandry Levy. The alternative hypothesis that someone else did it was not actively considered. During the investigation other women were murdered in the same park by the same individual who eventually was convicted of Chandra Levy’s death. I hope this mindware gap is due to cognitive miserliness rather than to a true knowledge deficit. The need to consider alternative hypothesis needs to be central to investigative techniques.

Consider the following data regarding medical treatments:

200 people were given the treatment and improved

75 people were given the treatment and did not improve

50 people were not given the treatment and improved

15 people were not given the treatment and did not improve

Do you think the treatment was effective?

Many people think that the treatment was effective since 200 people given the treatment improved, whereas only 75 people who were given the treatment did not improve. However, the conclusion regarding the effectiveness of the treatment requires a control group in which people were not given the treatment. Of the 65 people in the control group, 77% improved. Of the 275 people in the experimental treatment group , 73% improved.   As you can see there was improvement in both the treatment and the control groups. There is no support for the treatment being effective.

1Stanovich, K. E. (2009). What Intelligence Tests Miss: the psychology of rational thought. New Haven: The Yale University Press.

© Douglas Griffith and healthymemory.wordpress.com, 2009. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.