Posts Tagged ‘Elizabeth Kolbert’

Denying the Grave: Why We Ignore the Facts that Will Save Us

March 13, 2017

“Denying the Grave:  Why We Ignore the Facts that Will Save Us” is the third of three books of three books to be reviewed from an article titled, “That’s What You Think:  Why reason and evidence won’t change our minds” by Elizabeth Kolbert in the 27 February 2017 issue of “The New Yorker.”

The authors of this book are a psychiatrist, Jack Gorman, and his daughter, Sara Gorman, a public health specialist.  They probe the gap between what science tells us and what we tell ourselves.  Their concern is with those persistent beliefs which are not just demonstrably false, but also potentially deadly, like the conviction that vaccines are hazardous.

The Gormans argue that ways of thinking that now seem self-destructive must at some point have been adaptive.  They dedicate many pages to the confirmation bias, which they claim has a physiological component.  This research suggests that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs.  They observe,”It feels good to ‘stick to our guns’ even if we are wrong.”

The Gormans do not just want to catalogue the ways we go wrong;  they want to correct them.  Providing people with accurate information does not seem to help; people simply discount it.  They write that “the challenge that remains is to figure out how to address the tendencies that lead to false scientific belief.”

The Enigma of Reason

March 11, 2017

“The Enigma of Reason”  by Hugo Mercier and Dan Sperber is the first of three books to be reviewed from an article titled, “That’s What You Think:  Why reason and evidence won’t change our minds” by Elizabeth Kolbert in the 27 February 2017 issue of “The New Yorker.”

Ms. Kolbert notes hat since research in the nineteen-seventies revealed that we humans can’t think straight and that reasonable—seeming people are often totally irrational, the question remains:  How did we come to be this way.  “The Enigma of Reason” is the first book to be discussed that attempts to address this question.  The argument of Mercier and Sperber is that our biggest advantage over other species is our ability to cooperate.  Cooperation is difficult to establish and also difficult to sustain.  They argue that reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; instead it developed to resolve the problems posed by living in cooperative groups.

Mercier and Sperber write “Reason is an adaptation to the hyper social niche humans have evolved for themselves.”  Habits of mind that seem to be weird or goofy or just plain dumb from and intellectual point of view prove shrewd when seen from an “interactionist” perspective.

They use confirmation bias to further their argument.  This is the tendency we have to embrace information that supports our forms of faulty thinking.  “Confirmation bias” is the subject of entire textbooks’ worth of experiments.  One of the most famous was conducted at Stanford.  Researchers rounded up a group of students who had opposing opinions on capital punishment.  Half of these students were in favor of it and thought that it deterred crime;  the other half were against it and thought that it had no effect on crime.

These students were asked to respond to two studies, which the students did not know had been made up.  One of these studies was pro and the other was anti and presented what were, objectively speaking, equally compelling statistics.  The students who had originally supported capital punishment rated the pro-deterrence  highly credible and the anti-deterrence data unconvincing.  The students who’d originally opposed capital punishment did the reverse.  At the end of the study, the students were again asked about their views.  The only difference was that this time the students were more in favor of their original views than they had been originally.

To further their point Mercier and Sperber suggest what would happen to a mouse that thinks as we do.  If such a mouse were bent on confirming its belief that no cats were around, he would soon be dinner.  To the extent that confirmation bias leads people to dismiss evidence of new or under appreciated threats, it’s a trait that should have been selected against.  The fact that we both have survived, Mercier and Sperber argue, proves that it must have some adaptive functions, and they maintain that that function is related to our “hypersociability.”

Mercier and Serber prefer the term “myside bias” to “confirmation bias.”  They point out that we humans are not randomly credulous.  Presented with someone else’s argument, we’re quite adept at spotting the weaknesses.  In an experiment illustrating this post by Mercier and some European colleagues participants were asked to answer a series of simple reasoning problems.  Then they were asked to explain their responses, and were given a chance to modify them.  Only fifteen% changed their minds in step two.

In step three, participants were shown the same problems, along with their answer and the answer of another participant who had come to a different conclusion.  However, the responses presented to them as some else’s  were actually their own and vice versa.  Only about half the participants realized what was going one,  Among the remaining half, suddenly people became much more critical.  Almost 60% rejected the responses they’d earlier been satisfied with.