“The Enigma of Reason” by Hugo Mercier and Dan Sperber is the first of three books to be reviewed from an article titled, “That’s What You Think: Why reason and evidence won’t change our minds” by Elizabeth Kolbert in the 27 February 2017 issue of “The New Yorker.”
Ms. Kolbert notes hat since research in the nineteen-seventies revealed that we humans can’t think straight and that reasonable—seeming people are often totally irrational, the question remains: How did we come to be this way. “The Enigma of Reason” is the first book to be discussed that attempts to address this question. The argument of Mercier and Sperber is that our biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and also difficult to sustain. They argue that reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; instead it developed to resolve the problems posed by living in cooperative groups.
Mercier and Sperber write “Reason is an adaptation to the hyper social niche humans have evolved for themselves.” Habits of mind that seem to be weird or goofy or just plain dumb from and intellectual point of view prove shrewd when seen from an “interactionist” perspective.
They use confirmation bias to further their argument. This is the tendency we have to embrace information that supports our forms of faulty thinking. “Confirmation bias” is the subject of entire textbooks’ worth of experiments. One of the most famous was conducted at Stanford. Researchers rounded up a group of students who had opposing opinions on capital punishment. Half of these students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.
These students were asked to respond to two studies, which the students did not know had been made up. One of these studies was pro and the other was anti and presented what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence highly credible and the anti-deterrence data unconvincing. The students who’d originally opposed capital punishment did the reverse. At the end of the study, the students were again asked about their views. The only difference was that this time the students were more in favor of their original views than they had been originally.
To further their point Mercier and Sperber suggest what would happen to a mouse that thinks as we do. If such a mouse were bent on confirming its belief that no cats were around, he would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or under appreciated threats, it’s a trait that should have been selected against. The fact that we both have survived, Mercier and Sperber argue, proves that it must have some adaptive functions, and they maintain that that function is related to our “hypersociability.”
Mercier and Serber prefer the term “myside bias” to “confirmation bias.” They point out that we humans are not randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. In an experiment illustrating this post by Mercier and some European colleagues participants were asked to answer a series of simple reasoning problems. Then they were asked to explain their responses, and were given a chance to modify them. Only fifteen% changed their minds in step two.
In step three, participants were shown the same problems, along with their answer and the answer of another participant who had come to a different conclusion. However, the responses presented to them as some else’s were actually their own and vice versa. Only about half the participants realized what was going one, Among the remaining half, suddenly people became much more critical. Almost 60% rejected the responses they’d earlier been satisfied with.