Posts Tagged ‘truthiness’

Motivated Reasoning, Cognitive Dualism, and Scientific Curiosity

December 4, 2016

This post is based on a Feature Article by Dan Jones titled “Seeing reason:  How to change minds in a ‘post-fact’ world, in the December 3, 2016 issue of the New Scientist.   The article notes that politicians spin and politicians lie and that that has always been the case, and to an extent it is a natural product of a free democratic culture.  But Jones goes on to note, “Even so we do appear  to have entered a new era of ‘post-truth politics’, where the strongest currency is what satirist Stephen Colbert had dubbed ‘truthiness’:  claims that feel right, even if they have no basis in fact, and which people want to believe because they fit their pre-existing attitudes.”

However, facts are important, as Brendan Nyhan of Dartmouth College notes, “We need to have discussions that are based on a common set of accepted facts, and when we don’t, it’s hard to have a useful democratic debate.”  As Jones writes, “In the real world of flesh-and-blood humans, reasoning often starts with established conclusions and works back to find “facts” that support what we already believe.  And if we’re presented with facts that contradict our beliefs, we find clever ways to dismiss them.”  Psychologists call this lawyerly tendency motivated reasoning.

A Pew Research Center survey released  before the US election showed that compared with Democrats, Republicans are less likely to believe that scientists know that climate change is occurring, that they understand its causes, or that they fully and accurately report their findings.  They are also more likely to believe that scientists’ research is driven by careerism and political views.  Many liberals think this is a product of scientific illiteracy, which if addressed would bring everyone around to the same position.  Unfortunately, research by Dan Kahan at Yale University has shown that, in contrast to liberals, among conservatives it is the most scientifically literate who are less likely to accept climate change.  Kahn says, “Polarisation over climate change isn’t due to a lack of capacity to understand the issues.  Those who are most proficient at making sense of scientific information are the most polarized.

Kahan attributes this apparent paradox to motivated reasoning, the better one is at handling scientific information, the better one is at confirming his own bias and writing off inconvenient truths. For climate-change deniers studies suggest that motivation is often the endorsement of free-market ideology, which includes objections to government regulation of business that is required to address climate change.  Psychologist Stephan Lewandowsky of the University of Bristol says, “If I ask people four questions about the free market, I can predict attributes towards climate science with 60% accuracy.”

Jones writes, “But liberal smugness has no place here.  Consider gun control.  Liberals tend to want tighter gun laws, because, they argue, fewer guns would translate into fewer gun crimes.  Conservatives typically respond that with fewer guns in hand, criminals can attack the innocent with impunity.”

In spite by the best efforts of criminologists, the evidence on this issue is mixed.  Kahan has found that both liberals and conservatives react to statistical information about the effects of gun control in the same way:  they accept what fits in with the broad beliefs of their political group, and discount that which doesn’t.  Kahn writes, “The more numerate you are, the more distorted your perception of the data.”  Motivated reasoning is found on other contentious issues from the death penalty and drug legalization to fracking and immigration.

The UK’s Brexit both provides another compelling case study on the distorting power of motivated reasoning.  Researchers at the Online Privacy Foundation found that both Remainers and Brexiteers could accurately interpret statistical information when it came to assessing whether a new skin cream caused a rash, their numeracy skills abandoned them when looking at stats that undermined rationales for their views such as figures on whether immigration is linked to an increase or a decrease in crime.

It is not just a matter of political ideology.  Although the bogus link between autism and the vaccine for measles, mumps, and rubella is often portrayed as a liberal obsession, it cuts across politics.  Nyhan says, “There’s no demographic factor that predicts who is most vulnerable to anti-vaccine claims.”

It should not be concluded that myth-busting is a waste of time.   Nyhan and Reifler found that during the 2014 midterm elections in the US fact-checking improved the accuracy of people’s beliefs even when it went against ingrained biases.  Both Democrats and Republicans updated their beliefs after having a claim debunked.

Emily Thomson of George Washington University found that misconceptions of issues like how much of the US debt China owns, whether there’s  a federal time limit for receiving welfare benefits, and who pays for Social Security could be fixed by a single corrective statement.

Unfortunately the bad news is that myth-busting loses its power on salient and controversial issues.  Nyhan says, “It’s most effective for topics that we’re least concerned about as a democracy.  Even the release of President Obama’s birth certificate had only a limited effect on people’s belief that he wasn’t born in this country.”  Thomson has found that even when corrections work, for example getting to accept that a congressman accused of taking campaign money from criminals did no such thing—the taint of the earlier claim often sticks to the innocent target.  This phenomenon is termed “belief echoes.”

Graphical presentation of information can be more effective than verbal presentations, but this benefit requires that people be able to read graphs.  Many people have difficulty understanding graphs, so simple graphs have a higher likelihood of success.

Kahan calls the ability to hold two seemingly contradictory beliefs at the same time “cognitive dualism.”  Cognitive dualism was found in a recent Pew survey on climate change:  just 15% of conservative Republicans agreed that human activity was causing climate change, but 27% agreed that if we change our ways to limit carbon emissions it would make a big difference in tackling climate change.  This same dualism was found among US farmers.  A 2013 survey found that only a minority accepted climate change as a fact.  Yet a majority believed that some farmers would be driven out of business by climate change, and the rest will have to change current practices and buy more insurance against climate-induced crop failures.  By buying crops genetically engineered to cope with climate change and purchasing specialist insurance polices, many of them already have.

Kahan has discovered something interesting about people who seek out and consume scientific information for personal pleasure,  He calls this trait scientific curiosity.  He has devised a scale for measuring this trait.  He and his colleagues have found that, unlike scientific literacy, scientific curiosity is linked to greater acceptance of human-caused climate change, regardless of political orientation.  On many issues, from attitudes towards porn and the legislation of marijuana, to immigration and fracking,scientific curiosity makes both liberal and conservatives converge on views closer to the facts.

So exploiting cognitive dualism and fostering scientific curiosity appear to be the most promising avenues to pursue.  It is important to remember that it is scientific curiosity rather than scientific literacy that is important here.

© Douglas Griffith and, 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.