Posts Tagged ‘Dan Kahan’

Reason

April 17, 2018

Steven Pinker has a chapter called Reason in his outstanding book, “Enlightenment Now.” Part of the problem with reason or reasoning are beliefs, as was expounded in a previous healthy memory blog post, “Beliefs: Necessary, but Dangerous.” The legal scholar Dan Kahan has argued that certain beliefs become symbols of cultural allegiance protected by identity-protective connection. People affirm or deny these beliefs to express not what they know but who they are. Endorsing a belief that hasn’t passed muster with science and fact-checking isn’t so irrational. At least not by the criterion of the immediate effects on the believer. The effects on the society and planet are another matter. The atmosphere doesn’t care what people think about it, and if it in fact warms by 4 degrees Celsius, billions of people will suffer, no matter how many of them had been esteemed in their peer groups for holding a locally fashionable opinion on climate change along the way. Kahn concluded that we are all actors in a Tragedy of Belief Commons: what’s rational for every individual to believe (based on esteem) can be irrational for the society as a whole to act upon (based on reality). Technology has the effect of magnifying differences that result in polarization in political and social domains.

A fundamental problem is that accurate knowledge can be effortful and time consuming to obtain. Predictions are very difficulty as some have noted especially when they are about the future. Psychologist Philip Tetlock has studied the accuracy of forecasters. He recruited hundreds of analysts, columnists, academics, and interested laypeople to compete in forecasting tournaments in which they were presented with possible events and asked to assess their likelihood. This research was conducted over 20 years during which 28,000 predictions were made. So, how well did the experts do? On average, about as well as a chimpanzee throwing darts. In other words, not better than chance.

Tetlock and fellow psychologists Mellers and Gardner held another competition between 2011 and 2015 in which they recruited several thousand contestants to take part in a forecasting tournament held by the Intelligence Advanced Research Projects Agency (IARPA). Again the average performance was at chance levels, but in both tournaments the researchers could pick out “superforecasters,” who performed not just better than chimps and pundits, but better than professional intelligence officers with access to classified information, better than prediction markets, and not too far from the theoretical maximum. The accurate predictions last for about a year. Accuracy declines into the future, and falls to the level of chance around 5 years out.

The forecasters who did the worst, were also the most confident, were the ones with Big Ideas, be they left- or right wing, optimistic or pessimistic. Here is the summary by Tetlock & Gardner:

“As ideologically diverse as they were, they were united by the fact that their thinking was so ideological. They sought to squeeze complex problems into the preferred cause-effect templates and treated what did not fit as irrelevant distractions. Allergic to wishy-washy answers, they kept pushing their analyses to the limit (and then some), using terms like “furthermore” and “moreover” while piling up reasons why they were right and others wrong. As a result they were unusually confident and likelier to declare things as “impossible” or “certain.” Committed to their conclusions, they were reluctant to change their minds even when their predictions clearly failed.”

Tetlock described the super forecasters as follows:

“pragmatic experts who drew on many analytical tools, with the choice of tool hinging on the particular problem they faced. These experts gathered as much information from as many sources as they could. When thinking, they often shifted mental gears, sprinkling their speech with transition markers such as “however,” “but,” “although,” and “on the other hand.” They talked about possibilities and probabilities, not certainties. And while no one likes to say, “I was wrong,” these experts more readily admitted it and changed their minds.”

The superforecasters displayed what psychologist Jonathan Baron calls “active open-mindedness” with opinions such as these:

People should take into consideration evidence that goes against they beliefs. [Agree]
It is more useful to pay attention to those who disagree with you than to pay attention to those who agree. [Agree]
Changing your mind is a sign of weakness. [Disagree]
Intuition is the best guide in making decisions. [Disagree]
It is important to persevere in your beliefs even went evidence is brought to bear against them. [Disagree]

The manner of the Superforecasters’ reasoning is Bayesian. They tacitly use the rule from the Reverend Bayes on how to update one’s degree of credence in a proposition in light of evidence. It should be noted that Nate Silver (fivethirtyeight.com) is also a Bayesian.

Steven Pinker notes that psychologists have recently devised debiasing programs that fortify logical and critical thinking criteria. They encourage students to spot, name, and correct fallacies across a wide range of contexts. Some use computer games that provide students with practice, and with feedback that allows them to see the absurd consequences of their errors. Other curricula translate abstruse mathematical statements into concrete, imaginable scenarios. Tetlock has compiled the practices of successful forecasters into a set of guidelines for good judgment (for example, start with the base rate; seek out evidence and don’t overreact or under react to it; don’t try to explain away your own errors but instead use them as a source of calibration). These and other programs are provably effective: students’ newfound wisdom outlasts the training session and transfers to new subjects.

Dr. Pinker concludes,”Despite these successes, and despite the fact that the ability to engage in unbiased, critical reasoning is a prerequisite to thinking about anything else, few educational institutions have set themselves the goal of enhancing rationality (This includes my own university, where my suggestion during a curriculum review that all students should learn about cognitive biases fell deadborn from my lips.) Many psychologists have called on their field to “give debiasing away” as one of its greatest potential contributions to human welfare.”

It seems appropriate to end this post on reason with the Spinoza quote from the beginning of the book:

“Those who are governed by reason desire nothing for themselves which they do not also desire for the rest of humankind.”

Advertisements

Thinking About Science

July 9, 2017

This is the eighth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Thinking About Science is a chapter in this book.

Were it not for science and, more importantly, scientific thinking we would still be living in the dark middle ages. Our wealth and health is due to scientific thinking. Yet knowledge about science and belief in scientific facts is lacking. HM admires the Amish. Although they reject science, they live the humble lives dictated by their beliefs. Too many others enjoy the fruits of science yet reject scientific methods and findings. Their lack of respect for science exposes us to the continued risks of global warming and puts unvaccinated children at risk, to name just two problems.

In 1985, Walter Bodmer, a German-born geneticist, who is a professor at Oxford University in the UK, was appointed by the Royal Society of London to lead a team to evaluate the current state of attitudes toward science and technology in Britain. The Royal Society was concerned about antiscientific sentiment in Britain, seeing it as a serious risk to societal well-being. The results and recommendations of the study were published in a seminal paper known as the Bodmer Report.

Previous research had focused primarily on measuring attitudes directly, but Bodmer and his team argued for a single and intuitive idea that opposition to science and technology is driven by a lack of understanding. So via the promotion of better understanding of science, society can promote more favorable attitudes and take better advantage of the benefits afforded by science and technology. This idea about science attitudes is called the deficit model. According to this model, antiscientific thinking is due to a knowledge deficit. Once this deficit is filled, antiscientific attitudes will be mitigated or will disappear.

The paucity of scientific knowledge and abundance of antiscientific beliefs have been documented in all societies that have been studied. Although there is a weak relationship between scientific knowledge and attitudes about science, attempts to address the deficit model have failed. This is in spite of the millions and millions of dollars spent on research, curriculum design, outreach and communication, little to no headway has been achieved.

HM thinks that science is so vast and continually expanding that the deficit is simply too large to fill. Although scientists are knowledgeable in their specialties, as they move away from the specialities that knowledge falls off.

But there is another explanation that scientific attitudes are not based on the rational evaluation of evidence, so providing information does not change them. Attitudes are determined instead by a host of contextual and cultural factors.

These two explanations are not mutually exclusive. They are likely both operative.

One of the leading voices promoting this new perspective is Dan Kahan, a Yale law professor. He argues that our attitudes are not based on rational, detached evaluation of evidence because our beliefs are not isolated pieces of data that we can take and discard at will. Instead these beliefs are intertwined with other beliefs, shared cultural values, and our identities, To discard a belief means discarding a whole host of other beliefs, forsaking our communities, going against those we trust and love, virtually challenging our identities.

Drs. Sloman and Fernbach flesh out this theory by the story of Mike McHargue, who now is a podcaster and blogger who goes by the moniker Science Mike. Mike once attended a fundamentalist church and held fundamentalist beliefs. When he reached his thirties he began reading scientific literature and his faith in these beliefs began to waver. His initial reaction was to lose his faith completely, but for a long time he kept his new beliefs from his community. Eventually a personal experience helped him rediscover his faith and he is now, once again, a practicing Christian, but he continues to reject his fundamentalist church’s antiscientific beliefs.

Here is Science Mike’s response to a caller who has begun to question many of his beliefs:

Do I have advice on how to live when you’re at odds with your community? Absolutely. Do not live at odds with your community… You are a time bomb right now. Because at some point you won’t be able to pretend anymore, and you will speak honestly, and there will be a measure of collateral damage and fallout in your church. It’s time to move on. It’s time to find a faith community that believes as you believe…When that happens, you’re going to lose relationships. Some people cannot agree to disagree and those relationships can become abusive…There is a lot of pain because there are some people who are dear to me that I can’t talk to anymore…It is not possible for us to have the relationship we once had, and it’s rough. I’m not gonna lie. It’s rough.

This poignant response provides useful and important advice.

HM accepts the fundamental thesis of Drs. Sloman and Fernbach, that our knowledge is inadequate. Scientific evidence can be wrong, but at any given time, the scientific evidence available is the best information to use. We ignore it at our peril.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.