Posts Tagged ‘pragmatic experts’

Reason

April 17, 2018

Steven Pinker has a chapter called Reason in his outstanding book, “Enlightenment Now.” Part of the problem with reason or reasoning are beliefs, as was expounded in a previous healthy memory blog post, “Beliefs: Necessary, but Dangerous.” The legal scholar Dan Kahan has argued that certain beliefs become symbols of cultural allegiance protected by identity-protective connection. People affirm or deny these beliefs to express not what they know but who they are. Endorsing a belief that hasn’t passed muster with science and fact-checking isn’t so irrational. At least not by the criterion of the immediate effects on the believer. The effects on the society and planet are another matter. The atmosphere doesn’t care what people think about it, and if it in fact warms by 4 degrees Celsius, billions of people will suffer, no matter how many of them had been esteemed in their peer groups for holding a locally fashionable opinion on climate change along the way. Kahn concluded that we are all actors in a Tragedy of Belief Commons: what’s rational for every individual to believe (based on esteem) can be irrational for the society as a whole to act upon (based on reality). Technology has the effect of magnifying differences that result in polarization in political and social domains.

A fundamental problem is that accurate knowledge can be effortful and time consuming to obtain. Predictions are very difficulty as some have noted especially when they are about the future. Psychologist Philip Tetlock has studied the accuracy of forecasters. He recruited hundreds of analysts, columnists, academics, and interested laypeople to compete in forecasting tournaments in which they were presented with possible events and asked to assess their likelihood. This research was conducted over 20 years during which 28,000 predictions were made. So, how well did the experts do? On average, about as well as a chimpanzee throwing darts. In other words, not better than chance.

Tetlock and fellow psychologists Mellers and Gardner held another competition between 2011 and 2015 in which they recruited several thousand contestants to take part in a forecasting tournament held by the Intelligence Advanced Research Projects Agency (IARPA). Again the average performance was at chance levels, but in both tournaments the researchers could pick out “superforecasters,” who performed not just better than chimps and pundits, but better than professional intelligence officers with access to classified information, better than prediction markets, and not too far from the theoretical maximum. The accurate predictions last for about a year. Accuracy declines into the future, and falls to the level of chance around 5 years out.

The forecasters who did the worst, were also the most confident, were the ones with Big Ideas, be they left- or right wing, optimistic or pessimistic. Here is the summary by Tetlock & Gardner:

“As ideologically diverse as they were, they were united by the fact that their thinking was so ideological. They sought to squeeze complex problems into the preferred cause-effect templates and treated what did not fit as irrelevant distractions. Allergic to wishy-washy answers, they kept pushing their analyses to the limit (and then some), using terms like “furthermore” and “moreover” while piling up reasons why they were right and others wrong. As a result they were unusually confident and likelier to declare things as “impossible” or “certain.” Committed to their conclusions, they were reluctant to change their minds even when their predictions clearly failed.”

Tetlock described the super forecasters as follows:

“pragmatic experts who drew on many analytical tools, with the choice of tool hinging on the particular problem they faced. These experts gathered as much information from as many sources as they could. When thinking, they often shifted mental gears, sprinkling their speech with transition markers such as “however,” “but,” “although,” and “on the other hand.” They talked about possibilities and probabilities, not certainties. And while no one likes to say, “I was wrong,” these experts more readily admitted it and changed their minds.”

The superforecasters displayed what psychologist Jonathan Baron calls “active open-mindedness” with opinions such as these:

People should take into consideration evidence that goes against they beliefs. [Agree]
It is more useful to pay attention to those who disagree with you than to pay attention to those who agree. [Agree]
Changing your mind is a sign of weakness. [Disagree]
Intuition is the best guide in making decisions. [Disagree]
It is important to persevere in your beliefs even went evidence is brought to bear against them. [Disagree]

The manner of the Superforecasters’ reasoning is Bayesian. They tacitly use the rule from the Reverend Bayes on how to update one’s degree of credence in a proposition in light of evidence. It should be noted that Nate Silver (fivethirtyeight.com) is also a Bayesian.

Steven Pinker notes that psychologists have recently devised debiasing programs that fortify logical and critical thinking criteria. They encourage students to spot, name, and correct fallacies across a wide range of contexts. Some use computer games that provide students with practice, and with feedback that allows them to see the absurd consequences of their errors. Other curricula translate abstruse mathematical statements into concrete, imaginable scenarios. Tetlock has compiled the practices of successful forecasters into a set of guidelines for good judgment (for example, start with the base rate; seek out evidence and don’t overreact or under react to it; don’t try to explain away your own errors but instead use them as a source of calibration). These and other programs are provably effective: students’ newfound wisdom outlasts the training session and transfers to new subjects.

Dr. Pinker concludes,”Despite these successes, and despite the fact that the ability to engage in unbiased, critical reasoning is a prerequisite to thinking about anything else, few educational institutions have set themselves the goal of enhancing rationality (This includes my own university, where my suggestion during a curriculum review that all students should learn about cognitive biases fell deadborn from my lips.) Many psychologists have called on their field to “give debiasing away” as one of its greatest potential contributions to human welfare.”

It seems appropriate to end this post on reason with the Spinoza quote from the beginning of the book:

“Those who are governed by reason desire nothing for themselves which they do not also desire for the rest of humankind.”