Posts Tagged ‘Philip Fernbach’

The Two Causal Reasoners Inside

July 5, 2017

This is the fourth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. The title of this post is identical to the title of a section in that book. Drs. Sloman and Fernbach state that we are engaged in some type of causal reasoning almost all the time, but that not all causal reasoning is the same. Some of it is fast. It’s quick and automatic as when a man concludes that the reason his hand hurts is because he bashed it against the wall. Another type of causal reasoning is when we try to remember the causes of WW I.

This two process distinction goes beyond causal reasoning and can be used for all cognitive processing. Daniel Kahneman formulated this distinction in his best selling book, “Thinking Fast and Slow.” There have been many previous posts on this topic. There are sixty-nine hits using Kahneman in the healthy memory blog search box. Normal conversation, driving, and skilled performance are dominated largely by System 1. System 1 is called intuition. When we have to stop and think about something, that is an example of System 2 processing, which is called reasoning. The psychologist Stanovich breaks down System 2 processing into instrumental and epistemic processing in his efforts to develop a Rational Quotient (RQ) that improves upon the standard IQ.

Professor Shane Frederick has introduced a simple test to determine whether a person is more intuitive or more deliberative. It’s called the Cognitive Reflection Test (CRT). Here’s an example problem.

A bat and a ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?

Do you think the answer is 10 cents? If you do, you’re in good company. Most people report that as the answer (including the majority of students in Ivy League colleges). !0 cents pops into almost everyone’s mind. This is the product of System 1 processing. However, if System 2 is engaged, one realizes that if the ball costs 10 cents and the bat costs $1 more than the ball, then the bat costs $1.10 and together they cost $1.20. So 10 cents is the wrong answer. The small proportion of people whose System 2 processes kick in, realize that 10 cents is wrong, and they are able to calculate the correct answer. Frederick refers to such people as reflective, meaning that they tend to suppress their intuitive reasons and deliberate before responding.

Here is another CRT problem.

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it take 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half the lake?

The answer 24 comes to most people’s mind. But if the patch doubles in size every day, then if the lake is half covered on day 24, it would be fully covered on day 25. But the problem states that the lake is fully covered on day 48, so 24 can’t be correct. The correct answer is one day before it’s fully covered, day 47.

Here’s another CRT problem.

If it takes 5 machines to make 5 widgets, how long would it take 100 machines to make 100 widgets?

Try to solve this on your own.




















The correct answer is 5 minutes (each machine takes 5 minutes to make one widget).

The solution to all three problems requires the invoking of System 2 processing. Less than 20% of the U.S. population gets all three problems correct. This finding might reflect a reluctance to think and might account for many of the problems the United States is facing. About 48% of students at the Massachusetts Institute of Technology (MIT) got all three problems correct, but only 26% of Princeton students did.

© Douglas Griffith and, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.


The Knowledge Illusion: Why We Never Think Alone

March 12, 2017

“The Knowledge Illusion:  Why We Never Think Alone” is  the second of three books to be reviewed from an article titled, “That’s What You Think:  Why reason and evidence won’t change our minds” by Elizabeth Kolbert in the 27 February 2017 issue of “The New Yorker.”

The authors of this book, Steven Sloman and Philip Fernbach also believe that sociability is the key  to how the human mind functions, or, more accurately, malfunctions.  In a study conducted on Yale University, graduate students were ask to rate their understanding of everyday devices to include toilets, zippers, and cinder blocks.  Then they were asked to write detailed step-by-step explanations of how the devices work, and to rate their understanding again.  Doing this revealed to the students their own ignorance, because their self-assessments dropped.

Sloan and Fernbach call this the “illusion of explanatory depth” and find this effect just about everywhere.  They say that what allows us to press in this belief is other people.  This is something we are very good at.  We’ve been relying on one another’s expertise ever since we figured out how to hang together, which was probably a key development in our evolutionary history.  They argue that we collaborate so well that we can hardly tell where our own understanding ends and others’ begins.  They argue that this borderlessness is crucial to what we consider progress.  “As people invented new tools for new ways of living, they simultaneously created new realms of ignorance;  If everyone insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amount to much.  When it comes to new technologies, incomplete understanding is empowering.”

Where this gets us into trouble, according to Sloan and Fernbach, is in the political domain.  “It’s one thing for me to flush a toilet without knowing how it operates,  and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about.”

Sloan and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea.  Respondents were asked how they thought the U.S. should react, and also to locate Crimea on a map.  The farther off base they were about the geography, the more likely they were to favor military intervention.