Posts Tagged ‘Steven Sloman’

Robots Will Be More Useful If They are Made to Lack Confidence

July 17, 2017

The title of this post is identical to the title of an article by Matt Reynolds in the News & Technology section of the 10 June 2017 issue of the New Scientist. The article begins “CONFIDENCE in your abilities is usually a good thing—as long as you know when it’s time to ask for help. Reynolds notes that as we build ever smarter software, we may want to apply the same mindset to machines.

Dylan Hadfield-Menell says that overconfident AIs can cause all kinds of problems. So he and his colleagues designed a mathematical model of an interaction between humans and computers called the “off-switch-game.” In the theoretical set-up robots are given a task to do and humans are free to switch them off whenever they like. The robot can also choose to disable its switch so the person cannot turn it off.

Robots given a high level of “confidence” that they were doing something useful would never let the human turn them off, because they tried to maximize the time spent doing their task. Not surprisingly, a robot with low confidence would always let a human switch it off, even if it was doing a good job.

Obviously, calibrating the level of confidence is important. It is unlikely that humans would ever provide a level of confidence that would not allow them to shut down the computer. A problem here is that we humans tend to be overconfident and to be unaware of how much we do not know. This human shortcoming is well documented in a book by Steven Sloman and Philip Fernbach titled “The Knowledge Illusion: Why We Never Think Alone.” Remember that transactive memory is information that is found in our fellow human beings and in technology that ranges from paper to the internet. Usually we eventually learn the best sources of information in our fellow humans and human organizations, and we need to learn where to find and how much confidence to have in information stored in technology, which includes AI robots. Just as we can have the wrong friends and sources of information, we have the same problem with robots and external intelligence.

So the title is wrong. Robots may not be more useful if they are made to lack confidence. They should have a calibrated level of confidence just as we humans should have calibrated levels of confidence depending upon the task and how skilled we are. Achieving the appropriate levels of confidence between humans and machines is a good example of the Man-Machine Symbiosis J.C.R. Licklider expounded upon in his Classic paper “Man-Computer Symbiosis.”

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Conclusion: Appraising Ignorance and Illusion

July 16, 2017

The title of this post is identical to the title of the final chapter of The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. The authors note that this book has three central themes: ignorance, the community of knowledge, and the illusion of understanding.

The authors note that ignorance is inevitable simply because there’s too much complexity in the world for any individual to master. The Dunning-Kruger effect, which has been discussed in previous healthy memory posts, is that those who perform the worst overrate their own skills the most. This effect can be found by giving a group of people a task to do and then asking them how well they think they’ve done on the task. Poor performers overestimate how well they’ve done; strong performers often underestimate their performance. This effect has been found many times both in the psychological laboratory and in many real-world environments: among students, in offices, and among doctors. Dunning has collected an impressive amount of evidence that the reason it happens is that those who lack skills also lack the knowledge of what skills they’re missing. Consequently, they think they’re pretty good. However, those who are knowledgeable have a better sense of how matters should be handled and they know what skills they need to improve on. Dunning stresses the importance of this effect because all of us are unskilled in most domains of our lives.

The authors wrote, “As for the community of knowledge, intelligence resides in the community and not in any individual. So decision-making procedures that elicit the wisdom of the community are more likely to produce better outcomes than procedures that depend on the relative ignorance of lone individuals. A strong leader is one who knows how to inspire a community and take advantage of the knowledge within it, and can delegate responsibility to those with the most expertise.”

There are good and bad aspects of the illusion of understanding. We’re more likely to be accurate by avoiding illusion. We have a good idea of what we know and what we don’t know, and this should help us achieve our goals. We won’t take on projects that are beyond us, and we’ll be less likely to disappoint others, and we’ll be better positioned to deliver on our promises.

But they also note that illusion is a pleasure, as many of us spend a significant part of our lives living in illusions quite intentionally. We entertain ourselves with fictional worlds. Illusions can stimulate creative products by inspiring us to imagine alternative worlds, goals, and outcomes. And they can motivate us to attempt what we wouldn’t otherwise attempt.

Many posts have been devoted to this book because it addresses an important topic. It provides us with a more accurate picture of what we know and the consequences of shortfalls in knowledge. The title states “why we never think alone.” Perhaps it should read, “why we should never think alone.” Although he has certainly tried, HM has not done this volume justice, and he encourages you to read it for yourself.

 

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Making Smart Decisions

July 15, 2017

This is the twelfth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Making Smarter Decisions is a chapter in this book. Perhaps the one area it is important to make smart decisions is in finance.

Consider the following question: Assume that you deposit $400 every month into a retirement savings account that earns a 10% yearly rate of interest and that you never withdraw any money. How much money do you think you will have in any account (including interest earned): After 10 years, 20 years, 30 years, and 40 years.

Respondents answering these questions responding with a median response of
$223,000.00 after forty years. The correct answer is almost $2.5 million. Use your spreadsheet to prove this for yourself.

HM’s father passed away before he retired, and he was denied a large amount of the pension he should have received. Nevertheless, both his parents had managed their finances carefully. All they invested in were FDIC insured savings accounts and Certificates of Deposit. HM was amazed at the money his Mom accumulated. She was in fine shape, placed no burden on him, and left him a substantial inheritance.

HM’s parents also never carried credit card debt. No one should ever carry credit card debt. This debt is compounded, and increases at a nonlinear rate just as savings accounts do. But unlike savings accounts, debt is subtracted.

Actually the rates that are charged are usurious and should not be allowed. But the financial industry has effectively bought congress. (You should know that the United States has the best congress money can buy). Added to this are the clever programs where you gain rewards for using the card. Using the card and earning rewards is not a problem unless you do not pay off the card monthly when it is due. You should remember if you carry credit debt you are losing money.

Behavioral economics has some effective ideas to aid in better financial decisions (enter, “behavioral economics” into the healthy memory blog search box to find additional posts on behavioral economics). It has found ways to nudge better decisions. Nudging can be done by setting defaults. Rather than have employees opt in regarding retirement contributions, have them opt out if they do not want to contribute. Have being an organ donor being the default option on a driver’s license, and have them opt out if they do not want to be a donor. The big idea of the nudge approach is that it easier and more effective to change the environment that it is to change the person. Once we understand what quirks of cognition drive behavior, we can design the environment so that those quirks help us instead of hurting us.

We can apply these lessons to how we make decisions as part of a community of knowledge. Realizing that people are explanation foes—that we usually don’t have the inclination or even the capability to master he details of all our decisions, we can try to structure the environment to help ourselves make good decisions despite our lack of understanding.

Reduce Complexity
Simple Decision Rules
Just-in-Time Education
Check Out Understanding

 

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Making People Smart

July 14, 2017

This is the eleventh post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Making People Smart is a chapter in this book.

The authors state, “The illusion of comprehension arises because people confuse understanding with familiarity or recognition. When you reread text at a later time, it seems familiar. Psychologist Paul Kolers provided an extreme case of this by having people read inverted text. More than a year later the same people could read the same text faster than different text they hadn’t read before. Thus, they had retained a memory over the course of a year for how to read specific text.

A problem that we all have is that this sense of familiarity can be confused with actual understanding of the material. It’s one thing to be familiar with some text or to know it by heart, but another to really get a full understanding of its meaning. Comprehension requires processing text with care and effort in a deliberate manner. It requires thinking about the author’s intention. This isn’t obvious to everyone and many students confuse studying with light reading. The knowledge illusion extends to education as well. Learning requires breaking common habits by processing information more deeply.

Sloan and Fernbach neglect to discuss how current technology hinders the development of fuller and deeper understanding. In their chapter on Thinking with Technology they did discuss how technology fools us into thinking we know more than we know. But they did not discuss how being continually plugged in and multitasking prevents fuller and deeper understanding. The belief that we can multitask is mistaken. What we are in reality doing is switching between, or all too often, among tasks, and the act of switching has attentional and cognitive costs. Fuller and deeper understanding comes from concentrating on one topic for a prolonged period of time. Usually many such encounters are often needed for this fuller and deeper understanding. Multitasking fosters superficial, not deep processing.

We suffer from the knowledge illusion when we confuse what experts know with what we ourselves know. The fact that we can access someone else’s knowledge makes us feel like we already know what we’re talking about. We are not built to become masters of all subjects, but we are built to participate in a community.

The authors write, “A real education includes learning that you don’t know certain things (a lot of things). Instead of looking in at the knowledge you do have, you learn to look for the knowledge you don’t have. To do this, you have to let go of some hubris; you have to accept what you don’t know. Learning what you don’t know is just a matter of looking at the frontiers of your knowledge and wondering what is out there beyond the border. It’s about asking why.”

Since 2006, a course entitled “Ignorance” has been taught at Columbia University. Guest scientists are invited to speak about what “they don’t know, what they think is critical to know, how they might get to know it, what will happen if they do find this or that thing out, and what might happen if they don’t.” The course focuses on all that is not in the textbooks and thus guides students to think about what is unknown and what could be known. The idea is to focus not on what students themselves don’t know, but what entire fields of science don’t know, with the aim of provoking and directing students to ask questions about the foundations of a scientific field. This course requires that students ponder not just some set of scientific theories; it requires that they begin to understand what the entire community has and hasn’t mastered.

Being a cognitive psychologist, HM has needed to learn about many disciplines, computer science, neuroscience, statistics, and linguistics, to name just a few. This is vastly more knowledge that one individual can comprehend. So much knowledge is accepted as faith. What distinguishes this faith from religious faith is that there is a higher power to appeal to: namely the power of verification. The Dalai Lama is a religious leader who is unique in that he incorporates scientific results into the Buddhist religion.

It was perhaps when HM graduated from high school that he had high confidence in what he knew. His undergraduate education quickly disabused him of this notion, and his graduate and continuing studies have increased his awareness of how much he does not know.

We all need to become better consumers of information. We need to be skeptical when deciphering the media. As has been noted in previous posts that there is a profitable business in false science and false news. Adrian Chen wrote in the New York Times Magazine of Russian “troll farm”, which was a business whose employees are assigned pro-Kremlin viewpoints and putative information to propagate by blogging, posting on social media sites, and flooding comment sections of news sites, often using false identities. It is sad that this sort of thing goes on all the time in both the political and commercial domains. All of which emphasizes that we should be modest about what we do know, to never have absolute beliefs in anything, and to constantly try to increase our understanding so we can use knowledge more effectively, and perhaps contribute to communal understanding.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

 

Thinking About Politics

July 11, 2017

This is the ninth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Thinking About Politics is a chapter in this book.

HM remembers when the Affordable Care Act was being debated, a woman was asked what she thought about it. She remarked that she was strongly in favor of it. However, when she was asked about Obamacare, she said that she was strongly against it. Such is the state of politics in the United States. A survey by the Kaiser Family Foundation in April 2013, found that more than 40% of Americans were not even aware that the Affordable Care Act was Law (12% thought it had been repealed by Congress—it hadn’t.)

Drs. Sloman and Fernbach write that public opinion is more extreme than people’s understanding justifies. Americans who most strongly justified military intervention in the Ukraine in 2014 were the ones least able to identify Ukraine’s location on a map. A survey out of Oklahoma State University’s Department of Agricultural Economics asked consumers whether the labeling of foods produced with genetic engineering should be mandatory. 80% of the respondents thought that it should. But 80% also approved of a law stating that there should be mandatory labels on foods containing DNA. They believe that people have the right to know if their food has DNA. So these respondents thought that all meats, vegetables, and grains should be labeled “BEWARE HAS DNA.” But we would all die if we avoided foods that contain DNA.

We all need to appreciate how little we understand. The authors write, “Taken to its extreme, the failure to appreciate how little we understand combined with community support, can ignite really dangerous mechanisms. You don’t have to know much history to know how societies can become caldrons in an attempt to create a uniform ideology, boiling away independent thinking and political opposition through propaganda and terror. Socrates died because of a desire for ancient Athenians to rid themselves of contaminated thinking. So did Jesus at the hands of the Romans. This is why the first crusades were launched to free Jerusalem of the infidel, and why the Spanish Inquisition drove Jews and Muslims to convert to Christianity or leave Spain between 1492 and 1501. The twentieth century was shaped by the demons of ideological purity, from Stalin’s purges, executions, and mass killings to Mao’s Great Leap Forward: the herding of millions of people into agricultural communes and industrial working groups, with the result than many starved. And we haven’t even mentioned the incarcerations and death camps of Nazi Germany.”

The authors write, “Proponents of political positions often cast policies that most people see as consequentialist in values-based terms in order to hide their ignorance, prevent moderation of opinion, and block compromise. They note the health care debate as a perfect example of this. Most people just want the best health care for the most people at the most affordable price. This is what the national conversation should be about how to achieve this. But this might be technical and boring. So politicians and interest groups make it about sacred values. One side asks whether the government should be making decisions about our health care, focusing the audience on the importance of limited government. The other side asks whether everybody in the country deserves decent health care, focusing on the value of generosity and preventing harm to others. The authors say that both sides are missing the point. All of us should have similar values: we want to be healthy, we want others to be healthy, and we want doctors and other medical professionals to be compensated, but we don’t want to pay too much. The health care debate should not be about basic values, because in most people’s minds basic values are not the issue. The issue is the best way to achieve the best outcomes.

Ideologies and ideologues are the bane of effective government. They constrain alternatives and blind us to obvious solutions. As mentioned in the second post in this series, other advanced countries have effectively addressed the problem of healthy care with a single payer system in which that single payer is the government. There are already proven examples from which to choose. But in the United States, ideology has deemphasized the role of government, and the single payer system is regarded as a radical solution.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

 

Thinking About Science

July 9, 2017

This is the eighth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Thinking About Science is a chapter in this book.

Were it not for science and, more importantly, scientific thinking we would still be living in the dark middle ages. Our wealth and health is due to scientific thinking. Yet knowledge about science and belief in scientific facts is lacking. HM admires the Amish. Although they reject science, they live the humble lives dictated by their beliefs. Too many others enjoy the fruits of science yet reject scientific methods and findings. Their lack of respect for science exposes us to the continued risks of global warming and puts unvaccinated children at risk, to name just two problems.

In 1985, Walter Bodmer, a German-born geneticist, who is a professor at Oxford University in the UK, was appointed by the Royal Society of London to lead a team to evaluate the current state of attitudes toward science and technology in Britain. The Royal Society was concerned about antiscientific sentiment in Britain, seeing it as a serious risk to societal well-being. The results and recommendations of the study were published in a seminal paper known as the Bodmer Report.

Previous research had focused primarily on measuring attitudes directly, but Bodmer and his team argued for a single and intuitive idea that opposition to science and technology is driven by a lack of understanding. So via the promotion of better understanding of science, society can promote more favorable attitudes and take better advantage of the benefits afforded by science and technology. This idea about science attitudes is called the deficit model. According to this model, antiscientific thinking is due to a knowledge deficit. Once this deficit is filled, antiscientific attitudes will be mitigated or will disappear.

The paucity of scientific knowledge and abundance of antiscientific beliefs have been documented in all societies that have been studied. Although there is a weak relationship between scientific knowledge and attitudes about science, attempts to address the deficit model have failed. This is in spite of the millions and millions of dollars spent on research, curriculum design, outreach and communication, little to no headway has been achieved.

HM thinks that science is so vast and continually expanding that the deficit is simply too large to fill. Although scientists are knowledgeable in their specialties, as they move away from the specialities that knowledge falls off.

But there is another explanation that scientific attitudes are not based on the rational evaluation of evidence, so providing information does not change them. Attitudes are determined instead by a host of contextual and cultural factors.

These two explanations are not mutually exclusive. They are likely both operative.

One of the leading voices promoting this new perspective is Dan Kahan, a Yale law professor. He argues that our attitudes are not based on rational, detached evaluation of evidence because our beliefs are not isolated pieces of data that we can take and discard at will. Instead these beliefs are intertwined with other beliefs, shared cultural values, and our identities, To discard a belief means discarding a whole host of other beliefs, forsaking our communities, going against those we trust and love, virtually challenging our identities.

Drs. Sloman and Fernbach flesh out this theory by the story of Mike McHargue, who now is a podcaster and blogger who goes by the moniker Science Mike. Mike once attended a fundamentalist church and held fundamentalist beliefs. When he reached his thirties he began reading scientific literature and his faith in these beliefs began to waver. His initial reaction was to lose his faith completely, but for a long time he kept his new beliefs from his community. Eventually a personal experience helped him rediscover his faith and he is now, once again, a practicing Christian, but he continues to reject his fundamentalist church’s antiscientific beliefs.

Here is Science Mike’s response to a caller who has begun to question many of his beliefs:

Do I have advice on how to live when you’re at odds with your community? Absolutely. Do not live at odds with your community… You are a time bomb right now. Because at some point you won’t be able to pretend anymore, and you will speak honestly, and there will be a measure of collateral damage and fallout in your church. It’s time to move on. It’s time to find a faith community that believes as you believe…When that happens, you’re going to lose relationships. Some people cannot agree to disagree and those relationships can become abusive…There is a lot of pain because there are some people who are dear to me that I can’t talk to anymore…It is not possible for us to have the relationship we once had, and it’s rough. I’m not gonna lie. It’s rough.

This poignant response provides useful and important advice.

HM accepts the fundamental thesis of Drs. Sloman and Fernbach, that our knowledge is inadequate. Scientific evidence can be wrong, but at any given time, the scientific evidence available is the best information to use. We ignore it at our peril.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Thinking with Technology

July 8, 2017

This is the seventh post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Thinking with Technology is a chapter in this book. Much has already been written in this blog on this topic, so this post will try to hit some unique points.

In the healthy memory blog Thinking with Technology comes under the category transactive memory as information in technology, be it paper or the internet, falls into this category. Actually Thinking with Other People also falls into this category as transactive memory refers to all information not stored in our own biological brains. Sloan and Fernbach realize this similarity as they write that we are starting to treat our technology more and more like people, like full participants in the community of knowledge. Just as we store understanding in other people, we store understanding in the internet. We already know that our having knowledge available in other people’s heads leads us to overrate our own understanding. We live in a community that shares knowledge, so each of us individually can fail to distinguish whether knowledge is stored in our own head or in someone else’s. This is the illusion of explanatory depth, viz., I think I understand things better than I do because I incorporate other people’s understanding into my assessment of my own understanding.

Two different research groups have found that we have the same kind of “confusion at the frontier” when we search the internet. Adrian Ward of the University of Texas found that engaging in Internet searches increased people’s cognitive self-esteem, their own sense of they ability to remember and process information. Moreover, people who searched the internet for facts they didn’t know and were later asked where they found the information often misremembered and reported that they had known it all along. Many completely forgot ever having conducted the search, giving themselves credit instead of Google.

Matt Fisher and Frank Keil conducted a study in which participants were asked to answer a series of general causal knowledge questions like, “How does a zipper work?” One group was asked to search the internet to confirm the details of their explanation. Th other group was asked to answer the questions without using any outside sources. Next, participants were asked to rate how well they could answer questions in domains that had nothing to do with the questions they were asked in the first phase. The finding was that those who had searched the internet rated their ability to answer unrelated questions as higher than those who had not.

The risk here should not be underestimated. Interactions with the internet can result in our thinking we know more than we know. It is important to make a distinction between what is accessible in memory and what is available in memory. If you can provide answers without consulting any external sources, then the information is accessible and is truly in your personal biological memory. However, if you need to consult the internet, or some other technical source,or some individual, then although the information is available, but not accessible. This is the difference between a closed book test or an open book test. Unless you can perform extemporaneously and accurately, be sure to consult transactive memory

Sloman and Fernbach have some unique perspectives. They discount the risk of super intelligence threatening humans, at least for now. They seem to think that there is no current basis for some real super intelligence taking over the world. The reason they offer for this is that technology does not (yet) share intentionality with us. HM does not quite understand why they argue this, and, in any case, the ‘yet” is enclosed in parentheses, implying that this is just a matter of time.

To summarize succinctly, technology increases our knowing more than we know. In other words, it increases the knowledge illusion.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Thinking with Other People

July 7, 2017

This is the sixth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Thinking with Other People is a chapter in this book. The evolution of modern humans from other species of hominids was extremely rapid on an evolutionary time scale. It began with the emergence of the genus Homo on the African savannah 2 to 3 million years ago. Sloan and Fernbach note that the great leap that humanity took during that period was cognitive. The brain mass of modern humans is about three times that of our early hominid ancestors.

A compelling hypothesis, the social brain hypothesis, is that the driving force of the evolution of human intelligence was the coordination of multiple cognitive systems to pursue complex, shared goals. Living in a group confers advantages, such as hunting, but it demands certain cognitive abilities. There are needs to communicate in sophisticated ways, to understand and incorporate the perspectives of others, and the sharing of common goals. According to the social brain hypothesis the cognitive demands and adaptive advantages associated with living in a group created a snowball effect: As groups got larger and developed more complex joint behaviors, individuals developed new capabilities to support those behaviors, which in turn allowed groups to get even larger and allowed group behavior to become even more complex.

Anthropologist Robin Dunbar, whom we have encountered previously in healthy memory blog posts, tested the social brain hypothesis against the ecological hypothesis. He collected data on many species of primates on brain size as well as facts about the environment they live in like the extent of their roaming territory and dietary habits, and facts about their societies such as their average group size. Brain size and group size are closely related. Primate species that live in large groups have bigger brains. Environmental measures such as territory size and diet were unrelated.

Increased brain size led to language and what sets people apart from other species is the ability to seamlessly communicate ideas of arbitrary complexity. Members of a hunting party need to understand the intentions of others in the hunting party so that each can play their respective roles.

Sloan and Fernbach argue that we humans have the unique capability of shared intentionality. They argue that this ability is one that no other machine or cognitive system does: We can share our attention with someone else. When we interact with one another, we do not merely experience the same event; we also know we are experiencing the same event. And this knowledge that we are sharing our attention changes more than the nature of the experience; it also changes what we do and what we’re about to accomplish in conjunction with others.

Sloan and Fernbach contine, “Sharing attention is a crucial step on the road to being a full collaborator in a group sharing cognitive labor, in a community of knowledge. Once we can share attention, we can share common ground. We know some things that we know others know, and we know that they we know (and of course we know that they know that we know, etc.) The knowledge is not just distributed; it is shared. Once knowledge is shared in this way, we can share intentionality, we can jointly pursue a common goal. A basic human talent is to share intentions with others so that we can accomplish things collaboratively. HM thinks that Sloan and Fernbach are describing the ideal situation. It is not unusual for consultants and training to be required to make this happen. And many organizations continue to function in a state that is far from ideal.

Sloan and Fernbach note that the knowledge illusion is the flip side of what economists call the curse of knowledge. When we know something, we find it hard to imagine that someone else doesn’t know it. The curse of knowledge sometimes comes in the form of hindsight bias. “The curse of knowledge is that we tend to think what is in our heads is in the heads of others. In the knowledge illusion, we tend to think what is in others’ heads is in our heads. In both cases, we fail to discern who knows what.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

 

The Brain is in the Mind

July 6, 2017

This is the fifth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. The title of this post is identical to the title of a section in that book.

When asked where the mind is located, most people respond that it is in the brain. So most people assume that the locus of thought—the most impressive of human capacities—is in the most sophisticated of human organs, the brain. Is this correct? Consider the following experiment.

This is a simple experiment where participants are asked to push a button with one of their hands to indicate their response. We have no problem with this task and respond in not much more than half second. But if the experiments vary one little detail, a detail that shouldn’t matter if the mind is in the brain, the results changed. The objects were oriented either to the left or to the right. For example, the handle of a watering can be on the right-hand side in half the pictures and on the left-hand side in the other half. If all we’re doing to decide whether the object is upright or upside down is consulting the knowledge stored in our brain about the object’s orientation, then whether the handle is on the left or right should make no difference. But it does. When responding yes with our right hand, we are faster when the handle is on the right than when the handle is on the left. When we are asked to say yes by pressing a button with our left hand, we are faster when the handle is on the left.

Here’s why. A photograph of a utensil with a handle on the right makes it easier to use our right hand. We see the photograph and immediately and unconsciously start organizing our body to interact with the picture object. Even though the handle is a photograph and not real, the handle is calling for our right hand. The fact that our right hand is primed for action makes us faster to respond with it, even to a question about the orientation of the object, which has nothing to do with action. By priming our hand to interact with the object, our body is directly affecting how long it takes us to answer the question. We don’t just pull the answer out of our brain. Instead our body and brain respond in synchrony to the photograph to retrieve an answer.

We use our bodies to think and remember. One study showed that acting out a scene is more effective than other memorization techniques for recalling a scene. Embodiment is a cluster of ideas about the important role the body plays in cognitive processing. Cognition is unified with objects that we’re thinking about and with. When we make music, our thoughts about music and the music we make with out mouths or instruments are part of the same process and highly interdependent. It’s much easier to move our fingers as if we’re playing a guitar if we actually have a guitar, and it’s much easier spell a word or do arithmetic if we write down what we’re thinking. The fact that thought is more effective when it is done in conjunction with the physical world suggests that thought is not a disembodied process that takes place on a slate inside the head. The authors conclude, “Mental activities do not simply occur in the brain. Rather, the brain is only one part of a processing system that also includes the body and other aspects of the world.

Emotional reactions are also memories. Remember the healthy memory blog post on Dr. Lisa Feldman Barrett and her book “How Emotions Are Made” Our emotions are the result of our interpretations of and models based on our interoceptive responses. We learn to interpret our internal bodily responses in an analogous manner to how we build models and interpret the external world.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Dougla Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

The Two Causal Reasoners Inside

July 5, 2017

This is the fourth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. The title of this post is identical to the title of a section in that book. Drs. Sloman and Fernbach state that we are engaged in some type of causal reasoning almost all the time, but that not all causal reasoning is the same. Some of it is fast. It’s quick and automatic as when a man concludes that the reason his hand hurts is because he bashed it against the wall. Another type of causal reasoning is when we try to remember the causes of WW I.

This two process distinction goes beyond causal reasoning and can be used for all cognitive processing. Daniel Kahneman formulated this distinction in his best selling book, “Thinking Fast and Slow.” There have been many previous posts on this topic. There are sixty-nine hits using Kahneman in the healthy memory blog search box. Normal conversation, driving, and skilled performance are dominated largely by System 1. System 1 is called intuition. When we have to stop and think about something, that is an example of System 2 processing, which is called reasoning. The psychologist Stanovich breaks down System 2 processing into instrumental and epistemic processing in his efforts to develop a Rational Quotient (RQ) that improves upon the standard IQ.

Professor Shane Frederick has introduced a simple test to determine whether a person is more intuitive or more deliberative. It’s called the Cognitive Reflection Test (CRT). Here’s an example problem.

A bat and a ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?

Do you think the answer is 10 cents? If you do, you’re in good company. Most people report that as the answer (including the majority of students in Ivy League colleges). !0 cents pops into almost everyone’s mind. This is the product of System 1 processing. However, if System 2 is engaged, one realizes that if the ball costs 10 cents and the bat costs $1 more than the ball, then the bat costs $1.10 and together they cost $1.20. So 10 cents is the wrong answer. The small proportion of people whose System 2 processes kick in, realize that 10 cents is wrong, and they are able to calculate the correct answer. Frederick refers to such people as reflective, meaning that they tend to suppress their intuitive reasons and deliberate before responding.

Here is another CRT problem.

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it take 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half the lake?

The answer 24 comes to most people’s mind. But if the patch doubles in size every day, then if the lake is half covered on day 24, it would be fully covered on day 25. But the problem states that the lake is fully covered on day 48, so 24 can’t be correct. The correct answer is one day before it’s fully covered, day 47.

Here’s another CRT problem.

If it takes 5 machines to make 5 widgets, how long would it take 100 machines to make 100 widgets?

Try to solve this on your own.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The correct answer is 5 minutes (each machine takes 5 minutes to make one widget).

The solution to all three problems requires the invoking of System 2 processing. Less than 20% of the U.S. population gets all three problems correct. This finding might reflect a reluctance to think and might account for many of the problems the United States is facing. About 48% of students at the Massachusetts Institute of Technology (MIT) got all three problems correct, but only 26% of Princeton students did.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

The Illusion of Understanding

July 4, 2017

This is the third post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach.

In the 1980s the cognitive scientist Thomas Landauer decided to estimate the size of human memory on the same scale that is used to measure the size of computer memories. He used several clever techniques to measure how much knowledge people have. For example, he estimated the size of an average adult’s vocabulary and calculated how many bytes would be required to store that much information. Then he used that result to estimate the size of the average adult’s entire knowledge base. The answer was half of a gigabyte. Currently HM is looking at his USB flash drive with 32 gigabytes of storage.

Dr. Landauer did another study in which he measured the difference in recognition performance between a group that had been exposed to items and a group that had not. This difference is as pure a measure of memory that one can get. He measured the amount of time people spent learning the material in the first place. This told him the rate at which people are able to acquire information that they later remember. He also found a way to take into account that people forget. His analyses found that people acquire information at roughly the same rate regardless of the details of the procedure used in the experiment to the type of material being learned. People learn at approximately the same rate whether the items were visual, verbal, or musical.

Then Dr. Landauer calculated how much information people have on hand, the size of their knowledge base, by assuming they learn at this same rate over the course of a seventy-year lifetime. The same result of 1 gigabyte was obtained by every technique he tried. This number is just a tiny fraction of what a modern laptop can retain.

Drs. Sloman and Fernbach note that this is only shocking if you believe the human mind works like a computer. The model of the mind a machine designed to encode and retain memories breaks down when you consider the complexity of the world with which we interact. They conclude that it would be futile for memory to be designed to hold tons of information because there’s just too much out there.

Drs. Sloman and Fernbach note that most of cognition consists of intuitive thought that occurs below the level of conscious awareness. Huge quantities of information are processed in parallel. People are not computers in that we don’t just rely on a central processor that reads and writes to a memory to think. We rely on our bodies, on the world around us, and on other minds. There’s no way we could store in our heads, all there is to know about our environment.

We humans are surprisingly ignorant, more ignorant that we think. We also exist in a complex world, one that is even more complex than one might have thought. But if we’re so ignorant, how can we get around, sound knowledgeable, and take ourselves seriously while understanding only a tiny fraction of what there is to know?

The authors’ answer is that we do so by living a lie. We ignore complexity by overestimating how much we know about how things work, by living life in the belief that we know how things work even when we don’t. We tell ourselves the we understand what’s going on, that our opinions are justified by our knowledge, and that our actions are in justified beliefs even hough they are not. We tolerate complexity by failing to recognize it. That’s the Illusion of understanding.

The Knowledge Illusion: Why We Never Think Alone (Unabridged) 2

July 2, 2017

If you have not read the immediately preceding post, please scroll down and read it. The immediate post presents HM’s thoughts about how the need to use knowledge from fellow humans fails. A good example of this can be found in the current debates about the Affordable Care Act and its replacement. Although the Affordable Care Act is flawed, information from fellow humans who have successfully dealt with this problem is being ignored.

The United States has the most expensive healthcare costs in the world, but results characteristic of a third world country. And it is the only advanced country that does not have a single payer system in which the single payer is the government. This progression did not happen all at once. It began in England after WW II and variants of it were gradually adopted over time by advanced countries with the notable exception of the United States.

How can this be explained? It can be explained most simply in one word, “beliefs.” In this case, the belief in free markets. Free markets are good, but what is frequently forgotten is that free markets do not remain free, they are manipulated and require government intervention to disrupt the development of monopolies. Moreover, free markets are not universally applicable. Economists have effectively argued that free markets are not appropriate for medical care.

However, even if one believed in the viability of free markets for medical care, how can they ignore the success of single payer systems in the developed world? The problem is that beliefs stymy new and creative thinking and using knowledge from knowledgeable people. New beliefs require thinking and thinking requires mental effort, which many people find uncomfortable.

Then there is the faux “Fair and Balanced” news. When the Affordable Care Act was being proposed, “Fair and Balanced” news featured an English Woman who had a complaint about a surgical procedure she had undergone. This woman was livid about this presentation on “Fair and Balanced” news. Although she had complaints about this one procedure, she was highly enthusiastic about the national health system in Britain. Moreover, none of the countries who have adopted a single payer system in which the single payer is the government have abandoned these programs. It should be noted that in the United States Medicare is a single payer system that works quite well. However, Medicare covers only a certain percentage of costs, so supplemental insurance is prudent.

So beliefs can thwart change and innovation. But not all beliefs are necessarily bad. Consider religion, particularly religions for which the medical suffering of fellow human beings is important. One would think that in the United States where such religious beliefs are widespread, medical care would be among the best, not the worst. What apparently happens here is compartmentalization. These religious beliefs are thwarted by beliefs about government and the supremacy of market forces. The result of this compartmentalization is that the health and finances of fellow citizens suffer.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

The Knowledge Illusion: Why We Never Think Alone (Unabridged)

July 1, 2017

“The Knowledge Illusion: Why We Never Think Alone” is an important book by Steven Sloman and Phillip Fernbach. An earlier healthy memory blog post with the same title as the book has already been written. That post was based on a summary of the book done by Elizabeth Kolbert for the New Yorker. Having now read the entire book, HM feels that this volume deserves more detailed attention.

Drs. Sloman and Fernbach are cognitive scientists. Cognitive science emerged in the 1950s to understand the workings of the human mind. It asks questions such as “how is thinking possible?” What goes on inside the brain that allows sentient beings to do math, understand their mortality, act virtuously and (sometimes) selflessly, and still do simple things, like eat with a knife and fork? Currently no machine, and probably no other animal, is capable of these acts.

The authors write, “The human mind is not like a desktop computer, designed to hold reams of information. The mind is a flexible problem solver that evolved to extract only the most useful information to guide decisions in new situations. As a consequence, we individuals store very little detailed information about the world in our heads. In that sense people are like bees and society a beehive: Our intelligence resides not in individual brains, but in the collective mind. To function, individuals rely not only on knowledge stored within our skulls, but also on knowledge stored elsewhere: in our bodies, in the environment, and especially in other people.” In the lingo of the healthy memory blog, information not held within our individual brains, is stored in transactive memory. The authors conclude, “When you put it all together, human thought thought is incredibly impressive, but it is a product of a community, not of any individual alone.”

The authors make a compelling argument that we all suffer, to a greater or lesser extent, from an illusion of understanding, an illusion that we understand how things work when in fact our understanding is meager. Unfortunately, we are not adequately aware of the shortcomings in our understanding. We think we understand much much more than we actually do. Readers of the healthy memory blog should be aware of the risks of having absolute beliefs, that all beliefs should be hedged with some reasonable degree of doubt.

The authors note that history is full of events that seem familiar, that elicit a sense of mild to deep understanding, but whose true historical context is different that we imagine. The complex details get lost in the mist of time while myths emerge that simplify and make stories digestible in part to service one interest group or another. There is a very interesting book by James W. Lowen titled “Lies My Teacher Told Me: Everything Your American History Textbook got wrong”. He argues that history as taught in the public schools is basically propaganda advanced by the school board selecting texts. HM found this book most instructive. People should be recalled for a defective education, but reading this book is more practical.

It is also important to remember that the study of history is dynamic. New research yields new interpretations of history.

The authors write, “Thought is for action. Thinking evolved as an extension of the ability to act effectively; it evolved to make us better at doing what’s necessary to achieve our goals. Thought allows us to select from among a set of possible actions by predicting the effects of each action and by imagining how the world would be if we had taken different actions in the past.”

It is unlikely that we would have survived had we been dependent on only the limited knowledge stored in our individual brains. The authors write,”The secret to our success is that we live in a world in which knowledge is all around us. It is in the things we make, in our bodies and workspaces, and another people. We live in a community of knowledge.”

But not all of this is knowledge is accurate, meaning that there are degrees of belief and some knowledge is faux. Understanding that our knowledge is not golden can offer us improved ways of approaching our most complex problems. Recognizing the limits of our understanding should make us more humble, and open our minds to other people’s ideas and ways of thinking. The authors note that It offers lessons about how to avoid things like bad financial decisions, and can enable us to improve our political system and help us assess how much reliance we should have on experts versus how much decision-making power should be given to individual voters.

The authors write, “This book is being written at a time of immense polarization on the American political scene. Liberals and conservative find each other’s views repugnant, and as a result, Democrats and Republicans cannot find common ground or compromise.” The authors note, “One reason for this gridlock is that both politicians and voters don’t realize how little they understand. Whenever an issue is important enough for public debate, it is also complicated enough to be difficult to understand.” They conclude, “Complexity abounds. If everybody understood this, our society would likely be less polarized.”

Neuroscience is much in the news as there have been many exciting developments in the field. Little is currently being written about cognitive science, although there are exciting and relevant new findings in cognitive science. The following is directly quoted from “The Knowledge Illusion: ”Our skulls may delimit the frontier of our brains, but they do not limit the frontier of our knowledge. The mind stretches beyond to include the body, the environment, and people other than one’s-self, so the study of the mind cannot be reduced to the study of the brain. Cognitive science is not the same as neuroscience.”

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

The Knowledge Illusion: Why We Never Think Alone

March 12, 2017

“The Knowledge Illusion:  Why We Never Think Alone” is  the second of three books to be reviewed from an article titled, “That’s What You Think:  Why reason and evidence won’t change our minds” by Elizabeth Kolbert in the 27 February 2017 issue of “The New Yorker.”

The authors of this book, Steven Sloman and Philip Fernbach also believe that sociability is the key  to how the human mind functions, or, more accurately, malfunctions.  In a study conducted on Yale University, graduate students were ask to rate their understanding of everyday devices to include toilets, zippers, and cinder blocks.  Then they were asked to write detailed step-by-step explanations of how the devices work, and to rate their understanding again.  Doing this revealed to the students their own ignorance, because their self-assessments dropped.

Sloan and Fernbach call this the “illusion of explanatory depth” and find this effect just about everywhere.  They say that what allows us to press in this belief is other people.  This is something we are very good at.  We’ve been relying on one another’s expertise ever since we figured out how to hang together, which was probably a key development in our evolutionary history.  They argue that we collaborate so well that we can hardly tell where our own understanding ends and others’ begins.  They argue that this borderlessness is crucial to what we consider progress.  “As people invented new tools for new ways of living, they simultaneously created new realms of ignorance;  If everyone insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amount to much.  When it comes to new technologies, incomplete understanding is empowering.”

Where this gets us into trouble, according to Sloan and Fernbach, is in the political domain.  “It’s one thing for me to flush a toilet without knowing how it operates,  and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about.”

Sloan and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea.  Respondents were asked how they thought the U.S. should react, and also to locate Crimea on a map.  The farther off base they were about the geography, the more likely they were to favor military intervention.