Posts Tagged ‘The Knowledge Illusion’

Robots Will Be More Useful If They are Made to Lack Confidence

July 17, 2017

The title of this post is identical to the title of an article by Matt Reynolds in the News & Technology section of the 10 June 2017 issue of the New Scientist. The article begins “CONFIDENCE in your abilities is usually a good thing—as long as you know when it’s time to ask for help. Reynolds notes that as we build ever smarter software, we may want to apply the same mindset to machines.

Dylan Hadfield-Menell says that overconfident AIs can cause all kinds of problems. So he and his colleagues designed a mathematical model of an interaction between humans and computers called the “off-switch-game.” In the theoretical set-up robots are given a task to do and humans are free to switch them off whenever they like. The robot can also choose to disable its switch so the person cannot turn it off.

Robots given a high level of “confidence” that they were doing something useful would never let the human turn them off, because they tried to maximize the time spent doing their task. Not surprisingly, a robot with low confidence would always let a human switch it off, even if it was doing a good job.

Obviously, calibrating the level of confidence is important. It is unlikely that humans would ever provide a level of confidence that would not allow them to shut down the computer. A problem here is that we humans tend to be overconfident and to be unaware of how much we do not know. This human shortcoming is well documented in a book by Steven Sloman and Philip Fernbach titled “The Knowledge Illusion: Why We Never Think Alone.” Remember that transactive memory is information that is found in our fellow human beings and in technology that ranges from paper to the internet. Usually we eventually learn the best sources of information in our fellow humans and human organizations, and we need to learn where to find and how much confidence to have in information stored in technology, which includes AI robots. Just as we can have the wrong friends and sources of information, we have the same problem with robots and external intelligence.

So the title is wrong. Robots may not be more useful if they are made to lack confidence. They should have a calibrated level of confidence just as we humans should have calibrated levels of confidence depending upon the task and how skilled we are. Achieving the appropriate levels of confidence between humans and machines is a good example of the Man-Machine Symbiosis J.C.R. Licklider expounded upon in his Classic paper “Man-Computer Symbiosis.”

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Conclusion: Appraising Ignorance and Illusion

July 16, 2017

The title of this post is identical to the title of the final chapter of The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. The authors note that this book has three central themes: ignorance, the community of knowledge, and the illusion of understanding.

The authors note that ignorance is inevitable simply because there’s too much complexity in the world for any individual to master. The Dunning-Kruger effect, which has been discussed in previous healthy memory posts, is that those who perform the worst overrate their own skills the most. This effect can be found by giving a group of people a task to do and then asking them how well they think they’ve done on the task. Poor performers overestimate how well they’ve done; strong performers often underestimate their performance. This effect has been found many times both in the psychological laboratory and in many real-world environments: among students, in offices, and among doctors. Dunning has collected an impressive amount of evidence that the reason it happens is that those who lack skills also lack the knowledge of what skills they’re missing. Consequently, they think they’re pretty good. However, those who are knowledgeable have a better sense of how matters should be handled and they know what skills they need to improve on. Dunning stresses the importance of this effect because all of us are unskilled in most domains of our lives.

The authors wrote, “As for the community of knowledge, intelligence resides in the community and not in any individual. So decision-making procedures that elicit the wisdom of the community are more likely to produce better outcomes than procedures that depend on the relative ignorance of lone individuals. A strong leader is one who knows how to inspire a community and take advantage of the knowledge within it, and can delegate responsibility to those with the most expertise.”

There are good and bad aspects of the illusion of understanding. We’re more likely to be accurate by avoiding illusion. We have a good idea of what we know and what we don’t know, and this should help us achieve our goals. We won’t take on projects that are beyond us, and we’ll be less likely to disappoint others, and we’ll be better positioned to deliver on our promises.

But they also note that illusion is a pleasure, as many of us spend a significant part of our lives living in illusions quite intentionally. We entertain ourselves with fictional worlds. Illusions can stimulate creative products by inspiring us to imagine alternative worlds, goals, and outcomes. And they can motivate us to attempt what we wouldn’t otherwise attempt.

Many posts have been devoted to this book because it addresses an important topic. It provides us with a more accurate picture of what we know and the consequences of shortfalls in knowledge. The title states “why we never think alone.” Perhaps it should read, “why we should never think alone.” Although he has certainly tried, HM has not done this volume justice, and he encourages you to read it for yourself.

 

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Making Smart Decisions

July 15, 2017

This is the twelfth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Making Smarter Decisions is a chapter in this book. Perhaps the one area it is important to make smart decisions is in finance.

Consider the following question: Assume that you deposit $400 every month into a retirement savings account that earns a 10% yearly rate of interest and that you never withdraw any money. How much money do you think you will have in any account (including interest earned): After 10 years, 20 years, 30 years, and 40 years.

Respondents answering these questions responding with a median response of
$223,000.00 after forty years. The correct answer is almost $2.5 million. Use your spreadsheet to prove this for yourself.

HM’s father passed away before he retired, and he was denied a large amount of the pension he should have received. Nevertheless, both his parents had managed their finances carefully. All they invested in were FDIC insured savings accounts and Certificates of Deposit. HM was amazed at the money his Mom accumulated. She was in fine shape, placed no burden on him, and left him a substantial inheritance.

HM’s parents also never carried credit card debt. No one should ever carry credit card debt. This debt is compounded, and increases at a nonlinear rate just as savings accounts do. But unlike savings accounts, debt is subtracted.

Actually the rates that are charged are usurious and should not be allowed. But the financial industry has effectively bought congress. (You should know that the United States has the best congress money can buy). Added to this are the clever programs where you gain rewards for using the card. Using the card and earning rewards is not a problem unless you do not pay off the card monthly when it is due. You should remember if you carry credit debt you are losing money.

Behavioral economics has some effective ideas to aid in better financial decisions (enter, “behavioral economics” into the healthy memory blog search box to find additional posts on behavioral economics). It has found ways to nudge better decisions. Nudging can be done by setting defaults. Rather than have employees opt in regarding retirement contributions, have them opt out if they do not want to contribute. Have being an organ donor being the default option on a driver’s license, and have them opt out if they do not want to be a donor. The big idea of the nudge approach is that it easier and more effective to change the environment that it is to change the person. Once we understand what quirks of cognition drive behavior, we can design the environment so that those quirks help us instead of hurting us.

We can apply these lessons to how we make decisions as part of a community of knowledge. Realizing that people are explanation foes—that we usually don’t have the inclination or even the capability to master he details of all our decisions, we can try to structure the environment to help ourselves make good decisions despite our lack of understanding.

Reduce Complexity
Simple Decision Rules
Just-in-Time Education
Check Out Understanding

 

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Making People Smart

July 14, 2017

This is the eleventh post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Making People Smart is a chapter in this book.

The authors state, “The illusion of comprehension arises because people confuse understanding with familiarity or recognition. When you reread text at a later time, it seems familiar. Psychologist Paul Kolers provided an extreme case of this by having people read inverted text. More than a year later the same people could read the same text faster than different text they hadn’t read before. Thus, they had retained a memory over the course of a year for how to read specific text.

A problem that we all have is that this sense of familiarity can be confused with actual understanding of the material. It’s one thing to be familiar with some text or to know it by heart, but another to really get a full understanding of its meaning. Comprehension requires processing text with care and effort in a deliberate manner. It requires thinking about the author’s intention. This isn’t obvious to everyone and many students confuse studying with light reading. The knowledge illusion extends to education as well. Learning requires breaking common habits by processing information more deeply.

Sloan and Fernbach neglect to discuss how current technology hinders the development of fuller and deeper understanding. In their chapter on Thinking with Technology they did discuss how technology fools us into thinking we know more than we know. But they did not discuss how being continually plugged in and multitasking prevents fuller and deeper understanding. The belief that we can multitask is mistaken. What we are in reality doing is switching between, or all too often, among tasks, and the act of switching has attentional and cognitive costs. Fuller and deeper understanding comes from concentrating on one topic for a prolonged period of time. Usually many such encounters are often needed for this fuller and deeper understanding. Multitasking fosters superficial, not deep processing.

We suffer from the knowledge illusion when we confuse what experts know with what we ourselves know. The fact that we can access someone else’s knowledge makes us feel like we already know what we’re talking about. We are not built to become masters of all subjects, but we are built to participate in a community.

The authors write, “A real education includes learning that you don’t know certain things (a lot of things). Instead of looking in at the knowledge you do have, you learn to look for the knowledge you don’t have. To do this, you have to let go of some hubris; you have to accept what you don’t know. Learning what you don’t know is just a matter of looking at the frontiers of your knowledge and wondering what is out there beyond the border. It’s about asking why.”

Since 2006, a course entitled “Ignorance” has been taught at Columbia University. Guest scientists are invited to speak about what “they don’t know, what they think is critical to know, how they might get to know it, what will happen if they do find this or that thing out, and what might happen if they don’t.” The course focuses on all that is not in the textbooks and thus guides students to think about what is unknown and what could be known. The idea is to focus not on what students themselves don’t know, but what entire fields of science don’t know, with the aim of provoking and directing students to ask questions about the foundations of a scientific field. This course requires that students ponder not just some set of scientific theories; it requires that they begin to understand what the entire community has and hasn’t mastered.

Being a cognitive psychologist, HM has needed to learn about many disciplines, computer science, neuroscience, statistics, and linguistics, to name just a few. This is vastly more knowledge that one individual can comprehend. So much knowledge is accepted as faith. What distinguishes this faith from religious faith is that there is a higher power to appeal to: namely the power of verification. The Dalai Lama is a religious leader who is unique in that he incorporates scientific results into the Buddhist religion.

It was perhaps when HM graduated from high school that he had high confidence in what he knew. His undergraduate education quickly disabused him of this notion, and his graduate and continuing studies have increased his awareness of how much he does not know.

We all need to become better consumers of information. We need to be skeptical when deciphering the media. As has been noted in previous posts that there is a profitable business in false science and false news. Adrian Chen wrote in the New York Times Magazine of Russian “troll farm”, which was a business whose employees are assigned pro-Kremlin viewpoints and putative information to propagate by blogging, posting on social media sites, and flooding comment sections of news sites, often using false identities. It is sad that this sort of thing goes on all the time in both the political and commercial domains. All of which emphasizes that we should be modest about what we do know, to never have absolute beliefs in anything, and to constantly try to increase our understanding so we can use knowledge more effectively, and perhaps contribute to communal understanding.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

 

The New Definition of Smart

July 13, 2017

This is the tenth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. The New Definition of Smart is a chapter in this book.

The chapter begins by stating that we tend to be knowledgeable of only a few, and oftentimes one, individuals for an area accomplishment. Martin Luther King, jr., is known by most people as being key to the Civil Rights Movement. In reality, many people over a prolonged period of time were involved in the movement. They write that the tendency to substitute individuals for complicated entities can be seen in how we talk about institutions. We talk about the Eisenhower administration or the Kennedy administration as if the president of the United States personally carried out all the function of the executive branch of government. The Affordable Care Act, commonly referred to as Obamacare, runs to about 20,000 or so pages of legalese. The authors ask how much of it do you think that Barack Obama himself wrote? Their guess is none. We speak of great scientists as if they changed the world, but they did not do it alone. Wasn’t it the great physicist Sir Isaac Newton who said that if he saw further than other men it was because he stood on the shoulders of giants?

The authors spend a good deal of time discussing the concept of intelligence, commonly referred to as IQ and how it has developed. Rather then try to summarize their summary consider the title of a presentation that Robert J. Sternberg made at the Annual Meeting of the Association for Psychological Science in 2017, “Are We Creating a Society of Smart Fools? Lesson from 40+ Years of Research on Human Intelligence, Creativity, and Wisdom”. HM can think of no better expert as an authority on this topic.

 

Sloan and Fernbach, building on the concept of a group mind come up with the concept of c for collective intelligence. Diversity is important for successful groups. A group in which everyone has the same expertise will unlikely be effective. The question is what are the objectives for a group, and what areas of expertise are needed to fill it. Then one searches for these unique pieces and asks the question, what does a given individual add to the group. As the group advances it is likely that new sources of expertise will be required and that groups will be dynamic. If a new member does improve a group, c increases.

This concept of c is relatively new. A team led by Anita Woolley of the Tepper School of Business is developing this concept. Instead of testing people individually, they gave each of forty teams of these people a variety of tests that included brainstorming the possible uses for a brick, Raven’s Advanced Progressive Matrices that is often used as a quick assessment of intelligence, a moral reasoning problem, a shopping trip planning task, and a group typing taAnisk. Each team did each task together.

All tasks were positively correlated in that a group who did well on one task was more likely to do well on another task than a group who didn’t do well on the first task. Thus, they uncovered c the factor. The new research being done suggests that the success of a group is not predominately a function of the intelligence of its individuals, but rather by how well they work together with their respective competencies.

Obviously c is a very promising concept, one for which much work still needs to be done.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

 

 

Thinking About Politics

July 11, 2017

This is the ninth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Thinking About Politics is a chapter in this book.

HM remembers when the Affordable Care Act was being debated, a woman was asked what she thought about it. She remarked that she was strongly in favor of it. However, when she was asked about Obamacare, she said that she was strongly against it. Such is the state of politics in the United States. A survey by the Kaiser Family Foundation in April 2013, found that more than 40% of Americans were not even aware that the Affordable Care Act was Law (12% thought it had been repealed by Congress—it hadn’t.)

Drs. Sloman and Fernbach write that public opinion is more extreme than people’s understanding justifies. Americans who most strongly justified military intervention in the Ukraine in 2014 were the ones least able to identify Ukraine’s location on a map. A survey out of Oklahoma State University’s Department of Agricultural Economics asked consumers whether the labeling of foods produced with genetic engineering should be mandatory. 80% of the respondents thought that it should. But 80% also approved of a law stating that there should be mandatory labels on foods containing DNA. They believe that people have the right to know if their food has DNA. So these respondents thought that all meats, vegetables, and grains should be labeled “BEWARE HAS DNA.” But we would all die if we avoided foods that contain DNA.

We all need to appreciate how little we understand. The authors write, “Taken to its extreme, the failure to appreciate how little we understand combined with community support, can ignite really dangerous mechanisms. You don’t have to know much history to know how societies can become caldrons in an attempt to create a uniform ideology, boiling away independent thinking and political opposition through propaganda and terror. Socrates died because of a desire for ancient Athenians to rid themselves of contaminated thinking. So did Jesus at the hands of the Romans. This is why the first crusades were launched to free Jerusalem of the infidel, and why the Spanish Inquisition drove Jews and Muslims to convert to Christianity or leave Spain between 1492 and 1501. The twentieth century was shaped by the demons of ideological purity, from Stalin’s purges, executions, and mass killings to Mao’s Great Leap Forward: the herding of millions of people into agricultural communes and industrial working groups, with the result than many starved. And we haven’t even mentioned the incarcerations and death camps of Nazi Germany.”

The authors write, “Proponents of political positions often cast policies that most people see as consequentialist in values-based terms in order to hide their ignorance, prevent moderation of opinion, and block compromise. They note the health care debate as a perfect example of this. Most people just want the best health care for the most people at the most affordable price. This is what the national conversation should be about how to achieve this. But this might be technical and boring. So politicians and interest groups make it about sacred values. One side asks whether the government should be making decisions about our health care, focusing the audience on the importance of limited government. The other side asks whether everybody in the country deserves decent health care, focusing on the value of generosity and preventing harm to others. The authors say that both sides are missing the point. All of us should have similar values: we want to be healthy, we want others to be healthy, and we want doctors and other medical professionals to be compensated, but we don’t want to pay too much. The health care debate should not be about basic values, because in most people’s minds basic values are not the issue. The issue is the best way to achieve the best outcomes.

Ideologies and ideologues are the bane of effective government. They constrain alternatives and blind us to obvious solutions. As mentioned in the second post in this series, other advanced countries have effectively addressed the problem of healthy care with a single payer system in which that single payer is the government. There are already proven examples from which to choose. But in the United States, ideology has deemphasized the role of government, and the single payer system is regarded as a radical solution.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

 

Thinking About Science

July 9, 2017

This is the eighth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Thinking About Science is a chapter in this book.

Were it not for science and, more importantly, scientific thinking we would still be living in the dark middle ages. Our wealth and health is due to scientific thinking. Yet knowledge about science and belief in scientific facts is lacking. HM admires the Amish. Although they reject science, they live the humble lives dictated by their beliefs. Too many others enjoy the fruits of science yet reject scientific methods and findings. Their lack of respect for science exposes us to the continued risks of global warming and puts unvaccinated children at risk, to name just two problems.

In 1985, Walter Bodmer, a German-born geneticist, who is a professor at Oxford University in the UK, was appointed by the Royal Society of London to lead a team to evaluate the current state of attitudes toward science and technology in Britain. The Royal Society was concerned about antiscientific sentiment in Britain, seeing it as a serious risk to societal well-being. The results and recommendations of the study were published in a seminal paper known as the Bodmer Report.

Previous research had focused primarily on measuring attitudes directly, but Bodmer and his team argued for a single and intuitive idea that opposition to science and technology is driven by a lack of understanding. So via the promotion of better understanding of science, society can promote more favorable attitudes and take better advantage of the benefits afforded by science and technology. This idea about science attitudes is called the deficit model. According to this model, antiscientific thinking is due to a knowledge deficit. Once this deficit is filled, antiscientific attitudes will be mitigated or will disappear.

The paucity of scientific knowledge and abundance of antiscientific beliefs have been documented in all societies that have been studied. Although there is a weak relationship between scientific knowledge and attitudes about science, attempts to address the deficit model have failed. This is in spite of the millions and millions of dollars spent on research, curriculum design, outreach and communication, little to no headway has been achieved.

HM thinks that science is so vast and continually expanding that the deficit is simply too large to fill. Although scientists are knowledgeable in their specialties, as they move away from the specialities that knowledge falls off.

But there is another explanation that scientific attitudes are not based on the rational evaluation of evidence, so providing information does not change them. Attitudes are determined instead by a host of contextual and cultural factors.

These two explanations are not mutually exclusive. They are likely both operative.

One of the leading voices promoting this new perspective is Dan Kahan, a Yale law professor. He argues that our attitudes are not based on rational, detached evaluation of evidence because our beliefs are not isolated pieces of data that we can take and discard at will. Instead these beliefs are intertwined with other beliefs, shared cultural values, and our identities, To discard a belief means discarding a whole host of other beliefs, forsaking our communities, going against those we trust and love, virtually challenging our identities.

Drs. Sloman and Fernbach flesh out this theory by the story of Mike McHargue, who now is a podcaster and blogger who goes by the moniker Science Mike. Mike once attended a fundamentalist church and held fundamentalist beliefs. When he reached his thirties he began reading scientific literature and his faith in these beliefs began to waver. His initial reaction was to lose his faith completely, but for a long time he kept his new beliefs from his community. Eventually a personal experience helped him rediscover his faith and he is now, once again, a practicing Christian, but he continues to reject his fundamentalist church’s antiscientific beliefs.

Here is Science Mike’s response to a caller who has begun to question many of his beliefs:

Do I have advice on how to live when you’re at odds with your community? Absolutely. Do not live at odds with your community… You are a time bomb right now. Because at some point you won’t be able to pretend anymore, and you will speak honestly, and there will be a measure of collateral damage and fallout in your church. It’s time to move on. It’s time to find a faith community that believes as you believe…When that happens, you’re going to lose relationships. Some people cannot agree to disagree and those relationships can become abusive…There is a lot of pain because there are some people who are dear to me that I can’t talk to anymore…It is not possible for us to have the relationship we once had, and it’s rough. I’m not gonna lie. It’s rough.

This poignant response provides useful and important advice.

HM accepts the fundamental thesis of Drs. Sloman and Fernbach, that our knowledge is inadequate. Scientific evidence can be wrong, but at any given time, the scientific evidence available is the best information to use. We ignore it at our peril.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Thinking with Technology

July 8, 2017

This is the seventh post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Thinking with Technology is a chapter in this book. Much has already been written in this blog on this topic, so this post will try to hit some unique points.

In the healthy memory blog Thinking with Technology comes under the category transactive memory as information in technology, be it paper or the internet, falls into this category. Actually Thinking with Other People also falls into this category as transactive memory refers to all information not stored in our own biological brains. Sloan and Fernbach realize this similarity as they write that we are starting to treat our technology more and more like people, like full participants in the community of knowledge. Just as we store understanding in other people, we store understanding in the internet. We already know that our having knowledge available in other people’s heads leads us to overrate our own understanding. We live in a community that shares knowledge, so each of us individually can fail to distinguish whether knowledge is stored in our own head or in someone else’s. This is the illusion of explanatory depth, viz., I think I understand things better than I do because I incorporate other people’s understanding into my assessment of my own understanding.

Two different research groups have found that we have the same kind of “confusion at the frontier” when we search the internet. Adrian Ward of the University of Texas found that engaging in Internet searches increased people’s cognitive self-esteem, their own sense of they ability to remember and process information. Moreover, people who searched the internet for facts they didn’t know and were later asked where they found the information often misremembered and reported that they had known it all along. Many completely forgot ever having conducted the search, giving themselves credit instead of Google.

Matt Fisher and Frank Keil conducted a study in which participants were asked to answer a series of general causal knowledge questions like, “How does a zipper work?” One group was asked to search the internet to confirm the details of their explanation. Th other group was asked to answer the questions without using any outside sources. Next, participants were asked to rate how well they could answer questions in domains that had nothing to do with the questions they were asked in the first phase. The finding was that those who had searched the internet rated their ability to answer unrelated questions as higher than those who had not.

The risk here should not be underestimated. Interactions with the internet can result in our thinking we know more than we know. It is important to make a distinction between what is accessible in memory and what is available in memory. If you can provide answers without consulting any external sources, then the information is accessible and is truly in your personal biological memory. However, if you need to consult the internet, or some other technical source,or some individual, then although the information is available, but not accessible. This is the difference between a closed book test or an open book test. Unless you can perform extemporaneously and accurately, be sure to consult transactive memory

Sloman and Fernbach have some unique perspectives. They discount the risk of super intelligence threatening humans, at least for now. They seem to think that there is no current basis for some real super intelligence taking over the world. The reason they offer for this is that technology does not (yet) share intentionality with us. HM does not quite understand why they argue this, and, in any case, the ‘yet” is enclosed in parentheses, implying that this is just a matter of time.

To summarize succinctly, technology increases our knowing more than we know. In other words, it increases the knowledge illusion.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Thinking with Other People

July 7, 2017

This is the sixth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. Thinking with Other People is a chapter in this book. The evolution of modern humans from other species of hominids was extremely rapid on an evolutionary time scale. It began with the emergence of the genus Homo on the African savannah 2 to 3 million years ago. Sloan and Fernbach note that the great leap that humanity took during that period was cognitive. The brain mass of modern humans is about three times that of our early hominid ancestors.

A compelling hypothesis, the social brain hypothesis, is that the driving force of the evolution of human intelligence was the coordination of multiple cognitive systems to pursue complex, shared goals. Living in a group confers advantages, such as hunting, but it demands certain cognitive abilities. There are needs to communicate in sophisticated ways, to understand and incorporate the perspectives of others, and the sharing of common goals. According to the social brain hypothesis the cognitive demands and adaptive advantages associated with living in a group created a snowball effect: As groups got larger and developed more complex joint behaviors, individuals developed new capabilities to support those behaviors, which in turn allowed groups to get even larger and allowed group behavior to become even more complex.

Anthropologist Robin Dunbar, whom we have encountered previously in healthy memory blog posts, tested the social brain hypothesis against the ecological hypothesis. He collected data on many species of primates on brain size as well as facts about the environment they live in like the extent of their roaming territory and dietary habits, and facts about their societies such as their average group size. Brain size and group size are closely related. Primate species that live in large groups have bigger brains. Environmental measures such as territory size and diet were unrelated.

Increased brain size led to language and what sets people apart from other species is the ability to seamlessly communicate ideas of arbitrary complexity. Members of a hunting party need to understand the intentions of others in the hunting party so that each can play their respective roles.

Sloan and Fernbach argue that we humans have the unique capability of shared intentionality. They argue that this ability is one that no other machine or cognitive system does: We can share our attention with someone else. When we interact with one another, we do not merely experience the same event; we also know we are experiencing the same event. And this knowledge that we are sharing our attention changes more than the nature of the experience; it also changes what we do and what we’re about to accomplish in conjunction with others.

Sloan and Fernbach contine, “Sharing attention is a crucial step on the road to being a full collaborator in a group sharing cognitive labor, in a community of knowledge. Once we can share attention, we can share common ground. We know some things that we know others know, and we know that they we know (and of course we know that they know that we know, etc.) The knowledge is not just distributed; it is shared. Once knowledge is shared in this way, we can share intentionality, we can jointly pursue a common goal. A basic human talent is to share intentions with others so that we can accomplish things collaboratively. HM thinks that Sloan and Fernbach are describing the ideal situation. It is not unusual for consultants and training to be required to make this happen. And many organizations continue to function in a state that is far from ideal.

Sloan and Fernbach note that the knowledge illusion is the flip side of what economists call the curse of knowledge. When we know something, we find it hard to imagine that someone else doesn’t know it. The curse of knowledge sometimes comes in the form of hindsight bias. “The curse of knowledge is that we tend to think what is in our heads is in the heads of others. In the knowledge illusion, we tend to think what is in others’ heads is in our heads. In both cases, we fail to discern who knows what.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

 

The Two Causal Reasoners Inside

July 5, 2017

This is the fourth post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach. The title of this post is identical to the title of a section in that book. Drs. Sloman and Fernbach state that we are engaged in some type of causal reasoning almost all the time, but that not all causal reasoning is the same. Some of it is fast. It’s quick and automatic as when a man concludes that the reason his hand hurts is because he bashed it against the wall. Another type of causal reasoning is when we try to remember the causes of WW I.

This two process distinction goes beyond causal reasoning and can be used for all cognitive processing. Daniel Kahneman formulated this distinction in his best selling book, “Thinking Fast and Slow.” There have been many previous posts on this topic. There are sixty-nine hits using Kahneman in the healthy memory blog search box. Normal conversation, driving, and skilled performance are dominated largely by System 1. System 1 is called intuition. When we have to stop and think about something, that is an example of System 2 processing, which is called reasoning. The psychologist Stanovich breaks down System 2 processing into instrumental and epistemic processing in his efforts to develop a Rational Quotient (RQ) that improves upon the standard IQ.

Professor Shane Frederick has introduced a simple test to determine whether a person is more intuitive or more deliberative. It’s called the Cognitive Reflection Test (CRT). Here’s an example problem.

A bat and a ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?

Do you think the answer is 10 cents? If you do, you’re in good company. Most people report that as the answer (including the majority of students in Ivy League colleges). !0 cents pops into almost everyone’s mind. This is the product of System 1 processing. However, if System 2 is engaged, one realizes that if the ball costs 10 cents and the bat costs $1 more than the ball, then the bat costs $1.10 and together they cost $1.20. So 10 cents is the wrong answer. The small proportion of people whose System 2 processes kick in, realize that 10 cents is wrong, and they are able to calculate the correct answer. Frederick refers to such people as reflective, meaning that they tend to suppress their intuitive reasons and deliberate before responding.

Here is another CRT problem.

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it take 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half the lake?

The answer 24 comes to most people’s mind. But if the patch doubles in size every day, then if the lake is half covered on day 24, it would be fully covered on day 25. But the problem states that the lake is fully covered on day 48, so 24 can’t be correct. The correct answer is one day before it’s fully covered, day 47.

Here’s another CRT problem.

If it takes 5 machines to make 5 widgets, how long would it take 100 machines to make 100 widgets?

Try to solve this on your own.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The correct answer is 5 minutes (each machine takes 5 minutes to make one widget).

The solution to all three problems requires the invoking of System 2 processing. Less than 20% of the U.S. population gets all three problems correct. This finding might reflect a reluctance to think and might account for many of the problems the United States is facing. About 48% of students at the Massachusetts Institute of Technology (MIT) got all three problems correct, but only 26% of Princeton students did.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

The Illusion of Understanding

July 4, 2017

This is the third post in the series The Knowledge Illusion: Why We Never Think Alone (Unabridged), written by Steven Sloman and Phillip Fernbach.

In the 1980s the cognitive scientist Thomas Landauer decided to estimate the size of human memory on the same scale that is used to measure the size of computer memories. He used several clever techniques to measure how much knowledge people have. For example, he estimated the size of an average adult’s vocabulary and calculated how many bytes would be required to store that much information. Then he used that result to estimate the size of the average adult’s entire knowledge base. The answer was half of a gigabyte. Currently HM is looking at his USB flash drive with 32 gigabytes of storage.

Dr. Landauer did another study in which he measured the difference in recognition performance between a group that had been exposed to items and a group that had not. This difference is as pure a measure of memory that one can get. He measured the amount of time people spent learning the material in the first place. This told him the rate at which people are able to acquire information that they later remember. He also found a way to take into account that people forget. His analyses found that people acquire information at roughly the same rate regardless of the details of the procedure used in the experiment to the type of material being learned. People learn at approximately the same rate whether the items were visual, verbal, or musical.

Then Dr. Landauer calculated how much information people have on hand, the size of their knowledge base, by assuming they learn at this same rate over the course of a seventy-year lifetime. The same result of 1 gigabyte was obtained by every technique he tried. This number is just a tiny fraction of what a modern laptop can retain.

Drs. Sloman and Fernbach note that this is only shocking if you believe the human mind works like a computer. The model of the mind a machine designed to encode and retain memories breaks down when you consider the complexity of the world with which we interact. They conclude that it would be futile for memory to be designed to hold tons of information because there’s just too much out there.

Drs. Sloman and Fernbach note that most of cognition consists of intuitive thought that occurs below the level of conscious awareness. Huge quantities of information are processed in parallel. People are not computers in that we don’t just rely on a central processor that reads and writes to a memory to think. We rely on our bodies, on the world around us, and on other minds. There’s no way we could store in our heads, all there is to know about our environment.

We humans are surprisingly ignorant, more ignorant that we think. We also exist in a complex world, one that is even more complex than one might have thought. But if we’re so ignorant, how can we get around, sound knowledgeable, and take ourselves seriously while understanding only a tiny fraction of what there is to know?

The authors’ answer is that we do so by living a lie. We ignore complexity by overestimating how much we know about how things work, by living life in the belief that we know how things work even when we don’t. We tell ourselves the we understand what’s going on, that our opinions are justified by our knowledge, and that our actions are in justified beliefs even hough they are not. We tolerate complexity by failing to recognize it. That’s the Illusion of understanding.