Posts Tagged ‘Donald Rumsfield’


July 18, 2017

A previous post titled “Making People Smart” discussed a course entitled “Ignorance” that has been taught at Columbia University. Guest scientists are invited to speak about what “they don’t know, what they think is critical to know, how they might get to know it, what will happen if they do find this or that thing out, or what might happen if they don’t.” The course focuses on all that is not in the textbooks and thus guides students to think about what is unknown and what could be known. The idea is to focus not on what students themselves don’t know, but what entire fields of science don’t know, with the aim of provoking and directing students to ask questions about the foundations of a scientific field. This course requires that a students ponder not just some set of scientific theories and ideas; it requires that they begin to understand what the entire community has and hasn’t mastered. This course is taught by Stuart Fierstein and he has published the book on which that course is based. The book is titled, appropriately enough, “Ignorance: How It Drives Science.”

HM has read the book. It is well written and fairly short. HM recommends this book to anyone who is interested in science. HM will not be writing further posts on this book, but he will be using it as a point of departure to consider a larger truth the book holds. Previous healthy memory blog posts have noted that the rapid improvements in life on this planet come from science, or more particularly, scientific thinking. Truly effective and innovative scientific thinking comes from looking for areas of ignorance, information that we don’t have. In the words of Donald Rumsfeld, these are unknown unknowns.

Contrast this with how most of us think. We comfort ourselves with what we know. We like to stay close to home. However, growth mindsets encourage us to grow our knowledge and skills continually till the end of life. Staying active and continuing to learn is one of the best, if not the best, means of warding off dementia. Another way of looking at this is to look for areas of our own ignorance, and pursue those areas which we would like to pursue.

Businesses that survive know that it is important to adapt to changes constantly. This is needed if they are, at worst, able to survive, or, at best, to thrive. So they are examining areas of ignorance and making decisions as to which to pursue.

What is generally ignored is that governments need to adapt to survive. They need to identify areas of ignorance and address them or difficulties will be encountered, and, eventually, even survival will be threatened. This is not to say that conservative views are not valued. But there role is to preclude foolish pursuits, not to preclude addressing pressing problems with new ideas.

A good example of this is the problem of healthcare in the United States. The problem is severe as the United States has the most expensive healthcare in the world, but health statistics characteristic of third world countries. Every advanced country has successfully addressed healthcare and is providing healthcare for all its citizens via a single payer system in which the single payer is the government.
It is unbelievable that in the United States that there are people who do not think that healthcare is a right for all people. HM regards people who do not believe this as moral degenerates, and that goes double for such people who profess religious faiths. One party believes in market forces, which are very effective under many circumstances, but not for healthcare. But they continue to believe that market forces are universally applicable. When someone has a hammer, everything looks like a nail, which is the problem here.

Perhaps it is ironic that communism is an ideology that failed because, among other factors, it was too widely applied and ignored market forces. The lesson here is that ideologies preclude effective thinking, and ideologues are the bane of an effective democracy. Think and look around. There are many effective examples of universal healthcare in all advanced countries, except one, the good old USofA.

© Douglas Griffith and, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.


A Field Guide to Lies

January 16, 2017

A Field Guide to Lies is a recent book by Daniel J. Levitin.  The  subtitle is “Critical Thinking in the Information Age.”  This information age is embedded in an age of lies.  Hence Levitin’s book is most timely.  One of Levitin’s previous books is “The Organized Mind.”  This book was reviewed in previous healthy memory blog posts.  To find relevant posts enter “Levitin” into the search box of the healthy memory blog.

The importance of being able to think critically in this age of lies cannot be overestimated.  The first part of  “A Field Guide to Lies”  is titled “Evaluating Numbers.”  Here he discusses the role of plausibility in the assessment of numerical values.  They should be read critically and subjected to sanity checks.  He has a section titled “Fun with Averages” which illustrates how averages can be used to mislead.  Similar tricks can be done with graphs, which he addresses in a section titled “Axis Shenanigans.”  There are hijinks in how numbers are reported that need to be understood if one is to think critically.  Shenanigans and hijinks can occur early on when the numbers are collected.  As virtually all information is probabilistic, probabilities need to be understood.  People need to be able to think probabilistically, and Levitin provides advice as to how to proceed.

Part Two is titled “Evaluating Words.”  It begins by discussing how we know.  Particularly in this age of misinformation and of organizations whose mission it is to mislead, it is important to identify expertise.  It is also important to identify potential motivation behind a given expertise.  A common failure is not to consider alternative explanations, and when they are considered, to undervalue them.  The final section in Part Two is titled Counterknowledge.  HM thinks that this section might have the wrong title.  Although most certainly there is legitimate counterknowledge, today counterknowlede is often a set of well-conceived and well-designed lies.  Very frequently, these lies are outlandish, but yet they are still believed.

Part Three is titled “Evaluating the World.”  The best way of evaluating the world is with science.  Consequently, “How Science Works” is the title of the first section.  The section on logical fallacies is HM’s  favorite.  For many years HM has been annoyed at Dr. Watson’s asking Holmes how did he deduce something or other.  Apparently, Arthur Conan Doyle did not understand what deduction is.  Deduction is drawing a correct conclusion from a set of premises.  But this is not what Holmes did.  Holmes used abduction to solve crimes.  That is, he came up with a conjecture or hypothesis, which he then proved through evidence.

Knowing what you don’t know is another subsection of Evaluating the World.  Remember Rumsfield, “…as we know, there are known knowns;  there are things we know we know.  We also know that there are known unknowns; that is to say we know there are some things that we do not know.  But there are also unknown unknown’s—the ones that we don’t know we don’t know.”  To these statements Levitin adds, “A final class that Secretary Rumsfeld didn’t talk about are incorrect knowns—things  that we think are so, but aren’t.  Believing false claims falls into this category.  One of the biggest causes of bad, even fatal, outcomes is belief in things that are untrue.”     To this, HM would add, that most of what we know is probabilistic, not absolute, and this complicates the thinking processing further.

Bayesian thinking is needed.  Levitin discusses Bayesian thinking in Science and Court, and illustrates this thinking with Four Case Studies.  However, Bayesian thinking is not restricted to just Science and Court.  It should be part of our daily thinking.  Fortunately Levitin dedicates an appendix to the Application of  Bayes’ Rule.

Levitin’s book provides a good introduction to critically thinking.  Unfortunately we live in an era where lying is epidemic and lying has become a business.  The next post is titled “Lies Incorporated.”

© Douglas Griffith and, 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.