Facts, Fiction and the Fallible Mind: Q & A with Dr. Steven Sloman
Internationally renowned cognitive scientist, Steve Sloman, Ph.D., was a guest on the May 28 weekly behavioral health show “In Your Right Mind,” which airs every Sunday at 5 p.m. on radio station KABC. In the episode titled, “Why Facts Don’t Change Our Minds,” Dr. Sloman shared a wealth of knowledge and experience discussing the common disconnect between knowledge and behavior with Drs. Lee McIntyre and Tonmoy Sharma, and co-host Kristina Kuestner. The show can be heard for free on demand by clicking here.
Understanding how the human mind perceives and incorporates knowledge helps to explain reasoning and decision-making, and why people do things that contradict what they think they know. Dr. Sloman and his former student Philip Fernbach, Ph.D. explore this concept and more in the exciting new book, The Knowledge Illusion: Why We Never Think Alone, published by Penguin Random House. We recently sat down with Dr. Sloman to learn more about his work, how people mistake other’s knowledge for their own, and the discrepancy between facts and the belief systems that guide everyday behavior.
Q. Dr. Sloman, thank you so much for taking the time again to speak with us about your fascinating work and how it not only applies to every person but also influences society at large. On the show, you discussed how personal knowledge and understanding is an illusion; that people think they understand things better than they really do. In fact, your research findings revealed that people may confuse other people’s knowledge for their own. Could you please give a few more examples of common situations that would illustrate this tendency?
A. The obvious example is in the classroom. Students sometimes think they understand because the teacher or other students understand. But as soon as they leave the classroom and are responsible for the material themselves, they discover they know less than they thought. Another example is when discussing politics. Often people at the dinner table speak as if they are experts on the issue being discussed. In fact, they have the illusion that they understand because they think that others that share their political view understand. They are often not taken to task, because nobody else around the table really understands the issue either.
Q. Would you say that some people overestimate their knowledge more than others? If so, are there any demographic or psychiatric characteristics that would cause certain people to overestimate their personal knowledge?
A. We have evidence that people who are more reflective are less likely to over-estimate their understanding. More reflective people don’t just respond with the first thing that comes to mind, but instead think before they speak. There are simple tests one can use to identify such people. The vast majority of people are not reflective.
Q. In this current era of the internet, information wars and so-called “fake news,” how can people know if what they believe is true?
A. It’s usually not so hard. Don’t believe things just because the people around you report them. Instead, see what the sources of the news are. Both scientists’ and mainstream media’s reputations rest on being credible, and so usually they can be trusted. Those things that are acknowledged by outlets that represent a spectrum of political points of view are more likely true than information that comes solely out of outlets that toe a common political line.
Q. In your latest book written with Dr. Fernbach, The Knowledge Illusion: Why We Never Think Alone, you explain how people remain ignorant of their own ignorance. When asked to explain in detail topics they claim to understand, they are often unable to do so. Would you say that the only way to obtain true knowledge is through personal experience?
A. Absolutely not. Most of our knowledge is housed elsewhere: in other people’s heads, in books, on the internet, and in other places. You can’t trust every source of information, but you need to trust some in order to be educated. For instance, you can’t directly experience the fact that men have walked on the moon or Newton’s laws of thermodynamics. You need to learn these things from other people.
Q. If enough people believe certain “facts,” do they then become reality for that group of people.
A. Some facts are like that, and some justifiably so. For instance, if enough people believe that money has value, then it does have value. Other things take on a questionable reality. For instance, certain cults believe that the world is going to end on a certain day. Such beliefs are perceived as reality for a while, but then actual reality sets in (if the members of the group survive their day of reckoning). Much of our reality is not determined by a group’s beliefs but by the laws that govern the universe. You can’t just believe the free market will provide everyone with health care in order for everyone to have health care.
Q. In your book, you recommend sticking to certain “rules of thumb” when making decisions to avoid the confusion that ignorance can introduce to situations. Could you give us an example of this?
A. It’s hard for people to appreciate how important it is to start saving for retirement as early as possible. The value of money increases in an increasing way over time. This is a hard concept to grasp. So, instead of trying to understand how the value of money increases, it’s a good idea just to put away a little money on a regular basis. Here’s another example: It’s hard to appreciate how bad Coke is for you, especially when you want some. So, a good rule of thumb is: don’t buy Coke when you go shopping. Keep it unavailable in your household. Shape your environment to encourage good behavior because our decisions are generally governed by whatever environment we find ourselves in. If everyone around us is smoking, we’re more likely to smoke.
Q. You also recommend that individuals be aware of their potential ignorance and remain open-minded and realistic. That sounds like sound advice for all of us to take, especially for those in powerful positions whose decisions affect many people. Is there any advice you would give to our world leaders regarding the knowledge illusion?
A. I think it’s important for leaders to understand the limitations of their own knowledge and expertise. Good leaders know how to ferret out true expertise and incorporate it into their decision-making. It’s ok not to know things; nobody knows very much. The trick is to be able to identify experts and incentivize them to tell the truth.
Q. Is there anything else you would like our readers to know?
A. It’s ok to be ignorant. Everybody is relatively ignorant because the world is complex beyond our imagination. Usually, it’s also ok to live with the illusion that we have more knowledge than we do. It can be empowering, and can make a person more interesting. But it’s critical not to take it too far, because it can have deadly consequences. When individual hubris leads to arguments, terrorism and even war, it’s clear that puncturing illusions is called for.
About Steven Sloman, Ph.D.
Steven Sloman, Ph.D. is a computationally-oriented cognitive scientist and Professor of Cognitive, Linguistic and Psychological Services at Brown University and an expert in how people think. Much of his work in recent years has focused on how people reason causally about the world. These interests are reflected in his many scientific papers and two of his important books, Causal Models: How People Think About the World and Its Alternatives and The Knowledge Illusion: Why We Never Think Alone. His current work concerns ignorance and the community of knowledge. Dr. Sloman also serves as the current Editor-in-Chief of the journal Cognition.