Subscribe

Politics

Brains, Trees
Photo credit: Adapted by WhoWhatWhy from Kalvicio de las Nieves / Flickr (CC BY-NC 2.0)

A look at why facts, data, and truth have little to do with what we believe.

Listen To This Story
Voiced by Amazon Polly

We like to think that humans are rational beings. That we process information based on data, a common set of facts that allow us to determine what we call objective truth. The problem is that scientific research shows that this is not true.

According to our guest on this week’s WhoWhatWhy podcast, Dr. Gaurav Suri, a psychologist and computational neuroscientist at San Francisco State University, we reach our conclusions based on factors that seldom have to do with truth.

Suri, a Stanford Ph.D. who studies the brain mechanisms that shape motivated actions,

discusses how people with the same set of facts can reach different conclusions, with the differences in their conclusions mostly based on our preexisting motivations.

He details how we interpret data through emotions, and decision-making, and argues that pattern recognition and tribalism are what really shape our beliefs, and provides cognitive support for those conclusions. When the facts get in the way of what we believe, our brains struggle to rid us of this “cognitive dissonance” by the uniquely human process of denial.

In the course of this fascinating conversation Suri explains how our social relationships are the key driver of our beliefs: What people we trust say matters more than the facts under our nose. He adds that the speed of information today reinforces the worst aspects of how we think. It accentuates what he calls our “automated thinking” and tamps down our “control-based thinking.” Thinking and processing information at today’s speeds is, according to Suri, as if we are constantly driving 60 mph on an icy road.

An additional factor, Suri says, is our mobility — which means we can (and do) move to live and work with people who believe as we do. This creates, he says, an amplification that further erodes critical thinking, which ultimately leads to groupthink and, at worst, mob action.

The solution to this inborn aspect of human nature, he says, is systematic doubt — learning to question the views we hold while remembering to respect those who believe differently. Given the tinderbox nature of today’s world, anything less will doom us.

googleplaylogo200px download rss-35468_640
Click HERE to Download Mp3


Full Text Transcript: 

As a service to our readers, we provide transcripts with our podcasts. We try to ensure that these transcripts do not include errors. However, due to time constraints, we are not always able to proofread them as closely as we would like. Should you spot any errors, we’d be grateful if you would notify us. 

Jeff Schechtman: Welcome to the WhoWhatWhy podcast. I’m your host Jeff Schechtman. George Orwell said that to see what’s in front of one’s nose needs a constant struggle. Think about why it’s so hard. We think that we take in the information around us, analyze it, and make decisions. If the decision is wrong, it’s because we didn’t have enough facts, or we misinterpreted the data. The problem is that that’s not true. It’s not the way our brains work.

Jeff Schechtman: The events of the past few weeks have certainly shown us that. It was never true, and today, less so. When information is coming at us like water from a fire hose, through pipes that are open 24/7, when TV news wakes us up and doom-scrolling Twitter puts us to sleep, there are many other forces at play. We don’t always follow the facts, sometimes we make decisions based on our instinct, our tribe, and simply the language that’s used to present things to us. The cognitive dissonance between fact and feelings though, should not doom us. At least not if we understand how it works and what’s going on.

Jeff Schechtman: To help us understand this, I’m joined by my guest, Gaurav Suri. Dr. Gaurav Suri is an experimental psychologist and a computational neuroscientist at San Francisco State University. He studies the brain’s mechanisms that shape motivated actions and decisions, including value-based decision-making, emotion, and emotional regulation. He has a PhD from Stanford, an MBA from UCLA, a master’s in mathematics and computer science from Stanford, and he’s the author of a mathematical novel entitled A Certain Ambiguity. It is my pleasure to welcome Dr. Gaurav Suri here to the WhoWhatWhy podcast.

Jeff Schechtman: Gaurav, thanks so much for joining us.

Gaurav Suri: My pleasure, Jeff. Thank you for having me.

Jeff Schechtman: One of the things that I think has been so maddening and frustrating over the past several months, with respect to the pandemic, with respect to our politics, is the inability and the surprise of seeing people deny facts, argue facts for which there is clear and solid evidence. Talk a little bit about that first.

Gaurav Suri: Jeff, we have the abiding intuition that human decision-making is about finding the right rational answer, and that we have calculation engines in our brain that, when given certain inputs, are able to calculate the rational answer. This is how we see ourselves. This is how we see other people. Because of that, we feel that if somebody is coming up with answers that we believe are wrong, they’re just missing the information because, given the information we had, we came up with another answer. They’re coming up with their own answer. So clearly, they don’t have the right information. So our instinct is to just give them data.

Gaurav Suri: It turns out that giving people data is probably the least effective way of convincing people to change their minds. And the reason is that our brains are not these rational calculation engines that give us the objective truth in all contexts. That’s not what our brains evolved to do. Our brains are pattern finders. They find statistical regularities in our environment. They’re not in the business of finding some objective truth, and I think that causes a lot of confusion amongst people who believe that “Gee, my brain finds the objective truth. How come this other person’s brain isn’t finding that?” So we try to give more data to more people.

Jeff Schechtman: How much of that has to do with language and how much of it has to do with culture?

Gaurav Suri: I think language and culture are really important inputs into what the brain processes. For example, a culture might give us goals. It might give us some ideals, and these ideals and goals might put constraints on our decision-making.

Gaurav Suri: I think that more than language and culture, motivation is the direct modifier of how we arrive at decisions. So we think that the classic intuition on decision-making is we get some inputs, we process them and we come to an answer. Probably what actually happens is that we have some preexisting motivations and these motivations shape information as it comes into our brain, and sort of guide an answer.

Gaurav Suri: It’s very likely that we are in possession of an answer, and the thinking that we do, the cognition that we do, is really finding support for an answer that we already want. So our intuition might be that we think and come to an answer. The reality might be that our motivations give us an answer and then we find the cognitive support for our preferred answer.

Gaurav Suri: Now, where does motivation come from? Motivation may have various sources. It may, as you point out, come from our culture. It may come from the social tribe we belong to within our culture. It may come from a lifetime of learning, meaning people around us, how they behaved. It may even come from the particular connections in our brain. Some people are more promotion-oriented, more reward-oriented. Other people want to avoid punishment.

Gaurav Suri: And so, people have different motivations for a variety of reasons. But the one thing that’s common across these circumstances is that motivation is a powerful shaper of our decisions, and our cognition seems to support, in a large part, what our motivations are nudging us towards.

Jeff Schechtman: What about the idea of being able to maintain two different ideas in our mind at the same time? One that comes from what we want the answer to be, and one that comes from evidence and data that is presented to us?

Gaurav Suri: Right. In psychology, Jeff, that’s called cognitive dissonance. If we have two ideas that are contradictory to each other. People try to reduce dissonance in several ways. Let me just give an example. Let’s say I’ve been a smoker for many years. I’m not, but let’s say that I have, and I get all of this information that smoking is bad for my health. These are two opposing facts. One fact is saying, I’m a smoker. I like to smoke. The other factor saying smoking is going to kill me. So what might I do?

Gaurav Suri: What people tend to do, and this has been carefully studied over decades, is they either minimize one of the truths. They might say, “Ah, I’m going to die anyway.” Or they might bring in some other fact, such as, “My grandma smoked two packs a day, but she lived to 103.” The attempt in all of these is to reduce the dissonance.

Gaurav Suri: Now, interestingly, cognitive dissonance has also been observed in the animal kingdom. I mean, non-human primates also show a drive to reduce dissonance. So it need not be happening at this linguistic conscious level. But the fact is, that when we have two opposing ideas, we try to change, minimize, or alter one or both of them so that we don’t have to walk around with both ideas equally powerful in our minds. That has to do with the structure of our networks.

Jeff Schechtman: It does seem like part of the response to this — certainly as we’ve seen it play out with aspects of the pandemic, and as we’ve seen it play out with our politics — is simply denialism.

Gaurav Suri: That is one of the easiest ways to reduce dissonance. In the context of the pandemic, if one group of evidence or one group of people is saying, “You should wear a mask. That reduces your risk of getting this disease,” another group of people is saying, “No, it’s weak to wear a mask.” The way we are going to decide this is not by doing a detailed analysis of the facts about wearing a mask. That may enter into the equation, we may actually read an article or two, but the primary driver of this decision is going to be the social tribe that we’ve previously identified with.

Gaurav Suri: If our social tribe is sort of this science-based community, listening to the expert, we’re much more likely to wear a mask. If our social tribe, and in this particular instance, President Trump was pretty visible saying he may not wear a mask, that’s a powerful signal. If I’m, let’s say aligned with that social tribe, I may find motivations not to wear a mask and then find reasons to support my decision. And so, a powerful dissonance is created by listening to and hearing from different social tribes. Again, it’s not necessarily reason that’s going to help us decide which social tribe we’re going to go with, it is prior motivations. And in many contexts, these motivations are very powerful and they sort of recruit our reasoning apparatus to find evidence in support of the answers that we know we want.

Jeff Schechtman: Recruiting that apparatus is kind of a gentle way to put it. In many ways, it’s weaponized by individuals against other groups in a way that uses this built-in bias, this built-in dissonance.

Gaurav Suri: Right, Jeff, what’s weaponized is not how strong the motivations are not just to belong to one social tribe or the other, but to see the social tribe, the one which we don’t belong to, as being deeply flawed, evil, un-American. Motivation is not just that I identify with the tribe, the motivation is that I want to see the tribe that I don’t belong to as being destroyed or owned, or whatever word one might use. So what’s weaponized is that these motivations have been made so powerful.

Gaurav Suri: Reason requires resources, requires time, requires a certain mindset to establish it’s slow-acting influence. But if we are facing onslaughts of motivation to belong to one tribe, and let’s say, hating the other tribe, then you’re right: The motivations have weaponized against any sort of controlled thinking to take hold. What we use our thinking for is to construct clever arguments on social media that support positions that we’ve arrived at via motivated processes.

Jeff Schechtman: How much is trust a part of this?

Gaurav Suri: Most systems in the brain, Jeff, are two-way. People ask, “Did this happen first, or did that happen first?” Well, it turns out that, for most things, A influences B, and B influences A. And so, it’s this reciprocal reciprocity, which is very common in the brain. We see this a lot.

Gaurav Suri: When we’re hungry, we’re more likely to see restaurants. So the hunger is influencing the perception, and to some extent, the act of seeing the restaurant is influencing our hunger. When we see these tasty-looking pizza slices, we might find hunger. So that’s a reciprocal process.

Gaurav Suri: So trust, if I join a social tribe, I join with some drives — motivational drives, historical drives. And now as I’ve joined it and I’ve spent days and time with it, I’m getting more invested in it.

Gaurav Suri: I am changing the tribe I belong to, and the tribe is changing me. It’s a reciprocity process, and as those interactive processes happen, the word trust comes to mind as a shorthand for saying, “I have so much history. I’ve influenced these people around me a great deal and been influenced by these people around me. And these ties are now stronger.” We refer to these strengthened ties as a shorthand, which is, “I trust them and they trust me.” These lead to strongly influenced and influential groups that engender a great deal of motivation.

Jeff Schechtman: To what extent are these cognitive processes impacted by speed? The fact that the world is different today, that so much information is coming at us so quickly, does that put more pressure on these systems than the world as it was 50 years ago, for example?

Gaurav Suri: Great question. It’s probably accurate to say that the brain has multiple sub-networks, all of which interact with each other to come up with a decision. A simplified view of these multiple interactive systems is just to focus on two. I’ll do that, but I want wanted to note that there probably are multiple. These are probably not atomic systems. These are interactive systems. But all that said, there’s at least two flavors of thinking. One is fast, automatic, and doesn’t take that many resources.

Gaurav Suri: Imagine that you’ve been driving for many years, so driving, for example, would be an automated process. The other process is slower. It requires a lot of resources, attention. Distraction would really influence this process, and this would be akin to somebody first learning to drive. They have to put in all their resources. My son is just learning to drive, and so I see it. When he goes out on the road, he has to put in every last resource because it’s not this automated thing that it is for me now.

Gaurav Suri: So our attention systems require cognitive resources. They’re slower. The brain networks that make this type of thinking possible requires sustained attention, a lack of distraction, and the ability to logically build and continue thoughts for a while until we come up with an answer. We’re looking for errors, we’re making sure that we don’t make errors. It’s like driving on a slippery road. Even if somebody knows how to drive really well, if the road is very slippery, now, all of a sudden we’re completely focused on not making a new error. We can’t be talking on the phone, or be really angry at our passengers, and successfully do that if it requires all our resources.

Gaurav Suri: On the other hand, we can be chewing gum and be angry or talking on the phone, because chewing gum is an automated process. So these things do require a great deal of sustained attention for them to work. Otherwise, they don’t work.

Gaurav Suri: Your original question about the fact that we’re distracted. Well, if we’re distracted, we don’t have sustained attention. If we’re used to consuming information in small bites, we don’t have sustained attention. If we have many motivational forces acting on us, we don’t have sustained attention. Our system that is the slower attention-based control part, I will call it, control-based thinking, it’s delicate. Automated thinking is strong. Most of what we do is automated thinking, and that doesn’t require that many resources. But the thing is that, for new events, new situations, we need this control-based thinking, and we need to be able to have the resources to bring to bear so we can successfully do this control-based thinking.

Jeff Schechtman: Given our limitations as human beings then, and processing that information and the increasing speed at which it’s happening, doesn’t life become almost full-time like driving on that icy road?

Gaurav Suri: Yeah, I think that’s a great metaphor. I think that acknowledging that possibility, reality, recognizing that highly emotional states or distracted states rob us of our ability to do the control-based thinking, is a necessary first step in getting to being able to successfully deploy control-based thinking when we need to.

Gaurav Suri: One of the problems that’s happened is that there’s been an information proliferation, which is one would think a good thing, but there’s also been a motivation proliferation on the internet. And so, one can be guided by some initial motivation and then find a social circle in which those motivational forces are really amplified. And they’re amplified to the extent that they are constantly operating in the background and shaping what we pay attention to and how we pay attention, and really decreasing our control-based thinking, our ability to do control-based thinking.

Gaurav Suri: In previous eras, which you started to refer to, there was an agreement that what’s in the newspaper is a common input for everybody. And because it was a common input for everybody, we had to have a measure of agreement on, yes, we have some degree of trust and we find this information to be reliable. And the people writing the newspaper had to cater to a variety of camps that existed in the population.

Gaurav Suri: That doesn’t exist anymore. Because no matter what I believe, I can find a community of people who believe as I believe. And those people are going to amplify my motivations. And once my motivation that amplified, my ability to process data and exert control-based thinking is going to get decreased. As you point out, that is akin to driving on an icy road without the resources, because we’re so distracted by these emotional states, and the shaping of our responses is guided by these emotional states.

Jeff Schechtman: And then talk about when these groups and these divergent points of view become too wide, become too extreme, what it leads to and what we’ve seen it lead to in terms of the extreme of this tribalism, which are this kind of mob mentality.

Gaurav Suri: The essence of politics is looking at policies and understanding that different people have motivations that may differ from us and coming to a compromise. That’s only possible if one essentially believes in the goodness, or to some extent, at least, of the different tribe. If one thinks that these other people, they’re trying to do a policy that’s different from what I’m trying to do, but let me listen to what they have to say, and let’s come to a compromise. That’s sort of the art politics, but that’s only possible if I don’t think that the other group needs to be wiped off the face of the country. It’s only possible if I allow myself to listen to the other groups without wanting to, or without having this internal monologue in my head that the other group is evil or wicked or needs to be utterly destroyed.

Gaurav Suri: Politics can’t start in those circumstances. So as you said at the outset, I’m an experimental psychologist and I look at brain networks, but I’m also an amateur historian. And so, I read a great deal about the American Civil War. It’s striking that where we are now is very similar to some of the influences that were in the late 1850s and 1860s, because there was this notion that the other side is just utterly, utterly wanting to destroy the alternative. And this gap is getting wider because we’re more and more in bubbles. We’re less and less able to talk. We’re more and more influenced by motivations that are ever stronger. And our attention span is less and less able to process information that might start sowing seeds of doubt in the views of the social tribe that we belong to. It’s a dangerous time.

Jeff Schechtman: There’s also the degree to which we then have so much at stake, both personally and tribally, in that particular point of view.

Gaurav Suri: Yes. The stakes increase if we are in a de facto state of war with each other. I mean, if one group wants to destroy the other group, the stakes are very high. In the news recently, we have all these reports of armed unrest, and if so-and-so doesn’t agree with us, let’s kill them. One doesn’t get to that conclusion without motivation that is really unsullied with any sort of doubt.

Gaurav Suri: I’m a big fan of doubt. I think progress happens when one says to oneself, “Yeah, but wait a minute.” That thought, that linguistic phrase, even, Jeff. “Yeah, but wait a minute.” If one says that to oneself, that’s a signal that one is incorporating new data and thinking about the data in new ways. If we don’t do that, then we’re in more and more polarized camps. We’re further and further away, and the stakes are higher and higher because aggression is a part of our very, very ancient networks. And if we are utterly convinced that it’s us versus them, our survival versus their survival, we come to the conclusion that the other group should not exist. I fear that large parts of our country are now in that state.

Jeff Schechtman: Finally, I guess what’s most disturbing about it, layering onto what you’re saying, is that because we have no historical precedent for a situation like this, where we have information and movement at such an incredible speed, there’s really no way to anticipate the outcome.

Gaurav Suri: There is no way to anticipate the outcome, but we do have the ability to do critical thinking. I want to be clear on this. I think that critical thinking is difficult, it requires a lot of sustained attention, it requires quietened motivation, it’s slow. All that said, we can do it. And so, we can figure out the consequences of these bubbles that we are inhabiting, and we can start to make policies that reduce this utter isolation and other warfare that’s descended on our society. I believe it is possible. I believe that crises create great danger and they create great opportunity, and I think that we are in this state and how we react now, here in January 2021 is going to define how our next decades are going to be.

Gaurav Suri: If we can come up with policies that promote pro-social behavior on the internet, that promote the idea of listening to divergent points of view, that promote a politics based on compromise, then we shall thrive. If we continue to go down this path of “It’s us or them,” and them being just other people in America, then that’s not such a good idea.

Gaurav Suri: One of the things that’s, for example, really galling to me is how the YouTube feed or how the feed of news, or the Twitter feed is customized for a person, is configurable by that person. And that person configures it based on their preferences and the social tribe they belong to, and they make it so they receive no articles or information that is against their current point of view. All of the social platforms are configured to people over doubt, and the reason is that people don’t like cognitive dissonance as we discussed. So they don’t want to hear points of view that may require thinking.

Gaurav Suri: Thinking is hard. This control-based thinking is hard. It requires resources. So unless we can find policies that get people in the thinking arena again, and get them away from all decisions that are completely colored by motivation, they will be. Motivation has always been, and will always be, a central part of our decision. But if we can slow that train down as we have in the past, and at least get these control-based systems to have a voice in our decisions, if that happens, we build a better country. If that doesn’t, things will continue to look the way they do.

Jeff Schechtman: Dr. Gaurav Suri, I thank you so much for spending time with us today.

Gaurav Suri: It was my pleasure, Jeff. Thank you.

Jeff Schechtman: Thank you. And thank you for listening and for joining us here on Radio WhoWhatWhy. I hope you join us next week for another Radio WhoWhatWhy podcast. I’m Jeff Schechtman. If you liked this podcast, please feel free to share and help others find it by rating and reviewing it on iTunes. You can also support this podcast and all the work we do by going to whowhatwhy.org/donate.

Related front page panorama photo credit: Adapted by WhoWhatWhy from J Brew / Flickr (CC BY-SA 2.0) and Rob Larsen / Flickr (CC BY 2.0)

Comments are closed.