Lying has become a viable strategy for success. How influencers, algorithms, and crowds turn deception into accepted reality in the digital age
We inhabit a fractured information landscape where truth itself has become negotiable.
Lying has not only lost its stigma — it’s become a viable strategy for success.
Politicians lie with impunity, corporate leaders fabricate narratives, and social media influencers craft false personas, all understanding that in the attention economy, authenticity is just another performance metric.
Our guest on this week’s WhoWhatWhy podcast, Renée DiResta, author of Invisible Rulers: The People Who Turn Lies into Reality, reveals the machinery behind this transformation.
The infrastructure of deception, she explains, has become so sophisticated and pervasive that we’ve normalized dishonesty as simply another tool in the communications toolkit.
This isn’t merely about grand political conspiracies, but about how the digital revolution has fundamentally altered our relationship with truth at the most basic human level.
As a researcher at Stanford’s Internet Observatory, she discovered how a complex system of influencers, algorithms, and crowds now determines what information moves through society. The result, she details, is a landscape of bespoke realities, each carefully curated and algorithmically amplified, where making something trend is often perceived as indistinguishable from establishing it as true.
DiResta’s personal journey, from concerned parent to focused researcher, illustrates how those who study online deception often become victims of it themselves. Her work documenting false election claims led to congressional inquiries and professional exile, showing how passionate activists spreading misinformation can overpower truthful voices.
In this conversation, we explore why lying has become consequence-free, how the invisible architecture of influence operates, and whether there’s still time to reclaim our shared understanding of reality in an age when propaganda has been democratized but not necessarily for democracy’s benefit.
Apple Podcasts
Google Podcasts
RSS
Full Text Transcript:
(As a service to our readers, we provide transcripts with our podcasts. We try to ensure that these transcripts do not include errors. However, due to a constraint of resources, we are not always able to proofread them as closely as we would like and hope that you will excuse any errors that slipped through.)
[00:00:09] Jeff Schechtman: Welcome to the WhoWhatWhy podcast. I’m your host, Jeff Schechtman. In the grand tapestry of human communications, we have witnessed epochs of transformation from the printing press to radio, television to the internet. But perhaps no shift has been as seismic or as perilous as the one we’re experiencing today, where once information flowed through established channels with recognized gatekeepers, we now inhabit a fractured landscape where truth itself has become negotiable. We live in an age where lying has not only lost its stigma, but has become a viable strategy for success. Politicians lie with impunity, knowing their supporters will rationalize it or ignore the deception. Corporate leaders fabricate narratives, confident that algorithm amplification will outweigh factual correction. Social media influences craft false personas, understanding that authenticity is now just another performance metric. The infrastructure of deception has become so sophisticated, so pervasive, that we’ve normalized dishonesty as simply another tool in the communications toolkit. This isn’t merely about grand political conspiracies or foreign interference campaigns, though those exist. It’s about how the digital revolution has fundamentally altered our relationship with truth at the most basic human level. When anyone can create a compelling lie, when algorithms reward sensational fiction over mundane facts, when online crowds can be mobilized to defend any narrative, the very concept of shared reality begins to dissolve. The machinery of influence has been democratized, but not necessarily for democracy’s benefit. In this new ecosystem, a handful of skilled propagandists can wield the power once reserved for media moguls and heads of state. They understand that in a world of infinite information streams, the person who masters the attention economy masters the truth. The result is a landscape of bespoke realities, each carefully curated and algorithmically amplified, where making something trend is often indistinguishable from making it true. My guest, Renée DiResta, has spent years as a forensic investigator on this new machinery of deception. Her journey began as a concerned parent confronting vaccine information and evolved into groundbreaking research at Stanford’s Internet Observatory. Her new book, Invisible Rulers, The People Who Turn Lies Into Reality, offers both a chilling diagnosis of how we arrived at this moment and a blueprint for understanding how small communities of propagandists have learned to manipulate public opinion on a massive scale. But DiResta’s story is also deeply personal. She has found herself transformed from researcher into target, facing congressional inquiries, online harassment, and professional exile, all for the sin of studying how lies become accepted truths in the digital age. Her experience reveals the circular nature of our crisis. Those who seek to understand the mechanics of deception become victims of those very mechanics. Today, we’re going to explore why lying has become so pervasive and seemingly consequence-free, how the invisible architecture of influence operates in the 21st century, and whether there’s still time to reclaim our shared understanding of reality. It is my pleasure to welcome Renée DiResta here to talk about Invisible Rulers, The People Who Turn Lies Into Reality. Renée, thanks so much for joining us here on the WhoWhatWhy podcast.
[00:03:49] Renée DiResta: Thanks for having me, Jeff.
[00:03:50] Jeff Schechtman: It’s a delight to have you here. Thank you so much. Certainly, there have always been lies from politicians. There have always been propagandists, people trying to make us believe things, but something is fundamentally different in the time we live in today. Talk about that from a 30,000-foot view, first of all.
[00:04:09] Renée DiResta: Right. As you say, propaganda is very, very old. The term comes from Pope Gregory. After the printing press emerged and the Catholic Church began to realize that pamphlets being spread from person to person, the printed word was leading to a sense that their faith was being undermined, that the true faith specifically was being undermined. The word propaganda comes from the Pope’s exhortation to the bishops, the cardinals, that they must go and propagate the faith, propagate the true faith. It’s a very, very old term. As you know, politicians have also lied for as long as there have been politicians. I think the reason I wrote the book was to explain the kind of complex system between influencers, algorithms, and crowds that exists today, a system that really emerged with the internet that is very distinct to the internet. What I mean by that is we always, all of us individually, had the power to spread information within our communities. We could sit and we could talk to each other. You could go to maybe your local town hall where you would hang out, play bridge or something like that. You could write an op-ed in your local newspaper with the origin of what we now kind of quaintly called web one. You could run a blog, you could post your thoughts to the web, and maybe people would find it. But once we had social media, there were virality components built in. Algorithms gave you the power to help share other people’s content as well. We became creators as well as distributors. The system that began to emerge turned everybody into not only people who could speak in our local community, but people who could share information globally. And that in aggregate, collectively, we became the people who curated and distributed information. But not just by ourselves. This was happening in conjunction with algorithms who served also as gatekeepers deciding what signal from a sufficient number of people was enough to propagate that information along. So the combination of crowds of people plus algorithms really began to shape the flows of information. So this was distinct and new. And it happened very, very quickly, very rapidly. So the speed and the scale was new. Millions and then eventually, in some cases, billions of people were on these platforms. And then the other thing that I talk about heavily in the book, the sort of third leg of the stool, if you will, is the influencer, which is this very interesting figure that is also unique to the age of social media. And where we’ve always had celebrities, people that are known for being in music, maybe, or in movies, sometimes media theorists will kind of joke around sometimes and say that the celebrity is known for being well known. Sometimes you become famous and it just kind of becomes a self-referential cycle. But on the internet, ordinary people can kind of amass a following, grow a large number of followers, sometimes into the millions, sometimes larger than a newspaper or even a celebrity. And those people become kind of key nodes in networks that decide what information gets more or less attention just based on the size of their following. So these sort of three things together, the crowd, just groups of ordinary people acting in concert, the influencer, particular folks who are very good at putting out messages that reach a lot of people, that have a lot of resonance, they’re great storytellers, they’re very charismatic, and then the algorithms, which are curatorial functions on social media platforms that are serving the interests of the platform, the business interest of the platform, that often really shape the information that we see. And these three things together become this system that really determines what information moves and it really influences public opinion today.
[00:08:05] Jeff Schechtman: The irony of this, I suppose, is that the original intent of all of this, the idea of moving from top-down information to even a kind of participatory information, was to democratize that information. And it has had more or less the opposite impact.
[00:08:23] Renée DiResta: Well, it is democratized. Just because you don’t like what message moves doesn’t mean it’s not democratic, right? So I think that one of the interesting questions is who participates in which processes? And oftentimes you’ll see institutions and experts and people who are accustomed to the broadcast media environment still thinking that that is the way to get messages out into the world. You know, I remember I was at Stanford University for five years and when, you know, you alluded to in the introduction, when conspiracy theories about me started, I remember Stanford Com saying something sort of that really struck me as incredibly funny, which was, oh, don’t worry about it, it’s not in the Washington Post, Renee. And I thought that that was so incredibly funny because it was such an indication of where this institutional communication team thought that public opinion was still shaped. Why would I care about what people online were saying? It wasn’t in the Washington Post. It wasn’t on the nightly news on a reputable channel, right? It was just some people online that were saying a thing. Because they didn’t think about crowds of people online as being profoundly influential. They didn’t think about the alt-media podcast circuit as having millions and millions of listeners. They still thought of institutional top-down media as being the place where you went to give a quote. And that was legitimate important media. And so this democratized bottom-up network, that space where opinion is often shaped for very, very large communities of people, in fact, that to them was a completely different ecosystem. And it just wasn’t even really on their radar as something to consider important.
[00:10:09] Jeff Schechtman: Talk about the ways in which this has played into what we have often referred to as a kind of confirmation bias. And in the process of that, along with these three legs of the stool that you’re talking about, has created what you call this bespoke reality.
[00:10:28] Renée DiResta: So one of the things that social media algorithms tend to do is they show you more of a thing that they think you will like. So that can be a thing that you’ve engaged with previously. Or it can be a thing that people who it sees as being like you are interested in. So if you are interested in, let’s say gardening, even if you’ve never searched for cooking content, a whole lot of people who are interested in gardening also happen to be interested in cooking. This kind of comes along with the interest base. Or maybe fitness. People who are interested in gardening maybe live healthy lifestyles. And so maybe it’ll push you some fitness content. Not because you’ve ever gone searching for fitness, but because a whole lot of other people who like gardening also like fitness. So the way that the algorithmic suggestions work, the platform wants to keep you on site. And so by making these suggestions, giving you these little nudges, it sees what you go and engage with. If you consistently ignore or click the button that says you don’t want to see those fitness posts, then it’s going to stop showing you them and it’s going to move on to something else. But over time, it’s going to refine its suggestions to be really tailored to what you want to see. And in the realm of gardening and fitness and cooking and sports and whatever else, these can be just very innocuous suggestions that just provide entertainment, maybe information, maybe education. But sometimes when you get into the realm of politics or into the realm of certain types of hot button or controversial topics where there can occasionally be a lot of misleading or polarizing information in the world, that same process can happen. And what that means is when you are clicking and engaging with that kind of content, oftentimes, and this is very natural human behavior, people are more likely to want to continue to click and engage on content from influencers or specific posts that they like because they agree with it. And so you are reinforcing that over time. It doesn’t mean that you’re never going to see other things, particularly if you’re on a platform like X, where you’re going to see people from the other side of an issue and that kind of fighting with other people is part of the point of the platform, actually. You’re going to continue to see it. It’s not like you’re in a hermetically sealed echo chamber. But what you’re reinforcing for the algorithm is that you kind of want to see certain types of content and it’s going to continue to show it to you.
[00:13:02] Jeff Schechtman: And this gives undue power to certain influencers. It develops what you talk about is this asymmetry of passion. Talk about that.
[00:13:11] Renée DiResta: So there are certain issues where people who are on one side really come out and create tons and tons and tons of content, and people on the other side just don’t. And I talked about this in the context of the vaccine conversation. You know, I had my first baby in 2013. I have three kids. And at the time, again, the Facebook recommendation engine realized I’d had a baby and I posted some baby pictures and it began to recommend me mom content. And it began to recommend me all kinds of mom content. And like I said, I joined some groups for like making your own baby food, which made it decide that I was like a crunchy mom. I’m really not actually, but it decided I was. And so based on, you know, certain kind of behaviors and groups that I had joined, it started pushing me anti vaccine groups and people who make their own babies that maybe are more likely to also be anti-vaccine. And it was kind of an interesting situation because there were dedicated anti vaccine groups on Facebook with, you know, tens of thousands, or in some cases, hundreds of thousands of members in them. But there were not pro-vaccine groups of that size. And that’s because most people just go and vaccinate their children and it doesn’t become their identity. They’re not out there passionately posting about how they vaccinated their kids every day, because it is not a thing that it is just not a normal behavior that people did. You just go, you get vaccinated, you move on. And so when the platform is looking for content to recommend, when people are searching for vaccine information, overwhelmingly, the majority of the content that it had available was anti-vaccine content. And this was something that Google struggled with. Also, for a long time, you know, there’s a very routine shot that babies are given when they’re born, the vitamin K shot, and it’s to prevent brain hemorrhaging. And it’s not a vaccine, there’s no virus associated with it. But the people who began to, you know, the anti-vaccine movement was creating blog posts about how the vitamin K shot was toxic. And that was what was rising to the top of Google search results, because nobody was producing content about why the vitamin K shot mattered. You had some very kind of boring, bland CDC type content, maybe some pediatric content. But at the time, it didn’t seem like a thing that ordinary people had to be talking about and creating content about. So the asymmetry of passion really tilted what kind of content was available for platforms and search results to return. We call this a data void sometimes, when a certain keyword won’t have a whole lot of information for it. And so what gets returned is this sort of content. In some cases, it’s even political propaganda around, you know, sometimes you can see like extremists will do this to come up with a term, they’ll tell people, you should go look for this term. And then when you go and look for it, it’s content that they want you to find. And so that’s how that works.
[00:16:14] Jeff Schechtman: And so much of this has given rise to so many political things that fall under this general rubric of the big lie and the way that has been self-reinforcing.
[00:16:27] Renée DiResta: So with something like the big lie, one of the things that happens, I’ll continue with the vaccine example, because I mentioned that in some of these groups, you know, you have tens of thousands of people in them, right? So once you join some of these groups, you really find a sense of camaraderie. And oftentimes that also leads to people continuing to participate. And it becomes, you know, sort of a part of their identity, a big part of their identity. And this is, you know, we just picked the one example that I have a personal history with, personal story around. But it doesn’t matter what the particular example is, that happens in a lot of different political communities. And people will begin to very, very strongly and deeply identify with that particular, in the book, I use the term factions. And they’ll even sometimes put like a little emoji in their bio so that when they’re communicating with somebody online, it’s immediately apparent like what factions, what issues they care about. People who really care about like biking in cities will have the bike emoji in there. You know, it gets very niche sometimes. And so what you have is this dynamic where, you know, you mentioned the big lie. In that particular case, you had a situation where Trump supporters, right, who were, you know, they very deeply trust Donald Trump. And they deeply trust the right-wing influencers who they have listened to and followed for a very, very, very long time. And around election 2020, I spent a whole chapter on this in the book, just really kind of explaining how it happened, explaining the buildup over time. They were told for the entirety of the presidential campaign that the election was going to be stolen, that there was going to be massive fraud. Trump hammered on that over and over and over again. And the influencers, the right-wing political influencers, would engage with ordinary people who would see something in their neighborhood, and they would take a picture of it, and they would tag the influencer in. And in one case, there was a very kind of famous case in Sonoma where a man saw ballot envelopes in a dumpster. And these were ballot envelopes from, I think it was 2018, that had been properly disposed of. And, you know, they had been held for the requisite length of time, and then they had been disposed of in the dumpster. But, you know, this individual took this photo, tagged in the right-wing media, the right-wing influencers, and the story goes viral that mail-in ballots, you know, implying that they’re from 2020, are being destroyed, that they’re just being disposed of, and that, you know, of course, they layer on these theories that they’re Trump ballots. And the rumor flies far, you know, faster and further than the eventual, you know, explanation from the election officials explaining what is happening and, you know, these are old ballots, et cetera, et cetera. And so you have this buildup of distrust combined with this messaging that’s coming from political elites and political influencers saying it’s going to be stolen, it’s going to be stolen, it’s going to be stolen. And so when he loses, what they’ve been hearing is that it’s going to be stolen, it’s going to be stolen, it’s going to be stolen. And on election day, there’s always some irregularity somewhere in the United States, right? Some machine goes down, something goes wrong. And so those stories that had happened on election day, you see the influencers kind of combing back through them and then pulling them back up and reframing them as here is how they stole it and, you know, pushing them out again. You know, there was a situation in Maricopa County, Arizona, where people were very afraid that Sharpie markers were bleeding through and they were convinced that this was a conspiracy to make their ballots unreadable. And so they went and they kind of dredged that back up and they said, oh, they stole it in Arizona. You know, so this became an example where evidence was twisted to fit the frame, you know, frame is a way that you can kind of position events in the world, right? So you have the evidence that are, you know, examples of real things that happened are twisted and repositioned to fit this frame, which is the election will be stolen. And that is what is communicated to these very passionate people who have been hearing for a long time from people that they trust that this is what is going to happen. And that is the that is how Stop the Steal worked. And unfortunately, it’s a very repeatable process, because these folks have also been told you can’t trust the other media, the other media lies to you. So when other media comes out and says, no, there’s no evidence of this, no, there’s no evidence of a steal, no, the, you know, there’s no evidence of ballot fraud, etc, etc. They have been primed already to distrust that media. And that is one of the reasons why it’s very, very hard to break down that barrier and begin to kind of reinhabit a shared reality.
[00:21:39] Jeff Schechtman: And one of the things that we have seen happen beyond just reinforcement of what people believe to begin with, but we see people with with sort of a casual engagement with an idea, casual engagement with a subject, suddenly adopt some of these things as core beliefs.
[00:21:59] Renée DiResta: So in the case of online influencers, in particular, adopting a belief can often translate into more attention and more revenue. So there’s an incentive, potentially, we call this audience capture, to take a to take a position that you think your audience wants you to have, doesn’t matter if you sincerely hold it or not. What matters is that if you express it, you’ll get the kind of engagement from followers, you’ll get retweets, and you can monetize it. There was an interesting example of, you know, there’s a debate going around right now about flag burning because of a president’s executive order. And Matt Walsh, who’s a sort of, you know, right wing kind of firebrand, somebody pulled up that he had been tweeting in 2019 about how flag burning should absolutely be legal. And then, of course, yesterday, he was tweeting about how it absolutely should not be legal, right? And so there’s this sense of, did he have a sincere change of heart over that time? Or is it more that the political winds of his audience, who are, you know, very heavily Trump supporters and aligned with the president, have, you know, this is where that belief has gone in that particular political community. This is the belief that comes with that identity now. So you express it, right? And so that question of where is the sincerity versus are you expressing it in part because this is going to get you a ton of engagement and your posts are monetized on that social media platform. So maybe he’ll come out and say something like, no, no, no, I changed my mind. And here’s how my views evolved on that over time. People do change their minds. But one of the things that you see on social media is a trajectory where people will have this movement into a conversion into the beliefs that their audience has. And that question always lingers in the back of my mind, which is how much of this is sincere and how much of this is, these are the people who are subscribing to your substack. Right.
[00:24:15] Jeff Schechtman: To what extent has this problem and all that we’ve been talking about evolved in conjunction with the way social media itself has evolved, the way people have learned to interact with it, the way it too now is becoming bifurcated in so many ways?
[00:24:34] Renée DiResta: Oh, it’s absolutely, I mean, it’s inseparable. You have to think about it as a complex system, right? There’s a, you know, we are, there’s that quote by, oh boy, it’s like Winston Churchill, we shape our buildings and then they shape us, right? I just kind of butchered that, consider that a paraphrase. But it’s the idea that you, you know, you are a product of your environment, right? And, and when you have a, when you’re engaging on a social media platform, the incentive for the creator is determined by what the platform algorithm rewards. The creator isn’t only creating content for the human audience, they’re creating content for the algorithm. And that’s something that I think people really, really need to understand. When, when I make, you know, I’ve started making video content a little bit more now, just because people want to get their content on video and that’s fine. And so I’ll take, you know, a 4,500 word explainer essay that I feel like now I’m mostly writing for the AI to read, you know. And then if I, if I make a video of it for like Instagram Reels, Reels is very upfront with you. And it’ll say, if you make this over three minutes long, I’m not going to push it out to anybody other than the people who follow you, right? So there’s the immediate constraint there, right? So you’ve got an immediate time constraint. And then there are certain things that it is going to reward as far as like types of formatting, types of cuts. Sometimes on TikTok, it’ll reward use of certain types of sounds. Recency is important. There are certain, you know, on YouTube, like the thumbnail that you use is, is important. The title that you use is important. What is the title that you’re giving your video? There’s a reason people use these clickbaity sounding titles. And that’s because that is what the algorithm uses when it’s picking from the millions and millions and millions of pieces of content that it can use to show somebody. Your content is so heavily influenced by what the algorithm is going to show people. And so whether you want to or not, you are, you know, in order to, to get attention, you are playing that game. And so the, I think that it is, you know, you have to just think about it as, you know, it’s like the medium is the message, right? You’re just, you’re essentially just evolving your, your content and how you communicate is so heavily shaped by the system that determines whether people see the communication.
[00:27:10] Jeff Schechtman: Talk a little bit about your own experience, Renee, and the way your research into this, the work that you were doing at Stanford really created a world of problems for you.
[00:27:22] Renée DiResta: Well, I mean, look, I study adversarial abuse online. That’s how I’ve always described it. And, you know, when it’s adversarial, that means that there are people who are going to be mad at you when you, you know, when you, when you point at it, right? So it’s not a surprise. The, the work that we did in 2020 was chronicling election rumors and, and misinformation and disinformation that we thought we were going to see a lot of content from state actors. We actually sort of set about in 2020 anticipating we’d see a ton of stuff from Russia and China and Iran in part because it was the first big presidential election since the Russian interference in 2016, which I might add, we are now re-litigating as Tulsi Gabbard tries to redefine reality. But Russia did interfere in 2016 and we anticipated that we would see more, more interference in 2020. So we set up this project and the, and then, and then lo and behold, most of the rumors came from the sitting president of the United States. And this presented an interesting challenge because when you are chronicling things, you know, our, we just kind of recorded stuff as they happened. We had a team of about 120 student analysts and we had a ticketing system called JIRA, where you could just log things as they happened. And, and then we could kind of go back and, you know, add more information later. So the thing that is, you know, sometimes a challenge is that when the overwhelming majority of the rumors are coming from one side, because only one side is alleging the election is being stolen, then it looks like you are biased against one side. That is the accusation that they make against you. You know, they say things like, oh, all of your tickets are targeting the right. No, no, no, that’s not true. You’ve got that cause and effect backwards. The, the database at the end chronicled things that happened. The things that happened were rumors spread by people on the right. That is what happened. Had they been spread by people on the left, that would be more evenly distributed, but the left was not alleging election theft. And so there are far fewer rumors from the left in the election rumor database. So, so the, you know, Jim Jordan and his merry band decided to accuse us of a mass, you know, conspiracy to, um, to silence conservatives. And, uh, and this was, um, you know, they’re just being absolutely wild. Um, I don’t know if I’m allowed to say the word BS on your, there’s an academic definition for the word bullshit, which is, you know, information, uh, spread without regard for the truth. Right. And so there were these, um, sort of bloggers and these Twitter files guys who started saying things like Stanford censored 22 million tweets, which was just a staggering number. And, and where they got that from was after the election. Um, we added up the number of tweets in the most viral rumors that spread during the 2020 election. So things like Sharpie gate, right. That Sharpie markers were leading through only for Trump voters or the dominion. You might remember there was this lawsuit Fox news paid out $740 million on saying that dominion voting machines had been rigged. We tracked that rumor as it happened. So we added up the most viral rumors, the rumors that everybody saw, everybody saw these rumors. We added them up at the end, um, in around March of 2021, when we were doing this final report. And, uh, and they came out to about 22 million tweets. There’ve been about 22 million tweets on those kind of top 10, most viral rumors. And we wrote that in a report that sat on the internet publicly, all the stuff was done publicly, um, for a little over a year. And then, you know, Jim Jordan and these crackpots got their hands on it. And they said that our, our, our addition, our act of addition of the most viral tweets was actually the number of tweets that we had censored. So, you know, and that’s, and this is what they do. Right. And, and so you’re, you’re in this, um, there’s a, another saying, like if you’re explaining you’re losing, right. So you find yourself in this, this, uh, bizarro world where they’re making these like insane accusations and every accusation, they just throw a barrage of accusations at you. And every single one takes, you know, six paragraphs of explanation. Um, and so it, it, whereas for them, they just say, you know, the, the woke ribs at Stanford university tried to steal the election from Donald Trump, uh, by censoring 22 million tweets. And that was the allegation that they made against us. Um, and, uh, that we ran a censorship, you know, a censorship cabal out of Stanford and university of Washington and a couple of other, um, couple of other institutions that were partners on this project. And then America first legal sued us, you know, and, uh, so that that’s still going on. I may sue you again to tie you up in paper and to shut you up, right. To make it so that you can’t talk. Um, and so that is, uh, that is how that all happened. Um, but in order to, in order to sell this story, um, they also had to turn us into like agents of the state. They tried to really make this into, you know, that we were somehow like agents of the government who were doing this because it’s a claim that we had violated people’s first amendment rights by doing this too. Um, so they began to allege things like, you know, that, that I had been a CIA agent and that, that the CIA had like tasked to me with doing this, just these, these absolutely insane allegations. Um, and so there, so then, so then the conspiracy theories about me became personally weird also. And, um, yeah. And so, you know, so I had the whole, the whole experience of basically having to start a sub stack just to, um, just kind of like put information out there into the world about the conversations I had actually had with these reporters versus what they wrote and stuff like that.
[00:33:41] Jeff Schechtman: Given how pervasive social media is, the growth of social media, the internet, and now AI layered on top of that and soon to be even deeper, to what extent is what we’re talking about, the, the, the fundamental thesis here, virtually inevitable given the reality of this world.
[00:34:03] Renée DiResta: The fundamental thesis in which sense?
[00:34:05] Jeff Schechtman: The, the fundamental thesis of propaganda and, and the three legged stool that you started out with, the way in which that is, is basically built into, to all that we have today.
[00:34:16] Renée DiResta: So the reason I mention the history of propaganda in the book and, and in the start of our chat that it’s always existed is because I think it is important to understand it. It has always existed and we derive mechanisms for responding to it in its, um, you know, in its various forms, in various medias, various media environments. And so I think that that provides some guidance for where we are today. I think it’s prolific today. I think we’re like swimming in it. You know, I think it’s everywhere. And that’s because I think that we all have the power to create it ourselves now, right. To serve as creators or more importantly, as, uh, inadvertent amplifiers, right. When we’re clicking the like and share button, when we’re really participating in the kind of factional political wars that, um, that many people get involved in, you know, myself included, you know, everybody gets sucked in, in some way. Um, but that also means though, that some of the ways in which in the past we taught people how to recognize it. There was a lot of curriculum in the United States around recognizing rhetoric in the 1930s, late 1930s. This was as the U.S. was heading into, uh, World War II. And, and that was because there was concern about rising fascism. Um, and there was, for example, I spend the last chapter in the book talking about Father Cawthorn, the radio priest, um, who had millions of listeners, tens of millions of listeners at a time when the population of the U.S. was only about 128 million people. So this was a wildly influential figure, um, uh, American Catholic priest who really became enamored with Mussolini and Hitler and began to use his weekly radio address to, um, to talk about the, you know, to, to praise and to, to kind of, um, deliver a message in support of fascism and, you know, eventually in support of what Hitler began to do, uh, to American audiences and saw the responses from the broadcasters, radio broadcasters, as they tried to figure out how to fact-check this, right? You would have somebody come on afterwards to fact-check him. They would gradually try to make him pre-clear his speeches, try to figure out how to, how do you respond to this? Eventually the Catholic church kind of yanked him and was like, that’s it, you’re done. Um, but there was this interesting, you know, I kind of relate the history of that, but one of the other things that happened was the rise of this, um, group called the Institute for Propaganda Analysis that really began to try to teach this rhetorical analysis. And they did it even in the middle and high schools, but they also had these pamphlets that they would give out at, you know, the, I think they said the Cracker Barrel, which is sort of funny to say today, given the current controversy about Cracker Barrel and its logos, but the, um, you know, the, the actual, uh, places, the community spots where people would meet. But, so that model of thinking about this in terms of propaganda, I think is the right way to think about it because for too long, I would argue we have thought about it in terms of facts and the idea that if you just gave people more facts, they would change their minds. And that’s just not where we are. Um, we have a crisis of trust. We have a significant divergence in what sources people consider legitimate. We have a real division along identity lines of where people get their information. And I think that when you think about things in terms of, um, trust and identity, and when you think about things in terms of rhetoric and propaganda, then a different set of responses become, you know, become more in the realm of the, things that we should be considering. How do we educate people and make them more aware? And what is a media literacy that looks at that as opposed to a media literacy that tells people, check your sources and consider your facts. And in my opinion, um, that would be an interesting place for us to be experimenting.
[00:38:27] Jeff Schechtman: And as we look around the world, we see that this problem is certainly a global problem, but it is definitely on steroids here in America. What can we learn from that?
[00:38:38] Renée DiResta: Well, that’s a really interesting question. I think that, that the crisis of trust has not hit quite so badly in other places. So this is something that comes up quite a bit. Um, there are media literacy efforts that are conducted by governments that, uh, that seem to do better than ours have here, but those are often much more, uh, homogenous publics and they have higher degrees of trust in government. So I think again, when you come down to it, that trust piece is incredibly significant. And that I think, you know, it, it means that it’s not just a technological problem. And I think for too long, we have focused on the social media challenge from a tech perspective. It doesn’t mean there aren’t tech things to do. I spend a lot of time looking at middleware and user agencies or empowerment. Uh, but I think that it means that the trust thing has to really be foregrounded.
[00:39:30] Jeff Schechtman: Renée DiResta, her book is Invisible Rulers, The People Who Turned Lies Into Reality. Renée, I thank you so much for spending time with us here today on the WhoWhatWhy podcast.
[00:39:40] Renée DiResta: Thank you so much for having me.
[00:39:42] Jeff Schechtman: Thank you. And thank you for listening and joining us here on the WhoWhatWhy podcast. I hope you join us next week for another WhoWhatWhy podcast. I’m Jeff Schechtman. If you liked this podcast, please feel free to share and help others find it by rating and reviewing it on iTunes. You can also support this podcast and all the work we do by going to WhoWhatWhy.org/donate.