cover of episode How the Right Launders Online Propaganda with Renée DiResta

How the Right Launders Online Propaganda with Renée DiResta

Publish Date: 2024/8/15
logo of podcast On with Kara Swisher

On with Kara Swisher

Chapters

Shownotes Transcript

The first half of 2024 was defined by a slew of A-list album releases, but the second half, that belongs to the newcomers. I'm Rihanna Cruz, senior producer of Switched on Pop, and over the course of our brand new series, The Newcomers, we'll be talking to some artists, popular in their own right, that are popping off right now and who we think you should be listening to.

There's our pop darlings, Latin superstars, and those in between. Tune in to Switched on Pop wherever you get your podcasts. Presented by Amazon Prime. Support for this show comes from Amazon Business. We could all use more time. Amazon Business offers smart business buying solutions, so you can spend more time growing your business and less time doing the admin. I can see why they call it smart. Learn more about smart business buying at amazonbusiness.com. It's on!

Hi, everyone, from New York Magazine and the Vox Media Podcast Network. This is On with Kara Swisher, and I'm Kara Swisher. My guest today is Rene DiResta, one of the world's leading experts on online disinformation and propaganda, and the former technical research manager at Stanford's Internet Observatory, also known as SIO. Stanford launched the SIO in order to investigate online abuse and the many ways that people attempt to manipulate, harass, and target others online. But

But earlier this summer, the university initiated its own quiet self-destruction after a storm of coordinated congressional inquiries from that famous clown Jim Jordan and class action lawsuits by conservative groups led by people like Stephen Miller, who claimed that the SIO and its researchers colluded illegally with the federal government to censor speech. I'm sorry to tell you, this is nonsense. They are nonsensical, aggressive people who are trying to make trouble by telling lies.

almost persistently in order to stop people from researching important issues around online abuse. All right, alleged lies. They tell alleged lies. The university claims that the SIO has not been dismantled or shut down as a result of outside pressure, but rather forced to refocus due to lack of grant money.

Okay, Stanford profile and courage. Either way, Renee was caught in the middle of the storm and I'll obviously ask her about what happened there. She also just released a book that showcases some of her research. It's called Invisible Rulers, The People Who Turn Lies Into Reality. And it's about the interplay between influencers, algorithms, and the online crowds who are just as culpable. I've interviewed Renee plenty of times, including on this podcast, and there is so much to get

to today. We'll talk about everything that happened at the Stanford Internet Observatory, her book, of course, and the CEOs behind the platforms that spread propaganda, people like Elon Musk, who may be on the hook for some of the riots happening in the UK right now. As I said, there's a lot to get to. Our expert question today comes from Chris Krabs, the former director of the Cybersecurity and Infrastructure Security Agency, who was fired by Donald Trump via, you guessed it, Twitter, for saying the election was not stolen.

By the way, the election was not stolen. Let's get to Renee now. It is on.

Hi, Renee. Thanks for being on ON. It's great to see you. Great to see you, too. So we've talked a lot over the many years, and we've got a lot to get to today. So let's dive right in. Let's start with the Stanford Internet Observatory. You joined the organization in 2019 as a technical research manager. This past June, you learned that your contract with the university would not be renewed. You weren't the only one. Other staffers were told to look for work elsewhere, despite what the university says. The group essentially is dismantled, and I've talked to a lot of people about that.

There had been a coordinated campaign by Republicans that painted you and the university as part of a conspiracy to silence conservative voices. It's ongoing. Let's first talk about the work there because you worked with Alex Samos and students and everyone else. So explain what you were doing there and what happened.

Yeah. So Stanford Internet Observatory was an interdisciplinary research organization within Stanford Cyber Policy Center, and it looked at abuse of online information technologies. I describe it as adverse serial abuse. Right. So I was looking at everything from spam and scams to child safety, which was a huge part of our work. Right.

Information integrity. So that was where the disinformation work came into play. A lot of like state actor influence campaigns that we studied over the years. We looked at emerging technologies, how emerging technologies change the information environment. And then how new, you know, how kind of bad actors, right, like people who are making CSAM and other kinds of horrible content were using new technologies to increase their, you know, their scope and scale of operations. Right. So typical. A lot of universities were doing this.

Yeah, yeah, yeah. Stanford, Harvard, a whole bunch. And then there were groups that were doing it too. Yes. But one of the things that we did was we led a couple of inter-institutional projects alongside Kate Starbird's team at UW, DFR Lab over at the Atlantic Council, and Grafica.

And those looked at elections and they looked at vaccines. And in both cases, they were looking at understanding what was the best kind of rapid responses for emerging rumors on the Internet.

So during the election, that might be a rumor that dead people were voting or that Sharpie markers were rendering your ballot useless, things like that. So very much focused on voting procedures. And then the context of the vaccines, rumors that were going viral around the safety of the vaccines, efficacy, conspiracy theories. And again, the thing

with a rumor is you don't actually know the truth, right? It's impossible to know in that moment. So one of the goals of the project was to try to help other people who were able to respond and did know what was actually happening on the ground have some indication that they should be responding when and, you know. So essentially just tracking these conspiracy theories in action and determining whether they were accurate, correct? Correct.

The determining whether they were accurate really relied on looking to see if like fact checks came out. So that was where we would actually, we never made a determination about whether or not something was true or false. We would say like, hey, this, here's this information over here. And we would put out a rapid response that sometimes would link and say, these fact checkers over here say this thing is true. We would reach out to tech

platforms occasionally. Hey, you've got this thing going viral on your platform, right? They had relationships with fact-checkers. They, in turn, could go and get a label put on content. Exactly. And those platforms were more open before, correct? They were...

Not terrifically open, but somewhat, sometimes. They were trying very hard to make sure that the election was free and fair and that bad actors, particularly state actors, were not using their platforms to manipulate the public. So that's when you got on the radar of conspiracy theorists. So talk about what happened because you're studying conspiracy theories and then the conspiracy theorists don't like this too much.

It actually wound up happening nearly two years after the work was done, right? So in August of 2022, we're doing work on the 2022 election. Again, we were, funny enough, Elon loved a lot of the work at the time. He amplified the work. He boosted the work we did talking about Russian, Chinese, and Iranian operations. Yoel Roth still worked there at the time, in fact, right? There was still the sense that he had just bought the platform and he didn't want it to be a disaster for the 2022 midterms. So that was the environment we were operating in.

But this blog kind of comes out of nowhere. It calls itself the Foundation for Freedom Online. Turns out it's like one guy at the time, and he claims that he's a whistleblower.

which implies he has some inside knowledge. We didn't know who the hell he was. Turned out he worked for the State Department for like two months, but he claimed that during his time at State that he had been the quote-unquote head of cyber at State. This is not true either. That he had seen evidence that there was a vast plot to silence conservatives and that we were behind it. Which they had been thinking for years. I had argued with a bunch of them about, especially on Twitter, that they got pushed down. What was the word they used? The...

Shadow banned, right? That was their thing for a while. That was their thing in 2017, 2018. This took it a step further and alleged that we were the source of election rigging. We had, in fact, swung the election by silencing and suppressing using an AI censorship Death Star super weapon all of the narratives alleging voter fraud in the 2020 election.

This is like stupid on its face to anybody with half a brain who remembers the 2020 election, but that doesn't matter because this person was briefing Jim Jordan and Dan Bishop, and he was arguing in these posts that once they had subpoena power, they should investigate us. So this is laid out in August of 2022. And

And this plays – my point is it plays into this idea that they were being manipulated and that conservative voices were being – whether it was Josh Hawley were being – Senator Hawley being suppressed, correct? Exactly. Yeah. So this just feeds into the grievance cycle that, you know, very long-running grievance cycle. But, of course, because it feeds into that, right-wing media picks it up, right? Right.

comes a whole thing. Marjorie Taylor Greene is weighing in on how silenced and censored she is. And so what happens is they manage to connect the dots between the Twitter files allegations of some vast censorship regime happening within Twitter 1.0 and then the work that we did studying elections and vaccines.

And they allege that we colluded with Twitter to suppress 22 million tweets. There were two separate congressional inquiries into the Stanford Internet Observatory and several class action suits, which I think were more – the Jim Jordan thing was a circus. It was a spectacle, really. And the Twitter files was a nothing burger of –

of an investigation and found that the people were doing their jobs essentially. And so they didn't find what they were looking for, but these class action suits by conservative groups, which is a, which is a tactic that Elon uses now and Peter Thiel has used it. Trump uses it. A lot of them use these things. This one was Trump's former advisor, Stephen Miller. And I believe it names you personally.

Yeah. Yeah. The family separation policy guy who now runs a lawsuit mill called America First Legal. Yeah, this is the problem, right? I mean, there's a couple of different levels of annoyance. The first is like the mobs on Twitter that, you know, that decide that, you know, you're some sort of like... You're a CIA agent, correct? Yeah, I'm a CIA agent, evil, like... Explain what you did. You worked for... For the agency? I was an intern, correct?

When I was an undergrad. So you're a CIA agent. That's their allegation. Not only that, the CIA got me my job at Stanford. It placed me there is this theory that they seem to have. Yeah, you're CIA Renee. I've read about you. Right. Yep. And so that, I mean, that just generates a lot of like personal harassment and annoyance. It is what it is. You know, you get threats on the internet. You know this. Right.

It's when Congress picks it up and uses subpoena power to compel you to turn over your emails in response to allegations like this, right? When they then write reports alleging that they have found, you know, some evil nexus. This is a very effective playbook. It's, you know, been shown to work. And then the other piece of it is the civil litigation and the lawfare issue.

And so we wind up getting sued by Stephen Miller and he kind of like grabs these plaintiffs, one of whom we just never even heard of before. And, you know, that's – I can't even comment on it because it's still pending, right? You're just sort of under pending litigation and it really – that is ironically the chilling effect. That is the impact on free speech. So –

Right. And so the university did what? They didn't want you to talk a lot. I know that from talking to a lot of people there and then suddenly not. They put the remaining staffers in a larger cyber policy center. And essentially, you described as capitulation. Explain your perspectives. They didn't shut it down, but they did not shut it down, right? Right.

Well, nobody really is there anymore. So, no, it's not technically shut down. But, you know, that's unfortunately the reality. There's, I think, just a couple of people who are working on very specific projects. They reoriented it around child safety exclusively. And that's very, very, very important work, to be clear. I mean, my frustration was even as they were trying to decide what to do, the child safety work in particular really needed to continue. Right.

But I would argue, so did the election work, right? And we have a very extremely online election happening right now, right? We have extremely unpredictable platforms. We have new platforms. We have so many that have emerged since 2022, gotten popular. We have all sorts of incentivized actors that want to create chaos and spread rumors and things like this. And one thing that's

That's interesting about it is we also have new and emerging technologies that are shaping the political discourse. And instead of, you know, instead of doing that work that we had done to study rumors and narratives as they emerged, I think just a handful of academic institutions are doing that now. Kate Starbird's team still is, right? But the thing that

got destroyed through the set of actions was the network, right? The network of researchers, the network of communicators. We spoke with state and local election officials because state and local election officials are the people on the ground, independent of party, who actually understand what is happening in their districts and who need to be communicating with their community.

constituents. And by cutting those ties, by making it so that any conversation is collusion and any cooperation is a cabal, what you've essentially done, you destroy it. Right. They make normal conversation into collusion. That's correct. Which is not illegal either, correct? I mean, what is the allegation of illegality here from your perspective?

Well, the theory is that we took tips from DHS, the CIA. You know, they really they change it depending on which day of the week it is. They told us to tell platforms to censor content.

That the government decided what tweets were going to come down and used Stanford to launder requests for millions of tweets to come down to suppress a particular viewpoint. And the argument is that this is a deep state effort to do that. So we were not funded by the government. The government did not tell us to do this.

But because there's no actual evidence that can back up that specific claim, they just use a whole lot of innuendo. But the perception stays, and that's one of the things that is very challenging to overcome, right? Right, right. And it also creates financial hardship and time and space hardship, correct? Yes, exactly. And you can't speak, right? Right, you can't speak. So they're suppressing speech by suing you.

It's funny how that works, right? The free speech team. Another irony, the researcher who works to understand and expose misinformation and conspiracy theories becomes enmeshed in a conspiracy, although that

seems correct way to happen, right, in some weird ways. It happens for a reason, right, because the easiest way to address the problematic finds is to smear the individual. This has always happened, right? Whenever you don't like a climate science finding, you attack the scientist. It happened during McCarthy era many times. Right. McCarthyism is exactly the parallel here, actually. That, I think, it's been disappointing to me that that hasn't been pointed out a bit more often. It is, in fact, the effort to label you as some sort of –

you know, person who is suppressive or censorious or doing a bad thing, you know, the equivalent of, in McCarthyism, of course, it's communism, but to allege that you have some sort of inherent ideological bias and you're trying to subversively do something to some other group. And then the other piece of it is the effort to make the work toxic and make the people who do the work, you know,

dangerous or bad or untouchable. And that is the entire operation. Right, right. There were several people they did this to even before is to try to question their patriotism, whether they're working for things, bringing them, making them communists. They have very similar tools for doing this, right? That you insinuate and then have no proof and then

Media has to sort of report it out, et cetera, et cetera. But the fact of the matter is you're out of a job. Your identity is being questioned. You and your former colleagues have been buried in lawsuits.

You've gotten death threats. What are you doing now since your contract wasn't renewed, correct? Well, I still have projects, I mean. Right, right. I still have work that I want to get done. I think I had an amazing last five years because I was getting paid to do work that I really wanted to do anyway. By Stanford, which is a great institution, right? Exactly, right. So I'm just continuing to do the work that I want to do, the work that I think we need to be

And I can do it as independent for a while, but it is, you know, as you know, it's a bit frustrating to see institutions not understand the power that they hold in this moment and the need to stand up, right, to learn the lessons of McCarthyism. The people who came out of that era as heroes, people like, you know, Arthur Miller, are people who...

We'll be back in a minute. Support for this episode comes from SAS.

How is AI affecting how you learn, work, and socialize? And what do you need to know to make responsible use of it as a business leader, worker, and human in the world? Find out when you listen to Pondering AI, a podcast featuring candid conversations with experts from across the AI ecosystem. Pondering AI explores the impact and implications of AI, for better and for worse, with a diverse group of innovators, advocates, and data scientists. Check out Pondering AI wherever you get your podcasts.

♪♪

There's our pop darlings, Latin superstars, and those in between. Tune in to Switched on Pop wherever you get your podcasts. Presented by Amazon Prime. Support for this show comes from Amazon Business. We could all use more time. Amazon Business offers smart business buying solutions so you can spend more time growing your business and less time doing the admin. I can see why they call it smart. Learn more about smart business buying at AmazonBusiness.com.

So you have written a book. You published your first book after this. It's called Invisible Rules, as I said, the people who turn lies into reality. And you write about how we got to this place. Now, again, conspiracy has been around forever. This is not a new thing. The book centers on the creation of, quote unquote, bespoke realities. How is it different than before? Because I've studied propaganda for decades.

In history for a long time and every single – there's not an autocracy who hasn't used it. There's not a democracy that hasn't used it, right? Some kind of conspiracy theories or propaganda, misinformation. How do you like to describe it and how is this different?

I describe it a lot as in those environments, it was very, very top-down. And that's because in the mass media era, control of the ability to propagandize was held by very few, right? And so you had to have some kind of access in order to have that degree of influence.

And now we have a really fascinating information environment where we have this figure, the influencer, right, who has the reach of mass media. It's the kind of figure that can only exist, that is like native to the last 20 years. We've had celebrities. We've had influential people. But this idea of mass reach is –

And somewhat, you know, mass reach for individuals is very new. What I wrote the book about was actually to describe the incentives, right? So there's always a system of incentives underlying propaganda. Chomsky writes about this in the 80s, talking about advertising, talking about creating hatred of the other, right? The sort of filters that go into why mass media propagandizes in some states in some ways.

And he doesn't do it to say, like, you should hate mass media, you should distrust mass media. He does it to say your eyes should be open to the system of incentives that you understand why you're seeing what you're seeing. And that was what I wanted to do, basically saying we have this new environment that is—you asked what's different—

inherently participatory. There's so much focus on the algorithm and the sort of infrastructure of social, and that misses the incentives of the individuals that come into play. The influencer is the other sort of massive figure in the space. But then there's the power of the crowd and the crowd's ability to create a reality, to shape public opinion. And it's happening in niches, where mass media at least tried to do something

Right.

The crowd is what triggers the algorithm to share the influencer's stuff. And the crowd feels together as a group. Yeah, exactly. So all three of these things, like this is the system. I have long described the first time I went over to AOL, there was a group of quilters who'd never met except online, and they made a whole quilt online. And I thought, this is lovely. And then I thought, this could be bad. This could have bad people. These are quilters. They were lovely. They made a beautiful AOL quilt, but they don't have to make that. They can make anything. And I thought, oh, this isn't maybe good.

It's a difference in how power is distributed, right? It went from top-down propaganda being a function of the top-down to now the potential to be from the bottom up. Or anywhere. Right, and very distributed. And it's actually that network that is what matters. And that's why, just to connect the dots back to the last thing we were saying, when you destroy the networks between people who are trying to melt the counter-response, you essentially...

cripple one side of the equation, right? If you maintain the network and you destroy their network, then they are at a disadvantage because networked communication is the entire ballgame. Absolutely. Look at the stories we're talking about with regard to, you know, J.D. Vance and his couch, you know. He did not fuck a couch, everybody. He did not. He did not, just for, you know, to be clear. It needs to be said. It does. It does. You never know. I think it is that concept, though, of like,

Ordinary people can make that the story of the day for, what are we now, three weeks? Yeah, that never would have risen to the top of any old media environment ever, ever. Right. It would have been talked about maybe, ha-ha, but never, never, ever gotten anywhere because it wouldn't have been created in the first place, right? Yes, exactly. This was just a guy who just made it up. Yeah.

and said he made it up, which didn't really matter. It's still, you know, you have a vice presidential candidate making a tasteless joke about it in his acceptance speech. One of the things you've used with propaganda, which I prefer too over misinformation and disinformation, explain the difference of why you think that. Propaganda works. I keep saying just it's propaganda. Just like let's just like— Yeah, this is where I've been for years. Misinformation—

implies that something is factually wrong, right? That there is a truth that we know and this is the opposite of that, right? And that is just not what is happening

On the internet. I was misinformed. Like, yeah. Right. And that also assumes that you want to be informed and that if somebody just goes and puts out the fact check, the problem is solved. Right. And that was, I think, kind of a prevailing belief in the early quaint days when we called this like fake news in 2016. Right. Mm-hmm.

And that's just not what we saw. Like when we would look at election rumors in 2020, we framed that project originally saying election misinformation. That was still in, you know, four years ago, made sense to us. And then as we watched those moments go viral, it was always that there was going to be a

truth that was going to come out, right? You would know if the ballots in the dumpster were deliberate or accidental or, you know, the facts of the case.

But in that moment, it wasn't misinformation. It was a rumor. It was something where people didn't know the truth. And then the other thing that had been added onto it was that you had these highly partisan figures who picked up these rumors and then they amplified them as if they were true in service to a political agenda. And so they would say things like big if true, right? And it was this phenomenon of taking the way that humans communicate rumors and uncertainty, how we make sense of the world,

But then taking that uncertainty and leveraging it, using innuendo for the political moment. And that was where it becomes information with an agenda at that point. And propaganda has always been the word we've used for that. So I felt like we were torturing ourselves into this model of true and false. Yeah, because misinformation is I misspoke and disinformation is someone did it on purpose as a lie. I just like propaganda and lie. That seems to sort of cover the problem.

It's pretty good. We've had these terms for a while. One of the things you talk about is the virality, and it's something I've talked about for years and years and years, especially I used to say Google is...

fast, quick, speedy, contextual, and accurate, right? When you search for some Rene D'Aresta, you get Rene D'Aresta, right? You know what I mean? You don't get anything else. It's a utility. And when you do it on social media, you can get just about anything. You can get Michael Schellenberger saying lies about you or whatever. So it's a very different... Virality is the problem, though, when you apply it to a lot of things. Because

speedy and virality together creates an architecture of propaganda very beautifully, especially when its engagement is involved too. And

And virality is, you know, it's a function of engagement. Very literally, right? You have to click a button in order to make that retweet happen. One of the things I wanted to talk about was the idea of the crowd as passive is wrong. It is not. We are not sitting there on the Internet passively receiving things. J.D. Vance's couch did not go viral because like some, you know, gnomes made it go viral. It went viral because individual people decided it was interesting, entertaining, funny, and they wanted to be part of it.

And that's like people click the button. We are exercising agency in that moment. We are not automatons. And we're saying, okay, we are going to, in this moment, shape what our friends see, shape how other people process information, determine what trends. And it's not necessarily a conscious decision, but virality is active. Virality is not accidental. No, absolutely. It's different than the passive, say, of watching a television or looking at a billboard. You have a chapter in a book called If You Make a Trend, You Make it True. Like anybody can make a trend, which is –

what's powerful about it. And then the media does take it up because when media reports on it, even if it's not true, just like the couch thing, which I think is, you know, I'm not a fan of J.D. Vance, but he didn't fuck a couch. Stop it. That's enough. That's enough with that. It's funny, but it's not, right? It's to say he should not be subject to the same thing as he himself does to people. I thought about it in the context of like,

1960s radical politics guidebooks. I was thinking of Saul Alinsky in this book, Rules for Radicals, where he has this thing that he says. There's two things. He says, you know, ridicule is man's most potent weapon, right? Which is why weird and all these other things are working. You know why they're working. That's the main reason. It's very hard. There's no defense. But the other thing that he says is like a good tactic is one that your people enjoy. Right.

And I think about that a lot because the norms that we've created over the last 10 years of this now is that people think that this is what political activism is, right? And I'm saying that descriptively. I'm not even making a normative judgment here myself. I'm just saying that the left finds this appealing right now because there was this

The sense that you had to take the high road for so long and that the system would somehow restore itself if the adults in the room didn't participate. People will behave.

Right. And that's not, unfortunately, where we are because if only one side is doing that while the other side continues to realize that trolling works for capturing attention, for shaping opinion, for galvanizing a base. Well, you can see the Harris campaign doing it. You know, when they go low, we go high. They're going low now. We're going to go lower. Right.

It is. It's depressing on one level. On the other level, I'm like, OK, well, now we're going to now we're going to see what happens when both sides fight this way. Right. Is there going to be a realization that this is not actually something that we should want? Except it does work. It's going to work. And that's. And actually, the Harris campaign is very funny. Yeah. We'll be back in a minute.

On September 28th, the Global Citizen Festival will gather thousands of people who took action to end extreme poverty. Watch Post Malone, Doja Cat, Lisa, Jelly Roll and Raul Alejandro as they take the stage with world leaders and activists to defeat poverty, defend the planet and demand equity. Download the Global Citizen app to watch live. Learn more at globalcitizen.org.com.

On September 28th, the Global Citizen Festival will gather thousands of people who took action to end extreme poverty. Watch Post Malone, Doja Cat, Lisa, Jelly Roll and Raul Alejandro as they take the stage with world leaders and activists to defeat poverty, defend the planet and demand equity. Download the Global Citizen app to watch live. Learn more at globalcitizen.org.com.

This episode is brought to you by Shopify. Forget the frustration of picking commerce platforms when you switch your business to Shopify, the global commerce platform that supercharges your selling wherever you sell. With Shopify, you'll harness the same intuitive features, trusted apps, and powerful analytics used by the world's leading brands. Sign up today for your $1 per month trial period at shopify.com slash tech, all lowercase. That's shopify.com slash tech.

Let's dive into the role of incentives of the tech companies, because this is great for them. Enragement equals engagement, right? And the incentives, which I've talked about a lot. And it's also a mess. That's why they want to get out of politics or they want to not police propaganda on their platforms. Yeah.

Talk a little bit about the benefits they get. And then you brought up earlier these companies also made it more difficult for you to look at what's happening, for you to see what's going on behind the curtain at Oz. Meta, just for people that don't know, no longer supports CrowdTangle, which it bought, and Twitter is charging for access to its API. And I wouldn't even trust it.

frankly, given there's a large societal pushback against quote-unquote big tech. Why do they feel they can make these platforms more opaque? And what's the incentives to either hide it or get out of it altogether, fixing it in any way?

Well, for moderation, after 2020 and then 2022, you do see the shift begin to happen because just to be clear, we were not the only ones subpoenaed by Jim Jordan. It was not only, you know, I think he sent out 91 subpoenas, right? The man who didn't honor his own. But it wasn't just us. It was tech platforms that were getting, I don't know if they were subpoenaed or just received letters. Yeah, they did. Yeah.

They did get subpoenaed, right?

And the platforms did not push back on that. And to be clear, neither did institutions, neither did academia, right? Nobody really did, actually. And there was a bit of a, like a freeze, right, where people didn't say, like, actually, a label is not censorship. I did. I got, like, railed in front of Congress, you know, as a result. Right.

You have to decide that it is worth it to have the fight. The other thing that begins to happen, though, is that you have the Europeans, right? So we're talking about this very much in the context of the American election, the American culture war. Then the Europeans do come in and they say, you are going to do the following things. We expect compliance with the Digital Services Act.

And one of the reasons, I think, for the end of CrowdTangle is that they're trying to launch this thing that they then see as – it's the meta content library that they see as being a way for them to comply with what the European Union wants. Unfortunately, it's not as good, candidly, as CrowdTangle was. And that's why I think a lot of people are very upset about it. So the other piece, though, is that for a platform like Twitter –

When you have researchers saying, hey, we see this type of manipulation, in the olden days, pre-2022, you would have actually had, I know for us, you would have had engagement with the platform teams directly. There was a sense that this was going to be, you know. You were plotting with them, Renee. You know that you were plotting.

So there's a – I know this is reframed as a plot, but it's really like, hey, here are these accounts. We think they're this. What do you think that is? It was really quite boring, actually. I mean, they make us sound much more exciting than we are. But that – again, when you – those conversations aren't happening anymore. And conversations between the platforms and government aren't happening anymore. And some people are like, that's fantastic. That's great. Well –

But in June, the Supreme Court overturned a lower court decision that sought to impose limits on a federal government's ability to communicate with social media companies, which was great. In essence, they okayed the Biden administration's contact with platforms during the pandemic. It was a good ruling in that way, but it does make platforms more leery about doing anything, correct? Totally. Even if the Supreme Court has given the government the ability to do so.

Well, and one thing that I think is actually quite reasonable as far as responses is that you should be disclosing, for example, government takedown requests. Like, that is a thing that is good global policy, period. We should absolutely want those lists public. We should want the knowledge that the government is making takedown requests clear. We should have an understanding of why. That is a completely reasonable policy response to this. Most of the investigations are not...

are not being conducted with the intention of actually getting to substantive, meaningful policy responses or legislation. That's the problem. So content moderation is expensive. They only get into trouble. Moderating content, when they deplatformed Trump after January 6th, they got only into trouble. They get the attacks. They get the lawsuits. They get all this stuff. Is there going to be a...

More moderation, no less is what you're going to say, correct? Much less. Right. That's the expectation. Again, I think it should be different and better, right? And we were doing a bunch of work on, you know, if you think about this from the standpoint of like system design, which is what it

then you can answer questions like, what should a curation algorithm upvote, right? Moderation is what you do at the end state when something has kind of failed, right? Where something has gone south and you have to respond and react to it. But you can create better incentives earlier on by creating design incentives

Yes, absolutely. It's an architecture problem. Exactly. It's something I've talked about a lot. It's an architecture problem. If it's incented to go for crazy, it's going to be crazy. And if it's incented to be nuts, it'll be nuts. If it's incented to be happy, it'll be happy, right? And I think that's the difficulty is that it runs smack into their very –

I would say juvenile ideas around free speech of what it is, right? They have these sort of like dummies guide to free speech whenever I talk to them. But let's, before we get to Elon and who thinks he's Mr. Free Speech, which he is Mr. Suppressed Speech as far as my experience with him has been. And speaking of which, he's suing a group of advertisers, the World Federation of Advertisers, for allegedly organizing a systematic illegal boycott against X. Let me just say, this is just nonsense. You can't make advertisers...

advertised on your Nazi porn bar, that you made, that you want to do that now. You can't call yourself a free speech absolu and then sue people for exercising their right to free speech. It's just, you aren't one. It's just like, let's start with that. He's pulled a similar move, threatening to sue the ADL. As I said, it's his MO, his weird version of contact moderation, which is he suppresses the content of people he doesn't like. He's doing exactly what they had long accused Twitter of doing.

And so same little thing happened to you in SIO in many ways. So what do you tell these advertisers? How should they defend themselves? Because he's judged shopping by putting it in a Texas court. Yeah, that's one. I mean, I would say fight. I mean, that's really...

People need to have the fight at this point because the entire intent is to intimidate and drive capitulation, unfortunately. It's very frustrating. I was never a big – I'm not a lawyer. I've never been a big follower of courts or procedures or things. But I have been very interested in this idea of what can be done about the forum shopping problem. I mean, I'm being sued in Louisiana today.

Why? Because, you know, some plaintiff who I'd never heard of lives there, right? Simple, simple. There it is. That's the answer. And they want it there. And they want it there, right. And I actually didn't realize the extent to which that was possible. I was very naive about that, honestly.

And so watching this happen over and over and over again now, it is form-shopping. And even though the case should eventually get tossed, because as you and every other lawyer on Twitter noted, it's a joke. But before that happens, it's going to be two years of procedural fighting, and it's going to bleed whatever funding and budget they have. So the problem for a lot of these groups is that what happens next is

It reinforces how effective the strategy is seen to be and potentially increases the likelihood that it will be deployed against others. Right. No, it works. It works because it's expensive. I think these people should get money from Reid Hoffman and just go to town. That's what I, you know what I mean? Just like fight back.

I think eventually that's what will happen. But Elon is at the forefront of scaling back content moderation efforts there. But secretaries of state in five states recently put pressure on him to fix Grox. X's AI chatbot saying it had spread election disinformation. But do you think it matters at all? Because no one's really on X, although we'll talk about what happened in Britain. But talk a little bit about this. Like these secretaries of states are very concerned about the disinformation and he himself is pushing it. He took down something about concentration camps.

because it was a fake headline from a newspaper. It's one thing to have conspiracy theories running around rampant on Twitter, but what's happened here in the UK and Ireland with these riots, talk about that.

I think, so I was working the last couple of days. I'm not intimately versed in every single post that went up or came down, but the... He was suggesting the country's on the brink of civil war. Oh, no, no, I know, yeah. And he tweeted conspiracy theories about the UK police system. And he compared them to the Soviet Union. Police officers have been injured, buildings, cars set afire, people in real danger. Elon is a really terrible person for having done this, but go ahead. The challenge is...

People don't want the government regulation of content, right, particularly not in Western democracies. And I'm very sympathetic to that.

one of the questions is, you know, where is the, you know, where is the line with incitement? He's not actively telling anyone to go do a thing, right? So there's not going to be a whole lot of action you can take. And so one of the problems that we have, again, when we get to like, how do you respond? Elon owns the platform. So the curation and amplification and all of that is set by him, which is an interesting, interesting position to be in. I think Elon

One of the things that confronts us, though, is the question of what kind of, you know, what can you do about societal responses? I don't know that there is an immediate solution to the problem. I wish I had one. No accountability, right? Right, because there is no accountability. Well, there might be in the United Kingdom, even if it's not here, correct? Right, that may be true. And again, they have different laws. There's all sorts of, I'm not intimately familiar with UK speech law where they're going to come down on what he said or...

or implied or whatever, you know, where those lines are. I don't know them as well. Just thinking about what it would look like here in the United States, it's very hard to envision accountability for that sort of thing. And that's, you know, and so the accountability then is put on the people who go and do the

Right. And that's consistently where we've seen...

You know, the punishment goes to the people who actually do the saying, right? Who take the violent act, which is reasonable. But they could then point to this. I mean, just for people to know, there was a killing of three young girls in Southport in England, and it escalated violent demonstrations across the country after a false suspect was named online. Demonstrations fueled by anti-migrant, anti-Muslim, people who just like to fight, lies that were spread on X. In under 24 hours, a false name had already received over 30,000 mentions online.

Hi, Cara, and hello, Renee. What a timely interview.

So I guess my question is this. In the eight years since 2016, the Russian interference in the 2016 election, is it really the Russians that we need to be worried about? Or has that playbook that they've developed...

finding seams in society, elevating certain discourses and then pouring kerosene on both sides of the divine and stoking that fire. Has that playbook been picked up by other actors? And is it not just the actors we need to be worried about, but is it also the platforms and the players and the people sitting at the top of those platforms? Just...

You think about this all the time. You've written a book about it. Would love to hear where your head's at right now in 2024 on the cusp of yet another most important elections of our lifetime.

Yeah, it's a great question. So the 2020 election work that we did that I described earlier, right, the election integrity partnership, and Krebs was the head of DHS CISA during 2020, right? It was the Trump appointees around the government, just to be clear, when we did our election work. Including Chris, who was fired. Yeah, exactly. And again, when we started that project, it was the first presidential election since 2016. And we thought we were going to see a whole lot of state actor involvement.

We thought we would see more hack and leak operations. We thought we would see the, you know, whatever the internet research agency, the sort of Russian troll factory, whatever gunpowder it had left, we assumed we would see that. And we did, right? You know, their accounts have been decimated by two years of platforms, particularly Facebook, constantly taking them down. And so they didn't have, they were not able to reconstitute that and be influential. We saw the Iranians do some stuff. We saw the Chinese do some stuff. It was mostly minor and around the edges.

But the rumors that undermined and led to actual real-world violence came from authentic American voices, authentic American influencers, including the president of the United States himself. And that's where you do see that the –

And recognition that appealing to a niche, appealing to divisiveness, driving people apart, and most importantly, convincing your faction that they have to be constantly at war with the other factions, that's so normalized now that the Russian and Chinese and Iranian trolls who exist are not the same people.

are such a small part of amplifying that. So it's real. Democracy doesn't die in darkness. It dies in the full light of day, right? Yeah, it's not. I mean, there are...

There are a lot of people who really want it to be Russia. And the problem is when it's inauthentic state actors, we have a really clear, morally unambiguous response, which is the accounts come down, the pages come down, the network is disrupted, and that's it, right? Which they don't. Yeah, nobody cries for the – nobody's like, oh, they're really censoring those poor Russian trolls. Right.

But when the same thing is done by real people, then we have a speech problem, right? Then we have a culture problem. Then we have a societal problem. And that's where we actually are. And I think the most important thing to be thinking about now is independence.

independent of actor, right? Whether it's a Russian troll, we can, you know, moderate them differently. But ultimately, the question becomes, what do you do to restore that societal cohesion? What do you do to diffuse that vitriol? And that's not a technological problem. It can be enhanced by technology. You can use bridging algorithms. There's all sorts of different, again, design and structural things you can do. But the question becomes, how do you diminish the impact

of lies about voting manipulation and things like this when people are not going to trust the people doing the diffusing, right? Well, you have Tim Walz come in and change the spark plug and they all feel better. But, you know, in all seriousness, we've got the presidential election coming up, same guy running who has been accused of sparking the insurrection, just like you talked about.

Two things. And then I have one final question. How are platforms prepared to deal with the potential election misinformation or propaganda this time around? And what if he does it again, calls for violence after he loses the election?

What happens then? Yeah, one of the things they need to do is lay that out unambiguously now, right? That is, I think, the number one thing that every platform has to do, which is to say, even if you are a candidate, even if you are an elected official, this is what we are going to do in response to this kind of incitement. And they need to have that there, and then they need to follow through, right? And that's the other piece, which is— Will they? I don't know. No.

But, I mean, that is the role. I mean, these are not poor businesses, right? They can, you know. I get it. But right now Mark Zuckerberg is, look at my pretty chain. That is where he is.

That's where he is. I mean, there's other people within the... I understand that there is, you know, the visibility of the CEO and the CEO setting the tone. But I do think that to whatever extent they have moderation and policy teams, they need to be coming out now with what's going to happen. Who is critical? What platforms are critical to this right now? I think Twitter is a lot smaller than people think it is. I think Twitter is... It attracts the media. Yeah, it attracts the media, but it's also... I mean, you know...

It can galvanize a group to act, right? And that's what we were just talking about in the context of the UK, right? It can provide a spark, create a sense of legitimacy. It also provides a platform where people who go out and act then share their actions online, right? So it sort of gives them an opportunity to build some clout. And meanwhile, platforms like Facebook have seriously deprecated political content already, right? They've already moved away from it. And we just don't see as much in the way of...

So they have to do this in advance before it happens and then say what we're going to do after it and then actually do it, which they won't, none of which they will do. You don't think so? No.

All right. Last question. Are you worried about a similar thing? Because it didn't happen in 2022. It didn't, you know what I mean? There wasn't, but this is a presidential election. If he loses again, will it be seen as a legitimate loss? I think is actually one of the really challenging questions in front of us. Will it be scattered, small-scale political violence? You know, I think that you're seeing a lot of election officials who are very afraid of that. You're

You're seeing a lot of election officials who are afraid that it's going to come for them personally. And that, I think, is one of the areas where – that's the other area that I think platforms have to be particularly vigilant on, the need to protect election officials from threats and doxing and harassment because these are ordinary people doing their jobs. And we already saw with the two election workers in Georgia who wound up winning a massive defamation suit against Rudy Giuliani. But again, they had to move out of their houses before that happened. Like, the costs are pretty profound here.

So are you worried for this election? Yeah, well, I'm worried about scattered political violence, right? That I am actually very worried about. Caused by online. Yes. Having seen it in Britain.

Yeah, exactly. Like that I think is the biggest. I believe the election will be well run as it has been. I believe that the secretaries of state, be they Democrat or Republican, have made very, very strong statements. And just that sort of bipartisan show that like we are committed to a free and fair election. The actual election will be free and fair. It's a question of...

What incentivized actors try to convince the public that it is not so if it does not go their way? That's what I'm most concerned about. All right. Thank you, Renee. Again, her book is Invisible Rulers, The People Who Turn Lies Into Reality. Thank you so much. Thank you.

On With Kara Swisher is produced by Christian Castro-Russell, Kateri Yochum, Jolie Myers, Megan Burney, and Gabriela Bielo. Special thanks to Kate Gallagher, Claire Hyman, and Kaylin Lynch. Our engineers are Rick Kwan, Fernando Arruda, and Aaliyah Jackson. And our theme music is by Trackademics.

If you're already following the show, you are now part of the On With Kara Swisher conspiracy to take over the world. If not, CIA Renee is on to you. That's so ridiculous. Go wherever you listen to podcasts, search for On With Kara Swisher and hit follow. Thanks for listening to On With Kara Swisher from New York Magazine, the Vox Media Podcast Network and us. We'll be back on Monday with more.

The first half of 2024 was defined by a slew of A-list album releases. But the second half, that belongs to the newcomers. I'm Rihanna Cruz, senior producer of Switched on Pop. And over the course of our brand new series, The Newcomers, we'll be talking to some artists, popular in their own right, that are popping off right now and who we think you should be listening to.

There's our pop darlings, Latin superstars, and those in between. Tune in to Switched on Pop wherever you get your podcasts. Presented by Amazon Prime. Support for this show comes from Amazon Business. We could all use more time. Amazon Business offers smart business buying solutions, so you can spend more time growing your business and less time doing the admin. I can see why they call it smart. Learn more about smart business buying at amazonbusiness.com.