cover of episode Is Social Media Turning Us Into Political Extremists?

Is Social Media Turning Us Into Political Extremists?

Publish Date: 2022/9/22
logo of podcast FiveThirtyEight Politics

FiveThirtyEight Politics

Chapters

Shownotes Transcript

This episode is brought to you by Shopify. Forget the frustration of picking commerce platforms when you switch your business to Shopify, the global commerce platform that supercharges your selling wherever you sell. With Shopify, you'll harness the same intuitive features, trusted apps, and powerful analytics used by the world's leading brands. Sign up today for your $1 per month trial period at shopify.com slash tech, all lowercase. That's shopify.com slash tech.

Hey there, listeners, Galen here. Before we get started, I wanted to let you in on some big news, which is that after a two and a half year hiatus, 538 is going to again be hitting the road for a live show. We're going to be in Washington, DC on October 25. We will be at sixth and I it's going to be just before the midterm elections. So come and join us. I'm going to put a link in the show notes where you can find

tickets. Again, that's October 25th at 6 and I in Washington, DC. We look forward to seeing you there. Hello and welcome to the FiveThirtyEight Politics Podcast. I'm Galen Druk. What effect is social media having on our politics and for that matter, society more broadly?

According to critics, we're living through an unregulated era of social media that will look as quaint as tobacco in its pre-regulation era. Cigarettes, those pleasurable, ubiquitous little sticks, turned out to be highly addictive, detrimental to overall health, and ultimately deadly. It took a decades-long crusade to convince the public as much, and then regulate and punish the industry that promoted them.

New York Times reporter Max Fisher agrees with this view. In his new book, he writes, the early conventional wisdom that social media promotes sensationalism and outrage, while accurate, turned out to drastically understate things. An ever-growing pool of evidence suggests its impact is far more profound. This technology exerts such a powerful pull on our psychology and our identity and is so pervasive to our lives that it changes how we think, behave, and relate to one another.

The effect, multiplied across billions of users, has been to change society itself. Fisher describes those changes as contributing to political and social crises.

Social media's effect on politics has been increasingly scrutinized and debated since the 2016 election, and it's again in the spotlight as we head into the midterms. So that is what we're going to talk about today with Max Fisher himself. His new book is called The Chaos Machine, The Inside Story of How Social Media Rewired Our Minds and Our World. Welcome to the podcast. Thanks for being here. Thank you so much for having me. I'm really excited to be on FiveThirtyEight. Awesome. So

Did I do a fair job of distilling the thesis of your book? What would you like to add? I think that that's exactly right. That's the thesis, and it's also the quest of the book, is figuring out how do we answer the question of what that effect is on us

individually because we know that the platforms are, I mean the cigarette comparison is apt and that cigarettes are, they didn't just happen to fill them with nicotine, they designed them to be deliberately addictive and we know that social media platforms are designed to be not just deliberately addictive but to deliberately change your behavior in ways that are going to serve the platform's bottom line but that we have a lot of evidence, a reason to think, have many profound implications on the way that you act not just online but when you're offline too.

and trying to ascertain through a combination of traditional New York Times reporting. I mean, this started four years ago as a series of stories of the paper reporting on the ground, something that would happen and trying to understand what was social media's role in it, tracing it back post by post, talking to people there on the ground.

And combining that with what I think is unique and new with the book and the part that I'm really proud of, which is pulling together what turns out to be this fairly large but disparate and unconnected body of really growing and deep social science research into the effects of this technology. So I spent time with neuroscientists, with cognitive psychologists, with social psychologists, with political scientists, with

with dissidents within Silicon Valley, whistleblowers at the companies who worked on some of these systems. People were tracking the systems from the outside to understand them and taking all of these pieces and trying to pull them together to understand as empirically and as completely and comprehensively as I think we can what it's doing to us individually and as a whole. And it's a scary picture, I think.

As I'm sure listeners can tell by everything you mentioned there, there's a lot for us to dig into in this conversation. But I'm curious from the outset, we're a politics podcast. Social media gets blamed for a lot of what ails society. I think mental health, for example, is one of the big topics that's come up in Congress and the mainstream media, etc.,

How do you see this impacting our politics? Like, do you think that's the most urgent aspect of how social media is shaping our lives? Or is it, you know, the other broader social impacts? That's a great question. The most urgent. I mean, because especially here in the United States, we're in something of a political crisis right now. That's, I think, a pretty good case for understanding the political impacts. But

Every impact of social media traces back to the same kind of core origins to it. Let me give you an example of what the effects are on politics and then I think the thing that is really interesting and really telling because it shows you how deep the effect goes, I could show you what we – what I learned in the book about how it pulls out these changes in you.

There's one study that I write about that was just trying to measure very surface level the impacts of social media on politics where they took these two big groups of people and they had the control group continue to use social media however they normally used it. In the experimental group, they said shut off Facebook and shut off your account for four weeks. So just one platform and just four weeks, which is of course not a very long amount of time when you consider that these platforms have been in our lives for 10 or 15 years.

And what they found doing a bunch of sentiment tests and a bunch of kind of psych tests throughout this four-week period and before and after it, two really significant changes for the experimental group that shut off the platform. Number one, their happiness and life satisfaction shot way up, equivalent to the effect of about a quarter to a half of going to therapy, which is

blew my mind because therapy is quite expensive and shutting off Facebook is free. And I thought that was also very telling because the researchers wrote this in the paper. That is one of many indications we have that social media is something we use because we are addicted to it and not because we find that it adds value to our lives or makes us happy. In fact, like cigarettes, it tends to make us unhappy. The second effect that's relevant for politics is that they found that

People who shut off Facebook that this reduced their issue polarization. So not overall polarization in the way they viewed the other party as a whole, but their polarization on any issues that were salient in that period. It reduced their polarization by equivalent to 50% of the overall increase in polarization in American life over the past 25 years.

Didn't that same study also, though, show that the people who shut off Facebook were just less aware of things that were going on in the world? Not misinformation, but just factual news. So is that also kind of like making the argument, if you just don't pay attention, you won't be mad about it?

whether it's Facebook or something else. Sure. So that's a good, like you said, we're going to dig into the social science. So I love it. I'm very excited that we're getting into it. I went through, I pulled up the research articles because it's a complicated issue. And like people make some pretty bold claims when it comes to the impacts of social media. And while from a personal perspective, I may be biased to think that, yeah, it seems like it's like cigarettes. Seems like people are going crazy. Like, come on.

I want to make sure that we're responsible here. Of course. Of course. So the people who stopped using social media, they tended to fill that with other social activities rather than with other forms of news consumption. And that, again, gets to like why are we on social media? It's not to – I mean –

Why are we on it in the sense of what's the compulsion that brings us to it rather than the kind of conscious choice? And it's the social impulses rather than the desire to be informed. But the tool that social media often uses to derive whatever effect it's trying to derive from you is news content quite frequently. So you're going to be encountering less of it because that's the tool that is used to provoke outrage, provoke polarization. You're right that people who are not on the platforms were less interested.

informed about the news, they were typically less informed specifically about news stories that were evocative of polarization. So yes, you were going to know less, but the things you were going to know less about are going to be the things that were going to provoke anger in you. Much like if you were following less celebrity gossip that made you angry at celebrities, you would also know less about what was happening with celebrities. So-

The claim here is that social media makes our politics more polarized. Is that the main thing you sort of focus on in terms of the effect on our politics? Or are there other effects as well? I'm glad you asked that because that – I like that study just because it's one very small indicator that it's having an effect on politics. But I think that that –

both understates and overstates the effect on politics. I mean, like you said, there's lots of different ways to encounter news, but I think understates it because I think that's just, that's the something that happens at the long end of a causal chain about it affect on how it changes. Most of how it affects us has to do with sentiment and emotional sentiment and what we perceive our community around us to believe. I can give you two examples, again, two studies that I think speak- We love a study. I love a study. So this is why this is a great podcast. Yeah, exactly.

that speak to how the platforms hook you in, what they're doing to you, what they're changing, which is on a much deeper level than how you think about the other party. One of them was a study that I really loved where they had a group of people and before the experiment, they sat them down and they tested all of these people on their level of internal outrage. How prone were they as people to feeling or expressing outrage?

And then they had a group of those people then send a tweet on a fake Twitter platform so that they could control the experience that had some outrage sentiment in it. Because we know from a lot of other research that outrage is the sentiment that the platforms reward really above all else. And when I say reward, what I mean is that it puts it in front of other people so that it will receive a lot of engagement. And outrage here means something more specific than just anger. It means anger at someone who

on behalf of a social group for a social transgression. So, um,

you know if you were to knock my chair and i was maybe like a little annoyed with you that wouldn't be outrage but if you did something that i felt to be like morally offensive or was against the group or i saw you cut in line at the you know cafe that would be outrage so they had these people send outrage tweets and then they showed them the fake tweets as if it had received an enormous amount of engagement which again is something that we know that the platforms will work very hard to produce regardless of whether or not people out in the world actually like that tweet or not

And what they found was that when people saw their fake tweet that received a lot of retweets and likes for expressing outrage, they became not just more prone to send tweets like that in the future, but their level of underlying outrage that they felt increased, including when they were not online. Just when they were out in the world, they became more prone to outrage. And the reason for that is because we –

internalize social cues very powerfully. So if we feel that outrage is something that is going to be rewarded, that is something that our brain tells us to do more of and that it rewards so effectively that it actually makes us feel more outraged so that we'll conduct more of it when we're out in the world.

So that I think is a powerful way of showing that the platforms are training you to feel certain sentiments. And there's a second study that I thought was really powerful for showing how it impacts not just emotional sentiment but on politics.

If you show someone just a news headline about – in this study, they used refugees. Refugees at the border are – it had a false claim in the headline that was meant to provoke outraids. Refugees at the border are – they're all criminals or they're coming here to dilute the population or they were sent by George Soros.

In a news context, people were actually pretty good, including Republicans and conservatives they recruited for the study, pretty good at saying that headline is false. And then when they would then say, do you want to share that on Facebook? They would say no, because I know that it's false.

And then if they flipped around just the procedure for how they conducted that study, if they showed people that headline in a Facebook context as a Facebook post and then said, do you want to share this? The people in the study would say, yes, I want to share that. And that speaks to a lot of research that we have from

social scientists and sociologists that when you experience something in a social context the way that your brain works is not to look at it intellectually and judge is that true or not the way that you look at as a social context is is that something that my community thinks and doesn't want me to think that and if you reach that conclusion with the social part of your mind and in the neurological studies they actually show a different part of your brain than the analytical part firing first so it's literally a different brain that is examining this

then your brain convinces you to believe that it's true. So when you have this combination of these two forces, these platforms are training you to express outrage and express other kinds of sentiment that we know are very polarizing, like tribalism, us versus them, and taking very extreme stances on behalf of the in-group and against the out-group, and then rewards that, shows it to you in a social context. I think we have a lot of evidence that shows that that is something that can change a lot of things, but politics, I think, is one of them.

The argument that you're making here, though, is not just that this is the outcome of social media. In a way, you're describing, you know, someone gets emphysema if we're going to, you know, use the cigarette analogy or someone gets lung cancer.

But there's another piece to your argument, which is the adding the nicotine part, which is what gets us here. So what is the pipeline? Right, right, right. Yes. Yeah. The distortion, that's a really important part of it. So you open up your social media platform, your Facebook, your Twitter, um,

Or to a lesser extent, if you open up YouTube, what you think you are seeing because what the platform presents itself as showing you is the sentiments, the thoughts, the ideas, the political attitudes of people in your community. Maybe that means friends that you follow or if you follow reporters or celebrities or whoever is in your network, you think that you are seeing people.

their emotions and their thoughts being fed to you through the platform. And that is false. That is not what you were seeing. There's an enormous amount of content on social media, more than you could possibly look at at a first glance. So what you were seeing when you open it up are the things that these platforms have selected for you to see. And they have these very powerful algorithms and a lot of these other systems have predetermined specific sentiments and

kinds of words, combination of words that they want you to look at because they find that to be very effective and have, you know, correctly concluded that those are going to be effective at getting you to not just read and scroll but to engage yourself and then have presented them in such a way as to pull you in and maybe has even shown you content from outside of your network but your brain doesn't

registers that because it's what you think you're looking at as the sentiment of your community. And that is part of what makes the distorting effect so powerful because you log on and you think everybody's polarized, everybody's outraged, and maybe the platform is playing a role in creating that. But it is a much different sentiment than what people in the network actually feel.

How different is that from, say, a media executive coming to understand that Rush Limbaugh is extremely effective at getting people to tune into his radio show and keeping them hooked throughout the day or selling a tabloid?

Like I've seen many times on my way out of the grocery store that like Amelia Earhart was found. She had a plastic surgery makeover. Oh, that's true. I think. Sure. Yeah. Like in a way, there's nothing new under the sun. How is this any different from a media company realizing what stokes outrage and attention and using that to sell all of the other media forms that already exist?

Right. It's a great question. And the two differences are you are perceiving it as not a headline that is telling you something. You are perceiving it as the consensus sentiment of your community. And that is something that there's actually a name for it, common knowledge, which is a very benign sounding name. And it can be very benign. But there are a lot of studies showing that if you put people in a social circle and you find some way to make them think that everyone in that social circle believes X, they

They will not only be likelier to think that X must have something to it, but they will really internalize that and believe X themselves. And the other component of it is the participation because that's what social media really wants you to do. It doesn't just want you to look at it and scroll and think, boy, that sure is an angry tweet. I guess I'm angry too. They want you to participate in it yourself by posting something. And that is when you feel this social feedback that is incredibly effective at getting you to spend more time on the platforms, which is what they want.

And that also ends up changing your underlying attitudes, emotional sentiment, and even your sense of right and wrong along the way. And that's different from the community that forms around, you know, a charismatic podcast or Rush Limbaugh or whatever. You mean this charismatic podcast? Yes, exactly. That's exactly what I'm talking about.

It is. I mean, community is actually, you're right. A community can be artificially constructed and has been since the beginning of time. That's what cult leaders do. That's what politicians do. The system is doing something that is very similar to that, except that it is coming from an algorithm and is making you feel as if it is something organic coming from your, and actually that community formation, I'm really glad that you raised that, is a really powerful way that social media has inculcated

a lot of the most extreme things that have come out of it. I spent a lot of time in the first chapter of the book talking about the early emergence of anti-vaccine networks, because I think that is a like, it was like 2014. It's kind of before we took a lot of this seriously. And as I think a good way to show that it's not just crazies who this happens to, it's really, it's everybody who can be pulled into this and shows how it works, which is that

moms were one of the first big early adapter communities on social media. Moms and gamers, funnily enough, were like the two networks that really came on in big numbers. And both of them got radicalized in different ways by the platforms. But it was through, for example, moms, the creation of this community where something that

just to pick on one platform in particular, though it's not just Facebook, that Facebook's groups system figured out, which is that if you are a mom on Facebook and you're trying to find parenting tips and it just recommends you into a regular parenting tip group,

You might spend 10 minutes on the platform picking up some tip about early morning feedings but if it can recommend you into a group that has health conspiracies, that says maybe doctors are lying to you, that is built around something that is a little bit more provocative and feeds into a sense of danger, feeds into a sense of threat and feeds into a sense of you as a mom

as moms collectively or as a group are besieged by some terrifying outside threat and it's, you know, Bill Gates and his nanobot vaccines or it's, you know, doctors are trying to push untested treatments, recommending you into those groups is a way to get you to spend instead of 10 minutes

an hour or two hours because then it becomes a sense of this community that is radicalized around this idea in this case that vaccines are scary or that vaccines are untested. And that becomes incredibly addictive for the people on it. And of course, it's not just they present you with a group and then you join it and you're an anti-vaxxer, but it's as much more sophisticated gradual process of kind of frog boiling you into it. Right. And you're saying the value there for the social media company is that the more engagement, the more they can sell ads against your attention. Yeah.

Exactly. You know, I'm curious here because the vast, you know, basically everyone uses social media. The vast, vast majority of people do not become political extremists or conspiracy theorists on social media. So,

First of all, what indicates someone as being perhaps more prone to those things? And then two, are we able to see broader trends? Like, I've read plenty of the stories that you describe in your book, and they're really compelling on an individual level. But I'm also curious if we're able to see over time and across population that people

you know, as social media has spread, our information environment has done X or people's understanding of the world in a factual sense has declined in Y way. You know, like, are we able to see this as a broad picture trend? Right. So it's tough to, it's kind of the million dollar question because it's tough to like, how do you separate out? Because you can't do an experimental trial for like, well, what if we did a world without social media for 10 years and what would be different? So you had two questions, which who's prone to social media use? Yeah.

And then let me answer it and then you can remind me. Yeah, yeah, yeah. Okay. So there's actually some really fascinating research on specific personality traits and even specific neurological traits that make people more prone to heavy social media use and addictive social media use. One of them is people who have a...

I'm going to get the name of this specific part of the brain wrong. I think it's nucleus accubens. I was in a podcast with someone who was an actual doctor and I mispronounced it slightly, so now I'm really gun-shy around it. But it's in the book and you will see it spelled out correctly, unlike my terrible pronunciation, that is associated with dopamine production and associated with addiction.

Yeah, yeah. So I guess that part makes sense to me. But the part where like, if everyone's using social media, and social media is so effective at bringing people down these rabbit holes and blah, blah, blah, blah.

The vast, vast majority of people haven't fallen down those rabbit holes. Like what's different about the people who do? Why are we not all QAnon? Sure. Okay. Right. I see your question. And because we're not all QAnon, because 95% of us are not QAnon, is it really the social media that's kind of causing this in the first place? Sure. So I think that's an important question because I think that the most –

profound effects of social media in terms of their overall society and political impacts are to people like you and me. The vast majority of social media users who check it, you know, eight, 10, 15 times a day, which I think is about the median

who are not in QAnon or are not anti-vaxxers, but it is still affecting that pull on you, even if it's a little gentler, even if it's a little subtler. 100%. I'm definitely addicted to social media. You know, maybe not as badly as my boss, Nate, but, you know, I'm still a bit addicted.

So how do we know that overall, the big picture effect, right? Well, no, because there's a difference between, okay, people are addicted. That's probably not great. But, you know, I'm also addicted to caffeine, and, like, that's not a societal problem. The societal problem comes when...

people are radicalized or misinformed and behaving in certain ways in society as a result, right? I would actually, I would maybe challenge that. It might be. I mean, it's hard because it's how do you measure one thing versus the other? There's definitely an argument that it's the extreme is on the fringes. And I understand that I do owe you an answer to your question about who were those people and how do they become that? But I think that there is a case that it is the

overall majority of us for whom the effect is subtle and as an individual, you and I are not going out and doing, you know, horrible things because of how social media changes. But when you multiply that out by a couple billion users, if you're one example, a, another study in Germany, this group of, or pair of researchers looked at like every town and city in Germany over, I think it was like a two or three year period. You've read the book more recently than I have. So if I get a number wrong, like, please do correct me. Um,

And they controlled for are people in this community more or less likely to use social media using a number of different factors. But they zeroed in on Facebook specifically rather than internet usage generally. And what they found was when –

I'm going to try to get this number right for you. When Facebook usage in a particular area, and again, controlling for these other things, was one standard deviation above the national mean in Germany, that that ended up having all of these changes for how people in that community saw, I mean, presumably lots of things, but the one thing they were measuring was attitudes towards refugees. Because this is a time in Germany when there was a big influx, a million refugees from the Middle East and Asia, and it was a big political issue.

And the most, the kind of top of the iceberg of this effect was that they found that there was an increase of it was something like 10 or 15% in the likelihood of vigilante violence against refugees in these towns. But their argument was not Facebook is making people go out and attack refugees. Their argument was that Facebook is pulling these communities as a whole because these are communities where Facebook is used so widely. So this effect on, you know,

individually playing out, even if it's, you know, making people 5%, 7%, you know, whatever, more antagonistic and hostile towards refugees, that one indicator of that is that the person who would have been otherwise like right on the line between whether or not they were going to commit vigilante violence gets pushed over the edge. And that that I think,

That was something that really stuck with me because that was a way, I think, to show, and there's a lot of different ways to kind of attack this question so that we're not hanging at all in just one study, a way to show that overall attitudes in a community towards, say, an outgroup or perceived outsiders, which is something that comes up a lot when you're looking at social media, becomes more hostile when people spend more time on the platform, which makes sense when you know that the platform's

promote moral outrage, when they promote tribalism, when they promote the sense of us versus them, and especially when you know that they do it all in a way that bypasses our normal kind of cognitive checks against misinformation. Because another big part of this that I found more anecdotally, when you go to these towns, people believe all sorts of crazy rumors and people are more inclined to believe, you know, not the like QAnon level stuff, but just like a little bit of misinformation about, you know, refugees are dangerous, so they're out to get us, so we should, you know,

deport them. A sort of red flag that sticks up for me in the story of how social media is changing our world and our politics is that it's oftentimes the case that folks on the left will say, oh, it's, you know, contributing to the far right. It's responsible for January 6th. It's

If social media causes extremism and political extremism in particular, like is there something special about it that only radicalizes the right?

What's going on there? Or is it radicalizing the left as well? Or is it something beyond social media? So that is, I think, a fascinating question that we have a couple of data points on, but I will be honest, I don't think we have a conclusive answer for it. There's one study in particular that actually Twitter conducted, an internal study that they found the platform promoted engagement war with tweets that had a right-wing valence. And that was something that

played into a theory and i think at this point it's just a theory although it's a compelling one that social media's effects can be more pronounced on the right because the political right and not just the united states but in any country tends to be more preoccupied with a sense of um

traditional demographic boundaries and a sense of, you know, us as a community where, you know, the white Christians or were Hindu nationalists in India or the Burmese Buddhist majority in Myanmar, all kind of like ethno-nationalist movements. That's what you call it when it gets to a much more extreme end. But a sense of like, you know,

questioning whether we want refugees in our society, immigrant society, concerns about demographic change. These are all ideas that are associated with the right and that play more into the us versus them tribalism that we know the platforms promote because it's more effective online. So there's some reason to think that it has more of an effect on right, but I think it's just as likely that this is an effect that we're seeing more coming from the political system that then manifests on social media where I think it's fair to say that there is something of a

maybe you call it a crisis of extremism or political extremism on the American right at the moment. And so that is naturally going to manifest on the platforms. But you do certainly see a lot of these effects on the political left in the United States too, and the pull towards polarization. And all you have to do is spend some time on Twitter where the left tends to be a little bit more culturally dominant. And it's very easy to see false tweets, misinformation that feed into a sense of

partisan tribal antagonism, you know, going viral pretty regularly. The idea being that like sort of to the left that would say social media is...

radicalizing the right perhaps look in the mirror? Yeah, I think that's fair. Sure, yeah. And I think it's like I said, it's the things that are happening on the right have, there's no way we could pin it all on social media, of course. So it's going to be more pronounced because there are more things happening. But yeah, for sure. I think you see it on the left as well.

This episode is brought to you by Experian. Are you paying for subscriptions you don't use but can't find the time or energy to cancel them? Experian could cancel unwanted subscriptions for you, saving you an average of $270 per year and plenty of time. Download the Experian app. Results will vary. Not all subscriptions are eligible. Savings are not guaranteed. Paid membership with connected payment account required.

You're a podcast listener, and this is a podcast ad. Reach great listeners like yourself with podcast advertising from Lipson Ads. Choose from hundreds of top podcasts offering host endorsements, or run a reproduced ad like this one across thousands of shows to reach your target audience with Lipson Ads. Go to LipsonAds.com now. That's L-I-B-S-Y-N-Ads.com.

Do we see – I asked a version of this question before, but when we add all of this together, do we see that globally there is a rise in extremism and political violence? Yeah.

I mean, if you were to talk to people who study those phenomena, I think they would say certainly in Western democracies and in democracies globally. But they would also say that there are a number of travails with democracy globally at the moment. So, of course, again, that's not something that we're going to pin completely on social media. I will say that –

you know i say anecdotally you can look at a number of countries and i looked at in the book united states western europe

brazil india a few countries in southeast asia you know at some point we get pretty close to just all the democracies so it's possible that a lot of these the way that social media is playing into say uh hindu nationalism and extreme anti-muslim violence in that country and in a few neighboring countries is particular to dynamics in that country but when you do see the pattern playing out over and over again of social media giving rise to a

political extremism that is specifically focused on antagonism and vigilante violence towards outgroups that you do have to question whether it's a coincidence. Yeah. I mean, when reading through the different examples that you cite in your book, you keep thinking over and over again, like, okay, how is this different from many of the worst atrocities that

humans have committed throughout time or worst impulses, right? I mean, you talk about genocide in Myanmar, you know, you talk about truly the worst things that humans can do and tie in social media. But of course, you know, 99.99999% of human atrocities were committed without the help of, in history, were committed without the help of social media. And I think if you look at the data we're in, interestingly enough, although it may not seem it if you read the news or if you look at social media,

this unprecedented era of peace, even for the last century. I think in the last decade, it was...

got 0.04% of deaths were the result of violence, whereas in the first half of the 20th century, so throughout two world wars, it was 1%. You look a little beyond that, it was like maybe 2%. And if you look just since the rise of social media, like 2012 to 2020, there's a slight uptick and then downtick in terms of deaths that are attributable to violence.

But it still doesn't match anything from like the 70s or 80s. So we're like we're in this historically very peaceful time, even though there was like a slight uptick in the past decade and then downtick. It still doesn't match like even the 70s and 80s, which was also a historically peaceful period of time.

Is the idea here that, like, we're setting ourselves up for something much worse? Because, like, across the board, I don't see that along with the rise of social media, we've become much more violent or something like that. So, right. I mean, there's...

What you're trying to nail down is the effect of social media. Is it increasing political violence at a time when these much larger world historical trends are decreasing it, right? Because the decline of violence and mass violence is something that's been going on for like 80 years.

And, you know, if we see that uptick in a number of places because social media is there or if we think we can assign that as social media, it does become kind of this hard like how do you suss out how much of the effect of the one and the other. It's a totally fair question. Let me give you two examples that I think speak to the kind of the way that I think we can isolate some effect on social media. One is in

In Myanmar, that's a place where it's you can't pin it all on social media. And I mean, you can't pin all of any of these things on social, you know, Mark Zuckerberg did not invent racial violence and he did not invent the act of genocide any more than Marble invented cancer. Like we can say that like cancer rates are down globally, but then they also go up if smoking goes up in a place and you can still say that like, okay, cancer is down, but like smoking still causes cancer.

In Myanmar, I was there in 2014 before there was any social media and there was a lot of racism and there was a lot of racial resentment and there was a lot of suspicion towards Muslims in the country. And then I went back three years later, 2017, to report on it during the genocide and I

There were a lot of factors that contributed to the genocide in Myanmar, but it was hard to avoid – and this is something that is anecdotal, but I will get to one that is more empirical, I promise. It was hard to avoid that social media was just –

everywhere. I mean, anybody you would talk to who was in any way involved in the genocide in the government, if there was a clerical figure, if it was just like a person on the street, and you would ask them, you know, what should we do? Or what do you think your country should do with the Rohingya who are the

Muslim minority was being targeted by this genocide that was ongoing and they would they would pull out their phone and they would show you Facebook and they would say Facebook you know showed me these horrible things that they're doing I'm on Facebook all day discussing with other people the horrible ways or the you know the things that we need to do to get rid of the range of threat there was a lot of research after the fact that showed in a number of ways the way that

Something the system would do because the obvious question is, well, was that sentiment just there anyway and we're just seeing it neutrally reflected through Facebook? A lot of the most viral, most viewed posts on Facebook that had zillions and zillions of shares saying we need to go kill all the Kulara, which is a derogatory slang term for Muslims in the country.

were from actually very small accounts and were from accounts that would get very little engagement in a neutral platform. But what happened, this effect that we're now very familiar with, is somebody would send, would post this and all of the other posts from larger accounts that would say, you know, urge peace from government accounts saying let's not go kill all the Muslims or from, you know, influential figures in the country or cultural figures who were trying to tell people to calm down. They would get a lot less engagement than this really, really

hateful post of incitement that the platform would learn is going to be a great way to engage people would pick it up, pass it around and it would go super viral and it would route people into groups that were focused on hate and genocide over the groups that were not focused on those things. So it didn't invent antagonism towards Muslims in the country which goes back decades.

But it was a powerful enough accelerant that even the United Nations after the fact called Facebook. It said that Facebook had played a determining role in the genocide, which I think was kind of a big moment for the reckoning. So the example that is more, I think, empirical is Sri Lanka.

It's a small country off the coast of India, coincidentally, also a country that is majority Buddhist that has a Muslim minority. And I was there in 2018 because there had been this explosion of communal violence. So not state-led like in Myanmar, but just completely grassroots that blew up in a bunch of different villages at once targeting the Muslim minority. And everyone was saying that it was linked back to Facebook. Okay, what does that really mean?

And going there, it was very clear that something similar had happened where you had these very fringe figures who before they'd gotten on Facebook had a like middling to small audience and then got on Facebook. And these are people who were hate groups basically like the equivalent of the KKK for Sri Lanka.

would suddenly get these huge audiences, would go ultra viral with this misinformation targeting Muslims and then hate speech again from very small accounts but that would suddenly get suspiciously 10,000 likes, 10,000 shares. Inciting violence against the Muslims would then go viral until you got this accumulation and people who

And I would go talk to them, people who two months before this had happened, three months before, had not believed that Muslims were particularly a threat to them and had never evinced any show of violent racism, would never would have believed these or never had believed these conspiracies beforehand, suddenly believed them enough to go commit all this violence. And the government, the democratically elected government, even though they didn't want to do it in the middle of this violence, they shut down access to all the platforms and the violence immediately dissipated.

And a colleague who reported the story with us talking to a government official, and he made the same point that you did. He was like, look,

He was almost apologetic about it. He was like social media did not invent racial violence in our country. It did not invent the history of animus between Buddhists and Muslims and it didn't invent this propensity that people have for racial violence. He put it, I thought, really beautifully. He said that the germs were ours but Facebook is the wind and that Facebook had spread these sentiments and multiplied them in a way that would not have happened and had not happened absent the platform.

Reading about this in your book did make me think about the studies that have been done on the role that radio played in the Rwandan genocide and ultimately that it went a far way in terms of spreading racial hatred that ultimately led to the genocide. And you can look at that and say, well, ban radio. Like, no one's going to do that. Sure. In a situation like this...

then what do you do? So I would ask the experts who are the people who are studying this a lot, and their answer was actually usually pretty simple. Not simple in terms of ease of execution, but simple in terms of how would it actually work. But they would just say, if you just turn off the engagement maximizing features of the platform, those are the things that are driving all this. And if you just have a more actually neutral platform that doesn't have

algorithms that are sorting and ranking posts based on what's going to give you the most engagement, that doesn't have this group's recommendation feature, that maybe even doesn't have likes and shares. Even Jack Dorsey, the head of Twitter before he left, said maybe having that little like counter on the bottom of tweets is something really hard to control in our society. If you get rid of that and you go back to the social media that we had, because we have seen these platforms that existed before 2006, 2007,

Like radio, like TV, it has the propensity to host horrible things for sure. But you lose the incredible training, manipulating, distorting effect that the platform has. And that is a platform that is actually like much safer for us to have, but is also vastly less lucrative. The social media companies that existed that were like that were like MySpace. They were a tiny, tiny fraction of what we have now. So

No one has an idea for how to actually bring that about, but it does technically exist. It's possible to build that. So we've talked about truly some of the worst effects that social media can have. And I obviously hope never gets to anything close to that in the United States, but we're focused on the midterms right now. And I think here...

There has been a lot of focus post-2016 on how social media shapes our elections. How do you process this? So I talked about this in the book because at some point my editor was like, you know, you should try to answer the question of did social media decide the 2016 election? And the ultimate kind of stance that I took on it is that I think that question both

over and under states. It both gives social media too much credit and not enough credit. I mean, like an election that close, you can point to whatever you want, but social media's effects on us. I mean, all the things that we're talking about, the emotions that it trains in you, the way that it

trains you to reorient the way that you situate your own identity in your social community. And that can be race-based, partisanship-based, the way that it changes how you see and interact with politics, and especially the way that it changes how you interact with information, the way that you look for information that will confirm information.

your identity rather than looking at it analytically, those effects are so multitudinous and so profound that I think by the time you get to the point of, okay, you're walking to the voting booth and like it's time for me to decide who to vote for, so much has already been changed in you and in the way that you gather information and the way you place yourself in the world that I think that it's almost understating social media's effects just to look at that kind of last, that last stage of it.

How do you address the complaint in general is radicalization and extremism. I think the complaint also from the left is the spread of misinformation online that may help promote far right ideas. True. And even help, in the case of 2016, an adversarial nation shape views about the American election. I think the argument from the right is like, hey, our content is getting downvoted or censored. Is there evidence of that?

No, I don't think so. I think there's a... I think it's common cause that looked into whether there's evidence of like downranking or censorship and whether it plays out more on the left or the right. And it's...

It's pretty politically neutral and it's not particularly common. And I really think everyone kind of wants to focus on – and understandably because it's where we're looking at politics and we're kind of – we want to look at what's that like last step or what's the like surface level effect and that it's conservatives talking – right. It's the elections. Conservatives talking about censorship. It's people on the left saying maybe there's not enough of removing – why did I see a Nazi on Twitter? That goes to show that Twitter loves Nazis.

And I really think that you want to think of social media's effects, and this speaks to your last question about elections too, its effects as largely overwhelmingly atmospheric and subterranean. And I'll give you an example of this that I think is a better answer than what I gave you before to your question about elections. There's a, I call it like a set of studies or a set of experiments in Brazil that I write about in the book on YouTube. Because I'd heard a lot that like YouTube had something to do with the Bolsonaro phenomenon. This is the like-

I think you could fairly call him far-right guy. Sometimes they call him a mini Trump, but he's like very much his own thing, who became president of Brazil in 2018. He kind of came out of nowhere. He actually started as a YouTuber, but had always been a lawmaker, and then all of a sudden got a huge audience.

We wanted to understand, was it actually true that YouTube had played a role in Bolsonaro's rise? Or was this just like he happened to be on YouTube and like people were flocking to him for some other reason? And like, how could you possibly attribute that to the systems? And something that we, like I and these researchers who I worked with, this combination of team in a Brazilian research university in Harvard,

found was that YouTube's recommendation system, it wasn't just recommending people into a lot of videos about Bolsonaro who were not previously Bolsonaro fans or had not expressed far-right attitudes because they also measured this both by the recommendations and by following comments on the videos because people leave lots of comments. You can track the sentiment over time how it's changed. Is that the system was recommending enormous numbers of Brazilians into this giant

network, basically, of channels and videos that the system had created. Because the way that YouTube works, the way it drives the overwhelming majority of its traffic is you watch a video and then at the end of it, it plays another video or it recommends you into a few videos. And YouTube has said that this is where we get a bajillion percent of our traffic. It's why we're so profitable. What

What the system had learned to do is anybody who expressed any interest in news and politics or even not in those topics, if you were just a user in Brazil just minding your own business watching a video gaming video, which is something that we heard a lot of people get pulled into it too,

the system would sooner or later route you into this, think of it like a spiral, this spiral of channels and videos that would start not overtly political, but would just be some kind of like something that was a little engaging. Like there's this guy Nando Mora, who was just like a popular, like Ben Shapiro kind of style YouTuber, would like sometimes get into politics, sometimes do comedy, sometimes play video games, play music, and would start to pull people into this network. It would show them one video after another that would slowly bring them around to,

politics that were generally aligned with conspiracy theories, with hatred of the government, with the far right, and then would bring them into Bolsonaro. So Bolsonaro might just be 5% of the YouTube diet, but it would be a worldview that had not existed beforehand. In the real world, you would not find all these ideas linked together, pulled them all together into this network at which the center of which was Bolsonaro. What they found in this study was that

Pulling these huge numbers of users and then tracking them over time They would watch more and more of these videos their sentiment in the comments would change over time and then they would get spread out to Other channels that were not bolsonar aligned regular, you know news channels political influencer channels and those users would then push those same Attitudes out in the comments. So this was there were a lot of different ways that we attacked this and in the investigation of it, but this was one way that we found that

That last step of like when you go to the polls and vote, are you voting for Bolsonaro or not? It wasn't that like, oh, I watched a YouTube video that morning and it made me decide to become far right in Bolsonaro. It was by the time you were even thinking about politics, you had been sucked into this much larger community and identity and this worldview and this set of ideas that the platform had learned was especially engaging, but that just happened to have this guy at the center of it.

One thing I really like about this book is how much you dig into the academic research and, you know, you paint these pictures and you speak to a bunch of different people who help tell the story. But a lot of it is based on research. As we've kind of discussed here, there are things that are easy to prove with an academic study. And that's an even complicated statement. There are things that are provable with an academic study. There are things that aren't. Yes.

Do you think we have all of the information we need to kind of be like, these are cigarettes. This is bad for us. Right. Change it now. Or like, is there something you're still looking for after writing this book? Like, is there more information we need or is it time to close up shop and just say like, tell your kids to get off. Right. You know, you should get off yourself. It's damaging to our elections. Shut it all down around election time. Like, where are you?

That's a good question. And honestly, you might actually be a better position to answer that because I'm someone who spent four years mired in it. So when you spent four years research something like, you know, you buy into it a lot. And what matters more is like you are a smart person who is coming to it fresh and coming to it very informed. So like what you think about it. But I think that there are there are a number of things that.

We have a fuzzy idea, but there's a lot more detail that it would be, I think, important and interesting to learn about. The systems are also changing frequently, which makes it kind of a moving target. So we get like this really good sense of clarity and then things change and now we don't understand it as well. And TikTok comes along. Right. I mean, TikTok is actually, I think that's the, I have no idea what the effect is. Sometimes people ask me, they're like, okay, what's the effect of TikTok? And I talk about it a little in the book, but I don't know. I don't think we have the data or the research to know what it is actually doing to you. I spend...

way too much time on it. And like, I'm sure it's making me crazy in all sorts of ways. Well, that is kind of, I guess, where my final self-reflective question comes from. You say like, oh, you would know better as someone who's reviewing this information fresh. Like, I know from my own experience, the way that

Instagram or Twitter is hooking me isn't by showing me like neo-Nazi content or frankly like any political content. I spend my time watching some French guy make crazy structures out of chocolate and cake and like I'll just – I've seen that one. It's great. When I was supposed to be writing these questions this morning, I was looking at Instagram being like, oh, wow, you can like make that out of chocolate. Yeah.

So where is it? And so my question is like, is that also bad for me? Because like I said, so many people are not like so many people actually find political content off putting, don't want to talk about politics. You know, my closest friends are sending me like pictures of dogs doing like crazy stuff, cats doing crazy stuff, just like little clips from our favorite TV shows of people saying hilarious things. Like,

I get where the argument is that this can lead to political radicalization and that's deleterious to our society, right? Like you can argue here and there that humans have impulses that have led to atrocities all along, but I get the connection there. Sure. At the end of the day, is it also like the kitty cats and the chocolate statues that are with us? Right.

So that's actually a good question because Instagram is the one social platform that I use regularly even though I try not to. Is it bad for you? So I think what I will say to that is that there are – I mean I think if you're a kid, it's bad for you.

If your kid is bad for you because your social need is much higher, it's exaggerated. So it's like we think of like bullying. Oh, well, you don't know me. Okay. I need people. No, I'm kidding. We do. I mean, we all need people. We all need it. Absolutely. It's true. On average, I would say. Adolescents have a much higher need. So the –

social feedback, the effect of that is extremely exaggerated. One of the things that I think is deleterious about social, even if it's just the like doggy videos, is the, um,

sense of validation you get from likes. Like I posted like a picture in my office this morning just because I thought I caught like a cool angle. And I've checked several times to see how many likes it gets. And if it gets very few likes, I will feel like not great. And if I get a lot of likes, then I will think that it feels good because I will think that I'm getting the sense of social validation that we all crave. But

We know from the research that that's really just a hit of dopamine and you are not actually getting the chemicals that are associated with social connection. So I think that it's a roundabout way of answering your question. It's bad for you if you are using it for politics, for news, because we know where it leads you. I think it's also bad if you are using it for a sense of social connection, which is the thing that I find myself going to it for because it's

It presents it to you in this kind of gamified numerical way that is really exaggerated, that is manipulative because, again, it's not actually do people like your photo? Do they actually like your cake or do they like your dog? It's just did the system choose to push it out to people that –

Right, yeah. You describe the kind of back in the day before social media was more publicly shamed. They describe this phenomenon where it's like you go on social media to get a social interaction because you crave social interaction and you get that hit of dopamine, but then you don't get the actual reaction.

well-being benefits of spending time with another person. So that need becomes even greater. So then you go back to social media more and more and more and more. And you basically describe it as something of like a slot machine in that way. So what's amazing about, I'm so glad that you mentioned that one. It blows my mind about that exact example is that I didn't come up with it. That was something that was used by a consultant who worked in social media who was consulting for the companies who said, you should do this. And they did do it.

to have your product simulate an exaggerated version of the effect of getting a real social interaction because it will addict them both by the dopamine boost and like you said, because you're not actually getting the benefit of the social interaction. So you chase that high like you do with a slot machine and you keep going back to it. So-

If that is happening to you, then it's not good for you. Even if it's being smuggled through a French chef making chocolate. Right, yeah. I mean, if you're just consuming it passively, then it's not so different from just like watching TV. But there's a lot of research finding that when you shut it off, on average, you just become happier and your life satisfaction goes up. And that average is...

broad enough that it makes me think it likely applies to people who are not on politics version Evans, but it's for sure the case that if you are on more benign versions of the platforms, then the effect is I'm sure lower.

So general things people should do is just get their news straight off of podcasts, actual newspapers, et cetera. That's your advice to folks. But sort of like the most urgent thing is how you're consuming news. Don't consume news in a social context. I actually think that – I mean, yes, you should definitely don't consume news in a social context because the part of your brain that processing it changes. Maybe.

My catch-all piece of advice, which I think encompasses that as well. This gives a social context. Yeah, it is. Absolutely, yeah. I just made an uh-oh face into the camera for a second.

So maybe you shouldn't be listening to this, folks. Turn it off. It's a – I think a podcast is fine. I mean it's unidirectional. It's not being filtered through the like sense of social – if we said there are like 10,000 people agreeing with everything that I say, then that would like feed into people's sense of like false conformity. Yeah.

The broad piece of advice that I give to people is just to think about it like a drug because it acts like a drug, not just in the sense that it's addictive, but in the sense that like any drug, like caffeine, like nicotine, like alcohol, that it changes your brain chemistry a little bit and then it changes your emotions a little bit. And, you know...

That doesn't mean throw away your smartphone and never use it, which is impossible because like social media has dominated the world. So we all have to use it. But in the same way that like,

You know, I will have a glass of wine tonight, but I know to wait until the evening to have it. And I know to have like one or two. And I know to only have it in certain situations. You just have the same with social media. And also just in the sense that like if I, you know, went out with friends and had like two drinks or three drinks, I would be able to identify its effects on me. And I would know that if like, I don't know, someone says something and I get a little annoyed at them, like, okay, that's the alcohol thing.

producing that effect. It's not coming from me. So I would know how to just like cope with living with its effects more responsibly and safely. And I think that that's my advice on how to live with social media because it's in our world now, like it or not, is to use it in a way where you think about what's a responsible way to use it and just to be as conscious as you can of the ways that it is pulling at you and distorting how you're thinking and feeling and behaving.

Well, I think we're going to leave it there. I appreciate you entertaining both my more skeptical questions and my own personally interested questions about how social media is affecting me. And I'm glad we could also do this in person. Yeah, me too. Yeah. Thank you so much for having me. It was great.

Max Fisher is the author of The Chaos Machine, the inside story of how social media rewired our minds and our world. My name is Galen Druk. Nash Consing and Kevin Ryder are in the control room. Chadwick Matlin is our editorial director and Emily Vanesky is our intern. You can get in touch by emailing us at podcasts at 538.com. You can also, of course, tweet at us with any questions or comments. If you're a fan of the show, leave us a rating or review. Oh, you know, the like podcast.

positive social feedback, a rating or a review in the Apple Podcast Store or tell someone about us. Thanks for listening and we will see you soon.