cover of episode Facebook's Plan To Be Cool Again

Facebook's Plan To Be Cool Again

Publish Date: 2022/8/17
logo of podcast Land of the Giants

Land of the Giants

Chapters

Shownotes Transcript

When you're running a small business, sometimes there are no words. But if you're looking to protect your small business, then there are definitely words that can help. Like a good neighbor, State Farm is there. And just like that, a State Farm agent can be there to help you choose the coverage that fits your needs. Whether your small business is growing or brand new, your State Farm agent is there to help. On the phone or in person. Like a good neighbor, State Farm is there.

Silicon Valley Bank is still the SVB you know and trust. The SVB that delivers human-focused, specialized lending and financial solutions to their clients. The SVB that can help take you from startup to scale-up. The SVB that can help your runways lead to liftoff. The only difference? Silicon Valley Bank is now backed by the strength and stability of First Citizens Bank. Yes, SVB. Learn more at svb.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.

Okay, I'm here to help answer the question. Is Facebook cool? So recently we conducted a very unscientific study. We wanted to get a sense of how young people feel about Facebook. I don't really feel like a lot of people in my generation specifically use Facebook that much.

They did not hold back. Yeah, no. I'm good on Facebook. Every friend request I've gotten for the past three years has been an old family member that I haven't talked to in years or the bully in my high school. It's mostly just, from my observation, like burnouts, people who speak in high school, some like moms, you know, random creepy people. Everybody who gets into a pyramid scheme messages you on Facebook to try to get you involved.

A chill just ran down Mark Zuckerberg's spine. Because these are the exact users Facebook has been stressed out about losing for years. It's definitely not my first choice for social media, but I do like the idea of being able to join individual communities and being able to hear from people who I may not have otherwise have known of or heard from. I just keep, like, all information off of there about myself because I already consider it kind of like a dying app. I don't even really know the purpose of it.

Okay, so Facebook is far from dead. It has nearly 3 billion monthly users. But this feeling young people have? It suggests that something has gone wrong for Facebook.

Because the company that defines social media, how we connect with each other and express ourselves online, it's now struggling to stay relevant. So Facebook is undertaking one of the most drastic shifts in its 18-year history. An overhaul to the beating heart of the app, the invention we started our series with, the news feed. That place where you've gone to scroll through updates from your friends and family, it isn't going to focus on your friends and family anymore.

the future of the feed will be less social. Which is ironic, because Facebook just spent years trying to make the feed more social. Today, what went wrong with the news feed and how Facebook is trying to fix it? This is Land of the Giants.

February 2018, Adam Masseri, then the head of Facebook's News Feed, was on stage at the Code Conference in California. And he was there to admit, the News Feed, it had gotten messy. As news and public content has grown in the ecosystem and grown really, really rapidly, that has, for a lot of people, crowded out some of the stories that they care about. As in, posts from friends and family.

News stories and recipe videos? That is not what people came to Facebook for. What's remained really consistently true is that when we ask people what they want out of the platform, the number one answer we get across different countries is they want to connect with friends and family. What he didn't quite say aloud? This crowded news feed was a problem of Facebook's own making. Let's rewind. Shereen Ghaffari is craving a Pete's Iced Mocha. Coffee run?

Once upon a time, our news feeds felt personal. Basically, everything that showed up was something from your friends. Status updates, wall posts. Here's an old friend who wrote, in town for Labor Day. That would never happen now. But then around 2011, Facebook started to take the news part of news feed seriously. Twitter had taken off and Facebook pulled a classic move. Copy the hot new thing.

which meant recruiting publishers to the platform, selling them on the ability to reach a massive audience through the newsfeed. Established news organizations jumped in. Startups like Upworthy and NowThis also saw an opportunity to make viral content.

This pup is learning how to walk using an underwater treadmill. Meet a nine-year-old girl that just dominated a Navy SEALs obstacle course. This math teacher joined his students for a jump and dance party. With the newsfeed expanding, Facebook needed to figure out how to sort everything, to refine its process of algorithmic ranking, to make sure what was shown at the top of your newsfeed was what you actually wanted to see the most.

The company was tweaking these algorithms all the time. But around 2015, Facebook had a specific goal. We were really looking at time spent. The idea being that if you choose to spend your time with Facebook, we're providing you with a lot of value. This is Tom Allison, who ran engineering for News Feed at the time. He's now the head of the Facebook app at Meta. You heard him say time spent. Here's what that meant.

Whatever kept you on Facebook the longest, Facebook's algorithms would give you more of that kind of content. But eventually, Facebook realized this approach could have some unintended consequences. When you kind of optimize a system for time spent, you get people kind of trying to trick you into watching something that's not that valuable or clicking on something that's not that valuable. Clickbait. Facebook felt spammy.

This is the crowded news feed Adam Masseri was talking about at the beginning of 2018. In optimizing for time spent, the company realized people weren't getting what they originally came to Facebook for. Mike Shretfer was the company's chief technology officer back then. He explained this moment to us. We literally are looking at this stuff on a daily, weekly basis and trying to understand, you know, from a long-term value perspective. It's like, what is it that we can do in our products that

help people enjoy it and thus come back and use it. Because if they don't, we sort of have a fundamental business problem. And, you know, the thesis there is that the reason people come to our products is fundamentally about connecting with other people they know rather than sort of being, for example, just entertained.

The business problem of just being entertained on Facebook. People were consuming content, but they weren't posting as much or commenting on their friends' posts. Facebook's own research showed that users felt worse about their time on the platform when they were passively watching videos and not engaging with their friends. Facebook needed to uphold its promise as a social network to keep the newsfeed a place where people wanted to hang out. Not in small part because that's where the ads are. So Facebook did something drastic.

It changed the way the newsfeed worked. We made an adjustment to tune a lot of the ranking a lot further towards emphasizing friends and family content, highlighting meaningful interactions. Mark Zuckerberg on stage for an interview with News Corp CEO Robert Thompson.

That adjustment toward friends and family content he mentioned, it was a brand new era for the newsfeed, referred to internally by the acronym MSI, for Meaningful Social Interactions. Facebook announced it in January 2018. Gone was time spent. In a way, this was going back to basics, using algorithms to prioritize posting with friends.

But what does that look like technically? How could an algorithm recognize a meaningful interaction? Here's Shrepp for again. You ask yourself the question, how do you decide or how do you help figure out what it is people want? And one thought was that when I'm writing comments on a friend's post, that probably is a reasonable indicator that that is something that I found interesting.

Interactions, or basically content that people engaged with, would now be the measure of what came first in the newsfeed.

The result of that change was to remove about 50 million hours of viral video watching a day. Zuckerberg again, on stage with News Corp. Remarkably, Zuckerberg said he expected meaningful social interactions to lead to people spending less time on Facebook. His hope? That it would be time well spent. Except… So if you just have an engagement-based ranking system, it sort of like naturally goes off the rails.

Jeff Allen was working at Facebook when it made the pivot to this focus on engagement, on social interactions. At the time, another pivot was happening inside the company, one toward integrity. Integrity is Facebook's Orwellian word for employees like Jeff Allen, who are tasked with figuring out how to make its platform safer.

That could mean building defenses against spam or even state-backed influence campaigns. By the time Facebook was shifting to the Meaningful Social Interactions, or MSI, era in 2018, Allen says Integrity had become the hot team to work on. There was a really great moment, right, when it was like, let a thousand flowers bloom, and every org at the company sort of said, okay, let's think about, like, what does Integrity mean for our team? What does Integrity mean for our product? Let's actually dedicate people to thinking about this.

It didn't take long for employees working on Integrity to realize that, yet again, a big Mewspeed pivot was having unintended consequences. Here's Tom Allison. "MSI really, again, came from the spirit of people want on Facebook to feel like they can interact with one another. Comments were a very good proxy for that. But yeah, I mean, what we ended up seeing too was some people left gnarly comments. Some people abused that system."

Surprise! Not all interactions are positive. Comments could also be arguments about a political story or even worse. And people weren't always resharing the highest quality posts. Jeff Allen again. — Borderline harmful, sensational content fundamentally does better under user engagement. Mark Zuckerberg has even acknowledged this, right? — He did. — In November of 2018, Zuckerberg put up a lengthy post on his Facebook page.

In it, he explained that the closer content veered to violating Facebook's rules — so the closer to bullying or hate speech — the more engagement it was prone to get. Under an engagement-based ranking system, that so-called borderline content could be rewarded by Facebook's algorithms. A problem. And so what you have to do after that is build all these systems to demote things that you think are bad.

So Zuckerberg announced that Facebook would begin demoting content that got closer to violating its rules. So it's like, amplify everything that looks engaging, build all these systems to demote things that look like they're bad. You know, if it looks like misinformation, if it looks like hate speech, if it looks like something else, demote it.

In Zuckerberg's post, he wrote, quote, "One of the most painful lessons I've learned is that when you connect two billion people, you will see all the beauty and ugliness of humanity." This sentiment from Zuckerberg, it seemed to suggest that all the divisive, low-quality content, it was a people problem, not a technology problem. Zuckerberg had a point. It seems like no matter how Facebook tweaked its algorithms to promote more meaningful conversations, people would find a way to yell at each other.

But in the end, it was still Facebook's technology that picked what beauty and ugliness of humanity to amplify. Technology that made complicated decisions about what to show billions of people, all in a black box. How much...

Should an entity as powerful as Meta be allowed to operate basically fully in the dark? This is Frances Haugen, a former Facebook product manager. I interviewed her the day she attended President Biden's 2022 State of the Union address, a primo invite she got by way of being the highest profile whistleblower in big tech to date.

In late 2021, Haugen leaked a trove of documents from inside Facebook that suggested the company knew more than it let on about the harms its platform caused.

Among those documents, glimpses of how the pivot to MSI was making the newsfeed worse. By the time I showed up, so I showed up in 2019, that system had been in place for 18 months. And it was well known amongst the data scientists and security people that this change had significantly improved.

given preference to extreme and polarizing content. One of the leaked internal documents showed that optimizing for engagement was, quote, contributing hugely to political misinformation. Integrity employees estimated that when it came to political content, removing the way Facebook ranked for engagement would decrease misinformation in the news feed by as much as 50 percent. It's unclear what Facebook did following this report.

We should also note that the company says these documents were taken out of context. But still, it's clear that the MSI era of newsfeed, despite the company's intentions, often exacerbated the worst in people. Think comment flame wars with your extended family over politics.

Facebook's mission is to empower people to build community and bring the world closer together. And I absolutely love that mission. Jeff Allen again. It's just disappointing. It is disappointing when you see the examples that are counter to that. It wasn't just that Facebook may have optimized for those political flame wars.

If MSI began as a promise to make time on the platform well-spent, to prioritize higher-quality, meaningful content, Allen says that didn't go according to plan either. The problem that we actually have at Facebook is a really low-quality content problem. Here's an example by way of an internal report Allen put together in October 2019.

He found 15,000 pages on Facebook with large audiences in the U.S. that were actually run out of troll farms in Kosovo and Macedonia. They were basically just trying to figure out, okay, like, what's the easiest way to get attention in American audiences and get them onto our domains where we can show them ads? Allen says they took advantage of what Facebook was optimizing for —

And the tactic that they stumbled into was copying and pasting politically charged and highly divisive articles onto their own domain and then sharing them on Facebook with sensational headlines. He found that the troll farm pages reached a combined 140 million U.S. users monthly.

Jeff Allen left Facebook in 2019. He's since co-founded a nonprofit for industry professionals who want to improve social media platforms called the Integrity Institute. I think there's still enormous opportunity at the company to think about how is community actually built on the internet, right? Like, you know, how can you build systems that reward people who care about communities and are operating in the community's best interest? Questions Facebook is still grappling with.

In the end, meaningful social interactions wasn't the answer to Facebook's problems that it had hoped for. If anything, it made things worse, especially for the kind of user Facebook wanted the most. In March of 2021, more internal research leaked by Francis Haugen painted a damning picture of how young people really felt about Facebook.

"Most young adults perceive Facebook as a place for people in their 40s and 50s. Young adults perceive content as boring, misleading, and negative." "Facebook? Not cool." No punches were pulled. The researchers wrote that young adults have a "wide range of negative associations with Facebook, including privacy concerns and impact to their well-being." The social network that started as a place for college students had become boomertown.

So it's time for yet another newsfeed era, or as it's now called, just feed. It's an entirely new playbook lifted from Facebook's current most formidable competitor. And it moves away from the central philosophy that has shaped the feed since the beginning. Your friends after the break.

Support for Land of the Giants comes from Quince. The summer is not quite over yet, but shifting your wardrobe to the colder months could start now, little by little. You can update your closet without breaking the bank with Quince. They offer a variety of timeless, high-quality items. Quince has cashmere sweaters from $50, pants for every occasion, and washable silk tops.

And it's not just clothes. They have premium luggage options and high-quality bedding, too. Quince's luxury essentials are all priced 50% to 80% less than similar brands. I've checked out Quince for myself, picking up a hand-woven Italian leather clutch for my mom. As soon as she saw it, she commented on how soft and pretty the leather was and told me it was the perfect size for everything she needs to carry out on a quick shopping trip.

Make switching seasons a breeze with Quince's high-quality closet essentials. Go to quince.com slash giants for free shipping on your order and 365-day returns. That's Q-U-I-N-C-E dot com slash giants to get free shipping and 365-day returns. quince.com slash giants

On September 28th, the Global Citizen Festival will gather thousands of people who took action to end extreme poverty. Watch Post Malone, Doja Cat, Lisa, Jelly Roll, and Raul Alejandro as they take the stage with world leaders and activists to defeat poverty, defend the planet, and demand equity. Download the Global Citizen app to watch live. Learn more at globalcitizen.org.com.

If you want to understand the future of Facebook, just open TikTok. As you swipe through TikTok's main feed, you'll notice that the app is probably showing you videos from accounts you don't know. And you never need to tell TikTok what you like or who you want to follow. It does that for you, getting smarter the more you use the app. Definitely seeing what TikTok did was certainly illuminating. An understatement. That's Tom Allison again, the head of Facebook.

TikTok had clearly hit on something. This feed of never-ending entertainment from all over the world recommended to your interests with eerie accuracy? It's addictive, like falling into an attention vortex. Facebook wants in on that kind of attention and out of the problems caused by the old era of the feed. By the time TikTok hit 1 billion users in September of 2021, Allison and his colleagues decided that the newsfeed needed to change. Again.

We went through a lot of discussions with Mark. You know, I mean, Mark is the creator of Facebook. We had multi-hour sessions with him and kind of many people that have led or been involved in Facebook in the past have really kind of together shaping this kind of, I would say, updated vision for how the Facebook app is going to respond to the next generation of people who are going to use it.

They decided that rather than only showing stuff from your friends and pages you follow, they would embrace TikTok's idea of social media. The main feeds of both Facebook and Instagram would use AI to start heavily recommending posts, regardless of where they came from. If people didn't want to engage in the feed anymore, why not just make it as entertaining as possible? The meaningful social interactions era, the era meant to boost connections between friends and the feed, was

was over. The company calls this next phase Discovery Engine. This new Discovery Engine approach is already starting to power your feed on Facebook and Instagram. Look closely and you'll notice more and more content from accounts you don't follow. Content with no connection to you, served up by a company that was built on the idea of connection. You heard it when Facebook announced the MSI era. For the most part, it didn't think entertaining, unconnected content was what Facebook was for.

But Tom Allison says in a post-TikTok world, that's changed.

I think really the big mental leaps that we've finally made is like, hey, this distinction between connected content and feed and unconnected content, this might actually be kind of a more artificial construct that we've created. And actually what the next generation is looking for is actually a blend of kind of who speaks to them and how in this distinction of like, this is a creator that I'm connected to that really resonates with me versus this is one that's getting

recommended to me, but I'm not connected to. It's not, I don't think it's as strong of a mental model as we believe it has been.

This is a key idea that Allison and Meta bring up while defending the change. This next generation, they want different things from their social media feeds. Now, Facebook itself is around 18 years old. So we're talking about a generation that grew up with social media. And they grew up kind of using it and learning how to integrate it, frankly, into their lives in different ways than I did. You know, who's, you know, I'm not a young adult, you know, spoiler alert.

To Eli Pariser, a noted critic of Facebook's business model and author of the 2012 book The Filter Bubble, a phrase he coined, this discovery engine pivot is about something else. Facebook has failed at the kind of social media it defined. So now it has to change. It's not that people don't want to share things digitally with other people and even with friends and neighbors and what have you.

But I think Facebook has so blown it on the mission of making that feel like this is a good place to do that, that that content no longer is available to them. People are no longer willing to trust that app with this content. Even though people have moved on from the feed, Facebook can't. It's the company's main moneymaker. And so, you know, I partly see this as a defensive move, which is like, we've just got to find a way to get...

If Facebook plans to monetize the metaverse as successfully as it has the feed, well, that's a long way off.

For now, it's playing catch-up to TikTok when it comes to the technology that powers its new strategy. TikTok has spent years getting its discovery system right, creating experiences for users that feel relevant regardless of who you follow.

Facebook's overhauled recommendation engine? Tom Allison says they're working on it. I think these recommendations are going to get better and better and better to the point where they feel just as good, if not even better than some of your connected content and feed. That's our aspiration.

Right now, though, some users have been complaining about all the recommended content in their feeds, especially on Instagram, where this discovery engine approach is being tested more aggressively than on Facebook. But Allison defends the approach. We have more sophisticated AI now that can look at kind of the billions of pieces of content that are being created across Facebook and Meta more broadly and deliver the piece that might be right for you in that moment.

I've been struck while listening to Allison and Meta executives talk about trying to make the feed more entertaining, how they still try to couch everything in the idea of connection. Take this example. Allison says that entertainment and social media feeds has become a jumping off point for private messaging between friends and family. We have this kind of, you know, mantra in the feed team now. We call it social.

connection through content. There's this recognition that kind of content, whether it's from your friend network or whether it's from a connected group, can lead to connection because that's what you discuss, that's what you might share, that's what might start a message thread with another person. But what we're really doing is kind of saying, you know what, recommendations can do that too. Sure, sending your family and friends a cool video by a stranger can be fun. It can spark real conversation.

But I can't help hearing a desperation here about not wanting to lose Facebook's soul with this discovery engine pivot. Facebook told the world that its push to emphasize meaningful social interaction was about getting back to its roots by making the feed more about the people you engage with.

And somehow, this pivot to discovery, moving away from communicating with friends in the feed entirely, it's also being sold as a return to form. Here's Meta's former CTO, Mike Schreppfer, again. So I think trying to figure out that integration of people's interests and what they might discover and how that helps them connect and bind and start conversations with other people they care about, that's, I think, our unique place in the world, as opposed to just like, I launched this app to be entertained, which is, I think, what some other people are in the business of doing.

Putting all of Meta's justifications for this change aside, the impact of this discovery engine approach is that the company is taking more responsibility over what we see than ever before. Eli Pariser again. It places an enormous amount, even more power in the hands of

These companies... Because before, what we saw on our feeds was ranked by Facebook's system, but it still basically came from our networks. Now, Facebook's algorithms will reach outside our networks. Facebook will curate our worldviews with a power and freedom it's never had. I would begin by talking about the stakes if something goes wrong. Mehtab Khan is a resident fellow at the Information Society Project at Yale Law School.

She focuses on the ethics of AI and regulation. You know, people's perceptions about the social and political world that we live in is shaped by what the algorithm is showing us. Khan has concerns with the idea of Meta's AI taking even more control over what we see. One example. Automated content moderation is rife with problems. Content moderation. This is one of the ways we've already experienced Meta's AI.

Facebook has already been using AI to demote harmful content in the feed. But we know that a lot of misinformation, bullying, and hate speech still gets through. It's...

not built for context. It does not really do well with content that's not in English. And it also does not capture political and social and economic gender dynamics, largely outside the US and EU. This is something we heard from people who have worked inside Facebook in response to this discovery engine change.

Some people are uneasy, given Meta's track record catching harmful content, especially outside the U.S. and EU. The technology is designed keeping very narrow focus of users in mind, and it doesn't even work that well for that small subset of users. Relying even more on AI to catch stuff and to recommend good content? Scary to some.

But to Tom Allison: It doesn't scare me, but I recognize the responsibility that we have, and that is weighty, if you know what I mean. If there's any company that I think has really risen to the challenge of trying to do this at scale, I do think it's Meta. There's an irony to Facebook leaning more on its own AI and algorithms. Meta's top policy executive, Nick Clegg, told us as much directly. In a strange kind of way,

It's a strange kind of way in the future we're going to be doing what we have been alleged to do for a long time but didn't. So, you know, if you listen to the kind of sort of Francis Haugen kind of narrative, it's, oh my gosh, they're just spoon-feeding people hate speech and people in this sort of bovine passive state are just ingesting it and then repeating it. Of course, it was nonsense. It was nonsense.

Because the vast, vast, vast majority of content that people saw on Facebook was driven by, of course, by our systems, but also by their own choices, who their friends are, which groups they're part of, what content they engage with, and so on. This is a defense Clegg himself has made before in an attempt to demystify concerns about the company's algorithms.

In March of 2021, Clegg wrote a lengthy essay called "You and the Algorithm: It Takes Two to Tango." His point was Facebook was using your choices to make decisions on what to show you. You were dance partners. But that analogy is starting to break down as the algorithm gains more control over your feed. Maybe this will go well for Meta. Maybe the discovery engine will help Facebook feel cool again.

But as we've seen with past errors of the feed, when Facebook makes a tweak, there are always unintended consequences. And with the company embracing its own recommendations more than ever, if things go wrong, there will be no blaming the users this time. In our next episode, Meta's most expensive acquisition ever and one of the most used communication apps in the world, WhatsApp.

We'll tell the story of WhatsApp's incredible success and challenges ahead through the lens of its largest market, India.

Special thanks to our opening voices, Christopher Cunane, Tristan Green, Esther Heath, Brian Parrish, Susanna Raddick, and Julia Smith. Key conversations also inform this episode with Sahar Masachi, co-founder of the Integrity Institute, Brandon Silverman, and Joan Donovan. Thank you all for your time. Land of the Giants, the Facebook meta disruption is a production of Recode, The Verge, and the Vox Media Podcast Network. Megan Cunane is our senior producer.

Oluwakemi Aladesui is our producer. Production support from Cynthia Betubiza. Jolie Myers is our editor. Richard Seema is our fact checker. Brandon McFarlane composed the show's theme and engineered this episode. Samantha Oltman is Recode's editor-in-chief. Jake Castronecas is deputy editor of The Verge. Art Chung is our showrunner. Nishat Kerwa is our executive producer. I'm Alex Heath. And I'm Shireen Ghaffari.

If you liked this episode, as always, please share it and follow the show by clicking the plus sign in your podcast app.