cover of episode Googlers vs. Google

Googlers vs. Google

Publish Date: 2021/3/23
logo of podcast Land of the Giants

Land of the Giants

Chapters

Shownotes Transcript

Silicon Valley Bank is still the SVB you know and trust. The SVB that delivers human-focused, specialized lending and financial solutions to their clients. The SVB that can help take you from startup to scale-up. The SVB that can help your runways lead to liftoff. The only difference? Silicon Valley Bank is now backed by the strength and stability of First Citizens Bank. Yes, SVB. Learn more at svb.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.

When you're running a small business, sometimes there are no words. But if you're looking to protect your small business, then there are definitely words that can help. Like a good neighbor, State Farm is there. And just like that, a State Farm agent can be there to help you choose the coverage that fits your needs. Whether your small business is growing or brand new, your State Farm agent is there to help. On the phone or in person. Like a good neighbor, State Farm is there.

On December 2nd, 2020, Dr. Timnit Gebru, co-lead of Google's ethical AI team, was technically on vacation, driving across country. But she couldn't quite stay away from work. So somewhere in the middle of the U.S., after long hours on the road, Gebru chatted with one of her direct reports. She was just chatting me about random stuff. Like, oh, you know, here is a really great candidate. But then all of a sudden, their conversation took a turn. And then she's like, oh my God, oh my God, oh my God.

Oh my God, did you resign? Did you resign? I'm like, what? Her colleague said she'd gotten an email from Megan Cucciolia, vice president of engineering and Google research. So a couple levels higher than Gebru and her ethical AI team. The email said the company had just accepted Gebru's letter of resignation, which was a surprise to Gebru because she didn't think she'd sent one.

So she called her direct manager immediately and caught him off guard. He's like, hey, how's it going? Sorry, like I was just, you know, hanging out with my kids, blah, blah, blah. I'm like, do you know that everybody's got an email saying I resigned and I got an email to my personal email and I've been cut off my corp and whatever. He was like, what? So he had no idea. No idea that this happened. None.

During the previous few days, Gebru had been in a tense debate with higher-ups at Google because she'd co-authored a research paper that the company didn't want her to publish. The thing is, as far as Gebru was concerned, it was her job to research ethical questions relevant to Google's work in artificial intelligence and to publish her findings. So Gebru didn't understand why she was being told she shouldn't publish this work, and she wasn't ready to comply.

The debate had gotten pretty heated as far as corporate drama goes, and the last email Gebru sent to her higher-ups was a list of conditions. She wanted more clarity on what happened with her paper, and what Google's process was for approving research overall. Gebru said she would make arrangements to leave Google if her conditions couldn't be met, but she didn't think it would escalate to that. And if it did, Gebru didn't think her exit would happen so abruptly, without her direct manager even knowing. She spent the rest of her road trip trying to piece together what happened.

I can't tell you what I drove by, but I'm on the phone in the car. I usually get nauseous. I can't do that in the car. But this time, somehow my body was just, you know, no sleep, whatever. We've got to just... I was just like thinking, you know, what are they going to say?

Geber and many of her colleagues would soon tell the world that she was wrongfully fired by Google. Google would continue to say she resigned. It's a complicated story that's still unfolding, even as we record this episode. But it's already clear that Geber's exit is a significant event for the company. To many Googlers, Geber's departure feels like the final signal of the deep loss of trust between the company's leaders and employees.

Because what does it say about Google's culture when one of its star researchers leaves after she does the work she felt she was hired for, studying the ethics of Google's own technology? This is Land of the Giants. I'm Shireen Ghaffari. And I'm Alex Kantrowitz.

Last week we told you the story of an important turning point in Google's unique culture of debate. The controversy around Google's contract with the Department of Defense ushered in a new era for the company, in which some of its biggest adversaries would be its own employees.

This is part two of that story, The Aftermath, where we explore how trust inside Google has continued to erode over the last couple years and what that means for the future of the company. And for us, its users. Because at a certain point, when enough people who work inside Google don't even trust each other, can we? In November 2018, I drove to Mountain View for an event I never thought I'd cover. Thousands of Google employees walking off their jobs in protest. In detail, I invite...

The catalyst for this walkout was a bombshell New York Times story from October 2018. It exposed that for years, Google seemed to protect several former executives accused of sexual harassment, including Andy Rubin, the creator of Android, who faced serious allegations of sexual misconduct. According to the New York Times, Google's own internal investigation found those allegations credible. Rubin resigned in 2014. As part of his exit package, the company paid him nearly $90 million. Harassment was not only...

not punished. It was, you know, in the case of Andy Rubin, it was like rewarded. Meredith Whitaker was one of the walkout's seven key organizers, and a lot of Googlers shared her reaction to the Rubin news. As far as they could tell, Rubin walked away with a fortune and his reputation intact. When Page announced Rubin's departure, he didn't say anything about sexual misconduct. He wrote, quote, I want to wish Andy all the best with what's next.

20,000 employees across Google offices around the world participated in the walkout. This was historic. It is incredibly rare for people working in tech to participate in a protest against their employer. Because we all think of tech as one of the best industries to work in, with competitive salaries and luxury perks. The walkout was one of the largest in-person protests by employees of a major tech company to date. A spectacular display that Googlers were ready and willing to take a stand.

which of course Google, the company, knew already. Do you see a direct line there between your activism on Maven and your activism on The Walkout? Yeah, absolutely. 2018 was a busy year for Whitaker.

Google announced it would not renew its contract with Project Maven in June, and this walkout happened five months later. That's no coincidence. Whitaker and her colleagues had built a muscle for organizing through the Maven controversy. So by the time the Rubin story broke, employee activism at Google could mobilize quickly. Plus, in the few months since Maven, a lot had happened. In between that, there was Dragonfly.

That was a project to bring censored Google search to China, which Google tried to keep secret from the majority of its employees. The company would eventually terminate Dragonfly. But at the time of the walkout, it was yet another frustration for many Googlers. From their perspective, Google had been keeping secret after secret. The Rubin story was just the latest. It was a nexus, right? That after sort of a punishing...

string of unforced errors and just appalling unethical decisions, that story broke. For a lot of Googlers, the walkout was about much more than Rubin. It was also about diversity at the company. According to Google's diversity report in 2018, only 2.5% of its workforce identified as Black, 3.6% Latinx, and 0.3% Native American.

Organizers of the walkout made several core demands, including better pay transparency and putting a rank-and-file employee on the company's board of directors. But Google ended up giving in to only one, an end to forced arbitration, which meant employees could now take Google to court over issues like sexual harassment. For the organizers, there was much more work to do, and they shared a perception that the company was going to stay tolerant of that work to some degree.

Google was still much more open and receptive to dissent than any other major tech company. And people like Lawrence Berland would keep testing its limits. I got invited to a lunch.

of people that had been meeting since the walkout in San Francisco. Berland was a longtime Google engineer. He joined the company in 2005. And except for a few years he took off, Google had been Berland's entire professional life. But it wasn't until after the walkout that he became an activist there. There was this increasingly growing community of people organizing inside the company and

on a pretty wide range of issues that all kind of had as their unifying theme, like workers having a say and kind of the loss of the culture that I think a lot of us for a while maybe thought made this kind of thing unnecessary. - Berland joined Google when its open culture was going strong, but Google felt like a radically different place to him by the summer of 2019. In July, a little more than six months after the walkout, Meredith Whitaker left the company.

In a Medium post announcing her departure, she wrote, "The reasons I'm leaving aren't a mystery. I'm committed to the AI Now Institute, to my AI ethics work, and to organizing for an accountable tech industry. And it's clear Google isn't a place where I can continue this work." Whitaker was the fourth leader of the walkout to leave. So a new core group of internal activists took up the helm of organizing at the company, including Burland.

But by Thanksgiving of 2019, he would be fired from Google, along with a few others. And in the process, Google's open culture would sink deeper into existential crisis.

I would say the story of that crisis starts in August 2019, when a small group of Googlers published an open letter to the company about how it should approach business with a controversial government agency. Berland immediately signed this letter in support, so we asked him to read some for us. It has recently come to light that CBP is gearing up to request bids on a massive cloud computing contract. CBP as in the U.S. Customs and Border Protection Agency.

It's time to stand together again and state clearly that we will not work on any such contract. We demand that Google publicly commit not to support CBP, ICE, or ORR with any infrastructure, funding, or engineering resources directly or indirectly. So this petition is referencing a potential cloud contract with CBP, which means, actually, this was pretty reminiscent of Project Maven, since that was also a controversial cloud computing contract Google had with a government agency.

Remember, this is in the context of an international outcry over Trump's hardline immigration policies, like the escalation of family separations and child detention. And ICE and CBP were executing those policies. So to many employees, any support from Google to those agencies wouldn't just violate its AI principles, but also the company's publicly held values. In January of 2017, thousands of Googlers, including our executives, joined together to protest the Trump administration's Muslim ban.

This was the right thing to do, and we are proud to work at a place that reflects these values. This is in reference to a couple things. One, Sergey Brin had showed up at San Francisco airport to protest Trump's travel ban in January 2017, which was big news. It's not often that an otherwise private tech executive shows up to a public protest. And just a couple days after Sergey's protest, Googlers held their own at its Mountain View headquarters, where CEO Sundar Pichai gave a speech criticizing the executive order. I've spoken up strongly.

Sergey participated in the protest. Anyway, I think it's really important. We spent, you know, I see many of leads from Google here today. We spent two hours this morning talking about all of this. But mainly, I think today is about from hearing from other voices. We have spoken up, but it's great to hear the stories. And so hopefully there'll be more and the fight will continue. So that's it.

The fight will continue. That's a pretty strong political statement. Yeah, so this petition two years later was essentially saying, look, Google, here's a moment to be consistent with the values that the leadership seemed to stand for back in 2017. Just say you're not even going to think of applying for a contract with CBP and ICE. Over 1,300 employees signed the letter. But then, on August 22nd, just days later, Business Insider reported that Google had actually given CBP a free test drive of a Google Cloud product.

So Google was open to working with the agency. And it was suddenly much less clear whether the company reflected the values that brought Sergey to SFO two years before. And then another story about Google broke. On October 3rd, a Politico newsletter revealed that Google had made an important hire. Everybody found out that the company had hired this guy, Miles Taylor, who was the chief of staff

at DHS to Kirstjen Nielsen. Which meant Taylor had leadership roles at the Department of Homeland Security during the period of intensified child separation at the border and the travel ban. So a lot of Googlers were confused and upset to find out the company had quietly hired him as its head of national security policy engagement.

And when the Googlers started questioning his hire, something else happened. The company started deleting people's questions, meme posts, and so on if they talked about Miles Taylor by name. Internal posts across the company's various communication channels. And that really pissed me off. They were literally saying, you can't talk about this.

Something you have to know here: Google had recently revised its internal community guidelines to discourage political debate. This was coming after the company had been dealing with increasingly polarized discussions among its conservative and liberal employees on a host of issues connected to the Trump administration. The revised guidelines said, quote, "While sharing information and ideas with colleagues helps build community, disrupting the workday to have a raging debate over politics or the latest news story does not."

The new policy bans statements that "insult, demean or humiliate individuals, including coworkers and public figures." This was a significant change for Google's famously open culture. And it was a limiting one. When I asked Google about it at the time, Google's comms team defended it by saying the no political discussion rule would actually protect its workforce.

Think about how toxic political debate makes any social media platform. Well, Google built its own internal social media with Google+, MemeGen, Listservs. The new policy could be read as Google's effort to save those conversations from going off the rails. But the controversy around Miles Taylor made this guideline really difficult for employees to follow and for Google to enforce because Taylor's hire was a work issue that was fundamentally political. And so a group of us got together to start like posting posts

stuff specifically designed to see what they were willing to take down. Berland said he started uploading memes about Taylor to meme gen to test the limits of what Google would moderate. They had been complaining to me that like you can't single out an individual employee, you know, for their for their past work. And I was like, well, no, this is like a high level government figure. So I post another one that just said Miles Taylor is a Googler. Can I even say that?

And that got deleted too for singling him out. I do also see management's point or like, what do you say to like, well, now this person is working at the company and he feels harassed or bullied or he feels targeted by his colleagues and he can't do his work. It would have probably made it, I don't know, I guess emotionally harder for him to feel like he was welcome to do his work, which was definitely at some level, I think the intention. But really this wasn't about him. You know, this was about management's decision to hire him.

And I have to ask where the line is. Even if the debate was just about defining policies for singling out employees, it was proving to be charged territory. And then, in late October 2019, BuzzFeed News reported that Google deleted an employee question about Miles Taylor's hire ahead of a TGIF all-staff meeting. Back then, Google used an internal system for employees to submit and upvote questions ahead of the meeting. And this was like the first, the top or second most rated question for that TGIF and would have...

would have been asked if they hadn't deleted it. And the question said something like, and I don't have the exact text, but it said something along the lines of, you know, like as a Muslim who works here, who has family abroad, who are interested in coming to the United States, you know, how can I feel safe and welcomed in my own workplace? The Washington Post published a partial recording of the TJF from that week featuring CEO Sundar Pichai.

To the question about trust, I think it's one of the most foundational things for the company. I take it seriously, listen to the feedback, try to understand when I feel there's something which caused breaking of trust and see what we can do to improve. It's definitely gotten harder to do this at the scale we are doing and the number of people we are adding every year. Pichai was pretty frank about the crisis the company's culture was facing.

I think we are genuinely struggling with some issues, transparency at scale, how to do it, especially at a time when everything we do doesn't stay within the walls. We now know it empirically. We have leaks on difficult issues. Googlers also don't always agree on what's right and wrong, right?

Pichai is not a particularly emotive guy, but it's still clear here that something was broken in his trust of Google's special culture too. Because Google's debates were not just between employees and management anymore. The press was involved, and the world was listening.

Leaks were a real threat to Google's culture, as management saw it. And you could see how a culture where people can't trust each other to keep certain things private would be a real threat to Google's business, too. But on the flip side, you could argue that some employees were only leaking information to the press in the first place because Google had broken their trust. Management had kept too many secrets. Seeing this all go down, it was like watching a couple veer toward a breakup. And so I started looking at the calendars of the people who were doing the censoring.

And I found some really concerning things. Berland said he looked at the Google calendars of some of the community moderators for the company's internal tools, like MemeGen. He says he started with the people who were taking down the memes he and others had posted. He wanted to know if they were watching worker organizing in any closer ways. I found people whose names I recognized from organizing listed as calendar entries, like check on this person's memes again. Like they were surveilling them.

Okay, let me just unpack this. It's not necessarily surveillance or out of bounds for a company to watch what its employees are doing on its corporate network.

But at the same time, organizers within Google were finding that the company may have been watching them more closely than they ever imagined, and beyond their meme activity. Like by the end of October, as some employees were more seriously talking about unionizing and meeting regularly about activism and growing numbers, a few Googlers discovered that management was having meetings of its own with IRI, a known union-busting consulting firm.

In early November 2019, things took a turn for Berland and his colleague Rebecca Rivers, who was a software engineer and also involved in the organizing movement at Google. They put me on leave.

And they put Rebecca on leave the next morning. Both were locked out of their corporate accounts. And after they'd been suspended for about a couple of weeks, Berland and Rivers hosted a rally in front of Google's San Francisco office. I covered it. About 200 Googlers came to show their support. And Berland gave an impassioned speech. This is not really about me. It's not about Rebecca. It's about us. All of us. And the open culture we built and treasure together. If they can do this to me, they can do this to anyone.

What is Google without our culture? You know, we had that rally on a Friday. And that Monday, they fired me and Rebecca. And they also fired Paul and Sophie. And they fired us all within about half an hour of each other.

That's Rebecca Rivers, Paul Duke, Sophie Waldman, and Berland. Paul Duke and Sophie Waldman were among a small group of Googlers who'd written that original petition from August, asking Google to stay out of CBP business. The group was quickly dubbed the Thanksgiving Four because they were fired the Monday before Thanksgiving in 2019. And they were all recognized within the company as active members of the growing organizing movement.

To many Googlers, other tech workers, and people watching from the outside, this felt like a blunt move for Google to fire all of them in such close succession. It felt like they were trying to make an example of us. So Google won't exactly say who of the Thanksgiving four was fired for exactly what. It talks about them as a group.

The company said at the time that the four employees were fired for, quote, intentional and often repeated violations of security and conduct policies. In one case, Google said an employee subscribed to other colleagues' calendars and, quote, set up notifications so that they received emails detailing the work and whereabouts of those employees, all without those employees' knowledge or consent. The company said that, quote, no one had been dismissed for looking at a need-to-know document in the ordinary course of their work.

But Google's reasoning is now being disputed by the National Labor Relations Board. In December 2020, the NLRB issued a complaint alleging that Google had violated Lawrence Berlin's labor rights in firing him. It also included another former Googler in the complaint, Catherine Spears, a security engineer fired just weeks after Berlin in December. Many employees now actually call the whole group the Thanksgiving Five, or the Fired Five, to include Spears.

The NLRB complaint alleged that Google illegally surveilled, retaliated against, and interrogated employees engaged in labor organizing. It also claimed that Google selectively enforced its rules around accessing need-to-know documents and who can look at what calendars. Ultimately, it alleged that Google was discouraging its employees from starting a union or engaging in other legally protected worker organizing activities. This NLRB complaint was a moment of vindication for Burland.

But it also came with some really bad news for Google's organizers. The same day that the NLRB issued that complaint is the day that they fired Dr. Gebru in December. Literally the same day. Actually, they were about two hours apart. After the break, I'm going to bring you through a story I reported on as it unfolded. What led to Dr. Timnit Gebru's exit?

Support for Land of the Giants comes from Quince. The summer is not quite over yet, but shifting your wardrobe to the colder months could start now, little by little. You can update your closet without breaking the bank with Quince. They offer a variety of timeless, high-quality items. Quince has cashmere sweaters from $50, pants for every occasion, and washable silk tops.

And it's not just clothes. They have premium luggage options and high-quality bedding, too. Quince's luxury essentials are all priced 50% to 80% less than similar brands. I've checked out Quince for myself, picking up a hand-woven Italian leather clutch for my mom. As soon as she saw it, she commented on how soft and pretty the leather was and told me it was the perfect size for everything she needs to carry out on a quick shopping trip.

Make switching seasons a breeze with Quince's high-quality closet essentials. Go to quince.com slash giants for free shipping on your order and 365-day returns. That's Q-U-I-N-C-E dot com slash giants to get free shipping and 365-day returns. quince.com slash giants

On September 28th, the Global Citizen Festival will gather thousands of people who took action to end extreme poverty. Watch Post Malone, Doja Cat, Lisa, Jelly Roll, and Raul Alejandro as they take the stage with world leaders and activists to defeat poverty, defend the planet, and demand equity. Download the Global Citizen app to watch live. Learn more at globalcitizen.org.com.

By 2018, when Dr. Timnit Gebru was considering an offer from Google to co-lead its ethical AI team, she'd already established herself as a leading voice in the field. Gebru and another researcher named Joy Blomwini had done a study that found a serious flaw in facial recognition software. It couldn't identify black women with nearly the same accuracy as white men. Several big tech companies were developing that AI, including Microsoft, where Gebru was working when the study came out.

So she'd also already established herself as someone who was unafraid to call out her employer for bias in its technologies. And outside of her research, Geber had co-founded a group called Black in AI, a support and advocacy network for Black innovators. So when Google recruited her, there was just really no, I mean, what can I say, like illusions about what I believe in, what I stood for. Still, Geber had reservations about taking the job.

So that was around the time of the whole Maven controversy. Meredith Whitaker was still there. I talked to her. And, you know, I was like, the fact that people like her were there was actually the thing that made me feel like, okay, you know, compared to other companies, like at least there's people like her kind of pushing the envelope. And I was like, okay, you know.

It's a big company and you might be able to move the needle a little bit. It's important to do that. So Gebre signed on. I was the first black woman to be a researcher there. She said part of the draw was a new AI lab Google was planning to set up in Ghana. It's first on the African continent.

Plus, the department she joined, Google Research, was a big deal inside the company and beyond. Research is a critical part of Google's innovation process. The department informs some of the company's most ambitious and world-changing technologies. A bunch of the tech giants have similar teams. Yeah, I mean, I think we do have a bit more of an academic pedigree than a lot of tech startup companies.

Jeff Dean, the head of AI you heard from last week, is also SVP and head of Google Research. So basically, he was Gebru's highest-level boss. Dean joined Google in 1999, so he met Larry Sergei when they had just reluctantly left their Stanford PhDs behind. And he shared the founder's respect for academia. I mean, I think the research background and sort of ethos of the company still exists in our engineering culture, and I think that's a really healthy aspect of Google.

Google's actually got a public statement describing its research philosophy in detail. And something it calls out there is the importance of basic, fundamental research, where, quote, intellectual freedom is an essential ingredient. We should say here, we interviewed Jeff Dean on December 1st, 2020. That was one day before Gebru's exit. So we just talked more broadly about research at Google. More recently, though, through Google, Dean declined a follow-up interview to talk about what happened with Gebru.

Per Google, thousands of people work in its research organization. During our interview with Dean, he talked about how the research team's work has fed into the development of autonomous driving, which you all know about, and other fields like healthcare. He said Google is already helping doctors better diagnose certain conditions with AI.

But Dean said as Google charged ahead with its innovations, it was important to the company to make sure they were also charging ahead ethically. In fact, that's one of Google's AI principles. That list Google released after Maven, it wasn't just about how Google would work with the military. One of the AI principles is, you know, the system should not perpetuate unfair bias.

This is one of the principles that Gebru was essentially hired to uphold within Google Research. She said Jeff Dean helped recruit her to Google, and the company seemed thrilled to have her. They were excited for me to be there, to go there. Jeff Dean gave me a high five the first day when he saw me at the office.

Gebre was working on the forefront of a growing field looking at AI and bias, which is something Google had run into some problems with in the past. For instance, there was a highly publicized one in 2015 where there was an image recognition model that was misidentifying Black people as gorillas. In 2015, an engineer called out Google on Twitter for auto-tagging pictures of him and his friend, both Black, as gorillas in Google Photos. The tweet went viral.

A Google engineer responded quickly on Twitter, quote, "Holy fuck. This is 100% not okay," unquote. The company later apologized and said it fixed the problem. So what went wrong in the first place? Google didn't intentionally program code to tag human beings as animals and perpetuate racist tropes. People forget, like, when you're creating training data, you're doing it with people.

AI is just trained by people. There's a huge workforce behind it. And so then you're transferring this type of knowledge. And then a lot of the examples that are available that people use for these things could be just people of lighter skin, right? So then you put something out there that doesn't work very well for people of darker skin.

In other words, based on limited training data, AI can absorb human bias. And that, Geber said, can lead to grave outcomes. Like, for example, what if doctors start using image recognition AI to help identify people's health problems? Would the AI recognize pictures of, say, melanoma on Black people as well as it could on white people? There's just very little rigor in the way you test these kinds of AI models that can be used for such high-stakes scenarios.

So Gebre set out to build that rigor the way she'd been trained: through systematic research. Outside of Google, Gebre was one of the most highly regarded AI ethics researchers. But she said it was different inside the company. The kind of respect I got externally from Google

is very different from internally at Google. Gebre says she and her team were sometimes left out of emails and important meetings. I wanted to manage one person. Imagine one person. That was a huge deal. And I'm like, you know, I spend yesterday talking to world governments. I spend today coming to a company that wonders if I can manage one person. Gebre was eventually promoted in October 2020.

You have to understand how good my work had to be for that to happen. I was getting awards left and right. I was having papers that were like, you know, it wasn't like, you know, so it would have been so weird if I didn't get promoted. I mean, it was just like, you know what I mean? So that's how hard it is, right? You know, I was almost a little starstruck that she was, you know, on the team.

Alex Hanna is a social scientist and researcher on Google's Ethical AI team, the same team Gebru co-led. Hanna was at Google before Gebru joined. To me, it is someone who is really vocal about these things. If she doesn't like how you're treating someone, she'll let you know. And that didn't always go over so well with everyone. When I talked to Hanna and others who worked with Gebru, they'd bring up how often she'd advocate on behalf of marginalized communities inside Google.

She also centered those communities in her research on the impact of AI. One of the areas in which they're very excited about is large language models. Giant models, big, like they just want to make them larger and larger. Natural language processing is a field of artificial intelligence that uses computing to understand and generate human language. Google says natural language processing impacts the user experience in many of its products, from search to maps to ads. It's an incredibly powerful tool.

But Gebre was concerned that Google wasn't paying enough attention to the potential risks of developing this tech so quickly and at such a huge scale. And she wasn't the only one. So at some point, people started asking me at Google, "What kind of things should we consider when we're working on large language models?" Gebre decided to write a paper to answer that question. And she thought she had support from the top for it. In her last performance review, Gebre said Jeff Dean urged her to make sure Google was applying its AI principles to the products that use natural language processing.

And Gebru got to tell Dean that she was, in fact, working on that very paper already. And he goes, "Oh, cool, like this is not my area of expertise, but I'm sure I'll learn a lot waiting for it." You know? So Gebru wrote the paper with several other academics, highlighting the risks of large language models as she said she would. She wrote about the environmental costs of such high-volume computing and the risk of amplifying racist or sexist biases in language data.

Alex Hanna read an early draft of the paper. I thought it was good. I thought it was a good sort of place to summarize a lot of the points that have been made. You know, these were tying a lot of threads together. In other words, the study was an analysis of existing research. The findings weren't entirely new or all that surprising to some people who had been following the field.

Gebre and her co-authors were set to publish this research as part of an upcoming academic conference. But Google needed to sign off on it first, which was standard protocol at the company for any research that's done by in-house staff. So Gebre was surprised when she received an urgent meeting invite to discuss the paper with her manager's manager, Megan Cacciolia. All of us are supposed to go on vacation. Thanksgiving week, we're all out.

Thursday afternoon. The Thursday before Thanksgiving. Megan Cotuglia, Sammy's manager, put a random meeting on my calendar for like the day, the two hours later. In the meeting, Geber said she remembers Cotuglia telling her that an unspecified group of Google leaders had decided she had to either pull the paper or remove her name from it by November 27th, the day after Thanksgiving.

Through Google, we reached out to Cacholea for comment and the company declined. Back to that meeting, Gebru said she pressed for more details.

I was like, "Can you tell us, like, are there sections you want to remove? Is there wording that you want to change? Like, what is it that, like, you know, no, no, like, after you retract them maybe." I was so upset. I started crying. I was like, "Now you're like, I can't even do my work, you know?" Gebre said eventually her manager was allowed to read a brief document to her out loud, summarizing the feedback on her research.

In response, Gebru sent a six-page memo to Kacholia, Jeff Dean, and her manager, headlined, "Addressing feedback from the ether at Google." Three days later, on November 30th, Gebru got a follow-up email from Kacholia. She said Gebru could discuss the feedback further with her manager, but also Kacholia asked Gebru to confirm that day that she had withdrawn the paper or removed her name. There was some back and forth by email. And the next day, on December 1st, Gebru wrote back that she wasn't comfortable taking her name off the paper yet.

She laid out a set of conditions Google would need to comply with before she did. Number one, she wanted to know exactly what process Google used to review her paper and decide not to approve it. Crucially, she asked to know who was part of that decision. Two, she wanted managers on her team to recognize something went wrong here and say how they were going to fix that in the future. Three, she wanted Google to spell out the parameters of research at the company. What would Googlers be allowed or not allowed to study?

If you meet these conditions, I'm happy to retract my name off of this paper. If not, I'm not going to I'm going to publish it. And, you know, I we can it would be without a Google affiliation and we can figure out like a last date that would least destabilize the team. And I'd work with my manager, Sammy, to do that.

So she'd plan to leave the company if her conditions couldn't be met. And Google decided it couldn't. In particular, a sticking point for Google was the company didn't want to reveal who had objections to Gebru's work. When I asked a spokesperson for Google if it was standard for people reviewing research at the company to remain anonymous, they said there is currently no policy on this and that it's working on making one.

Around the same time that Geber sent those conditions off, she also wrote an email to the Google Brain Women and Allies group, an internal listserv, sharing her frustrations about Google's rejection of the paper. More broadly, she criticized the slow pace of Google's efforts to hire more women and Black researchers. She said writing internal documents and hosting discussions about the state of diversity at the company wasn't really making a difference. Google's numbers weren't getting much better. You can't just keep on writing these documents. We've written a million. I've written a million documents.

She said, "Your life gets worse when you start advocating for underrepresented people." Gebru encouraged her colleagues to focus on holding Google's leaders accountable and to think through what kind of external pressure they could put on the company. But this email frustrated Google's leadership because in their view, Gebru, who was a manager, had just told her colleagues to stop a form of diversity and inclusion work. Gebru felt like she was just stating the harsh truth and that people needed to turn to other methods.

So by this point, tension had really escalated between Gebru and the company leadership. But this is Google. People ask tough questions and push executives all the time. So Gebru still felt like there was a way for her to stay, or at least leave someone amicably. Alex Hanna remembers the time, around 10.30 p.m. Eastern on December 2nd, when she saw the email saying Gebru had elected to resign. My initial thought was, well, that...

I'm sad to see this, but like, you know, she's within her right. But then I texted her and I said, oh, I hear he resigned. And she said, I didn't resign. There's just like no argument here because...

We know how resignations work. This is not how a resignation works. Come on. Like your manager would know, your direct reports would know. Like you just don't do that to people. Many of Gebru's colleagues agreed and they were aghast at the abruptness of it all. Did Google have any idea the kind of reaction it was about to unleash?

The thing that comes to mind that someone said to me is that this is like an own goal for Google. As in, you know, when you accidentally score the other team a point by kicking the ball into your own team's net. It's just, you know, complete thing that is working, going to work against them. Because as Google was well aware, Gebru voiced her opinions. I don't know how to describe how I felt. I don't remember. I was just in the, you know, I just felt I had to act.

So she tweeted: I was fired by Jeff Dean for my email to brainwomen and allies. My corp account has been cut off, so I've been immediately fired. Right after her departure, Gebru's tweet went viral in tech circles and beyond. Last I checked, it was retweeted more than 2,400 times. The day after Gebru heard the news from Google, a group of academics and AI published a petition in her support. The petition called Google's treatment of Gebru "unprecedented research censorship" and "an act of retaliation."

As of March 17th, over 2,600 Googlers have signed it, along with more than 4,300 academic, industry, and civil society supporters. A hashtag started trending: #ISandWithTimNeep. There were people who were literally spending so much time trying to help me that I never met before. And they were like organizing on my behalf. Jeff Dean sent a note to staff the day after Gebre's departure, which he later shared publicly, saying that there had been speculation and misunderstanding on social media about the situation.

He wrote that Gebru's paper was submitted late through Google's review process, and that ultimately, a quote, cross-functional team at Google determined it didn't meet the company's bar for publication because it allegedly ignored too much relevant research. I don't think I have ever written a paper with more references than this one.

I mean, it has like 160 references or something like that, right? So it ignored too much relevant research? Are you kidding me? I talked to Google about Gebru's response, and a representative for the company said its problem with Gebru's paper was more than just citations, that it took issue with the paper's conclusions. To Dean's point about timing, Gebru confirmed that she did submit her paper for internal review at Google the day before the deadline to submit it to a conference. So there was a tight timeline for Google to review this paper.

But Gebru, Hannah, and other academics I talked to said that's normal. Hannah and some colleagues even spun up a little study to test their claim. Hannah says they logged the submission data for around 140 Google research papers. "We said just under half the papers submitted at the pub approved are done so with a day or less notice to approvers." Google acknowledged that it's common for its researchers to submit their papers late and still get approval, but that in those cases, the research wasn't as sensitive and far-reaching, in Google's view.

For many of Gebru's supporters, Google's defense seems like a flimsy argument. But the paper is not the issue. I mean, the issue is to me, the issue is having someone who is vocal, who is a vocal Black woman, who does immense amounts of advocacy work.

Google disputed that Gebru was squeezed out for being vocal about the company's diversity issues. But with the way it handled Gebru's exit, I mean, if people outside Google hadn't heard her scathing critiques of Google's culture before all this, they sure have now. A week after Gebru's departure, Sunar Pichai sent an internal memo to staff where he pledged to investigate what happened around her exit. Here he said in the memo, "We need to accept responsibility for the fact that a prominent black female leader with immense talent left Google unhappily."

I've heard the reaction to Dr. Gibru's departure loud and clear. It seeded doubts and led some in our community to question their place at Google. I want to say how sorry I am for that, and I accept the responsibility of working to restore your trust. I'm not exactly sure what he's doing to restore the trust, so I don't understand. And Meg, my colleague's core of access has been cut off for a while now. She still doesn't have it back. And I just don't understand what they're doing. It's just like they're doubling down.

What Gebru's mentioning here is something that happened to Meg Mitchell, the founder of the Ethical AI team. Mitchell publicly criticized Google for what happened to Gebru and was doing her own research about the incident, according to the New York Times. Soon after, she lost access to her corporate account. Mitchell was locked out of her account for weeks, until February 19th this year, when she was fired. I got back on the phone with Alex Hanna right after the news broke. Yeah, I mean, it was a gut punch to...

For me, I mean, I had two bosses that were fired within three months of each other. Google wouldn't actually even have or be seen as a respected leader in this space if it weren't for these two people. When I asked a Google spokesperson about why Meg Mitchell was fired, they wrote, for quote, "...multiple violations of our code of conduct as well as of our security policies, which included the exfiltration of confidential business-sensitive documents and private data of other employees."

We asked Mitchell for comment. She sent a statement that read, in part, that she's, quote, scared to respond directly to what Google is saying because, quote, I don't want them to keep escalating, and that she is not sure what the company is referring to about her allegedly violating its code of conduct. Quote, I loved working at Google and spent my time there increasing Google's success and building a culture that's inclusive for tech minorities. I think I was doing a good job.

Google has since said it's restructuring its ethical AI team and appointed Dr. Marianne Croke, previously a VP of engineering at the company, to head the new team. Croke was one of the company's most senior Black executives. And in the wake of the outcry over Gebrey's departure, in late February, Google made some other internal changes, like tying executive pay partly to reaching diversity goals, enacting new procedures around potentially sensitive employee exits, and increasing staff to help with employee retention. That's according to reporting from Axios.

In an email announcing the changes, Dean said that Google, quote, could have and should have handled the situation with more sensitivity and, quote, for that, I am sorry. But Hannah said after all the turmoil the team has been through. It's hard to see a future for our team. It's hard to see a future for myself.

This doesn't give me any confidence that there can be any guarantees in doing effective social science research here without being censored. Google sent me a statement. It said it has hundreds of people working on responsible AI and has published over 200 publications on topics like reducing gendered correlations in pre-trained natural language models. Quote, "This research is incredibly important and we're continuing to expand our work in this area in keeping with our AI principles."

But still, many academics in AI and computer science say that what happened to Gebru could have a chilling effect on research in the entire tech industry.

The conference that Gebru submitted the paper to has now suspended Google's sponsorship. And Gebru's exit has made researchers question if there's a place for truly independent academic work at any of the tech giants. Not just Google, but companies like Microsoft, Facebook, Twitter, or Apple. Which is important because in some ways these companies are just as powerful, if not more, than some of the leading research institutions in the world. If you're someone who wants to understand the technology shaping all of our lives, you need access to what's under the hood at a place like Google.

And it's not only academics who are worried. Politicians have gotten involved, too. In December, nine U.S. members of Congress, including Senators Ron Wyden, Cory Booker, and Elizabeth Warren, signed a letter demanding Google answer questions about Gebrey's departure and reaffirm its commitment to academic freedom.

So it's come to this: US senators reminding Google about academic freedom. The company that started as a dissertation idea between two Stanford PhD students. The company whose first policies were literally cribbed from Stanford's student handbook. And then of course, Google is still facing the NLRB complaint about the firing of Lawrence Berland and Catherine Spears. If the company doesn't settle, the NLRB will take its case to an administrative judge. It has a hearing later this year.

With Dr. Timnit Gebru and Lawrence Berland and his colleagues, we've just told you two stories about people who've left Google. But there's another side to the fight for its culture. There are the people who stayed.

Right now, inside Google, a small contingent of employees are building a labor organizing movement that big tech has never seen before. And the other tech giants, and even startups, are watching. At the time of this episode, over 800 Googlers now support a new union called the Alphabet Workers Union. This is a tiny fraction of Google's over 100,000 employees, but it's still meaningful because it's been incredibly rare for tech workers to even talk about unionizing before in Silicon Valley.

This new Alphabet Workers Union represents the idea that people at Google are not just going to let its culture go. Google built something special that's still kicking: this demand to talk about the bigger picture, about policy and society that you don't really see at any other major company.

And aside from the engineers who are so often in the headlines, there's another group of rank-and-file workers who have been pushing to have a say in how the companies run. Like the over 2,000 contracted Google cafeteria workers who announced they had formed a union with Unite Here in December 2019, after two years of quietly organizing. This was before the Alphabet Workers Union even existed.

Now look, most people at Google don't really have anything to do with these unions or organizing. People who are just focused on their jobs. But the small groups speaking out, they're making an impact. I know from my reporting that their activism at Google has inspired other action at companies like Amazon and Microsoft. And everyone at Google. No matter where they stand on its internal activism, all of them are about to weather an even larger storm. Because there's another giant coming for the company, the U.S. government.

And it's more ready than ever to regulate big tech. That's next week and our finale. The story of how Google became the center of antitrust scrutiny. What might happen next in that fight. And what it means for Google that it's going to face this next era of regulation as a divided nation itself. Footage of the walkout in Burland and Rivers Valley is courtesy of Bruce Hanna. We've talked to a lot of people for this episode to better understand Google's culture. So thanks to Catherine Spears, April Christina Curley, Jack Polson, and Margaret O'Meara.

Thank you.

Art Chung is our showrunner. Nishat Kurwa is our executive producer. I'm Alex Kantrowitz. You can check out my weekly interview series, Big Technology Podcast, on your favorite podcast app. And I'm Shereen Ghaffari. If you like this episode, leave us a rating and review on Apple Podcasts and tell a friend. And subscribe to hear our next episode when it drops.