cover of episode (Yet Another) Emergency Pod: Sam Altman Is Back

(Yet Another) Emergency Pod: Sam Altman Is Back

Publish Date: 2023/11/22
logo of podcast Hard Fork

Hard Fork

Chapters

Shownotes Transcript

This podcast is supported by KPMG. Your task as a visionary leader is simple. Harness the power of AI. Shape the future of business. Oh, and do it before anyone else does without leaving people behind or running into unforeseen risks. Simple, right? KPMG's got you. Helping you lead a people-powered transformation that accelerates AI's value with confidence. How's that for a vision? Learn more at www.kpmg.us.ai.

Casey, where are you right now? Okay, first of all, you have to stop calling me, Kevin. I'm trying to be on vacation over here. I am literally in the middle of gate E8 at San Francisco International Airport. Like, is it loud there? How are you recording this? It's surprisingly quiet, but only because my flight just got canceled. No! What happened?

Well, step one was they were waiting on some paperwork. Step two was they opened up some sort of compartment and they said that they needed to replace a broken latch. And then during that process, they announced that something was cracked and they were taking the airplane out of service. So the best option for us was to hop on and chat while we wait for our next plane to arrive. And that's how my Thanksgiving has gone so far. I'm so sorry that happened to you. How's yours?

holiday going so far? Well, I would say it's been very relaxing. I've just been spending, you know, quiet afternoons, you know, reading books and catching up on sleep. No, of course, I have been in a frenzy of reporting and trying to figure out what the heck is going on behind all this drama. This is a sort of triumphal return of Sam Altman and some OpenAI employees have been saying things like, we are so back.

Well, before we get into the aftermath, Kevin, we should probably just quickly tell folks who may not have heard what happened since last week came to them in an emergency podcast just hours ago. Yes.

I feel like this week has just been one never-ending emergency podcast. I feel like we have really not stopped recording since Friday. But basically, here's the deal. Late Tuesday, OpenAI announced that Sam Altman was being brought back after this five-day campaign that had been waged by Sam and his allies, OpenAI's employees, who had threatened to

quit en masse and go work at Microsoft, as well as the company's investors. They said they had a, quote, agreement in principle for Sam Altman to return and that they were overhauling the company's board of directors. So Adam D'Angelo, who is the chief executive of

Cora and was one of the board directors who had voted to fire Sam Altman, is staying on, but he is the only member of that board who is staying on. Helen Toner and Tasha McCauley, two of the other board members who voted to fire Sam Altman, are leaving the board, and

Two new people are joining to replace them. Brett Taylor, who is an early Facebook executive and former executive of Salesforce, is coming onto the board. He will be the new chairman. And Larry Summers, the former Treasury Secretary, is also coming onto the board. So Casey, what was your reaction to this news?

I mean, on one hand, it was very surprising given that there had been a few failed attempts to return Sam to this position so far. On the other hand, though, I think by the time this happened, Kevin, this really was inevitable. And there was one particular detail I read in some reporting that I want to share right now. And this was the moment where I thought there is no way Sam Altman doesn't come back to the sport. Can I just share this with you? Yes, please. So...

This is from a story written by Keech Hagee, Deepasith Raman, and Berber Jin at the Wall Street Journal. And in their piece on the matter, they said, and this is, of course, as the board is discussing the situation with some supporters for Sam Altman. And this is the quote. The board agreed to discuss the matter with their counsel. After a few hours, they returned, still unwilling to provide specifics, specifics in this case about why Altman was fired. The story goes on. They said that Altman wasn't candid and often got his way. The board said that Altman had been so deft

They couldn't even give a specific example, according to the people familiar with the executives. So when the people who were trying to get Sam back asked the board, hey, no, seriously, why did you fire this guy? Their answer was, he's so good at being bad, we can't even tell you what he did bad. And that was the moment where I thought, this man is going to be CEO of this company again. Yeah, that's a good observation. And it dovetails with some reporting that my colleagues and I have been doing at the Times recently.

about why Sam Altman was fired and about some of the conflicts between him and the board that have been going on for a while now. In particular, this conflict between Sam Altman and Helen Toner, one of the board members who is departing over this academic paper that she had written that sort of drew attention to

open AI in a negative light. And Sam Altman was upset about this. And this is sort of part of what sparked the disagreement between him and the sort of Helen Toner faction of the board. But obviously, you're right, we still don't know exactly what the trigger was for firing Sam Altman. But it seems to have been vague enough or unconvincing enough that the faction of the board that wanted to push him out was not able to stand their ground.

ground. And ultimately, in these bargaining sessions, they agreed to bring him back in exchange for certain changes to the company's governance. That's right. I do think that that dispute that you mentioned is important to spend another beat on though, right? Because I do think that the entire conflict is contained in this story. OpenAI is, of course, famously a non-profit board that runs a for-profit company. Helen represented the non-profit board

Sam, his duty is to the nonprofit, right? He is hired essentially by the nonprofit. But I also think that his loyalties have been much more to the sort of commercial corporate side of the venture, at least as this most recent drama has been playing out. And the paper that Helen co-wrote was a paper in part about AI safety. And the thing that she and her co-authors wrote was that OpenAI's

rival, Anthropic, which is, of course, co-founded by a bunch of former OpenAI people, they wrote that Anthropic had essentially built their product more safely than OpenAI had.

And so you can understand why in Sam's mind that was a betrayal, right? But you can also understand why in Helen's mind that was just her doing her job. Her job is to make sure that AI gets built in the safest manner possible. Her job is not to protect the reputation of open AI. And so that appears to be where the schism was. And even if that wasn't the trigger for why Sam got fired, I think it tells you a lot about what happened over the past week.

week. Yeah, so we'll see who gets added to the board in the coming days. This is not the final composition for the board. Are you throwing your hat in the ring, by the way, Kevin? I will not serve if elected. This company has already cost me too much sleep.

So it remains to be seen who will end up on the final version of the board. This is sort of being seen and portrayed as an interim board that's just there to kind of sort things out and ultimately decide who should be on the nonprofit board going forward. But I would say, you know, a couple things.

One is Microsoft is definitely going to have a bigger hand in the governance of OpenAI going forward. You know, when Microsoft did this deal with OpenAI, investing billions of dollars in the company, they were kind of a passive investor, right? They did not have a seat on the board. They were not making the decisions about the future of this company, even though this company and its technology have become very important to Microsoft's future operations.

business plans. The best joke I heard about that, by the way, was from Matt Levine, who wrote something like Microsoft invested in a nonprofit at a valuation of $80 billion. Yeah, so Microsoft obviously will want to ensure that something like this doesn't happen again, that its investment in open

in AI is not sort of jeopardized by this nonprofit board. And so I expect that they will want a board seat going forward. And the bigger picture here, I think, and this is something that I've been writing about today, is that this war in AI between sort of the capitalists

and the catastrophists. That's catchy. Yeah, thank you. The people who think that AI is going to be a powerful business tool and the people who worry that it's going to take over the world and destroy humanity. That war, I think, as of now,

is over. The capitalists have won. The people who are now in charge of the board of OpenAI are the kind of seasoned dealmakers and Silicon Valley insiders that you would expect to govern a for-profit technology company. They are not these kind of academics and ideologues who worry that AI could become too powerful and need to be shut down. And I think that's mostly the way that things are going to be from here on out.

out. Certainly, I think the pro-safety people have lost their most important perch that they had on any sort of power in this circumstance at all. At the same time, there is a faction throughout the government, academia, in journalism, and within the industry itself that wants to build this technology in a safe way. So I don't think that disappears. But you're right, it did lose a lot of power. I think Mike's sort of wrapping up question for you, Kevin, is,

How much do you think this changes OpenAI? Is it the case that Monday morning rolls around and it is just back to business as usual for these folks? Or do you think that this crazy series of events will have affected Sam and the company in some profound way that might change what we expect to see from them going forward?

It's hard to say. I was talking to some OpenAI employees who were going back to the office to celebrate. They were having a party at the office. I did not get invited to that party, but I was hearing dispatches from inside of it. And at one point, the fire alarm of the OpenAI offices was set off by a fog machine. In case you want to do sort of a vibe check on how people at the company are feeling right now, they are very happy. They are celebrating. They are

you know, bonded. Nothing bonds people together like going through a crisis. Like the one we're having right now where I'm recording an emergency episode in the airport. It's true. I've never felt closer to you. Yes, I do feel very bonded to you right now, Casey. Um,

You know, the people I'm talking to, they think that this is going to be a real moment of reinvigoration for the company, that employees are feeling optimistic about the future, and they are now even more devoted to this mission of building AGI.

And, you know, obviously, I think there are going to be some people who come out ahead or behind of this kind of reorganized, reconstituted OpenAI. And there's a lot of questions we still don't know the answers to about how the company will change going forward. But I think if you're looking for a sort of clear before and after picture of OpenAI, I would say before there was this sense that there was this fragile structure of

that needed to be sort of balanced, the needs of the business and the needs of the nonprofit. And now I think people feel like the business is firmly in the driver's seat. That all sounds right to me. I think the one thing I would add is that I do think the company and Sam Altman in particular are just going to be under more scrutiny now, right? We all learned a lot about the history of this company and of Sam Altman in particular over the past week. And I think to the extent that the company makes moves that are perceived as sort of

pro-corporate, pro-Microsoft and anti-safety. I just think they're going to get 10 times the attention that they did before all of this happened. And that might be a good thing, right? So at the end of the day, I think that the board that they had did not execute its responsibilities well and did need to go. But I do hope that you and I will keep our eye on some of the concerns that they were raising behind closed doors, even if they would never be straightforward about what those concerns actually were.

Totally. Well, Casey, thank you for taking one for the team and recording an emergency podcast from the airport. I hope that you are able to get on a new flight and make your Thanksgiving plans after all. Yeah. And let me just say to the people of OpenAI, this is the last one I have in me this week. Okay. I don't care what kind of crazy shenanigans you guys get up to with your fog machine and your fire alarms that you're pulling at company headquarters. You're not going to hear my voice again until next Friday. And you can count on that. That's the hard fork promise. We are going to take a vacation.

All right, maybe go grab that turkey out of the oven, Kevin. I'm starting to see some smoke coming out from the door behind you. That's just my fog machine in solidarity.

This podcast is supported by KPMG. Your task as a visionary leader is simple. Harness the power of AI. Shape the future of business. Oh, and do it before anyone else does without leaving people behind or running into unforeseen risks. Simple, right? KPMG's got you. Helping you lead a people-powered transformation that accelerates AI's value with confidence. How's that for a vision? Learn more at www.kpmg.us.ai.

I'm Julian Barnes. I'm an intelligence reporter at The New York Times. I try to find out what the U.S. government is keeping secret. Governments keep secrets for all kinds of reasons. They might be embarrassed by the information. They might think the public can't understand it. But we at The New York Times think that democracy works best when the public is informed.

It takes a lot of time to find people willing to talk about those secrets. Many people with information have a certain agenda or have a certain angle, and that's why it requires talking to a lot of people to make sure that we're not misled and that we give a complete story to our readers. If The New York Times was not reporting these stories, some of them might never come to light. If you want to support this kind of work, you can do that by subscribing to The New York Times.

Thank you.

Buy a bag of Hills Pet Nutrition. Help feed a shelter pet. Buy another bag of Hills Pet Nutrition. Help feed more shelter pets. Buy Hills. Help feed shelter pets. Buy Hills. Help feed shelter pets. Every time you feed your pet Hills, you help feed a shelter pet in need, helping them become healthy, happy, and more adoptable. Science did that. Visit hillspet.com slash radio or tap the banner to learn more.