- archived recording
When you walk in the room, do you have sway?
Can you hear me now? OK, let me turn it down. OK, let’s go the other way. How about now?
Frances Haugen is the Facebook whistleblower, but right now, she’s taking an unusually scientific approach to her sound check.
I am an analog electrical engineer by training. So occasionally, I get to use it — how’s this? Bubblegum watermelon. Bubblegum watermelon.
Why are you — is that the word, bubblegum watermelon?
Yeah, there’s enough dynamic range on it.
Also it’s a thing where I did girls’ chorus in elementary school. And if you forgot the words to things, they would tell you to say bubblegum watermelon because it looks like you’re saying something real.
Oh, like when you’re fake in the background.
Yeah, bubblegum watermelon. [MUSIC PLAYING]
A few decades after her stint in elementary school chorus, Haugen, a former product manager from Facebook’s civic integrity team, rose to the national stage after she leaked tens of thousands of pages of internal research to the Wall Street Journal. She exposed just how much Facebook leadership knew about the harm the platform has caused. Among the revelations: that Facebook was aware of the mental health dangers Instagram poses for teen girls, that Facebook’s algorithm awards rage and misinformation, and that the company had a different set of rules privileging the most high-profile people on the platform. I, for one, was not particularly surprised after covering Facebook for so long. But this — what Frances Haugen unleashed — was a firehose of bad. Since coming forward, Haugen’s been called one of the greatest sources of the century. She’s testified before American and European lawmakers. And I think she’s done a public service in helping crack open the black box that is Facebook— or Meta — or whatever you want to call the company these days. And now that her mic is working, I wanted to ask her about how she came to blow this particular whistle and what impact she thinks she’ll actually have.
OK, great. Frances, it’s nice to meet you.
Hi, nice to meet you.
So I think we’ll start at sort of the beginning when you joined Facebook in 2019, and you thought it had the potential to bring out the best in people. So why did you think that after all your years in tech, after the 2016 elections, Cambridge Analytica, Beacon — you can go back even further at Facebook. So give me a sort of sense of why you thought this was the place to do it.
Well, I very intentionally did not work at Facebook for a very, very long time. So I went to my first recruiting event there in 2007. And the thing that motivated me to join in 2019 was I had a very close friend who helped me put my life back together after I was ill in 2014. So I met him in 2015. He was an important part of my life throughout 2016. And he got radicalized over the course of the six months before the election online. And it was just like a searing loss for me. And so when I got reached out to by a recruiter, I said very clearly the only thing I would work on at Facebook is civic misinformation. And they happened to come back and be like, oh, by the way, we actually have a job doing exactly that.
So this disappointed you. And you thought it came directly from Facebook? Or was that where he was getting a bulk of his information?
I think it was more 4chan and Reddit, but it was one of those things where as I interacted with Facebook and I saw things — back in 2016, it used to be that when you would click on a post from a friend, this little thing would pop up underneath the post and would show other posts you might want to engage with. And I would sit there, and I would see what kinds of stuff was being shown there. And it was clear that sensationalistic things were getting the most distribution. And this specific thing was being weaponized by folks in Macedonia, and that this was a known vulnerability that was decaying our information environment dramatically in the run-up to the election.
All right. Did you encounter those problems at Google and Pinterest and even Yelp?
So it’s interesting to think about the difference between how Google operates. So each time they change their ranking algorithm, it’s not a decision that’s made blindly by an algorithm. It’s not like they just look at the performance of the metrics. They literally have these people called raters. And they’ll send out maybe 500 search queries, where they’ll run the old version of search quality and the new version of search quality, and they’ll look at the search results. And these humans, they will go in there and very carefully compare the search results and give you a judgment. And that’s a system where humans are deciding where the system goes forward. And what I found at Facebook was this kind of blind faith at many different levels of the org, that if you pick the right metrics, you didn’t have to make judgment decisions.
So I know Pinterest was suffering because I talked to them quite a bit about some of their issues. Was it because of the scale or that they were more aware of it at these other companies?
Hmm. I think that scale is definitely part of the problem at Facebook, where it’s a thing of the subject matter that you’re dealing with is just different. So at Pinterest, I remember in the run-up to the 2016 election, you could see that there were vulnerabilities that were not being adequately protected. But it’s one of these questions of most people don’t ask Pinterest for guidance on political topics, right? Most people don’t get their news from Pinterest. And so while Pinterest had problems, Pinterest also didn’t take on a greater level of responsibility by being the internet in most parts of the world.
And that’s what Facebook has intentionally done. Facebook is free to use in many countries in the world. But if you want to use the free and open internet, you’ll have to pay for it. And by choking off the free and open internet in these places, Facebook has taken on a higher level of responsibility.
They are the internet.
They are the internet.
People don’t realize in this country how much they are the internet everywhere else.
So when you got there, talk to me about your first day there. I have a certain impression of Facebook that has changed over the many years I’ve seen their various headquarters, but what was yours?
So my first day there was pretty innocuous. It’s like, they take your passport. They sign you up for an email account. You log in, that kind of thing. Very rapidly, it became apparent to me how chaotic things were because in every previous place I’ve been at, I’ve gotten to go through boot camp. And when I joined — it was in June, middle of June — my boss was like, we have to have a roadmap in a week or something for civic misinformation. You don’t have time to go to boot camp. And that was a giant red flag from the beginning because the reality of algorithmic systems is there’s enough context where you can’t go from 0 to 1, especially when your entire team is brand new. And my data scientist was new to the company, my engineering manager was new to the company. And that was the three of us. That was it. And so this was specifically around bootstrapping a civic misinformation team. So Facebook has wanted us to focus on their integrity problems in terms of censorship or in terms of content, like the idea that there are good ideas and bad ideas. And they don’t want us to focus on the idea that this is actually a problem about amplification and about systemic biases and product choices that Facebook has made.
Right. Often, Facebook tries to say, well, we’re just like other media. And I’m like, no you’re not. You’re not even close. You’re amplified. And I think I said amplified and weaponized in 2016, and they yelled at me for about an hour. Why did they have this point of view that they were just like other media and you were saying, no, this is amplified, this is weaponized. This is on a scale — it’s not like seeing a billboard and passing it.
Completely different scale.
Why do they not look at it that way from your perspective?
I think there’s a real thing of people are heroes of their own stories. Almost no one is the villain of their own story. And if you acknowledge power, then you also acknowledge responsibility. And I think if you walked around Facebook’s office, very rapidly, you would physically see their fetishization of flatness. They have the largest open floor plan office in the world.
It is. It’s exhausting.
Oh, I know. Oh, my goodness. So it’s a quarter of a mile long. It’s basically an aircraft hangar. But they do that because no one is above anyone else. No one is below, right? It’s flat. And when you refuse to acknowledge that power exists, you actually end up reinforcing the fact that power isn’t flat in the world. And so I think it’s one of these things where they haven’t been able to hit that point of maturation, where they say, actually, no, we have a lot of responsibility here.
So when you’re thinking about that idea of flat, you have the offices of Mark and Sheryl, and they’re always bragging about how they’re out in the open, this and that. You can see into their offices if you care to. Nobody really does. But it is a fetish of all of Silicon Valley, actually, this idea that there is no hierarchy, and in fact, everybody knows there is one. So what do you think that does when they’re making decisions? Is it the lack of accountability, or they don’t want to take responsibility for their actions? How do you look at them as — in that manner?
Yeah, a big part of the management philosophy of Facebook is the idea that if you pick the right metrics and let people run free, they can do whatever — that’s part of how you can empower people fresh out of college to do major actions. You’re like, you can do anything, no matter how crazy the idea, as long as you move the goal metrics. I think the idea is that they want to believe that this is a liberating management philosophy, right? Because it gives people freedom to try these crazy things. But the problem is if the metrics themselves are the problem and when Facebook shifted over to, quote, “meaningful social interactions” in 2018, it meant that it was very difficult to fix that problem within the context of Facebook because that’s a problem that needs leadership to solve, because the metrics are not going to come out and tell you their own problems.
And when you look at that, when you look at Mark’s influence, it’s very clear he’s influential, but pretends that they’re not. I’ve seen it. How did you experience that specifically?
So in the fall of 2019, Facebook came out with — it was the political speech policy. So it was around like, could politicians basically lie in political ads, among other things? So some of the context on that — the B.J.P., which is the political party of Modi in India, began referring to Muslims as termites. And there’s a phenomena seen very clearly across ethnic violence zones around the world in cases of genocide that one of the precursors, one of the things that’s a red flag that you should really pause and you know, sit up and take notice when you see, is that when we start describing minorities as vermin, be it insects, be it rodents, when we dehumanize them that way, it makes it easier for us to kill them. And so people inside the company flagged this. People in civic integrity came out and said, this is a giant risk factor. This needs to not be acceptable on Facebook. It’s too dangerous. And so this led to a period of soul searching. So they went out and put 30 researchers — this is people with Ph.D.‘s, political theorists. They went out into the public. They went out to academics. They did a very, very, very rigorous project for three or four months, right? And they came back to Mark with this thoughtful policy, saying, we’ve talked to lots of stakeholders. This is negotiating amongst all these different conflicting needs. And Mark looked at it and said, I don’t like this. And he went home that weekend and wrote his own policy. And the policy that came back is the one that was announced by Facebook. And the only problem with it was, no one had ever talked to anyone in ads before it was released. And so the policy as written was not enforceable because of things like Facebook doesn’t actually know who’s a politician. And so it’s one of these things where it shows you how, when Mark has complete unilateral control, it doesn’t matter that 30 people who thoughtfully went out and consulted with experts, the public, clergy, all these people, it doesn’t matter that they did all that work because Mark didn’t like it.
Yeah. So was there a thing that got you thinking — was there a catalyst of doing this, in collecting this data? Was there one moment you’re like, that’s it, I’m going to do it. I’ve had enough. Was there a moment for you?
I had a long period of soul searching. So I was really fortunate to live with my mom and dad in 2020. And my mother is an Episcopal priest. She has a little rural parish in Iowa. And if you’re having a crisis of conscience, it’s really great having a residential priest at hand because you can be as neurotic as you want, as much as you want. And so I had a lot of time to think through what were my responsibilities and what were my duties because a thing that really destroys whistleblowers is they live with a secret that people’s lives are on the line and knowing that they personally don’t have the resources to fix those problems. And the moment that kind of catalyzed me to be like, well, now there is no hope of resolving these things internally was Facebook got rid of civic integrity in early December 2020, right after the election. Basically, they were like, there wasn’t blood on the streets. Therefore, we’ve succeeded. We can get rid of civic integrity. And I was just so shocked by this because in order to make organizational change, you have to have critical mass, and you have to have institutional endorsement. And when they dissolved civic integrity, it became very apparent to me that there was no way you were going to have critical mass. And there was no way you were going to have institutional endorsement of change.
Right, so what happened when that happened? You just were like, that’s enough, or what occurred?
So I was working on counterespionage at the time. So I was in the threat intelligence org. And so it took me a long time to think through how I wanted to do it, because I’d been told by multiple friends before I took even the civic misinformation job, given that you are going to be addressing the nation-state actors who are weaponizing Facebook, you should just assume that your devices are compromised. And so think about that for a moment. You’re in a situation where you think that you’re going to have to do something, but you also aren’t sure what devices you can trust. You don’t even know if you can google for help, right? And so I had to be very thoughtful and very careful. And I’m very fortunate that I got introduced to Whistleblower Aid, who was able to teach me about how do you do a lawful whistleblow.
Mm-hmm. And how do you start collecting. So what did you start collecting on? There’s all kinds of — you see pictures of whistleblowers. They print things, they run out, slip it in their underwear, whatever. How did you do that?
So I think it’s fairly obvious from looking at what has been published by Gizmodo or the Wall Street Journal, that there are pictures of my laptop screen. Beyond that, I can’t go into a ton of detail. But by doing that, it meant that nothing ever took place on my laptop.
So how did you decide what to collect? And the revelations you collected, later reported by The Wall Street Journal, run the gamut of Facebook profiting off of people’s rage to internal research on the harm of Instagram on teen mental health. Look, I wasn’t shocked. I always thought Facebook was a firehose of bad. But you showed up with these receipts. And so what do you think from your perspective was the most important thing that you pulled out? Or how did you decide what to pull out and what not?
Hmm. So I feel an extreme amount of gratitude for Jeff Horwitz.
This is the Wall Street Journal reporter.
Mm-hmm, the Wall Street Journal reporter because when people know a lot about a topic, it’s not necessarily obvious to them what is or isn’t obvious. And the first thing I ever said to Jeff when he reached out was, I was like, I’m not ready to give you documents. And he was like, that’s totally OK. But if you’re willing to answer questions, that actually is hugely useful for my work. And so the fact that Jeff was around to just even be curious with me, right, be like, oh, why does x happen? And then I could go and poke around a little bit and come to a conclusion. That gave me a really good insight into what might be questions that would be important for the public to have answers to.
So he helped you do that, but was there anything you had seen that you’re like, this is ridiculous and people need to know?
Oh, yeah. My specific expertise is around algorithms. It’s one of these things where the issues around hyper-amplification and around concentration of voice, so in most places in the world, 1% or 2% of the population gets 80% of the reach, which is insane, and then specifically the bias of the most reach going to the most extreme, divisive, and polarizing content. So that was the thing that I wanted to make sure — I wanted to make sure as much as possible on how the newsfeed was constructed got out, as much as possible on groups, as much as possible on hate. And then I was also working on something called adversarial harmful networks towards the end of my time there.
And explain what they are.
Oh, sure. So an adversarial harmful network is something like QAnon, where it’s not just people having a conspiracy online. It’s people actively figuring out how to weaponize Facebook’s vulnerabilities and intentionally engaging in strategies to try to get around Facebook’s security defenses. And the thing that made me very, very concerned was not just that these movements existed because I do think they’re dangerous, but I was concerned about what Facebook was doing to address those movements. And so they were beginning to experiment with strategies on how to make those groups organically dissolve. And I may not agree with QAnon, but I believe that when people use Facebook to organize, they shouldn’t have their group mysteriously fall apart. They should know they get taken off. And the fact that Facebook, instead of choosing just safer defaults, like putting a little friction in with sharing, going in and being more responsible with how they design their amplification settings, that’s a content-neutral intervention. It’s an idea-neutral intervention. It’s not about good and bad ideas. But instead, Facebook, because they knew they were up against a wall, they were starting to develop tools that I didn’t think could responsibly exist.
In other words, they were choking people in the dark. I think that’s what you’re saying, correct? And didn’t want to take responsibility because of the impact of the fact that they could choke people, which they do.
I would give the analogy of, someone would mysteriously trip in the dark or trip down the stairs. And they’re like, oh, I don’t know how he fell down the stairs. How did that happen?
It was Mark. Anyway, so I’m just wondering about — one thing that did get so much attention was the Instagram for girls thing. Why do you think that got so much attention? Because I think that was probably the least of the surprises. Why do you imagine that happened?
I think there is a real thing of that parents are suffering around the United States. They’re watching their kids suffer. They feel really lost. And there is a real psychological — I’ve experienced this before where you get gaslit. And the process of being gaslit, which is when someone denies things that are true to you repeatedly, is that it really erodes your sense of safety and your confidence in yourself. And when you have a moment where suddenly you are reacquainted with reality, it is like a breath of fresh air. Facebook straight up lied to Congress with regard to the safety of its platforms and children. And I think that resolution of that moment, of people knowing that kids are being harmed, having it being denied, and then like you’re saying, having the receipts come out, I think that was, in many ways, a relief to people.
Yeah, in terms of seeing that. And obviously, politicians grab onto something like that.
Kids. So when you came forward, you got a lot of praise, obviously, but you got some heat. Because one of the things that happened that I actually pushed back on a lot of Facebook people was how quickly they attacked you. And so I want to talk about some of the critiques that were made, many by Facebook and others. And it’s sort of a lightning round. Facebook definitely tried to undermine you, saying things like you had worked for the company less than two years, had no direct reports, never were in the room, the C-level meetings, which I thought was the most hysterical of all of them.
Love that line.
Yeah, I know. What do you say to that?
So the thing that Facebook didn’t disclose was what was my percent seniority in the company. So on our profiles, on the internal directory, you could see how senior you were as a percent of the company. And I don’t remember my exact percentage, but I was in the 60s or 70s in terms of seniority within the company. So I was more senior than 60% or 70% of the company.
So what about the C-level meetings? I thought that was particularly —
Wasn’t in the room, which, of course —
Wasn’t in the room.
Wasn’t in the room. I was like, how Hamilton of them.
Well, I think it’s — remember, I have read more Facebook documents at this point than probably all but a handful of Facebook employees. And so it can be great to say that I wasn’t in the room, but I have notes from meetings directly with Mark, where Mark intentionally chose to not take actions that didn’t cost the company any money to make places like Ethiopia safer and still chose not to. And I’m a data scientist. I have worked on these ranking systems and in more contexts than all but a handful of people in the industry. And so I don’t really like attention. I’m not someone who needs to be on center stage. And I’m happiest when I can sit and play with data. And so, yeah, I haven’t been in a lot of, quote, “C-level meetings,” but I have gotten to look into the guts of how Facebook’s algorithms work. And the results are pretty shocking.
Yeah, most people in C-level meetings are suck-ups, but in any case —
I can’t comment.
No, OK. So they said you’re not just this lone heroic whistleblower. Facebook tweeted about an orchestrated gotcha campaign.
Oh, I love that.
You love it, OK.
They just have the best lines.
I know. A conspiracy theory machine talking about conspiracy theories, in other words.
So they didn’t start using the conspiracy theory line until a broader set of journalists got the documents. And then they had a little panic. So I find it a little sexist when I read comments about how there’s no way I could be as competent as I am, or there’s no way I could be so poised when I speak, because I did national circuit debate in high school. My debate coach was so good, he became the head of the National Debate Association, right? He had the number one, number two, and I was a mere number 25 in the country debater when I was a senior.
And I thought I was crappy because —
Debate, chorus, all the things. Facebook is fucked. But go ahead. So the idea is that you couldn’t be articulate.
Exactly. The idea that I couldn’t do these things on my own I find kind of offensive.
Conservative media, of course, suggests that you’ve been advised by a Democratic operative.
I love that one, too. So like I said —
I like that you love them. “I love that personal attack!”
It’s because I’m so entertained. Think of all the juicy political dramas or movies you ever watched. I feel like I’m a character in one of them now. I’m like, ooh, what are they going to come up with next? So there is the attack that members of my comms team have worked for Democratic campaigns in the past.
To be really clear, the reason why my comms team got involved two days before the first Wall Street Journal article was there were not a lot of choices. And so it’s not like I went and surveyed 10 teams and was like, oh, I know what I’m missing from my life. I need a Democratic operative. It’s just, it’s what I had available.
Yeah, in other words, you’re not a spy for Elizabeth Warren.
I am not.
Anyway, next one’s important and the most real — a question of financial incentives. You stand to get a substantial upside if there’s an S.E.C. case. The payout could be 10% to 20%, 30% of total fines, which could be quite a lot. Was that a motivator for you?
If your investment strategy is to whistleblow in order to hit it big, it’s not a very wise strategy. Blowing the whistle is never a Plan A or Plan B or even Plan C. It’s Plan J or I or K or whatever. I’ve been really lucky to have had a long career in tech. And I have been lucky enough to make some good investments. And I live on savings. I didn’t need to do this for the money. If something happens, I think it’d be great to be able to invest it in things like tech reform or art.
And you also are a crypto investor, correct, early on?
I am, yeah. I think in general, decentralized technologies are super interesting. I do hold enough different investments in the crypto space that I am an enthusiast.
Yeah, now George Soros is not funding your efforts, but Pierre Omidyar is — someone I know very well, actually. Could you explain what he’s doing?
Now Pierre, for those who don’t know, founded eBay. He’s quite wealthy. He does a lot of advocacy work.
So to be super, super clear, I do not receive anything directly from the Omidyar Network. I have not received from anyone any direct financial compensation. I am just living on savings. What I have gotten direct support from is Whistleblower Aid. So they are my pro bono legal team. And Omidyar has in the past donated to Whistleblower Aid, though by the time my case came up this year, they had already gone through their small grant from them. And I also have received basically operational support in Europe through a non-profit called Luminate, which is part of the Omidyar Network. And to be really clear, there’s a difference between Omidyar’s foundation — he’s a very hands-off philanthropist — and say, having a billionaire with ominous strings dangling from me.
Got it. So no ominous strings.
No ominous strings, OK. So did Facebook at all try to contact you?
I’m not aware of Facebook contacting me. They haven’t contacted my lawyers. I have not seen any direct communications from them. I think they were going through maybe an emotionally frustrating moment. And so maybe they spoke more harshly than they intended. But they have not reached out to me directly.
Frances, they intended. I think they did. I’m sure there was a meeting where they decided. Were you surprised?
Well, that’s the interesting thing, is, like, from a strategy perspective, people have said there is a clear, successful crisis comms book. And Facebook did not follow the crisis comms book. And so part of why I’m willing to be charitable on some of the stuff is very clearly, they were coming from a very stressed place emotionally because you wouldn’t want to say those kinds of things about me because it makes you look petty.
So Facebook executive Andrew Bosworth, who you know of, recently did an interview with Axios journalist Ina Fried where Boz tried to punt responsibility for what happened at Facebook —
Of course he did.
— off the algorithm and then onto the user. Let’s play the clip and then I’d like your reaction.
- archived recording (andrew bosworth)
People want that information. And I know that you don’t want them to want it. I don’t believe that the answer is, I will deny these people the information they seek, and I will enforce my will upon them. That can’t be the right answer. That cannot be the democratic answer.
- archived recording
Is there somewhere in between where you’re not completely preventing them from getting that information, but you are making it less easy for it to spread?
- archived recording (andrew bosworth)
We’re doing that. We are on the middle path. You just don’t like the answers. But at some point, the onus is and should be in any meaningful democracy on the individual.
So I talk a lot about engagement-based ranking. So this is the idea of giving the most reach to content that can elicit a reaction from you, like a comment or reshare. That plays out on our News Feeds by giving the most reach to the most extreme and polarizing ideas because the fastest path to a click is anger. But it plays out in ads also. An ad that gets more clicks, more comments, more reshares is considered, quote, “a higher quality ad.” And the only problem with that is that Facebook’s advertising systems say higher-quality ads are cheaper ads. You get more engagement, we’re going to let you distribute this ad for way, way, way less money, to the point where an angry, hateful ad is going to be five to 10 times cheaper than an empathetic and compassionate ad. So for Bosworth to come out here and say, no, no, no, no, this is what people want, guess what? Mark Zuckerberg in 2018 wrote a white paper on this saying engagement-based ranking is dangerous because people are drawn to engage with extreme content, even though they don’t like it. We ask people, after they engage with this content, did you like this content? And they say, I didn’t like it. But secondly, they’re making money by subsidizing hate, right? We don’t get to have a free market of ideas because if you can put out five to 10 times as much hate per unit of empathetic, compassionate, finding a middle ground, your democracy will fall apart. So I find that idea that he’s just blaming the victim, I find that completely unacceptable.
Yeah, Andrew likes to put a lot of words together in a row. That’s what I would say. What I tend to think is that enragement equals engagement.
Mm, ooh, I love that line.
Please borrow it freely. I say it all the time. And then secondly, if you give someone a giant gun, they’ll shoot it.
Hmm. Can I give you another Bosworth quote?
So he has one from a couple of years ago about how the only thing that matters is connection. And if people kill themselves, if murders happen as a result of Facebook, it doesn’t matter because connection is more important than any of those things. [MUSIC PLAYING]
We’ll be back in a minute. If you like this interview and want to hear others, follow us on your favorite podcast app. You’ll be able to catch up on “Sway” episodes you may have missed, like my conversation with Sacha Baron Cohen, who also has some issues with Facebook, and you’ll get new ones delivered directly to you. More with Frances Haugen after the break.
Let’s talk about solutions. You’ve now testified before Congress twice. You’ve told them what reforms you wanted to see. What do you think the most essential actions Congress, which hasn’t acted very much when it comes to tech, needs to take if you had to list two or three?
So I really think we are way at the beginning in terms of thinking about what to do with these problems because Facebook has done a really good job of distracting us with the censorship debate. So we are very hotly arguing over, should you take off the platform XYZ content? Is there enough censorship, not enough censorship, when, in reality, the things we should be talking about are about platform design choices. So that’s things like, Twitter wants you to click on a link before you reshare it. Have you been oppressed by clicking on that link before you reshared it? Just stuff like that. And so the main thing I’m currently advocating for is around ways of doing more structured transparency because we are not in a place where we can even really have conversations yet around how to remedy a bunch of these problems. The other thing is I think Facebook would have never gotten as bad as it currently is if it hadn’t been allowed to operate in the dark. If there was a more structured way of doing transparency, in a way where when an academic or someone like you, you’ve been doing this advocacy work for years, and saying, Facebook, here are some problems you have. If Facebook couldn’t just say, oh, that’s anecdotal, it’s not real. If they had to actually articulate, this is what we’re going to do to fix it, and here’s some privacy-conscious aggregate data that would allow you to see am I actually making progress, we would have a completely different platform today.
Yes, in other words, get rid of the black box, the alleged black box.
It can’t be a black box.
It can’t be a black box.
It can’t be a black box.
So you think the disclosures will lead to change inside the company? They had an I’m a victim problem before. Now they really believe that. Now Casey Newton and others say naming and shaming is a weapon. That is a good thing. What do you imagine is happening within the company?
So I’m a big proponent of people rarely change because you villainize them. And so I am still — if I could work at Facebook again, I would go and work at Facebook again because I think it’s —
They are doing some of the most important work in the world. And it’s not fixed yet. But until it’s fixed, I still strongly endorse, we need to go keep trying to fix it. But the secondary thing is they are starting to do — you see occasionally these little leaks on things, where they’re trying to dig more deeply into various actions around kids, more actions around misinformation. And I think there’s a different question, which is will they go far enough, which I don’t think they will until we have more mandated transparency, but that’s a different issue. I strongly believe Facebook has the capacity to change, and that one day, Mark is going to wake up and realize that he could stop living in fear. Because right now, he’s afraid. He’s estranged from the larger community. He doesn’t want to be the villain of the story, which is why he’s trying to change the conversation on all the Meta stuff and video games, the Metaverse. But I think there’s going to be a moment where the cost of staying the same is higher than the cost of changing. And at that point, I have faith that Facebook will change.
I think that’s really interesting. The thing is that he’ll wake up someday. I do not think he will wake up someday and do that. I think that you will flame out before he does.
The — you and people like you.
So one of the things that I lived through in the process of getting better after I was sick — so I was so malnourished from my celiac because I didn’t understand what was happening to me. My digestive system shut down enough that I was literally starving to death. But I was also paralyzed beneath my knees. And when your nerves heal, they hurt like when they get damaged. And so I want you to imagine living with such bad pain for years that it’s like your feet are on fire. So when people say, oh, you’ll flame out before he will, I guarantee you, I will last longer than Mark Zuckerberg will because I already have. And I think there’s a thing of, my intention in 2022 is to begin working on building out — I want to plant a youth movement because this is not the first time we’re going to have to wrestle with how do we govern. What are our rituals of governance for giant technological platforms? It’s just the first time that it’s been a crisis at this scale. And I invite anyone who wants to stop feeling angry to have another way forward because I think we can accomplish a lot, just given enough time.
I think that’s a great way of saying it. You sounded a little bit like a politician there for a second — not in a bad way. Would you ever consider running for office?
It was so interesting going to Europe. I guess it was a month ago now. Because one of the things about having celiac is I am sufficiently sensitive enough that it is really hard for me to eat outside my home. And we worked really, really hard for me not to get sick, and I got sick at least once a week, every week we were on the road. I do not want a lifestyle where I have to be outside my home all the time. And I’m going to figure out how I can be helpful. But being a politician is a thing that until they figure out how to cure my neuropathy and cure my celiac, that’s not a thing that’s ever going to happen.
All right, we’re coming up on the critical midterm elections and the 2024 presidential elections. What needs to be done to make our democracy safer? And what needs to be done by Facebook in particular?
So I’m a strong proponent that we need to have flat C.P.M.s. So that’s cost per thousand impressions. So it’s a way that ads are priced. I believe we need to have flat ad rates for politicians because like I mentioned before, that idea that a hateful, divisive ad is 10 times cheaper than a loving or compassionate one, our democracy can’t exist if we subsidize tearing our democracy apart. So that’s the thing that has to happen as soon as possible. I think the secondary thing is right now, the way the political ads database is structured, Facebook likes to say the solution to bad speech is more speech. Right now, you can take out targeted ads, spreading misinformation in southeast Ohio. And if you don’t have the right person who meets those ad targets in that place, you will not be aware of what lies are being spread. And so you never get a chance to have counter-speech And so I’m a strong proponent that the entire political ads database needs to be easily mass downloaded because that’s how people actually analyze these things. And Facebook has actively sabotaged people who tried to look at what was happening with speech in a systematic way on their platform. They’ll block their access to the ads A.P.I. They’ll do lots of things because Facebook doesn’t want anyone seeing the overall picture of how speech is being distorted on the platform. And then I think there’s a third thing that’s a critical, “we have to do it immediately” kind of thing. There’s a number of things outlined in my disclosures that would be easy fixes for reducing misinformation. That’s things like, let’s imagine Alice writes something, Bob reshares it. Carol reshares it. So now we’ve just moved beyond friends of friends. It lands in Dan’s newsfeed. Imagine if Dan had to copy and paste whatever that is in order to share it. Has he been oppressed? I would argue no, he hasn’t. But that action alone —
Yeah, it’s friction. Cut the reshare chain at two. That has the same impact as the entire third-party fact checking program, and it works in every single language. And remember, the United States is a language-diverse place. These issues aren’t just about Ethiopia. They’re not about people far across the seas. In the United States, there’s lots of people using the raw version of Facebook that Mark himself says is dangerous.
So you’re saying we should help them stop and think before they do something stupid.
Yes. And Facebook knows these tools exist today, and they come at tiny little costs of profit for Facebook. We’re talking slivers. And so there’s this question of how much is our democracy worth? Is it worth 1% of Facebook’s profits? Is it worth 2% of Facebook’s profits? I would argue, yeah, it is. They shouldn’t be allowed to subsidize their profits with our safety and our democracy.
That’s a very good point. Frances, if you had to see Mark Zuckerberg — I don’t know if you’ve ever met him or anything like that.
I haven’t. I hope one day I do.
It’s a laugh riot. If you had to say one thing to him, what would it be?
I would say, Mark, you are a very smart person. You have so much capacity. You have the ability to build any version of Facebook you want to build. You’re the only person in the world who can govern Facebook. You don’t have to live the life you’ve been living. You don’t have to be afraid of people knocking on your door. You don’t have to be afraid of people glaring at you at a restaurant. You don’t have to be afraid of your kids having horrible things said to them at school. You can change. Facebook can change. You can live in a new world. And you are the single most powerful, most impactful person who can drive that change. You can have a different life. And I want you to have a happy, fulfilled life because that’s the person you can be, and I believe in you.
You’re very kind to them, Frances, I have to say. I know these people. I think you’re too kind for them.
Well, I believe that people rarely change by being vilified.
People change because — hurt people hurt people. And at this point, it seems obvious to me that Mark is a hurt person. And the only way we’ll have a world that we deserve to live in is if we give him a graceful way out because that’s what we all need right now.
Yeah, it’s amazing that we have to make the most powerful people in the world feel safe, but OK, maybe Facebook will bring you back so you can actually implement some of these changes or maybe you’ll just shame them into these changes. Either way works for me.
Or gently guide them.
Gently guide them. You’re a lot kinder than I. Frances, thank you so much.
My pleasure. It’s been great. [MUSIC PLAYING]
“Sway” is a production of New York Times Opinion. It’s produced by Nayeema Raza, Blakeney Schick, Daphne Chen, Caitlin O’Keefe, and Wyatt Orme. Edited by Nayeema Raza, with original music by Isaac Jones, mixing by Sonia Herrero and Isaac Jones, and fact-checking by Kate Sinclair, Michelle Harris, and Kristin Lin. Special thanks to Shannon Busta and Mahima Chablani.
If you’re in a podcast app already, you know how to get your podcasts, so follow this one. If you’re listening on the Times website and want to get each new episode of “Sway” delivered to you, along with a case of bubblegum watermelons, download any podcast app and search for “Sway” and follow the show. Thanks for listening.