Skip to main content

What's different for Facebook after this week's whistleblower testimony and massive outage?


On today's episode of 5 Things: Armed with tens of thousands of documents, Facebook whistleblower Frances Haugen testified before Congress this week, warning lawmakers that the company has repeatedly misled the public about how its platforms drive division and harm users, especially children.

Earlier in the week, Facebook saw one of its worst outages ever. Facebook's platforms were down for six hours and over three billion people were affected worldwide.

Facebook went down for a bit on Thursday, October 8, 2021 as well.

We've known for a long time that Facebook and Instagram have harmful effects and we've known we're probably a bit too dependent on the platforms.

So why does this moment feel different?

Breaking news reporter Gabriela Miranda, tech reporter Mike Snider and politics reporter Matthew Brown sit down with host Claire Thornton to dissect Frances Haugen's Congressional testimony, explain what we know about Facebook's 'amplification' algorithm and discuss how everyday people had their lives upended by Facebook's massive outage last Monday.

Hit play on the player above to hear the podcast and follow along with the transcript below. This transcript was automatically generated, and then edited for clarity in its current form. There may be some differences between the audio and the text.

Claire Thornton:

Hey there, I'm Claire Thornton, and this is 5 Things. It's Sunday, October 10th. These Sunday episodes are special. We're bringing you more from in-depth stories you may have already heard. Armed with tens of thousands of documents, Facebook whistleblower, Frances Haugen testified before Congress this week, warning lawmakers that the company has repeatedly misled the public about how its platforms drive division and harm users, especially children. Frances Haugen is kind of the first example of someone from inside Facebook coming forward with internal information about the company. My generation grew up with social media and we've seen the harmful effects of Facebook and Instagram for a long time. So what's different now? I'm joined by breaking news reporter, Gabriela Miranda, tech reporter, Mike Snider, and politics reporter, Matt Brown. Thank you all so much for being here.

Gabriela Miranda:

Thank you.

Matt Brown:

Thanks for having us.

Mike Snider:

Sure, Claire.

Claire Thornton:

So this is kind of the first time we've had a big Facebook whistleblower with these credible things that she's bringing to the table and sharing with the public. Who is Frances Haugen, this Facebook whistleblower? What do we know about her?

Mike Snider:

Francis Haugen, she's 37 and she worked at Facebook for nearly two years. She worked on what she called the civic integrity team as a product manager to combat election interference and misinformation. Before she worked at Facebook, she also was a product manager at Google, Pinterest and Yelp, where she's stepping forward now, the reason she did that is because she said Facebook disbanded this civic integrity team after the 2020 presidential race and they slacked their attention off and that helped lead to the January 6th attack on the US Capitol. So she said she'd lost faith in the company's commitment to protect users. So she copied thousands of pages of internal documents that she says shows how Facebook lied to the public about its efforts to rule out hate speech, misinformation and violence. So she initially leaked that stuff to the Wall Street Journal and then subsequently after they'd done several stories, she appeared on 60 Minutes, showed her face, gave her name, and then she ended up in front of the Senate on Tuesday afternoon, Tuesday morning and afternoon, sorry.

Claire Thornton:

Yeah, what did she say in her testimony? And let's talk more about how you said that after the 2020 election, Facebook kind of disbanded that group that she was a part of maybe too soon, because two months later the insurrection happened, what else do we know about that?

Mike Snider:

Well, one thing she said was that Facebook is kind of in this workforce loop where it doesn't have enough people to cover all the projects it's doing. So a crisis arises, they focus on that and then employees leave because of this crisis and this loop snowballs, and they have fewer and fewer people to handle the problems that they have to deal with. And one of the things she said that they weren't monitoring enough is counter-espionage and counter-terrorism. In fact, she said this consistent under-staffing is basically quote a national security issue. And she's already talking to other parts of Congress about that. I think she's going to be on speaking to another committee next week.

Mike Snider:

So the problem is there's just... Facebook is in so many things and they affect so many people. I think all of us use a figure that more than 3.5 billion people use Facebook's platforms and we're talking about Facebook, the site, Facebook Messenger, the communications app, Instagram-

Claire Thornton:

Facebook Marketplace.

Mike Snider:

Oh yeah, that's all in there, yes, yeah. And then WhatsApp. So there's all these projects that are constantly being updated and people are on there then doing things, but Facebook has to monitor some of the stuff, because in any communication system, you get some people are doing bad things. The question is what is bad and when did they step in.

Claire Thornton:

Matt, you were covering the testimony. Do you want to add anything about what all Frances Haugen said and why it matters?

Matt Brown:

So, one thing about Frances Haugen, that's really important I think to emphasize here is that she works directly on democracy and misinformation issues for the company. She also worked on counter-espionage issues where she was mainly her priority to make sure that foreign governments, for instance, weren't interfering with Facebook's platform. So she was really had a front row seat to a lot of the major issues that she is just talking about in these documents and is presenting to lawmakers. She didn't really mince words when she was in the testimony. She said that Facebook quote, "Harms children's, stokes division and weakens our democracy," and was very clear that consistently Facebook's executives have known about a lot of the issues that are on the platform and they've chosen not to try to correct them because of the profit margins and profit incentives that they have as a company. So she did express interestingly, that Facebook is not something that she wants to see disbanded or broken up.

Matt Brown:

She said that she wants to see Facebook be improved. She actually told the Wall Street Journal, "If people come out of this saying that they don't want to use Facebook, then I have failed," which I found was very interesting and was something that she also made very, very clear to lawmakers. What she was also clear about was that for many of the issues that she was discussing with lawmakers, several of them, she herself did not work on and was very clear that she didn't necessarily know all the internals going on there at the company, but she didn't necessarily have to know all the internals of the company because she had brought documents, a trove of them to the Wall Street Journal, and now had been seen by lawmakers as well that showed specifically, for instance, Facebook's effect on harming children, an issue that the company has been very concerned about for a while, Facebook's effects on human trafficking and drug cartels, their propensity for causing extremism in some parts of the world. These are all things that Haugen either had worked on or had information access to now for the first time really we're seeing these internal documents from inside the company.

Claire Thornton:

Yeah, so Gabriela, Matt and myself, we've really grown up with social media and Mike, you've had lots of opportunities to cover Facebook over the years. We've known about the negative impacts of Facebook and Instagram for quite some time, what is different now? And do you guys think that we are at some sort of tipping point? The major Facebook outage last week was one of the worst ever for the company, so at the same time that we're learning more and more about the severity of the negative impacts from Facebook, it's clear that we're as dependent on the platform as ever. So what's different now?

Mike Snider:

Well I think in the 2016 election, we have evidence that stuff was happening on Facebook that affected that. So there was this big build up to try to make sure something didn't happen in the 2020 election. Then after the 2020 election, you have this January 6th thing. So you have politics, and that aspect is we have proof basically that it happened. And now with the whistleblower documents, we have internal proof apparently that there is evidence of these emotional and personal things that happened to people on Instagram, younger people. So you've got that happening. Meanwhile, in the past, we've had issues where Facebook came to Congress to deal with privacy issues, where data had been leaked. So you have all these different things happening. And one of the people I talked to about this kind of called it a perfect storm of things hitting Facebook right now. The problem is how do you regulate this?

Mike Snider:

Congress can hardly agree on anything, let alone how to regulate this. There is a partisan point of view on big tech, we don't need to get into that now, but depending on what party you're in, you probably have a different opinion of what needs to be regulated about big tech. There is a something called Section 230, the Communications Act that could be effected, but what it does is it basically protects online publishers from user-generated content. But what do you do? Do you change that just to affect Facebook? What would the repercussions be to other social media outlets and other online places where people put up content? So there's no agreed upon situation here. There is some antitrust issues up with the state attorneys general and I think it's the FTC that is still unfolding possibly in the courts that could affect where the potential of some breakup of some of the things that Facebook owns.

Mike Snider:

There's been a lot of concern and outcry about the fact that basically companies like Facebook buy up the competition. They bought Instagram, they bought WhatsApp. They didn't create those, they just bought them. So there's a lot of... I don't think anybody has a specific map of what's going to happen, but there's a lot of feeling that there is momentum that something has to happen here. In fact, Mark Zuckerberg, when he's been to Congress and then things he's posted online is he wants regulation. And I think most big tech wants some kind of regulation. The question is, are they going to get what they want? Or would they like what they get? There has to be some type of regulation.

Claire Thornton:

Yeah, anyone else want to add onto the question of what's different now with what we know about the negative impacts?

Matt Brown:

Well, one of the things that I think was very different in the testimony at least that Haugen gave and that lawmakers were talking about is you really just saw a difference in the line of questioning that lawmakers were interested in going down. It wasn't a discussion necessarily on what is wrong with Facebook, very specific instances or this or that post, or why was this taken down? It was really a zeroing in on the algorithm itself and Haugen consistently was talking about a phenomenon called amplification algorithms, which is basically Facebook saying that if there's engagement with a certain type of content, then we're going to continue to boost that to people, and that's how we are going to keep people on the platform by basically saying, if this content is what you all want to see, then we're going to continue to promote it.

Matt Brown:

That's very good for Facebook because it's good for their bottom line, it makes it so that there's more content out there that can be monetized by advertisers, but that's not necessarily good for individual users for people, for society because the content that is often going viral on these platforms, as we all know, is content that makes us angry, content that makes us maybe feel worse about ourselves, content that makes us feel like we're missing out. So oftentimes I believe Haugen described it as needing to make the platform less twitchy, less reactive, less viral, not just for some motivation of making sure that people's mental health isn't as bad, but for literally the stability of democracy. We're not just talking about our collective health here, we're talking about the functioning of our government and our civic society. One of the things that was so different about this testimony as well is that for the longest time, we have understood that a lot of these issues with misinformation, going viral on these platforms and extremism has been a serious issue on these platforms, but that's because we've had people from the outside, who've been able to do the research and say, "We're going to create certain programs that are going to be able to simulate what it's like to be on these platforms."

Matt Brown:

But because of a bit of a wonky term called black box algorithms, we don't have access to know what actually is going on inside. So we can do a test and say, "Oh, well, this person... If we make this account and it follows these certain posts, then it's going to go down a rabbit hole that's then going to say then it's going to radicalize the person for instance." But we're not able to actually look at Facebook's code and see, "Oh, well, why is that the case? Is that going to be the case all the time? Does it actually happen at scale?" What Haugen has showed us is that Facebook for the longest time has actually been doing those exact experiments with the full knowledge of what's going on inside of Facebook. They've been able to do much, much larger scale experiments here, actually knowing the full code and the full process of how the algorithm is working and being able to then come to a much more definitive conclusion that this algorithm is causing mental health issues.

Matt Brown:

This algorithm is causing extremism in communities, this algorithm is causing people to be angrier, but it's also causing them to be more engaged. And when researchers at Facebook brought this information to the top executives, they consistently said, these are the effects and if we want to make them better, we have solutions that would potentially make a lot of the issues that are going rampant on this platform better, but they would probably decrease engagement on some level.

Claire Thornton:

And they didn't do anything about it.

Matt Brown:

And consistently Facebook's executives said, we are going to choose keeping people engaged, increasing the scale of our platforms, instead of saying, we're going to scale this down and maybe not grow as quickly and not increase profits as quickly. So the fact that we know from inside the company now, how definitively this went is what really makes this situation so much more of a game changer. Haugen actually compared Facebook's behavior on this issue to the tobacco companies before regulation came for them and to the automakers actually even before we were required to have seatbelts in our cars. So she's really taking this as a historic moment that needs regulation. She kept urging that this is something that Facebook is not going to correct on its own, that its executives have consistently shown that they are not interested in correcting this on their own, even though they know what they need to do. And that's why she's now turning to Congress and saying that someone from outside the company needs to act.

Mike Snider:

That's that civic integrity team that she was working on that they shut down right after the election. And he's right, yeah, so that's the connection there. And he mentions these acceleration algorithm aspects. So think of you as a person, you're... And the example she used was you're a girl, you're on Instagram, you might be thinking about food, looking at people put pictures of food up, like, "I went to dinner with my friends and took a picture of food," but if you keep looking at food, there might be a way that it gets you to anorexia content. And that could make someone think, I don't... There's a lot of body image issues in teenagers and they see this and that could send them down the wrong thing. And the other aspect is a political aspect on sites. Say you're into... After the election, the big statement was stop the steal. That could send you to a more of extremist site because of these engagement algorithms.

Claire Thornton:

So toxic.

Gabriela Miranda:

And I think just speaking to the toxic environment that social media has, I think what's different about this generation, and I'm about 23, is that like Mike said and Matt said, you'll go on social media and there's this picture perfect example of different people on your doom scrolling, or you do come across anorexic kind of views or body shaming. And I think our generation and thanks to whistleblowers and other people speaking out, we're kind of realizing, okay, maybe our dependency on social media shouldn't be there. Or I think with the outage, a lot of people were just like, "Okay, let me reflect on why I'm so attached to these platforms and the negative sides of it." So I think that's also one of the reasons that this feels different and this whistleblower and these testimonies are feeling a little bit different just because I think we're all opening our eyes a little bit.

Claire Thornton:

Yeah, so Frances Haugen's testimony came days after this massive outage that you reported on, Gabriela, so over 3 billion people use Facebook platforms. Facebook dominates the market to say the least. And when we can't communicate on these platforms, it's a huge deal. I remember the day of the outage, I was actually off from work. I didn't really hear about it until after the workday had ended and I remember thinking, "Oh my gosh, I'm glad I was off, otherwise my work probably would have been affected." And people's communications with their family was affected. Gabriela, what did people tell you about how the outage affected their lives and those of their family?

Gabriela Miranda:

So similar to you, I was also off the day of the outage, but I use WhatsApp a lot just to communicate with my family in Puerto Rico and Dominican Republic. And so I had it on my mind and then I came to work on Tuesday and I realized a lot of people were tweeting and talking about how the outage affected them and how six hours of silence from their families was torture. And I think in a lot of Latin America and countries abroad, I spoke to people saying, "I couldn't talk to my families and my dad had been in the hospital," or "I couldn't reach my mom and I know she needed my help." And I think it's insignificant to a lot of Americans who think, "Okay, six hours of not communicating, it's fine. You should just find another application or just call." But international fees are so expensive and texting is so expensive and WhatsApp and Facebook Messenger provided this free and really quick way to communicate with people. And a lot of sources I talked to were just like, "We only use WhatsApp and that's all we knew."

Gabriela Miranda:

In Brazil and Afghanistan, that was the easiest way, and they didn't have other options like telegram or Signal downloaded, especially the older community. So, yeah, so the outage was terrifying for them because there was emergency situations or there were six hours of silence and they didn't know what was happening with their families. So they were able to share those stories and kind of shed light on how serious and dependent they were on these apps.

Claire Thornton:

Did they say anything about how they realized I'm too dependent, this is too much for this to have this big of an impact on my life?

Gabriela Miranda:

Yeah, I talked to three people and there was a man whose wife was in Mexico and he had just recently moved to Texas on a work visa. And they both only had WhatsApp and Facebook Messenger downloaded and Instagram and all of those platforms offer video chats. So he had planned to watch his daughter's birth through video chat, but his wife went into active labor during the outage. And he just said, "I'm heartbroken, I was so reliant on this app. I never questioned it, I, for not having a backup plan, missed out on something so huge." And another woman whose family is still in Kabul, Afghanistan, her sister was attacked the night before the outage and due to the time difference, she basically missed a whole day of updates on her sister's condition. So I think all of these people were just like, "We can't have this happen again. We need to download Signal. We need to download Telegram, Viber." And I know all of those platforms, alternative platforms had huge spikes in users during the outage too.

Gabriela Miranda:

So I think everybody that I spoke to at least said, "We have backup plans now, we made our family download Telegram, we made our family download these other apps just in case this happens again," because an outage can be deadly or really scary for these families.

Claire Thornton:

Wow, those are incredible examples. So given all of this, Facebook is just really in hot water this week. Mark Zuckerberg's response to all this was unusually defiant. In the past, when Facebook has had to testify before Congress about its algorithm and extremism on the site, Zuckerberg has apologized, he said, "We're trying to do better. We're working to fix this." This time, he didn't take that tact. What is his different response this time around mean?

Mike Snider:

There is a potential that we may not even be able to understand the algorithms that Facebook uses. I think one thing that Frances Haugen said was they have probably the most expert team of AI folks at Facebook. So when Zuckerberg says, "What she's saying doesn't make any sense," or "That's not true," it could turn into a he said, she said. Even if we had a third party go into Facebook software, would someone be able to understand it? I'm not smart enough to tell you the answer to that, I'll be honest with you, but that might be why he's saying something different now. But as I said, in the past he said that there needs to be regulation because they don't know what to regulate. Now, whether that's fair, should we, say, hold him up and say, that's a great way to look at it or not, we can debate about that all we want to, but it's interesting his response basically denying what she said happened. Again, there's evidence, but does that evidence really bear out if we look at it real hard, I'm not sure.

Claire Thornton:

When he says he wants regulation, do you think that could be a strategy on his part to take the blame off himself and his company and kind of put it on someone else's shoulders?

Mike Snider:

For sure, for sure. And then, there's another possibility, I'm not sure how likely this is, but if people start learning what regulation is going to happen, they may push back on legislators. Let's be honest, I think most of us, most of the populace likes Facebook more than it likes Congress.

Claire Thornton:

Okay, I'm following.

Mike Snider:

So if it was... I mean, I don't think we're ever going to throw him into the ring and have a death match and see who comes outstanding, but he could be also playing to the crowd, that's going to come eventually when we get closer to some type of regulation, I'm not really sure. And actually the experts I talked to that's what kind of makes me not feel bad to say I'm not really sure because they don't see how this could happen. I mean we're barely keeping the checkbook open in our nation right now, can't decide how to do that properly. So our political leaders can hardly get anything done, so the fact that there would actually be any regulation, it could be a pie in the sky thought.

Matt Brown:

Well, I do want to push back just a little bit on that last point. One report that we came out with actually just yesterday, was talking about the potential for a momentum on this issue. As Mike said, obviously, Washington is Washington and Congress is a very difficult place to see anything get passed. But the tone at the hearing was a bit different in I'd say two places. One, we really saw lawmakers asking substantive technical questions about how Facebook works. They weren't asking their particular partisan pet issues, even though many of them have done that in other hearings, you weren't hearing nearly as much about anti-trust and the monopolistic powers of big tech from Democrats, you weren't hearing nearly as much about free speech issues or conservative voices being suppressed as you do from Republicans. You heard a lot of conversation instead about specifically how the algorithms at Facebook work and Haugen was very good at describing specifically how at the end of the day, a lot of the issues that we're talking about are because of this amplification model that Facebook goes after.

Matt Brown:

And it again, should be noted, a lot of social media companies do this, but Facebook has all of the biggest, so they are obviously the worst offenders in many ways. The other issue here is that you did see in the rhetoric of this cooperation on algorithms, a deep focus from lawmakers, very specifically on the effects of social media on children. They were very unanimously interested in how can we update laws to make sure that youth and children on these platforms are not being targeted by advertisers for instance. How can we make sure that specifically, if no one else is able to escape engagement and algorithmic targeting, can we make sure that young people are not targeted? How can we make sure that young people especially don't have their data be collected? These were things that... It was an interesting space to see that particular bipartisan space come up on the hill, where it seemed to be whatever we think of certain issues around Section 230, for instance, which also came up during the hearing.

Matt Brown:

We think that we need to be updating some of these laws around child privacy. And that's also the only place that you're really seeing bipartisanship on the issue where Democrats and Republicans are both very eager to say, "Well, I have my bill that I've proposed on this issue," but it's not going to go anywhere in the current 50-50 Senate, unless it has bipartisan support. And children and protecting children is the one place where both parties are able to say, "Okay, this is somewhere where we think that we can make headway on this issue." And that's probably another reason why mark Zuckerberg is so frustrated and or anxious about the situation because it is true. We saw in the documents that children is a space where Zuckerberg and Facebook at large have a big anxiety. The reason why they were doing all this research on why Facebook and Instagram were bad platforms for children and youth is because people aren't going there.

Matt Brown:

They're going to Snapchat, they're going to TikTok. They're just watching more stuff on YouTube, there's hanging out on Netflix. They're going to other parts of the internet where Facebook isn't able to keep monetizing them. And that's very, very scary for a company like Facebook that used to be the it kid in town. Zuckerberg started this off when he was in college to appeal to college people and now he's the old thing. Facebook is for old people is one thing that one of the documents summarize, which is, no seriously, that's what their own research told them. And it's an issue for the company when they are looking at their future here. So when we see these documents that are saying, "Oh, Facebook was trying to figure out how can we insert ourselves into the playdates of six year olds," or "Facebook is trying to figure out how can we make the platform better for teenage girls so that they'll be wanting to hang around here more often."

Matt Brown:

These are issues that are not coming from a place of strength for Facebook, partly it is because so many of us are already spending all of our time on these platforms, but any sign of weakness in that is potentially existential for Facebook, it's social media. The whole reason we're on there is because everyone else is on there. So if all your friends are leaving, then you get a negative feedback loop. And what we're seeing right now is that in that broader concern that Facebook has, that they're not the cool, it thing anymore for the young people who make them all the money, is a bigger issue that now, maybe those are the concerns and the threat to those exact same young people might be the thing that causes Congress to act. And both of those things are existential, both in the near term and in the longterm for Facebook.

Claire Thornton:

That is fascinating.

Mike Snider:

Yeah, I think it's just something that's not... It's not going to go away. I mean, even in the next few weeks, I'm sure we'll have more happening on this, but thanks for having us.

Claire Thornton:

Mike, Matt, Gabriela, thank you all so much for being here.

Gabriela Miranda:

Thank you so much, I'm happy to be here.

Mike Snider:

You bet.

Matt Brown:

Thank you so much.

Claire Thornton:

Check out stories from Gabriela, Mike and Matt at the links in the episode notes. I'd love it if you could give us a rating and review on Apple Podcasts, and if you're a big fan of 5 Things, tell your friends about the show, tell your family, tell your neighbors. I know personal recommendations are one of the biggest ways people learn about podcasts. Taylor Wilson will be back tomorrow morning with five things you need to know for Monday. Thanks for listening, I'm Claire Thornton, I'll see you next time.