Facebook, other apps crash hours after “damning” whistleblower interview

by Nate Morris
facebook whistleblower
Newsletter Signup

Subscribe to our newsletter below and never miss the latest stories affecting Black America.

Listen to this article here

In a bombshell interview on CBS’s 60 Minutes, a former Facebook employee went public with claims the company profits from hate speech. Hours later, Facebook and other apps it owns went down across the world.

“[Facebook’s] own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions,” Francis Haugen said in the interview.

“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”

Thousands of pages of documents suggest Facebook ignored hate speech and misinformation after 2020 election

Haugen left the company in May after realizing Facebook was allowing hate and misinformation to spread. Before quitting, however, she secretly copied thousands of pages of internal documents revealing the organization’s actions.


facebook whistleblower


According to these documents, Facebook knowingly “took action on” as little as 3-5% of hate speech and less than 1% of violent speech on the platform.

When she was recruited in 2019, Haugen said she would only accept the offer if she could fight against misinformation. According to 60 Minutes, Facebook assigned her to the Civic Integrity team created to help protect elections. Immediately after the 2020 election, however, the company did away with the team completely.

“They told us, ‘We’re dissolving Civic Integrity,'” Haugen said.

“Like, they basically said, ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast forward a couple months, we got the insurrection.”

Company’s actions “a betrayal of democracy”

In addition to getting rid of her team, Facebook also ended several safeguards put in place before the election. The safeguards remained off even as dangerous lies about election fraud spread. Federal investigators and prosecutors claim that many used the platform to openly plan the January 6th insurrection.

“As soon as the election was over, they turned them back off or they changed the settings back to what they were before”. Haugen said Facebook made the decision “to prioritize growth over safety.”

“That really feels like a betrayal of democracy to me,” she continued.

By Monday morning, just hours after the interview aired, Facebook’s site was down. Other platforms like Instagram and WhatsApp (both owned by Facebook) were also not working.

By Monday afternoon the New York Times was reporting that all operations at Facebook headquarters were also shut down. According to the newspaper:

“Facebook’s internal communications platform, Workplace, also went down on Monday, leaving most employees unable to do their jobs. Two Facebook workers called it the equivalent of a “snow day.””

Facebook was even forced to take to Twitter, it’s social media rival, to announce updates about the outage.  In a tweet on Monday morning, the company said:

“We’re aware that some people are having trouble accessing our apps and products. We’re working to get things back to normal as quickly as possible, and we apologize for any inconvenience.”


facebook whistleblower


Haugen claims company also allowed content deemed “harmful” to teen girls

In addition to claims of willful failure to curb hate speech and misinformation, Haugan also accuses the company of ignoring how its platforms affect young girls.

Scott Pelley, who interviewed Haugan, noted that “One of the Facebook internal studies that you found talks about how Instagram harms teenage girls. One study says 13.5% of teen girls say Instagram makes thoughts of suicide worse; 17% of teen girls say Instagram makes eating disorders worse.”

This research, according to Haugen, also told Facebook that this harmful content “actually makes [teenagers] use the app more”.

“Facebook’s own research says it is not just that Instagram is dangerous for teenagers… it’s that it is distinctly worse than other forms of social media,” she continued.

According to CBS, Facebook declined an interview, but did issue a statement saying, in part, “we continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”

Screenshots of internal conversations between employees of the company, however, imply otherwise. In a posting after the January 6th attack, many employees voiced frustration with company leadership.  One employee called the actions of company executives “wishy-washy”. They wrote that they struggle “working for a company that does not do more to mitigate the negative effects of its platform.”

Whistleblower to testify before Congress this week to urge regulations

Haugen, for her part, will be testifying before Congress this week. She says her goal is not to inspire more anger or outrage toward Facebook, but instead to inspire change.

“I have a lot of empathy for Mark [Zuckerberg],” she said. “Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side effects of those choices are that hateful, polarizing content gets more distribution and more reach.”

“I’m hoping that this will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into place,” Haugen said at the end of the interview.

“That’s my hope.”

3 comments

chez October 4, 2021 - 5:37 pm

Hmmm… this is an interesting development because, personally, what I observe is that criticism on Facebook of disinformation, especially that coming from Russia, or calling out racist content and racist institutions like the British Royal Family get flagged as hate speech. I have been placed in Facebook Jail for both of those in the last few months. But white supremacy, disinformation about COVID 19, and lies about Democratic leaders continue to get a free pass. Also, hateful posts that clearly originate from bots that draw comments from my progressive friends show up in my newsfeed because those friends have commented on them. But what never shows up in my newsfeed? Posts by actual progressive leaders whom I don’t follow specifically on Facebook for whatever reason. And those same friends are commenting there as well.

I use Facebook primarily for keeping up with friends and family. I try to keep my own timeline free from controversy but I am not averse to commenting on threads that other people, my IRL friends, start when disinformation or racism show up. I’ve been accused of slander for calling out racist speech when it was right there for all to see with no consequence for the perpetrator of that vile, hateful content. I’ve blocked people who spread hateful content. But I and other progressive friends get banned for calling out racism and racist institutions. I’ve never heard of any of the people who spread actual hate being banned. Not once that I can think of. And they form a considerable percentage of the commenters on political posts that I see. They are people known to me, not bots. And they can comment and post with impunity.

It has been obvious to me for quite some time that Facebook’s policies against hate speech are very one sided and that its algorithm actively spreads disinformation and right wing views while suppressing progressive–what would be considered centrist in the rest of the world–content at the same time. I don’t know if Facebook took itself offline or if someone was poised and ready to attack it and did so in the moments following Haugen’s revelations. it will be interesting to see if Facebook makes any substantive response to any of this.

Facebook, other apps crash hours after "damning" whistleblower interview – NewsChest public figures October 5, 2021 - 1:10 am

[…] In a bombshell interview on CBS’s 60 Minutes, a former Facebook employee went public with claims the company profits from hate speech. Hours later, Facebook and other apps it owns went down across the world. “[Facebook’s] own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions,” Francis Haugen said in the interview. “Facebook has realized that if they change the algorithm to be […] Facebook, other apps crash hours after "damning" whistleblower interview […]

Comments are closed.

You may also like