In a bombshell interview on CBS’s 60 Minutes, a former Facebook employee went public with claims the company profits from hate speech. Hours later, Facebook and other apps it owns went down across the world.
“[Facebook’s] own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions,” Francis Haugen said in the interview.
“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”
Thousands of pages of documents suggest Facebook ignored hate speech and misinformation after 2020 election
Haugen left the company in May after realizing Facebook was allowing hate and misinformation to spread. Before quitting, however, she secretly copied thousands of pages of internal documents revealing the organization’s actions.
According to these documents, Facebook knowingly “took action on” as little as 3-5% of hate speech and less than 1% of violent speech on the platform.
When she was recruited in 2019, Haugen said she would only accept the offer if she could fight against misinformation. According to 60 Minutes, Facebook assigned her to the Civic Integrity team created to help protect elections. Immediately after the 2020 election, however, the company did away with the team completely.
“They told us, ‘We’re dissolving Civic Integrity,'” Haugen said.
“Like, they basically said, ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast forward a couple months, we got the insurrection.”
Company’s actions “a betrayal of democracy”
In addition to getting rid of her team, Facebook also ended several safeguards put in place before the election. The safeguards remained off even as dangerous lies about election fraud spread. Federal investigators and prosecutors claim that many used the platform to openly plan the January 6th insurrection.
“As soon as the election was over, they turned them back off or they changed the settings back to what they were before”. Haugen said Facebook made the decision “to prioritize growth over safety.”
“That really feels like a betrayal of democracy to me,” she continued.
By Monday morning, just hours after the interview aired, Facebook’s site was down. Other platforms like Instagram and WhatsApp (both owned by Facebook) were also not working.
By Monday afternoon the New York Times was reporting that all operations at Facebook headquarters were also shut down. According to the newspaper:
“Facebook’s internal communications platform, Workplace, also went down on Monday, leaving most employees unable to do their jobs. Two Facebook workers called it the equivalent of a “snow day.””
Facebook was even forced to take to Twitter, it’s social media rival, to announce updates about the outage. In a tweet on Monday morning, the company said:
“We’re aware that some people are having trouble accessing our apps and products. We’re working to get things back to normal as quickly as possible, and we apologize for any inconvenience.”
Haugen claims company also allowed content deemed “harmful” to teen girls
In addition to claims of willful failure to curb hate speech and misinformation, Haugan also accuses the company of ignoring how its platforms affect young girls.
Scott Pelley, who interviewed Haugan, noted that “One of the Facebook internal studies that you found talks about how Instagram harms teenage girls. One study says 13.5% of teen girls say Instagram makes thoughts of suicide worse; 17% of teen girls say Instagram makes eating disorders worse.”
This research, according to Haugen, also told Facebook that this harmful content “actually makes [teenagers] use the app more”.
“Facebook’s own research says it is not just that Instagram is dangerous for teenagers… it’s that it is distinctly worse than other forms of social media,” she continued.
According to CBS, Facebook declined an interview, but did issue a statement saying, in part, “we continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”
Screenshots of internal conversations between employees of the company, however, imply otherwise. In a posting after the January 6th attack, many employees voiced frustration with company leadership. One employee called the actions of company executives “wishy-washy”. They wrote that they struggle “working for a company that does not do more to mitigate the negative effects of its platform.”
Whistleblower to testify before Congress this week to urge regulations
Haugen, for her part, will be testifying before Congress this week. She says her goal is not to inspire more anger or outrage toward Facebook, but instead to inspire change.
“I have a lot of empathy for Mark [Zuckerberg],” she said. “Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side effects of those choices are that hateful, polarizing content gets more distribution and more reach.”
“I’m hoping that this will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into place,” Haugen said at the end of the interview.
“That’s my hope.”