The Facebook whistleblower finally spoke publicly on 60 Minutes Sunday. Frances Haugen, who worked on Facebook’s Civic Integrity Team until it was dissolved following the 2020 election, called out the social media giant for prioritizing profits over public safety.
“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” Haugen said, later adding, “Facebook, over and over again, has shown it chooses profit over safety. It is subsidizing — it is paying for its profits with our safety.”
Haugen believes Facebook has contributed to divisions in society, and joined the United Nations in blaming the company for real-world ethnic violence. In 2018, the U.N. said Facebook played a “determining role” in the spread of hate against Rohingya in Myanmar. Facebook has also repeatedly come under fire in the U.S. for being a platform on which individuals spread hateful and violent rhetoric, along with mountains of misinformation about subjects ranging from politics to cultural issues.
“When we live in an information environment that is full of angry, hateful, polarizing content, it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other,” Haugen said. “The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”
While Facebook has said in the past that it will work to curb the spread of hateful and violent rhetoric on the platform, Haugen says the company’s algorithm actually promotes it.
“It is optimizing for content that gets engagement, a reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing — it’s easier to inspire people to anger than it is to other emotions,” Haugen said. She later added to that point, saying, “Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction, and the more anger that they get exposed to, the more they interact, the more they consume.”
Ahead of the 2020 election, Facebook put safeguards in place to protect against misinformation about the election and its results. But Haugen says they removed those safeguards soon after, which assisted in the planning of the Jan. 6 insurrection at the Capitol.
“As soon as the election was over, they turned them back off, or they changed the setting back to what they were before to prioritize growth over safety,” Haugen said, “and that really feels like a betrayal of democracy to me.”
Watch a veteran of the war in Afghanistan credit his military experience for staying cool while trapping an alligator in a trash bin:
Read more from Yahoo Entertainment: