Advertisement

The story of Carol and Karen: Two experimental Facebook accounts show how the company helped divide America

In 2019, two users joined Facebook. Both had similar interests: young children and parenting, Christianity, civics and community.

"Carol," 41, was a conservative from North Carolina. She was interested in news, politics, then-President Donald Trump and the nation's first family. She followed the official accounts for Trump, first lady Melania Trump and Fox News.

"Karen" was the same age and lived in the same state. But she was a liberal who liked politics, news and Sens. Bernie Sanders and Elizabeth Warren. She disliked Trump. She followed a local news site, pages about North Carolina and the liberal advocacy group MoveOn.

Facebook's algorithms got to work, suggesting what they'd be interested in.

Accepting recommendations for sites supportive of Trump led Carol to suggestions for a site called “Donald Trump is Jesus,” and another for QAnon, a wide-ranging extremist ideology that alleges celebrities and top Democrats are engaged in a pedophile ring. Karen was presented with anti-Trump pages, including one that posted an image showing an anus instead of Trump's mouth.

The two women were not real. They were created by a Facebook researcher to explore how the social media platform deepened political divides in the U.S. by recommending content rife with misinformation and extremism.

Live updates: More on Facebook papers and the whistleblower testimony in the UK

The experiment shows that Facebook, which had 2.9 billion monthly active users as of June 30, knew before the 2020 presidential election that its automated recommendations amplified misinformation and polarization in the U.S., yet the company largely failed to curtail its role in deepening the political divide.

Reports describing the experiments are among hundreds of documents disclosed to the Securities and Exchange Commission and provided to Congress in redacted form by attorneys for Frances Haugen, a former Facebook employee. The redacted versions were obtained by a consortium of 17 news organizations, including USA TODAY.

More: Facebook says it removes hate speech. Its own research says otherwise.

In the summer of 2019, a Facebook researcher created two fictitious accounts with similar demographics but opposite political beliefs. Facebook's recommendation algorithm quickly suggested the users follow accounts on extreme ends of the political spectrum.
In the summer of 2019, a Facebook researcher created two fictitious accounts with similar demographics but opposite political beliefs. Facebook's recommendation algorithm quickly suggested the users follow accounts on extreme ends of the political spectrum.

Jose Rocha said he's experienced the divisiveness firsthand.

A military veteran who grew up in a Democratic, pro-union family in Selah, Washington, Rocha said Facebook normalized racist views and led him down a rabbit hole to far-right ideologies.

For a time, Rocha said, he became a Nazi sympathizer and a backer of other extremist views – behavior he now blames on Facebook's recommendations system.

"I wouldn't have even known they existed if it wasn't for Facebook. So I wouldn't have went out seeking them," said Rocha, 27.

Bill Navari, 57, a conservative sports commentator from Pittsburgh, said a cousin blocked him on Facebook after he suggested she get her TDS ("Trump derangement syndrome") checked.

“I’ve seen people on Facebook saying, ‘If you are voting for Trump, unfriend me.' But I didn’t see anyone saying, ‘If you are voting for Biden, unfriend me,'” he said. “Facebook has become like oil and water, and never the two shall meet.”

These days, he steers clear of political debates on Facebook.

“I’ll post pics of my family, of my dog, where we went on vacation, and I stay in touch with the friends and family. But posting a meme or putting something on Facebook, it’s not going to change anyone’s mind,” he said. “I just think the conversation has become so coarse.”

Is Facebook to blame? “I don’t like pointing fingers without direct knowledge,” he said. “But I do think that Facebook is a party to this.”

The internal Facebook documents show how swiftly the platform's recommendation algorithms can amplify polarization by sending users to content full of misinformation and extremism.

The company's experiment with the hypothetical conservative user was called "Carol's Journey to QAnon." Within five days of going live on June 2, 2019, the user was barraged by "extreme, conspiratorial and graphic content," the researcher wrote.

One of the recommendations included an image labeling former President Barack Obama a "traitor" with a caption that read, "When we're done he'll claim Kenyan citizenship as a way to escape." (Despite racist claims to the contrary, Obama is a U.S. citizen.)

The report on the fictitious liberal user was called "Karen and the Echo Chamber of Reshares." That account went live on July 20, 2019. Within a week, Facebook's recommendations pivoted to "all anti-Trump content." Some recommendations came from a small Facebook group that had been flagged for "promoting illegal activity," the Facebook researcher wrote.

One image served to Karen showed then-first lady Melania Trump's face superimposed on the body of a bikini-clad woman kneeling on a bed. The caption read, "Melania Trump: Giving evangelicals something they can get behind."

Facebook whistleblower Frances Haugen appears before the Senate Commerce, Science, and Transportation Subcommittee at the Russell Senate Office Building on October 05, 2021, in Washington, D.C. Haugen left Facebook in May and provided internal company documents about Facebook to journalists and others, alleging that Facebook consistently chooses profits over safety. (Photo by Matt McClain-Pool/Getty Images)

Haugen, the former Facebook employee who has blown the whistle on the company, is a former product manager who worked on Facebook’s Civic Integrity team, focusing on elections. She had a front-row seat to the most divisive political events in recent memory, including the Jan. 6 insurrection in which Trump supporters tried to block Congress from certifying Joe Biden's win in the presidential election.

Concerned that Facebook was prioritizing profits over the well-being of its users, Haugen reviewed thousands of documents over several weeks before leaving the company in May.

The documents, some of which have been the subject of extensive reporting by The Wall Street Journal and CBS News' "60 Minutes," detail company research showing that toxic and divisive content is prevalent in posts boosted by Facebook and shared widely by users.

"I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolves these conflicts in favor of its own profits," Haugen alleged during a Senate hearing this month. "The result has been more division, more harm, more lies, more threats and more combat."

Haugen has called on Facebook to limit its practice of prioritizing content that has drawn shares and comments from many users.

She has sought federal whistleblower protection from the SEC, alleging that Facebook, a publicly traded company, misled investors. She could get a financial award if the SEC were to penalize the company.

In this file photo illustration, a smartphone displays the logo of Facebook on a Facebook website background, on April 7, 2021, in Arlington, Virginia. Facebook's independent Oversight Board announced on April 13, 2021, it would start accepting requests to remove "harmful content" that users believe has been wrongly allowed to remain on the leading social network. The move broadens the mandate of the so-called "supreme court" of Facebook, which up to now had been tasked with reviewing instances of whether content was improperly taken down from Facebook or Instagram.

Facebook denies that it is the cause of political divisions in the U.S.

“The rise of polarization has been the subject of serious academic research in recent years but without a great deal of consensus," said spokesman Andy Stone. "But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization."

Facebook cited a research study that showed polarization has declined in a number of countries with high social media use even as it has risen in the U.S.

As for the test accounts, Stone said the experiment was "a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform."

Facebook while black: Users call it getting 'Zucked,' say talking about racism is censored as hate speech

Facebook tweaks its algorithms to increase engagement

After Russia used Facebook to interfere in the 2016 presidential election, pressure built on the company and its CEO, Mark Zuckerberg, to do something about misinformation and divisive content.

Meanwhile, critics charged that the company's apps exploited human psychology to hook people on social media, hijacking their time and undermining their well-being.

Facebook and Instagram ads linked to Russia during the 2016 election.
Facebook and Instagram ads linked to Russia during the 2016 election.

Especially worrying to company leaders was that users were less engaged on the platform. They scrolled through updates on their timelines, reading articles and watching videos. But they commented and shared posts less than before.

In response, Facebook radically altered the algorithm that determines what to display at the top of users' News Feed, the stream of posts from friends, family, groups and pages. The change was aimed at bringing users more updates from friends and family that spark meaningful social exchanges, the company said at the time.

But the focus on posts with high numbers of comments and likes rewarded outrage and resulted in the spread of more misinformation and divisive content, according to internal documents reviewed by USA TODAY. The more negative or incendiary the post, the further and faster it spread.

The change was noticeable.

Kent Dodds, a software engineer from Utah, said he rarely uses Facebook. In September 2019, he hopped on to voice his support for then-Democratic presidential candidate Andrew Yang.

Soon Dodds' News Feed shifted. Instead of seeing posts from his social circle, he was bombarded by political posts from distant Facebook connections.

“I remember coming away from that thinking, Facebook wants me to fight. They really want me to engage with these friends I haven’t talked to in a long time about their very different political views, and clearly not in a positive way,” Dodds said.

“Whether or not Facebook is intentional about what their algorithm is doing, it is responsible for it, and it’s doing harm to our society and they should be held accountable,” he said.

Dodds' experience wasn't unique. The company's research showed "how outrage and misinformation are more likely to be viral," one internal document says.

Particularly problematic were “deep reshares,” or posts from people who are not your friends and whom you don’t follow on Facebook. “Our data shows that misinformation, toxicity and violent content are inordinately prevalent among reshares,” another document says.

This content often went viral before Facebook could catch it, according to a study.

“This is an increasing liability. For example: Political operatives and publishers tell us that they rely more on negativity and sensationalism for distribution due to recent algorithmic changes that favor reshares,” according to the study.

Facebook tweaked its algorithm last year before the election to reduce the problem in civic and health-related posts. However, Zuckerberg was reluctant to expand those changes to other types of posts if it would decrease overall user engagement, according to an internal memo.

“Mark doesn’t think we could go broad,” the memo said.

Facebook confirmed another experiment that didn't go forward. In 2019, a company researcher proposed a project in which users would be paid for allowing the company to monitor their usage of the platform in the runup to the 2020 presidential election. The goal was to show that the platform does not drive polarization.

Facebook told USA TODAY the study was scuttled because it duplicated another project about the election – a partnership between Facebook researchers and academics – that has not yet been published.

The debates over user engagement and polarization are complex, said Eli Pariser, author of "The Filter Bubble" and a researcher and co-director of New_Public, an incubator seeking to create better digital spaces.

"I think it’s also pretty clear that the company had made a whole bunch of decisions to prioritize engagement, and those have had public consequences,” he said.

More: Facebook says it’s winning the fight against hate speech targeting Black Americans. Its own research says otherwise.

One user's rule: Don't mix friends and family on Facebook

Deanie Mills struggled to deal with those consequences.

Mills is a 70-year-old crime novelist and grandmother who lives with her husband on a remote West Texas ranch. Half her family are Democrats; the other half are old-school conservatives, many of them military veterans.

For years she bit her tongue at family gatherings. “I didn’t want to get into a barroom brawl over politics with friends,” she said.

In 2008, she joined Facebook and used her account to speak out against the Iraq War at the urging of her son, a Marine who had become disillusioned with the war effort.

Facebook friend requests from relatives started to roll in. “I thought, oh crap,” said Mills, who backed Barack Obama for president. “I support the troops 100%, but I don’t support this war and I don’t want to lose family over it.”

She created a rule: Don’t mix Facebook with family. Relatives agreed to stay in touch in other ways.

Today she said her heart breaks every time she hears about families and friendships ripped apart by Facebook feuds. The problem, she said, is that people have a predilection for sensationalism, fear and outrage.

“People just want to be whipped up,” Mills said. “And Facebook says, ‘Here’s your drug. Come back here in the alley and I can fix you up.'"

Experts who have studied Facebook say that's how the platform is engineered.

Brent Kitchens, an assistant professor of commerce at the University of Virginia, co-authored a 2020 report that found Facebook users' News Feeds become more polarized as they spend more time on the platform. Facebook usage is five times more polarizing for conservatives than for liberals, the study found.

"Everything leads me to believe it's not malicious, and not intentional, but it's something they're aware of from their engagement-based content curation," Kitchens said.

Chris Bail, the director of Duke University's Polarization Lab, said he believes Facebook has played a role in deepening political divisions, but he cautioned there are other factors. He partly blames social media users who – consciously or not – seek validation and approval from others.

"Changing a few algorithms wouldn't do the trick to change that," said Bail, the author of "Breaking the Social Media Prism."

Alex Mayercik, a 52-year-old from Houston, also blames human nature.

“I have often said to people: It was harder for me to come out as a gay conservative than it was for me to come out,” she said.

Her political views and support of Trump cost her friends on Facebook, including her best friend from grade school, she said. "These were people that were friends, that I knew, that I broke bread with, that I went to church with."

But she also blames Facebook.

“I feel it leans one way politically, and that does not promote open dialogue,” said Mayercik. “People have to disagree. It seems to me that whether it’s Facebook or Twitter or any other social media platform, everybody is entitled to have an opinion.”

Facebook removes guardrails after election

Haugen told U.S. senators this month she was alarmed when, after the 2020 presidential election and before the Jan. 6 Capitol riot, Facebook disbanded her team and turned off safeguards to combat misinformation and dangerous movements.

Removing those measures, such as limits on live video, allowed election fraud misinformation to spread widely and for groups to gather on Facebook as they planned to storm the Capitol, she testified.

Protesters attempt to enter the U.S. Capitol building on Jan. 6 after mass demonstrations  during a joint session of Congress to ratify President-elect Joe Biden's 306-232 Electoral College win over President Donald Trump.
Protesters attempt to enter the U.S. Capitol building on Jan. 6 after mass demonstrations during a joint session of Congress to ratify President-elect Joe Biden's 306-232 Electoral College win over President Donald Trump.

"Facebook changed those safety defaults in the runup to the election because they knew they were dangerous. And because they wanted that growth back, they wanted the acceleration of the platform back after the election, they returned to their original defaults," Haugen said when she testified before Congress this month.

"The fact that they had to 'break the glass' on Jan. 6 and turn them back on, I think that’s deeply problematic," she said.

Facebook rolled back the measures when conditions returned to normal after the election, a decision "based on careful data-driven analysis," Nick Clegg, Facebook’s vice president of policy and global affairs, wrote in a recent memo to employees.

Some of those measures were left in place through February, he wrote. "And others, like not recommending civic, political or new groups, we have decided to retain permanently."

The Facebook researcher who created Carol and Karen suggested deeper changes. The platform's recommendations should exclude groups or pages with known references to conspiracies in their names, such as QAnon, and those with administrators who broke Facebook's rules.

The researcher left the company in August 2020 as Facebook banned thousands of QAnon pages and groups, criticizing the failure to act sooner, BuzzFeed News reported. The FBI labeled QAnon a potential domestic terrorism threat in 2019.

Stone, the company spokesman, said Facebook adopted some of the researcher's recommendations earlier this year, such as eliminating the "like" button in the News Feed for pages that had violated the company's rules but had not yet been removed from the platform.

Duke University's Bail said Facebook should change its system in a more fundamental way.

Rather than boost posts that get the most likes, he said, the platform should boost those with a large number of likes from a cross-section of sources, including Democrats and Republicans.

Regardless of whether Facebook makes such changes, it has already lost its hold on Katie Bryan. The interior designer from Woodbridge, Virginia, said she got fed up with the spread of hate and misinformation by Trump supporters when he first ran for president. She responded by unfriending friends and relatives.

Now, she said, “I don’t really even enjoy logging on to Facebook anymore."

Since Haugen came forward, Bryan deleted the Facebook and Instagram apps from her phone.

Contributing: Grace Hauck and Rachel Axon

From Facebook friend to romance scammer: Older Americans increasingly targeted amid COVID pandemic

This article originally appeared on USA TODAY: Facebook Papers: Whistleblower documents show FB was dividing America