Editor's note: This story has been updated to include additional response from Facebook.
Facebook made a startling discovery while investigating how people used its products to exploit others: a U.S. sex trafficking network recruiting women from overseas and advertising illegal sexual services in domestic massage parlors.
The Facebook team charged with protecting users from real-world harm highlighted the findings in a November 2019 report touting how its investigation had identified and disrupted a criminal network with 40 potential victims.
In truth, law enforcement was already on top of the crime. The FBI had launched an investigation nearly 2½ years earlier after tips from the National Human Trafficking Hotline. A local police department also was working the case in Florida after receiving an anonymous letter saying, “I hope that more girls will not get hurt anymore.”
The internal Facebook report was among disclosures made to the Securities and Exchange Commission and provided to Congress by legal counsel for whistleblower Frances Haugen. The redacted versions were obtained by a consortium of news organizations, including USA TODAY.
The documents related to a group of sex spas raise questions about whether the social media company could have done more sooner to protect victims and how much it benefitted financially from human trafficking activity on its platforms.
They could also land Facebook in hot water legally due to a 2018 law that says internet companies can be held criminally and civilly liable for promoting or facilitating sex trafficking.
What’s striking in the documents is internal discussion around how to curb trafficking on Facebook's platforms. The reason company investigators proposed clarifying policies on revenue from human trafficking? To “prevent reputational risk for the company,” according to the documents.
The spas referenced in the Facebook documents were run by Florida resident David Williams, who pleaded guilty to trafficking-related and financial crimes for operating more than a dozen illicit massage parlors across four states.
The documents say Williams and his ex-wife, Qun Shen, used dozens of Facebook pages and accounts to promote the parlors and relied on two marketing firms, one in the U.S. and one in India, to buy Facebook ads filled with keywords for potential sexual services.
Shen, who was not charged, said she never placed advertisements on Facebook and was separated from Williams at the time he was investigated by law enforcement. Williams did not respond to a request for comment through his attorney. He filed for divorce from Shen in September 2017 after less than two years of marriage.
According to the internal documents, the Facebook investigators also discovered that the network targeted women from areas where economic need is high, notably the Philippines. The network used a “Romeo” scheme, they said, posting comments on the women’s photos and sending friend requests. It followed up with romantic Facebook direct messages about marriage and sent money to help the women’s families.
In the U.S., women lived in massage parlors run by the network and worked typically 14 hours a day, the report said. The network turned again to Facebook to message potential clients and arrange appointments. It relied on generic stock photos on Facebook pages and other websites, as do many illicit spas, but also posted suggestive images of the Filipino women to drum up business.
Facebook investigators said in the report that they had “actioned” or disabled all 84 pages and 22 accounts associated with the network. A report was “probably being created,” they noted, to inform law enforcement “currently working this case to take the actions that they deem necessary.”
And, importantly, through the spa case, Facebook was able to identify and understand the modus operandi of the network to “get insights about how this type of problem could abuse our products.”
What Facebook did or didn’t do with that information could be damaging legally for the social media giant. An update to Section 230 of the Communications Decency Act known as FOSTA-SESTA makes it illegal for internet companies to act in “reckless disregard” of sex trafficking on its platforms.
“Facebook can’t stick its head in the sand,” said Maggy Krell, who worked on sex trafficking cases as a supervising deputy attorney general in California, including one against Backpage.com that ultimately led to its closure. “Once on notice that its site is being used to traffic someone, they must act.”
Facebook spokesman Andy Stone said the company proactively investigated the activity on its platforms and reported its findings to law enforcement, although he would not say who it told or when. He also said Facebook took down all accounts associated with Williams.
But USA TODAY found it failed to remove at least three pages for spas associated with the network, including public posts promoting “special" and "full-body sensual massages” with winking emojis. After reporters reached out for comment from Shen, Williams and Facebook, the pages were taken down.
Criminal and court records reviewed by USA TODAY bear no mention of Facebook or the specific evidence it found. People closely involved in the case said they have no memory of any mention of Facebook or its findings in the case.
Williams had been arrested three months before Facebook investigators posted their report to an internal message board in November 2019. He entered a guilty plea the same month.
It’s unclear whether the policy and enforcement changes set forth by investigators to curb exploitation on its platforms were ever realized, including clarifying how Facebook handles ad revenue from human trafficking operations and expanding existing programs to systematically find and disable similar massage parlor pages as well as activity by suspected traffickers across the company’s platforms.
Stone said the company prohibits human exploitation “in no uncertain terms.”
“We’ve been combatting human trafficking on our platform for many years and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform,” Stone said.
USA TODAY has been investigating human trafficking for three years, finding that the tendrils of criminal rings behind illicit massage parlors can extend across the country and into seemingly legitimate enterprises such as massage schools. As part of that work, reporters were tracking Williams’ case.
During the course of an investigation by the FBI, the IRS, the Department of Homeland Security and a half-dozen other agencies, agents identified more than 100 Chinese nationals with direct ties to Williams. Most were women who traveled extensively through the U.S., which investigators noted “is indicative of trafficking/moving females to different massage parlor locations.”
Williams pleaded guilty Nov. 14, 2019, to racketeering, transportation for the purpose of prostitution, harboring aliens for financial gain and money laundering, according to court documents. He was sentenced to three years in prison.
The internal documents indicate Facebook is well aware its products are used by traffickers but has made incremental efforts to get ahead of it, all while downplaying it in public, according to an anonymous whistleblower disclosure sent to the SEC and reviewed by USA TODAY.
The Williams case is one of several in-depth studies by Facebook employees of human exploitation that resulted in similar proposed changes. Several documents reference a detection mechanism that had expired, which employees asked to be reactivated.
Others focused on a push for “soft actions,” or anything short of removing content from Facebook platforms. Even then, Facebook seemed hesitant to act if there was even a slight risk of affecting usership, a sentiment that Haugen has articulated in hearings with lawmakers.
In one document detailing feedback from CEO Mark Zuckerberg on soft action proposals, an employee said the company was not rolling out the proposals on human trafficking or other dangerous content areas because of concerns over transparency. In another, soft action seemed to be prioritized in developing Facebook Dating, the company’s dating app, on which Facebook pushed ahead despite significant warnings that it could result in sexploitation of children.
Krell said Facebook should be taking note of actions by lawmakers and law enforcement in the past decade against other internet companies. That modern timeline begins in 2010, when Craigslist eliminated its adult services section after an outcry from state attorneys general after a man was accused of murdering a female masseuse he met through the site.
In 2015, a Senate subcommittee began a bipartisan investigation into sex traffickers’ use of the internet. Nearly two years later it published a scathing report showing that Backpage.com was deeply complicit in online sex trafficking. Around the same time, Krell’s office was filing charges against Backpage.com and its leaders, as was the state of Texas.
Backpage was seized in 2018 by the U.S. Department of Justice, which called it the internet’s leading forum for prostitution ads. The department filed a 93-count federal indictment against people at Backpage on charges of facilitating prostitution and money laundering. Company CEO Carl Ferrer pleaded guilty to both charges. The trial for several others is ongoing.
Later that year came FOSTA-SESTA, in which Congress specified that it is illegal to benefit from “participation in a venture” of sex trafficking. That law also gave states the right to pursue criminal action against those who violate the statute, and it allows trafficking survivors to sue platforms that were used in their exploitation.
Facebook ultimately supported the legislation.
Afterward, Craigslist further shrank its personals section, and federal authorities shut down CityXGuide, which was described as picking up where Backpage left off, according to a Justice Department press release of the seizure.
Throughout all this action, Krell points out, Facebook’s reach remained constant, hitting 2 billion users in 2017. Then this summer, the Texas Supreme Court ruled that Facebook could not broadly use a Section 230 defense in three civil cases that allege Facebook assisted, facilitated and knowingly benefited from the trafficking of minors on its platforms.
“This context should give Facebook some red flags that it needs to figure out what anti-trafficking policies it has and what it is doing to eliminate human trafficking on its platform,” said Krell, who documents her experience on the Backpage case in a forthcoming book, "Taking Down Backpage."
A review of the internal Facebook documents reveals the company has known its products were part of the life cycle of human trafficking for more than three years, including for recruitment, facilitation and exploitation.
In March 2018, Facebook employees identified Instagram profiles dedicated to selling domestic servants in Saudi Arabia – part of a practice in the Middle East that the United Nations and the U.S. State Department have found constitute human trafficking. Facebook didn’t consider the profiles a policy violation and took no action at the time, according to the newly disclosed documents.
Facebook’s efforts escalated in 2019 when Apple threatened to remove the company's products from the App Store after a BBC investigation found domestic workers were being bought and sold on social media, including on Facebook-owned Instagram. The internal documents show Facebook knew about that practice before the BBC story broke.
Stone, the Facebook spokesman, pointed to a letter from the company to the U.N. Special Rapporteurs from 2020 regarding the company's mitigation efforts to combat domestic servitude and exploitation.
In the letter, Facebook said it had used technology to detect and remove more than 4,000 pieces of content that violated its policies. The company also launched a quality monitoring program in 2020, the letter said, that identifies gaps in policies and, around the same time, increased the amount of human trafficking content it removed.
Yet, throughout the documents reviewed by USA TODAY, Facebook seemed to pay deference to maintaining or increasing users and to potential revenue versus protecting trafficking victims. Authors of one document said warnings directed at vulnerable workers could “alienate buyers,” or those buying labor contracts in potential trafficking situations.
Stone, however, said “it’s absurd to suggest this is the position of Facebook or its leadership, based on a single comment in an internal working document."
Facebook determined that one of the best courses of action was educational campaigns, according to the documents. It suggested a pilot program in the Philippines to warn potential victims of the dangers of recruiters and false advertisement.
Another of Facebook's suggested actions was donating proceeds of ad sales from traffickers. Stone said Facebook provides ad credits to groups that fight human trafficking, including Polaris, which runs the National Human Trafficking Hotline.
Krell said that option falls short of the social media platform's responsibility.
“Giving that money somewhere else, to me that’s so woefully inadequate,” Krell said. “Donating a few thousand dollars to some organization after the fact isn’t any kind of systematic solution to the problem.”
Contributing: Dian Zhang and Marisa Kwiatkowski
Cara Kelly is a reporter on the USA TODAY investigations team, focusing primarily on pop culture, consumer news and sexual violence. Contact her at email@example.com, @carareports or CaraKelly on WhatsApp.
This article originally appeared on USA TODAY: Facebook knew it profited from sex trafficking. Did it break the law?