Striking terror —

Lawsuits: OnlyFans bribed Instagram to put creators on “terrorist blacklist” [Updated]

Thousands of adult entertainers claim they were added to a "blacklist."

Lawsuits: OnlyFans bribed Instagram to put creators on “terrorist blacklist” [Updated]

(Update, 5:27 pm ET: A GIFCT spokesperson clarified how the “blacklist”—or more accurately, in its terms, its terrorist content database—works to log terrorist activity between different online platforms. She says only videos and images are currently hashed, and nothing gets automatically removed from other platforms. Instead, once content is hashed, each platform considers things like the type of terrorist entity it is or the severity of the content and then weighs those measurements against its own policies to decide if it qualifies for removal or content advisory labels.

The GIFCT spokesperson also noted that Instagram accounts are not hashed, only Instagram images and videos, and there is no “blacklist” of users, although GIFCT analyzes who produces the content the organization hashes. The database records hashes to signal terrorist entities or terrorism content based on the United Nations sanctions list of terrorist entities from the UN Security Council. And all that content remains in the database unless a platform/GIFCT member like Meta uses a GIFCT feedback tool that was introduced in 2019 to flag the content as not qualifying as terrorist content. The feedback tool can also be used to recommend modified labels of content. Currently, that’s the only way to challenge content that gets hashed. GIFCT members also maintain active discussions on content moderation with GIFCT’s “centralized communications mechanism.” In these discussions, the spokesperson says none of the complaints raised in the lawsuit have been mentioned by members.

About two years ago, GIFCT became an independent nonprofit, and since then, it has released annual transparency reports that provide some insights into the feedback it receives. The next transparency report is due in December.)

Original story: Through the pandemic, OnlyFans took over the online adult entertainment world to become a billion-dollar top dog, projected to earn five times more net revenue in 2022 than in 2020. As OnlyFans’ business grew, content creators on rival platforms complained that social media sites like Facebook and Instagram were blocking their content but seemingly didn’t block OnlyFans with the same fervor, creating an unfair advantage. OnlyFans' mounting success amid every other platform's demise seemed to underscore its mysterious edge.

As adult entertainers outside of OnlyFans’ content stream looked for answers to their declining revenue, they realized that Meta had not only allegedly targeted their accounts to be banned for posting supposedly inappropriate content but seemingly also for suspected terrorist activity. The more they dug into why they had been branded as terrorists, the more they suspected that OnlyFans paid Meta to put the mark on their heads—resulting in account bans that went past Facebook and Instagram and spanned popular social media apps across the Internet.

Now, Meta has been hit with multiple class action lawsuits alleging that senior executives at Meta accepted bribes from OnlyFans to shadow-ban competing adult entertainers by placing them on a "terrorist blacklist." Meta claims the suspected scheme is “highly implausible,” and that it's more likely that OnlyFans beat its rivals in the market through successful strategic moves, like partnering with celebrities. However, lawyers representing three adult entertainers suing Meta say the owner of Facebook and Instagram will likely have to hand over documents to prove it.

Meta and its legal team did not immediately respond to Ars’ request for comment, but in their motion to dismiss, Meta says that even if “a vast and sophisticated scheme involving manipulation of automated filtering and blocking systems” was launched by Meta employees, Meta would not be liable. As a publisher, Meta says it is protected by the First Amendment and the Communications Decency Act to moderate content created by adult entertainment performers how it sees fit. The tech company also says it would be against Meta’s interests to manipulate algorithms to drive users off Facebook or Instagram to OnlyFans.

Fenix International Limited owns OnlyFans and also filed a motion to dismiss, claiming that the lawsuit had no merits and that OnlyFans has the same protected rights as a publisher as Meta. Neither Fenix nor its legal team immediately responded to Ars’ request for comment.

A spokesperson for the legal team representing the adult entertainers, Millberg, provided documents filed last week in response to both companies' motions to dismiss, which Millberg considers “meritless.” They say that the First Amendment and CDA protections cited by Meta don’t apply, because plaintiffs aren’t suing over their content being blocked, but over allegations that the companies engaged in unfair business practices and “a scheme to misuse a terrorist blacklist.”

Rather than dismiss their complaint, plaintiffs asked the judge to reject the motions, which by law would ordinarily prevent discovery in the case, or if the judge is persuaded by the motions to dismiss, to allow for limited discovery before deciding. A Millberg spokesperson says this is just the start of a long legal process, and they expect that their request for discovery will be granted. That would mean that Meta and OnlyFans would have to share evidence to disprove the claim, which as of yet, neither has.

It’s likely that any judgment on the companies’ motions to dismiss will influence how the companies defend against other lawsuits. For the Millberg class action lawsuit, a hearing is scheduled in the Northern District of California on September 8. The judge will be William Alsup, who some may recall in 2014 received media attention for siding with a woman who contested the federal government’s no-fly policy and recommended a process of correcting mistakes, so that the US is not labeling people as terrorists who aren’t. Adult entertainers are hoping that he’ll be equally sympathetic in helping them to remove that unearned label.

What is this terrorist watch list?

It’s not just adult entertainers suing. Rival adult entertainment platforms, FanCentro and JustFor.Fans, are also suing, claiming that their social media traffic dropped so dramatically that “it could not have been the result of filtering by human reviewers.” Instead, they allege that Fenix relied on a “secret Hong Kong subsidiary into offshore Philippines bank accounts set up by the crooked Meta employees” to pay off Meta and tank its rivals’ traffic.

To achieve maximum effect in its alleged mission to delete rival adult content from the Internet, Fenix allegedly asked Meta to add 21,000 names and social media accounts to a terrorist blacklist that would ensure their content wouldn’t be displayed on Facebook, Instagram, Twitter, or YouTube.

The Global Internet Forum to Counter Terrorism (GIFCT) was co-founded in 2017 by owners of major social media platforms and other companies “to prevent terrorists and violent extremists from exploiting digital platforms.” Whenever a content moderation system flags an account on one platform, a digital fingerprint called a hash is shared with all the other platforms so that the image, video, or post won’t show up anywhere.

Critics, like the Electronic Frontier Foundation, have said the practice limits users’ rights to free expression across the Internet any time a post gets mistakenly flagged, with little recourse to get off the terrorist list, or even confirm if they're on it. The GIFCT told the BBC that it continually works to “enhance transparency and oversight of the GIFCT hash-sharing database” by extensively engaging with stakeholders.

GIFCT did not immediately respond to Ars’ request for comment. The Millberg legal team says that it wants to begin discovery in September by asking Meta and GIFCT to share records that would either prove or disprove whether 21,000 Instagram accounts were improperly branded as terrorists.

Channel Ars Technica