How Twitter's Child Porn Problem Ruined Its Plans For an OnlyFans Competitor

An anonymous reader quotes a report from The Verge: In the spring of 2022, Twitter considered making a radical change to the platform. After years of quietly allowing adult content on the service, the company would monetize it. The proposal: give adult content creators the ability to begin selling OnlyFans-style paid subscriptions, with Twitter keeping a share of the revenue. Had the project been approved, Twitter would have risked a massive backlash from advertisers, who generate the vast majority of the company’s revenues. But the service could have generated more than enough to compensate for losses. OnlyFans, the most popular by far of the adult creator sites, is projecting $2.5 billion in revenue this year — about half of Twitter’s 2021 revenue — and is already a profitable company.

Some executives thought Twitter could easily begin capturing a share of that money since the service is already the primary marketing channel for most OnlyFans creators. And so resources were pushed to a new project called ACM: Adult Content Monetization. Before the final go-ahead to launch, though, Twitter convened 84 employees to form what it called a “Red Team.” The goal was “to pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to do this safely and responsibly,” according to documents obtained by The Verge and interviews with current and former Twitter employees. What the Red Team discovered derailed the project: Twitter could not safely allow adult creators to sell subscriptions because the company was not — and still is not — effectively policing harmful sexual content on the platform.

“Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale,” the Red Team concluded in April 2022. The company also lacked tools to verify that creators and consumers of adult content were of legal age, the team found. As a result, in May — weeks after Elon Musk agreed to purchase the company for $44 billion — the company delayed the project indefinitely. If Twitter couldn’t consistently remove child sexual exploitative content on the platform today, how would it even begin to monetize porn? Launching ACM would worsen the problem, the team found. Allowing creators to begin putting their content behind a paywall would mean that even more illegal material would make its way to Twitter — and more of it would slip out of view. Twitter had few effective tools available to find it. Taking the Red Team report seriously, leadership decided it would not launch Adult Content Monetization until Twitter put more health and safety measures in place. “Twitter still has a problem with content that sexually exploits children,” reports The Verge, citing interviews with current and former staffers, as well as 58 pages of internal documents. “Executives are apparently well-informed about the issue, and the company is doing little to fix it.”

“While the amount of [child sexual exploitation (CSE)] online has grown exponentially, Twitter’s investment in technologies to detect and manage the growth has not,” begins a February 2021 report from the company’s Health team. “Teams are managing the workload using legacy tools with known broken windows. In short, [content moderators] are keeping the ship afloat with limited-to-no-support from Health.”

Part of the problem is scale while the other part is mismanagement, says the report. “Meanwhile, the system that Twitter heavily relied on to discover CSE had begun to break…”

Read more of this story at Slashdot.



Source: Slashdot – How Twitter’s Child Porn Problem Ruined Its Plans For an OnlyFans Competitor