Twitter discussed creating an OnlyFans clone to monetize the adult content that's been prevalent on the platform for many years, but its inability to effectively detect and remove harmful sexual content put the brakes on that notion, according to a Verge investigation. A team Twitter put together to find out whether the company could pull off such a move determined this spring that "Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale.” The team's findings were “part of a discussion, which ultimately led to us pause the workstream for the right reasons,” Twitter spokesperson Katie Rosborough said.
Twitter is said to have halted the Adult Content Monetization (ACM) project in May, not long after it agreed a $44 billion sale to Elon Musk — that deal is now up in the air. The company's leadership team determined that it couldn't move forward with ACM without enacting more health and safety measures.
The investigation (which you can read in full here) details warnings that Twitter researchers made in February 2021 about the company not doing enough to detect and remove harmful sexual content, such as Child Sexual Abuse Material (CSAM). The researchers are said to have informed the company that the enforcement system Twitter primarily uses, RedPanda, is “a legacy, unsupported tool” that is "by far one of the most fragile, inefficient and under-supported tools" it employs.
While the company has machine learning systems, those seemingly struggle to detect new instances of CSAM in tweets and livestreams. Twitter manually reports CSAM to the National Center for Missing and Exploited Children (NCMEC). However, the researchers noted that the labor-intensive process led to a backlog of cases and a delay in reporting CSAM to NCMEC. Rosborough told The Verge that since the researchers released their report last year, Twitter has significantly increased its investment in detecting CSAM and is hiring several specialists to tackle the issue.
“Twitter has zero tolerance for child sexual exploitation,” Rosborough said. “We aggressively fight online child sexual abuse and have invested significantly in technology and tools to enforce our policy. Our dedicated teams work to stay ahead of bad-faith actors and to help ensure we’re protecting minors from harm — both on and offline.”
Advertisers may have bristled at the notion of Adult Content Monetization (even though porn is widespread on the platform), but the potential financial upside for Twitter was clear. OnlyFans expects to bring in $2.5 billion in revenue this year, which is about half of what Twitter generated in 2021. Twitter offers creators several ways to directlymonetize the large audiences many of them have built on the platform. Adding OnlyFans-style functions might have been a goldmine for adult content creators and the company. Broader issues have prevented the company from taking that step, despite the improvements it claims to have made over the last 18 months.
Engadget has contacted Twitter for comment.
via engadget.com
0 Response to "Twitter planned to build an OnlyFans clone, but CSAM issues reportedly derailed the plan"
Post a Comment