dailyO
Technology

Twitter wanted to earn money from porn like OnlyFans does. It did not work out.

Advertisement
Vivek Mishra
Vivek MishraSep 01, 2022 | 12:59

Twitter wanted to earn money from porn like OnlyFans does. It did not work out.

Twitter wanted to give adult content creators the ability to begin selling OnlyFans-style paid subscriptions. (Photo: Reuters)

In order to make the social media company more profitable, Twitter, earlier this year, had considered monetizing adult content on the website.

But this idea was put on hold because Twitter found out that it cannot effectively detect child sexual abuse material (CSAM) on the platform.

According to a report on The Verge, Twitter wanted to give adult content creators the ability to begin selling OnlyFans-style paid subscriptions, with Twitter keeping a share of the revenue.

Advertisement

Adult Content Monetization: There is already a lot of adult content and porn on the platform. Twitter, though a huge social media platform, doesn't generate profits like Facebook, Google or Instagram. So executives at Twitter thought why not monetize the adult content on the website.

OnlyFans is an internet content subscription service used primarily for adult content. OnlyFans is projecting $2.5 billion in revenue this year and Twitter executives thought that the company could easily begin capturing a share of that money since the service is already the primary marketing channel for most OnlyFans creators, reported The Verge.

Twitter found that it lacked tools to verify if creators and consumers of adult content were of legal age. (Photo: Unsplash)

This led to them considering a new project called ACM or Adult Content Monetization.

Why it was not launched: Before deciding to go ahead with ACM, Twitter formed a team of 84 employees to check everything. The team, named the "Red Team", during its research, found that “Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale,” reported The Verge.

The team also found that Twitter lacked tools to verify if creators and consumers of adult content were of legal age. This led to the project being delayed indefinitely.

Why can't Twitter detect harmful sexual content? To detect child sexual abuse material (CSAM) and other harmful sexual content on the platform, Twitter uses a database developed by Microsoft called PhotoDNA, which helps platforms quickly identify and remove known CSAM. But if a piece of CSAM isn’t already part of that database, newer or digitally altered images can evade detection, reported TechCrunch.

Advertisement
Last updated: September 01, 2022 | 12:59
IN THIS STORY
    Please log in
    I agree with DailyO's privacy policy