Image

X Updates Policies To Allow Sexual Content within the App

Could X be looking to monetize adult content in future, as another potential revenue path for the app?

Today, the platform formerly known as Twitter has updated its policy on adult content, which means that “consensually produced and distributed adult nudity or sexual behavior” is now officially allowed in the app.

I mean, it goes with the name, I guess.

As per X:

We believe that users should be able to create, distribute, and consume material related to sexual themes as long as it is consensually produced and distributed. Sexual expression, whether visual or written, can be a legitimate form of artistic expression. We believe in the autonomy of adults to engage with and create content that reflects their own beliefs, desires, and experiences, including those related to sexuality.”

In other words, X is now okay with porn. Which it’s unofficially been okay with since forever, but it’s now making its approach more explicit in this respect.

X says that it still has strong rules against “content promoting exploitation, nonconsent, objectification, sexualization or harm to minors, and obscene behaviors”, while it also won’t allow adult content to be shown in profile photos or profile banners. So overall, its restrictions haven’t changed, but it will now be more open to adult material.

So why the update?

As noted, X has long allowed porn indirectly, by simply not enforcing it, and letting users post what they like in the app. That’s led to various challenges in enforcement and exposure (as well as ad placement, a key note given its more recent struggles on this front), though at the same time, it’s also provided a solid driver of traffic and engagement for the app.

So much so, in fact, that back in 2022, then Twitter management actually explored the possibility of enabling adult content creators to sell subscriptions in the app, as part of a potential push to tap into OnlyFans’ $2.5b self-made content market.

Given that so much of this content is already so present in the app, it seems, potentially, like a logical step to make more money for the platform, by leaning into this element, rather than just turning a blind eye to it.

Except, Twitter’s team eventually decided that it couldn’t do it.

Why?

As reported by The Verge:

Before the final go-ahead to launch, Twitter convened 84 employees to form what it called a “Red Team.” The goal was “to pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to do this safely and responsibly”. What the Red Team discovered derailed the project: Twitter could not safely allow adult creators to sell subscriptions because the company was not – and still is not – effectively policing harmful sexual content on the platform.”

Essentially, Twitter’s team wasn’t able to provide any assurance that they would be able to stop this type of exploitation in the app, because its detection measures were not adequate.

But have they potentially improved under Elon Musk?

Musk has made a lot of noise about addressing child exploitation content in the app, calling it his top priority early on in his tenure as Twitter owner. Yet, since then, Musk has also culled 80% of the platform’s staff, including many from its moderation and safety teams, so it seems unlikely that he’s actually been able to improve this, at a much lower headcount.

But that may also be changing.

Back in January, X announced a plan to build a new “Trust and Safety center of excellence” in Texas, in order to improve its responsiveness in addressing child exploitation specifically. 

Maybe, with that in place, X could be in a better position to address such, and that could, at least in theory, enable it to re-examine its adult content monetization plan once again.

And definitely, X needs the money right now, given that its ad revenue remains down by an estimated 50% on pre-Elon levels, while X Premium take-up is nowhere close to picking up the slack.

Add to this the fact that adult content has reportedly increased significantly in the app since Musk’s staff cuts, and there are quite a few indicators that this could be where X is heading.

X also added an “Adult Content” filter for Communities back in March.

Again, X by name…

At this stage, however, X does still have definitive rules in place that restrict adult content monetization, with a note in its Ad Policy stating that:

X prohibits the promotion of adult sexual content globally.

But that, technically, only applies to ads for such, not creators making money from subscriptions.

And maybe, that too could soon change.

SHARE THIS POST