Image

X Says That Over 500K Users Now Contribute to Community Notes

X (formerly Twitter) has reached another milestone with its crowd-sourced fact-checking Community Notes feature, with over 500,000 contributors now feeding notes into the system.

Community Notes

As per X:

There are now over half a million Community Notes contributors in 70 countries around the world. Taking a minute to reflect on what contributors have done: in 2023 we showed 37,000+ notes well over 14 billion times, and in just the first four months of 2024, we’ve already shown 29,000+ notes that have been seen over 9 billion times. An amazing pace of growth, covering more topics in more languages every day.”

Community Notes, originally called Birdwatch, was an initiative spearheaded by Jack Dorsey when he was in charge of Twitter back in 2021. The program enables X users to contribute fact-checks and corrections, with citations and references then displayed in-stream.

Twitter Community Notes

The idea is that this will expand the platform’s capacity to counter misinformation, and limit its spread, and studies have shown that it can be an effective deterrent in helping to reduce the amplification of false reports.

Indeed, as cited by X, studies have shown that:

  • Users repost content 61% less often after a post gets a Community Note
  • Posts are deleted at much higher rates once a Community Note has been attached
  • Community Notes are perceived as “significantly more trustworthy” than traditional misinformation flags

There definitely is a value to the Community Notes system, and enabling users to contribute their own fact-checks and clarifications on posts, and it has had an overall positive effect. Though various other studies have shown the Community Notes is not effective at dispelling all types of misinformation within the app, mostly because the process is too slow to limit the spread of false information before it’s had an effect, and because of X’s systematic requirements which mean that all Notes need to get agreement from people of opposing political viewpoints before they’re displayed in the app.

And that does have a big impact. According to a study conducted by MediaWise last year, only around 8.5% of the Community Notes created are ever displayed in the app, with many divisive topics never getting accurate fact-checks because they fail to reach cross-political agreement.

X is continually refining its system, particularly in terms of the time it takes for a Note to be shown in the app. But the basic requirements of review and agreement do mean that Community Notes fail to dispel many reports before they’ve had a chance to proliferate, while on many topics, Notes are simply never shown because they don’t achieve the required consensus.

Which is not such a big problem when Community Notes is used as a complement to internal moderation systems. But as X continues to put more reliance on crowd-sourced fact checks, in replacement of internal labor, that does lead to potential flaws.

Indeed, the European Commission recently requested an explanation from X as to why its moderation workforce has decreased from 2,294 staff in September to 1,849 as of April 2024.

X, of course, has cut over 80% of its total staff since Musk took control of the app, and many have praised its cost-cutting, given the relatively minimal impacts on the platform, at least externally.

But at the same time, various reports have indicated that hate speech is on the rise in the app, as well as various other content concerns, including porn bots and spam, which likely would have been weeded out, or at least reduced, by a broader moderation workforce.

But X seems determined that Community Notes is a better way forward, and that its user community is better placed to moderate content, as opposed to a company-appointed set of assessors.

Seemingly, X’s ad partners don’t agree, with the company’s ad intake reportedly still down 50% on previous levels.

But at the same time Community Notes is improving, and expanding, and maybe, at some stage, there will be a point where its capacity increases enough to cover the workload lost from the reduction in staff moderators.

SHARE THIS POST