Begin typing your search...

Mastodon has child abuse material problem: Study

Over two days, researchers found 112 matches for known CSAM in addition to nearly 2,000 posts that used the 20 most common hashtags which indicate the exchange of abuse materials.

Mastodon has child abuse material problem: Study
X

Representative Image (Photo: IANS)

SAN FRANCISCO: Decentralised social media platform Mastodon, viewed as a viable alternative to Twitter (Now X), is rife with child sexual abuse material (CSAM), a new study has shown.

According to the researchers from the US-based Stanford Internet Observatory, the findings raise serious concerns about the effectiveness of safety efforts across so-called 'decentralised' platforms, which allow users to join independently run communities with their own moderation rules, particularly when dealing with the internet's most vile content, reports The Wall Street Journal.

Over two days, researchers found 112 matches for known CSAM in addition to nearly 2,000 posts that used the 20 most common hashtags which indicate the exchange of abuse materials.

The Internet Observatory searched for CSAM on the 25 most popular Mastodon instances. Researchers also used Google's SafeSearch API and PhotoDNA, a tool that helps find flagged CSAM, to identify explicit images.

“We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organisation of doing any kind of social media analysis, and it’s not even close,” David Thiel, one of the study’s authors, was quoted as saying.

During its search, the team discovered 554 pieces of content that matched hashtags or keywords commonly used by online child sexual abuse groups, all

of which were identified as explicit with 'highest confidence' by Google SafeSearch.

There were also 713 mentions of the top 20 CSAM-related hashtags on Fediverse (a group of federated social networking services) posts with

media, and 1,217 text-only mentions of 'off-site CSAM trading or grooming of minors'.

According to the study, the open posting of CSAM is 'disturbingly prevalent'.

Decentralised networks do not approach moderation in the same way that mainstream sites like Facebook, Instagram, and Reddit do. Instead,

moderation is delegated to each decentralised instance, which can lead to inconsistency across the Fediverse.

IANS
Next Story