'Search engines with glorifying self-injury content can raise suicide risk'

The report, by the Office of Communication (Ofcom), UK’s regulator for the communications services, analysed over 37,000 web pages, images and videos.

Update: 2024-02-01 05:33 GMT

Representative Image 

NEW DELHI: Internet search engines like Google, Microsoft Bing, DuckDuckGo, Yahoo! and AOL are rife with content that glorifies or celebrates self-injury, and can act as gateways to self-harm and raise suicide risk, a report warned on Wednesday.

The report, by the Office of Communication (Ofcom), UK’s regulator for the communications services, analysed over 37,000 web pages, images and videos.

The researchers entered common search queries for self-injurious content, as well as cryptic terms typically used by online communities to conceal their real meaning.

They found that one in every five (22 per cent) results linked, in a single click, to content which celebrates, glorifies, or offers instruction about non-suicidal self-injury, suicide or eating disorders.

Nineteen per cent of the very top links on page one of the results linked to content promoting or encouraging these behaviours, increasing to 22 per cent of the top-five page one results.

Image searches also carried high risk.

Image searches delivered the highest proportion of harmful or extreme results (50 per cent), followed by web pages (28 per cent), and video (22 per cent).

Research has already shown that images can be more likely to inspire acts of self-injury.

Also, it can be hard for detection algorithms to distinguish between visuals glorifying self-harm and those shared in a recovery or medical context.

"Search engines are often the starting point for people’s online experience, and we’re concerned they can act as one-click gateways to seriously harmful self-injury content," said Almudena Lara, Online Safety Policy Development Director, in a statement.

"Search services need to understand their potential risks and the effectiveness of their protection measures -- particularly for keeping children safe online -- ahead of our wide-ranging consultation due in Spring," Lara added.

Further, they found that people are six times more likely to find harmful content about self-injury when entering deliberately obscured search terms, a common practice among online communities.

Both the specific and evolving nature of these terms pose significant detection challenges for services. One in five (22 per cent) search results were categorised as ‘preventative’, linking to content focused on getting people help -- such as mental health services or educational material about the dangers of self-injury.

The researchers urged search services to take steps to minimise the chances of children encountering harmful content on their service -- including content that promotes self-harm, suicide and eating disorders.

Tags:    

Similar News