Teen communities throughout the internet, cleverly naming and culling their discussions to avoid detection. They present a nearly intractable problem for social media companies under pressure to do something about material on their services that many people believe is causing harm, particularly to teenagers. Those concerns came into sharp focus in recent weeks in a pair of Senate subcommittee hearings: the first featuring a Facebook executive defending her company, and the second featuring a former Facebook employee turned whistle-blower who bluntly argued that her former employer’s products drove some young people toward eating disorders.
The hearings were prompted in part by a Wall Street Journal article that detailed how internal Facebook research showed Instagram, which is owned by Facebook, can make body image issues worse for some young people. On Tuesday, executives from YouTube, TikTok and Snapchat are scheduled to testify before a Senate subcommittee about the effects of their products on children. They are expected to face questions about how they moderate content that might encourage disordered eating, and how their algorithms might promote such content. “Big Tech’s exploiting these powerful algorithms and design features is reckless and heedless, and needs to change,” Senator Richard Blumenthal, a Democrat of Connecticut and the chair of the subcommittee, said in a statement. “They seize on the insecurities of children, including eating disorders, simply to make more money.”
But what exactly can be done about that content — and why people create it in the first place — may defy easy answers. If creators say they don’t intend to glamorise eating disorders, should their claims be taken at face value? Or should the companies listen to users complaining about them? “Social media in general does not cause an eating disorder. However, it can contribute to an eating disorder,” said Chelsea Kronengold, a spokeswoman for the National Eating Disorders Association. “There are certain posts and certain content that may trigger one person and not another person. From the social media platform’s perspective, how do you moderate that gray area content?”
The association advises social media companies to remove content that explicitly promotes eating disorders and to offer help to users who seek it out. But young people have formed online communities where they discuss eating disorders and swap tips for the best ways to lose weight and look skinny. Using creative hashtags and abbreviations to get around filters, they share threads of emaciated models on Twitter as inspiration, create YouTube videos compiling low-calorie diets, and form group chats on Discord and Snapchat to share how much they weigh and encourage others to fast.
Influencers in fashion, beauty and fitness have all been accused of promoting eating disorders. Experts say that fitness influencers in particular can often serve as a funnel to draw young people into extreme online eating disorder communities. YouTube, Snapchat, TikTok and Twitter have policies prohibiting content that encourages eating disorders. The companies should improve their algorithms that can surface such content, Kronengold said. “It becomes an issue, especially when people are coming across this content who can be harmed by it or don’t want to see it,” she said.
The writers are journalists with NYT©2021
The New York Times