The Business & Technology Network
Helping Business Interpret and Use Technology
«  

May

  »
S M T W T F S
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 

Moderating Eating Disorder Content Is Harder Than You Think

DATE POSTED:March 18, 2024

Both troubled teens and government agencies are asking, “How thin is thin enough?” The teens are thinking about how thin they want to look, while the government is thinking about what’s too thin to post online.

The refrain is always the same: the platforms need to do morenever mind the difficult details. Platforms need to remove posts that encourage eating disorders—but leave space for people to talk about recovery. They need to increase moderation. But they’re not supposed to use AI, which is fallible, easily gamed, and potentially illegal. And they’re not supposed to use more human moderators, who will be traumatized by their experiences.

Content moderation is impossible to do well at scale. Moderation around eating disorder content is especially fraught. And there’s no way to eliminate ED content online without broadly shutting down entire parts of the Internet.

Newcomers to this topic might think it’s easy to tell whether a particular post qualifies as eating disorder content. Is there a picture of a starving girl or not? We think we know it when we see it. And this is true sometimes: plenty of ED content openly identifies as such, usually in order to cordon itself off from the rest of the Internet.

An account bio might read, “edtwt, minors dni”—shorthand for “eating disorder Twitter, minors do not interact.” The account might also include that they’re pro-recovery, post or do not post fatphobia, their favorite K-pop group, whether they go to the gym or not, what kind of eating disorder they have, and what kind of fashion they like.

Behind these accounts are individuals who are complex, imperfect, and hurting. And this content has a context. A post that’s so obviously eating disorder content on this account may not obviously eating disorder content posted elsewhere. To be “pro-recovery” is to want to (or be open to) recover from your eating disorder. The person behind an account may be in therapy or receiving outpatient treatment. Both accounts dedicated to eating disorders and accounts that do not solely focus on eating disorders but still have individuals struggling with them may discuss their experiences with the disorder and recovery. Platforms cannot simply ban discussions about eating disorders without sweeping away plenty of honest conversations about mental health, which might also drive the conversation towards less helpful corners of the internet.

And the platforms probably can’t ban fatphobia, at least not in a way that would take care of accounts that use fatphobia to encourage disordered behavior. If an ED account posts a video of a person with a fatphobic caption, platforms can delete the ED account, but the platform shouldn’t delete the original video just because it could be harmful in the wrong hands. The same goes for pictures of people who happen to be skinny: the distinction between lean and clearly anorexic is not as clear-cut as you might like to imagine. We could have everyone register their BMIs and daily caloric intake with their username and password, to be sure that no unhealthy daily outfit pictures slip past the mods, but short of that appalling dystopia, we’re out of luck.

This is all a bit hypothetical and ridiculous, because after all, platforms currently host plenty of self-identified ED content. The platforms could start with banning that.

And once platforms start banning openly disordered content, ED accounts would quickly stop being open about it. This isn’t speculative. These accounts have all sorts of alternative names and phrases invented specifically to evade content moderation. They can invent ED dog whistles much faster than platforms can ban them. Or these ED accounts can create a new account that doesn’t post eating disorder content at all—only borderline content: Kate Moss, particularly thin K-pop idols, girls who might have an eating disorder but might just be really into yoga. Sometimes you know it when you see it, but sometimes you don’t. When is posting a picture a pack of gum ED content and when is it just posting a picture of a pack of gum?

Proposals and lawsuits that aim to make platforms liable for the harmful eating disorder content are asking for the impossible. The best outcome they could hope for is that platforms crack down on the explicitly rule-breaking content and that ED content gets a bit more covert. Because it’s not going away. Companies have been removing ED content since 2001. Eating disorders are older than social media, and advocates who think platforms can moderate ED content out of existence understand neither eating disorders nor content moderation.

No one is saying platforms should do nothing. They should ban accounts and remove content that violates platform rules. And platforms should further develop their current tools. They should prompt users to reach out to helplines or to take breaks. They should allow users to widely mute certain topics and ads. But platforms will never be able to remove all eating disorder content. Nor will they be able to remove all users with eating disorders. Nor should they, even if they could.

Social media cannot solve eating disorders. They can and should aim to host better conversations in general, but we should stop expecting them to moderate mental health problems away.

Santana Boulton is a legal fellow with TechFreedom